Вы находитесь на странице: 1из 7

$EVWUDFW- The present paper introduces the Bacterial Foraging

Optimization (BFO) technique to develop an efficient


forecasting model for prediction of various stock indices. The
connecting weights of the adaptive linear combiner based
model are optimized by the BFO so that its mean square
error(MSE) is minimized. The short and long term prediction
performance of the model is evaluated with test data and the
results obtained are compared with those obtained from the
multilayer perceptron (MLP) based model. It is in general
observed that the proposed model is computationally more
efficient, prediction wise more accurate and takes less training
time compared to the standard MLP based model.
I. INTRODUCTION
inancial Iorecasting or speciIically Stock Market
prediction is one oI the hottest Iields oI research lately
due to its commercial applications owing to the high
stakes and the kinds oI attractive beneIits that it has to oIIer.
As more and more money is being invested in the stock
market, investors get nervous and anxious oI the Iuture
trends oI the stock prices in the markets. The primary area
oI concern is to determine the appropriate time to buy, hold
or sell. UnIortunately, stock market prediction is not an easy
task, because oI the Iact that stock market indices are
essentially dynamic, non-linear, complicated, nonparametric,
and chaotic in nature |1|. The time series oI these processes
are multi-stationary, noisy, random, and has Irequent
structural brakes |2| |3|. In addition, stock market's
movements are aIIected by many macro-economical Iactors
|4| such as political events, Iirms' policies, general economic
conditions, investors' expectations, institutional investors'
choices, movement oI other stock market, psychology oI
investors, etc.
There has been a lot oI research in the Iield oI stock market
prediction across the globe on numerous stock exchanges.
Generally there are three schools oI thoughts regarding such
prediction. The Iirst school believes that no investor can
achieve above average trading advantages based on
historical and present inIormation. The major theories
include the Random Walk Hypothesis and the EIIicient
Market Hypothesis |5|. II these hypotheses come true it will
make all prediction methods worthless. Taylor |6| in his
paper gives us compelling evidence to reject the random


Ritanjali Majhi and P. K. Dash are with College oI Engineering,
Bhubaneswar, INDIA.
G. Panda is with Department oI Electronics & Communication
Engineering, National Institute oI Technology, Rourkela
769008,India,phone-91-9437048906(email : ganapati.pandagmail.com)
G. Sahoo is with the Department oI CSE, BIT, Mesra, Ranchi, India.


walk hypothesis and thus encourages researchers to generate
better models Ior market price prediction.
The second view is that oI Iundamental analysis. Analysts
undertake in-depth studies into the various macro-economic
Iactors and look into the Iinancial conditions and results oI
the industry concerned to discover the extent oI correlation
that may exist with the changes in the stock prices.
Technical analysts present the third view on market price
prediction. They believe that there are recurring patterns in
the market behavior, which can be identiIied and predicted.
In the process they use number oI statistical parameters
called technical Indicators and charting patterns Irom
historical data. However, these techniques oIten yield
contradictory results due to heavy dependence on human
expertise and justiIication.
The recent trend is to develop models Ior Iorecasting
Iinancial data. These models can be broadly divided into
statistical models and soIt-computing models. One oI the
well known statistical methods is the ARIMA-method. The
recent advancement in the Iield oI soIt computing has given
new dimension to the Iield oI Iinancial Iorecasting. Most
ArtiIicial Neural Network (ANN) based models use
historical stock index data such as technical indicators |12|
to predict Iuture prices. Tools based on ANN have
increasingly gained popularity due to their inherent
capabilities to approximate any nonlinear Iunction to a high
degree oI accuracy. Neural networks are less sensitive to
error term assumptions and they can tolerate noise, chaotic
components, and heavy tails better than most other methods
|7|.
The three most popular ANN tools Ior the task are Radial
Basis Function (RBF) |8|, Recurrent Neural Network (RNN)
|9| and Multilayer Perceptron (MLP). More recently, new
models based on Multi Branch Neural Networks (MBNN)
|10|, Local Linear Wavelet Neural Networks (LLWNN) |11|
among others have been reported. The Genetic algorithm
(GA) has recently been reported |1,12|. Very little work has
been reported on the use oI evolutionary computing tools in
training the weights oI Iorecasting models. Recently a new
evolutionary computing technique known as Bacterial
Foraging Optimization (BFO) has been reported|13| and
successIully applied to many Iields |14|.
In this paper the BFO based model Ior prediction oI prices oI
leading stock indices is suggested. The paper demonstrates
the ability oI the BFO based model to predict stock index
movements, both Ior short term (next day, 3 days and 5
days) and long term (7 days and 15 days) prediction.
Simulation study with real data exhibits that the proposed
BFO method oIIers superior prediction perIormance both Ior
short and long terms.
Stock Market Prediction of S&P 500 and D1IA using Bacterial
Foraging Optimization Technique
Ritanjali Majhi, G. Panda, Senior Member IEEE, G. Sahoo, P. K. Dash, Senior Member IEEE and
D. P. Das
F
2569
1-4244-1340-0/07$25.00 c 2007 IEEE
Authorized licensed use limited to: NATIONAL INSTITUTE OF TECHNOLOGY ROURKELA. Downloaded on January 31, 2009 at 08:47 from IEEE Xplore. Restrictions apply.


The paper is organized as Iollows. Section-II deals with the
basic principle oI the BFO. The application oI BFO Ior stock
market prediction is discussed in Section-III. The Iormula
used Ior technical indicators and algorithm Ior developing
the BFO based Iorecasting model are discussed here. The
results and discussion is described in Section-IV. Finally
conclusion is given in Section-V.

II. BASICS OF BACTERIAL FORAGING
OPTIMIZATION

Bacterial Foraging Optiomization (BFO) is a new
evolutionary computation technique which has been
proposed and introduced recently by Passino|13|. It is
inspired by the pattern exhibited by bacterial Ioraging
bahaviours. Bacteria have the tendency to gather to the
nutrient-rich areas by an activity called chemotaxis. It is
known that bacteria swim by rotating whip like Ilagella
driven by a reversible motor embedded in the cell wall. E.
coli has 8-10 Ilagella placed randomly on acell body. When
all Ilagella rotate counterclockwise, they Iorm a compact,
helically propelling the cell along a trajectory, which is
called run. When the Ilagella rotate clockwise, they pull on
the bacterium in diIIerent directions, which causes the
bacteria to tumble.
(1)Chemotaxis : An E. coli bacterium can move in two
diIIerent ways; it can run (swim Ior a period oI time) or it
can tumble, and alternate between these two modes oI
operation in the entire liIetime. In the BFO, a unit walk with
random direction represnts a Tumble and a unit walk with
the same direction in the last step indicates a Run. AIter one
step move, the position oI the i th bacterium can be
presented as
) ( ) ( ) , , ( ) , , 1 ( f i C l k f l k f
i i
+ = + (1)
where ) , , ( l k f
i
represents the i th bacterium at f th
chemotactic, k th reproductive and l th elimination and
dispersal step. ) (i C is the length oI unit walk, which is set
to be constant and ) ( f is the direction angle oI the f th
step. When its activity is Run, ) ( f is same as ) 1 ( f ,
otherwise, ) ( f is a random angle directed within a range
oI |0,2 |.
II the cost at ) , , 1 ( l k f
i
+ is better than the cost at
) , , ( l k f
i
then the bactrium takes another step oI size
) (i C in that direction otherwise it will Tumble. This
process will be continued until the number oI steps taken is
greater than
c
N .
(2) Reproduction : AIter all
c
N chemotactic steps have been
covered, a reproduction step takes place. The Iitness oI the
bacteria are sorted in ascending order. 2 /
b r
S S = bacteria
having higher Iitness die and the remaining
r
S are allowed
to spilt into two identical ones, which occupy the same
positions in the environment at 1
st
step. Thus keep the
population size constant.
(3) Elimination and Dispersal : Since bacteria may stuck
around the initial positions or local optima, it is possible Ior
the diversity oI BFA to change either gradually or suddenly
to eliminate the accidents oI being trapped into local
minima. The dispersion vent happens aIter a certain number
oI reproduction process. A bacterium is chosen, according to
apreset probability
ed
p , to be dispersed and moved to
another position within the environment. These events may
prevent the local minima trapping eIIectively, but
unexpectedly disturb the optimization process. The detailed
mathematical treatment oI this new concept is presented in
| passino|.

III. APPLICATION OF BFO TO STOCK MARKET
PREDICTION

ReIerring to Fig. 1, the adaptive linear combiner is
essentially an adaptive FIR Iilter having number oI inputs
equal to the number oI Ieatures in the input pattern. The
weights oI the combiner are considered as the bacteria and
initially their values are set to random numbers and a
population oI bacteria is chosen. Each bacterium updates its
values using BFO principle by way oI minimizing the mean
square error (MSE) as the cost Iunction. The details oI
optimization is discussed in sequence.











Fig. 1 BFO based Iorecasting model


BFO algorithm for training of the forecasting model :

Step -1 Initiali:ation of parametrs
(i)
b
S No. oI bacteria to be used Ior searching the total
region
(ii)
is
N Number oI input sample
(iii) p Number oI parameter to be optimized
(iv)
s
N Swimming length aIter which tumbling oI bacteria
will be undertaken in a chemotactic loop.
(v)
c
N Number oI iterations to be undertaken in a
chemotactic loop. Always
c
N ~
s
N .
(vi)
re
N Maximum number oI reproduction to be
undertaken
(vii)
ed
N Maximum number oI elimination and dispersal
events to be imposed over the bacteria.
d(k)
Desired
stock index
Input Ieatures
Linear Combiner _
BFO based training
e(k)
y(k)
-
2570 2007 IEEE Congress on Evolutionary Computation (CEC 2007)
Authorized licensed use limited to: NATIONAL INSTITUTE OF TECHNOLOGY ROURKELA. Downloaded on January 31, 2009 at 08:47 from IEEE Xplore. Restrictions apply.


(viii)
ed
P Probability with which the elimination and
dispersal will continue.
(ix) The location oI each bacterium P (1- p , 1-
b
S , 1) is
speciIied by random numbers on |0,1|
(x) The value oI ) (i C (i.e. runlengthunit). It is assumed to
be constant Ior all bacteria.

Step -2 Iterative Algorithm for optimi:ation
This section models the bacterial population, chemotaxis,
reproduction, elimination and dispersal. Initially
0 = = = l k f
(i) Elimination dispersal loop 1 + = l l
(ii) Reproduction loop 1 + = k k
(iii) Chemotaxis loop 1 + = f f
(a) For , ... ,......... 2 , 1
b
S i = the cost Iunction, (in this case
mean squared error) ) , , , ( l k f i J Ior each i th bacterium is
calculated as Iollows :
(1)
is
N training samples are passed through the model (10
technical indicators used at a time as input) and weighted
sum to give the output.
(2)The output is then compared with the corresponding
desired signal to calculate the error.
(3)The sum oI squared error averaged over
is
N is Iinally
stored in ) , , , ( l k f i J . The cost Iunction is calculated Ior
number oI input samples ( )
is
N as
2
1 1
2
)| ( ` ) ( | ) ( k v k v k e J
is is
N
k
N
k

= =
= =
(2)

(4)End oI For loop.
(b)For
b
S i .... ,......... 2 , 1 =
,
the tumbling/swimming
decision is taken.
Tumble: A random vector ), (i with each
element ), (i
m
, . ,......... 2 , 1 p m = a random number in the
range |-1, 1|.
Move: Let + = + ) , , ( ) , , 1 ( l k f P l k f P
i i
) ( ) (
) (
) (
i i
i
i C
T


This results in an adaptable step size in the direction oI
tumble Ior bacterium i .
The cost Iunction(mean squared error) ) , , 1 , ( l k f i J + is
computed.
Swim (i) Let c 0; (counter Ior swim length)
(ii) While
s
N c < (have not climbed down too long)
Let 1 + = c c
II ) 1 ( ) ( < f J f J then
+ = + ) , , ( ) , , 1 ( l k f P l k f P
i i
) ( ) (
) (
) (
i i
i
i C
T

(3)
which compute the new ) , , 1 , ( l k f i J +
ELSE let
s
N c = . This is end oI the WHILE statement.
(c)Go to next bacterium ) 1 ( + i iI
b
S i to process the next
bacterium.
(d)II ) min(J minimum value oI J among all the bacteria}
is less than the tolerance limit then break all the loops.
Step-3. II
c
N f < ,go to (iii) i.e. continue chemotaxis loop.
Step-4. Reproduction :
(a) For the given k and , l and Ior each
b
S i . ,......... 2 , 1 =
let J be the health oI i
th
bacterium. Sort bacteria in
ascending order oI cost J

(higher cost means lower health).
(b) The 2 /
b r
S S = bacteria with highest J value die and
other
r
S bacteria with the best value split and the copies
that are made are placed at the same location as their parent.
Step-5. II
re
N k < go to 2.
Step-6. Elimination Dispersal :
For
b
S i .. ,......... 2 , 1 = with probability
ed
P ,eliminate and
dispersal each bacterium(this keeps the number oI bacteria
in the population constant). To achieve this we eliminate a
bacterium, by simply dispersing one to a random location on
the optimization domain.


IV. SIMULATION STUDY

A. Experimental Data

The data Ior the stock market prediction experiments has
been collected Ior Standard`s & Poor`s 500 (S&P 500), USA
and Dow Jones Industrial Average (DJIA), USA. The
experimental data used consists oI technical indicators and
daily close price oI the indices. The total number oI samples
Ior the stock indices is 3228 trading days, Irom 3
rd
January
1994 to 23
rd
October 2006. Each sample consists oI the
closing price, opening price, lowest price, highest price and
the total volume oI stocks traded Ior the day. Ten Technical
indicators are selected as Ieature subsets by the review oI
domain experts and prior research. These are computed Irom
the raw data as indicated in the Table 1.
The data is divided into two sets training and testing sets.
The training set consists oI 2510 samples and the rest is set
aside Ior testing. All the inputs are normalized to values
between -1 to 1. The normalization is carried out by
expressing the data in terms oI the maximum and minimum
value oI the dataset.





2007 IEEE Congress on Evolutionary Computation (CEC 2007) 2571
Authorized licensed use limited to: NATIONAL INSTITUTE OF TECHNOLOGY ROURKELA. Downloaded on January 31, 2009 at 08:47 from IEEE Xplore. Restrictions apply.





TABLE 1
SELECTED TECHNICAL INDICATORS AND THEIR FORMULA















































B. Training and testing of the forecasting model

Training oI the Iorecasting models are carried out using the
BFO algorithm as given in Section-III and the optimized
weight values are obtained. Then using these weights the
same Iorecasting models are again used Ior testing purpose.
The experiments are carried out to test the perIormance oI
the model Ior predicting the close price oI the index one day,
three, Iive , seven and IiIteen days in advance.
The Mean Absolute Percentage Error (MAPE) is used to
gauge the perIormance oI the trained prediction model Ior
the test data. The MAPE is deIined as
1
`
1
, , 100
N
f f
f f
v v
MAPE
N v
=

(4)


0 50 100 150 200 250 300
10
-4
10
-3
10
-2
10
-1
10
0
No. of generation
M
S
E


Fig. 2(a). Learning characteristics oI S&P 500 Ior one day advance using
BFO

0 50 100 150 200 250 300
10
-4
10
-3
10
-2
10
-1
10
0
No. of generation
M
e
a
n

s
q
u
a
r
e

e
r
r
o
r


Fig. 2(b). Learning characteristics oI DJIA Ior one day advance using BFO


V. RESULTS AND DISCUSSION

The simulated, BFO model is used to predict the S&P500
and DJIA stock indices closing price one, three, Iive, seven
and IiIteen days in advance. Fig. 2(a) and (b) show the
learning characteristics oI the model obtained through
simulation Ior one day advance Ior S&P and DJIA
respectively. These indicate that the mean square error
(MSE) Ialls substantially during training and then settles at a
minimum value indicating the convergence oI weights .
The results provided here are the best evaluations obtained
using the testing set. Various experiments are carried out,
varying the selection oI technical indicators as input to the
network. This is done in an attempt to identiIy and eliminate
less important statistical parameters and rogue parameters
which adversely aIIect the network prediction perIormance.
The ten technical indicators used Ior this simulation are
EMA10, EMA20, EMA30, ADO, STI, RS19, RSI14, PROC27,
CPACC and HPACC. Various parameters used in the
simulation study Ior BFO are :
b
S 16,
is
N 2510, p 10,
s
N 3,
c
N 5,
re
N 140,
ed
N 2,
ed
P 0.25 and ) (i C 0.0075.
Technical
Indicators
used
Formula
Exponential
Moving
Average
(EMA)

(3 numbers)
( ) (Previous EMA (1- )) P A A + ;
A2/(N1)
P Current Price, A- Smoothing Iactor,
N-Time Period

(EMA10, EMA20 and EMA30 are calculated
using the given Iormula)
Accumulation
/ Distribution
Oscillator
(ADO)
(C.P - L.P) - ( H.P - C.P))
(H.P - L.P) (Period's Volume)

C.P Closing Price, H.P Highest price,
L.P Lowest price
Stochastic
Indicator
(STI)
(Today's Close - Lowest Lowin Kperiod)
100
(Highest High in Kperiod - Lowest Lowin Kperiod)
K=

D SMA oI K Ior the Period.
Relative
Strength
Index
(RSI)
(2 numbers)
100
RSI 100 -
1 (U/D)


(RSI9 and RSI14 are calculated using the given
Iormula)
Price Rate OI
Change
(PROC)

(Today's Close - Close X-period ago)
100
(Close X-period ago)



(PROC27 is calculated using the Iormula)
Closing Price
Acceleration
(CPACC)

( Close Price - Close Price N-period ago)
100
(Close Price N-period ago)


High Price
Acceleration
(HPACC)
( High Price - High Price N-period ago)
100
(High Price N-period ago)



2572 2007 IEEE Congress on Evolutionary Computation (CEC 2007)
Authorized licensed use limited to: NATIONAL INSTITUTE OF TECHNOLOGY ROURKELA. Downloaded on January 31, 2009 at 08:47 from IEEE Xplore. Restrictions apply.


To compare the perIormance oI the proposed model a
best possible MLP model with 9-3-1} neuron is also
simulated using same technical indicators. The value oI
converging Iactor, is choosen as 0.01. The training is
carried out Ior 10000 iterations.
2500 2600 2700 2800 2900 3000 3100 3200
1050
1100
1150
1200
1250
1300
1350
No. of testing samples
S
&
P
5
0
0

s
t
o
c
k

v
a
lu
e
actual
predicted

Fig. 3 Comparison oI actual and one Day ahead predicted value during
testing Ior S&P500 using BFO

2500 2600 2700 2800 2900 3000 3100 3200
1050
1100
1150
1200
1250
1300
1350
Testing samples
S
&
P

5
0
0

s
t
o
c
k

v
a
l
u
e
actual
predicted


Fig. 4 Comparison oI actual and one day ahead predicted value during
testing Ior S&P500 using MLP


2500 2600 2700 2800 2900 3000 3100 3200
1050
1100
1150
1200
1250
1300
1350
No. of testing samples
S
&
P
5
0
0

s
t
o
c
k

v
a
l
u
e
actual
predicted

Fig. 5 Comparison oI actual and IiIteen days ahead predicted value during
testing Ior S&P500 using BFO

2500 2600 2700 2800 2900 3000 3100 3200
1000
1050
1100
1150
1200
1250
1300
1350
No. of testing samples
S
&
P

5
0
0

s
t
o
c
k

v
a
l
u
e
actual
predicted


Fig. 6 Comparison oI actual and IiIteen days ahead predicted value during
testing Ior S&P500 using MLP

2500 2600 2700 2800 2900 3000 3100 3200
0.95
1
1.05
1.1
1.15
1.2
1.25
x 10
4
No. of testing samples
D
J
I
A

s
t
o
c
k

v
a
l
u
e
actual
predicted



Fig. 7 Comparison oI actual and one Day ahead predicted value during
testing Ior DJIA using BFO


2500 2600 2700 2800 2900 3000 3100 3200
0.95
1
1.05
1.1
1.15
1.2
1.25
x 10
4
Testing samples
D
J
I
A

s
t
o
c
k

v
a
l
u
e
actual
predicted



Fig. 8 Comparison oI actual and one day ahead predicted value during
testing Ior DJIA using MLP


2007 IEEE Congress on Evolutionary Computation (CEC 2007) 2573
Authorized licensed use limited to: NATIONAL INSTITUTE OF TECHNOLOGY ROURKELA. Downloaded on January 31, 2009 at 08:47 from IEEE Xplore. Restrictions apply.


2500 2600 2700 2800 2900 3000 3100 3200
0.95
1
1.05
1.1
1.15
1.2
1.25
x 10
4
No. of testing samples
D
J
I
A

s
t
o
c
k

v
a
l
u
e
actual
predicted


Fig. 9 Comparison oI actual and IiIteen days ahead predicted value during
testing Ior DJIA using BFO

2500 2600 2700 2800 2900 3000 3100 3200
0.85
0.9
0.95
1
1.05
1.1
1.15
1.2
1.25
x 10
4
No. of testing samples
D
J
I
A

s
t
o
c
k

v
a
l
u
e
actual
predicted


Fig. 10 Comparison oI actual and IiIteen days ahead predicted value during
testing Ior DJIA using MLP

Figs. 3 and 5 display the actual vs. predicted graph Ior
S&P500 index Ior one day and IiIteen days ahead prediction
respectively using BFO. The actual vs. predicted graph Ior
S&P500 index Ior one day and IiIteen days ahead prediction
respectively using MLP are presented in Figs. 4 and 6.
Table-2 shows the MAPE and computation time oI the BFO
based model in comparison to the MLP based model Ior one
day, three, Iive, seven and IiIteen days in advance prediction
Ior S&P 500. The respective values Ior DJIA index is shown
in Table-3.
The plots in Figs. 3-10 indicate that during testing the
prediction perIormance oI BFO based model is superior to
its MLP counterpart. This is true Ior both short term and
long term prediction. Observations oI Tables 2and 3 indicate
that the MAPE oI BFO model is less than the MLP model
Ior both the stock indices and Ior all days ahead prediction.
Similarly the BFO model takes less time Ior training
compared to its MLP counterpart.






TABLE 2
COMPARISON OF MAPE AND COMPUTATION TIME BETWEEN
BFO AND MLP BASED FORECASTING MODELS Ior S&P 500

Days
ahead
MAPE Computation time in
minutes
BFO MLP BFO MLP
1 0.8108 1.0049 19.2674 26.4081
3 1.0105 2.0922 19.3594 26.6443
5 1.2399 2.5097 19. 5667 26.7836
7 1.3912 3.1170 19.7543 26.8925
15 1.8365 5.9866 19.8211 27.0888
TABLE 3
COMPARISON OF MAPE AND COMPUTATION TIME BETWEEN
BFO AND MLP BASED FORECASTING MODELS Ior DJIA

Days
ahead
MAPE Computation time in
minutes
BFO MLP BFO MLP
1 0.6623 1.0648 16.6096 26.9130
3 0.9534 2.0068 18.7013 27.3456
5 1.1870 3.7658 19.0289 27.9007
7 1.3811 5.6478 19.9532 28.5708
15 1.8893 6.7248 20.6643 29.7406


VI. CONCLUSION

The BFO based adaptive model Ior short and long term
Iorecasting oI stock indices is developed in this paper. The
structure oI the model is basically an adaptive linear
combiner whose weights are updated using the BFO tool. To
demonstrate the perIormance oI the proposed model
simulation study is carried out using known stock indices
and its prediction perIormance is compared with MLP based
standard Iorecasting model. The comparison indicates that
the proposed model oIIers lesser computational complexity,
better prediction accuracy and lesser training time compared
to those obtained Irom the MLP model.

REFERENCES

|1| T.Z. Tan, C. Quek, G.S. Ng, 'Brain Inspired Genetic Complimentary
Learning Ior Stock Market Prediction, IEEE congress on evolutionary
computation,vol. 3, pp . 2653-2660, 2-5th Sep., 2005.
|2| K.J. Oh, K-j. Kim,'Analyzing stock market tick data using piecewise
non linear model.Expert System with Applications, vol. 22, issue 3, pp.
249-255, April 2002.
|3| Y. Wang, 'Mining stock prices using Iuzzy rough set system Expert
System with Applications, vol. 24, issue 1, pp. 13-23, Jan. 2003.
|4| Y. Wang, Predicting stock price using Iuzzy grey prediction system
Expert System with Applications, vol. 22, pp. 33-39, Jan. 2002.
|5| E. Peters, 'Chaos & Order in Capital Markets. A new view of cvcles,
prices, market volatilitv, John Wiley & Sons Inc.
|6| S. Taylor, 'Modeling Financial Time Series, John Wiley & Sons, 1986.
|7| T. Masters, 'Practical Neural Network recipes in C`, Academic
Press, New York, 1993.
|8| J. Hanm, N. Kamber, 'Data Mining. Concepts & Techniques, San
Francisco; Morgan KanImann Publishers, 2001.






2574 2007 IEEE Congress on Evolutionary Computation (CEC 2007)
Authorized licensed use limited to: NATIONAL INSTITUTE OF TECHNOLOGY ROURKELA. Downloaded on January 31, 2009 at 08:47 from IEEE Xplore. Restrictions apply.


|9| E.W. Saad , D.V. Prokhorov, D.C. Wunsch, ' Comparative study oI
stock trend prediction using time delay, recurrent and probabilistic neural
networks, IEEE Transactions oI Neural Network, vol. 9, issue 6, pp. 1456-
1470, Nov.1998.
|10| T. Yamashita, K. Hirasawa, J. Hu, 'Application oI multi-branch
neural networks to stock market prediction, International joint conIerence
on neural networks, Montreal, Canada,vol.4, pp. 2544-2548, 31stJuly-
4thAug., 2005..
|11| Y. Chen, X. Dong, Y. Zhao, 'Stock index modeling using EDA based
local linear wavelet neural network, International conIerence on neural
networks and brain, vol. 3, pp. 1646-1650, 13
th
-15
th
, Oct., 2005.





























































|12| Kyoung-jae Kim, Artificial neural networks with evolutionarv instance
selection for financial forecasting, Expert System with Applications, 30, pp.
519-526, April, 2006.
|13| K. M. Passino, 'Biomimicry oI Bacterial Foraging Ior distributed
optimization and control, IEEE control System Magazine, pp. 52-66, June
2002.
|14| S. Mishra, 'Hybrid least square adaptive bacterial Ioraging strategy Ior
harmonic estimation, IEE Proc. on Gener. Transm. Distrib., vol 152, no. 3,
pp. 379-389, May 2005.
2007 IEEE Congress on Evolutionary Computation (CEC 2007) 2575
Authorized licensed use limited to: NATIONAL INSTITUTE OF TECHNOLOGY ROURKELA. Downloaded on January 31, 2009 at 08:47 from IEEE Xplore. Restrictions apply.

Вам также может понравиться