Prediction of Shanghai Index based on Additive Legendre Neural Network

In this paper, a novel Legendre neural network model is proposed, namely additive Legendre neural network (ALNN). A new hybrid evolutionary method besed on binary particle swarm optimization (BPSO) algorithm and firefly algorithm is proposed to optimize the structure and parameters of ALNN model. Shanghai stock exchange composite index is used to evaluate the performance of ALNN. Results reveal that ALNN performs better than LNN model.


Introduction
Artificial neural networks (ANNs) are powerful mathematical methods that can be used to learn complex linear and non-linear continuous functions, and have been successfully applied to many areas in the past decades [1].Due to that traditional neural network has some disadvantages such as low efficiency, long learning time and easy to fall into the local minimum solution, Legendre neural network (LNN) was proposed.LNN modle has no hidden layer and could add dimensionality of the input layer with a set of nonlinear functions.[4].
The structure of Legendre neural network is very simple and learning speed is fast.But the number of input Legendre polynomials is large due to the fact that each input variable has n order Legendre polynomials.The stucture of LNN is fixed and only task is to optimize the paramters of LNN.To reduce the optimization complexity and improve efficiency, in this paper, a novel Legendre neural network model is proposed, namely additive Legendre neural network (ALNN).Binary particle swarm optimization (BPSO) algorithm is proposed to select proper input Legendre polynomials in order to construct proper structure.Firefly algorithm is used to optimize the parameters of ALNN.Shanghai stock exchange composite index is used to evaluate the performance of ALNN.

Structure of ALNN
Legendre neural network (LNN) was first proposed by Yang and Tseng for function approximation in 1996 [5].LNN has less parameter and does not have hidden layer, which uses Legendre orthogonal polynomials as the activation functions of hidden layer neurons.Due to the absence of hidden layer, LNN provides computational advantage over the MLP. and n order Legendre polynomials.
The Legendre polynomials of each input variable are described as followed.( ) 1 The output of i f is defined as followed.The final output y is defined as followed.x t taking the value 1.A new velocity ( ) i v t for particle i is updated as same as PSO, which is defined as followed.( 1) * ( ) ( ( )) ( ( )).
where w is the inertia weight, 1 c and 2 c are positive constants and 1 r and 2 r are uniformly distributed random number.
i Pbest is the best fitness of particle i and Gbest is the best position among all particles.
where r si created randomly from range [0,1], and the function Sig is defined as followed.

Parameters optimization of ALNN
According to the optimal structure of ALNN, tally the number ( p ) of 1 in the optimal particle Firefly algorithm is the random optimization method of simulating luminescence behavior of firefly in the nature.The firefly could search the partners and move to the position of better firefly according to brightness property.A firefly represents a potential solution.In order to solve optimization problem, initialize a firefly
tanh() is Hyperbolic tangent, ik b is boolean value (0 or 1).When ik b is equal to 1, Legendre polynomial 0 ( ) k L x is selected as input data and weight ik w is assigned.

2 . 2
Structure optimization of ALNNAdditive Legendre neural network could not allow all Legendre polynomials as input data.It uses evolutionary method to select proper Legendre polynomials.In this paper, binary particle swarm optimizaiton (BPSO) is used to the moving trajectory and velocity of each particle is defined in term of probability.The moving trajectory represents changes of probabilities of a certain value.The moving velocity is defined as probability of a state or another state.Thus each bit ( ) i x t of one particle is restricted to 0 or 1. Suppose that ALNN has m input variables and n order Legendre polynomials.The length of each particle is m*(n+1).Each ( ) i v t represents the probability of bit ( ) i

,
need be optimized.Firefly algorithm (FA) is an efficient optimization algorithm which was proposed by Xin-She Yang in 2009[7].It is very simple, has few parameters and easy to apply and implement, so this paper uses firefly algorithm to optimize the parameters of Legendre neural network.
number of fireflies).As attractiveness is directly proportional to the brightness property of the fireflies, so always the less bright firefly will be attracted by the brightest firefly.The brightness of firefly i is computed as