SlideShare une entreprise Scribd logo
1  sur  7
Télécharger pour lire hors ligne
Computer Engineering and Intelligent Systems                                                    www.iiste.org
ISSN 2222-1719 (Paper) ISSN 2222-2863 (Online)
Vol 3, No.3, 2012


         A Novel Neural Network Classifier for Brain Computer
                                               Interface
           Aparna Chaparala1* Dr. J.V.R.Murthy2 Dr. B.Raveendra Babu3 M.V.P.Chandra Sekhara Rao1
    1.    R.V.R.&J.C. College of Engineering, Guntur - 522019, AP, India
    2.    Dept. of CSE, JNTU College of Engineering, Kakinada, AP, India
    3.    DELTA Technology & Management Services Pvt. Ltd., Hyderabad, AP, India
    * E-mail of the corresponding author: chaparala_aparna@yahoo.com


Abstract
Brain computer interfaces (BCI) provides a non-muscular channel for controlling a device through
electroencephalographic signals to perform different tasks. The BCI system records the
Electro-encephalography (EEG) and detects specific patterns that initiate control commands of the device.
The efficiency of the BCI depends upon the methods used to process the brain signals and classify various
patterns of brain signal accurately to perform different tasks. Due to the presence of artifacts in the raw
EEG signal, it is required to preprocess the signals for efficient feature extraction. In this paper it is
proposed to implement a BCI system which extracts the EEG features using Discrete Cosine transforms.
Also, two stages of filtering with the first stage being a butterworth filter and the second stage consisting of
an moving average 15 point spencer filter has been used to remove random noise and at the same time
maintaining a sharp step response. The classification of the signals is done using the proposed Semi Partial
Recurrent Neural Network. The proposed method has very good classification accuracy compared to
conventional neural network classifiers.
Keywords: Brain Computer Interface (BCI), Electro Encephalography (EEG), Discrete Cosine
transforms(DCT), Butterworth filters, Spencer filters, Semi Partial Recurrent Neural network, laguarre
polynomial


1. Introduction
A Brain Computer Interface (BCI) system records the brain signals through Electro-encephalography
(EEG), preprocesses the raw signals to remove artifacts and noise, and employs various signal processing
algorithms to translate patterns into meaningful control commands. The purpose of BCI is to control
devices like computers, speech synthesizers, assistive appliances and neural prostheses by individual with
severe motor disabilities, through brain signals. Signal processing plays an important role in BCI system
design, as meaningful patterns are to be extracted from the brain signal.
Figure 1 depicts a generic BCI system (Mason S G et al. 2003). The device is controlled through a series of
functional components. Electrodes record signals from the users scalp and convert the signals into electrical
signals which are amplified. The artifact processor removes the artifacts from the amplified signals. Feature
generator transforms the signals into feature values that are the base for the control of device. The feature
generator is generally made up of three steps, signal enhancement, feature extraction and dimensionality
reduction. Signal enhancement refers to the preprocessing of the signals to increase the signal-to-noise ratio
of the signal. Most commonly used preprocessing methods are Surface Laplacian (Mc Farland D et al.
1998 ; Dornhege G et al. 2004), Independent Component Analysis (ICA) (Serby H et al. 2005), and
Principal Component Analysis (Guan J et al. 2005). Feature extraction generates the feature vectors and
dimensionality reduction, reduces the number of feature. Thus features useful for classification is identified
and chosen while artifacts and noise are eliminated in feature generator step. Genetic algorithm (Peterson D
A et al. 2005), PCA (Bashashati A et al. 2005), Distinctive sensitive learning vector quantization (DSLVQ)

                                                      10
Computer Engineering and Intelligent Systems                                                  www.iiste.org
ISSN 2222-1719 (Paper) ISSN 2222-2863 (Online)
Vol 3, No.3, 2012

(Pfurtscheller et al. 2001) are some of the feature selectors used. The feature translator translates the
features into control signals. Various classification algorithms based on linear or nonlinear classification
methods are available in literature for classifying the features. Bayesian (Curran E et al. 2004), Gaussian
(Millan J R 2004), k-nearest neighbor (Blankertz B et al. 2002), SVM (Peterson D A et al. 2005, MLP
(Hung C I et al. 2005) are some of the classifiers used. The BCI transducer translates the brain signals into
logical control signals. The logical control signals from the feature translator is converted into semantic
control signals in control interface. Device controller converts the semantic control signals into physical
control signals which control the device.




                                   Fig 1: Functional model of a BCI system
In this paper, the proposed BCI system extracts features from the EEG signals using Discrete Cosine
transforms. The classification of the signals is done using the Semi Partial Recurrent Neural network with
laguarre function in input layer and tanh function in hidden layer with delta learning rule. The paper is
organized into four sections, with section I giving introduction to BCI systems, section II concerns with the
materials and methods used, section III discusses the result with conclusion in section IV.


2. Materials and Methods
The discrete cosine transform (DCT) is closely related to Karhunen-Loeve-Hotelling (KLH) transform, a
transform that produces uncorrelated coefficients (N Ahmed et al. 1983). DCT converts time series signal
into basic frequency components. It decomposes the image into set of waveforms. The process of
decomposing an image into a set of cosine basis functions is called forward discrete cosine transform
(FDCT) and process of reconstructing is called inverse discrete cosine transform (IDCT). Some simple
functions to compute the DCT and to preprocess the provided EEG data for BCI system are as follows:
The FDCT (N Ahmed et al. 1983) of a list of n real numbers s(x), x = 0, ..., n-1, is the list of length n is
given by:

                                     n −1
                                                   (2 x + 1)uπ
               S (u ) = 2 / nC (u )∑ s ( x ) cos                    u = 0… n                      (1)
                                     x =0               2n

Where C(u) is equal to 1/ square root of 2 for u=0 or is equal to 1 for all other values.
The constant factors are chosen so that the basis vectors are orthogonal and normalized. The inverse cosine
transform (IDCT) is given by:


                                                       11
Computer Engineering and Intelligent Systems                                                    www.iiste.org
ISSN 2222-1719 (Paper) ISSN 2222-2863 (Online)
Vol 3, No.3, 2012

                               n −1
                                                       ( 2 x + 1)uπ
               s ( x) = 2 / n ∑ C (u ) S (u ) cos                           x=0… n                     (2)
                               u =0                          2n

Where C(x) is equal to 1/ square root of 2 for x=0 or is equal to 1 for all other values.
A 15 point Spencers filter is used to compute the moving averages of EEG signals and reduce the noise
spikes. The obtained data in the frequency domain is filtered using Butterworth filter to remove noise and
artifacts in the frequency range of 5-30Hz. The Butterworth filter is a signal processing filter which gives a
as flat a frequency response for the pass-band (Giovanni Bianchi et al. 2007). It is one of the most
commonly used digital filters and is also called maximally flat magnitude filter. In Butterworth filter, no
ripples are formed in the pass-band and are zero on reaching stop-band. It has slower roll-off and more
linear phase response when compared to other filters like Chebyshev and elliptic filter. Butterworth filters
are advantageously used to filter EEG signals as the pass-band and stop-band are maximally flat, which
results in quality output signal for different frequency band.
In a low-pass filter, all low frequency components in the signal are passed through and the high frequency
components are stopped. The cutoff frequency divides the pass-band and the stop-band. Thus artifacts in the
EEG signal are easily filtered out using a low-pass filter. The low-pass filter can be modified into high-pass
filter; when placed in series with others to form band-pass and band-stop filters. The gain G(ω) of an
n-order Butterworth low pass filter (S. Butterworth 1930) in terms of transfer function H(s) is given as

                                                                      G02
                                      G 2 (ω ) = H ( jω ) =
                                                          2
                                                                             2n
                                                                                                       (3)
                                                                  ω 
                                                               1+  
                                                                  ω 
                                                                   c
where n is order of filter, ωc is cutoff frequency and G0 is the DC gain i.e gain at zero frequency.
The Butterworth filter is used to preprocess the EEG signal to remove high frequency noise or artifacts with
cutoff frequencies in a range of 5 - 30 Hz.
The trend of a time series is estimated using a linear filtering operation as follows:

                                                 q
                                           γ t = ∑ ar X t (n + r )                                     (4)
                                                r =0


Where ar is a set of weights and ∑ ar = 1 is a moving average or finite impulse response filter.
The 15 point Spencer filters for moving averages is symmetric in nature. It is given as:
 1
    (3, - 6, - 5,3, 21, 46, 67,74, 67, 46, 21,3,- 5,-6,-3)
360
The maximum and average energy from each channel are computed and used as attributes. Support vector
machine is used to reduce the feature vector.


2.1 Partial Recurrent Neural Network
The neural network where input is fed through successive layers of the network to the output is called
feedforward networks. The neural network which has a feedback loop is known as Recurrent Neural
Network (RNN). If the feedback is in only one of the layers then it is referred to as Semi Partial Recurrent
Neural network (SPRNN). The recurrent networks are dynamic in nature as the feedback loops use unit
delay elements. PRNN has feedback in any one of the layers only. PRNNs are easier to use than the RNNs.
Time is implicitly represented in PRNN. Simple PRNN consists of two-layer network with feedback in the

                                                          12
Computer Engineering and Intelligent Systems                                                    www.iiste.org
ISSN 2222-1719 (Paper) ISSN 2222-2863 (Online)
Vol 3, No.3, 2012

hidden layer as shown in figure 2. The output of the hidden layer at time t is fed back as additional inputs at
time t+1, thus the PRNN works in discrete time steps. The proposed PRNN has laguarre function in the
input layer and a tanh function in the hidden layer. The tanh function being asymmetric helps to train faster.




                              Fig 2: A simple Partial Recurrent Neural network
The output of PRNN when a input vector x is propagated through a weight layer V, and the previous state
activation due to recurrent weight layer U,

                                      y j (t ) = f (net j (t ))                                      (5)


                             n                m
                                                                 
               net j (t ) =  ∑ X i (t )v ji + ∑ y h (t − 1)u jh  + θ j                             (6)
                             i                h                 

where n is the number of inputs, θj is bias, f is output function, m is number of state nodes, and
i, j / h, k denotes the input, hidden and output nodes respectively.
The output of the network with output weights W is,

                                                 m
                                   net k (t ) = ∑ y j (t ) wkj + θ k                                 (7)
                                                 j


The learning of the PRNN at each time step starts with the input vectors fed into the network and it
generates an error, the error is backpropagated to find error gradients for each weights and bias. The
weights are updated with learning function using the error gradient.
In this paper it is proposed to implement a laguarre function in the input layer to provide details of the
input’s past memory recursively. The laguarre polynomial is given by




                                                         13
Computer Engineering and Intelligent Systems                                                www.iiste.org
ISSN 2222-1719 (Paper) ISSN 2222-2863 (Online)
Vol 3, No.3, 2012



                                   Lk (u ) =         (
                                               e u d k −u k
                                               k! dx k
                                                       e u    )                                  (8)




Where k is the order of the polynomial and u is the value for which the polynomial is to be found. It is
proposed to use the first order polynomial ie k=1.
The experimental setup consists of 25 neurons in the input layer, 4 neurons in the hidden layer and two
neurons in the output layer ( one neuron for each class). The hidden layer and the output layer activation
functions used are tanh.


3. Results and Discussion
The dataset used for the work is provided by University of Tübingen, Germany, Dept. of Computer
Engineering and Institute of Medical Psychology and Behavioral Neurobiology, and Max-Planck- Institute
for Biological Cybernetics, Tübingen, Germany, and Universität Bonn, Germany, Dept. of
Epileptology(Thomas Lal et al. 2004) was used. 168 instances of a single patient were used to test the
proposed algorithm. 80% of the data was used for training and the remaining for testing. The classification
accuracy obtained along with the classification accuracy of MLP neural network is shown in figure 3.




                       Figure 3 : The classification accuracy of the proposed system
From figure 3, the classification accuracy of the proposed system improves by 10% which is a considerable
improvement from regular MLP neural network as well as regular Partial recurrent neural network..

4. Conclusion
In this paper it was proposed to implement a novel neural network based on the partial recurrent neural
network with laguarre polynomial in the input layer. Features from the EEG data in time domain was
extracted usingdiscrete cosine transform. The frequency of interest was extracted using Butterworth band

                                                    14
Computer Engineering and Intelligent Systems                                                  www.iiste.org
ISSN 2222-1719 (Paper) ISSN 2222-2863 (Online)
Vol 3, No.3, 2012

pass filter. Maximum and average energy for each channel was calculated. The proposed method was
implemented using LabVIEW and VC++. The obtained results in the proposed classification method are
better than currently available classification algorithms. Further investigation needs to be carried out with
other EEG data.


References
Mason S G and Birch G E (2003), “A general framework for brain computer interface design”, IEEE
Transactions on Neural Systems and Rehabilitation Engineering, vol.11, 70–85.
McFarland D and Wolpaw J R (1998),”EEG-based communication and control: short-term role of feedback
“, IEEE Transactions on. Rehabilitation Engineering, vol. 6, 7–11.
Dornhege G, Blankertz B, Curio G and Muller K R (2004), “Boosting bit rates in noninvasive EEG
single-trial classifications by feature combination and multiclass paradigms”, IEEE Transactions on
Biomedical Engineering, vol. 51, 993–1002
Serby H, Yom-Tov E and Inbar G F (2005), “An improved P300-based brain–computer interface”, IEEE
Transactions on Neural Systems and Rehabilitation Engineering, vol. 13, 89–98
Guan J, Chen Y, Lin J, Yun Y and Huang M (2005), “N2 components as features for brain computer
interface”, Proceedings of 1st International Conference on Neural Interface and Control (Wuhan, China),
45–9.
Peterson D A, Knight J N, Kirby M J, Anderson C W and Thaut M H (2005), “Feature selection and blind
source separation in an EEG based brain computer interface”, EURASIP J. Appl. Signal Process. 19
3128–40
Bashashati A, Ward R K and Birch G E (2005), “A new design of the asynchronous brain computer
interface using the knowledge of the path of features”, Proceedings of 2nd IEEE-EMBS Conference on
Neural Engineering (Arlington, VA), 101–4
Pfurtscheller G and Neuper C (2001), “Motor imagery and direct brain–computer communication”,
Proceedings of IEEE vol.89, 1123–34
Curran E, Sykacek P, Stokes M, Roberts S J, Penny W, Johnsrude I and Owen A M (2004), “Cognitive tasks
for driving a brain–computer interfacing system: a pilot study”, IEEE Trans. on Neural Systems and
Rehabiitational Engineering Vol. 12, 48–54
Millan J R (2004), “On the need for on-line learning in brain–computer interfaces”,          Proceedings of
Annual International Joint Conference on Neural Networks (Budapest, Hungary)
Blankertz B, Curio G and Muller K R (2002), “Classifying single trial EEG: Towards brain–computer
interfacing”, Advances in Neural Information Processing Systems vol 14, 157–64
Hung C I, Lee P L, Wu Y T, Chen L F, Yeh T C and Hsieh J C (2005), “Recognition of motor imagery
electroencephalography using independent component analysis and machine classifiers”, Arificial Neural
Networks and Biomedical Engineering, 33, 1053–70.
N. Ahmed, T. Natarajan (1983),    “Discrete-Time Signals and Systems”, Reston Publishing Company.
Giovanni Bianchi and Roberto Sorrentino (2007). “Electronic filter simulation & design”, McGraw-Hill
Professional. 17–20. ISBN 9780071494670.
S. Butterworth (1930), “Wireless Engineer” , vol. 7, 536–541.
Thomas Lal, Thilo Hinterberger, Guido Widman, Michael Schröder, Jeremy Hill, Wolfgang Rosenstiel,
Christian Elger, Bernhard Schölkopf, Niels Birbaumer.(2004), “Methods Towards Invasive Human Brain
Computer Interfaces”, Advances in Neural Information Processing Systems (NIPS)




                                                     15
Computer Engineering and Intelligent Systems                                                  www.iiste.org
ISSN 2222-1719 (Paper) ISSN 2222-2863 (Online)
Vol 3, No.3, 2012

Aparna Chaparala, is working as an Associate Professor in computer science and engineering department
of R.V.R. & J.C. College of Engineering, Chowdavaram, Guntur. She has 9 years experience in teaching.
She completed her M.Tech in Computer Science & Engineering. She is doing her research in Data Mining
area. Presently pursuing Ph.D from J.N.T.U, Hyderabad. She has published 5 papers in international
journals.


Dr J.V.R. Murthy is presently working as a professor in the department of CSE at J.N.T.U., Kakinada. He
did M.Tech in CSE at IIT. He has over 20 years of teaching experience and 3 years of industrial
experience. A Memento of Appreciation was awarded for “good performance and on schedule completion
of People Soft HRMS project” by Key Span Energy Corporation, New York. He has more than 15
publications in national and international journals. His interested areas of research include Data
Warehousing, data mining and VLDB.


Dr B. Raveendra Babu has obtained Masters degree in Computer Science and Engineering from Anna
University, Chennai. He received Ph.D. in Applied Mathematics from S.V University, Tirupati. He is
currently leading a Team as Director (Operations), M/s.Delta Technologies (P) Ltd.,Madhapur, Hyderabad.
He has 26 years of teaching experience. He has more than 25 international & national publications to his
credit. His interested areas of research include VLDB, Image Processing, Pattern analysis and Wavelets.


M.V.P.Chandra Sekhara Rao, is an Associate Professor in the department of computer science and
engineering in R.V.R. & J.C. College of Engineering, Chowdavaram, Guntur.           He has over 15 years of
experience in teaching.   He completed his B.E and M.Tech in Computer Science & Engineering.          He is
doing research in the area of Data Mining.     Presently pursuing Ph.D from J.N.T.U, Hyderabad.     He has
published 5 papers in international journals and presented a paper in international conference.




                                                     16

Contenu connexe

Tendances

11.iterative idma receivers with random and tree based interleavers
11.iterative idma receivers with random and tree based interleavers11.iterative idma receivers with random and tree based interleavers
11.iterative idma receivers with random and tree based interleavers
Alexander Decker
 
Lecture: Digital Signal Processing Batch 2009
Lecture: Digital Signal Processing Batch 2009Lecture: Digital Signal Processing Batch 2009
Lecture: Digital Signal Processing Batch 2009
ubaidis
 
Neural Network
Neural NetworkNeural Network
Neural Network
samisounda
 

Tendances (18)

Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
ASR_final
ASR_finalASR_final
ASR_final
 
Implementation of Back-Propagation Neural Network using Scilab and its Conver...
Implementation of Back-Propagation Neural Network using Scilab and its Conver...Implementation of Back-Propagation Neural Network using Scilab and its Conver...
Implementation of Back-Propagation Neural Network using Scilab and its Conver...
 
Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...
Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...
Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...
 
11.iterative idma receivers with random and tree based interleavers
11.iterative idma receivers with random and tree based interleavers11.iterative idma receivers with random and tree based interleavers
11.iterative idma receivers with random and tree based interleavers
 
Iterative idma receivers with random and tree based interleavers
Iterative idma receivers with random and tree based interleaversIterative idma receivers with random and tree based interleavers
Iterative idma receivers with random and tree based interleavers
 
An32272275
An32272275An32272275
An32272275
 
Mathematical Foundation of Discrete time Hopfield Networks
Mathematical Foundation of Discrete time Hopfield NetworksMathematical Foundation of Discrete time Hopfield Networks
Mathematical Foundation of Discrete time Hopfield Networks
 
Signal classification of signal
Signal classification of signalSignal classification of signal
Signal classification of signal
 
Lecture: Digital Signal Processing Batch 2009
Lecture: Digital Signal Processing Batch 2009Lecture: Digital Signal Processing Batch 2009
Lecture: Digital Signal Processing Batch 2009
 
Max net
Max netMax net
Max net
 
New Approach of Preprocessing For Numeral Recognition
New Approach of Preprocessing For Numeral RecognitionNew Approach of Preprocessing For Numeral Recognition
New Approach of Preprocessing For Numeral Recognition
 
Electrocardiogram Denoised Signal by Discrete Wavelet Transform and Continuou...
Electrocardiogram Denoised Signal by Discrete Wavelet Transform and Continuou...Electrocardiogram Denoised Signal by Discrete Wavelet Transform and Continuou...
Electrocardiogram Denoised Signal by Discrete Wavelet Transform and Continuou...
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)
 
Neural Network - Feed Forward - Back Propagation Visualization
Neural Network - Feed Forward - Back Propagation VisualizationNeural Network - Feed Forward - Back Propagation Visualization
Neural Network - Feed Forward - Back Propagation Visualization
 
Hopfield Networks
Hopfield NetworksHopfield Networks
Hopfield Networks
 
Neural Network
Neural NetworkNeural Network
Neural Network
 
Adaline madaline
Adaline madalineAdaline madaline
Adaline madaline
 

En vedette

En vedette (7)

Svm my
Svm mySvm my
Svm my
 
Classification & preduction
Classification & preductionClassification & preduction
Classification & preduction
 
Text Classification/Categorization
Text Classification/CategorizationText Classification/Categorization
Text Classification/Categorization
 
About SVM
About SVMAbout SVM
About SVM
 
Text categorization
Text categorizationText categorization
Text categorization
 
You Are Not As Rational As You Think
You Are Not As Rational As You ThinkYou Are Not As Rational As You Think
You Are Not As Rational As You Think
 
How to Use Social Media to Influence the World
How to Use Social Media to Influence the WorldHow to Use Social Media to Influence the World
How to Use Social Media to Influence the World
 

Similaire à A novel neural network classifier for brain computer

Module1_dsffffffffffffffffffffgggpa.pptx
Module1_dsffffffffffffffffffffgggpa.pptxModule1_dsffffffffffffffffffffgggpa.pptx
Module1_dsffffffffffffffffffffgggpa.pptx
realme6igamerr
 
Analysis of Butterworth and Chebyshev Filters for ECG Denoising Using Wavelets
Analysis of Butterworth and Chebyshev Filters for ECG Denoising Using WaveletsAnalysis of Butterworth and Chebyshev Filters for ECG Denoising Using Wavelets
Analysis of Butterworth and Chebyshev Filters for ECG Denoising Using Wavelets
IOSR Journals
 
ECG Signal Compression Technique Based on Discrete Wavelet Transform and QRS-...
ECG Signal Compression Technique Based on Discrete Wavelet Transform and QRS-...ECG Signal Compression Technique Based on Discrete Wavelet Transform and QRS-...
ECG Signal Compression Technique Based on Discrete Wavelet Transform and QRS-...
CSCJournals
 

Similaire à A novel neural network classifier for brain computer (20)

ENERGY COMPUTATION FOR BCI USING DCT AND MOVING AVERAGE WINDOW FOR NOISE SMOO...
ENERGY COMPUTATION FOR BCI USING DCT AND MOVING AVERAGE WINDOW FOR NOISE SMOO...ENERGY COMPUTATION FOR BCI USING DCT AND MOVING AVERAGE WINDOW FOR NOISE SMOO...
ENERGY COMPUTATION FOR BCI USING DCT AND MOVING AVERAGE WINDOW FOR NOISE SMOO...
 
Multidimensional Approaches for Noise Cancellation of ECG signal
Multidimensional Approaches for Noise Cancellation of ECG signalMultidimensional Approaches for Noise Cancellation of ECG signal
Multidimensional Approaches for Noise Cancellation of ECG signal
 
An Optimized Transform for ECG Signal Compression
An Optimized Transform for ECG Signal CompressionAn Optimized Transform for ECG Signal Compression
An Optimized Transform for ECG Signal Compression
 
Module1_dsffffffffffffffffffffgggpa.pptx
Module1_dsffffffffffffffffffffgggpa.pptxModule1_dsffffffffffffffffffffgggpa.pptx
Module1_dsffffffffffffffffffffgggpa.pptx
 
Signals and Systems-Unit 1 & 2.pptx
Signals and Systems-Unit 1 & 2.pptxSignals and Systems-Unit 1 & 2.pptx
Signals and Systems-Unit 1 & 2.pptx
 
Analysis of Butterworth and Chebyshev Filters for ECG Denoising Using Wavelets
Analysis of Butterworth and Chebyshev Filters for ECG Denoising Using WaveletsAnalysis of Butterworth and Chebyshev Filters for ECG Denoising Using Wavelets
Analysis of Butterworth and Chebyshev Filters for ECG Denoising Using Wavelets
 
A machine learning algorithm for classification of mental tasks.pdf
A machine learning algorithm for classification of mental tasks.pdfA machine learning algorithm for classification of mental tasks.pdf
A machine learning algorithm for classification of mental tasks.pdf
 
Dsp 2018 foehu - lec 10 - multi-rate digital signal processing
Dsp 2018 foehu - lec 10 - multi-rate digital signal processingDsp 2018 foehu - lec 10 - multi-rate digital signal processing
Dsp 2018 foehu - lec 10 - multi-rate digital signal processing
 
Advanced Support Vector Machine for classification in Neural Network
Advanced Support Vector Machine for classification  in Neural NetworkAdvanced Support Vector Machine for classification  in Neural Network
Advanced Support Vector Machine for classification in Neural Network
 
FPGA Implementation of Large Area Efficient and Low Power Geortzel Algorithm ...
FPGA Implementation of Large Area Efficient and Low Power Geortzel Algorithm ...FPGA Implementation of Large Area Efficient and Low Power Geortzel Algorithm ...
FPGA Implementation of Large Area Efficient and Low Power Geortzel Algorithm ...
 
Time Domain Signal Analysis Using Modified Haar and Modified Daubechies Wavel...
Time Domain Signal Analysis Using Modified Haar and Modified Daubechies Wavel...Time Domain Signal Analysis Using Modified Haar and Modified Daubechies Wavel...
Time Domain Signal Analysis Using Modified Haar and Modified Daubechies Wavel...
 
Mining of time series data base using fuzzy neural information systems
Mining of time series data base using fuzzy neural information systemsMining of time series data base using fuzzy neural information systems
Mining of time series data base using fuzzy neural information systems
 
FPGA IMPLEMENTATION OF HUANG HILBERT TRANSFORM FOR CLASSIFICATION OF EPILEPTI...
FPGA IMPLEMENTATION OF HUANG HILBERT TRANSFORM FOR CLASSIFICATION OF EPILEPTI...FPGA IMPLEMENTATION OF HUANG HILBERT TRANSFORM FOR CLASSIFICATION OF EPILEPTI...
FPGA IMPLEMENTATION OF HUANG HILBERT TRANSFORM FOR CLASSIFICATION OF EPILEPTI...
 
FPGA IMPLEMENTATION OF HUANG HILBERT TRANSFORM FOR CLASSIFICATION OF EPILEPTI...
FPGA IMPLEMENTATION OF HUANG HILBERT TRANSFORM FOR CLASSIFICATION OF EPILEPTI...FPGA IMPLEMENTATION OF HUANG HILBERT TRANSFORM FOR CLASSIFICATION OF EPILEPTI...
FPGA IMPLEMENTATION OF HUANG HILBERT TRANSFORM FOR CLASSIFICATION OF EPILEPTI...
 
ECG Classification using Dynamic High Pass Filtering and Statistical Framewor...
ECG Classification using Dynamic High Pass Filtering and Statistical Framewor...ECG Classification using Dynamic High Pass Filtering and Statistical Framewor...
ECG Classification using Dynamic High Pass Filtering and Statistical Framewor...
 
J041215358
J041215358J041215358
J041215358
 
IRJET- Disentangling Brain Activity from EEG Data using Logistic Regression, ...
IRJET- Disentangling Brain Activity from EEG Data using Logistic Regression, ...IRJET- Disentangling Brain Activity from EEG Data using Logistic Regression, ...
IRJET- Disentangling Brain Activity from EEG Data using Logistic Regression, ...
 
On The Fundamental Aspects of Demodulation
On The Fundamental Aspects of DemodulationOn The Fundamental Aspects of Demodulation
On The Fundamental Aspects of Demodulation
 
ADC Lab Analysis
ADC Lab AnalysisADC Lab Analysis
ADC Lab Analysis
 
ECG Signal Compression Technique Based on Discrete Wavelet Transform and QRS-...
ECG Signal Compression Technique Based on Discrete Wavelet Transform and QRS-...ECG Signal Compression Technique Based on Discrete Wavelet Transform and QRS-...
ECG Signal Compression Technique Based on Discrete Wavelet Transform and QRS-...
 

Plus de Alexander Decker

Abnormalities of hormones and inflammatory cytokines in women affected with p...
Abnormalities of hormones and inflammatory cytokines in women affected with p...Abnormalities of hormones and inflammatory cytokines in women affected with p...
Abnormalities of hormones and inflammatory cytokines in women affected with p...
Alexander Decker
 
A usability evaluation framework for b2 c e commerce websites
A usability evaluation framework for b2 c e commerce websitesA usability evaluation framework for b2 c e commerce websites
A usability evaluation framework for b2 c e commerce websites
Alexander Decker
 
A universal model for managing the marketing executives in nigerian banks
A universal model for managing the marketing executives in nigerian banksA universal model for managing the marketing executives in nigerian banks
A universal model for managing the marketing executives in nigerian banks
Alexander Decker
 
A unique common fixed point theorems in generalized d
A unique common fixed point theorems in generalized dA unique common fixed point theorems in generalized d
A unique common fixed point theorems in generalized d
Alexander Decker
 
A trends of salmonella and antibiotic resistance
A trends of salmonella and antibiotic resistanceA trends of salmonella and antibiotic resistance
A trends of salmonella and antibiotic resistance
Alexander Decker
 
A transformational generative approach towards understanding al-istifham
A transformational  generative approach towards understanding al-istifhamA transformational  generative approach towards understanding al-istifham
A transformational generative approach towards understanding al-istifham
Alexander Decker
 
A time series analysis of the determinants of savings in namibia
A time series analysis of the determinants of savings in namibiaA time series analysis of the determinants of savings in namibia
A time series analysis of the determinants of savings in namibia
Alexander Decker
 
A therapy for physical and mental fitness of school children
A therapy for physical and mental fitness of school childrenA therapy for physical and mental fitness of school children
A therapy for physical and mental fitness of school children
Alexander Decker
 
A theory of efficiency for managing the marketing executives in nigerian banks
A theory of efficiency for managing the marketing executives in nigerian banksA theory of efficiency for managing the marketing executives in nigerian banks
A theory of efficiency for managing the marketing executives in nigerian banks
Alexander Decker
 
A systematic evaluation of link budget for
A systematic evaluation of link budget forA systematic evaluation of link budget for
A systematic evaluation of link budget for
Alexander Decker
 
A synthetic review of contraceptive supplies in punjab
A synthetic review of contraceptive supplies in punjabA synthetic review of contraceptive supplies in punjab
A synthetic review of contraceptive supplies in punjab
Alexander Decker
 
A synthesis of taylor’s and fayol’s management approaches for managing market...
A synthesis of taylor’s and fayol’s management approaches for managing market...A synthesis of taylor’s and fayol’s management approaches for managing market...
A synthesis of taylor’s and fayol’s management approaches for managing market...
Alexander Decker
 
A survey paper on sequence pattern mining with incremental
A survey paper on sequence pattern mining with incrementalA survey paper on sequence pattern mining with incremental
A survey paper on sequence pattern mining with incremental
Alexander Decker
 
A survey on live virtual machine migrations and its techniques
A survey on live virtual machine migrations and its techniquesA survey on live virtual machine migrations and its techniques
A survey on live virtual machine migrations and its techniques
Alexander Decker
 
A survey on data mining and analysis in hadoop and mongo db
A survey on data mining and analysis in hadoop and mongo dbA survey on data mining and analysis in hadoop and mongo db
A survey on data mining and analysis in hadoop and mongo db
Alexander Decker
 
A survey on challenges to the media cloud
A survey on challenges to the media cloudA survey on challenges to the media cloud
A survey on challenges to the media cloud
Alexander Decker
 
A survey of provenance leveraged
A survey of provenance leveragedA survey of provenance leveraged
A survey of provenance leveraged
Alexander Decker
 
A survey of private equity investments in kenya
A survey of private equity investments in kenyaA survey of private equity investments in kenya
A survey of private equity investments in kenya
Alexander Decker
 
A study to measures the financial health of
A study to measures the financial health ofA study to measures the financial health of
A study to measures the financial health of
Alexander Decker
 

Plus de Alexander Decker (20)

Abnormalities of hormones and inflammatory cytokines in women affected with p...
Abnormalities of hormones and inflammatory cytokines in women affected with p...Abnormalities of hormones and inflammatory cytokines in women affected with p...
Abnormalities of hormones and inflammatory cytokines in women affected with p...
 
A validation of the adverse childhood experiences scale in
A validation of the adverse childhood experiences scale inA validation of the adverse childhood experiences scale in
A validation of the adverse childhood experiences scale in
 
A usability evaluation framework for b2 c e commerce websites
A usability evaluation framework for b2 c e commerce websitesA usability evaluation framework for b2 c e commerce websites
A usability evaluation framework for b2 c e commerce websites
 
A universal model for managing the marketing executives in nigerian banks
A universal model for managing the marketing executives in nigerian banksA universal model for managing the marketing executives in nigerian banks
A universal model for managing the marketing executives in nigerian banks
 
A unique common fixed point theorems in generalized d
A unique common fixed point theorems in generalized dA unique common fixed point theorems in generalized d
A unique common fixed point theorems in generalized d
 
A trends of salmonella and antibiotic resistance
A trends of salmonella and antibiotic resistanceA trends of salmonella and antibiotic resistance
A trends of salmonella and antibiotic resistance
 
A transformational generative approach towards understanding al-istifham
A transformational  generative approach towards understanding al-istifhamA transformational  generative approach towards understanding al-istifham
A transformational generative approach towards understanding al-istifham
 
A time series analysis of the determinants of savings in namibia
A time series analysis of the determinants of savings in namibiaA time series analysis of the determinants of savings in namibia
A time series analysis of the determinants of savings in namibia
 
A therapy for physical and mental fitness of school children
A therapy for physical and mental fitness of school childrenA therapy for physical and mental fitness of school children
A therapy for physical and mental fitness of school children
 
A theory of efficiency for managing the marketing executives in nigerian banks
A theory of efficiency for managing the marketing executives in nigerian banksA theory of efficiency for managing the marketing executives in nigerian banks
A theory of efficiency for managing the marketing executives in nigerian banks
 
A systematic evaluation of link budget for
A systematic evaluation of link budget forA systematic evaluation of link budget for
A systematic evaluation of link budget for
 
A synthetic review of contraceptive supplies in punjab
A synthetic review of contraceptive supplies in punjabA synthetic review of contraceptive supplies in punjab
A synthetic review of contraceptive supplies in punjab
 
A synthesis of taylor’s and fayol’s management approaches for managing market...
A synthesis of taylor’s and fayol’s management approaches for managing market...A synthesis of taylor’s and fayol’s management approaches for managing market...
A synthesis of taylor’s and fayol’s management approaches for managing market...
 
A survey paper on sequence pattern mining with incremental
A survey paper on sequence pattern mining with incrementalA survey paper on sequence pattern mining with incremental
A survey paper on sequence pattern mining with incremental
 
A survey on live virtual machine migrations and its techniques
A survey on live virtual machine migrations and its techniquesA survey on live virtual machine migrations and its techniques
A survey on live virtual machine migrations and its techniques
 
A survey on data mining and analysis in hadoop and mongo db
A survey on data mining and analysis in hadoop and mongo dbA survey on data mining and analysis in hadoop and mongo db
A survey on data mining and analysis in hadoop and mongo db
 
A survey on challenges to the media cloud
A survey on challenges to the media cloudA survey on challenges to the media cloud
A survey on challenges to the media cloud
 
A survey of provenance leveraged
A survey of provenance leveragedA survey of provenance leveraged
A survey of provenance leveraged
 
A survey of private equity investments in kenya
A survey of private equity investments in kenyaA survey of private equity investments in kenya
A survey of private equity investments in kenya
 
A study to measures the financial health of
A study to measures the financial health ofA study to measures the financial health of
A study to measures the financial health of
 

Dernier

Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
WSO2
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
?#DUbAI#??##{{(☎️+971_581248768%)**%*]'#abortion pills for sale in dubai@
 

Dernier (20)

Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 
A Beginners Guide to Building a RAG App Using Open Source Milvus
A Beginners Guide to Building a RAG App Using Open Source MilvusA Beginners Guide to Building a RAG App Using Open Source Milvus
A Beginners Guide to Building a RAG App Using Open Source Milvus
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
Manulife - Insurer Transformation Award 2024
Manulife - Insurer Transformation Award 2024Manulife - Insurer Transformation Award 2024
Manulife - Insurer Transformation Award 2024
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 

A novel neural network classifier for brain computer

  • 1. Computer Engineering and Intelligent Systems www.iiste.org ISSN 2222-1719 (Paper) ISSN 2222-2863 (Online) Vol 3, No.3, 2012 A Novel Neural Network Classifier for Brain Computer Interface Aparna Chaparala1* Dr. J.V.R.Murthy2 Dr. B.Raveendra Babu3 M.V.P.Chandra Sekhara Rao1 1. R.V.R.&J.C. College of Engineering, Guntur - 522019, AP, India 2. Dept. of CSE, JNTU College of Engineering, Kakinada, AP, India 3. DELTA Technology & Management Services Pvt. Ltd., Hyderabad, AP, India * E-mail of the corresponding author: chaparala_aparna@yahoo.com Abstract Brain computer interfaces (BCI) provides a non-muscular channel for controlling a device through electroencephalographic signals to perform different tasks. The BCI system records the Electro-encephalography (EEG) and detects specific patterns that initiate control commands of the device. The efficiency of the BCI depends upon the methods used to process the brain signals and classify various patterns of brain signal accurately to perform different tasks. Due to the presence of artifacts in the raw EEG signal, it is required to preprocess the signals for efficient feature extraction. In this paper it is proposed to implement a BCI system which extracts the EEG features using Discrete Cosine transforms. Also, two stages of filtering with the first stage being a butterworth filter and the second stage consisting of an moving average 15 point spencer filter has been used to remove random noise and at the same time maintaining a sharp step response. The classification of the signals is done using the proposed Semi Partial Recurrent Neural Network. The proposed method has very good classification accuracy compared to conventional neural network classifiers. Keywords: Brain Computer Interface (BCI), Electro Encephalography (EEG), Discrete Cosine transforms(DCT), Butterworth filters, Spencer filters, Semi Partial Recurrent Neural network, laguarre polynomial 1. Introduction A Brain Computer Interface (BCI) system records the brain signals through Electro-encephalography (EEG), preprocesses the raw signals to remove artifacts and noise, and employs various signal processing algorithms to translate patterns into meaningful control commands. The purpose of BCI is to control devices like computers, speech synthesizers, assistive appliances and neural prostheses by individual with severe motor disabilities, through brain signals. Signal processing plays an important role in BCI system design, as meaningful patterns are to be extracted from the brain signal. Figure 1 depicts a generic BCI system (Mason S G et al. 2003). The device is controlled through a series of functional components. Electrodes record signals from the users scalp and convert the signals into electrical signals which are amplified. The artifact processor removes the artifacts from the amplified signals. Feature generator transforms the signals into feature values that are the base for the control of device. The feature generator is generally made up of three steps, signal enhancement, feature extraction and dimensionality reduction. Signal enhancement refers to the preprocessing of the signals to increase the signal-to-noise ratio of the signal. Most commonly used preprocessing methods are Surface Laplacian (Mc Farland D et al. 1998 ; Dornhege G et al. 2004), Independent Component Analysis (ICA) (Serby H et al. 2005), and Principal Component Analysis (Guan J et al. 2005). Feature extraction generates the feature vectors and dimensionality reduction, reduces the number of feature. Thus features useful for classification is identified and chosen while artifacts and noise are eliminated in feature generator step. Genetic algorithm (Peterson D A et al. 2005), PCA (Bashashati A et al. 2005), Distinctive sensitive learning vector quantization (DSLVQ) 10
  • 2. Computer Engineering and Intelligent Systems www.iiste.org ISSN 2222-1719 (Paper) ISSN 2222-2863 (Online) Vol 3, No.3, 2012 (Pfurtscheller et al. 2001) are some of the feature selectors used. The feature translator translates the features into control signals. Various classification algorithms based on linear or nonlinear classification methods are available in literature for classifying the features. Bayesian (Curran E et al. 2004), Gaussian (Millan J R 2004), k-nearest neighbor (Blankertz B et al. 2002), SVM (Peterson D A et al. 2005, MLP (Hung C I et al. 2005) are some of the classifiers used. The BCI transducer translates the brain signals into logical control signals. The logical control signals from the feature translator is converted into semantic control signals in control interface. Device controller converts the semantic control signals into physical control signals which control the device. Fig 1: Functional model of a BCI system In this paper, the proposed BCI system extracts features from the EEG signals using Discrete Cosine transforms. The classification of the signals is done using the Semi Partial Recurrent Neural network with laguarre function in input layer and tanh function in hidden layer with delta learning rule. The paper is organized into four sections, with section I giving introduction to BCI systems, section II concerns with the materials and methods used, section III discusses the result with conclusion in section IV. 2. Materials and Methods The discrete cosine transform (DCT) is closely related to Karhunen-Loeve-Hotelling (KLH) transform, a transform that produces uncorrelated coefficients (N Ahmed et al. 1983). DCT converts time series signal into basic frequency components. It decomposes the image into set of waveforms. The process of decomposing an image into a set of cosine basis functions is called forward discrete cosine transform (FDCT) and process of reconstructing is called inverse discrete cosine transform (IDCT). Some simple functions to compute the DCT and to preprocess the provided EEG data for BCI system are as follows: The FDCT (N Ahmed et al. 1983) of a list of n real numbers s(x), x = 0, ..., n-1, is the list of length n is given by: n −1 (2 x + 1)uπ S (u ) = 2 / nC (u )∑ s ( x ) cos u = 0… n (1) x =0 2n Where C(u) is equal to 1/ square root of 2 for u=0 or is equal to 1 for all other values. The constant factors are chosen so that the basis vectors are orthogonal and normalized. The inverse cosine transform (IDCT) is given by: 11
  • 3. Computer Engineering and Intelligent Systems www.iiste.org ISSN 2222-1719 (Paper) ISSN 2222-2863 (Online) Vol 3, No.3, 2012 n −1 ( 2 x + 1)uπ s ( x) = 2 / n ∑ C (u ) S (u ) cos x=0… n (2) u =0 2n Where C(x) is equal to 1/ square root of 2 for x=0 or is equal to 1 for all other values. A 15 point Spencers filter is used to compute the moving averages of EEG signals and reduce the noise spikes. The obtained data in the frequency domain is filtered using Butterworth filter to remove noise and artifacts in the frequency range of 5-30Hz. The Butterworth filter is a signal processing filter which gives a as flat a frequency response for the pass-band (Giovanni Bianchi et al. 2007). It is one of the most commonly used digital filters and is also called maximally flat magnitude filter. In Butterworth filter, no ripples are formed in the pass-band and are zero on reaching stop-band. It has slower roll-off and more linear phase response when compared to other filters like Chebyshev and elliptic filter. Butterworth filters are advantageously used to filter EEG signals as the pass-band and stop-band are maximally flat, which results in quality output signal for different frequency band. In a low-pass filter, all low frequency components in the signal are passed through and the high frequency components are stopped. The cutoff frequency divides the pass-band and the stop-band. Thus artifacts in the EEG signal are easily filtered out using a low-pass filter. The low-pass filter can be modified into high-pass filter; when placed in series with others to form band-pass and band-stop filters. The gain G(ω) of an n-order Butterworth low pass filter (S. Butterworth 1930) in terms of transfer function H(s) is given as G02 G 2 (ω ) = H ( jω ) = 2 2n (3) ω  1+   ω   c where n is order of filter, ωc is cutoff frequency and G0 is the DC gain i.e gain at zero frequency. The Butterworth filter is used to preprocess the EEG signal to remove high frequency noise or artifacts with cutoff frequencies in a range of 5 - 30 Hz. The trend of a time series is estimated using a linear filtering operation as follows: q γ t = ∑ ar X t (n + r ) (4) r =0 Where ar is a set of weights and ∑ ar = 1 is a moving average or finite impulse response filter. The 15 point Spencer filters for moving averages is symmetric in nature. It is given as: 1 (3, - 6, - 5,3, 21, 46, 67,74, 67, 46, 21,3,- 5,-6,-3) 360 The maximum and average energy from each channel are computed and used as attributes. Support vector machine is used to reduce the feature vector. 2.1 Partial Recurrent Neural Network The neural network where input is fed through successive layers of the network to the output is called feedforward networks. The neural network which has a feedback loop is known as Recurrent Neural Network (RNN). If the feedback is in only one of the layers then it is referred to as Semi Partial Recurrent Neural network (SPRNN). The recurrent networks are dynamic in nature as the feedback loops use unit delay elements. PRNN has feedback in any one of the layers only. PRNNs are easier to use than the RNNs. Time is implicitly represented in PRNN. Simple PRNN consists of two-layer network with feedback in the 12
  • 4. Computer Engineering and Intelligent Systems www.iiste.org ISSN 2222-1719 (Paper) ISSN 2222-2863 (Online) Vol 3, No.3, 2012 hidden layer as shown in figure 2. The output of the hidden layer at time t is fed back as additional inputs at time t+1, thus the PRNN works in discrete time steps. The proposed PRNN has laguarre function in the input layer and a tanh function in the hidden layer. The tanh function being asymmetric helps to train faster. Fig 2: A simple Partial Recurrent Neural network The output of PRNN when a input vector x is propagated through a weight layer V, and the previous state activation due to recurrent weight layer U, y j (t ) = f (net j (t )) (5)  n m  net j (t ) =  ∑ X i (t )v ji + ∑ y h (t − 1)u jh  + θ j (6)  i h  where n is the number of inputs, θj is bias, f is output function, m is number of state nodes, and i, j / h, k denotes the input, hidden and output nodes respectively. The output of the network with output weights W is, m net k (t ) = ∑ y j (t ) wkj + θ k (7) j The learning of the PRNN at each time step starts with the input vectors fed into the network and it generates an error, the error is backpropagated to find error gradients for each weights and bias. The weights are updated with learning function using the error gradient. In this paper it is proposed to implement a laguarre function in the input layer to provide details of the input’s past memory recursively. The laguarre polynomial is given by 13
  • 5. Computer Engineering and Intelligent Systems www.iiste.org ISSN 2222-1719 (Paper) ISSN 2222-2863 (Online) Vol 3, No.3, 2012 Lk (u ) = ( e u d k −u k k! dx k e u ) (8) Where k is the order of the polynomial and u is the value for which the polynomial is to be found. It is proposed to use the first order polynomial ie k=1. The experimental setup consists of 25 neurons in the input layer, 4 neurons in the hidden layer and two neurons in the output layer ( one neuron for each class). The hidden layer and the output layer activation functions used are tanh. 3. Results and Discussion The dataset used for the work is provided by University of Tübingen, Germany, Dept. of Computer Engineering and Institute of Medical Psychology and Behavioral Neurobiology, and Max-Planck- Institute for Biological Cybernetics, Tübingen, Germany, and Universität Bonn, Germany, Dept. of Epileptology(Thomas Lal et al. 2004) was used. 168 instances of a single patient were used to test the proposed algorithm. 80% of the data was used for training and the remaining for testing. The classification accuracy obtained along with the classification accuracy of MLP neural network is shown in figure 3. Figure 3 : The classification accuracy of the proposed system From figure 3, the classification accuracy of the proposed system improves by 10% which is a considerable improvement from regular MLP neural network as well as regular Partial recurrent neural network.. 4. Conclusion In this paper it was proposed to implement a novel neural network based on the partial recurrent neural network with laguarre polynomial in the input layer. Features from the EEG data in time domain was extracted usingdiscrete cosine transform. The frequency of interest was extracted using Butterworth band 14
  • 6. Computer Engineering and Intelligent Systems www.iiste.org ISSN 2222-1719 (Paper) ISSN 2222-2863 (Online) Vol 3, No.3, 2012 pass filter. Maximum and average energy for each channel was calculated. The proposed method was implemented using LabVIEW and VC++. The obtained results in the proposed classification method are better than currently available classification algorithms. Further investigation needs to be carried out with other EEG data. References Mason S G and Birch G E (2003), “A general framework for brain computer interface design”, IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol.11, 70–85. McFarland D and Wolpaw J R (1998),”EEG-based communication and control: short-term role of feedback “, IEEE Transactions on. Rehabilitation Engineering, vol. 6, 7–11. Dornhege G, Blankertz B, Curio G and Muller K R (2004), “Boosting bit rates in noninvasive EEG single-trial classifications by feature combination and multiclass paradigms”, IEEE Transactions on Biomedical Engineering, vol. 51, 993–1002 Serby H, Yom-Tov E and Inbar G F (2005), “An improved P300-based brain–computer interface”, IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 13, 89–98 Guan J, Chen Y, Lin J, Yun Y and Huang M (2005), “N2 components as features for brain computer interface”, Proceedings of 1st International Conference on Neural Interface and Control (Wuhan, China), 45–9. Peterson D A, Knight J N, Kirby M J, Anderson C W and Thaut M H (2005), “Feature selection and blind source separation in an EEG based brain computer interface”, EURASIP J. Appl. Signal Process. 19 3128–40 Bashashati A, Ward R K and Birch G E (2005), “A new design of the asynchronous brain computer interface using the knowledge of the path of features”, Proceedings of 2nd IEEE-EMBS Conference on Neural Engineering (Arlington, VA), 101–4 Pfurtscheller G and Neuper C (2001), “Motor imagery and direct brain–computer communication”, Proceedings of IEEE vol.89, 1123–34 Curran E, Sykacek P, Stokes M, Roberts S J, Penny W, Johnsrude I and Owen A M (2004), “Cognitive tasks for driving a brain–computer interfacing system: a pilot study”, IEEE Trans. on Neural Systems and Rehabiitational Engineering Vol. 12, 48–54 Millan J R (2004), “On the need for on-line learning in brain–computer interfaces”, Proceedings of Annual International Joint Conference on Neural Networks (Budapest, Hungary) Blankertz B, Curio G and Muller K R (2002), “Classifying single trial EEG: Towards brain–computer interfacing”, Advances in Neural Information Processing Systems vol 14, 157–64 Hung C I, Lee P L, Wu Y T, Chen L F, Yeh T C and Hsieh J C (2005), “Recognition of motor imagery electroencephalography using independent component analysis and machine classifiers”, Arificial Neural Networks and Biomedical Engineering, 33, 1053–70. N. Ahmed, T. Natarajan (1983), “Discrete-Time Signals and Systems”, Reston Publishing Company. Giovanni Bianchi and Roberto Sorrentino (2007). “Electronic filter simulation & design”, McGraw-Hill Professional. 17–20. ISBN 9780071494670. S. Butterworth (1930), “Wireless Engineer” , vol. 7, 536–541. Thomas Lal, Thilo Hinterberger, Guido Widman, Michael Schröder, Jeremy Hill, Wolfgang Rosenstiel, Christian Elger, Bernhard Schölkopf, Niels Birbaumer.(2004), “Methods Towards Invasive Human Brain Computer Interfaces”, Advances in Neural Information Processing Systems (NIPS) 15
  • 7. Computer Engineering and Intelligent Systems www.iiste.org ISSN 2222-1719 (Paper) ISSN 2222-2863 (Online) Vol 3, No.3, 2012 Aparna Chaparala, is working as an Associate Professor in computer science and engineering department of R.V.R. & J.C. College of Engineering, Chowdavaram, Guntur. She has 9 years experience in teaching. She completed her M.Tech in Computer Science & Engineering. She is doing her research in Data Mining area. Presently pursuing Ph.D from J.N.T.U, Hyderabad. She has published 5 papers in international journals. Dr J.V.R. Murthy is presently working as a professor in the department of CSE at J.N.T.U., Kakinada. He did M.Tech in CSE at IIT. He has over 20 years of teaching experience and 3 years of industrial experience. A Memento of Appreciation was awarded for “good performance and on schedule completion of People Soft HRMS project” by Key Span Energy Corporation, New York. He has more than 15 publications in national and international journals. His interested areas of research include Data Warehousing, data mining and VLDB. Dr B. Raveendra Babu has obtained Masters degree in Computer Science and Engineering from Anna University, Chennai. He received Ph.D. in Applied Mathematics from S.V University, Tirupati. He is currently leading a Team as Director (Operations), M/s.Delta Technologies (P) Ltd.,Madhapur, Hyderabad. He has 26 years of teaching experience. He has more than 25 international & national publications to his credit. His interested areas of research include VLDB, Image Processing, Pattern analysis and Wavelets. M.V.P.Chandra Sekhara Rao, is an Associate Professor in the department of computer science and engineering in R.V.R. & J.C. College of Engineering, Chowdavaram, Guntur. He has over 15 years of experience in teaching. He completed his B.E and M.Tech in Computer Science & Engineering. He is doing research in the area of Data Mining. Presently pursuing Ph.D from J.N.T.U, Hyderabad. He has published 5 papers in international journals and presented a paper in international conference. 16