SlideShare une entreprise Scribd logo
1  sur  5
Télécharger pour lire hors ligne
ACEEE Int. J. on Network Security , Vol. 02, No. 03, July 2011



 Analysis of Neocognitron of Neural Network Method
               in the String Recognition
                                            Amit Kumar Gupta 1, Yash Pal Singh 2
                                        1
                                         KIET/ MCA Department, Ghaziabad (U.P.), India
                                                  Email: amitid29@gmail.com
                                   2
                                     BIET / CSE Department, Bundelkhand, Jhansi (U.P.), India
                                                 Email: yash_biet@yahoo.co.in

Abstract: This paper aims that analysing neural network method           can be implemented by using a feed-forward neural network
in pattern recognition. A neural network is a processing device,         that has been trained accordingly. During training, the network
whose design was inspired by the design and functioning of               is trained to associate outputs with input patterns. When the
human brain and their components. The proposed solutions                 network is used, it identifies the input pattern and tries to
focus on applying Neocognitron Algorithm model for pattern               output the associated output pattern. Four significant
recognition. The primary function of which is to retrieve in a           approaches to PR have evolved. These are [5].
pattern stored in memory, when an incomplete or noisy version
of that pattern is presented. An associative memory is a                 Statistical pattern recognition:
storehouse of associated patterns that are encoded in some                   Here, the problem is posed as one of composite
form. In auto-association, an input pattern is associated with
                                                                         hypothesis testing, each hypothesis pertaining to the premise,
itself and the states of input and output units coincide. When
the storehouse is incited with a given distorted or partial
                                                                         of the datum having originated from a particular class or as
pattern, the associated pattern pair stored in its perfect form          one of regression from the space of measurements to the
is recalled. Pattern recognition techniques are associated a             space of classes. The statistical methods for solving the same
symbolic identity with the image of the pattern. This problem            involve the computation other class conditional probability
of replication of patterns by machines (computers) involves              densities, which remains the main hurdle in this approach.
the machine printed patterns. There is no idle memory                    The statistical approach is one of the oldest, and still widely
containing data and programmed, but each neuron is                       used [8].
programmed and continuously active.
                                                                         Syntactic pattern recognition:
Keywords: Neural network, machine printed string, pattern                    In syntactic pattern recognition, each pattern is assumed
recognition, A perceptron-type network, Learning, Recognition,           to be composed of sub-pattern or primitives string together
Connection.
                                                                         in accordance with the generation rules of a grammar string
                                                                         of the associated class. Class identifications accomplished
                     I. INTRODUCTION
                                                                         by way of parsing operations using automata corresponding
    An Artificial Neural Network (ANN) is an information                 to the various grammars [15, 16]. Parser design and
processing paradigm that is inspired by the biological nervous           grammatical inference are two difficult issues associated with
systems, such as the brain. It is composed of a large number             this approach to PR and are responsible for its somewhat
of highly interconnected processing elements (neurons)                   limited applicability.
working in unison to solve specific problems. A Neural
                                                                         Knowledge-based pattern recognition:
Network is configured for pattern recognition or data
classification, through a learning process. In biological                    This approach to PR [17] is evolved from advances in
systems, Learning involves adjustments to the synaptic                   rule-based system in artificial intelligence (AI). Each rule is in
connections that exist between the neurons. Neural networks              form of a clause that reflects evidence about the presence of
process information in a similar way the human brain does.               a particular class. The sub-problems spawned by the
The network is composed of a large number of highly                      methodology are:
interconnected processing elements working in parallel to                1. How the rule-based may be constructed, and
solve a specific problem. Neural networks learn by example.              2. What mechanism might be used to integrate the evidence
A neuron has many inputs and one output. The neuron has                  yielded by the invoked rules?
two modes of operation (i) the training mode and (ii) the                Neural Pattern Recognition:
using mode. In the training mode, the neuron can be trained
                                                                             Artificial Neural Network (ANN) provides an emerging
for particular input patterns. In the using mode, when a taught
                                                                         paradigm in pattern recognition. The field of ANN
input pattern is detected at the input, its associated output
                                                                         encompasses a large variety of models [18], all of which have
becomes the current output. If the input pattern does not
                                                                         two important string.
belong in the taught list of input patterns, the training rule is
                                                                         1. They are composed of a large number of structurally and
used. Neural network has many applications. The most likely
                                                                         functionally similar units called neurons usually connected
applications for the neural networks are (1) Classification (2)
                                                                         various configurations by weighted links.
Association and (3) Reasoning. An important application of
                                                                         2. The Ann’s model parameters are derived from supplied I/O
neural networks is pattern recognition. Pattern recognition
                                                                    18
© 2011 ACEEE
DOI: 01.IJNS.02.03.200
ACEEE Int. J. on Network Security , Vol. 02, No. 03, July 2011

paired data sets by an estimation process called training.               acquires the ability to classify and correctly recognize these
                                                                         patterns, according to the differences in their shapes. The
                    II. METHODOLOGY                                      network has a multilayer structure. Each layer has two planes.
                                                                         One called s-plane, consists of s units. Other the c-plane,
    Different neural network algorithms are used for
                                                                         consists of c units. The units can have both excitatory and
recognizing the pattern. Various algorithms differ in their
                                                                         inhibitory connections. The network has forward connections
learning mechanism. Information is stored in the weight matrix
                                                                         from input layers to output layer and backward connections
of a neural network. Learning is the determination of the
                                                                         from the output layer to the input layer. The forward signals
weights. All learning methods used for adaptive neural
                                                                         are for pattern classification and recognition. The backward
networks can be classified into two major categories:
                                                                         signals are for selective attention, pattern segmentation and
supervised learning and unsupervised learning. Supervised
                                                                         associated recall [23].
learning incorporates an external teacher. After training the
network, we should get the response equal to target response.
During the learning process, global information may be
required. The aim is to determine a set of weights, which
minimizes the error. Unsupervised learning uses no external
teacher and is based on clustering of input data. There is no
prior information about input’s membership in a particular
class. The string of the patterns and a history of training are
used to assist the network in defining classes. It self-organizes
data presented to the network and detects their emergent
collective properties. The characteristics of the neurons and
initial weights are specified based upon the training method
of the network. The pattern sets is applied to the network
during the training. The pattern to be recognized are in the
form of vector whose elements is obtained from a pattern
grid. The elements are either 0 and 1 or -1 and 1. In some of
the algorithms, weights are calculated from the pattern
presented to the network and in some algorithms, weights                 Figure 4.1 shows the character identification process schematically.
are initialized. The network acquires the knowledge from the
                                                                         Figure 4.1 : This diagram is a schematic representation of the
environment. The network stores the patterns presented
                                                                         interconnection strategy of the neocognitron. (a) On the first
during the training in another way it extracts the features of
                                                                         level, each S unit receives input connections from a small
pattern. As a result of this, the information can be retrieved
                                                                         region of the retina. Units in corresponding positions on all
later.
                                                                         planes receive input connections from the same region of the
                III. PROBLEM STATEMENT                                   retina. The region from which an S-cell receives input
   The aim of the paper is that neural network has                       connections defines the receptive field of the cell, (b) On
demonstrated its capability for solving complex pattern                  intermediate levels, each unit on an s-plane receives input
recognition problems. Commonly solved problems of pattern                connections from corresponding locations on all C-planes in
have limited scope. Single neural network architecture can               the previous level. C-celIs have connections from a region of
recognize only few patterns. In this paper discusses on neural           S-cells on the S level preceding it. If the number of C-planes
network algorithm with their implementation details for                  is the same as that of S-planes at that level, then each C-cell
solving pattern recognition problems. The relative                       has connections from S-cells on a single s-plane. If there are
performance evaluation of this algorithms has been carried               fewer C-planes than S-planes, some C-cells may receive
out. The comparisons of algorithm have been performed                    connections from more than one S-plane.
based on following criteria:
   (1) Noise in weights                                                                   V. NEOCOGNITRON ALGORITHM
   (2) Noise in inputs                                                      Let u(1), u(2), …………u(N) be the excitatory inputs and
   (3) Loss of connections                                               v be the inhibitory input. The output of the s-cell is given
   (4) Missing information and adding information.                       by

                      IV. NEOCOGNITRON
     The Neocognitron is self organized by unsupervised
learning and acquires the ability for correct pattern
recognition. For self organization of network, the patterns
are presented repeatedly to the network. It is not needed to             Where a(v) and b represent the excitatory and inhibitory
give prior information about the categories to which these               interconnecting coefficients respectively. The function
patterns should be classified. The Neocognitron itself                   is defined by
                                                                    19
© 2011 ACEEE
DOI: 01.IJNS.02.03. 200
ACEEE Int. J. on Network Security , Vol. 02, No. 03, July 2011

                                                                        parameter are listed in table 3.9. As can be seen from the table
                                                                        total number of c-cells in layer Uc4 is 20 because each c-plane
                                                                        has only one cell. The number of cells in a connectable area
Let e be the sum of all excitatory inputs weighted with inter-
                                                                        is always 5×5 for every s-layer. Hence number of excitatory
connecting coefficients and h be the inhibitory input multi-
                                                                        input to each s-cell is 5×5in layer Us1, and 5×5×20 in layers
plied by the interconnecting coefficients.
                                                                        Us2, Us3 and Us4 because these layers are preceded by         c-
                                                                        layers consisting of 20 cell planes in each. During learning
                                                                        five training patterns 0,1,2,3 and 4were presented repeatedly
                                                                        to the input layer. After repeated presentation of these five
                                                                        patterns the Neocognitron acquired the ability to classify
So w can be written as,                                                 these patterns according to the difference in their shape.
                                                                        Each pattern is presented five times. It has been observed
                                                                        that a different cell responds to each pattern.
                                                                          TABLE 6.1   SHOWS THE   ARCHITECTURE AND VALUES   OF PARAMETERS USED IN
                                                                                                      NEOCOGNITRON.
When
In Neocognitron, the interconnecting coefficients a(v) and b
increases as learning progresses. If the interconnecting
coefficient increases and if e>>1 and h>>1,




Where the output depends on the ratio e/h, not on e and h.
Therefore, if both the excitatory and inhibitory coefficients
increase at the same rate then the output of the cell reaches a
certain value. Similarly the input and output characteristics
of c-cell is given by,




α is a positive constant which determines the degree saturation
of the output. The output of the excitatory cells s and c are
connected only to the excitatory input terminals of the other
cells. Vs and Vc cells are inhibitory cells, whose output is
                                                                        The various neural network algorithms differ in their learning
connected only to the inhibitory input terminals of other cells.
                                                                        mechanism. Some networks learn by supervised learning and
A Vs –cell gives an output, which is proportional to the
                                                                        in some learning are unsupervised. The capacity of storing
weighted arithmetic mean of all its excitatory inputs. A Vc –
                                                                        patterns and correctly recognizing them differs for different
cell also has only inhibitory inputs but its output is
                                                                        networks. Although if some of the networks are capable of
proportional to the weighted root mean square of its inputs.
                                                                        storing same no of patterns, they differ in complexity of their
Let u(1), u(2), ……u(N) be the excitatory inputs and c(1), c(2),
                                                                        architecture. Total no of neurons needed to classify or
……c(n) be the interconnecting coefficients of its input
                                                                        recognize the patterns are different for various algorithms.
terminals. The output w of this Vc –cell is defined by
                                                                        Total no of unknowns to also different for various algorithms.
                                                                        The networks are compared on the basis of these. The
                                                                        performance of various algorithms under different criteria is
                                                                        presented. These criteria are:

Storage Capacity. By increasing number of planes in each                A. Noise in Weight
layer the capacity of the network can be increased. But by                  Noise has been introduced in two ways,
increasing no of planes in each layer, the computer can not             1- By adding random numbers to the weight
simulate because of the lack of memory capacity.                        2- By adding a constant number to all weights of the network
                                                                        Noise in input: The network is trained with characters without
                          VI. RESULT                                    noise. Then by presenting each of the character, the network
                                                                        is tested. It has been observed that the no of characters
    Neocognitron algorithm A nine layered network U0, Us1,              recognized correctly differs for various algorithms. Noise is
Uc1, Us2, Uc2, Us3, Uc3, Us4, and Uc4 is taken. So the network          introduced to the input vectors at the time of testing and its
has 4 stages proceeded by an input layer. There are 21 cell             effect has been observed on various algorithms. This has
planes in each layer Us1-Us4, and 20 cell planes in Uc1-Uc4, The        been done by adding random numbers to the test vector.
                                                                   20
© 2011 ACEEE
DOI: 01.IJNS.02.03. 200
ACEEE Int. J. on Network Security , Vol. 02, No. 03, July 2011

B. Loss of Connection                                                               Demerits
In the network, neurons are interconnected and every inter-                            This Neocognitron method , The Network architecture is
connection has some interconnecting coefficient called                              very complex. It involves a large no of neurons
weight. If some these weights are equal to zero then how it is
going to effect the classification or recognition, is studied                                                CONCLUSION
under this section. The number of connections that can be
                                                                                       The performance of Quick Prop algorithm has been
removed such that the network performance is not affected
                                                                                    studied under nine criteria. It has been observed that a certain
has also been found out for each algorithm. If connection of
                                                                                    algorithm performs best under a particular criterion. The
input neuron’s to all the output neuron is removed, and the
                                                                                    algorithm have also been compared based on the number of
pixel corresponding to that neuron number is off than it makes
                                                                                    neurons and the number of unknowns to be computed. The
no difference. But if that pixel is on, in the output that be-
                                                                                    detailed description in Table-7.1
comes off. There are two sets of weights. First connecting
input neurons to hidden neurons to out put neurons. If the                            TABLE 7.1: PERFORMANCE OF NEOCOGNITRON ALGORITHM UNDER DIFFERENT
weights connecting 49 input neurons to 15 hidden neurons                                                          CRITERION

and weights connecting 15 hidden neurons to 2 output neu-
rons is equated to zero, then for 25 characters actual out is
equal to the target output.
C. Missing Information
   Missing information means some of the on pixels in pattern
grid are made off. For the algorithm, how many information
we can miss so that the strings can be recognized correctly
varies from string to string. We cannot switch off pixel from
any place. Which pixel is being switched also matters. For
few strings Table-6.2 shows the number of pixels that can be
switched off for all the stored strings in algorithm.
TABLE 6.2: MISSING I NFORMATION: NO OF PIXELS   THAT CAN BE MADE OFF IN THIS
                               ALGORITHM




                                                                                                              REFERENCES
                                                                                    [1] Basit Hussain and M.R.Kabuka, “A Novel Feature Recognition
                                                                                    Neural Network and its Application to string Recognition”,IEEE
                                                                                    Transactions on Pattern Recognition and Machine Intelligence, Vol.
                                                                                    16, No. 1,PP. 98-106, Jan. 1994.
                                                                                    [2] Z.B. Xu, Y.Leung and X.W.He,” Asymmetrical Bidirectional
                                                                                    Associative Memories”, IEEE Transactions on systems, Man and
                                                                                    cybernetics, Vol.24, PP.1558-1564, Oct.1994.
D. Adding Information                                                               [3] Schalkoff R.J.,” Pattern Recognition: Statistical Structured and
    There are three weight matrices. Weight connecting inputs                       Neural; Approach”, John Wiely and Sons, 1992
to the input layer, input layer to hidden layer, and hidden                         [4] Fukuanga K., “Stastitical Pattern recognition-2nd ed.”.,
layer to output layer. If we add 0.0002 to all the weights then                     Academic Press, 1990
                                                                                    [5] V.K.Govindan and A.P. Sivaprasad, “String Recognition- A
for all the 26 characters we get output equal to the target
                                                                                    review”, Pattern recognition, Vol.23,No. 7, PP. 671-683, 1990
output. If this number is increased then for some of the                            [6] Fu K.S., “Syntactic Pattern Recognition,” Prentice Hall, 1996.
characters we do not get output same as target output. Table-                       [7] Y.K. Chen, J.F. Wang, Segmentation of single or multiple touching
6.3 shows detailed description about the number of pixels                           handwritten numeral strings using background and foreground
that can be made on for all the strings that can be stored in                       analysis, IEEE Trans. Pattern Anal. Mach. Intell. 22 (2000) 1304–
networks.                                                                           1317.
                                                                                    [8] Jacek M. Zurada, “Introduction to artificial Neural Systems”,
 TABLE 6.3: ADDING INFORMATION: NO   OF PIXELS THAT CAN BE MADE IN
                                                                                    Jaico Publication House. New Delhi, INDIA
                             THIS ALGORITHM
                                                                                    [9] C.L. Liu, K. Nakashima, H. Sako,H.Fujisawa,Handwritten digit
                                                                                    recognition: benchmarking of state-of-the-art techniques, Pattern
                                                                                    Recognition 36 (2003) 2271–2285.
                                                                                    [10] R. Plamondon, S.N. Srihari, On-line and o8-line handwritten
                                                                                    recognition: a comprehensive survey, IEEE Trans. Pattern Anal.
                                                                                    Mach. Intell. 22 (2000) 62–84.
                                                                                    [11] C.L. Liu, K. Nakashima, H. Sako, H. Fujisawa, Handwritten
                                                                                    digit recognition: investigation of normalization and feature
                                                                                    extraction techniques, Pattern Recognition 37 (2004) 265–279.
                                                                               21
© 2011 ACEEE
DOI: 01.IJNS.02.03. 200
ACEEE Int. J. on Network Security , Vol. 02, No. 03, July 2011

[12] L.S. Oliveira, R. Sabourin, F. Bortolozzi, C.Y. Suen, Automatic            [17] Tax, D.M.J., van Breukelen, M., Duin, R.P.W., Kittler, J.,
segmentation of handwritten numerical strings: a recognition and                2000. Combining multiple classifiers by averaging or by
verification strategy, IEEE Trans. Pattern Anal. Mach. Intell. 24               multiplying?Pattern Recognition 33, 1475–1485.
(11) (2002) 1438–1454.                                                          [18] L.S. Oliveira, R. Sabourin , F. Bortolozzi , C.Y. Suen” Impacts
[13] U. Bhattacharya, T.K. Das, A. Datta, S.K. Parui,                           of verification on a numeral string recognition system” Elsevier
B.B.Chaudhuri, A hybrid scheme for handprinted numeral                          Science, Pattern Recognition Letters 24 (2003) 1023–1031.
recognition basedon a self-organizing network andMLP classiHers,                [19] U. Pal, B.B. Chaudhuri “Indian script character recognition: a
Int. J. Pattern Recognition Artif. Intell. 16 (2002) 845–864.                   survey” Elsevier Science, Pattern Recognition, 37 (2004) 1887 –
[14] C.L. Liu, H. Sako, H. Fujisawa, Effects of classifier structure            1899
and training regimes on integrated segmentation and recognition of              [20] Xiangyun Ye, Mohamed Cheriet ,Ching Y. Suen “StrCombo:
handwritten numeral strings, IEEE Trans. Pattern Recognition                    combination of string recognizers” Elsevier Science, Pattern
Mach.Intell. 26 (11) (2004) 1395–1407.                                          Recognition Letters 23 (2002) 381–394.
[15] K.K. Kim, J.H. Kim, C.Y. Suen, Segmentation-based recognition              [21] Symon Haikin, “Neural Networks: A comprehensive
of handwritten touching pairs of digits using structural features,              Foundation”, Macmillan College Publishing Company, New York,
Pattern Recognition Lett. 23 (2002) 13–24.                                      1994.
[16] Jain, A., Duin, P., Mao, J., 2000. Statistical pattern recognition:        [22] Javad Sadri, ChingY. Suen, Tien D. Bui “Agenetic framework
A review. IEEE Trans. Pattern Anal. Machine Intell. 22 (1), 4–37.               using contextual knowledge for segmentation and recognition of
                                                                                handwritten numeral strings” Elsevier Science, Pattern Recognition
                                                                                40 (2007) 898 – 919
                                                                                [23] C.M. Bishop, Neural Networks for Pattern Recognition, Oxford
                                                                                University Press, Oxford, 2003.




                                                                           22
© 2011 ACEEE
DOI: 01.IJNS.02.03.200

Contenu connexe

Tendances

A systematic review on sequence-to-sequence learning with neural network and ...
A systematic review on sequence-to-sequence learning with neural network and ...A systematic review on sequence-to-sequence learning with neural network and ...
A systematic review on sequence-to-sequence learning with neural network and ...IJECEIAES
 
Use of artificial neural network in pattern recognition
Use of artificial neural network in pattern recognitionUse of artificial neural network in pattern recognition
Use of artificial neural network in pattern recognitionkamalsrit
 
Dissertation character recognition - Report
Dissertation character recognition - ReportDissertation character recognition - Report
Dissertation character recognition - Reportsachinkumar Bharadva
 
VLSI IN NEURAL NETWORKS
VLSI IN NEURAL NETWORKSVLSI IN NEURAL NETWORKS
VLSI IN NEURAL NETWORKSMohan Moki
 
Neural Network Applications In Machining: A Review
Neural Network Applications In Machining: A ReviewNeural Network Applications In Machining: A Review
Neural Network Applications In Machining: A ReviewAshish Khetan
 
ETRnew.doc.doc
ETRnew.doc.docETRnew.doc.doc
ETRnew.doc.docbutest
 
Artificial neural networks and its application
Artificial neural networks and its applicationArtificial neural networks and its application
Artificial neural networks and its applicationHưng Đặng
 
Question bank soft computing
Question bank   soft computingQuestion bank   soft computing
Question bank soft computingMohit Singh
 
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...ijaia
 
Early detection of adult valve disease mitral stenosis using the elman artifi...
Early detection of adult valve disease mitral stenosis using the elman artifi...Early detection of adult valve disease mitral stenosis using the elman artifi...
Early detection of adult valve disease mitral stenosis using the elman artifi...iaemedu
 
05012013150050 computerised-paper-evaluation-using-neural-network
05012013150050 computerised-paper-evaluation-using-neural-network05012013150050 computerised-paper-evaluation-using-neural-network
05012013150050 computerised-paper-evaluation-using-neural-networknimmajji
 
Neural Networks for Pattern Recognition
Neural Networks for Pattern RecognitionNeural Networks for Pattern Recognition
Neural Networks for Pattern RecognitionVipra Singh
 
Neural Network Classification and its Applications in Insurance Industry
Neural Network Classification and its Applications in Insurance IndustryNeural Network Classification and its Applications in Insurance Industry
Neural Network Classification and its Applications in Insurance IndustryInderjeet Singh
 
Performance Evaluation of Classifiers used for Identification of Encryption A...
Performance Evaluation of Classifiers used for Identification of Encryption A...Performance Evaluation of Classifiers used for Identification of Encryption A...
Performance Evaluation of Classifiers used for Identification of Encryption A...IDES Editor
 
Tamil Character Recognition based on Back Propagation Neural Networks
Tamil Character Recognition based on Back Propagation Neural NetworksTamil Character Recognition based on Back Propagation Neural Networks
Tamil Character Recognition based on Back Propagation Neural NetworksDR.P.S.JAGADEESH KUMAR
 

Tendances (18)

A systematic review on sequence-to-sequence learning with neural network and ...
A systematic review on sequence-to-sequence learning with neural network and ...A systematic review on sequence-to-sequence learning with neural network and ...
A systematic review on sequence-to-sequence learning with neural network and ...
 
Use of artificial neural network in pattern recognition
Use of artificial neural network in pattern recognitionUse of artificial neural network in pattern recognition
Use of artificial neural network in pattern recognition
 
Dissertation character recognition - Report
Dissertation character recognition - ReportDissertation character recognition - Report
Dissertation character recognition - Report
 
B42010712
B42010712B42010712
B42010712
 
VLSI IN NEURAL NETWORKS
VLSI IN NEURAL NETWORKSVLSI IN NEURAL NETWORKS
VLSI IN NEURAL NETWORKS
 
Neural Network Applications In Machining: A Review
Neural Network Applications In Machining: A ReviewNeural Network Applications In Machining: A Review
Neural Network Applications In Machining: A Review
 
ETRnew.doc.doc
ETRnew.doc.docETRnew.doc.doc
ETRnew.doc.doc
 
Artificial neural networks and its application
Artificial neural networks and its applicationArtificial neural networks and its application
Artificial neural networks and its application
 
Question bank soft computing
Question bank   soft computingQuestion bank   soft computing
Question bank soft computing
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...
 
Early detection of adult valve disease mitral stenosis using the elman artifi...
Early detection of adult valve disease mitral stenosis using the elman artifi...Early detection of adult valve disease mitral stenosis using the elman artifi...
Early detection of adult valve disease mitral stenosis using the elman artifi...
 
05012013150050 computerised-paper-evaluation-using-neural-network
05012013150050 computerised-paper-evaluation-using-neural-network05012013150050 computerised-paper-evaluation-using-neural-network
05012013150050 computerised-paper-evaluation-using-neural-network
 
Neural Networks for Pattern Recognition
Neural Networks for Pattern RecognitionNeural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
 
Neural Network Classification and its Applications in Insurance Industry
Neural Network Classification and its Applications in Insurance IndustryNeural Network Classification and its Applications in Insurance Industry
Neural Network Classification and its Applications in Insurance Industry
 
Performance Evaluation of Classifiers used for Identification of Encryption A...
Performance Evaluation of Classifiers used for Identification of Encryption A...Performance Evaluation of Classifiers used for Identification of Encryption A...
Performance Evaluation of Classifiers used for Identification of Encryption A...
 
Tamil Character Recognition based on Back Propagation Neural Networks
Tamil Character Recognition based on Back Propagation Neural NetworksTamil Character Recognition based on Back Propagation Neural Networks
Tamil Character Recognition based on Back Propagation Neural Networks
 
[IJET V2I2P20] Authors: Dr. Sanjeev S Sannakki, Ms.Anjanabhargavi A Kulkarni
[IJET V2I2P20] Authors: Dr. Sanjeev S Sannakki, Ms.Anjanabhargavi A Kulkarni[IJET V2I2P20] Authors: Dr. Sanjeev S Sannakki, Ms.Anjanabhargavi A Kulkarni
[IJET V2I2P20] Authors: Dr. Sanjeev S Sannakki, Ms.Anjanabhargavi A Kulkarni
 

Similaire à Analysis of Neocognitron of Neural Network Method in the String Recognition

Pattern Recognition using Artificial Neural Network
Pattern Recognition using Artificial Neural NetworkPattern Recognition using Artificial Neural Network
Pattern Recognition using Artificial Neural NetworkEditor IJCATR
 
EXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptx
EXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptxEXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptx
EXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptxJavier Daza
 
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdfCCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdfAsst.prof M.Gokilavani
 
Fuzzy Logic Final Report
Fuzzy Logic Final ReportFuzzy Logic Final Report
Fuzzy Logic Final ReportShikhar Agarwal
 
Fault detection and_diagnosis
Fault detection and_diagnosisFault detection and_diagnosis
Fault detection and_diagnosisM Reza Rahmati
 
Scope for Artificial Neural Network in Textiles
Scope for Artificial Neural Network in TextilesScope for Artificial Neural Network in Textiles
Scope for Artificial Neural Network in TextilesIOSR Journals
 
Artificial Neural Networks.pdf
Artificial Neural Networks.pdfArtificial Neural Networks.pdf
Artificial Neural Networks.pdfBria Davis
 
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...Sarvesh Kumar
 
Artificial Neural Network: A brief study
Artificial Neural Network: A brief studyArtificial Neural Network: A brief study
Artificial Neural Network: A brief studyIRJET Journal
 

Similaire à Analysis of Neocognitron of Neural Network Method in the String Recognition (20)

Pattern Recognition using Artificial Neural Network
Pattern Recognition using Artificial Neural NetworkPattern Recognition using Artificial Neural Network
Pattern Recognition using Artificial Neural Network
 
Project Report -Vaibhav
Project Report -VaibhavProject Report -Vaibhav
Project Report -Vaibhav
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
Neural networks
Neural networksNeural networks
Neural networks
 
EXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptx
EXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptxEXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptx
EXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptx
 
Lesson 37
Lesson 37Lesson 37
Lesson 37
 
AI Lesson 37
AI Lesson 37AI Lesson 37
AI Lesson 37
 
Neural network
Neural networkNeural network
Neural network
 
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdfCCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
 
Artificial Neural Networks
Artificial Neural NetworksArtificial Neural Networks
Artificial Neural Networks
 
P2-Artificial.pdf
P2-Artificial.pdfP2-Artificial.pdf
P2-Artificial.pdf
 
Fuzzy Logic Final Report
Fuzzy Logic Final ReportFuzzy Logic Final Report
Fuzzy Logic Final Report
 
Fault detection and_diagnosis
Fault detection and_diagnosisFault detection and_diagnosis
Fault detection and_diagnosis
 
Scope for Artificial Neural Network in Textiles
Scope for Artificial Neural Network in TextilesScope for Artificial Neural Network in Textiles
Scope for Artificial Neural Network in Textiles
 
Neural network and fuzzy logic
Neural network and fuzzy logicNeural network and fuzzy logic
Neural network and fuzzy logic
 
Neural network
Neural networkNeural network
Neural network
 
neural networks
neural networksneural networks
neural networks
 
Artificial Neural Networks.pdf
Artificial Neural Networks.pdfArtificial Neural Networks.pdf
Artificial Neural Networks.pdf
 
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...
 
Artificial Neural Network: A brief study
Artificial Neural Network: A brief studyArtificial Neural Network: A brief study
Artificial Neural Network: A brief study
 

Plus de IDES Editor

Power System State Estimation - A Review
Power System State Estimation - A ReviewPower System State Estimation - A Review
Power System State Estimation - A ReviewIDES Editor
 
Artificial Intelligence Technique based Reactive Power Planning Incorporating...
Artificial Intelligence Technique based Reactive Power Planning Incorporating...Artificial Intelligence Technique based Reactive Power Planning Incorporating...
Artificial Intelligence Technique based Reactive Power Planning Incorporating...IDES Editor
 
Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...
Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...
Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...IDES Editor
 
Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...
Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...
Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...IDES Editor
 
Line Losses in the 14-Bus Power System Network using UPFC
Line Losses in the 14-Bus Power System Network using UPFCLine Losses in the 14-Bus Power System Network using UPFC
Line Losses in the 14-Bus Power System Network using UPFCIDES Editor
 
Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...
Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...
Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...IDES Editor
 
Assessing Uncertainty of Pushover Analysis to Geometric Modeling
Assessing Uncertainty of Pushover Analysis to Geometric ModelingAssessing Uncertainty of Pushover Analysis to Geometric Modeling
Assessing Uncertainty of Pushover Analysis to Geometric ModelingIDES Editor
 
Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...
Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...
Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...IDES Editor
 
Selfish Node Isolation & Incentivation using Progressive Thresholds
Selfish Node Isolation & Incentivation using Progressive ThresholdsSelfish Node Isolation & Incentivation using Progressive Thresholds
Selfish Node Isolation & Incentivation using Progressive ThresholdsIDES Editor
 
Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...
Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...
Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...IDES Editor
 
Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...
Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...
Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...IDES Editor
 
Cloud Security and Data Integrity with Client Accountability Framework
Cloud Security and Data Integrity with Client Accountability FrameworkCloud Security and Data Integrity with Client Accountability Framework
Cloud Security and Data Integrity with Client Accountability FrameworkIDES Editor
 
Genetic Algorithm based Layered Detection and Defense of HTTP Botnet
Genetic Algorithm based Layered Detection and Defense of HTTP BotnetGenetic Algorithm based Layered Detection and Defense of HTTP Botnet
Genetic Algorithm based Layered Detection and Defense of HTTP BotnetIDES Editor
 
Enhancing Data Storage Security in Cloud Computing Through Steganography
Enhancing Data Storage Security in Cloud Computing Through SteganographyEnhancing Data Storage Security in Cloud Computing Through Steganography
Enhancing Data Storage Security in Cloud Computing Through SteganographyIDES Editor
 
Low Energy Routing for WSN’s
Low Energy Routing for WSN’sLow Energy Routing for WSN’s
Low Energy Routing for WSN’sIDES Editor
 
Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...
Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...
Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...IDES Editor
 
Rotman Lens Performance Analysis
Rotman Lens Performance AnalysisRotman Lens Performance Analysis
Rotman Lens Performance AnalysisIDES Editor
 
Band Clustering for the Lossless Compression of AVIRIS Hyperspectral Images
Band Clustering for the Lossless Compression of AVIRIS Hyperspectral ImagesBand Clustering for the Lossless Compression of AVIRIS Hyperspectral Images
Band Clustering for the Lossless Compression of AVIRIS Hyperspectral ImagesIDES Editor
 
Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...
Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...
Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...IDES Editor
 
Texture Unit based Monocular Real-world Scene Classification using SOM and KN...
Texture Unit based Monocular Real-world Scene Classification using SOM and KN...Texture Unit based Monocular Real-world Scene Classification using SOM and KN...
Texture Unit based Monocular Real-world Scene Classification using SOM and KN...IDES Editor
 

Plus de IDES Editor (20)

Power System State Estimation - A Review
Power System State Estimation - A ReviewPower System State Estimation - A Review
Power System State Estimation - A Review
 
Artificial Intelligence Technique based Reactive Power Planning Incorporating...
Artificial Intelligence Technique based Reactive Power Planning Incorporating...Artificial Intelligence Technique based Reactive Power Planning Incorporating...
Artificial Intelligence Technique based Reactive Power Planning Incorporating...
 
Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...
Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...
Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...
 
Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...
Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...
Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...
 
Line Losses in the 14-Bus Power System Network using UPFC
Line Losses in the 14-Bus Power System Network using UPFCLine Losses in the 14-Bus Power System Network using UPFC
Line Losses in the 14-Bus Power System Network using UPFC
 
Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...
Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...
Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...
 
Assessing Uncertainty of Pushover Analysis to Geometric Modeling
Assessing Uncertainty of Pushover Analysis to Geometric ModelingAssessing Uncertainty of Pushover Analysis to Geometric Modeling
Assessing Uncertainty of Pushover Analysis to Geometric Modeling
 
Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...
Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...
Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...
 
Selfish Node Isolation & Incentivation using Progressive Thresholds
Selfish Node Isolation & Incentivation using Progressive ThresholdsSelfish Node Isolation & Incentivation using Progressive Thresholds
Selfish Node Isolation & Incentivation using Progressive Thresholds
 
Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...
Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...
Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...
 
Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...
Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...
Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...
 
Cloud Security and Data Integrity with Client Accountability Framework
Cloud Security and Data Integrity with Client Accountability FrameworkCloud Security and Data Integrity with Client Accountability Framework
Cloud Security and Data Integrity with Client Accountability Framework
 
Genetic Algorithm based Layered Detection and Defense of HTTP Botnet
Genetic Algorithm based Layered Detection and Defense of HTTP BotnetGenetic Algorithm based Layered Detection and Defense of HTTP Botnet
Genetic Algorithm based Layered Detection and Defense of HTTP Botnet
 
Enhancing Data Storage Security in Cloud Computing Through Steganography
Enhancing Data Storage Security in Cloud Computing Through SteganographyEnhancing Data Storage Security in Cloud Computing Through Steganography
Enhancing Data Storage Security in Cloud Computing Through Steganography
 
Low Energy Routing for WSN’s
Low Energy Routing for WSN’sLow Energy Routing for WSN’s
Low Energy Routing for WSN’s
 
Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...
Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...
Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...
 
Rotman Lens Performance Analysis
Rotman Lens Performance AnalysisRotman Lens Performance Analysis
Rotman Lens Performance Analysis
 
Band Clustering for the Lossless Compression of AVIRIS Hyperspectral Images
Band Clustering for the Lossless Compression of AVIRIS Hyperspectral ImagesBand Clustering for the Lossless Compression of AVIRIS Hyperspectral Images
Band Clustering for the Lossless Compression of AVIRIS Hyperspectral Images
 
Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...
Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...
Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...
 
Texture Unit based Monocular Real-world Scene Classification using SOM and KN...
Texture Unit based Monocular Real-world Scene Classification using SOM and KN...Texture Unit based Monocular Real-world Scene Classification using SOM and KN...
Texture Unit based Monocular Real-world Scene Classification using SOM and KN...
 

Dernier

Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyAlfredo García Lavilla
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxLoriGlavin3
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxLoriGlavin3
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfLoriGlavin3
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024Stephanie Beckett
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningLars Bell
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.Curtis Poe
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxLoriGlavin3
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxLoriGlavin3
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxLoriGlavin3
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxLoriGlavin3
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxLoriGlavin3
 

Dernier (20)

Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easy
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptx
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdf
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine Tuning
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
 

Analysis of Neocognitron of Neural Network Method in the String Recognition

  • 1. ACEEE Int. J. on Network Security , Vol. 02, No. 03, July 2011 Analysis of Neocognitron of Neural Network Method in the String Recognition Amit Kumar Gupta 1, Yash Pal Singh 2 1 KIET/ MCA Department, Ghaziabad (U.P.), India Email: amitid29@gmail.com 2 BIET / CSE Department, Bundelkhand, Jhansi (U.P.), India Email: yash_biet@yahoo.co.in Abstract: This paper aims that analysing neural network method can be implemented by using a feed-forward neural network in pattern recognition. A neural network is a processing device, that has been trained accordingly. During training, the network whose design was inspired by the design and functioning of is trained to associate outputs with input patterns. When the human brain and their components. The proposed solutions network is used, it identifies the input pattern and tries to focus on applying Neocognitron Algorithm model for pattern output the associated output pattern. Four significant recognition. The primary function of which is to retrieve in a approaches to PR have evolved. These are [5]. pattern stored in memory, when an incomplete or noisy version of that pattern is presented. An associative memory is a Statistical pattern recognition: storehouse of associated patterns that are encoded in some Here, the problem is posed as one of composite form. In auto-association, an input pattern is associated with hypothesis testing, each hypothesis pertaining to the premise, itself and the states of input and output units coincide. When the storehouse is incited with a given distorted or partial of the datum having originated from a particular class or as pattern, the associated pattern pair stored in its perfect form one of regression from the space of measurements to the is recalled. Pattern recognition techniques are associated a space of classes. The statistical methods for solving the same symbolic identity with the image of the pattern. This problem involve the computation other class conditional probability of replication of patterns by machines (computers) involves densities, which remains the main hurdle in this approach. the machine printed patterns. There is no idle memory The statistical approach is one of the oldest, and still widely containing data and programmed, but each neuron is used [8]. programmed and continuously active. Syntactic pattern recognition: Keywords: Neural network, machine printed string, pattern In syntactic pattern recognition, each pattern is assumed recognition, A perceptron-type network, Learning, Recognition, to be composed of sub-pattern or primitives string together Connection. in accordance with the generation rules of a grammar string of the associated class. Class identifications accomplished I. INTRODUCTION by way of parsing operations using automata corresponding An Artificial Neural Network (ANN) is an information to the various grammars [15, 16]. Parser design and processing paradigm that is inspired by the biological nervous grammatical inference are two difficult issues associated with systems, such as the brain. It is composed of a large number this approach to PR and are responsible for its somewhat of highly interconnected processing elements (neurons) limited applicability. working in unison to solve specific problems. A Neural Knowledge-based pattern recognition: Network is configured for pattern recognition or data classification, through a learning process. In biological This approach to PR [17] is evolved from advances in systems, Learning involves adjustments to the synaptic rule-based system in artificial intelligence (AI). Each rule is in connections that exist between the neurons. Neural networks form of a clause that reflects evidence about the presence of process information in a similar way the human brain does. a particular class. The sub-problems spawned by the The network is composed of a large number of highly methodology are: interconnected processing elements working in parallel to 1. How the rule-based may be constructed, and solve a specific problem. Neural networks learn by example. 2. What mechanism might be used to integrate the evidence A neuron has many inputs and one output. The neuron has yielded by the invoked rules? two modes of operation (i) the training mode and (ii) the Neural Pattern Recognition: using mode. In the training mode, the neuron can be trained Artificial Neural Network (ANN) provides an emerging for particular input patterns. In the using mode, when a taught paradigm in pattern recognition. The field of ANN input pattern is detected at the input, its associated output encompasses a large variety of models [18], all of which have becomes the current output. If the input pattern does not two important string. belong in the taught list of input patterns, the training rule is 1. They are composed of a large number of structurally and used. Neural network has many applications. The most likely functionally similar units called neurons usually connected applications for the neural networks are (1) Classification (2) various configurations by weighted links. Association and (3) Reasoning. An important application of 2. The Ann’s model parameters are derived from supplied I/O neural networks is pattern recognition. Pattern recognition 18 © 2011 ACEEE DOI: 01.IJNS.02.03.200
  • 2. ACEEE Int. J. on Network Security , Vol. 02, No. 03, July 2011 paired data sets by an estimation process called training. acquires the ability to classify and correctly recognize these patterns, according to the differences in their shapes. The II. METHODOLOGY network has a multilayer structure. Each layer has two planes. One called s-plane, consists of s units. Other the c-plane, Different neural network algorithms are used for consists of c units. The units can have both excitatory and recognizing the pattern. Various algorithms differ in their inhibitory connections. The network has forward connections learning mechanism. Information is stored in the weight matrix from input layers to output layer and backward connections of a neural network. Learning is the determination of the from the output layer to the input layer. The forward signals weights. All learning methods used for adaptive neural are for pattern classification and recognition. The backward networks can be classified into two major categories: signals are for selective attention, pattern segmentation and supervised learning and unsupervised learning. Supervised associated recall [23]. learning incorporates an external teacher. After training the network, we should get the response equal to target response. During the learning process, global information may be required. The aim is to determine a set of weights, which minimizes the error. Unsupervised learning uses no external teacher and is based on clustering of input data. There is no prior information about input’s membership in a particular class. The string of the patterns and a history of training are used to assist the network in defining classes. It self-organizes data presented to the network and detects their emergent collective properties. The characteristics of the neurons and initial weights are specified based upon the training method of the network. The pattern sets is applied to the network during the training. The pattern to be recognized are in the form of vector whose elements is obtained from a pattern grid. The elements are either 0 and 1 or -1 and 1. In some of the algorithms, weights are calculated from the pattern presented to the network and in some algorithms, weights Figure 4.1 shows the character identification process schematically. are initialized. The network acquires the knowledge from the Figure 4.1 : This diagram is a schematic representation of the environment. The network stores the patterns presented interconnection strategy of the neocognitron. (a) On the first during the training in another way it extracts the features of level, each S unit receives input connections from a small pattern. As a result of this, the information can be retrieved region of the retina. Units in corresponding positions on all later. planes receive input connections from the same region of the III. PROBLEM STATEMENT retina. The region from which an S-cell receives input The aim of the paper is that neural network has connections defines the receptive field of the cell, (b) On demonstrated its capability for solving complex pattern intermediate levels, each unit on an s-plane receives input recognition problems. Commonly solved problems of pattern connections from corresponding locations on all C-planes in have limited scope. Single neural network architecture can the previous level. C-celIs have connections from a region of recognize only few patterns. In this paper discusses on neural S-cells on the S level preceding it. If the number of C-planes network algorithm with their implementation details for is the same as that of S-planes at that level, then each C-cell solving pattern recognition problems. The relative has connections from S-cells on a single s-plane. If there are performance evaluation of this algorithms has been carried fewer C-planes than S-planes, some C-cells may receive out. The comparisons of algorithm have been performed connections from more than one S-plane. based on following criteria: (1) Noise in weights V. NEOCOGNITRON ALGORITHM (2) Noise in inputs Let u(1), u(2), …………u(N) be the excitatory inputs and (3) Loss of connections v be the inhibitory input. The output of the s-cell is given (4) Missing information and adding information. by IV. NEOCOGNITRON The Neocognitron is self organized by unsupervised learning and acquires the ability for correct pattern recognition. For self organization of network, the patterns are presented repeatedly to the network. It is not needed to Where a(v) and b represent the excitatory and inhibitory give prior information about the categories to which these interconnecting coefficients respectively. The function patterns should be classified. The Neocognitron itself is defined by 19 © 2011 ACEEE DOI: 01.IJNS.02.03. 200
  • 3. ACEEE Int. J. on Network Security , Vol. 02, No. 03, July 2011 parameter are listed in table 3.9. As can be seen from the table total number of c-cells in layer Uc4 is 20 because each c-plane has only one cell. The number of cells in a connectable area Let e be the sum of all excitatory inputs weighted with inter- is always 5×5 for every s-layer. Hence number of excitatory connecting coefficients and h be the inhibitory input multi- input to each s-cell is 5×5in layer Us1, and 5×5×20 in layers plied by the interconnecting coefficients. Us2, Us3 and Us4 because these layers are preceded by c- layers consisting of 20 cell planes in each. During learning five training patterns 0,1,2,3 and 4were presented repeatedly to the input layer. After repeated presentation of these five patterns the Neocognitron acquired the ability to classify So w can be written as, these patterns according to the difference in their shape. Each pattern is presented five times. It has been observed that a different cell responds to each pattern. TABLE 6.1 SHOWS THE ARCHITECTURE AND VALUES OF PARAMETERS USED IN NEOCOGNITRON. When In Neocognitron, the interconnecting coefficients a(v) and b increases as learning progresses. If the interconnecting coefficient increases and if e>>1 and h>>1, Where the output depends on the ratio e/h, not on e and h. Therefore, if both the excitatory and inhibitory coefficients increase at the same rate then the output of the cell reaches a certain value. Similarly the input and output characteristics of c-cell is given by, α is a positive constant which determines the degree saturation of the output. The output of the excitatory cells s and c are connected only to the excitatory input terminals of the other cells. Vs and Vc cells are inhibitory cells, whose output is The various neural network algorithms differ in their learning connected only to the inhibitory input terminals of other cells. mechanism. Some networks learn by supervised learning and A Vs –cell gives an output, which is proportional to the in some learning are unsupervised. The capacity of storing weighted arithmetic mean of all its excitatory inputs. A Vc – patterns and correctly recognizing them differs for different cell also has only inhibitory inputs but its output is networks. Although if some of the networks are capable of proportional to the weighted root mean square of its inputs. storing same no of patterns, they differ in complexity of their Let u(1), u(2), ……u(N) be the excitatory inputs and c(1), c(2), architecture. Total no of neurons needed to classify or ……c(n) be the interconnecting coefficients of its input recognize the patterns are different for various algorithms. terminals. The output w of this Vc –cell is defined by Total no of unknowns to also different for various algorithms. The networks are compared on the basis of these. The performance of various algorithms under different criteria is presented. These criteria are: Storage Capacity. By increasing number of planes in each A. Noise in Weight layer the capacity of the network can be increased. But by Noise has been introduced in two ways, increasing no of planes in each layer, the computer can not 1- By adding random numbers to the weight simulate because of the lack of memory capacity. 2- By adding a constant number to all weights of the network Noise in input: The network is trained with characters without VI. RESULT noise. Then by presenting each of the character, the network is tested. It has been observed that the no of characters Neocognitron algorithm A nine layered network U0, Us1, recognized correctly differs for various algorithms. Noise is Uc1, Us2, Uc2, Us3, Uc3, Us4, and Uc4 is taken. So the network introduced to the input vectors at the time of testing and its has 4 stages proceeded by an input layer. There are 21 cell effect has been observed on various algorithms. This has planes in each layer Us1-Us4, and 20 cell planes in Uc1-Uc4, The been done by adding random numbers to the test vector. 20 © 2011 ACEEE DOI: 01.IJNS.02.03. 200
  • 4. ACEEE Int. J. on Network Security , Vol. 02, No. 03, July 2011 B. Loss of Connection Demerits In the network, neurons are interconnected and every inter- This Neocognitron method , The Network architecture is connection has some interconnecting coefficient called very complex. It involves a large no of neurons weight. If some these weights are equal to zero then how it is going to effect the classification or recognition, is studied CONCLUSION under this section. The number of connections that can be The performance of Quick Prop algorithm has been removed such that the network performance is not affected studied under nine criteria. It has been observed that a certain has also been found out for each algorithm. If connection of algorithm performs best under a particular criterion. The input neuron’s to all the output neuron is removed, and the algorithm have also been compared based on the number of pixel corresponding to that neuron number is off than it makes neurons and the number of unknowns to be computed. The no difference. But if that pixel is on, in the output that be- detailed description in Table-7.1 comes off. There are two sets of weights. First connecting input neurons to hidden neurons to out put neurons. If the TABLE 7.1: PERFORMANCE OF NEOCOGNITRON ALGORITHM UNDER DIFFERENT weights connecting 49 input neurons to 15 hidden neurons CRITERION and weights connecting 15 hidden neurons to 2 output neu- rons is equated to zero, then for 25 characters actual out is equal to the target output. C. Missing Information Missing information means some of the on pixels in pattern grid are made off. For the algorithm, how many information we can miss so that the strings can be recognized correctly varies from string to string. We cannot switch off pixel from any place. Which pixel is being switched also matters. For few strings Table-6.2 shows the number of pixels that can be switched off for all the stored strings in algorithm. TABLE 6.2: MISSING I NFORMATION: NO OF PIXELS THAT CAN BE MADE OFF IN THIS ALGORITHM REFERENCES [1] Basit Hussain and M.R.Kabuka, “A Novel Feature Recognition Neural Network and its Application to string Recognition”,IEEE Transactions on Pattern Recognition and Machine Intelligence, Vol. 16, No. 1,PP. 98-106, Jan. 1994. [2] Z.B. Xu, Y.Leung and X.W.He,” Asymmetrical Bidirectional Associative Memories”, IEEE Transactions on systems, Man and cybernetics, Vol.24, PP.1558-1564, Oct.1994. D. Adding Information [3] Schalkoff R.J.,” Pattern Recognition: Statistical Structured and There are three weight matrices. Weight connecting inputs Neural; Approach”, John Wiely and Sons, 1992 to the input layer, input layer to hidden layer, and hidden [4] Fukuanga K., “Stastitical Pattern recognition-2nd ed.”., layer to output layer. If we add 0.0002 to all the weights then Academic Press, 1990 [5] V.K.Govindan and A.P. Sivaprasad, “String Recognition- A for all the 26 characters we get output equal to the target review”, Pattern recognition, Vol.23,No. 7, PP. 671-683, 1990 output. If this number is increased then for some of the [6] Fu K.S., “Syntactic Pattern Recognition,” Prentice Hall, 1996. characters we do not get output same as target output. Table- [7] Y.K. Chen, J.F. Wang, Segmentation of single or multiple touching 6.3 shows detailed description about the number of pixels handwritten numeral strings using background and foreground that can be made on for all the strings that can be stored in analysis, IEEE Trans. Pattern Anal. Mach. Intell. 22 (2000) 1304– networks. 1317. [8] Jacek M. Zurada, “Introduction to artificial Neural Systems”, TABLE 6.3: ADDING INFORMATION: NO OF PIXELS THAT CAN BE MADE IN Jaico Publication House. New Delhi, INDIA THIS ALGORITHM [9] C.L. Liu, K. Nakashima, H. Sako,H.Fujisawa,Handwritten digit recognition: benchmarking of state-of-the-art techniques, Pattern Recognition 36 (2003) 2271–2285. [10] R. Plamondon, S.N. Srihari, On-line and o8-line handwritten recognition: a comprehensive survey, IEEE Trans. Pattern Anal. Mach. Intell. 22 (2000) 62–84. [11] C.L. Liu, K. Nakashima, H. Sako, H. Fujisawa, Handwritten digit recognition: investigation of normalization and feature extraction techniques, Pattern Recognition 37 (2004) 265–279. 21 © 2011 ACEEE DOI: 01.IJNS.02.03. 200
  • 5. ACEEE Int. J. on Network Security , Vol. 02, No. 03, July 2011 [12] L.S. Oliveira, R. Sabourin, F. Bortolozzi, C.Y. Suen, Automatic [17] Tax, D.M.J., van Breukelen, M., Duin, R.P.W., Kittler, J., segmentation of handwritten numerical strings: a recognition and 2000. Combining multiple classifiers by averaging or by verification strategy, IEEE Trans. Pattern Anal. Mach. Intell. 24 multiplying?Pattern Recognition 33, 1475–1485. (11) (2002) 1438–1454. [18] L.S. Oliveira, R. Sabourin , F. Bortolozzi , C.Y. Suen” Impacts [13] U. Bhattacharya, T.K. Das, A. Datta, S.K. Parui, of verification on a numeral string recognition system” Elsevier B.B.Chaudhuri, A hybrid scheme for handprinted numeral Science, Pattern Recognition Letters 24 (2003) 1023–1031. recognition basedon a self-organizing network andMLP classiHers, [19] U. Pal, B.B. Chaudhuri “Indian script character recognition: a Int. J. Pattern Recognition Artif. Intell. 16 (2002) 845–864. survey” Elsevier Science, Pattern Recognition, 37 (2004) 1887 – [14] C.L. Liu, H. Sako, H. Fujisawa, Effects of classifier structure 1899 and training regimes on integrated segmentation and recognition of [20] Xiangyun Ye, Mohamed Cheriet ,Ching Y. Suen “StrCombo: handwritten numeral strings, IEEE Trans. Pattern Recognition combination of string recognizers” Elsevier Science, Pattern Mach.Intell. 26 (11) (2004) 1395–1407. Recognition Letters 23 (2002) 381–394. [15] K.K. Kim, J.H. Kim, C.Y. Suen, Segmentation-based recognition [21] Symon Haikin, “Neural Networks: A comprehensive of handwritten touching pairs of digits using structural features, Foundation”, Macmillan College Publishing Company, New York, Pattern Recognition Lett. 23 (2002) 13–24. 1994. [16] Jain, A., Duin, P., Mao, J., 2000. Statistical pattern recognition: [22] Javad Sadri, ChingY. Suen, Tien D. Bui “Agenetic framework A review. IEEE Trans. Pattern Anal. Machine Intell. 22 (1), 4–37. using contextual knowledge for segmentation and recognition of handwritten numeral strings” Elsevier Science, Pattern Recognition 40 (2007) 898 – 919 [23] C.M. Bishop, Neural Networks for Pattern Recognition, Oxford University Press, Oxford, 2003. 22 © 2011 ACEEE DOI: 01.IJNS.02.03.200