2. Perceptron Perceptron can be used for linear classification Linear classification using the perceptron If instances belonging to different classes can be divided in the instance space by using hyper planes, then they are called linearly separable If instances are linearly separable then we can use perceptron learning rule for classification
3. Multilayer Perceptron Multilayer perceptron: We can create a network of perceptron to approximate arbitrary target concepts Multilayer perceptron is an example of an artificial neural network Consists of: input layer, hidden layer(s), and output layer Structure of MLP is usually found by experimentation Parameters can be found using back propagation or montecarlo simulations
5. Error metric The parameters to be selected such that the minimum error is produced Error metric used: f(x) = 1/(1+exp(-x)) Error = ½(y-f(x))^2
6. Using montecarlo to get the parameters Distribute some random samples in the weight vector space Choose the ones which minimizes errors Repeat the process till convergence Finally points at the convergence gives us the value of the parameters