Skip to content

Kareem-Mohamed-Wardany/Multi-Layer-Perceptron

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Multi-Layer-Perceptron

GUI

User Input:

Enter number of hidden layers

Enter number of neurons in each hidden layer

Enter learning rate (eta)

Enter number of epochs (m)

Add bias or not (Checkbox)

Choose to use Sigmoid or Hyperbolic Tangent sigmoid as the activation function


Initialization:

Number of features = 5

Number of classes = 3.

Weights + Bias = small random numbers


Classification:

Sample (single sample to be classified)


Description:

Implement the Back-Propagation learning algorithm on a multi-layer neural networks, which can be able to classify a stream of input data to one of a set of predefined classes.

Use the penguins or Iris data in both your training and testing processes. (Each class has 50 samples: train NN with the first 30 non-repeated samples, and test it with the remaining 20 samples)


After training:

Test the classifier with the remaining 20 samples of each selected classes and find confusion matrix and compute overall accuracy.


Notes:

You should not drop any row from the dataset.

Using scikit-learn metrics library or any similar built-in function for the confusion matrix is not allowed.

About

MLP implementation with LMS Algorithm

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages