Enter number of hidden layers
Enter number of neurons in each hidden layer
Enter learning rate (eta)
Enter number of epochs (m)
Add bias or not (Checkbox)
Choose to use Sigmoid or Hyperbolic Tangent sigmoid as the activation function
Number of features = 5
Number of classes = 3.
Weights + Bias = small random numbers
Sample (single sample to be classified)
Implement the Back-Propagation learning algorithm on a multi-layer neural networks, which can be able to classify a stream of input data to one of a set of predefined classes.
Use the penguins or Iris data in both your training and testing processes. (Each class has 50 samples: train NN with the first 30 non-repeated samples, and test it with the remaining 20 samples)
Test the classifier with the remaining 20 samples of each selected classes and find confusion matrix and compute overall accuracy.
You should not drop any row from the dataset.
Using scikit-learn metrics library or any similar built-in function for the confusion matrix is not allowed.