-
Notifications
You must be signed in to change notification settings - Fork 338
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MNIST without training the weights #7
Comments
Are you sure everything is setup right? When I call this in the WANN_tool directory I get But that command is to use the trained weights. If you are interested in untrained weights, you could use the sweep command to look at accuracies of different single values: So even with a single weight of 0 you should be getting 18% |
"champions/mnist.out" is only the topology, in a giant square weight matrix. every weight is set to 0. Well...almost. it looks like the initial weights were still set, though they are normally rewritten before evaluation (every non-nan weight is set to the shared weight). This is why it is so bad, being a bit worse than chance, it looks like an almost completely inactive ANN -- except for a couple pixels that are still hooked up to the output layer. |
Is the code
"python model.py mnist256test -e 1"
performed to test WANN on MNIST without training the weights?
But I just get an accuracy of 0.0568.
The text was updated successfully, but these errors were encountered: