You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I trained the model using SVHN dataset and get around 80% accuracy on my own dataset. So I determine to finetune it on my dataset.
train_layers = ['hidden10', 'digit_length', 'digit1', 'digit2', 'digit3', 'digit4']
fine_tune_var_list = [v for v in tf.trainable_variables() if v.name.split('/')[0] in train_layers]
train_op = optimizer.minimize(loss, global_step=global_step, var_list=fine_tune_var_list)
I tried learning rate from 1e-2 to 1e-5 but the accuracy is always around 80% with loss around 1~2.
I wonder how to make it perform better?
The text was updated successfully, but these errors were encountered:
I trained the model using SVHN dataset and get around 80% accuracy on my own dataset. So I determine to finetune it on my dataset.
I tried learning rate from 1e-2 to 1e-5 but the accuracy is always around 80% with loss around 1~2.
I wonder how to make it perform better?
The text was updated successfully, but these errors were encountered: