You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a question regarding the layer size when using the keras tuner (Hyperband). What I would like to do is create a stacked LSTM model with lets say 3 layers. However the subsequent layer is allowed to be only as large (in terms of LSTM units) as the layer before. (So if the first LSTM has 400 units, the next one should have 400 or less and not more, as it can happen with the current code)
Is this somehow possible or can we not give a certain constraint? Or is this not wanted - to my understanding, the subsequent layer should be at least as large as the layer before, but preferred a bit smaller to 'force' the model to learn different relations?
I have tried something like this, as well as with the conditional_scope, however it only works for the first Trial and then it is back to "random" again.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Dear community!
I have a question regarding the layer size when using the keras tuner (Hyperband). What I would like to do is create a stacked LSTM model with lets say 3 layers. However the subsequent layer is allowed to be only as large (in terms of LSTM units) as the layer before. (So if the first LSTM has 400 units, the next one should have 400 or less and not more, as it can happen with the current code)
Is this somehow possible or can we not give a certain constraint? Or is this not wanted - to my understanding, the subsequent layer should be at least as large as the layer before, but preferred a bit smaller to 'force' the model to learn different relations?
I have tried something like this, as well as with the conditional_scope, however it only works for the first Trial and then it is back to "random" again.
Thanks for your answer,
Barry
Beta Was this translation helpful? Give feedback.
All reactions