Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Different inference results under different batch size #85

Open
HelloWorldLTY opened this issue Dec 10, 2024 · 2 comments
Open

Different inference results under different batch size #85

HelloWorldLTY opened this issue Dec 10, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@HelloWorldLTY
Copy link

Hi, I noticed that the inference result based on model.predict_on_dataset is slightly different with different batch size. Is it normal? Thanks.

image

image

@avantikalal
Copy link
Collaborator

Hi, this looks like a small difference but shouldn't be the case. Is it possible for you to give us a reproducible example?

@avantikalal avantikalal added the bug Something isn't working label Dec 16, 2024
@HelloWorldLTY
Copy link
Author

Hi, thanks for your answers. I think I can send you a small set of datasets to reproduce the error via email, as there are some datasets not proper to share publicly, like GTEx. I tried on various datasets and this problem happend again:

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants