Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WARNING:lightrag: Didn't extract any entities and relationships, maybe your LLM is not working ; NEED HELP; LLM isn't creating any entities and is writing graphs with 0 nodes and 0 edges #537

Open
shalini-agarwal opened this issue Jan 2, 2025 · 0 comments

Comments

@shalini-agarwal
Copy link

Hello,

I have been trying to create knowledge graph of a 200 page word document but I keep running into this issue -

Screenshot 2025-01-02 at 4 20 55 PM

I have tried running mxbai-embed-large:335m embedding model with both gemma 2b and llama 3.2 1b model but I am getting the same error. I am using the lightrag_ollama_demo.py example file from the repo.

This what my ollama log shows - Does anyone know how to resolve this issue? Would be extremely helpful! Thanks!

image image image image image image .......... .......... .......... image
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant