-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Possible memory leak in SPARQL endpoint? #123
Comments
Hi @cKlee, A more sophisticated logic is needed here, in order to gain advantage from the underlying inverted index and most important, to avoid what you're seeing. Specifically, even small queries containing joins or count() with simple graph patterns, completely scans the entire index; I guess that's the underlying reason of your out of memory issue. The bad thing is that I have no a precise idea about when this thing will be done, as it is complicated and it is taking me a lot of time. AG |
Hi,
I'm using the solrdf-1.0 branch with JRE 1.7.0 and successfully loaded aprox. 125000000 documents to the store (and optimized). Solr queries are fine. But with simple SPARQL queries I always get a "java heap space out of memory" exception (even with LIMIT 1).
I'm running a 4 CPU 64 bit Debian with 16G RAM. I'm not very familiar with Maven, but I tried to give Solrdf a little bit more RAM with
... but with the same exception.
The text was updated successfully, but these errors were encountered: