-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can I make index with my ttl data? #41
Comments
You can use
If there are still errors, please also provide the concrete error message. |
Thanks for your response. I successfully index my ttl data, but I got a new error when I run After
The error message said I have no |
The |
I reindex my test file to run My
It has not been progressed since then. As you can see below, the tmp files are not removed, and Thanks for your kindness answers 😺 |
The progress messages (
|
Jumping in here because I'm trying to do something similar. How do you make qlever index use your own Qleverfile?. O.k. nevermind. I figured it out. |
@tomersagi Why not just call your Qleverfile |
@givemetarte Can you provide a link to your data, so that we can check if we can reproduce the error? You can also try the following yourself, if you are sufficiently computer-savvy: When calling |
@givemetarte Here is a simpler thing which you can try first. In the Qleverfile, in the value for |
@Qup42 I tried to test the Olympic datasets, but I encountered the same issue. My server spec is Ubuntu 20.04, 64G memory, 2TB hard disk, which seems to enough for running qlever. I tested the same indexing work in my local laptop (MacOS), I got the same issue when I indexed the Olympic datasets. |
@hannahbast I changed I have no idea how to debug with You can download my data via here. This link is running temporarily, so the link will be gone in a week. The final Qleverfile is below:
|
@givemetarte I just tried it with your dataset and your Qleverfile and it works without problems. It is also very unusual that the Is there anything that comes to your mind? |
That's great. Would be nice if that information was in the README or anywhere. |
@hannahbast Thanks for testing! |
@givemetarte Thanks for the update. That is really strange and it would be great to find out what the problem is. Can you provide information about each of the two machines with Ubuntu: which Ubuntu version is it, how much RAM does the respective machine have, and which version of Docker is installed on the respective machine? |
@hannahbast Ubuntu servers' spec is below:
I downgraded the docker version, but that was not the reason. Qlever is still not working... |
Hi, |
Thanks for your answers. @dssib
tmp files are not deleted, and also |
I think it's indeed due to a memory error. On my side to avoid this I ended up merging the source .ttl files "by hand" first (I had a few hundreds of them), this will avoid having to merge vocabularies after the indexing. You could try this if that's also an option on your side. But perhaps the developers have also some suggestions, this one is based only on my trials and errors (in the end I did succeed to get everything indexed and running). |
@dssib Thanks for sharing your trials and errors. Unfortunately, Merging all ttl files is not worked for me. I still got the same error, and I'll try to run qlever in another machine (much higher memory). @hannahbast Could you recommend proper machine memory per triples? What is the minimum required to create an index on about 90 million triples? |
I changed Qleverfile options like wikidata.
|
Hi, I'm new to qlever, and I have few questions to index my ttl data.
In my lab, we build about 2 billion triples related to Korean address, and we want to test qlever performances for querying.
Questions are below:
qlever get-data
downloads data from an external source, but which function I don't need and all files are in local. Do I need to compress in one ttl.gz file?INPUT_FILES
path, and how can I write other settings?It would be so pleasure if you answer !
The text was updated successfully, but these errors were encountered: