IQSS logo

IRC log for #dataverse, 2021-05-19

Connect via chat.dataverse.org to discuss Dataverse (dataverse.org, an open source web application for sharing, citing, analyzing, and preserving research data) with users and developers.

| Channels | #dataverse index | Today | | Search | Google Search | Plain-Text | plain, newest first | summary

All times shown according to UTC.

Time S Nick Message
06:25 Virgile joined #dataverse
06:37 Virgile joined #dataverse
06:47 Virgile joined #dataverse
06:53 Virgile_ joined #dataverse
06:58 VJ joined #dataverse
07:14 VJ joined #dataverse
07:41 VJ joined #dataverse
09:36 JonathanNeal_ joined #dataverse
10:03 juancorr joined #dataverse
11:40 donsizemore joined #dataverse
13:02 Virgile joined #dataverse
13:24 Philipp82 joined #dataverse
13:25 Philipp82 left #dataverse
13:27 philipp14 joined #dataverse
13:35 philipp63 joined #dataverse
13:37 philipp63 Q: Why do I get assigned a new user name, e.g. ...63, each time I connect to this chat, while you guys, e.g. donsizemore, seem to have the same user name all the time??
13:37 donsizemore @philipp63 I sometimes get assigned dated usernames when I get kicked off but my session isn't terminated
13:38 donsizemore @philipp63 I do like it when the appended number makes me appear to be younger than I am.
13:38 Virgile haha ^^
13:41 philipp63 Haha :) But now you're logged on as donsizemore. Why don't you get the message that "your" username is already taken?
13:43 donsizemore my previous session closed properly
13:45 philipp63 Ok, so I have to click the Leave button?
13:51 philipp63 Anyway, I was wondering if anyone knows why the Dataverse SchemaSpy isn't working? http://phoenix.dataverse.org/schemaspy/latest/relationships.html
14:04 pdurbin joined #dataverse
14:04 pdurbin philipp63: when you join, you need to pick a more unique name :)
14:06 VJ joined #dataverse
14:11 dataverse-user joined #dataverse
14:12 philipp63 ah, no I get it, this service is not just about Dataverse, so there might be user called philipp using this chat service for sth else?
14:14 pdurbin Yeah, #dataverse is just one of thousands of channels here on the freenode IRC network.
14:14 dataverse-user left #dataverse
14:16 pdurbin philipp63: but to answer your question, yes, that old SchemaSpy link is dead but here's one that works (thanks to donsizemore!): https://guides.dataverse.org/en/5.4/schemaspy/index.html
14:17 pdurbin (a new look too, by the way
14:31 lincoln joined #dataverse
14:55 dataverse-user joined #dataverse
14:56 dataverse-user Hi! I'm trying to upload a 1.7gb file but dataverse doesn't seem to process it
14:57 dataverse-user Is there a way to upload this file via ssh or something similar?
14:57 dataverse-user I'm using v4.9.4 with aws3
14:59 pdurbin dataverse-user: are you using S3? If so, you can enable "direct upload".
14:59 dataverse-user really? and how can I enable it? 😅😅
15:00 pdurbin dataverse.files.<id>.upload-redirect
15:00 pdurbin From the table at https://guides.dataverse.org/en/5.4.1/installation/config.html#s3-storage-options
15:00 pdurbin You can also configure "direct download".
15:01 dataverse-user Will that work for a 4.9.4 dataverse version?
15:02 pdurbin 4.9.4 had direct download but not direct upload.
15:03 dataverse-user Oh... ok I'll try to enable it on a 5.3 dev version that I have
15:03 pdurbin Sounds good.
15:03 dataverse-user thank you very much!
15:04 pdurbin oh sure
15:05 dataverse-user34 joined #dataverse
15:15 Virgile joined #dataverse
15:20 dataverse-user joined #dataverse
15:28 dataverse-user Hi again, I set the dataverse.files.<id>.upload-redirect to true and dataverse.files.<id>.min-part-size to 2048mb but I don't see the another option to upload data, just the "htpp"
15:29 dataverse-user Http*
15:30 pdurbin Oh. The direct upload happens behind the scenes. You should look at network activity n your browser's dev tools window.
15:32 dataverse-user Oh, so I keep using the Http option and check for the network activity in my chrome and see if it works?
15:33 pdurbin exactly
15:33 dataverse-user Ok, I'll try that
15:33 dataverse-user brb
15:38 lincoln sorry for interrupting the conversation. As a Superuser, is there a curl command (possibility) to display  a) username (with his/her API-key)  and b) Dataverse name (if i know its identifier name)
15:40 pdurbin lincoln: not with the API key but otherwise yes
15:41 pdurbin please see https://guides.dataverse.org/en/5.4.1/api/native-api.html#list-users
15:47 dataverse-user Ok so that seems to work. Dataverse saved the file, however, the file keeps showing the "ingest in progress" message
15:50 pdurbin Oh right. I think there are some caveats about ingest and unzipping. Might act differently.
15:50 pdurbin If anything, I would think ingest might be skipped entirely. I forget what behavior to expect.
15:50 pdurbin There might be some docs on it somewhere.
16:22 dataverse-user joined #dataverse
16:23 dataverse-user Ok I think that I was missing the dataverse.files.<id>.ingestsizelimit, but now the file keeps ingesting after a payara restart and the dataset is locked
16:24 dataverse-user how could I work this out? 😅
16:45 donsizemore @dataverse-user you might need to remove the job from the queue https://guides.dataverse.org/en/latest/admin/troubleshooting.html?highlight=purge#long-running-ingest-jobs-have-exhausted-system-resources
16:48 philipp63 Thanks @pdurbin and @donsizemore for the updated SchemaSpy link!
16:48 philipp63 left #dataverse
16:49 philippc joined #dataverse
16:50 dataverse-user @donsizemore I ran /usr/local/payara5/mq/bin/imqcmd -u admin purge dst -t q -n DataverseIngest however the ingest keeps on going
16:53 donsizemore my only suggestion might be to try restarting payara now that you've purged the queue?
17:05 dataverse-user Ok so I deleted the lock with curl -X DELETE /api/datasets/id/locks
17:06 dataverse-user but everytime that I try to publish the dataset it locks again because the file is withe the "ingest in progress" message
17:42 poikilotherm Pdurbin Donsizemore have you seen this? https://blog.bofh.it/debian/id_461
17:42 dataverse-user Hi, I've even set my like this <jvm-options>-Ddataverse.files.myid.in​gestsizelimit=500000000</jvm-options> but dataverse keeps ingesting the file... my file size is 1.7 GB 😥
17:44 dataverse-user set my jvm-options like this***
17:49 donsizemore @poikilotherm I saw there was a kerfuffle
17:50 poikilotherm Oh wait a nice word I've never seen before. "kerfuffle"
17:51 poikilotherm I looked up the translation, but what's the subtext of it? In what context is it to be used?
17:56 philippc left #dataverse
18:21 dataverse-user OK! so I've fixed the issue with this curl -X PUT -d 200000000 http://localhost:8080/api/admin/settings/:TabularIngestSizeLimit
18:21 dataverse-user thank you all for your recommendations
18:30 donsizemore @poikilotherm today it was used in the context you posted above =)
19:35 Virgile joined #dataverse
21:07 pdurbin poikilotherm: yeah, I saw it. :(
21:08 pdurbin left #dataverse
22:22 Virgile joined #dataverse

| Channels | #dataverse index | Today | | Search | Google Search | Plain-Text | plain, newest first | summary

Connect via chat.dataverse.org to discuss Dataverse (dataverse.org, an open source web application for sharing, citing, analyzing, and preserving research data) with users and developers.