Time
S
Nick
Message
02:24
nanoz joined #dataverse
02:55
jri joined #dataverse
05:39
jri joined #dataverse
06:46
isullivan joined #dataverse
08:03
poikilotherm joined #dataverse
08:18
jri joined #dataverse
11:23
donsizemore joined #dataverse
11:51
pdurbin joined #dataverse
12:42
pdurbin
poikilotherm: I referenced your "grrr" about geotoolkit at https://github.com/IQSS/dataverse/issues/5312 :)
12:47
poikilotherm
:-D
12:48
poikilotherm
Thx :-)
12:48
poikilotherm
Since the merge of 5313 I am happy again - builds aren't failing anymore
12:53
pdurbin
awesome
12:54
poikilotherm
Indeed :-D
13:12
donsizemore joined #dataverse
13:30
pdurbin
Oh good, it looks like Craig is planning on making a pull request for https://github.com/IQSS/dataverse-docker/issues/8
13:31
poikilotherm
I was wondering if the situation that test accounts are not easy to get can be helped with the Dataverse Consortium becoming a member of Datacite?
13:32
pdurbin
I really don't know. Even if DataCite test accounts are given out, I don't like how it's a manual process. People should be able to install Dataverse and play with the "publish" feature right away. This is how it was when EZID was the default.
13:36
poikilotherm
But if Dataverse would be a member, maybe there would be an option to create a test system account that can be used for demo purposes and which is shared publicly?
13:37
poikilotherm
Being a member is a different story than being just some organization
13:40
donsizemore
@poikilotherm I asked them about that... they didn't want me to share our test credentials.
13:45
pdurbin
To be fair, Martin Fenner did say they have an API to create test DataCite accounts: https://github.com/IQSS/dataverse/issues/5024#issuecomment-423610278
13:46
pdurbin
poikilotherm: and that's also where he reinforces what donsizemore is saying. No shared credentials.
13:47
pdurbin
donsizemore: did you ever install DVN 3? Do you remember how you could publish immediately with fake DOIs or whatever? I think they all had "TEST" in them.
13:48
donsizemore
@pdurbin i only installed 3 for purposes of upgrading to 4
13:49
donsizemore
@pdurbin p.s. i hope i'm coming up for air today and can peck away at a couple of open issues (and, sadly, possibly open another in dataverse proper)
13:51
pdurbin
donsizemore: you'll have to take my word for it. :) Please open issues! Please fix issues! :)
13:55
poikilotherm
pdurbin: there is progress at payara - they finally are back to maintaining the Dockerfiles
13:56
donsizemore
@pdurbin mandy's having trouble with original-format zip files from a large dataset (at Harvard). i want to troubleshoot more but i've been doing vmware upgrades for 3 days
13:57
poikilotherm
They recently decided to switch from "domain1" to "production" domain, which has some implications. https://docs.payara.fish/documentation/payara-server/production-ready-domain.html
13:58
poikilotherm
Personally I think this is a good idea, as it makes dev use the same env as production. Do you see anything that could be a show-stopper?
14:01
pdurbin
poikilotherm: please tell me what you think of "Don't make me think about payaradomain vs. domain1 (have `asadmin start-domain` "just work")" https://github.com/payara/Payara/issues/349
14:01
poikilotherm
The automated reloading of stuff doesn't seem to be an issue IMHO, because when using mvn docker:watch, as soon as the WAR file is updated, the container is built and restarted
14:02
pdurbin
donsizemore: can she download files in smaller batches as a work around?
14:03
donsizemore
@pdurbin this was a few days ago so i imagine she's around/past it now. if i can pinpoint something actionable/correctable i was going to open an issue and/or submit a pull request
14:03
pdurbin
donsizemore: perfect. Thank you. And good luck with VMWare. It's been a while for me. Thankfully. :)
14:04
donsizemore
@pdurbin head cook and chief bottle washer =)
14:04
poikilotherm
pdurbin: Uhm this has been fixed, hasn't it? Within the container the delete the domain1, so Payara is just happy to start the only domain left, which is production
14:05
poikilotherm
Should be just like you wanted it - "don't make me think"...
14:06
pdurbin
poikilotherm: all I know if that they fixed that issue I reported back in 2015. I haven't touched Payara in a while.
14:06
poikilotherm
;-)
14:06
pdurbin
I touched Payara more recently than VMWare, I guess.
14:06
poikilotherm
Alright, then lets stick with the upstream decision for now.
14:07
pdurbin
Did Gustavo or Matthew email you back?
14:07
poikilotherm
This might be a good hint for future docs, describing the deployment process on "old fashioned installations" ;-)
14:07
poikilotherm
Matthew reported on Monday that they will catch up... Nothing else since...
14:08
poikilotherm
Oh sorry - was in Tuesday
14:08
poikilotherm
-i +o
14:08
pdurbin
The future is so bright I have to wear sunglasses.
14:09
poikilotherm
But that seems fair - in total is has been less than 48h since his response
14:10
poikilotherm
And I expect them to have more important stuff to look at first... ;-)
14:11
pdurbin
everybody's busy
14:11
poikilotherm
;-)
14:13
poikilotherm
Actually I thought you guys just party all day and sometimes, when you feel lucky, you might touch the keyboard.
14:13
pdurbin
lots of emojis in slack
14:14
poikilotherm
:-D
14:15
pdurbin
donsizemore: do you know anything about the consortium website?
14:23
donsizemore
@pdurbin i can ask jon (though he's home sick today)
14:24
donsizemore
@pdurbin kasha is here and has done some design work for it but doesn't think she has admin access. what do you want to know?
14:25
pdurbin
The URL . Maybe you can email it to me. :)
14:27
donsizemore
http://dataversecommunity.global/
14:28
pdurbin
thanks!!!
14:28
pdurbin
and I hope Jon feels better
15:22
pameyer joined #dataverse
15:25
donsizemore joined #dataverse
15:34
pameyer
@donsizemore do you know offhand if 1.5GB is under the size limit for harvard dataverse?
15:41
donsizemore
@pameyer i was just looking at that (and realizing UNC Dataverse hadn't increased the limit)
15:42
pdurbin
Dataverse should probably expose limits like this via API .
15:42
pameyer
that would be nice
15:42
donsizemore
@pdurbin you'll be glad to know that harvard dataverse's admin api is restricted to localhost =)
15:42
pdurbin
phew
15:43
poikilotherm
And maybe a config option?
15:43
donsizemore
@poikilotherm it is... i forgot about it in testing for our archivist
15:43
poikilotherm
Or even per dataverse limits?
15:43
donsizemore
@poikilotherm yes yes yes
15:43
pdurbin
quotas?
15:44
pameyer
corrupted zip file has me thinking about /tmp getting full and ending up with an invalid file... but could be completely off base there
15:44
poikilotherm
We will have different communities in our Dataverse instance and there might be different concerns about "what is a reasonable size"...
15:44
pdurbin
If you want quotas please leave a comment on https://github.com/IQSS/dataverse/issues/4339
15:45
pameyer
@poikilotherm "reasonable size" sounds like it could be as interpretable as "big data" ;)
15:45
poikilotherm
Hmm IMHO limits != quotas
15:47
pdurbin
a number of limits are already configurable
15:50
poikilotherm
Alright guys: gotta run again... ;-)
15:50
poikilotherm
Cu all
15:58
pameyer
we need an ascii art world clock for dataverse chat :)
15:59
pdurbin
I put "timezone" in the "who's who" doc in the topic.
15:59
pdurbin
pameyer: and I'm hoping we can add your installation to that doc soon. :)
16:00
pameyer
zeno's paradox of going into production
16:01
pdurbin
:)
16:14
pdurbin
andrewSC donsizemore pameyer the "upgrade across versions" pull request has been made: https://github.com/IQSS/dataverse/pull/5317
16:14
andrewSC
just opened the email!
16:15
andrewSC
then the PR and saw the LOC change xD
16:15
andrewSC
lol
16:18
andrewSC
I'm hyped though man
16:18
andrewSC
we're about to do a big lift operation into AWS (finally!!) from our private cloud infra..
16:19
andrewSC
things are running now but we're doing the cutover this coming monday
16:33
dataverse-user joined #dataverse
16:33
isullivan joined #dataverse
16:45
xarthisius
pdurbin: I'm sure you've noticed but https://dataverse.harvard.edu/ looks down
16:45
pameyer
xarthisius: it looks up to me
16:46
xarthisius
is there a chance I might have been blocked for some reason?
16:46
xarthisius
I get 503
16:46
pameyer
`curl -i https://dataverse.harvard.edu/api/info/version ` ?
16:47
xarthisius
interesting
16:47
xarthisius
I was about to say curl works
16:47
xarthisius
but chrome shows me 504
16:47
xarthisius
*503
16:47
pameyer
could you try `https://dataverse.harvard.edu/hosts.txt `
16:47
pameyer
probably one of the glassfish app servers fell over, and chrome's getting one but curl gets the other
16:48
xarthisius
yeah I get 503 for that with curl too
16:49
pameyer
that's because I typo'd it - should've been `host.txt` not `hosts.txt`
16:49
pameyer
you're not blocked; and I'm relatively sure the folks that can fix it are in the loop
16:50
xarthisius
host.txt gives me dvn-cloud-app-
16:50
xarthisius
1
16:53
pameyer
xarthisius: could you give it another try?
16:53
xarthisius
works fine now!
16:54
pameyer
glad to hear
17:30
nanoz joined #dataverse
17:55
pameyer
glassfish supports creating http listeners, and deleting them - but not editing them :<
18:01
donsizemore joined #dataverse
18:15
pameyer
as frequently happens, I'm wrong - you can use the generic asadmin set commands to edit them
18:32
donsizemore
@pameyer i was about to ask if you wanted the listener address to become an install-time variable, but then i saw your PR
18:32
pameyer
@donsizemore does dataverse-ansible support having the web server and app server on different boxes?
18:32
donsizemore
@pameyer not currently, no. it's a one-shot used mostly for testing
18:33
pameyer
probably doesn't make a difference then
18:34
pameyer
yeah, folks should firewall off ports they don't want exposed - but my paranoia tends to lead me to want as many things as possible to need to go wrong
18:34
donsizemore
@pameyer i was thinking localhost-by-default
18:35
pameyer
@donsizemore that sounds good to me
18:36
pameyer
if there's a better PR I'm happen to nuke the one I made documenting it - just didn't want somebody else to have to hunt for the incantation ;)
18:36
pameyer
s/happen/happy/g
18:36
donsizemore
@pameyer admittedly i didn't think about it until i saw your issue and PR but i like the inversion plan.
18:37
pameyer
@donsizemore I didn't think of it until I was installing on a system where I hadn't provisioned the OS
18:37
donsizemore
@pameyer yowch!
18:38
pameyer
was dmz only, fortunately
18:46
pdurbin
xarthisius: I heard. Thanks. If you notice it again, please feel free to email support dataverse.harvard.edu
18:47
xarthisius
I'm trying to figure out is it possible to use /api/search to get match for file described by persistentId=doi:10.7910/DVN/MJKCHZ/8CKNCD
18:47
xarthisius
but I'm not sure which field to use
18:47
xarthisius
pdurbin: will do next time
18:49
* pdurbin
thinks
18:50
xarthisius
If I may throw more onto the pile: what's a good/fast way to get dataset title and doi, if you only have fileId for something that's part of that dataset
18:50
xarthisius
short of parsing citation string
18:57
pdurbin
xarthisius: this seems to work: https://dataverse.harvard.edu/api/search?q=filePersistentId:8CKNCD
18:58
xarthisius
\o/
18:59
pdurbin
xarthisius: it's messy but you could parse the citation to get the title: curl https://dataverse.harvard.edu/api/search?q=filePersistentId:8CKNCD | jq '.data.items[0].dataset_citation'
19:00
xarthisius
that's what I currently do, is it safe to assume that whatever's between " " is the dataset title?
19:01
pdurbin
well, sometimes titles have quotes in them, unfortunately
19:02
pdurbin
xarthisius: you should feel free to make some test datasets at https://demo.dataverse.org with quotes in the title or whatever
19:02
pdurbin
xarthisius: and you should definitely feel free to open an issue at https://github.com/IQSS/dataverse/issues with something like "As a developer using the Dataverse API ..."
19:16
pameyer
pdurbin: would I be guessing right that the bounce bot is on a 5 minute cron job?
19:17
* pdurbin
checks his "monitoring" folder
19:17
pdurbin
pameyer: it looks like it sent an email at 1:01 am and 1:51 am. That's all I know.
19:19
pameyer
pdurbin: it seemed like folks started noticing at ~11:45, and that it was back ~11:50
19:19
pameyer
might've just been a coincidence
19:20
pdurbin
yeah
19:57
donsizemore joined #dataverse
21:28
jri joined #dataverse
22:15
donsizemore joined #dataverse