Time
S
Nick
Message
06:51
Virgile joined #dataverse
10:34
icarito[m] joined #dataverse
12:01
donsizemore joined #dataverse
14:12
pdurbin joined #dataverse
14:34
Virgile joined #dataverse
14:35
Virgile
hey there :)
14:35
Virgile
Thanks Phil for your answer on google community, i'll check all that
14:37
Virgile
and I'll probably send an e-mail to the support, but now it's about to be the end of my working hours. ^^
14:38
Virgile
So I wish you all a nice day, afternoon or night depending on your timezone !
14:38
pdurbin
Virgile: thanks. :)
14:38
pdurbin
Tickets are nice for tracking.
14:38
donsizemore
O ye @pdurbin master of Sphinx
14:39
Virgile
indeed ;)
14:39
donsizemore
refresh my memory: you're manually building with 1.5.3?
14:39
pdurbin
And sometimes the conversation goes on and on on the mailing list.
14:40
Virgile
Yeah I won't bother the community more, only to give the solution, or to say what was my mistake ;-)
14:40
pdurbin
donsizemore: well, I think (hope) that everyone who contributes to docs does a `make html` before making a pull request. That's what I do. And I use the version that's in requirements.txt
14:40
pdurbin
Sphinx==1.5.6
14:40
donsizemore
:thumbs up:
14:40
pdurbin
That's the version that was on old Jenkins.
14:40
Virgile
Anyway, cya! And thanks again Don and Phil for your awesome work and precious help!
14:41
pdurbin
And rather than upgrading to 2x and then to 3x I've been sticking with it. :)
14:41
pdurbin
Virgile: bye!
14:42
donsizemore
though that's higher than the 1.5.1 complained about in S.O.
14:43
pdurbin
maybe the search feature easily breaks :)
14:43
pdurbin
keeps breaking
14:43
pdurbin
brittle
14:43
pdurbin
it is Javascript after all :)
14:51
pdurbin
bricas_ donsizemore juancorr poikilotherm Virgile: community call in 10 minutes: https://dataverse.org/community-calls (plus another 5 minutes of waiting for people to join, realistically). :)
14:54
Virgile
wish I could... kids to pick up ;o I'm already late - but i'll gladly read the notes! *is gone*
14:56
pdurbin
I've been there. My kids are finally old enough to walk home from school, just when there is no school.
15:18
donsizemore
whoops, missed the call i bet
15:18
pdurbin
yeah
15:19
pdurbin
You didn't miss much. And it's all in the notes.
15:22
donsizemore
so, on Sphinx downgrade - good thing we didn't close 7233 http://guides.dataverse.org/en/5.1/search.html?q=dataset
15:23
pdurbin
Sorry, what do you mean?
17:49
nightowl313 joined #dataverse
17:50
nightowl313
hi all! :pdurbin we at asu are wondering if someone from your dataverse would be willing to work with us to identify any dataverses/datasets that are asu-owned that we may need to transfer to our dataverse? we are just looking to see what the storage use is and what we might need to do (whether it be harvesting or transferring somehow), if that is even an option
17:51
nightowl313
we are new to this all ... but thought it would be good to get some actual data in our dataverse since it is live!
17:53
nightowl313
asu dataverse that is
17:55
pdurbin
nightowl313: hi! The best thing to do is email support dataverse.harvard.edu
18:00
nightowl313
got it .. will do ... thanks!
18:12
donsizemore
@nightowl313 our archivists keep threatening to need to move some datasets from Harvard to Odum. keep me posted on how it goes!
18:16
nightowl313
okay, will do! is that a typical thing? moving dataverses/datasets from one org to another, or is it typically all done via harvesting? and if moved, does the DOI/handle have to be destroyed and recreated? or, does it just move with? so many questions ... wondering if this would be a use case for the new bagit upload process?
18:17
donsizemore
harvesting would grab and register the metadata from a remote dataverse, but each harvested dataset only serves as a pointer back home
18:17
pdurbin
You'd probably want a new DOI but you can put the old DOI under "Other ID"."
18:18
donsizemore
i've never "moved" a dataset between dataset instances so i'm interested to see how it goes
18:20
pdurbin
My understanding of a DOI is that it is, in part, a declaration of which institution is standing behind the data. Which will make sure the data is available for however long.
18:22
nightowl313
that makes sense that if we move it in, we would want our own DOI on it ... and good that we can still refer to the old (now I remember seeing that field) ... :donsizemore will let you know what we find ... you've helped us so much!!!
18:27
nightowl313
oh sorry, i think i'm referencing users wrong on here ... should have used <name>: or @<name>
18:30
pdurbin
nightowl313: I think it works either way :)
18:32
donsizemore
i've been called many things, an @ or a : makes no difference!
18:36
nightowl313
haha ... so I already heard back from harvard support (they are fast!), and per the terms of use that users agree to when creating datasets, it doesn't appear that we can move them as per the preservation policies (which makes complete sense as I do understand that is the whole purpose of preservation!) ... so I think we would just be harvesting the datasets at this point
18:36
pdurbin
Oh well. At least they'll be searchable from your installation.
18:37
nightowl313
which is okay, just wasn't sure if there was a way to transfer ... but that really does make sense as per all of the archival efforts that have already taken place on them
18:38
nightowl313
yea, and that is what's important ... and it gives us a chance to test the harvesting functionality
18:39
pdurbin
:)
18:39
pdurbin
should work
18:43
pameyer joined #dataverse
18:46
nightowl313
oops, i read that email wrong and someone was responding to a different question (i ask a lot of questions), so not sure that is the case regarding moving the datasets ... although it would make sense ... will update
18:46
nightowl313
if I hear otherwise
18:46
pdurbin
👍
19:20
nightowl313 left #dataverse
20:33
pdurbin left #dataverse
22:55
nightowl313 joined #dataverse
22:58
nightowl313 left #dataverse