Time
S
Nick
Message
07:49
Virgile joined #dataverse
07:51
Virgile joined #dataverse
07:56
Virgile joined #dataverse
09:18
juancorr joined #dataverse
10:23
Virgile joined #dataverse
12:53
donsizemore joined #dataverse
15:09
Virgile joined #dataverse
15:09
Virgile
Hi there
15:09
Virgile
ansible on multiple servers issues - episode 2 ^^
15:10
Virgile
the problem i talked about on dvinstall seems to come from this :
15:10
Virgile
remote failure: Error occurred during deployment: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.7.6.payara-p1): org.eclipse.persistence.exceptions.DatabaseException
15:10
Virgile
Internal Exception: java.sql .SQLException: Error in allocating a connection. Cause: Class name is wrong or classpath is not set for : org.postgresql.ds.PGPoolingDataSource
15:10
Virgile
we're using postgresql 10
15:11
Virgile
Could it be an issue with jdbc?
15:15
pdurbin joined #dataverse
15:16
pdurbin
Virgile: it works fine on one server, right?
15:17
Virgile
i didn't try the playbook on a single server - I assumed it was fine if it was on the github ^^
15:17
Virgile
And we installed all manually when we to installed DV on one server
15:18
Virgile
and it was with postgresql 9.7
15:18
pdurbin
Just so I understand... you want dataverse-ansible to support multiple servers?
15:19
Virgile
We're building a playbook that is doing this, yes.
15:19
Virgile
one payara, one postegresql, one solr/rserv
15:19
Virgile
solr/rserv and postgresql are operational
15:19
pdurbin
Oh I thought you meant multiple payara serverse.
15:20
pdurbin
servers*
15:20
Virgile
once the payara server is install, we'll do an scp of payara directort on additional servers, as don recommended it
15:20
Virgile
is installed*
15:20
pdurbin
right, makes sense
15:20
Virgile
directory*
15:21
pdurbin
The installer supports a remote database server. I've never tried this with dataverse-ansible.
15:21
Virgile
but right know the issue is when the playbook arrives at dvinstall
15:22
Virgile
the more precise log about thar is what i copy/pasted up there
15:22
Virgile
that* (wow i need to go easy on that enter key ^^)
15:23
pdurbin
Once you figure this out, do you plan to make a pull request for dataverse-ansible?
15:25
Virgile
it's planned :) I asked some time to actively participate to DV community
15:27
Virgile
right the idea is to start from the "one server" install, so each time it's updated we only need to modify 1 file (roles/defaults/main.yml)
15:27
Virgile
right = right now
15:27
pdurbin
Nice. If that's the case, I'd suggest starting by creating an issue in the dataverse-ansible repo about the need you have.
15:29
Virgile
I must admit I'm not comfortable right now with git. I'll follow an internal course about it soon, then I'll start the the pull/push thingies
15:32
pdurbin
No problem. I only suggested it because donsizemore is busy tesing a pull request by poikilotherm right now and an issue might be a way to put a pin in it until we can all give it more attention.
15:32
Virgile
sure
15:33
Virgile
anyway i think i'll try to switch the dvinstall on a postegresql 9 server, see if it works like that
15:33
donsizemore
@Virgile that error means you don't have a connection pool
15:34
donsizemore
@Virgile which the Dataverse installer attempts to create. so I'd look in install.out
15:35
donsizemore
@Virgile and Dataverse doesn't really care where the Postgres service lives, as long as it can contact it
15:35
Virgile
i see
15:36
Virgile
thanks don, i'll chack again install.out then. :)
15:36
Virgile
check* ><
15:37
Virgile
it's slowly the end of the day for me
15:38
Virgile
Take care :)
15:39
pdurbin
bye!
15:59
pdurbin
poikilotherm: did you created issues for all these? https://twitter.com/poi_ki_lo_therm/status/1331709519391502340
15:59
pdurbin
I don't see an issue for this one: - Make more secrets configurable via MicroProfile Config (DOI, S3, ...)
16:00
pdurbin
Or this one: Create containers from Maven in @dataverseorg
16:10
poikilotherm
Aye
16:10
poikilotherm
Ah that S3 might not have one already
16:10
poikilotherm
All other have
16:10
poikilotherm
(IIRC)
16:11
pdurbin
And you want all this stuff in Dataverse 5.3?
16:19
poikilotherm
Virgile donsizemore: the class for the PG Pool is not available. I bet that means the pg driver is not installed properly for Payara to pick up
16:21
poikilotherm
pdurbin: well if it does not get into 5.3, it will be 5.4. I will not create images for Dataverse 5.x as long as these issues aren't resolved. Pushing myself to do clean solutions. Like with refactoring Payara to support config in a directory.
16:22
pdurbin
Makes sense. All the pushing is healthy for the project.
16:25
donsizemore
@poikilotherm yes I wondered about the driver
16:25
poikilotherm
BTW I learn a lot about java.nio API from the Payara hacking. Maybe this is useful for Dataverse, too.
16:26
pdurbin
Maybe. I'm vaguely awayare of it. The n is for new but I don't think it's so new anymore.
16:26
pameyer joined #dataverse
16:32
nightowl313 joined #dataverse
16:55
nightowl313
hey folks ... happy monday! i hope you all don't mind me asknig yet another question ... I spent a lot of time over the weekend messing around with the dataverse apis ... was able to use the native apis to get all kinds of info from various dataverses/datasetsin ours and other dataverses; should i theoretically be able to export the metadata from a dataset in one dataverse and import that json file into a dataset in another dataverse? (
16:56
nightowl313
i know that I can probably use pydataverse for this, but I'm just wondering if i should be able to manually use these apis to do this (and if anyone has done this)?
16:57
pameyer
I'm pretty sure GET on the dataset API produces json that needs minor fiddling before it's usable for the create dataset API
16:57
pameyer
I should, but don't, remember for the import API
16:58
pdurbin
generally speaking, it should work
16:59
nightowl313
i did try exporting the metadata from one, and then using that to copy/paste the correct values into the supplied template
16:59
nightowl313
that seemed to work pretty well ... but I could never get the api to export as dataverse_json
17:00
pdurbin
Hmm. You should be able to download dataverse_json from the GUI . Export Metadata.
17:01
nightowl313
it would never accept the syntax .. kept giving me errors or just giving me the html of my site :)
17:01
nightowl313
likely something I did wrong
17:01
nightowl313
oh from the GUI ! I'll try that1
17:01
nightowl313
that
17:04
nightowl313
and if I try to use the api to import, keeping the DOI (still haven't gotten it to work) ... does it import the DOI even if it contains the specific 10.<whatever> number and shoulder of the original site?
17:05
pdurbin
It might import the DOI into the database (I don't remember) but you can only update metadata on the DataCite site for DOIs you actually control.
17:05
pameyer
if I'm remembering right, the import API will only import if the DOI matches what the dataverse installation is configured to use
17:06
pameyer
otherwise, it gives you an error about trying to import a PID you don't control
17:07
nightowl313
oh okay ... it is giving me errors that I'm using the wrong format for the DOI, so maybe that's what I'm seeing .. not sure we really want to do that, but just checking
17:08
pdurbin
nightowl313: are you planning on giving that dataset a new DOI that you control?
17:09
nightowl313
ultimately that is what we want to do, but for people who still want the old DOI to work we are kind stuck trying to figure out the best way to do that ... we have one that we moved and it has the dataverse/datasets in both places currently
17:10
nightowl313
don't think that is the best option, so then we were thinking we would just put the old DOI in the "other ID" field and contact datacite to redirect the DOI
17:10
nightowl313
to the new DOI
17:10
nightowl313
which is probably what we will do, but just looking at options
17:11
nightowl313
if we keep the original DOI, not sure what happens with the original dataverse/dataset ... I guess we ask to have it deaccessioned
17:11
nightowl313
which then brings us back to just leaving the dataverses/dataset and harvesting them! lol
17:11
nightowl313
we originally just wanted to get some content into our dataverse, but now not sure it is worth it ... but I'm still using it as an opportunity to learn how all this works
17:12
nightowl313
sorry I keep asking questions about this ... i know we have discussed it before, but we are just still wrestling with all of the considerations
17:13
pdurbin
If it were me, I'd probably harvest it in for now.
17:13
nightowl313
agreed! =) wonder if other orgs have gone through this and arrived at the same conclusion!
17:14
pdurbin
Check out https://dataverse.ucla.edu and how it has 570 harvested datasets and 27 original datasets. That's fine. I remember when they were excited about their first original dataset.
17:14
pdurbin
And now they have 27. Great!
17:15
nightowl313
oh man! that is a lot of harvested datasets! I guess they were in the same situation!
17:15
pdurbin
They started off by hosting with Harvard Dataverse. Then they set up their own installation.
17:18
nightowl313
ah okay .. makes sense ... so many things to consider when building a dataverse! so glad there are others who have gone before... :D
17:18
nightowl313
thanks for the info ... that helps a LOT
17:19
pdurbin
Sure. I mean, it's a way of deferring the decision of what to do long term, perhaps. But at least it gives you some content. :)
17:20
nightowl313
absolutely! and then we can just encourage the researchers to add to our dataverse moving forward
17:22
pdurbin
exactly
17:28
nightowl313
=)
22:02
pdurbin left #dataverse