IQSS logo

IRC log for #dataverse, 2017-10-31

Connect via chat.dataverse.org to discuss Dataverse (dataverse.org, an open source web application for sharing, citing, analyzing, and preserving research data) with users and developers.

| Channels | #dataverse index | Today | | Search | Google Search | Plain-Text | plain, newest first | summary

All times shown according to UTC.

Time S Nick Message
01:42 jri joined #dataverse
07:58 jri joined #dataverse
12:01 jri joined #dataverse
12:15 donsizemore joined #dataverse
13:33 donsizemore joined #dataverse
14:42 Thalia_UM joined #dataverse
14:42 Thalia_UM Hiii!!! Good morning! :)
14:43 pdurbin Thalia_UM: have you had enough coffee? ;)
14:44 Thalia_UM Hahahaha no :S
14:45 Thalia_UM I can't drink coffee because I have gastritis
14:45 Thalia_UM :-$
14:45 pdurbin :(
14:45 pdurbin I'll drink an extra up for you. :)
15:11 rebecabarros joined #dataverse
15:17 rebecabarros pameyer: about DCM /hold setting, it's enough point out '/usr/local/glassfish4/glass​fish/domains/domain1/files" or do I have to put everything user@host:/fullpath? The user from DCM machine and from Dataverse machine have to be the same?
15:43 pameyer joined #dataverse
15:50 pameyer rebecabarros: for NFS, the UID/GID that the DCM sets the files to after moving them to `/hold` needs to match the UID/GID that glassfish is running as. I may not be understanding your question though (aka - where  `user@hold:/fullpath` comes in)
15:55 rebecabarros pameyer: https://pastebin.com/5vhNLZmh - is that right for main.yml?
15:57 pameyer two questions: which user is glassfish running as, and are you trying to run DCM and DV on the same system?
15:58 rebecabarros since I'm running locally I try to with: ansible-playbook -i local.hostlist dcm.yml but I've got an error that maybe is related with my ansible config - https://pastebin.com/b6yVAb8t
15:59 pameyer that error is an ssh error from ansible
16:01 rebecabarros they're running in different systems. And I'm afraid that glassfish is running as root
16:22 Thalia_UM I have a question
16:22 Thalia_UM Can i change dc-terms for dc???
16:23 Thalia_UM at metadata file
16:23 Thalia_UM ???
16:56 pameyer joined #dataverse
17:13 pameyer rebecabarros: glassfish running as root isn't a problem for this. do you have NFS setup between the DV and DCM systems?
17:13 pameyer Thalia_UM: is this the metadata file for exports, or something else?
17:13 Thalia_UM no
17:14 pdurbin Thalia_UM: are you using SWORD?
17:14 pdurbin I'm confused.
17:14 Thalia_UM yes
17:15 Thalia_UM It's to load the metadata and I want to know if I can attach email and other attributes
17:19 pdurbin Thalia_UM: have you seen http://guides.dataverse.org/en/4.8/api/sword.html#dublin-core-terms-dc-terms-qualified-mapping-dataverse-db-element-crosswalk ?
17:20 rebecabarros joined #dataverse
17:22 Thalia_UM Can I add other labels?
17:22 Thalia_UM as <dcterms: format> <dcterms: size>
17:23 Thalia_UM <dcterms:creator emali="">
17:23 pdurbin no
17:23 rebecabarros pameyer: is this NFS, right? https://access.redhat.com/documentation/en-US/Red_Hat_Enterprise_Linux/3/html/Reference_Guide/ch-nfs.html I think not because I did not do any change regarding this, I will check.
17:24 pdurbin rebecabarros: yes, that's what NFS is
17:24 Thalia_UM So, Can I only load the ones described in the example?
17:24 pameyer it doesn't have to be NFS, but DCM and DV need to have a shared filesystem
17:25 pdurbin Thalia_UM: that's correct. If you need more metadata fields, you will need to switch from SWORD (XML) to the "native" API (JSON).
17:27 pdurbin pameyer: I was just thinking to myself what else one could use for a shared filesystem besides NFS. I guess NFS is the oldest and most popular.
17:27 pameyer SMB *should* work, although in my experience it does odd things with permissions
17:29 pdurbin yeah
17:35 pdurbin Thalia_UM: you could try this: curl -s -X POST -H "Content-type:application/json" -d @scripts/search/tests/data/dataset-finch1.json "http://localhost:8080/api/dataverses/finches/datasets/?key=$FINCHKEY"
17:35 pdurbin Thalia_UM: here's the JSON file https://github.com/IQSS/dataverse/blob/develop/scripts/search/tests/data/dataset-finch1.json
17:37 rebecabarros in case of NFS, DCM is my NFS server and DV is my NFS client? how should I set the access options for /etc/exports?
17:40 Thalia_UM Philip, I am confused me with respect to the subject, I would like you to explain to me what is the use of loading the XML file through SWORD and json with the native API
17:40 pameyer rebecabarros: DCM as NFS server, DV as NFS client would work (so would the other way around, as would having them both be NFS clients of another fileserver).
17:41 pameyer for glassfish as root, you'll need `no_root_squash` in /etc/exports
17:42 pdurbin Thalia_UM: can you please email support@dataverse.org about this?
17:42 Thalia_UM Yes philip
17:42 Thalia_UM thank you
17:46 Thalia_UM I just got confused me a coworker :-$
18:02 rebecabarros pameyer: should I mount the shared NFS directory in my Dataverse files directory?
18:04 pameyer rebecabarros: that should work (or create a symbolic link from your dataverse files directory)
18:05 Thalia_UM Philip
18:06 Thalia_UM the file when exporting the metadata, is equal to the metadata example for SWORD?
18:07 pdurbin Thalia_UM: no, it is not equal, unfortunately :(
18:07 Thalia_UM In the repository that has harvard as they load the metadata?
18:07 Thalia_UM Ah ok
18:08 pdurbin I think rebecabarros was asking a similar question once. Can you created a dataset using DDI XML, which Dataverse exports. The answer no. (but please feel free to open an issue about this, rebecabarros !)
18:12 jri joined #dataverse
18:21 jri joined #dataverse
18:21 donsizemore joined #dataverse
18:28 rebecabarros yes, pdurbin and Thalia_UM. the metadada json that you get from http://$SERVER/api/datasets/$id/ve​rsions/$versionNumber?key=$apiKey is different from the one in api export option. For now you need to use the first json model to be able to update metadada using the api.
18:30 pdurbin It would be more logical if data in matched data out. Sorry.
18:34 pameyer huh - I hadn't know there was a way other than `http://$SERVER/api/datasets/​$id/versions/$versionNumber`
18:34 pdurbin pameyer: export was shipped along with harvesting
18:34 pameyer ah - gotcha
18:35 pameyer pdurbin: any ideas if what folks have been referring to as the "edit API" is the "replace all metadata items with this json" API, or if this is something that allows changing a single field?
18:36 pameyer rebecabarros's mention of updating metadata reminded me
18:37 pdurbin Edit via API is always a replace of the entire contents of the metadata. You can't simply change a single field such as the title of the dataset. You have to include all the metadata in either XML (SWORD) or JSON (native) format.
18:38 Thalia_UM Can i write that? ---> <dcterms:creator Contact="thalia@gmail.com">Thalia Uranga </dcterms:creator>
18:39 rebecabarros pameyer: all set regarding NFS. the list of dependencies, do I need to install theses packages before running the playbook or this action are included in playbook execution? I'm still getting the ssh error regarding ansible. Trying to solve this.
18:40 pdurbin Thalia_UM: are you trying to fill in "Author" or "Contact"?
18:41 pameyer rebecabarros: if I'm remembering correctly, the playbook doesn't attempt to configure NFS during install
18:42 Thalia_UM Author with contact
18:43 rebecabarros pameyer: not the NFS, but the packages: lighttpd, sshd, rsync...
18:43 pdurbin Thalia_UM: are you able to fill in the email address of an author using the GUI?
18:43 Thalia_UM omplete Author
18:43 pameyer pdurbin: guess I'm not understanding the difference between "create" and "edit" - replace-all seems like create to me (but I don't need to get un-confused on this today)
18:43 Thalia_UM Complete Author
18:44 Thalia_UM Can it be completed?
18:44 Thalia_UM Can I complete the author's email on the <dcterms: creator> tag?
18:47 pdurbin Thalia_UM: I'm asking you to look at the GUI because it is not possible via the GUI. It is also not possible via API.
18:47 Thalia_UM Why?
18:48 pdurbin No one has wanted to enter email addresses for authors. If you want this, please open a GitHub issue. Most people enter the email address under "Contact".
18:48 pameyer joined #dataverse
18:49 Thalia_UM are there more tags than the GUI? Another website that shows the metadata example?
18:50 pdurbin The GUI is the best place to see all of the available metadata fields.
18:50 Thalia_UM Who is the tag od Contact?
18:51 Thalia_UM dcterms:contributor           datasetContactEmai
18:51 Thalia_UM What is the tag of Contact?
18:53 pdurbin '“Contact E-mail” is automatically populated from dataset owners email.' http://guides.dataverse.org/en/4.8/api/sword.html Thalia_UM
18:58 pameyer joined #dataverse
18:59 Thalia_UM Yes, I read that part
19:01 pameyer joined #dataverse
19:05 rebecabarros pameyer: in dev.hostlist should I set the IP of DV machine?
19:07 pameyer IP or hostname should work
19:12 rebecabarros pameyer: ok. I've got an error in the task: rsync symlink
19:14 rebecabarros I have to run the playbook with sudo?
19:15 pameyer `ansible-playbook -u root` might help. is the error permission related?
19:17 rebecabarros seems like: TASK [dcm : rsync symlink] ******************************************​******************************************​******************************************​******************************************​****************************************** fatal: [172.16.9.36]: FAILED! => {"changed": false, "failed": true, "gid": 0, "group": "root", "mode": "0755", "msg": "refusing to convert between file and link for /bin/rsync", "owner": "root",
19:23 pameyer you found a point where different linux distributions are different - if rsync is at `/bin/rsync`, then that part of the playbook isn't needed (so you can comment it out)
19:32 rebecabarros what should I comment it out? And where? config.yml?
19:39 rebecabarros I will have to go. Thanks for all patience and help today pameyer and pdurbin. I guess that I almost there.
19:39 pameyer joined #dataverse
19:39 rebecabarros left #dataverse
20:04 axfelix joined #dataverse
20:05 pameyer joined #dataverse
20:20 pameyer roles/dcm/tasks/config.yml for the symlink job
20:25 pdurbin left #dataverse
20:30 pameyer joined #dataverse
20:33 donsizemore joined #dataverse
20:34 donsizemore @pdurbin just curious if each build still runs schemaspy per https://github.com/IQSS/dataverse/commit/184e687a33ac3ff99b8622ef858f563e61d1e393 ?
20:35 donsizemore @pdurbin i ask because https://apitest.dataverse.org/guides/developers/database/schemaspy/relationships.html is dated june 2015
20:39 donsizemore joined #dataverse
20:48 pameyer donsizemore: it looks to me like it should be running
20:49 pameyer I'm wondering about the hostname though
21:15 donsizemore joined #dataverse
21:29 pameyer joined #dataverse
22:37 axfelix joined #dataverse
23:48 pdurbin joined #dataverse
23:49 pdurbin The script is now at https://github.com/IQSS/dataverse/blob/v4.8.1/scripts/deploy/phoenix.dataverse.org/post#L15 and the schemaspy data is now here: http://phoenix.dataverse.org/schemaspy/latest/

| Channels | #dataverse index | Today | | Search | Google Search | Plain-Text | plain, newest first | summary

Connect via chat.dataverse.org to discuss Dataverse (dataverse.org, an open source web application for sharing, citing, analyzing, and preserving research data) with users and developers.