IQSS logo

IRC log for #dataverse, 2019-05-07

Connect via chat.dataverse.org to discuss Dataverse (dataverse.org, an open source web application for sharing, citing, analyzing, and preserving research data) with users and developers.

| Channels | #dataverse index | Today | | Search | Google Search | Plain-Text | plain, newest first | summary

All times shown according to UTC.

Time S Nick Message
00:33 jri joined #dataverse
01:34 jri joined #dataverse
02:35 jri joined #dataverse
03:36 jri joined #dataverse
04:36 jri joined #dataverse
05:37 jri joined #dataverse
07:00 jri joined #dataverse
08:45 stefankasberger joined #dataverse
09:11 pvx joined #dataverse
09:19 pdurbin Yes, you need a burrito but if you're using docker-aio it should give you one.
11:39 stefankasberger joined #dataverse
12:35 pdurbin stefankasberger: not sure if you saw my last message: http://irclog.iq.harvard.edu/dataverse/2019-05-06#i_92470
12:49 donsizemore joined #dataverse
12:55 donsizemore @pdurbin so the API test suite currently depends on the docker-dcm dockerfiles...
12:56 donsizemore @pdurbin ima set DV_API_KEY, DV_APIKEY and DVAPIKEY and see how far i get
13:15 donsizemore @pdurbin also: https://access.redhat.com/products/red-hat-enterprise-linux
13:22 pdurbin rhel 8 is out for real now?
13:27 donsizemore so they say
13:29 pdurbin heh
13:30 pdurbin Should I go ahead and create the dataverse-ansible issue asking for CentOS 8 support? :)
13:31 donsizemore you're welcome to, but centos 8 won't be out for a while
13:32 pdurbin yeah, I'll wait
13:32 pdurbin please ping me when it's out :)
13:35 pdurbin donsizemore: I wouldn't say that the API test suite depends on docker-dcm. I've been running the test suite on phoenix for three years. Is that longer than since Docker existed? :)
13:36 donsizemore @pdurbin so is this create-all-and-test (instead of run_test_suite.sh)?
13:38 pdurbin no, no, you have the right script
13:39 pdurbin see "Goals and options" under "Build" at https://user-images.githubusercontent.com/21006/28791389-4797d3ee-75fa-11e7-8269-dca817459a22.png ?
13:41 donsizemore must it be called on any particular port? i was passing it localhost but can make it 8080 if need be
13:41 donsizemore (actually, i'm calling it against the siteUrl variable, which in vagrant is localhost.localdomain)
13:43 pdurbin docker-aio uses a particular port (whatever is in the script) but when I run the api test suite from old Jenkins across the wire to phoenix is use port 80.
13:44 pdurbin You can't see it in that screenshot above, which is old anyway, so let me grab the latest from old Jenkins.
13:44 donsizemore so it's just not getting its burrito: "No API key defined for built in user management"
13:45 pdurbin Here's what I have on old Jenkins under "Goals and options": test -Dtest=DataversesIT,DatasetsIT,SwordIT,AdminIT,​BuiltinUsersIT,UsersIT,UtilIT,ConfirmEmailIT,Fi​leMetadataIT,FilesIT,SearchIT,InReviewWorkflowI​T,HarvestingServerIT,MoveIT,MakeDataCountApiIT -Ddataverse.test.baseurl='http://phoenix.dataverse.org'
13:45 pdurbin Currently, I have to remember to copy anything new from the run test suite script over to old Jenkins. So they stay in sync.
13:47 donsizemore Base URL for tests: http://localhost.localdomain
13:47 pdurbin hmm? :)
13:48 donsizemore i'm passing     DVAPIKEY: '{{ dataverse.usermgmtkey }}'     DV_APIKEY: '{{ dataverse.usermgmtkey }}'     DV_API_KEY: '{{ dataverse.usermgmtkey }}'     as environment variables
13:48 pdurbin The whole setup we have now is confusing. I'm sorry about that.
13:48 donsizemore just to be sure i got them all
13:48 pdurbin You'll need a burrito for sure.
13:48 pdurbin So that the api test suite can make users to test with.
13:50 pdurbin All (most?) of the stuff you need to successfully run the api test suite is described at http://guides.dataverse.org/en/4.13/developers/testing.html#getting-set-up-to-run-rest-assured-tests ... burrito key, the ability for anyone to create dataverses, etc.
13:50 donsizemore i had college roommates that didn't puke as spectacularly as maven
13:50 pdurbin "In short, Dataverse needs to be placed into an insecure mode that allows arbitrary users and datasets to be created and destroyed." :)
13:52 donsizemore aha. i thought that curl line was already fired, but it's only in sampledata, which is disabled by default
13:52 pdurbin This is all the stuff I do on phoenix to put it into an insecure mode and make it ready for running the api test suite: https://github.com/IQSS/dataverse/blob/v4.13/scripts/deploy/phoenix.dataverse.org/post
13:52 donsizemore i was assuming it was already set
13:53 pdurbin no, I drop the database on every phoenix run
13:53 pdurbin so I have to run setup-all.sh again
13:53 donsizemore i mean in ansible... but that case was off
13:53 pdurbin oh, I don't know
13:53 donsizemore i see what happened
13:53 pdurbin I am a student of dataverse-ansible. :)
13:54 donsizemore i can't even claim taalibe at this point
13:55 pdurbin stefankasberger: wait. Your big reveal of PyDataverse during the community call is in... 2 hours?!? Do I have that right?
14:20 stefankasberger @pdurbin: the call starts in 1h40, yes, but the reveal will not be that big. :)
14:21 stefankasberger @pdurbin: and, yes, i saw your post from yesterday. thanks!
14:28 donsizemore joined #dataverse
14:29 donsizemore @pdurbin woo woo! "[ERROR] Tests run: 19, Failures: 14, Errors: 0, Skipped: 0, Time elapsed: 35.746 s
14:31 pdurbin donsizemore: sounds like progress
14:32 pdurbin stefankasberger: ok but no one has posted anything to our mailing list yet.
14:32 pdurbin Did a tweet go out at least?
14:32 pdurbin I don't see a tweet.
14:32 pdurbin stefankasberger: what should I say in an email to the google group?
14:36 stefankasberger how long do i have time for presenting it? and how is the format usually? should i do a screenshare and show a bit around? for me a discussion afterwards would be most valuable, to get feedback and also to show how others can contribute.
14:38 pdurbin The format is usually very informal. On most calls there is no agenda and we talk for maybe 20 minutes or so. We never go longer than an hour. So it's really up to you. Do you care what I write to the Google Group? I'll come up with something if you don't mind. :)
14:52 pdurbin stefankasberger: this is what I just posted. I hope it looks ok: https://groups.google.com/d/msg/dataverse-community/op753PRmoUY/0E9rTnLOAQAJ
14:54 donsizemore joined #dataverse
14:58 Richard_Valdivia joined #dataverse
14:59 stefankasberger Thats great, thanks.
15:00 pdurbin phew :)
15:00 pdurbin I also posted this: https://github.com/CenterForOpenScience/osf.io/pull/8939#issuecomment-490114059
15:02 Richard_Valdivia Hello everyone!! Sorry if this is not a right channel to asking about that. IŽm trying to understand how add new metadata in our Dataverse instalation (version 4.4). After read the documentation, I could not yet understand how these works. I have to create a new TSV file?
15:04 pdurbin Richard_Valdivia: hi! Yes, this is a good channel to ask in. Yes, you will probably need to create a TSV file. What kind of metadata do you want to add?
15:16 Richard_Valdivia Hi pdurbin !! well, our team of librarians are still working on the metadata and I and my partner are studying the process for later (next few weeks) to do the deployment.
15:17 pdurbin Cool. I'm wondering what field or discipline.
15:18 Richard_Valdivia So there is no way to do the metadata creation through the Dataverse interface, correct?
15:18 pdurbin That's correct. It's command line (API) only right now. Sorry.
15:19 pdurbin It sounds like you found our documentation. I wrote a lot of it and can probably answer questions about it. :)
15:20 Richard_Valdivia About your question, at the moment we are working on an important project for us. This is a research on the missing person bones. We call it CAAF.
15:21 Richard_Valdivia haha... sorry pdurbin... English is not my native language. Maybe I make some mistakes... :)
15:21 pdurbin We all make mistakes. :)
15:23 Richard_Valdivia Yes, you right. IŽm reading the docs. Just to know if I understand right. Fist I have to create a text file (TSV) with all fields. And use "curl" to import this customization to Dataverse?
15:23 Richard_Valdivia This is correct?
15:23 pdurbin Yes, that's correct.
15:25 pdurbin Do you think other installations of Dataverse might be able to use your custom metadata block?
15:25 Richard_Valdivia Hmmm ... I got it. I thought I had missed something because I found the whole process a bit complex. I was imagining that I was following a very complicated path and that I should have a shortcut.
15:26 pdurbin I'm asking because sometimes a member of the Dataverse community will create a custom metadata block and make it available for other installations. For example, Harvard Dataverse has been looking at the custom metadata block for software created by Johns Hopkins.
15:26 pdurbin Yes, the process is extremely complex, unfortunately.
15:27 Richard_Valdivia Well pdurbin!! At this time I belive this is not useful for others. But this is our firt project. We call here a "opennig project"...
15:28 pdurbin Ok. That's fine. Harvard Dataverse has a few custom metadata blocks that are not useful to other installations. I was just curious. :)
15:29 Richard_Valdivia IŽm from Universidade Federal de São Paulo - Brazil. After the first implementation that this one on the "bones", we will have the creation for several disciplines and researches in UNIFESP. So I believe that in the future we will be able to contribute with our metadatas.
15:30 Richard_Valdivia But your curiosity is important. I am working to go to Dataverse Meeting 2019 and to exchange ideas with other partners.
15:31 Richard_Valdivia pdurbin!!! If I succeed in this implementation, can I contribute our experiences here?
15:31 pdurbin It's a great place to exchange ideas. donsizemore and stefankasberger will be there. Not sure about other people who are here in this channel.
15:32 Richard_Valdivia Contribute and asking more questions, of course... :)
15:34 pdurbin Richard_Valdivia: right now the best way to contribute is to clone the Datavere git repo and edit doc/sphinx-guides/source/adm​in/metadatacustomization.rst to suggest improvements to the doc you're reading. Or you could open an issue if you want someone else to edit it.
15:35 pdurbin Richard_Valdivia: also, we have a community call in a little less than half an hour. I'll be there and you can call in and ask questions about metadata blocks if you'd like: https://dataverse.org/community-calls
15:36 pdurbin Which disciplines in UNIFESP do you want to support? Which disciplines would benefit from a custom metadata block?
16:04 pdurbin We're about to start the community call: https://dataverse.org/community-calls
16:05 pdurbin andrewSC bjonnh bricas_ donsizemore jri juancorr pmauduit Richard_Valdivia stefankasberger yoh ^^
16:13 pdurbin stefankasberger: you're doing great. pyDataverse sounds awesome! :)
16:24 jri joined #dataverse
16:54 jri joined #dataverse
17:35 donsizemore joined #dataverse
17:43 donsizemore @pdurbin knock knock, mr. institutional memory?
17:44 donsizemore @pdurbin see, you've graduated from "mr. shib"
17:52 dataverse-user joined #dataverse
17:57 stefankasberger thanks. am looking so forward to the release on PyPI. :)
17:59 Richard_Valdivia joined #dataverse
18:02 jri joined #dataverse
18:03 Richard_Valdivia Hello! pdurbin!!! Unfortunately my network down and I could not see the rest of the previous messages. But according to your question, the disciplines that, in the future, we will  include data are those related to the sciences of the sea, bioinformatics and data of the disciplines related with neurology.
18:20 pdurbin Richard_Valdivia: those sound like interesting metadata blocks. Do you know if any of those disciplines have standards to base their metadata on? For example, social science uses a metadata standard called DDI. For astronomy we use VO. For life sciences, we use ISA-Tab.
18:22 pdurbin donsizemore: I just re-watched Indiana Jones and the Last Crusade. Indy: "Can't you remember?" Henry: "I wrote it down so I wouldn't HAVE to remember!"
18:26 Richard_Valdivia pdurbin: I do not have this answer at this time. IŽm new in actual project and IŽm software developer. But at the earliest opportunity I'll ask the librarians about it. I understand that this subject is interesting to every Dataverse community and and that would be a good contribution!
18:28 pdurbin Yes, it would be a great contribution. I'm thinking that perhaps our metadata exper could help you and your librarians create a custom metadata block for bioinformatics, neurology, or (what I would call) marine biology. Let me go ask him what he thinks.
18:28 * pdurbin walks down the hall
18:31 dataverse-user joined #dataverse
18:35 pdurbin Hmm. He isn't at his desk.
18:36 pdurbin Richard_Valdivia: my thought was that maybe he could work with you to create a custom metadata block that would be useful for multiple installations of Dataverse. Then, once you understand the process better, you could create the one for your bones. :)
18:37 pdurbin Of course, it sounds like your priority is the bones so you may not like that idea. :)
19:07 donsizemore joined #dataverse
20:09 pdurbin Richard_Valdivia: still around? I just talked to our metadata expert.
23:04 jri joined #dataverse

| Channels | #dataverse index | Today | | Search | Google Search | Plain-Text | plain, newest first | summary

Connect via chat.dataverse.org to discuss Dataverse (dataverse.org, an open source web application for sharing, citing, analyzing, and preserving research data) with users and developers.