Time 
            S 
            Nick 
            Message 
         
        
00:50 djbrooke joined #dataverse 
 
        
04:10 djbrooke joined #dataverse 
 
        
04:35 djbrooke joined #dataverse 
 
        
04:41 djbrooke joined #dataverse 
 
        
04:56 djbrooke joined #dataverse 
 
        
05:09 djbrooke joined #dataverse 
 
        
05:14 djbrooke joined #dataverse 
 
        
05:40 djbrooke joined #dataverse 
 
        
05:48 djbrooke joined #dataverse 
 
        
06:03 djbrooke joined #dataverse 
 
        
06:28 djbrooke joined #dataverse 
 
        
06:37 djbrooke joined #dataverse 
 
        
06:40 djbrooke joined #dataverse 
 
        
06:55 djbrooke joined #dataverse 
 
        
07:14 djbrooke joined #dataverse 
 
        
07:15 djbrooke joined #dataverse 
 
        
07:36 djbrooke joined #dataverse 
 
        
07:52 djbrooke joined #dataverse 
 
        
08:09 djbrooke joined #dataverse 
 
        
08:42 djbrooke joined #dataverse 
 
        
08:51 djbrooke joined #dataverse 
 
        
08:59 djbrooke joined #dataverse 
 
        
09:15 djbrooke joined #dataverse 
 
        
11:11 jri joined #dataverse 
 
        
11:25 jri joined #dataverse 
 
        
11:31 jri joined #dataverse 
 
        
13:40 danmcp joined #dataverse 
 
        
13:49 danmcp joined #dataverse 
 
        
14:19 djbrooke joined #dataverse 
 
        
14:58 djbrooke joined #dataverse 
 
        
14:59 djbrooke joined #dataverse 
 
        
15:12 andrewSC joined #dataverse 
 
        
15:16 pameyer joined #dataverse 
 
        
15:17 pameyer 
it looks like djbrooke's connection is intermittent :shrug: 
 
        
15:22 andrewSC 
is there a way to export some or all unpublished datasets? 
 
        
15:22 andrewSC 
I've been looking at http://guides.dataverse.org/en/latest/admin/metadataexport.html  but it seems to imply its for published datasets only? 
 
        
15:22 pameyer 
andrewSC: might depend on what you mean by export 
 
        
15:22 andrewSC 
hmmmm 
 
        
15:22 pameyer 
nm - messages crossing, looks like 
 
        
15:23 pameyer 
I'd expect APIs to work (with a key) 
 
        
15:23 pameyer 
but not completely sure 
 
        
15:23 andrewSC 
basically i'm just trying to create a text file per dataset where the title of the file is the title of the dataset with the description to be the body of the file 
 
        
15:23 andrewSC 
yeah i was thinking of just writing something simple and using the api 
 
        
15:24 pameyer 
the native API  for datasets sounds like it may get you close to what you're looking for.  it'll give you json for a dataset; with a curl loop that could give you one file per dataset 
 
        
15:25 andrewSC 
mhmmm 
 
        
15:25 pameyer 
might need some reformatting to go from json to plan text, or make the file body prettier 
 
        
15:25 andrewSC 
right right 
 
        
15:38 pdurbin 
pameyer: if you use weechat as an IRC  client like I do, you can avoid seeing people coming and going by running `/filter add irc_smart * irc_smart_filter *`. See http://wiki.greptilian.com/weechat  
 
        
15:41 pdurbin 
andrewSC: are you all set? Are you going to write this in Python? I'm supposed to follow up with you post-April about dataverse-client-python: http://irclog.iq.harvard.edu/dataverse/2017-08-16#i_55808  :) 
 
        
15:42 andrewSC 
oh boy 
 
        
15:44 pdurbin 
oh boy? :) 
 
        
15:46 andrewSC 
does the lib need a full rewrite or just improvement? 
 
        
15:47 andrewSC 
iirc it was out of date and didn't leverage the api fully? 
 
        
15:48 pdurbin 
It was written back when Dataverse only supported SWORD, which uses XML . These days Dataverse has a "native" API  that supports JSON . SWORD is still supported too. 
 
        
15:48 pdurbin 
andrewSC: you would be welcome to start fresh with an empty git repo if you prefer. It's really up to you. 
 
        
15:50 andrewSC 
hmmmmmm 
 
        
15:50 pdurbin 
andrewSC: or I could remove you from the "Dev Efforts by the Dataverse Community" spreadsheet at https://docs.google.com/spreadsheets/d/1pl9U0_CtWQ3oz6ZllvSHeyB0EG1M_vZEC_aZ7hREnhE/edit?usp=sharing  . It's really up to you. 
 
        
15:51 andrewSC 
man you know what 
 
        
15:51 andrewSC 
i could actually justify this for work 
 
        
15:52 andrewSC 
so I'm currently spinning up/developing four new services for the org to help with submissions and project deliverables, stuff like that. The current prepublication system we have, the posters, and the deliverables service will all need to integrate with dataverse 
 
        
15:53 andrewSC 
right now we have a semi janky module that uses the native api to create the datasets 
 
        
15:54 andrewSC 
that could be replaced with the actual dataverse lib 
 
        
15:54 pdurbin 
sounds great 
 
        
15:55 pdurbin 
tyrel was looking at it recently 
 
        
15:56 andrewSC 
I would need to speak with my boss to make sure they're fine with me writing OSS and the licensing around it 
 
        
15:56 andrewSC 
yeah? 
 
        
15:56 andrewSC 
is he interested in taking it on or? 
 
        
16:01 pdurbin 
I don't want to speak for tyrel but I think he just wanted to dabble a bit. Maybe fix a bug or two. He got the test suite to pass last week, which was awesome: https://github.com/IQSS/dataverse-client-python/issues/44#issuecomment-370035174  
 
        
16:03 andrewSC 
mmm gotcha gotcha 
 
        
16:04 djbrooke joined #dataverse 
 
        
16:04 andrewSC 
I mean if he wants to make some contributions I'm all for it 
 
        
16:04 andrewSC 
I take it SWORD will not be deprecated any time soon? 
 
        
16:05 pdurbin 
Nope. The S is for "simple" and we intend to keep it around. 
 
        
16:06 andrewSC 
gotcha gotcha 
 
        
16:11 andrewSC 
So here's what I'm thinking for next steps. Let me speak with my boss between sometime this afternoon and tomorrow afternoon and I'll be able to give you guys a definitive yes or no 
 
        
16:11 andrewSC 
does that work for you guys? 
 
        
16:12 andrewSC 
yes or no by EOD tomorrow 
 
        
16:12 pdurbin 
andrewSC: sounds great! And no pressure! And we're open to whatever license you want to use. 
 
        
16:13 andrewSC 
cool cool :) 
 
        
16:37 djbrooke joined #dataverse 
 
        
17:11 djbrooke joined #dataverse 
 
        
17:26 PatrickFromBU joined #dataverse 
 
        
17:27 PatrickFromBU 
pdurbin: I have a question about trying to create the dataverse image for our development docker hub repo 
 
        
17:28 pdurbin 
PatrickFromBU: hit me 
 
        
17:29 PatrickFromBU 
the docker image needs dvinstall.zip right? how shoudl we create that? 
 
        
17:30 PatrickFromBU 
I tried zipping up my dvinstall folder but that was missing an install binary 
 
        
17:30 pameyer 
Hi PatrickFromBU - which docker image? 
 
        
17:30 PatrickFromBU 
glassfish 
 
        
17:32 pdurbin 
PatrickFromBU: hmm, we don't seem to document how to create the installer (dvinstall.zip) at http://guides.dataverse.org/en/4.8.5/developers/making-releases.html#make-artifacts-available-for-download  . We should. You should `cd scripts/installer` and run `make`. There's a Makefile in there. 
 
        
17:33 pameyer 
`conf/docker` | `conf/docker-aio` ? 
 
        
17:35 PatrickFromBU 
pdurbin: easy enough! 
 
        
17:36 PatrickFromBU 
conf/docker is what I have been looking at--what is aio? 
 
        
17:36 pameyer 
be aware that the Makefile wasn't aware of updates to dataverse.war - so if you do `mvn package ; cd scripts/installer ; make` it won't rebuild without `make clean` first 
 
        
17:36 pameyer 
aio - docker for running integration tests 
 
        
17:38 PatrickFromBU 
ah ok. I was building from netbeans but I will keep that in mind 
 
        
17:47 pdurbin 
PatrickFromBU: are you using the Minishift registry? I never got that to work. :( 
 
        
17:48 PatrickFromBU 
pdurbin: no. we created a docker hub repo: https://hub.docker.com/u/ec528dv/  
 
        
17:54 danmcp joined #dataverse 
 
        
17:54 pameyer 
I'm pretty sure netbeans would only see the pom file (aka - just maven), so probably doesn't know anything about scripts/installer/Makefile 
 
        
17:55 pdurbin 
Ah. Ok. That's what I do too. Push to DockerHub. 
 
        
18:20 djbrooke joined #dataverse 
 
        
18:20 PatrickFromBU 
pdurbin: so using the makefile resolved my issue. but now glassfish is not finding postgres when starting. It works fine when pointing at the iqss repo but not the repo we created. I don't think there are any changes in our fork. 
 
        
18:20 pdurbin 
PatrickFromBU: did you already push your code to a branch on github? If so, can you please link to it? 
 
        
18:26 PatrickFromBU 
pdurbin: No not yet, I will do that. I will also check that I'm not missing any changes. At this point, I'm just trying to change one line, so the only differnet would be the repo address and image. Let me double check that 
 
        
18:27 pdurbin 
PatrickFromBU: in my experience, the Glassfish container loops and loops until it finds Postgres. 
 
        
18:27 pdurbin 
keeps rebooting or whatever 
 
        
18:30 PatrickFromBU 
pdurbin: right, I know what you're talking about. I think if it doesn't eventually find postgres it will stop and thats what's happening 
 
        
18:32 pdurbin 
Ok. That makes sense. 
 
        
18:35 pdurbin 
There's a lot of technical debt in `conf/docker`. I'm open to doing things in a better way. 
 
        
18:38 pdurbin 
There's a reason I documented in the dev guide. It's not for production use. I don't even mention it in the installation guide. 
 
        
18:38 pdurbin 
But it's something and something is better than nothing, I guess. :) 
 
        
18:43 PatrickFromBU 
the docs have been very helpful with everything, this is my first time working with many of these frameworks 
 
        
18:45 pdurbin 
There are a lot of moving parts and some of them will cut you. 
 
        
18:55 PatrickFromBU 
pdurbin: here is the link: https://github.com/EC528-Dataverse-Scaling/dataverse  
 
        
18:55 djbrooke joined #dataverse 
 
        
18:57 PatrickFromBU 
pdurbin: I am going to try recreating the image 
 
        
18:58 pdurbin 
PatrickFromBU: I'm using Minishift v1.5.0+ae62cf2. How about you? 
 
        
18:59 PatrickFromBU 
pdurbin: hm I'm using v1.13.0+ee338c6 
 
        
18:59 PatrickFromBU 
i thought I updated recently too 
 
        
19:00 pdurbin 
Ok, so you are using Minishift too. Not OpenShift proper. 
 
        
19:02 PatrickFromBU 
right 
 
        
19:03 pdurbin 
I should probably upgrade to the version you're using. 
 
        
19:03 PatrickFromBU 
yes I was confused, I have a newer version than you 
 
        
19:03 PatrickFromBU 
to my mind 1.5 > 1.14 
 
        
19:03 pdurbin 
Before I get on your branch. It look like you're using "develop" in the repo you linked. 
 
        
19:04 pdurbin 
Heh. Mathematically you are correct, of course. I've been doing this too long. :) 
 
        
19:05 pdurbin 
And haven't taken a math class in ages. 
 
        
19:07 PatrickFromBU 
"develop" in the docker hub repo? I pushed it by accident but not using it. trying to use "latest" 
 
        
19:09 pdurbin 
PatrickFromBU: I see that "develop" was most recently updated at https://github.com/EC528-Dataverse-Scaling/dataverse/branches/all  
 
        
19:10 PatrickFromBU 
pdurbin, sorry I thought you meant something else. yes I put the docker hub changes in the develop branch. should I put it in a different one? 
 
        
19:12 pdurbin 
Well... 
 
        
19:12 pdurbin 
I don't mean to be picky! 
 
        
19:12 pdurbin 
I can work with this for sure. 
 
        
19:13 pdurbin 
So please don't worry about it. 
 
        
19:13 PatrickFromBU 
haha. be picky. now is a good time because there is nothing there 
 
        
19:13 PatrickFromBU 
I do have another branch, but I don't think it is named properly 
 
        
19:14 pdurbin 
When you have a moment, please read through http://guides.dataverse.org/en/4.8.5/developers/version-control.html#create-a-new-branch-off-the-develop-branch  (and the whole page, ideally) 
 
        
19:15 pdurbin 
We're still learning how to work with external contributors, really. We never got external contributions back when we were on SourceForge. I don't think. 
 
        
19:16 PatrickFromBU 
so I think we need an issue number to properly name our branch, you had one 4040 but that was merged 
 
        
19:16 PatrickFromBU 
would that be our issue number? 
 
        
19:16 pdurbin 
Yeah, please just use 4040. 
 
        
19:16 pdurbin 
It's easier than thinking hard about what the scope should be. :) 
 
        
19:17 pameyer 
pdurbin mentioned a useful trick to me a little while back - when you've got a branch without an issue number that you've been working on, create a branch from that with the issue number when you're ready to make a PR 
 
        
19:17 pameyer 
I haven't tried it yet, but it seems like a really good idea 
 
        
19:17 PatrickFromBU 
ok thanks! 
 
        
19:19 pdurbin 
yeah something like `git checkout -b 4040-glassfish` then `git push origin 4040-glassfish`. Assuming you're on the develop branch. 
 
        
19:28 djbrooke joined #dataverse 
 
        
20:02 djbrooke joined #dataverse 
 
        
20:27 pdurbin 
PatrickFromBU: sorry, distracted. Haven't gotten a chance to try your DockerHub image yet. 
 
        
20:28 tyrel 
oy good afternoon. o/ pdurbin 
 
        
20:28 tyrel 
I'm mostly interested in Dabbling, not taking on full time. Your assessment was right pdurbin, thanks. 
 
        
20:29 tyrel 
Working on a coding assessment for an interview right now, but maybe next week once I'm done moving I can help dive deeper. :) 
 
        
20:31 tyrel 
pdurbin - also I have some free time tomorrow if you're interested in lunch (doesn't have to be at the usual places, i'm flexible), would love to talk in person about dataverse stuff 
 
        
20:32 PatrickFromBU 
pdurbin: np. it might be better to just discuss tomorrow. michael on my team says he is not having this issue, plus this is something we will change 
 
        
20:34 pdurbin 
andrewSC: see comments from tyrel ^^ 
 
        
20:34 pdurbin 
PatrickFromBU: oh, glad it's working for Michael. I'm still curious. :) 
 
        
20:34 PatrickFromBU 
pdurbin: I have another simple question though. As just a simple test, I am adding a print statement to the install perl script. I then make clean and make to create a new dvinstall.zip (no need to rebuild, I think). I run docker/build.sh to push to our docker hub repo and then create a new openshift app 
 
        
20:35 PatrickFromBU 
but I don't see the printouts in the install output. What am I missing? 
 
        
20:35 pdurbin 
hmm 
 
        
20:35 pdurbin 
Is something cached? 
 
        
20:35 PatrickFromBU 
oh 
 
        
20:35 PatrickFromBU 
hm 
 
        
20:35 PatrickFromBU 
good question 
 
        
20:35 pdurbin 
Do you delete the app completely? 
 
        
20:35 PatrickFromBU 
yes, but the image could be cached 
 
        
20:36 pdurbin 
`oc delete project project1` 
 
        
20:36 pdurbin 
yeah, I'm not sure. danmcp might know 
 
        
20:36 djbrooke joined #dataverse 
 
        
20:36 pameyer 
`docker rmi $image` - but don't recall the `oc` syntax to pass it through 
 
        
20:36 PatrickFromBU 
hm actually I don't think the image is cached because I used a new tag 
 
        
20:37 PatrickFromBU 
pdurbin I might hit up danmcp but those sound like the right steps? 
 
        
20:38 pdurbin 
I haven't tried yet. 
 
        
20:49 pdurbin 
I'm busy writing to the guy who popped in here last Friday asking about Kubernetes. :) 
 
        
21:11 djbrooke joined #dataverse 
 
        
21:12 pdurbin 
send. cc'd pameyer who talked to him (thanks again) 
 
        
21:15 pameyer 
pdurbin: no problem 
 
        
21:18 pdurbin 
PatrickFromBU: whoa! I'm now on Minishift 1.14.0 and the UI looks totally different. 
 
        
21:19 PatrickFromBU 
pdurbin: the ui for the openshift web console? 
 
        
21:19 pdurbin 
yeah 
 
        
21:19 pdurbin 
not it a bad way :) 
 
        
21:20 PatrickFromBU 
right! it is the same as what I'm used to but when I was interning at red hat over the summer I learned about patternfly, which I think is what they use to make it 
 
        
21:20 PatrickFromBU 
not sure if you do much front end, but if you're used to bootstrap patternfly is really cool 
 
        
21:21 pdurbin 
I just heard all about PatternFly by a member of their team at this talk the other night: https://twitter.com/monicagr/status/971424071404703744  
 
        
21:22 pdurbin 
PatrickFromBU: ok, I tweaked by config to pull from your DockerHub account and I'm running `oc new-app conf/openshift/openshift.json` 
 
        
21:23 PatrickFromBU 
pdurbin: cool. I'm trying one last thing then I'm going to email everyone. unfortunately I have to leave in a moment to pick up my wife but will be back online later 
 
        
21:24 PatrickFromBU 
pdurbin: that is in about 10-15 minutes 
 
        
21:25 pdurbin 
It says "Rolling deployment is running" 
 
        
21:25 pdurbin 
"pulling image "ec528dv/dataverse-glassfish 
 
        
21:26 pdurbin 
no worries if you need to run 
 
        
21:27 PatrickFromBU 
pdurbin: it should eventually show 1 pod, but that pod fails with "Is postgresql running?" 
 
        
21:27 PatrickFromBU 
Michael now says he has this problem :) 
 
        
21:28 PatrickFromBU 
What I'm a bit more confused about is that my simple print statements do not appear in the installer script. But I will explain this in the email 
 
        
21:30 pdurbin 
It's saying "error: update acceptor rejected dataverse-glassfish-1: pods for rc 'project1/dataverse-glassfish-1' took longer than 300 seconds to become available" 
 
        
21:31 pdurbin 
screenshot 1: https://i.imgur.com/9CLj1Co.png  , screenshot 2: https://i.imgur.com/HG30D1D.png  
 
        
21:33 pdurbin 
also "error: update acceptor rejected dataverse-postgresql-1: pods for rc 'project1/dataverse-postgresql-1' took longer than 300 seconds to become available" 
 
        
21:33 PatrickFromBU 
hm looks like it is crashing earlier 
 
        
21:35 pdurbin 
danmcp: do you think I should increase the timeout somehow? 
 
        
21:35 PatrickFromBU 
maybe dan or solly will provide some insight, unfortunatley I have to run after I send this, but thank you for your help! 
 
        
21:35 pdurbin 
Sure, see you tomorrow. 
 
        
21:43 djbrooke joined #dataverse 
 
        
22:14 djbrooke joined #dataverse 
 
        
22:34 danmcp joined #dataverse 
 
        
22:49 djbrooke joined #dataverse 
 
        
23:21 djbrooke joined #dataverse 
 
        
23:55 djbrooke joined #dataverse