Time
S
Nick
Message
00:38
axfelix joined #dataverse
02:03
axfelix joined #dataverse
02:20
djbrooke joined #dataverse
02:38
djbrooke joined #dataverse
04:19
djbrooke joined #dataverse
04:22
djbrooke joined #dataverse
04:26
axfelix joined #dataverse
11:34
donsizemore joined #dataverse
13:50
pameyer joined #dataverse
13:53
pameyer
pdurbin: is there a jenkinsfile around somewhere?
13:56
pdurbin
pameyer: you're making me think I should re-open https://github.com/IQSS/dataverse/issues/1936 :)
13:57
pdurbin
pameyer: would you like an account on our Jenkins server? Shouldn't be a problem.
13:57
pameyer
1936 might be too much overhead. installing jenkins isn't a problem; but I'd be guessing at the job defination and pipeline
13:58
pdurbin
I'm happy to copy and paste to you any configs you'd like.
13:59
pdurbin
Such as the configs for the three chained Jenkins jobs described at http://guides.dataverse.org/en/4.7.1/developers/testing.html#the-phoenix-server
14:00
pameyer
that'd be great - I'm not sure if the job def (aka jenkins-cli $host get-job $foo) will be something I could figure out or not
14:01
pameyer
no hurry (aka - it can wait until you don't have other stuff to do)
14:13
donsizemore joined #dataverse
14:14
djbrooke joined #dataverse
14:30
bsilverstein joined #dataverse
14:35
bsilverstein
pdurbin: meant to mention to you yesterday, #3919's branch has a code coverage boost of 0.7% :D
14:55
pdurbin
bsilverstein: fantastic!
14:56
pdurbin
pameyer: I'd like to get https://github.com/IQSS/dataverse/issues/3942 farther along first and I'll probably have more questions for you.
15:00
djbrooke joined #dataverse
15:04
djbrooke joined #dataverse
15:33
djbrooke joined #dataverse
15:35
bsilverstein joined #dataverse
15:51
djbrooke joined #dataverse
16:05
djbrooke joined #dataverse
16:18
jhand joined #dataverse
16:44
pdurbin
pameyer: does this "file package" screenshot look right? http://i.imgur.com/yLq87rC.png
17:41
djbrooke joined #dataverse
17:47
pameyer
pdurbin: "look right" how?
17:52
pdurbin
pameyer: heh. I'm trying to bring all the pieces together. Kicking off the old import/crawling code on "validation passed".
17:52
pdurbin
I'm just proud that I got a "package" to show up.
17:53
pameyer
gotcha - looks right for something from batch-import to me
17:53
pdurbin
cool
17:53
pdurbin
pameyer: what do you expect to see instead of "myUploadFolder1"?
17:54
pameyer
?
17:54
pdurbin
Do you see "myUploadFolder1" at http://i.imgur.com/yLq87rC.png ?
17:54
pameyer
yes
17:55
pdurbin
good :)
17:55
pdurbin
should I hard code it to that? ;)
17:55
pameyer
ok - now I've got context ;)
17:56
pameyer
batch import api parameter; but since new endpoints will be calling that...
17:56
pameyer
I'd go with the dataset identifier
17:56
pdurbin
huh. ok
17:57
pdurbin
so PXTYLE or whatever
17:58
pameyer
for default identifier scheme, yup
17:58
pdurbin
I know what you mean.
18:07
pdurbin
This code has changed a lot since I last looked at it.
18:14
pdurbin
pameyer: right now I'm sort of guessing what JSON the DCM is going to send on success and failure of checksum validation. Has this been finalized?
18:14
pameyer
still flexible
18:14
pameyer
want some more test scripts?
18:15
pdurbin
are there existing test scripts?
18:15
pameyer
not *existing* yet
18:15
pameyer
not for dcm -> DV ( validation ok | validation fail)
18:16
pdurbin
you don't need to cook up something special for me right now
18:20
pdurbin
ok, files are now being uploaded here, for example: /Users/pdurbin/dataverse/files/10.5072/FK2/WWXTUC/WWXTUC
18:20
pdurbin
the "WWXTUC" seems a little weird but whatever :)
18:42
pameyer
it's an artifact of putting everything into a subdirectory
18:43
pameyer
when they're published, it'd be visible as `10.5072/FK2/WWXTUC/`
18:45
djbrooke joined #dataverse
19:08
pdurbin
well, with "doi:" at the front
19:08
pameyer
I'd meant filesystem or rsync path
19:08
pdurbin
I'm probably confused about what you mean but that's ok.
19:08
pdurbin
oh, oh
19:09
pdurbin
makes sense
19:10
pdurbin
pameyer: speaking of still flexible, how does this look? jsonFromDcm: {"userId":133,"datasetId":268,"status":"validation passed","apiToken":"e95c18d8-3368-4722-bf29-3c9e17c9b8d1","uploadFolder":"REWBQA"}
19:11
pameyer
why put the API token in the JSON , instead of re-using from the http header?
19:11
pameyer
and would it break things to add "datasetIdentifier":"REWBQA" ?
19:14
pdurbin
Are you sending the API token in the header?
19:14
pameyer
I tend to assume that's where a dataverse API token should go
19:18
pdurbin
I can change it. I'm flexible too. :)
19:21
pdurbin
pameyer: is this better? jsonFromDcm: {"userId":141,"datasetId":286,"status":"validation passed","uploadFolder":"JUITEZ","datasetIdentifier":"JUITEZ"}
19:21
djbrooke joined #dataverse
19:23
pdurbin
pameyer: oh, I'm supposed to pass "totalSize" too, right?
19:24
pameyer
yeah
19:24
pdurbin
pameyer: do you need userId? Steve was asking if he can take it out.
19:25
pameyer
nope
19:25
pameyer
I don't need it
19:34
pdurbin
pameyer: how does this look? jsonFromDcm: {"datasetId":301,"status":"validation passed","uploadFolder":"DO7BKX","datasetIdentifier":"DO7BKX","totalSize":1234567890}
19:36
pdurbin
corresponding screenshot: http://i.imgur.com/nJAXyxq.png
19:39
pameyer
looks reasonable to me
19:42
pdurbin
Cool. I'm still not in love with "datasetIdentifier" as the name for the directory of files and thumbnails and whatnot for a dataset. In the Java code, to get this string, you call dataset.getFileSystemDirectory().
19:42
pdurbin
to get the fully qualified path
19:46
pameyer joined #dataverse
19:47
djbrooke joined #dataverse
19:57
djbrooke joined #dataverse
20:04
djbrooke joined #dataverse
20:11
pameyer
pdurbin: Dataset.getIdentifier()
20:12
pameyer
I added javadoc, but I'm pretty sure I didn't add that function
20:16
pdurbin
yes but
20:18
pdurbin
Maybe we should have dataset.getFileSystemDirectoryAbsolutePath vs. dataset.getFileSystemDirectoryLeaf or something. The second would be the final directory in the path.
20:19
pdurbin
I just posted comment with the JSON and screenshot from earlier to https://github.com/IQSS/dataverse/issues/3942
20:19
pdurbin
pameyer: is the DCM ready to send that JSON ?
20:20
pameyer
I have to replace commented out TODO lines and check the syntax
20:20
pameyer
define "ready"
20:21
pdurbin
In my tests I'm pretending to be the DCM sending that JSON .
20:22
pdurbin
Somebody should test the DCM sending that JSON . Do you want that to be me or you or Kevin or all of us? :)
20:22
pameyer
since I'm typo-man; I vote for all of us
20:23
pdurbin
heh, ok. that's what I was telling djbrooke this morning
20:23
pdurbin
that we need to own this feature
20:24
pdurbin
I also put a bug in his ear about moving the repo to IQSS.
20:25
pdurbin
pameyer: previously, we stopped at "can you download the rsync script from the DCM via API ?" We didn't care if the script only printed "hello world". For this issue we now need to execute the rsync script, right?
20:25
pameyer
not necessarilly
20:25
pdurbin
huh, ok
20:25
pdurbin
This issue needs more "definition of done" then. I'm confused. :)
20:26
pdurbin
or... what did we call it... acceptance criteria?
20:27
pameyer
if there's a directory with a name matching `uploadFolder` for `datasetIdentifier` with a valid manifest when the "happy-path" API is called, that should suffice
20:27
pameyer
"sad-path" API doesn't even need that
20:27
pameyer
does that help?
20:28
pdurbin
I guess. You seem to be saying that we can can manually do the job of the rsync script. We can put the files into the right place on disk and create a manifest file.
20:28
pameyer
correct
20:29
pdurbin
And then we hit the new API endpoint we added to Dataverse. With curl? With the DCM?
20:29
pameyer
current version of the DCM uses curl to make dataverse API calls
20:29
pdurbin
interesting
20:30
pdurbin
thanks
20:30
pameyer
no problem.
20:31
pameyer
I don't have as much automated testing infrastructure as you do; so I try to make the interface points something that's somewhat obvious and easy to check in isolation
20:37
djbrooke joined #dataverse
20:51
djbrooke joined #dataverse
20:54
djbrooke joined #dataverse
20:57
djbrooke_ joined #dataverse
21:34
axfelix joined #dataverse
21:47
pdurbin
makes sense
21:49
pdurbin left #dataverse