Time
S
Nick
Message
07:06
jri joined #dataverse
07:13
jri joined #dataverse
11:00
pdurbin joined #dataverse
11:57
jri joined #dataverse
12:54
donsizemore joined #dataverse
13:12
pameyer joined #dataverse
14:01
pdurbin
pameyer: good morning. I'm looking at Michael's branch.
14:02
pdurbin
On a fresh clone: COPY failed: stat /var/lib/docker/tmp/docker-builder586409924/dv/install: no such file or directory
14:04
pameyer
pdurbin: I'll try a fresh clone; but I didn't see that last night when I ran ITs on 18a42cb
14:04
pameyer
staging migration is top-of-mind currently though
14:05
pdurbin
I'm sure I can work around it. I just thought I'd mention it.
14:08
pdurbin
cp: directory ../../conf/docker-aio/dv/install does not exist
14:09
pdurbin
I guess I'll throw a `mkdir -p` in there unless you object to that.
14:09
pameyer
no particular objections
14:09
pameyer
surprised I missed it though
14:10
pameyer
... ok, more slightly annoyed with myself than surprised
14:10
pdurbin
heh
14:10
pdurbin
that script could use more error checking
14:10
pdurbin
checking $?
14:10
pdurbin
but it seems to get the job done
14:12
pameyer
more than 0 would be more ;)
14:18
pdurbin
cp: cannot stat 'pgdriver/postgresql-8.4-703.jdbc4.jar': No such file or directory
14:18
pdurbin
I'll fix that too while I'm in here.
14:27
pdurbin
pameyer: I just noticed that this branch has merge conflicts.
14:28
pdurbin
And one test is failing on my laptop: Failed tests: testSetCreation(edu.harvard.iq.dataverse.api.HarvestingServerIT): expected:<201> but was:<500>
14:30
pdurbin
pameyer: since Michael didn't drag the pull request to code review yet, does that mean he's still working on it?
14:31
pdurbin
I doubt he'll be bothered to the fixes I just pushed to get the api test suite to execute.
14:42
pdurbin
pameyer: ok, I left a comment. I hope it helps: https://github.com/IQSS/dataverse/pull/4606#issuecomment-401828758
14:44
pameyer
I didn't see the HarvestingServerIT failure on that branch late yesterday
14:45
pdurbin
I wonder if the failure is on "develop".
14:48
pdurbin
I'm doing another run on my laptop, this time on develop.
14:48
pdurbin
pameyer: thanks for the scripts for making this easier!
14:48
pameyer
np - I like making things easier
15:04
pdurbin
That's fun. I get a *different* single failing test on my laptop on develop: testAddUpdateDatasetViaNativeAPI(edu.harvard.iq.dataverse.api.DatasetsIT): JSON path message doesn't match.(..)
15:05
pdurbin
I kicked off a run on phoenix. It's been while. Would like another data point.
15:07
pdurbin
Bleh. phoenix is in an even worse state. Stage 2 (deployment) is failing: "FinalizeDatasetPublicationCommand 6898dc2b failed: null" https://build.hmdc.harvard.edu:8443/job/phoenix.dataverse.org-deploy-develop/220/console
15:14
pdurbin
I opened an issue: https://github.com/IQSS/dataverse/issues/4799
15:16
pameyer
huh
15:17
pameyer
at some point, it would be good if we took the "person pushing a button" out of the loop on these
15:32
pdurbin
Yes. Danny keeps asking us to bring him a GitHub issue to estimate.
15:33
pameyer
cron job from iqss jenkins?
15:34
pdurbin
I'm not sure I follow.
15:35
pameyer
every $x minutes, use CLI to kick off phenix deploy/run tests
15:36
pameyer
does that make sense, or am I confusing things more?
15:38
pdurbin
We should probably talk it out in person. You can tell Jenkins to check for new code every hour or every day or whatever.
15:39
pameyer
that sounds like it might be the same thing - but let's talk in person
15:42
pdurbin
Ok. I wonder if the new community consortium would be interested in hosting Jenkins.
15:42
donsizemore
@pdurbin yes =)
15:42
pdurbin
donsizemore: orly? :)
15:43
donsizemore
@pdurbin i'm totally speaking out of turn, as usual
15:44
pameyer
@donsizemore - I've got signups to do once I'm doing migrating stuff
15:47
pdurbin
I can't published through the GUI either.
15:50
donsizemore
@pameyer i'm standing up my first DCM - clear your calendar this afternoo
15:50
pameyer
that's not good
15:51
pameyer
@donsizemore - I should be around most of the afternoon
15:52
pameyer
it should be easier - but *easier* != *easy* sometimes :(
16:01
pdurbin
Any topics to discuss on the community call tomorrow? See https://groups.google.com/d/msg/dataverse-community/Sd_DD-iGrNs/ZC_YeAO0AgAJ
16:11
donsizemore
@pameyer i did start with a cent7 box, so i may send you a systemd unit file?
16:26
jri joined #dataverse
17:38
pameyer
@donsizemore - absolutely. I haven't tested much with cent7 yet
17:39
pameyer
@donsizemore - any suggests for places to improve documentation (or install process) are also very welcome
18:02
donsizemore joined #dataverse
18:04
donsizemore
@pameyer so if i started with CentOS7 and the 0.1-1 RPM and i'm in soft error territory, should i drop to centos6 first or blow away the VM and start with the dev-installation instructions?
18:05
pdurbin
soft error?
18:05
pameyer
@donsizemore centos6 might be the path of least resistance.
18:06
pameyer
it's possible that some of the restricted shell config changed with cent7, and that might need more dev
18:07
donsizemore
@pameyer this is all for play-play, all good
18:08
pameyer
@donsizemore until dual-mode gets sorted, that's how I'm guessing most folks would be interested
18:08
pameyer
"dual-mode" ~= "don't break the dataverse uploads users are used to"
19:12
donsizemore
@pdurbin soft error = 403 forbidden from lighttpd, nothing further logged. same result on centos6, gonna retrace my steps
19:12
pdurbin
ok
19:25
pameyer
@donsizemore sounds like lighttpd.conf config for the allowed IP addresses
19:39
donsizemore
@pameyer yeah my laptop may be sending ipv6 but it's not allowing localhost either. must head for my 3:30 zoom
19:40
pameyer
@donsizemore ipv6 :( . I've got a 4:10 zoom, enjoy your 3:30
19:48
EPIC_HSPH joined #dataverse
19:49
EPIC_HSPH
Hello!
19:50
EPIC_HSPH
I'm having difficulties with implementing my guestbook (many answers from people were just 'guest' and no further answers) is this the right place to find support?
20:04
pdurbin
EPIC_HSPH: hi! You're welcome to ask here. Which installation of Dataverse are you using?
20:05
EPIC_HSPH
The Harvard dataverse? im not sure
20:05
EPIC_HSPH
https://dataverse.harvard.edu/dataverse/EPIC2
20:05
EPIC_HSPH
I have installed a dataverse guestbook there
20:06
EPIC_HSPH
but nowadays more and more people dont fill it...
20:06
pdurbin
EPIC_HSPH: the best thing to do would probably be to click the "Support" button at the top of the page to open a ticket. Then you can let us know the ticket number.
20:07
EPIC_HSPH
I only see a "guest" username and theres no further information collected
20:11
EPIC_HSPH
has an ID of #263682.
20:12
pdurbin
EPIC_HSPH: thanks! I see it. That way there will be some tracking around this issue. How long ago was the most recent response in your guestbook?
20:13
EPIC_HSPH
today was the mosty recent response. it was a good response
20:14
EPIC_HSPH
11 files were downloaded ay 6/4/18, and all of them were 'guest
20:15
EPIC_HSPH
104 files downloaded on 5/25/18, and 3 of them were 'guest'
20:21
pdurbin
EPIC_HSPH: are you saying files are being downloaded without the required fields in your guestbook being enforced? Also, I have to leave in 10 minutes but someone will reply on the ticket you created.
20:24
EPIC_HSPH
yes
20:25
pdurbin
EPIC_HSPH: can you please link to a file where this happens?
20:33
EPIC_HSPH
this happens in many different files
20:34
EPIC_HSPH
they are all under the same dataverse in the link above
20:34
EPIC_HSPH
all folders have this issue, all files
20:38
pdurbin
EPIC_HSPH: Huh, I just tried downloading one of the files without filling out the guestbook and it wouldn't let me. It says "Required field" like this: https://i.imgur.com/wwqPiig.png
20:44
EPIC_HSPH
in general thats the case too
20:45
EPIC_HSPH
but 238 of 801 records are blacks
20:46
EPIC_HSPH
blanks
20:47
EPIC_HSPH
https://imgur.com/a/VgvXp74
20:48
EPIC_HSPH
i have to leave too, but can you please respond to this at the ticket? the email is epic hsph.harvard.edu thanks!
20:49
EPIC_HSPH
i have to leave too, but can you please respond to this at the ticket? thanks!
20:50
EPIC-HSPH joined #dataverse
21:24
pameyer left #dataverse
23:03
jri joined #dataverse