Time
S
Nick
Message
04:14
yoh joined #dataverse
07:18
jri joined #dataverse
08:25
stefankasberger joined #dataverse
09:10
poikilotherm joined #dataverse
09:50
pdurbin
I'm pretty excited about this new integration between Dataverse and Renku: https://groups.google.com/d/msg/dataverse-community/2H21moBIRgU/PUuai7UNBgAJ :)
09:55
poikilotherm
Mornin' pdurbin :-)
09:55
poikilotherm
Just back to keyboard from lunch :-)
09:56
poikilotherm
Looks pretty cool :-)
09:58
pdurbin
Yeah, there are still some details to work out. And next steps: https://github.com/SwissDataScienceCenter/renku/issues/593#issuecomment-525943482
09:58
pdurbin
Cool that the initial integration "just works" though. Christmas came early. :)
09:59
poikilotherm
:-D
09:59
poikilotherm
You sound pretty enthusiastic :-)
10:00
poikilotherm
pdurbin I saw yesterday evening that you guys tagged v4.16
10:00
poikilotherm
Is this stable or should I wait a day or two?
10:03
pdurbin
It's probably stable.
10:04
pdurbin
Famous last words.
10:04
pdurbin
You don't have much risk, do you? Nothing in production yet. What are you worried about? :)
10:11
poikilotherm
Other people :-D
10:11
poikilotherm
I'll just go ahead and create a new release then ;-)
10:12
poikilotherm
Oh BTW, did you see my https://github.com/IQSS/dataverse/issues/6132
10:14
poikilotherm
I pushed it to code review already... Hope that was not too quick...
10:15
pdurbin
poikilotherm: interesting. Do I have any control over whether Maven is in offline mode or not. If so, can you please add something about this in the dev guide?
10:15
poikilotherm
Maven is only on offline mode when you demand so: mvn -o
10:15
poikilotherm
s/on/in/
10:15
pdurbin
poikilotherm: ok, a line in the dev guide about this would be great
10:16
poikilotherm
Sure. Any appreciated space?
10:16
poikilotherm
Words on use case, too?
10:17
pdurbin
Maybe you could add something in () at http://guides.dataverse.org/en/4.16/developers/tips.html#deploying-with-asadmin where it says "mvn package"
10:20
pdurbin
Maybe change that line to "1. Build the war file: ``mvn package`` (``mvn -o package`` for faster offline builds)"
10:20
poikilotherm
Hu? Faster offline builds?
10:20
poikilotherm
This is only faster due to caching and not waiting for downloading the world...
10:21
poikilotherm
So this is only relevant for CI and Docker scenarios
10:21
poikilotherm
Everybody else will have their .m2 repo filled
10:23
poikilotherm
I added a clarification to the issue description
10:26
pdurbin
Maybe I should go make coffee. I haven't had any yet and I don't think I understand. Are you saying build times over at https://build.hmdc.harvard.edu:8443/job/phoenix.dataverse.org-build-develop/ will be faster? That's the Jenkins job I use to run ``mvn package`` on the develop branch before scp'ing the war file to the phoenix server.
10:27
poikilotherm
Yeah. I like that shirt... "Ok, but coffee first".
10:27
pdurbin
It looks like those builds take a little over a minute right now.
10:27
poikilotherm
Build times on Jenkins might show no difference, when you are not cleaning up the build env.
10:27
pdurbin
Yes, great shirt.
10:28
poikilotherm
Then the maven cache is not dumped
10:28
poikilotherm
This will be an issue for everyone using ephemeral build environments like Travis or Gitlab
10:29
poikilotherm
And it might become an issue in Jenkins when using a different build env for every branch
10:29
poikilotherm
Depends on where the caching happens: per env or inside the jenkins user home directory
10:30
pdurbin
So the builds at https://travis-ci.org/IQSS/dataverse will speed up? They seem to take 3-4 minutes.
10:30
poikilotherm
Yes, as long as you will do proper caching. Some tweaks might be necessary for that.
10:31
poikilotherm
https://docs.travis-ci.com/user/caching/#arbitrary-directories
10:31
pdurbin
Tweaks to a future version of https://github.com/IQSS/dataverse/blob/v4.16/.travis.yml ?
10:31
poikilotherm
Travis build times might suffer from their high load, too. It takes quite a time for their VM to boot
10:31
poikilotherm
Yarp
10:32
poikilotherm
Currently there is no caching in use
10:32
poikilotherm
IIRC there is no implicit caching on Travis
10:33
poikilotherm
This also is an issue for building in Docker
10:33
poikilotherm
Like I do for k8s #64
10:34
poikilotherm
The whole "maven downloads the world" thing can be packed into a layer, which is untouched as long as the pom.xml does not change. Huge difference in build time
10:36
pdurbin
The way this Travis doc reads it seems like we can just add those cache lines to our .travis.yml to enable caching, to speed things up. Without your pull request. But I could be wrong. Yes, keep telling me about Docker. :) I don't think your pull request would speed up docker-aio on my Mac because it uses the same ~/.m2 cache I use for regular development.
10:37
poikilotherm
Not so fast about Travis
10:37
poikilotherm
Yes, you could cache the .m2
10:37
pdurbin
The place where I suffer from slowness due to downloading Maven dependencies is when I spin up a branch on EC2. Do you know what I mean? Can you help me with that? :)
10:38
poikilotherm
But anything plugin related will not necessarily be cached. You cannot simply switch to offline mode
10:38
poikilotherm
Well, you could use that maven plugin to create a cache, save that in a tarball on S3 and preseed it when booting the VM
10:39
pdurbin
Yeah, I've had that thought. Mmmm, tarballs.
10:40
pdurbin
If I tried to to a dataverse-kubernetes release, would I feel the pain you're feeling?
10:40
poikilotherm
Tarballs are really unhandy when talking about preseeding a docker build
10:40
poikilotherm
I don't feel any pain :-D
10:41
pdurbin
You don't? Are you solving a theoretical problem? :)
10:42
poikilotherm
?
10:42
poikilotherm
Pardon?
10:42
poikilotherm
I lost your train of thoughts I fear
10:42
poikilotherm
My Maven stuff is unrelated to a new release
10:43
poikilotherm
That might have been mixed up a little
10:44
pdurbin
I still need to go make coffee. My fault. Did I tell you about the silly putty?
10:45
poikilotherm
Nope. But coffee first :-D
10:46
pdurbin
I guess I called it Silly Putty Day. Please see my writeup and picture of the wall outside my office at https://groups.google.com/d/msg/dataverse-dev/Pkces_MBqR8/fz_IRD2lBgAJ
10:51
pdurbin
Silly Putty Day is my way of asking, "To whom will the resolution of this issue bring joy?" In the case of your pull request, who is going to be excited? kcondon? pameyer? donsizemore? I personally don't see the benefit yet, which is fine. :) Will it bring joy to someone who is building Docker images for Dataverse?
10:51
poikilotherm
Definitly the later
10:52
poikilotherm
Maybe it will ease donsizemore and other on their way to branch builds
10:52
poikilotherm
It will not hurt anyone else
10:52
pdurbin
Ok, so this might spark joy for 4tikhonov?
10:52
poikilotherm
Absolutely
10:54
pdurbin
Can you add a line to a future version of http://guides.dataverse.org/en/4.16/developers/containers.html then? To explain the benefit and how to make use of the plugin or at least why it's there?
10:54
poikilotherm
Holy cow...
10:55
poikilotherm
Maybe I should refactor the whole page...
10:55
pdurbin
Yes, please! Maybe do that first. In a separate pull request. :)
10:55
poikilotherm
..
10:56
poikilotherm
You can create an issue for me :-D
10:56
poikilotherm
And put it to your column ;-)
10:58
poikilotherm
I dunno if you saw my removal of that endorsed stuff
10:58
poikilotherm
This seems to be completely unrelated these days
10:58
poikilotherm
I dunno if this is necessary for Netbeans
10:59
poikilotherm
Code, Installer, Docs have no reference to this
10:59
poikilotherm
Maybe you remember why this had been introduced before 2013 ;-)
11:18
poikilotherm
pdurbin I just created a little helper for my local env
11:19
poikilotherm
Using a git hook to put the current branch name into the buildnumber.properties
11:19
poikilotherm
We could create a Gist and add it to the dev docs...
11:20
poikilotherm
Interested=
11:20
poikilotherm
?
11:26
poikilotherm
Meh, orgs cannot create gists. I could add it as a script.
11:56
pdurbin
poikilotherm: sorry, coffee led to breakfast and a long conversation. You could call into custom-build-number like this if you want: https://github.com/IQSS/dataverse/blob/v4.16/conf/docker-aio/1prep.sh#L17
11:57
poikilotherm
Yeah, I just came across this script :-D
11:57
poikilotherm
Did see it before
11:57
poikilotherm
Did not
11:58
poikilotherm
Not sure if I want to reuse it - it will pester people with a strang message about "No custom build number specified. Using $BRANCH_COMMIT"
11:58
poikilotherm
On every checkout
11:58
donsizemore joined #dataverse
11:59
poikilotherm
Reusing the commands obviously is a good idea
12:01
donsizemore
@poikilotherm don't make things easier for me. not worth it =)
12:01
poikilotherm
I'm doing this primarily for me :-D
12:01
poikilotherm
I want a proper layer
12:01
poikilotherm
Reducing build time with Docker
12:02
poikilotherm
Hopefully this will be beneficial for others
12:02
poikilotherm
Either for other CI things or in using my stuff for CI
12:02
donsizemore
I support this. BTW, ibiblio was a primary Maven mirror for years before they moved to a CDN
12:03
pdurbin
poikilotherm: oh, I mentioned to donsizemore yesterday that our current sprint is just starting today and for three weeks (longer than our usual two weeks) we will be focusing almost entirely on improving automated testing. Would you like to help us with that? :)
12:03
donsizemore
so if you want to set up FLOSS mirrors with plenty of space and good connectivity... i know just the place
12:03
poikilotherm
Sure thing!
12:03
donsizemore
@pdurbin i was going to focus on grafana today, then hop back on testing
12:03
poikilotherm
Oh speaking of Grafana
12:03
poikilotherm
I have a nice little graph here for my notebook
12:03
poikilotherm
Using prometheus and node exporter
12:04
poikilotherm
Showing load avg, PSI, saturation, ...
12:04
poikilotherm
Maybe this is helpfull?
12:04
poikilotherm
Happy to share the code with you guys
12:04
poikilotherm
Can share a screen shot, too
12:04
pdurbin
poikilotherm: great! Do any of the issues currently in https://github.com/orgs/IQSS/projects/2#column-5298408 (the current sprint) interest you?
12:05
pdurbin
bjonnh bricas_ icarito[m] jri pmauduit stefankasberger: you are all welcome to contribute to our automated testing efforts over the next 3 weeks too, of course! :)
12:06
poikilotherm
https://i.imgur.com/ZlNYFNq.png
12:07
pdurbin
poikilotherm: nice screenshot. Do you want to add any of those metrics to https://github.com/IQSS/dataverse-ansible/tree/99_collectd_grafana ?
12:09
poikilotherm
If you guys think those are usefull I am happy to share
12:10
pdurbin
I don't know what PSI is. Except when I'm inflating my bike tires. :)
12:11
poikilotherm
https://facebookmicrosites.github.io/psi/
12:12
pdurbin
huh, interesting, thanks
12:12
donsizemore
@pdurbin is the AWS VM you and @pmauduit still up? (and if so, would you share your .pem?)
12:12
pdurbin
poikilotherm: would sometime in the next 3 weeks be a good time to work on https://github.com/IQSS/dataverse-kubernetes/issues/64 ?
12:14
poikilotherm
I am working on this right now!
12:15
poikilotherm
#6132 is about this :-D
12:16
pdurbin
donsizemore: sure, sent. Does it work? :) It's for http://ec2-3-81-53-52.compute-1.amazonaws.com
12:17
pdurbin
poikilotherm: great! How can I help? :)
12:17
donsizemore
thank you. i've plugged in the conf files you two stuck in the issue, but am getting errors. i'm retracing my ansible task manually on dataverse-test.irss.unc.edu and can compare against your VM .
12:17
poikilotherm
Do a code review, merge :-D
12:18
pdurbin
poikilotherm: did you add any docs yet?
12:18
poikilotherm
Nope
12:19
pdurbin
Please add at least one line to the docs. :)
12:19
poikilotherm
As I am very unsure that is an appropriate place
12:19
poikilotherm
There is no real build or CI section
12:19
poikilotherm
I fear this will be lost
12:20
pdurbin
poikilotherm: you can add it to a future version of http://guides.dataverse.org/en/4.16/developers/testing.html#continuous-integration
12:21
poikilotherm
Ok
12:22
pdurbin
thanks!
12:22
pdurbin
If you could fix the header level while you're in there I would appreciate it.
12:23
pdurbin
It should be at the same level as "Load/Performance Testing" and "The Phoenix Server".
12:23
donsizemore
@pdurbin thank you there are a number of differences in collectd.conf =)
12:24
pdurbin
donsizemore: I was worried about that. :/
12:24
pdurbin
I also messed with the httpd proxy conf.d file.
12:25
donsizemore
i can read them now, and diff them ;)
12:25
pdurbin
yeah
12:26
pdurbin
if you `locate .orig` you can hopefully find some of the files pmauduit and I made copies of before editing them. So you can diff them.
12:33
poikilotherm
pdurbin: you want me to level up CI, right?
12:33
pdurbin
poikilotherm: yes! Please! You can even run up our AWS bill!
12:33
poikilotherm
hehehe. I meant the docs
12:34
donsizemore
@pdurbin well, i want this on our test masheen for starters, so i'm diffing and cobbling into ansible as i go
12:34
poikilotherm
CI is now located below "load testing"
12:34
poikilotherm
Maybe CI should be top level on its own and Phoenix below it
12:35
pmauduit
poikilotherm: FYI for collectd and some other, we just need to put some files that could be customized, but I'd say that we don't need to use "replaceinfile"-like ansible modules
12:35
pmauduit
in the "<tool>.d/"-like directories
12:36
poikilotherm
Hi pmaudit!
12:36
poikilotherm
What are you using collectd for right now?
12:36
pdurbin
poikilotherm: you are very welcome to reorganize that page
12:36
poikilotherm
Argh
12:36
poikilotherm
pdurbin: too much load :-D
12:36
poikilotherm
Small steps :-D
12:36
poikilotherm
I'll fix the headers
12:37
pdurbin
poikilotherm: you can just add the one line if you want :)
12:37
poikilotherm
That whole page needs a refactoring, too ;-)
12:37
pdurbin
yeah
12:38
pdurbin
poikilotherm: I refactored the whole API Guide recently! Did you see? I'm even using your graphviz stuff: http://guides.dataverse.org/en/4.16/api/intro.html . Feedback is welcome!
12:39
poikilotherm
pmauduit: I'm asking because collectd seems like a big shotgun to grab those JMX things
12:39
poikilotherm
But I don't have on my radar whether you are using it for other things, too
12:40
poikilotherm
pdurbin: I saw the issue flow, but had no time to take a look yet :-)
12:40
donsizemore
@pdurbin yis yis minor discrepancies but things are making sense
12:44
pmauduit
for sure, the java app you are using is exporting a prometheus endpoint, we can get rid of collectd
12:44
pmauduit
but as far as I know, relying on the discussions we had with pdurbin in the past few days, this is not the case yet for glassfish
12:44
pmauduit
(it is for solr)
12:45
pmauduit
also, collectd can also provide some other metrics (more focused on the system ones, browsing through /proc and so on)
12:54
poikilotherm
pdurbin: I just updated https://github.com/IQSS/dataverse/pull/6133
12:54
poikilotherm
Feel free to nuke or add
12:55
poikilotherm
Well when using Prometheus, shouldn't we go with things like node_exporter and jmx_exporter?
12:55
poikilotherm
Node exporter is more actively maintained compared to collectd
12:55
poikilotherm
E. g. PSI is not available via collectd
12:56
poikilotherm
https://github.com/prometheus/jmx_exporter
12:56
poikilotherm
https://github.com/prometheus/node_exporter
13:09
pmauduit
I don't understand why running as a java agent if we specify the jmx endpoint where we want to gather metrics as a configuration option, but yes, it might be more relevant
13:10
pmauduit
(i.e. why not running as a seperate java application ? no need to instrument our running java app with an agent)
13:10
poikilotherm
You can do either - or
13:11
poikilotherm
Either upgrade Glassfish with a tool like Payara has onboard
13:11
poikilotherm
By adding it as an agent to the running Glassfish JVM
13:11
poikilotherm
Or you can scrape the JMX values by "remote" connecting to the Glassfish JMX service
13:48
poikilotherm
pdurbin: I just created issue and PR for my git hooks: https://github.com/IQSS/dataverse/issues/6134
13:48
poikilotherm
Including a git tip in the docs :-)
13:50
poikilotherm
Added this to Code Review, too ;-)
13:50
poikilotherm
Maybe you could give it a +1
14:04
donsizemore
@pdurbin knock knock?
14:04
poikilotherm
Maybe too much coffee :-D
14:04
pdurbin
poikilotherm: I just left you a review on your githook thing
14:05
pdurbin
donsizemore: how can I help?
14:05
pdurbin
pmauduit: you caught that Payara includes an exporter for Prometheus out of the box, right? Payara is a fork of Glassfish that actually gets security patches, etc. It's maintained. :)
14:07
donsizemore
@pdurbin so, the "explore" button. jon is giving a demo. he gets "internal error, no more information available" — with absolutely nothing of use in the server log
14:07
donsizemore
@pdurbin he tried three browsers. the same explore button on the same dataset works for me for a time - then i get the error. now they're all working again
14:07
pdurbin
donsizemore: which external tool? Data Explorer?
14:08
donsizemore
@pdurbin yes. if i cobble together the url with the fileId and load it in a browser, Data Explorer is fine. Dataverse is opening a new tab but the URL points at the dataset
14:08
pdurbin
donsizemore: you're on 4.16?
14:08
donsizemore
@pdurbin so, when i'm looking at the "internal error" with no information available in server.log and press reload, i get a clean dataset. we just got to 4.11.
14:09
donsizemore
@pdurbin i had IQSS' "inconsistent behavior" ticket in dataverse-ansible in mind as things are going haywire during Jon's demo
14:09
donsizemore
https://github.com/IQSS/dataverse-ansible/issues/98
14:09
pdurbin
so sad to hear things are going haywire :(
14:09
pdurbin
Is the server public?
14:10
donsizemore
but i'm not making it that far. Dataverse is opening a new tab; the URL bar contains the URL to the dataset, Dataverse shows a red error and stops. reload that tab and i get the dataset
14:10
donsizemore
https://dataverse.unc.edu/dataset.xhtml?persistentId=hdl:1902.29/H-828462
14:10
donsizemore
which works for me (now) but not for jon or for me minutes ago.
14:11
pdurbin
worked for me just now, a new tab with data explorer
14:11
donsizemore
i'm seeing a ton of BeanValidation errors in server.log but we're harvested constantly so who knows
14:12
pdurbin
violence in schools, sad
14:12
donsizemore
yup, they all worked for me during his talk, until they stopped working. then they all broke. now they're all working again
14:12
pdurbin
Can you make them break again? :)
14:15
donsizemore
they come and go, i haven't figured out how to trigger the condition
14:16
pdurbin
sounds frustrating
14:16
pdurbin
I'm frustrated for you. :)
14:16
donsizemore
eh, only because my coffee cup is empty but i dare not get more while jon's presenting
14:17
donsizemore
i was thinking of nuking the .prep files but then the URL launch started working again
14:18
pdurbin
donsizemore: he's been going for over an hour? You started at 9, right?
14:20
pdurbin
poikilotherm: I left you a comment asking if you want to refactor that other script right now or not. Up to you. :)
14:20
poikilotherm
Just on it :-D
14:21
donsizemore
@pdurbin yeah, it may be a 2-hr call tho
14:22
pdurbin
poikilotherm: if confused. Are you still pushing commits to your pull request? I don't want to move it to QA until commits stop. :)
14:23
poikilotherm
Let me push my refactoring commit, then you can move to QA
14:23
pdurbin
ok
14:25
poikilotherm
Pushed, left a comment and resolved
14:25
poikilotherm
Plz review :-)
14:26
poikilotherm
Ah damn, I think I just made a mistake :-/
14:27
poikilotherm
Please don't move to QA yet
14:27
pdurbin
You can revert that refactoring if you're worried about breaking something.
14:27
poikilotherm
I just forgot sth.
14:27
poikilotherm
Git hooks always run in the root of the repo
14:27
poikilotherm
The manual script does not necessarily get called from there...
14:30
pdurbin
I'm getting an error when I run scripts/installer/custom-build-number from the root of the repo. I'm on 3753a9d74.
14:33
poikilotherm
Could you be a bit more precise?
14:33
poikilotherm
An error message would be cool ;-)
14:33
pdurbin
Did you test it?
14:34
poikilotherm
Sure
14:34
poikilotherm
I can test again
14:35
pdurbin
readlink: illegal option -- f
14:35
poikilotherm
Meh.
14:35
poikilotherm
Grml.
14:35
poikilotherm
UNix
14:35
poikilotherm
-f is "fullpath" which is GNU specific
14:36
pdurbin
If you want, you can just put scripts/installer/custom-build-number back to how it was.
14:36
poikilotherm
I could just use a simple version with simply reling on $0
14:36
poikilotherm
And not trying to cover for symlinks
14:37
pdurbin
whatever works on both linux and mac. or again, please feel free to put it back how it was, maybe with a TODO comment to consolidate in the future
14:38
poikilotherm
Please checkout latest commit
14:38
poikilotherm
Works on Linux
14:41
pdurbin
Works but it doesn't output something like this anymore: No custom build number specified. Using 5028-dataset-explore-btn-336a42437
14:41
poikilotherm
Yeah.
14:41
poikilotherm
I told you :-D
14:42
pdurbin
I want my output back.
14:42
poikilotherm
If you insist on getting this, I can obviously read back the written file :-D
14:42
poikilotherm
heheh
14:43
jri joined #dataverse
14:45
poikilotherm
Here ya go
14:45
poikilotherm
5 minutes left before I need to run...
14:45
pdurbin
works fine now. I'll move it to QA. Thanks!
14:46
poikilotherm
Great
14:48
poikilotherm
Added some SLOPI comment to the PR, done for today
14:48
poikilotherm
Read you guys tomorrow
14:48
pdurbin
bye!
15:05
donsizemore
@pdurbin update on grafana: i have more to push but haven't yet gotten ansible (or manually-passed JSON ) to create the datasource
15:28
pdurbin
donsizemore: ok. Bummer. Can you please remind me if you're using a grafana playbook or module or whatever it's called that someone else wrote?
17:01
pdurbin
pmauduit: https://twitter.com/GenevievMichaud/status/1167110418344620044
17:54
donsizemore
@pdurbin it's the ansible module
18:05
donsizemore
and all i'm hearing about today is the externaltool feature of dataverse-4.11 not working correctly
18:10
pdurbin
woof
18:10
pdurbin
any luck reproducing the external tool problem?
18:11
pdurbin
and does the ansible module provide a sample json to test with?
18:30
donsizemore
the trick to reproducing it seems to be to give a presentation
18:30
donsizemore
our director clicked an explore button in a meeting with a client and got the red bar of "no further information available"
18:30
donsizemore
i'm using the sample json from the docs https://docs.ansible.com/ansible/latest/modules/grafana_datasource_module.html
18:31
donsizemore
@pdurbin i did verify for gustavo, BTW, that docker-aio is correctly setting FAKE as the DOI provider
18:56
pdurbin
donsizemore: ok, I forget what's next on your list :)
18:57
donsizemore
i'm going to try the grafana playbook in EC2 to see if the bump in ansible version makes a difference
18:57
donsizemore
note that i haven't gotten the API to accept datasource creation via JSON , either.
18:57
donsizemore
i created one in the web interface, exported its JSON , and compared that against what we're passing in Ansible
19:00
pdurbin
does it support other formats? YAML ?
19:00
donsizemore
EC2 bumps us to ansible-2.7.9
19:01
pdurbin
When you say it doesn't work via API either do you mean curl?
19:05
donsizemore
yes, a PUT to /api/datasources
19:07
pdurbin
ok, maybe we should try https://grafana.com/docs/http_api/data_source/#get-a-single-data-source-by-id
19:08
pdurbin
or the "get all"
19:12
pdurbin
donsizemore: I don't know if this helps but I see stuff from this: curl http://admin: admin localhost:3000/api/datasources
19:13
pdurbin
[{"id":1,"orgId":1,"name":"Prometheus","type":"prometheus","typeLogoUrl":"public/app/plugins/datasource/prometheus/img/prometheus_logo.svg","access":"proxy","url":"http://localhost:9090/prometheus ","password":"","user":"","database":"","basicAuth":false,"isDefault":true,"jsonData":{"httpMethod":"GET","keepCookies":[]},"readOnly":false}]
19:17
pdurbin
donsizemore: this worked: curl -H 'Content-Type: application/json' http://admin: admin localhost:3000/api/datasources -X POST --upload-file test_datasource.json
19:17
pdurbin
with test_datasource.json being this: { "name":"test_datasource", "type":"graphite", "url":"http://mydatasource.com ", "access":"proxy", "basicAuth":false }
19:20
pdurbin
donsizemore: am I even in the right ballpark? You're having trouble adding the datasource? Or did you say you're having trouble adding the dashboard?
19:21
donsizemore
i couldn't curl the datasources. but i'm setting a username and password so i may be my own worst enemy
19:22
pdurbin
donsizemore: basic auth should work so you should be able to replace "admin:admin" in the curl above.
19:22
pdurbin
for what it's worth, I can see the datasource I just added at http://ec2-3-81-53-52.compute-1.amazonaws.com/grafana/datasources
19:22
donsizemore
i could get a mixture of 403s or 404s from the api using the -u construct
19:23
pdurbin
hmm, well, try it without -u I guess. like I did above :)
19:23
donsizemore
i bet i have to ditch the ansible grafana module and use a curl module instead
19:23
* pdurbin
hugs curl
19:24
donsizemore
it hasn't been the primary focus of my day, though. i can't even disable other previewers - we only have Data Explorer in the externaltool table and... sometimes it launches, sometimes it doesn't
19:24
pdurbin
donsizemore: is the problem that we are trying to add files/grafana-dashboard.json but we need to add a datasource first?
19:24
donsizemore
no, it's dying at the datasource step
19:25
pdurbin
ok, dying in "create grafana datasource"?
19:45
donsizemore joined #dataverse
19:46
donsizemore
it's the ansible version
19:46
donsizemore
|dls irss-dls:~/Desktop| curl http://admin: admin ec2-54-85-112-243.compute-1.amazonaws.com/grafana/api/datasources [{"id":1,"orgId":1,"name":"prometheus","type":"prometheus","typeLogoUrl":"public/app/plugins/datasource/prometheus/img/prometheus_logo.svg","access":"direct","url":"http://localhost:9090/prometheus ","password":"","user":"","database":"prometheus","basicAuth":false,"isDefault":true,"jsonData":{"tlsAuth":false,"tlsAuthWithCACert":f
19:47
pdurbin
Oh! You fixed it? :)
19:47
donsizemore
the same code didn't work in vagrant (ansible-2.7.0) but does in EC2. weird.
19:48
pdurbin
sounds like progress :)
19:48
pdurbin
lemme know when you want me to try your branch in ec2 :)