Time
S
Nick
Message
02:52
jri joined #dataverse
07:05
jri joined #dataverse
07:30
jri joined #dataverse
10:22
juancorr joined #dataverse
11:54
danmcp joined #dataverse
13:19
danmcp joined #dataverse
13:42
dataverse-user joined #dataverse
13:42
dataverse-user
hi
13:42
dataverse-user left #dataverse
13:48
danmcp joined #dataverse
13:58
Venki joined #dataverse
14:01
Venki
Hi Phil
14:01
Venki
Quick question on metadatablock deletion
14:02
Venki
I have sent an email to the google usergroup
14:06
pameyer joined #dataverse
14:07
pameyer
Hi Venki
14:08
pameyer
which version of dataverse are you trying to delete the metadata / metadata block from?
14:08
pameyer
it's possible that the database schema has changed since that sql was posted to the google group
14:25
Venki
Hi @pameyer
14:25
Venki
Latest version
14:25
Venki
I did the testing even few days back and I deleted the metadatablock after creation but no datasets were created.
14:27
pameyer
was this before or after creating datasets with the new block?
14:27
Venki
after creating the datasets with the new block
14:27
Venki
delete from datasetfield where datasetfield.id in ( select datasetfield.id from datasetfield join datasetfieldtype on datasetfield.datasetfieldtype_id=datasetfieldtype.id where datasetfieldtype.metadatablock_id=12 );
14:28
Venki
ERROR: update or delete on table "datasetfield" violates foreign key constraint "fk_datasetfieldcompoundvalue_parentdatasetfield_id" on table "datasetfieldcompoundvalue" DETAIL: Key (id)=(11481780) is still referenced from table "datasetfieldcompoundvalue".
14:28
Venki
This is the error I get for deleting the data from the datasetfield table..
14:29
pameyer
have you tried deleting / destroying the dataset? I'm assuming that this isn't a production system, and that that would be a viable option - but let me know if I'm wrong
14:29
Venki
This is in the development server so no worries...
14:30
Venki
I was trying to use the metadata delete sql that you had posted in the google groups...
14:30
* pameyer
sigh of relief
14:30
Venki
It worked previously for other metadatablock that I had created but did not create any datasets.
14:30
pameyer
I *think* that if you destroy the dataset, then you'll be able to satisfy the constraint without more sql
14:31
Venki
Oh yeah... I remember now....I did delete the datasets from the UI and then delete the metadata blocks...
14:32
Venki
Let me try it now.. Hang on..
14:40
Venki
I tried to delete the dataset from the UI but it doesnt work. I go through the process of delete successfully and receive a message that the dataset has been successfully deleted
14:40
Venki
But the dataset still exists under that dataverse
14:41
Venki
@pameyer any help?
14:42
pameyer
has this dataset been published?
14:42
Venki
nope..
14:43
pameyer
I vaguely remember that there's a difference between destroy and delete
14:43
pameyer
but if it hasn't been published, that may not matter. taking a look to see where/if there's a destroy api
14:44
pameyer
yup - looks like there is
14:45
pameyer
DELETE to `api/datasets/$id/destroy`
14:45
Venki
wow is it to delete or destroy the dataset?
14:46
pameyer
that's to destroy the dataset - if you're still seeing it, it might be worth a try. definately an API to be careful with though
14:46
Venki
I found this from the guides page
14:46
Venki
curl -u $API_TOKEN: -i -X DELETE https://$HOSTNAME/dvn/api/data-deposit/v1.1/swordv2/edit/study/doi:TEST/12345
14:46
Venki
how to get the datasets ID?
14:47
Venki
From db?
14:47
pameyer
from the db would do - $id is the database id
14:47
pameyer
it should be in the dataverse json produced by the native API too; but if you're already in the db, that's probably easier
14:48
Venki
ok will give it a try now..
14:48
* pameyer
cross my fingers for good luck
14:49
* pdurbin
crosses his fingers too
14:52
pameyer
pdurbin: when / if you've got time, I'm curious if you think https://github.com/pameyer/dataverse/blob/exp-docker_it_iterating/conf/docker-aio/prep_it.bash is worth another PR (short version, docker-aio setup for running integrations tests is "run this script")
14:54
pdurbin
pameyer: correct me if I'm wrong but this looks like a single script to run instead of half a dozen commands.
14:57
pameyer
pdurbin: exactly
14:58
pdurbin
pameyer: I'd probably be more likely to use it (but time would tell). Ship it! :)
14:58
pameyer
pdurbin: will do ;) going a add a sentence about what's assumed to be installed to the readme first (aka - javac, mvn, docker, make)
14:58
pameyer
low urgency, and maybe not today - but will be incoming
14:59
pameyer
@Venki - any change after destroy?
15:00
pdurbin
pameyer: ah, sure. I added checking for if mvn is even installed here: https://github.com/IQSS/dataverse/blob/e7a56c7ee07ef3ff931c18927a2f737eeae33dba/conf/docker/build.sh#L56
15:02
Venki
@pameyer still figuring out how to use the destroy api since I had created the dataset under some other person's dataverse
15:05
pameyer
@Venki - definately worth taking your time with it
15:05
pdurbin
Venki: here's how to destroy a dataset: https://github.com/IQSS/dataverse/issues/2593#issuecomment-339636907
15:11
Venki
@pdurbin I used the code as shown in the github issue 2593 but I get the following error curl: (35) Encountered end of file
15:13
pdurbin
Venki: huh. Could it be an SSL problem? Maybe try it from localhost?
15:15
Venki
@pdurbin oops it works now... I changed from HTTPS to HTTP ..
15:15
Venki
But I get the error message as {"status":"ERROR","message":"Dataset with Persistent ID 10.5072/FK2/ITI8AB not found."}
15:15
Venki
But I am able to see the details from the UI?
15:19
Thali_UM joined #dataverse
15:31
Venki
Oops...I found the problem.... Solr index didnt get updated after I deleted the dataset from the UI...so now when I run the reindex command the datasets doesnt show under the Dataverse
15:36
pameyer
ah - that makes sense
15:36
pameyer
surprised that delete doesn't reindex though
15:39
Venki
yeah but I am still not able to use the delete sql and it still throws the foreign key constraint issues..
15:39
Venki
I think the deletion was not complete or perfect...
15:46
pameyer
can you still see a dataset with that identifier in postgres?
15:50
Venki
yes
15:50
Venki
I am able to see it in the dataset table
15:52
pameyer
that does sound like the delete didn't work
15:54
Venki
so what is the best way now to get rid of those dataset and its values and field types from the db?
15:55
Venki74 joined #dataverse
15:55
pameyer
I think it's back to postgres; looking like datasetfieldcompoundvalue
15:56
venki83 joined #dataverse
15:56
venki83
sorry @pameyer
15:56
venki83
Connection lost and back again as venki83
15:56
pameyer
no problem
15:57
venki83
Any help to do a cascading delete of the two tables datasetfield and datasetfieldtype?
15:57
pameyer
and it looks like there's a typo in that sql - it's *trying* to clean up the compound fields, and I'd guess failing because it's trying to delete from `datasetfield_controlledvocabularyvalue`
15:57
pameyer
instead of `datasetfieldcompoundvalue`
15:58
pameyer
so I *think* if you change that delete / subquery, it's got a chance of working. I'd also run it in a transaction, even if it's a non-production site
15:58
venki83
which sql is that?
15:58
pameyer
"delete from datasetfield_controlledvocabularyvalue where datasetfield_id in"
15:58
pameyer
from the google group posting
15:58
venki83
oh ok
16:00
pameyer
... and from slightly closer reading, this wasn't a typo - this was missing the datasetfieldcompoundvalue table
16:00
pameyer
didn't notice it because I didn't have any in my block
16:01
pameyer
may need to adjust the subquery...
16:04
pameyer
depending on how many compound fields were in your custom block, it may be easier to do manual deletes from that table
16:05
pameyer
subquery is looking non-intuitive to me, and probably not something I could figure out with testing it
16:08
venki83
delete from datasetfieldcompoundvalue where parentdatasetfield_id in ( select datasetfield.id from datasetfield join datasetfieldtype on datasetfield.datasetfieldtype_id=datasetfieldtype.id where datasetfieldtype.metadatablock_id=12 );
16:08
venki83
Still shows the error..
16:09
venki83
How do do manual deletes from the table?
16:09
venki83
Since the foreign key is set...
16:10
pameyer
select datasetfieldvalue.id from datasetfieldvalue join datasetfield on datasetfieldvalue.datasetfield_id=datasetfield.id join datasetfieldtype on datasetfield.datasetfieldtype_id=datasetfieldtype.id where datasetfieldtype.metadatablock_id=7
16:10
pameyer
^ should tell you the ids of the dataset fields
16:11
pameyer
for each of those, `delete from datasetfieldcompoundvalue where parentdatasetfield_id=$datasetfield_id`
16:24
venki83
nope...for that select query I get zero rows for metadatablock_id=12 if I change this value to other metadatablocks then I get some rows returned...
16:25
jri joined #dataverse
16:27
venki83
ok @pameyer I am taking a break now...if you think of any other sql statement that would help to delete the two tables please help to post it in the google groups... I will check tomorrow morning..
16:27
venki83
Thanks for all your kind help.. Really appreciate it...
16:27
pameyer
glad to help
17:12
Thali_UM
Help me whith this problem:
17:13
Thali_UM
Attempting to deploy the application.
17:13
Thali_UM
Command line: /usr/local/glassfish4/bin/asadmin deploy dataverse.war
17:13
Thali_UM
remote failure: Error occurred during deployment: Exception while loading the app : CDI deployment failure:Truncated class file. Please see server.log for more details.
17:13
Thali_UM
Command deploy failed.
17:13
Thali_UM
Failed to deploy the application! WAR file: dataverse.war.
17:13
Thali_UM
(exit code: 256)
17:13
Thali_UM
Aborting.
17:35
pdurbin
Thali_UM: bleh. Can you please email your server.log file to support dataverse.org?
17:40
pameyer
Thali_UM: does `jar tf dataverse.war` return any errors?
17:40
pameyer
might've been a build failure
17:43
pdurbin
pameyer: commencement today. I'm disappointed that I didn't see you at the beer tent. :)
17:43
pameyer
pdurbin: I'm on the wrong side of the river :(
17:43
pameyer
... but also generally in favor of avoiding crowds
17:44
pdurbin
I was just told about this article written by the guy in the office next to mine: https://www.bostonglobe.com/magazine/2012/05/05/don-let-charles-river-dictate-whom-you-date/sIupo1cXlbPYXOmbSnmr7H/story.html
17:48
Thali_UM
It doesn't throw errors
17:51
pameyer
that's a good sign - it should've listed the contents of the war file
17:53
Thali_UM
Yes it is
17:53
pameyer
just to double-check - you followed the instructions in the guides for the weld/osgi jar removal and replacement?
17:56
pdurbin
Thali_UM: which version of Glassfish are you using?
17:58
Thali_UM
4.1
18:00
pdurbin
Ok, that's the right version. Thanks.
18:00
Thali_UM
Yesterday I did the following:
18:00
Thali_UM
I could not start with asadmin start-domain so I applied asadmin start-domain --verbose, then I think glassfish was stopped and I set error, Philip suggested that I start from Netbeans, but send message that can not install dataverse because they are already being use the ports of the glassfish service.
18:01
pdurbin
Thali_UM: are you coming to https://projects.iq.harvard.edu/dcm2018 ?
18:05
Thali_UM
Yesterday I did the following:
18:05
Thali_UM
I could not start with asadmin start-domain so I applied asadmin start-domain --verbose, then glassfish was stopped and It send message error, Philip suggested that I start glassfish from Netbeans, but It send message error that can not install dataverse because they are already being use the ports of the glassfish service.ç
18:07
pameyer
glassfish doesn't always stop cleanly. if you're trying to start from netbeans, I'd suggest doing `kill -9 $pid`, where $pid is the process id from `ps aux | grep glassfish`
18:07
pameyer
you'd probably want to have netbeans not running when doing that ps command
18:11
pdurbin
Thali_UM: When did you run the installer? Yesterday? Was Dataverse running after you ran the installer? One of the jobs of the installer is to get Dataverse running for the first time? The installer does this without Netbeans.
18:13
Thali_UM
I executed the installer yesterday
18:13
Thali_UM
It did not work because it marks me errors
18:14
pdurbin
:(
18:17
Thali_UM
yes but when starting glassfish from netbeans, does not recognize the service, mentions that the ports are used by other program. I tried to start with start-domain --verbose = true but It send me permission messages in java.
18:18
Thali_UM
this message shows me in the server.log file
18:18
Thali_UM
[2018-05-24T12: 01: 35.787-0500] [glassfish 4.1] [INFO] [] [javax.enterprise.system.container.ejb.org.glassfish.ejb.per $
18:18
Thali_UM
No timers to be deleted for id: 100085350753370112]]
18:18
Thali_UM
[2018-05-24T12: 01: 35.790-0500] [glassfish 4.1] [SEVERE] [] [javax.enterprise.system.core] [tid: _ThreadID = 45 _ThreadN $
18:18
Thali_UM
Exception while loading the app: CDI deployment failure: Truncated class file
18:18
Thali_UM
java.lang.ClassFormatError: Truncated class file
18:19
pdurbin
I've never seen that error before.
18:20
pdurbin
Thali_UM: you might want to run `mvn clean` based on what I'm seeing here: https://stackoverflow.com/questions/18252775/maven-is-suddenly-throwing-error-truncated-class-file
18:21
Thali_UM
in which directory do I apply the command?
18:23
Thali_UM
already
18:23
Thali_UM
BUILD FAILURE
18:23
pdurbin
to run `mvn clean` you must be in the directory with pom.xml
18:24
Thali_UM
The goal you specified requires a project to execute but there is no POM in this directory (/Users/luisolartegervacio/.m2/repository). Please verify you invoked Maven from the correct directory.
18:25
Thali_UM
in the folder does not contain any pom.xml file
18:27
pameyer
dataverse project root - the directory that you have the dataverse repository checked-out into from github
18:29
Thali_UM
yes
18:29
Thali_UM
already
18:29
Thali_UM
build success
18:30
Thali_UM
but in the .m2/repository with mvn clean install the answer is Build failure
18:30
pameyer
that's expected - .m2/repository is where maven keeps a local copy of dependencies it has downloaded
18:34
Thali_UM
ah ok
18:41
Thali_UM
it always stops at the command line: / usr / local / glassfish4 / bin / asadmin deploy dataverse.war
18:43
pameyer
it == the dataverse install script?
18:43
pameyer
or netbeans?
18:54
Thali_UM
dataverse install script
18:56
pdurbin
A lot of people struggle to get a Dataverse dev environment set up. You're not alone, Thali_UM
18:57
Thali_UM
Thank you Philip
18:58
pameyer
depending on how you consider it, I still don't have a dataverse dev environment setup (never tried netbeans on os x)
18:58
pameyer
without a vm/container, I mean
18:58
Thali_UM
but I am not configuring for development, just install dataverse and visualize it in a web page or is it the same?
19:00
Thali_UM
Yes, it is also the first time I try on MAC
19:00
pameyer
it's debatable
19:00
pameyer
https://github.com/IQSS/dataverse/blob/develop/conf/docker-aio/readme.txt might be worth a look. it would mean installing docker on the mac you're using
19:01
pdurbin
Thali_UM: ok you could use Vagrant for development if Docker doesn't work for you.
19:01
pdurbin
or* you could use, I meant
19:01
pameyer
pdurbin: right - I keep forgetting about vagrant
19:02
pameyer
don't know why - I use it for other stuff
19:02
pdurbin
it's a thing
19:02
pameyer
my jenkins is in vagrant at the moment - I should remember
19:04
Thali_UM
I'll try
20:20
Thali_UM joined #dataverse
20:32
danmcp joined #dataverse
21:54
pameyer left #dataverse
22:51
jri joined #dataverse