Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Changing values to a database makes all items private, removes them from their collections #21

Open
jovi-juan opened this issue May 5, 2022 · 1 comment

Comments

@jovi-juan
Copy link

Hi Daniel,
Thank you for this time saving plugin. We wouldn't have been able to create and maintain our omeka site (https://philippinestudies.uk/mapping) without it. I know you haven't updated it in forever, but I just wanted you to know a few problems I've been having.

First off, importing into an existing collection doesn't work, either by setting it as the Default in the pulldowns in the import screen, or setting it in the spreadsheet. It always comes in as an Untitled collection. (at least for me)

I'm mentioning this because I recently tried to update all 18k items in the database , updating mostly blank values in existing columns. Really just two fields. It seemed to go OK. But when we checked it, all the items were set to private and removed from their collections. The only ones that were spared were the ones CSV+ skipped.

It was a pretty standard import. I was using the Dublin Core: Identifier field as I always do. I used the "Update values of specific fields" selection in the "Action" dropdown instead of "Update record if it exists, if not create one". I also the the default item type to "Physical Object" and collection to "No Default Collection". Everything else was standard for a TSV.

There was no need to use the "extra data fields". There were only 4 columns total, all solidly within the standard structure for an import, no need to touch anything in "Special Values".

So before I try this again, I thought I would ask if you had any advice about how to approach this. Should I keep the "Public" and "Collection" values from the csv export? Should I import collection by collection? (Groan, there are like 40 collections, one has 11k items, some have 2 items). It would take a lot of manual editing to restore the db to it's previous state in that case.

As long as I'm here, I am mystified, still, after years of use, what the meaning of "Extra data" is on the second screen. What's the standard there? Also, it was nerve racking trying to set the values correctly for a geo-import. I settled on all the geovariables to "Record Type", ticking the enigmatic "Extra data" box, and hoping for the best. I have no idea why this works as opposed to other seemingly appropriate settings for "Special Values".

Anyway, that last stuff is not too important. The other stuff about the importing and updating existing columns is.

Again, thanks for the excellent plugin.

Jovi

@margalini
Copy link

I have the same problem when using the "Replace values for all fields" Action. In addition all the items were set to unfeatured. The only solution that I have seen is to check the "Make records public" option and to include the collection and featured columns in the csv . This is a problem when there are many items.

I really wanted to change the value of all the tags, but what it does is add the new ones to the old ones, so some of them get repeated

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants