owned this note
owned this note
Published
Linked with GitHub
# OpenRefine - Wikimedia Commons: integration scenarios
We give an overview of what it should take to build a Commons integration in OpenRefine, starting with a minimalistic scenario (Goal A) and then a more ambitious one (Goal B).
Goal A: structured metadata editing
-----------------------------------
We make minimal changes to the Wikibase extension to let it edit MediaInfo entities in the same way that it currently edits items.
### Overview of the user workflow
Users would start off from tables like this one, where one column contains filenames of Commons media files.
| Title | Caption | Player | Creator | License | Date |
|---------------------------|----------------------------| ----|---- |--------------|------|
| Katie_Adams_QWC2021_1.jpg | Katie Adams scoring a goal | Katie Adams | Claire Kempf | CC-BY | 2021-08-03 |
| Susan_Beck_QWC2021_1.jpg | Susan Beck shooting a penalty | Susan Beck | Claire Kempf | CC-BY | 2021-08-03 |
The `Title` column can be "reconciled" to Commons, which checks that files with those names exist and extracts the Mid out of them. Then, this column can be used in a Wikibase schema to add metadata on these files. Other columns can be reconciled to Wikidata and used as values in statements on MediaInfo entities.
### Development effort
This consists in four steps. Time estimates are for one full time developer.
* *(2 months)* Develop a reconciliation service for Commons, which translates file names to Mids (just like [Minefield](https://hay.toolforge.org/minefield/) but following the [Reconciliation API](https://reconciliation-api.github.io/specs/latest/)). This could initially be a wrapper on top of MediaWiki's API, but we could also aim for it to be developed as a MediaWiki extension on a longer term (but deploying such an extension in Commons would not be doable for a one-year project).
* *(1 month)* Rework the Wikibase extension to be compatible not just with items but with any entity type which has statements and terms (labels, descriptions and aliases). The underlying library we use, [Wikidata Toolkit](https://github.com/Wikidata/Wikidata-Toolkit), is already built with this problem in mind, so in theory it should be fairly simple.
* *(2 months)* Generalize the manifest to make it possible to specify a federation scenario. This should be expressive enough to capture the current federation relation between Commons and Wikidata, and ideally be flexible enough to capture other scenarios envisioned by the teams from Wikimedia Foundation / Wikimedia Germany. Our questions to these teams are summarized [in this pad](https://hackmd.io/ZYWPoLrZSUSE9paRnXe7hg?view).
* *(0.5 months)* Optional: Make it possible to disable the parts of the Wikibase datamodel which are not used by the target instance, to avoid offering them to users. For instance, descriptions, aliases and references are not used on Commons yet, so should not be offered to users in the schema editing area. Ideally, we should also be able to rename parts of the datamodel in the UI ("Label" in Wikidata means "Caption" in Commons). This step is not needed if we go for Goal B afterwards.
With this plan, users should be able to edit structured metadata on existing files in Commons. They should be able to do the same on other instances of MediaWiki which add structured data to their media files (do we know of any?).
Goal B: support for file upload
-------------------------------
This builds on Goal A and adds the ability to upload new files to Commons.
### Overview of the user workflow
Start with a table where one of the columns contains paths to media files:
| Path | Title | Caption | Creator | License | Date |
| -------------------------|---------------------------|----------------------------| -------- |--------------|------|
| D:\Data\QWC\IMG_001.jpg | Katie_Adams_QWC2021_1.jpg | Katie Adams scoring a goal | Claire Kempf | CC-BY | 2021-08-03 |
| D:\Data\QWC\IMG_002.jpg | Susan_Beck_QWC2021_1.jpg | Susan Beck shooting a penalty | Claire Kempf | CC-BY | 2021-08-03 |
* Reconcile the second column, which contains the desired file names for the newly uploaded files on commons. When these filenames are free, no results are returned and the cells can be reconciled to "New item". When the filenames already exist, a match is found. Those filenames must be changed and reconciled again.
* Build a Wikibase schema, using the reconciled column as subject entity id. Structured metadata can be added by adding statements in the schema. An additional field is available in the schema to drop the `Path` column, indicating the path of the file to upload. If this field is used in combination with a matched cell in the `Title` column, then this uploads a new version of the file (a QA warning should be generated to make sure people don't do this unintentionally).
* Another additional field could be provided for the wikitext that should be added alongside the file and structured metadata
Then, for the upload itself, we have two scenarios (we could support one or both of them):
#### B1: Direct upload from OpenRefine
Just like the "Performs edits on Wikibase" operation, we can do the file uploads directly from OpenRefine as a long-running operation. Once the operation completes, the reconciled cells in the `Title` column are all matched to the newly-created Mids.
#### B2: Export to a batch format which is ingested by another tool
Just like the QuickStatements export currently offered, we could export the batch to a file format which would embed both the structured metadata, the wikitext and the media files themselves.
This file could then be uploaded in one go to Commons itself or an external tool similar to QuickStatements which would do the upload in the background.
We are not aware of any such file format but we could design one in concertation with other stakeholders (WMF, batch upload tool maintainers, Wikimedia Commons community)
### Development effort
This builds on top of *Goal A*: the steps listed there are prerequisites for this goal.
We need to add new fields to the schema (file path and wikitext) which only make sense for MediaInfo entities, therefore we need proper support for different entity types. It makes sense to develop this support for Wikibase in general anyway (making it easier to add support for editing Properties or Lexemes for instance).
* *(2 months)* Refactor the Wikibase extension to allow for editing multiple types of entities. A Wikibase schema can contain Item templates, but also Property templates, MediaInfo templates, and so on. Each contain different fields (Properties contain datatypes, MediaInfo contain files and wikitext).
* Develop export operations, depending on the scenario chosen:
* *(2 months)* **B1** (direct upload): this requires embedding a Java library to interface with MediaWiki's file upload API. We'll also need to rethink the error reporting of the editing operation (which is overdue anyway) since file uploads are likely to fail even more than regular Wikibase edits (larger payloads, more server-side checks).
* *(2.5 months)* **B2** (export as a bundled file): this requires designing this format, and developing the external tool which will consume it. There is more freedom in the library to use for the uploads since the external tool can be written in any language.
* *(0.5 months)* Add quality assurance constraints specific to MediaInfo entities (validity of filenames, checks on Wikitext, statement values should all be existing items and not new ones…). This can be done progressively, as issues are discovered during testing.