project:jung_rilke_correspondance_network

This is an old revision of the document!


Joint project bringing together three separate projects: Rilke correspondance, Jung correspondance and ETH Library.

Objectives:

  • agree on a common metadata structure for correspondence datasets
  • clean and enrich the existing datasets
  • build a database that can can be used not just by these two projects but others as well, and that works well with visualisation software in order to see correspondance networks
  • experiment with existing visualization tools

ACTUAL INPUT DATA

Comment: The Rilke data is cleaner than the Jung data. Some cleaning needed to make them match: 1) separate sender and receiver; clean up and cluster (OpenRefine) 2) clean up dates and put in a format that IT developpers need (Perl) 3) clean up placenames and match to geolocators (Dariah-DE) 4) match senders and receivers to Wikidata where possible (Openrefine, problem with volume)

METADATA STRUCTURE

The follwing fields were included in the common basic data structure:

sysID; callNo; titel; sender; senderID; recipient; recipientID; place; placeLat; placeLong; datefrom, dateto; language

DATA CLEANSING AND ENRICHMENT

* Description of steps, and issues, in Process (please correct and refine).

Issues with the Jung correspondence is data structure. Sender and recipient in one column. Also dates need both cleaning for consistency (e.g. removal of “ca.”) and transformation to meet developper specs. (Basil using Perl scripts)

For geocoding the placenames: OpenRefine was used for the normalization of the placenames and DARIAH GeoBrowser for the actual geocoding (there were some issues with handling large files). Tests with OpenRefine in combination with Open Street View were done as well.

The C.G. Jung dataset contains sending locations information for 16,619 out of 32,127 letters; 10,271 places were georeferenced. In the Rilke dataset all the sending location were georeferenced.

For matching senders and recipients to Wikidata Q-codes, OpenRefine was used. Issues encountered with large files and with recovering Q codes after successful matching, as well as need of scholarly expertise to ID people without clear identification. Specialist knowledge needed. Wikidata Q codes that Openrefine linked to seem to have disappeared? Instructions on how to add the Q codes are here https://github.com/OpenRefine/OpenRefine/wiki/reconciliation.

Doing this all at once poses some project management challenges, since several people may be working on same files to clean different data. Need to integrate all files.

DATA after cleaning:

https://github.com/basimar/hackathon17_jungrilke

DATABASE

Issues with the target database: Fields defined, SQL databases and visuablisation program being evaluated. How - and whether - to integrate with WIkidata still not clear.

Issues: letters are too detailed to be imported as Wikidata items, although it looks like the senders and recipients have the notability and networks to make it worthwhile. Trying to keep options open.

As IT guys are building the database to be used with the visualization tool, data is being cleaned and Q codes are being extracted. They took the cleaned CVS files, converted to SQL, then JSON.

Additional issues encountered:

- Visualization: three tools are being tested: 1) Paladio (Stanford) concerns about limits on large files? 2) Viseyes and 3) Gephi.

- Ensuring that the files from different projects respect same structure in final, cleaned-up versions.

  • Flor Méchain (Wikimedia CH): working on cleaning and matching with Wikidata Q codes using OpenRefine.
  • Lena Heizman (Dodis / histHub): Mentoring with OpenRefine.
  • Hugo Martin
  • Samantha Weiss
  • Michael Gasser (Archives, ETH Library): provider of the dataset C. G. Jung correspondence
  • Irina Schubert
  • Sylvie Béguelin
  • Basil Marti
  • Jérome Zbinden
  • Deborah Kyburz
  • Paul Varé
  • Laurel Zuckerman
  • Christiane Sibille (Dodis / histHub)
  • Adrien Zemma
  • Dominik Sievi wdparis2017
  • project/jung_rilke_correspondance_network.1505650095.txt.gz
  • Last modified: 2017/09/17 14:08
  • by mgasser