Differences
This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
project:jung_rilke_correspondance_network [2017/09/15 15:28] – [Data] wdparis2017 | project:jung_rilke_correspondance_network [2017/10/26 16:36] (current) – [Team] mgasser | ||
---|---|---|---|
Line 1: | Line 1: | ||
- | ===== Jung - Rilke Correspondance Network | + | ===== Jung - Rilke Correspondence Networks |
- | (screenshots or sketches up here) | + | Joint project bringing together three separate projects: Rilke correspondance, |
- | Joint project bringing together three separate | + | Objectives: |
+ | * agree on a common metadata structure for correspondence datasets | ||
+ | * clean and enrich the existing datasets | ||
+ | * build a database that can can be used not just by these two projects | ||
+ | * experiment with existing visualization | ||
===== Data ===== | ===== Data ===== | ||
- | * List and link your actual and ideal data sources. | + | **ACTUAL INPUT DATA** |
- | ACTUAL | ||
* For Jung correspondance: | * For Jung correspondance: | ||
* For Rilke correspondance: | * For Rilke correspondance: | ||
Line 20: | Line 23: | ||
4) match senders and receivers to Wikidata where possible (Openrefine, | 4) match senders and receivers to Wikidata where possible (Openrefine, | ||
+ | **METADATA STRUCTURE** | ||
- | IDEAL | + | The follwing fields were included in the common basic data structure: |
- | DATA after cleaning: | + | sysID; callNo; titel; sender; senderID; recipient; recipientID; |
- | https:// | + | **DATA CLEANSING AND ENRICHMENT** |
* Description of steps, and issues, in Process (please correct and refine). | * Description of steps, and issues, in Process (please correct and refine). | ||
- | Objective: provide a framework for correspondance, | ||
- | Issues with the Jung correspondence is data quality. Sender and recipient in one column. | + | Issues with the Jung correspondence is data structure. Sender and recipient in one column. |
- | Data cleaning still needed. | + | Also dates need both cleaning for consistency |
- | Also dates need both cleaning for consistency and transformation to meet developper specs. (Basil using Perl) | + | |
- | Will look for Q with Open Refine, note them, and list names that need to be created in wikidata for future use. | + | For geocoding the placenames: OpenRefine was used for the normalization of the placenames and DARIAH GeoBrowser for the actual geocoding (there were some issues with handling large files). Tests with OpenRefine in combination |
+ | |||
+ | The C.G. Jung dataset contains sending locations information for 16,619 out of 32,127 letters; 10,271 places were georeferenced. In the Rilke dataset all the sending location were georeferenced. | ||
+ | |||
+ | For matching senders | ||
+ | |||
+ | Doing this all at once poses some project management challenges, since several people may be working on same files to clean different data. Need to integrate all files. | ||
+ | |||
+ | DATA after cleaning: | ||
+ | |||
+ | https:// | ||
+ | |||
+ | **DATABASE** | ||
Issues with the target database: | Issues with the target database: | ||
Line 42: | Line 56: | ||
How - and whether - to integrate with WIkidata still not clear. | How - and whether - to integrate with WIkidata still not clear. | ||
- | Issues: letters are too detailed to be Wikidata items, although it looks like the senders and recipients have the notability and networks to make it worthwhile. Trying to keep options open. | + | Issues: letters are too detailed to be imported as Wikidata items, although it looks like the senders and recipients have the notability and networks to make it worthwhile. Trying to keep options open. |
As IT guys are building the database to be used with the visualization tool, data is being cleaned and Q codes are being extracted. | As IT guys are building the database to be used with the visualization tool, data is being cleaned and Q codes are being extracted. | ||
+ | They took the cleaned CVS files, converted to SQL, then JSON. | ||
- | Doing this all at once poses some project management challenges, since several people may be working on same files to clean different data. Need to integrate all files. | ||
- | Additional issues encountered: | ||
+ | Additional issues encountered: | ||
- | ===== Team ===== | + | - Visualization: |
- | Please add yourself to the list | + | - Ensuring that the files from different projects respect same structure in final, cleaned-up versions. |
- | Flor Méchain (Wikimedia CH): working on cleaning and matching with Wikidata Q codes using OpenRefine. | ||
- | Lena Heizman (Doda): Mentoring with OpenRefine. | ||
- | Hugo Martin | ||
- | Samantha Weiss | ||
- | Michael Gasser | ||
- | Irina Schubert | ||
- | Sylvie Béguelin | ||
- | Basie Manti | ||
- | Jérome Zbinden | + | ===== Visualization (examples) ===== |
- | Deborah Kyburz | + | {{: |
- | Paul Varé | + | Heatmap of Rainer Maria Rilke’s correspondence (visualized with Google Fusion Tables) |
- | Laurel Zuckerman | ||
- | Christian Sisi?? | + | {{: |
- | Adrien Zemma | + | Correspondence from and to C. G. Jung visualized as a network. The two large nodes are Carl Gustav Jung (below) and his secretary’s office (above). Visualized with the tool Gephi |
+ | ===== Video of the presentation ===== | ||
+ | {{vimeo> | ||
+ | |||
+ | {{tag> | ||
- | Dominik Sievi | ||
- | | + | ===== Team ===== |
- | + | ||
- | {{tag> | + | |
+ | |||
+ | | ||
+ | * Lena Heizman (Dodis / histHub): Mentoring with OpenRefine. | ||
+ | * Hugo Martin | ||
+ | * Samantha Weiss | ||
+ | * Michael Gasser (Archives, ETH Library): provider of the dataset | ||
+ | | ||
+ | * Sylvie Béguelin | ||
+ | * Basil Marti | ||
+ | * Jérome Zbinden | ||
+ | * Deborah Kyburz | ||
+ | * Paul Varé | ||
+ | * Laurel Zuckerman | ||
+ | * Christiane Sibille (Dodis / histHub) | ||
+ | * Adrien Zemma | ||
+ | * Dominik Sievi [[user:wdparis2017]] | ||