Mapping Digital Humanities



Christof Schöch, Maria Hinzmann

Trier Center for Digital Humanities
Trier University, Germany

FAIR data in the Wikibase Ecosystem
Digital Humanities Conference 2025
Universidade NOVA, Lisbon, Portugal

14 Jul 2025

Overview

  • 1 – Context: LOD and Wikibase at the TCDH
  • 2 – What is Mapping Digital Humanities?
  • 3 – Highlighting EntitySchemas for validation
  • 4 – Conclusion

Introduction:
LOD and Wikibase at the TCDH

TCDH

  • Established in 1998 to work on the Grimm’s Wörterbuch
  • Now a central research unit of Trier University
  • Focus areas:
    • Digital Lexicography
    • Scholarly Digital Editing
    • Research Software Engineering
    • Computational Literary Studies
    • Digital Cultural Heritage

LOD projects at TCDH

  • Mining and Modeling Text (French Enlightenment Novel; 2019–2023 => Tues 9am: B2)
  • Linked Open Data in the Humanities (cross-domain, 2024–2028)
    • Historical Wine Labels (=> Thursday 2pm: B2)
    • Historical biodiversity research
    • Semantic enrichment of scholarly publications
  • ‘Romantikerbriefe’ (networked scholarly edition)
  • ‘Princesses Libraries’ (private library collections)
  • Mapping Digital Humanities (institutional history of the field)

Why we’re using Wikidata / Wikbase / Wikibase.cloud

  • Why Wikidata?
    • Wikidata as a hub (disambiguation, identifiers)
    • Federation with Wikidata (bidirectionally)
    • Helpful community ;-)
  • Why Wikibase?
    • Great toolkit: wiki, SPARQL endpoint, QuickStatements, EntitySchema
    • Built-in collaboration, multilingualism, provenance
    • Flexible data model under our control
    • Federation is (relatively) easy
  • Why Wikibase.cloud (sometimes)?
    • Very low entry barrier
    • Key features are present

What is Mapping Digital Humanities?

Practical objectives of Mapping DH

  • Collect information about Digital Humanities initiatives
  • Broad geographical, temporal and domain-related scope
  • Primary classes of interest:
    • Centers, networks and research groups
    • Study programmes and training opportunities
    • Associations and their working groups
    • Journals and conference series
    • People involved in these initiatives

Strategic goals of Mapping DH

  • Respond to a need for documentation about DH
  • Provide visibility to a wide range of initiatives
  • Serve as entry point and hub about DH
    • Link out to richer information sources (DHCR, IDHC, etc.)
    • Link to authority files (Wikidata, ORCiD, ROR)
  • Allow studying the DH community
    • disciplinary drivers
    • institutional history
    • publication models

What’s currently in Mapping DH?

  • 267 centers, networks and research groups
  • 220 study programmes
  • 61 journals
  • 34 associations
  • 500+ people

What’s to be added next to Mapping DH?

  • Conference series: more complete coverage
  • Centers, associations, journals: add people in leadership roles
  • People: add year and field of Ph.D. 
  • Generally: add more reference statements

What’s currently not on the agenda

  • Individual conference contributions (IDHC)
  • Individual publications (OpenAlex)
  • Tools (TaPOR Tool Registry)
  • Projects (national databases)

Highlighting EntitySchemas for data validation

What are EntitySchemas?

  • Several mechanisms to validate Wikibase data
    • Property constraints (Wikidata)
    • EntitySchemas (Wikibase + Wikidata)
  • EntitySchemas
    • Focus on a given entity class (in subject position)
    • Define valid triple patterns: S + P + O
    • Check whether the items belonging to the class respect the patterns
  • Open world approach: undefined properties are valid!

How are the triple patterns defined?

  • Subject, property, object (+cardinality)
    • Subject: an entity class
    • Property: a relevant property
    • Object: data type or item class
    • Cardinality: number of values
  • Any EntitySchema can have multiple triple patterns

Example: EntitySchema for ‘Units’

  • Focus class: Q31 (Unit) and its subclasses (e.g. centers)
  • Patterns
    • website, IRI, exactly one
    • abbreviation, xsd:string, zero or one
    • part of, class institution, one or two

EntitySchema validation in action

  1. EntitySchema (Wikibase)
  2. Online validator (separate)
  3. SPARQL query (define items)
  4. Validation results (with errors)

Uses, advantages, limitations

  • Machine-readable documentation of our data model
  • Improve completeness and coherence of our data
  • Ensuring SPARQL queries return (all) intended results
  • Not deeply integrated into Wikibase (clunky)

Conclusion

Achievements so far

  • Solid basis for the data model
  • Good coverage for centers, associations, journals
  • Today is the first public presentation (soft launch)

Challenges and next steps

  • Expanding coverage and deeper information
    • Activities in Africa, Asia, South America
    • Dig deeper into the past
  • Keeping the Wikibase up to date
  • Adjusting the data model as needs develop

Solution / next steps (in my opinion)

  • Build up a team / small community around Mapping DH
  • Find regional ambassadors / contributors
  • Train team members to work on the Wikibase

Thank you!

References

Resources

References