DanNet - Danish WordNet with rich lexical relationships and SPARQL access.
DanNet is a WordNet for the Danish language. DanNet uses RDF as its native representation at both the database level, in the application space, and as its primary serialisation format.
DanNet is available in multiple formats to maximise compatibility:
| Format | Description |
|---|---|
| RDF (Turtle) | Native representation. Load into any RDF graph database (such as Apache Jena) and query with SPARQL. |
| CSV | Published with column metadata as CSVW. |
| WN-LMF | XML format compatible with Python libraries like wn. |
import wn
wn.add("dannet-wn-lmf.xml.gz")
for synset in wn.synsets('kage'):
print((synset.lexfile() or "?") + ": " + (synset.definition() or "?"))While every format includes all synsets/senses/words, the CSV and WN-LMF variants do not include every data point:
For the complete dataset, use the RDF format or browse at wordnet.dk.
Several companion datasets expand the RDF graph with additional data:
| Dataset | Description |
|---|---|
| COR | Links DanNet resources to IDs from the COR project. |
| DDS | Adds sentiment data to DanNet resources. |
| OEWN extension | Provides DanNet-style labels for the Open English WordNet to facilitate browsing connections between the two datasets. |
Additional data is implicitly inferred from the base dataset, companion datasets, and ontological metadata. These inferences can be browsed at wordnet.dk. Releases containing fully inferred graphs are specifically marked as such.
DanNet is based on the Ontolex-lemon standard combined with relations defined by the Global Wordnet Association as used in the official GWA RDF standard.
| Ontolex-lemon class | Represents |
|---|---|
ontolex:LexicalConcept | Synsets |
ontolex:LexicalSense | Word senses |
ontolex:LexicalEntry | Words |
ontolex:Form | Forms |

| Prefix | URI | Purpose |
|---|---|---|
dn | https://wordnet.dk/dannet/data/ | Dataset instances |
dnc | https://wordnet.dk/dannet/concepts/ | Ontological type members |
dns | https://wordnet.dk/dannet/schema/ | Schema definitions |
All DanNet URIs resolve to HTTP resources. Accessing one of these URIs via a GET request returns the data for that resource.
DanNet has proprietary relations defined in the DanNet schema in an Ontolex-compatible way. There is also a schema for EuroWordNet concepts. Both schemas follow the RDF conventions listed by Philippe Martin.
DanNet can be connected to AI tools like Claude via MCP (Model Context Protocol).
https://wordnet.dk/mcpio.github.kuhumcst/dannetTo connect in e.g. Claude Desktop: go to Settings > Connectors > Browse Connectors, click "add a custom one", enter a name (e.g., "DanNet") and the MCP server URL.

Once connected, you can query DanNet's semantic relations directly through Claude.
The database backend is Apache Jena, a mature RDF triplestore with OWL inference support. When represented in Jena, DanNet's relations form a queryable knowledge graph. DanNet is developed in Clojure, using libraries like Aristotle to interact with Jena.
See rationale.md for more on the design decisions.
The production deployment at wordnet.dk consists of three services managed via Docker Compose:
DanNet can be queried in various ways from Clojure (see queries.md). Apache Jena transactions are built-in and enable persistence via the TDB 2 layer.
The frontend is written in ClojureScript using Rum, served by Pedestal. The app works both as a single-page application (with JavaScript) and as a regular HTML website (without). Content negotiation serves different representations (HTML, RDF, Transit+JSON) based on the request.
See doc/web.md for details.
New releases are bootstrapped from the preceding release. The process (in dk.cst.dannet.db.bootstrap):
Bootstrap data should be located in
./bootstraprelative to the execution directory.
DanNet requires Java and Clojure's official CLI tools. Dependencies are specified in deps.edn.
(restart) in dk.cst.dannet.web.service — available at localhost:3456npx shadow-cljs watch appUsing Docker (requires Docker daemon running):
# From the docker/ directory
docker compose up --buildOr manually:
shadow-cljs --aliases :frontend release app
clojure -T:build org.corfield.build/uber :lib dk.cst/dannet :main dk.cst.dannet.web.service :uber-file "\"dannet.jar\""
java -jar -Xmx4g dannet.jarThe system uses ~1.5 GB when idle and ~3 GB when rebuilding the database. A server should have at least 4 GB of available RAM.
python3 -m venv examples/venv
source examples/venv/bin/activate
python3 -m pip install wn
python -m wn validate --output-file examples/wn-lmf-validation.json export/wn-lmf/dannet-wn-lmf.xmlThe production server at wordnet.dk runs as a systemd service delegating to Docker.
cp system/dannet.service /etc/systemd/system/dannet.service
systemctl enable dannet
systemctl start dannetTo update the web service software without changing the database:
# From the docker/ directory
docker compose up -d dannet --buildWhen releasing a new version of the database:
Build the database locally via REPL in dk.cst.dannet.web.service:
(restart)
;; Then in dk.cst.dannet.db:
(export-rdf! @dk.cst.dannet.web.resources/db)
(export-csv! @dk.cst.dannet.web.resources/db)
(export-wn-lmf! "export/wn-lmf/")Stop the service on production:
docker compose stop dannetTransfer database and export files via SFTP, then:
unzip -o tdb2.zip -d /dannet/db/
mv cor.zip dannet.zip dds.zip oewn-extension.zip /dannet/export/rdf/
mv dannet-csv.zip /dannet/export/csv/
mv dannet-wn-lmf.xml.gz /dannet/export/wn-lmf/Restart:
docker compose up -d dannet --build