The influence of the widespread adoption of JSON-LD based international standards.
By Noel Vizcaino
De jure or de facto
international
standards
- Set the stage for developement to happen. They are pre-requisites.
- They have staying power. e.g. Look at Dublin Core and ISO 19115.
- Besides other knowledge, they bring structure to the current environment.
- Already massive amounts of cross-pollinated compliant data that cannot be ignored.
Metadata serialisation
- XML standards played a role and still strong but...
- Replaced largely by JSON particularly for APIs.
- A downgrade in disguise, easier learning curve and thus broad adoption...
- but luckily something happened ...
W3C JSOND-LD: foundational standard
- An RDF document that is a superset of JSON and thus compatible.
- Major search engines agreed to make it the future of web metadata.
- Web metadata: Both schema.org (recommended) and DCAT accepted. Other standards have settled for these two.
- Google Knowledge Graphs emerges as well as others.
- Along with JSON-LD context document, any JSON becomes semantic linked data.
- Modularity and extension by addition favour specialist domains.
- Most advance RDF serialisation yet AFAIK. (Note: YAML-LD* will follow)
- A graph as the most flexible schema.
- Perfectly suited for metadata.
JSON-LD momentum: a reality, today
- Used by W3C in many standards.
- FAIR by design.
- Google Datasets. Knowledge Graph.
- The whole USofA administration (via DCAT-US) at all levels.
- CERN Opendataportal
- Financial: Bloomberg professional services/ Hypermedia API
- Geospatial: GeoJSON-> GeoJSON-LD -> OGC Earth Observation GeoJSON-LD
- Smart cities: FIWARE/ETSI NGSI-LD
- International Data Spaces. Data sovereignity (ISDA Data model). Brokered architecture standardisation effort.
- Mappings: ISO 19115 (geospatial)<-> DCAT <-> Schema.org . None can be ignored.
- Life sciences success: bioschemas.org
JSON-LD core ecosystem
- More than just data or metadata.
- The serialisation processing is also standardised. (JSON-LD processing)
- Shaping JSON-LD serialisation data is standardised. W3C JSON-LD framing.
- Many data views, independent of original data shape.
- Ready for ingestion in many databases but RDF-aware ones preferred. (To offload processing and storage)
Software Engineering: a Copernican moment for practitioners
- Where Clean/Hexagonal/Onion architecture meets with Domain-driven Design.
- Hard dependencies used to go towards the lowest level of abstraction, at the bottom.
- Now the core, domain model, is at the centre.
- Presentation layer, Databases, Object stores, networking, go to the exterior ring (top).
- The core is expected to be stable(r).
- The idea is to shield the architecture from disruptive change, among other benefits.
Software Engineering and Ontologies
- Ontologies are an integral part of scientific software development lifecycle
- Drive complexity out of the domain core.
- Also part of the functional end product.
- Ontology standards (bringing the domain terminology) will affect the source code, the design (and the modularity), the queries, the metadata and data, etc.
- True interdisciplinary collaboration becomes essential.
- Unix philosophy could be applied to Ontology Engineering. Minimalism and conciseness.
JSON-LD as metadata
- For domain intense inter-layer communication and eventual storage.
- RDF-awared databases abound. As well as other Graph related technologies. JSON-LD is just the beginning.
- It becomes relevant for the querying, communication and storage support.
- Regulatory compliance will play a role. Each domain have different challenges.
- ISO Graph Query standard being developed by same SQL committee. Beyond W3C SPARQL and Apache Gremlin.
- REST and GraphQL friendly. In the case of the latter there are overlaps.
- APIs should be simple and stable.