Unwind DataUnwind Data
Semantic Layer

Semantic Interoperability: The Open Standard Connecting Your Entire Data Stack

Semantic interoperability enables different systems, tools, and AI agents to share business logic consistently. The Open Semantic Interchange (OSI) specification is the vendor-neutral standard making it happen across your entire data stack.

Talk to an expert

Semantic interoperability is the ability for different systems, tools, and AI agents to share and interpret business logic consistently without manual translation. It means your definition of "revenue" in dbt compiles the same way in Snowflake, renders the same result in Omni, and returns a trustworthy answer when an AI agent queries it through natural language.

Interconnected data systems sharing semantic definitions across a modern data stack

The semantic layer has been discussed for years, but mostly in the context of BI tools. Looker had LookML. Power BI had its proprietary semantic model. Tableau had data sources. Each one defined business logic, but only for its own consumption. The moment data left that tool, the definitions stayed behind.

That era is ending. The Open Semantic Interchange (OSI) specification, released as v1.0 in January 2026 under an Apache 2 license, creates a vendor-neutral format for semantic definitions that travels across your entire data stack. From the warehouse to the transformation layer to BI to AI agents. One standard. Everywhere.

What is semantic interoperability

Semantic interoperability goes beyond technical connectivity. Your data warehouse can already talk to your BI tool through SQL. That is technical interoperability. The problem is that SQL carries no business context. A query returns rows and columns, but it does not tell the consuming system what "active customer" means, how "churn" is calculated, or which revenue accrual method applies to Q4.

Semantic interoperability solves this by encoding business logic into a portable, machine-readable format that any tool in the stack can interpret. When two systems achieve semantic interoperability, they do not just exchange data. They exchange meaning.

This distinction matters enormously for AI agents. A foundation model can write SQL. It cannot infer that your organization calculates net revenue differently from gross revenue, or that "active" means "logged in within 30 days" for the product team and "made a purchase within 90 days" for the finance team. Without semantic interoperability, AI agents produce answers that are syntactically correct and semantically wrong.

Where semantic definitions live today

The reason semantic interoperability has been so hard to achieve is that business logic is scattered across three distinct layers of the data stack. Each layer has its own way of defining meaning, and none of them naturally share those definitions with the others.

The data warehouse

Snowflake, BigQuery, and Databricks all store data, but they also increasingly store semantic context. Snowflake's Horizon Catalog introduced Semantic Views that let teams define business logic once and compile it across query engines. Databricks Unity Catalog governs datasets, lineage, and access controls. BigQuery has metadata services and policy tags. These warehouse-level semantics are powerful, but they are tied to the platform. A Semantic View in Snowflake does not automatically translate to Databricks.

The transformation layer

dbt is where most modern data teams define their business logic today. Models, tests, and documentation in dbt describe what the data means, how it should be joined, and what quality standards it must meet. dbt's Semantic Layer with MetricFlow takes this further by defining metrics as code: dimensions, measures, time grains, and relationships all declared in YAML. This is arguably the richest source of business logic in most data stacks, but until recently, those definitions were locked inside dbt. Learn more about how [the dbt Semantic Layer fits into modern data architecture](https://unwinddata.com/what-is-a-semantic-layer) and why centralizing metric definitions in your transformation layer is the most OSI-ready approach available today.

The BI layer

BI tools have always maintained their own semantic models. LookML in Looker defines joins, dimensions, and measures. Omni builds on this with a shared modeling layer. ThoughtSpot's Spotter Semantics translates business context for AI agents. Power BI has its own tabular model with DAX calculations. Each tool re-implements business logic from scratch, often leading to subtle differences that erode trust in reporting.

The result is that a single organization might have the same metric defined in three different places with three slightly different calculations. Data engineers know the dbt version is correct. Analysts trust the BI version. The AI agent picks whichever one it finds first. This is the semantic interoperability problem.

The Open Semantic Interchange specification

The Open Semantic Interchange (OSI) is an open-source initiative that creates a universal, vendor-neutral specification for sharing semantic model definitions across the entire data stack. Released as v1.0 on January 27, 2026, the specification is hosted on GitHub under an Apache 2 license.

OSI defines a standard format for the building blocks of business logic:

Semantic Models are the top-level container. A semantic model represents a complete set of business definitions for a domain, including all datasets, relationships, and metrics within it.

Datasets represent logical business entities. Think of them as the fact and dimension tables that your business logic operates on. Customer, Order, Product, Subscription. Each dataset contains fields and defines the structure of the data.

Fields are the row-level attributes within datasets. They can be used for grouping, filtering, and as building blocks for metric expressions. A field might be "signup_date" or "country" or "plan_type."

Metrics are the quantitative measures that matter to the business. Revenue, churn rate, customer lifetime value, average order value. OSI defines these as calculations with explicit expressions, so every consuming tool computes them identically.

Relationships define how datasets connect through keys. They support both simple and composite key relationships, allowing complex multi-table logic to be expressed portably.

Critically, the expression objects within OSI support multiple SQL dialects. This means a metric defined once can compile correctly in Snowflake SQL, Spark SQL, BigQuery Standard SQL, and Trino. The specification handles the dialect translation, not the individual tool.

Data engineers and analysts collaborating on a unified semantic model definition

Who is building the standard

The OSI partner list reveals how seriously the industry is taking semantic interoperability. This is not a single-vendor initiative. It is a cross-stack collaboration spanning warehouses, transformation tools, BI platforms, data catalogs, governance platforms, and financial institutions.

Data platforms: Snowflake, Databricks, Firebolt, and Starburst provide the compute and storage layer. Their participation means OSI definitions can be compiled natively within the warehouse, not just consumed externally.

Transformation: dbt Labs is a founding partner and has open-sourced MetricFlow as part of their OSI commitment. This connects the transformation layer directly to the standard. Coalesce, another dbt ecosystem player, has also joined.

BI and analytics: ThoughtSpot, Omni, Sigma, Preset, Hex, and Lightdash represent the analytics consumption layer. Their adoption means dashboards, explorations, and AI-powered analytics all read from the same semantic definitions.

Data governance and catalogs: Alation, Atlan, Collibra, Select Star, and DataHub bring the governance angle. When these platforms adopt OSI, data lineage, access policies, and quality rules can reference the same semantic definitions that BI tools and AI agents use.

AI and enterprise: Mistral AI, Salesforce, and Elementum AI represent the agentic and enterprise application layer. Salesforce's participation is particularly telling. They published a blog titled "The Agentic Future Demands an Open Semantic Layer," directly linking OSI to the performance of AI agents on enterprise data.

Financial services: BlackRock and JPMC joined the working group, signaling that even heavily regulated industries see semantic interoperability as critical infrastructure for AI governance and compliance.

Data virtualization: Denodo, which joined in March 2026, brings data virtualization into the standard. Their platform creates logical data views across multiple physical sources, and OSI ensures those virtual views carry consistent semantic context.

Others: JetBrains, Qlik, Informatica, AtScale, Credible, Blue Yonder, Domo, and RelationalAI round out the ecosystem, covering everything from IDE tooling to supply chain analytics.

How semantic interoperability changes the data stack

When OSI reaches broad adoption, the architecture of the modern data stack shifts in meaningful ways.

Define once, use everywhere

A data engineering team defines metrics in dbt using MetricFlow. Those definitions export to OSI format. Snowflake imports them into Horizon Catalog. ThoughtSpot reads them for natural language analytics. An AI agent consults them before writing a query. Nobody re-implements the calculation. Nobody maintains a separate version. The single source of truth is literal, not aspirational.

Governance follows the logic

Today, data governance is applied at the platform level. Snowflake governs access to Snowflake tables. Looker governs access to Looker explores. OSI enables governance at the semantic level. An access policy on the "revenue" metric can travel with the metric definition across every tool that consumes it. Gartner predicts that by 2030, 50% of organizations will use autonomous AI agents to interpret governance policies into machine-verifiable data contracts. OSI is the format those contracts will be written in. For a deeper look at how governance layers intersect with semantic definitions, see our guide on [data governance best practices for the modern data stack](https://unwinddata.com/what-is-data-governance).

AI agents get business context

This is the driving force behind the convergence. 88% of companies use AI in at least one business function, but only 1 in 10 have scaled AI agents beyond pilot. MIT Technology Review reported in March 2026 that the bottleneck is not model capability. It is data architecture. Agents lack the business context to deliver trustworthy answers at scale.

OSI solves this by giving agents a machine-readable vocabulary of business terms, metric calculations, and data relationships. Instead of inferring what "churn" means from column names, the agent reads the OSI definition. Instead of guessing how tables join, the agent follows declared relationships. The result is deterministic, auditable, explainable AI output.

Platform migration becomes portable

One of the most practical benefits of semantic interoperability is that switching platforms no longer means rebuilding your business logic. Moving from one warehouse to another, or from one BI tool to another, becomes a matter of importing the same OSI file. The logic travels with you. This reduces migration cost, shortens timelines, and removes vendor lock-in as a strategic risk.

The OSI roadmap

Phase 1 of the initiative ran from Q4 2025 through Q1 2026 and covered specification finalization, reference implementations, and community governance. This phase is largely complete with the v1.0 release and the formation of the working group.

Phase 2 runs from Q2 through Q4 2026. The targets are native support in 50+ platforms, domain-specific extensions for verticals like finance and healthcare, and pilot programs with early adopters. dbt Labs is already shipping OSI compatibility through MetricFlow. Snowflake has embedded it in Horizon Catalog. ThoughtSpot launched Spotter Semantics with OSI support in March 2026.

By 2027, the initiative aims for de facto standard status. Given the breadth of the partner list and the speed of adoption, that timeline looks realistic.

What data teams should do now

Semantic interoperability is not a future problem. The standard exists. The tools are adopting it. Data teams that start aligning now will have a structural advantage over those that wait.

Audit where your business logic lives. Map every place where metrics, dimensions, and business rules are defined. The warehouse, the transformation layer, BI tools, spreadsheets, and application code. Identify duplicates and conflicts. This audit is the starting point for any semantic interoperability initiative.

Centralize definitions in the transformation layer. dbt with MetricFlow is the most OSI-ready tool available today. If your business logic lives in BI tool configurations or warehouse views, consider migrating it to dbt where it can be version-controlled, tested, and exported to OSI format.

Evaluate your BI tool's OSI support. Omni, ThoughtSpot, Sigma, Preset, Hex, and Lightdash are all OSI partners. If your current BI tool is not on the list, it does not mean you need to switch today. But it should factor into your next evaluation cycle.

Treat semantic definitions as infrastructure. Business logic should be code-reviewed, version-controlled, and tested like any other infrastructure component. The days of defining metrics through a BI tool's point-and-click interface and hoping everyone uses the same one are ending.

Prepare your AI strategy. If you are building or deploying AI agents that query enterprise data, semantic interoperability is a prerequisite for trustworthy output. An agent without access to governed semantic definitions is an agent that will confidently deliver wrong answers. Companies with mature data governance see 24% higher revenue from AI. Semantic interoperability is how you operationalize that maturity.

The semantic layer used to be a feature inside a BI tool. Then it became a dedicated layer in the data stack. Now it is becoming an open, portable standard that connects every system touching your data. That progression is not slowing down. The question is whether your data architecture is ready for it.

Unwind Data

Speak with a data expert

We've helped scale-ups and enterprises across Europe move faster on exactly this kind of work — without the trial and error. Strategy, architecture, and hands-on delivery.

Schedule a consultation