BI Migration Approach: What Actually Works and What Breaks
Most BI migrations fail because teams migrate dashboards instead of fixing the logic underneath them. Here is a honest account of what works, what breaks, and why the semantic layer is where every migration should start.
Talk to an expert80% of BI migration projects run over time or over budget. That number is not surprising to anyone who has been inside one. What is surprising is that the failures are almost always the same failure, repeated across different organizations, different tools, and different teams.
The failure is this: the team treats the BI migration approach as a dashboard-moving exercise. They inventory the old tool, catalog every report, assign owners, and start rebuilding dashboards in the new platform. Six months later they have the same inconsistent metric definitions, the same conflicting revenue numbers, the same trust problems — just on newer infrastructure.
Moving dashboards is not a BI migration approach. It is a copy-paste operation with a six-month timeline.
A real BI migration is an opportunity to fix the logic underneath your dashboards. The teams that understand this are the ones whose migrations deliver value. The teams that treat it as a technical lift-and-shift are the ones that end up explaining to their boards why the new platform looks different but feels the same.
Why Most Organizations Migrate in the First Place
Understanding the trigger matters because it shapes the right approach. In my experience, BI migrations happen for one of four reasons, and each one implies a different scope.
Cost pressure. The existing platform has become expensive relative to the value it delivers. Looker pricing post-Google acquisition. Tableau licensing at scale. MicroStrategy maintenance contracts that outlived the use case. When cost is the trigger, the temptation is to move fast and minimize scope. This is where the copy-paste trap is most dangerous — speed pressure overrides the opportunity to fix the underlying problems.
Platform obsolescence. The existing tool cannot support what the business now needs. Legacy on-premise BI tools that cannot connect to cloud warehouses. Platforms that have no AI integration story. Tools that cannot serve mobile users or embed into products. When obsolescence is the trigger, the migration has a natural forcing function to rethink the architecture — use it.
Consolidation. The organization has ended up with three or four BI tools across different teams and is trying to standardize. This is the most complex BI migration scenario and the one most likely to surface the metric definition problem at full force. When marketing runs Sigma, finance runs Tableau, and product runs Looker, every team has its own definition of every metric. Consolidation migrations fail when teams try to pick a winner before resolving the definitional conflicts.
AI readiness. The existing BI layer cannot support the AI analytics use cases the organization wants to build. No semantic layer. No API access for AI agents. No governed metric definitions that LLMs can query reliably. This is the fastest-growing migration trigger in 2026, and it is the one that most clearly demands a semantic-layer-first approach.
The Four Things That Break in Every Migration
Regardless of which platforms are involved — Tableau to Power BI, Looker to Omni, MicroStrategy to Snowflake plus a modern BI layer — the same four things break in every BI migration. Knowing them in advance is the only way to plan around them.
1. Metric definitions. This is the one that causes the most damage and gets the least upfront attention. In most organizations, metric definitions are not written down anywhere accessible. Revenue is calculated in a SQL query that one analyst wrote in 2021 and nobody has reviewed since. Active customer is defined differently in the CRM, the warehouse, and the BI tool. When you migrate, you have to decide: do you rebuild these definitions as they exist today, or do you fix them? Teams that try to fix definitions mid-migration extend the timeline by months. Teams that rebuild definitions as they exist and plan to fix them later never fix them. The right BI migration approach resolves definitional conflicts before migrating any dashboards at all.
2. Data source connections. Every dashboard has upstream dependencies that are not always visible until you try to move them. A report that looks simple on the surface pulls from four data sources, two of which require custom connectors that are not supported by the new platform. A financial dashboard depends on a stored procedure that was written for the legacy database's SQL dialect. An executive report relies on a manual Excel upload that nobody documented. Discovery work on data source dependencies consistently takes two to three times longer than expected. Budget for it explicitly.
3. Business logic buried in the BI layer. This is the dirty secret of every mature BI environment. Over time, analysts embed business rules directly in dashboard-level calculated fields, report-specific filters, and tool-specific functions that have no equivalent in the target platform. A Tableau calculated field that reconciles two data sources using a workaround for a known data quality issue. A Looker derived table that pre-aggregates data in a way that compensates for a slow query pattern in the warehouse. A MicroStrategy metric that encodes a fiscal calendar adjustment that finance agreed on in 2019 and nobody remembers why. This logic is invisible until migration breaks it.
4. Adoption, not just access. A BI migration is complete when users trust the new tool and use it to make decisions. It is not complete when dashboards are rebuilt and access is provisioned. Most migration projects plan for the technical cutover and underplan for the change management. Business users who were productive in the old tool need time, training, and patience to reach the same productivity in the new one. The organizations that nail this build a parallel running period into the migration plan, keep both tools active for 60 to 90 days, and let usage data tell them when adoption has actually happened rather than declaring success at go-live.
The BI Migration Approach That Works: Semantic Layer First
The BI migration approach that consistently delivers the best outcomes inverts the intuitive sequence. Instead of starting with dashboards and working backward to the data, it starts with business definitions and works forward to the dashboards.
Here is what that looks like in practice.
Step 1: Audit what you actually use. Before touching the new platform, run a usage analysis on the existing tool. In most mature BI environments, 20% of dashboards generate 80% of the business value. The other 80% are zombie dashboards: published, technically active, used by nobody. Every dashboard you migrate costs time and money. Migrate only what is actively used, and use the migration as an opportunity to sunset everything else. This single step typically reduces migration scope by 40 to 60%.
Step 2: Extract and document the business logic. Go through the high-value dashboards and extract every metric definition, every calculated field, every business rule embedded in the BI layer. Write them down in plain language, not SQL. Bring the metric owners — finance, product, marketing — into a room and resolve the conflicts. This is the hardest step because it is organizational, not technical. But it is the step that determines whether the new platform inherits the same confusion or starts with a clean foundation.
Step 3: Build the semantic layer before you build dashboards. Once the business logic is documented and agreed upon, encode it in a governed semantic layer before rebuilding a single dashboard. This might be dbt's Semantic Layer with MetricFlow if your team runs dbt. It might be Snowflake Semantic Views if you are standardizing on Snowflake. It might be Cube or AtScale if you run multiple BI tools and need a portable, headless layer that serves all of them. The specific tool is secondary to the principle: business logic lives in one place, owned by one team, versioned and tested, consumed by everything downstream.
This is where the semantic-layer-first approach earns its name. Tableau to Power BI migrations that follow this sequence produce semantic models that serve dashboards, Copilot, and AI agents from a single governed definition. Looker to Omni migrations that follow this sequence produce dbt Semantic Layer definitions that survive any future BI tool change because they live outside the BI tool entirely.
Step 4: Migrate dashboards in priority order. With the semantic layer in place, dashboard migration becomes the fastest part of the project. Rebuild the top 20% of high-value dashboards first, connect them to the semantic layer, validate the outputs against known historical values, and get sign-off from the metric owners before anything goes live. Run parallel for 60 to 90 days on critical dashboards. Let usage data confirm adoption before decommissioning the old tool.
Step 5: Document the decisions you made. Every migration surfaces decisions about metric definitions, data source precedence, and business rule interpretation. These decisions need to live somewhere accessible after the migration team disbands. The organizations that fail to do this find themselves relitigating the same definitional conflicts 18 months after migration because nobody can remember what was decided or why.
The Semantic Layer Decision During Migration
Migration is the natural moment to implement a semantic layer if you do not have one, because you are already rebuilding metric definitions. The cost of encoding those definitions into a governed layer at migration time is marginal compared to the cost of rebuilding them after the fact.
The architecture choice depends on what you are migrating to and where you want your governance to live.
If you are migrating to a single-platform environment — all Snowflake or all Databricks — platform-native semantic layers are the path of least resistance. Snowflake Semantic Views or Databricks Unity Catalog Metric Views give you governed definitions with zero additional infrastructure. The tradeoff is lock-in: definitions live inside the platform and do not travel easily to other tools.
If you are migrating to a multi-BI environment — Power BI for finance, Sigma for operations, a data app for customers — a standalone semantic layer is the right investment. Cube or AtScale sit above the warehouse and serve governed metrics to every consuming tool through standard APIs. The metric is defined once. Every downstream system inherits it. When you change the definition, every tool gets the updated version automatically.
If your team already runs dbt, dbt's Semantic Layer with MetricFlow is the natural extension. Metrics are defined in YAML alongside your transformation models, version-controlled in Git, tested in CI, and served to any downstream BI tool or AI agent via the Semantic Layer API. The migration is an opportunity to pull business logic out of the BI layer and into dbt where it belongs.
What the BI Migration Cost Conversation Is Missing
Most BI migration cost discussions focus on platform licensing, implementation time, and training. Those are real costs. But the cost that is almost never quantified upfront is the cost of migrating bad definitions.
When you migrate metric inconsistencies from one platform to another, you pay for those inconsistencies twice: once in the migration work itself, and again in the reconciliation work that follows when users discover the new tool produces the same conflicting numbers as the old one. The trust problem you had with the old platform travels to the new one.
The organizations that treat migration as an opportunity to fix business logic — rather than just moving it — report dramatically better outcomes. Finance stops arguing about revenue in the first leadership meeting after go-live because the definition was agreed on before the migration, not after. AI agents deployed on top of the new platform return reliable answers because the metric definitions they query were resolved at migration time, not deferred.
The additional time spent on Step 2, extracting and resolving business logic, is the highest-ROI investment in any BI migration. It is also the step most likely to be cut when timelines get compressed. For a deeper look at what migration actually costs — licensing, implementation, and the hidden work — see our analysis of BI migration cost.
The Migration Decision Nobody Talks About
The most important decision in a BI migration is not which platform to migrate to. It is whether to migrate at all right now.
A BI migration undertaken before the data foundation is ready will inherit the data foundation's problems. If your ingestion pipelines are unreliable, the new BI tool will surface unreliable data faster. If your transformation layer has undocumented business logic embedded in hundreds of SQL models, the new BI tool will expose that complexity more visibly. If your organization has not resolved what metrics mean and who owns them, the new BI tool will reveal that conflict in the first week.
The question to ask before committing to a migration timeline is: are we migrating because the new platform will make our analytics better, or are we migrating to delay confronting the underlying data problems? If the answer is the latter, fix the data foundation first. A migration on top of a clean foundation is a six-month project. A migration on top of a broken foundation is a two-year project that ends in the same place.
The sequence that works is always the same: data foundation, semantic layer, BI tool. In that order. A BI migration approach that respects that sequence will deliver what the organization actually needs. One that skips to the BI layer first will deliver a new tool for the same old problems.
Unwind Data
Speak with a data expert
We've helped scale-ups and enterprises move faster on exactly this kind of work — without the trial and error. Strategy, architecture, and hands-on delivery.
Schedule a consultation