Deploy to Microsoft Fabric
In the deploy phase of your Fabric adoption journey, you implement the migration technically. This step translates architectural plans and assessments into tangible components like Lakehouses, Warehouses, Eventstreams, and Pipelines. You’ll also execute remediation tasks, prepare governance, and validate technical readiness before release.
This phase can be one of the most technically demanding in your journey. At its conclusion, workloads should be stable, secure, performant, and ready for end-user testing and production rollout.
Deployment Checklist
| Activity | Description | Responsible roles |
|---|---|---|
| Deploy supporting Fabric services | Provision workspaces, Lakehouses, Warehouses, KQL DBs, Eventstreams, Pipelines, and access controls. Validate workspace naming, tagging, and integration with Purview and Entra ID. | Fabric Platform Admin Landing Zone Architect |
| Remediate assets | Convert incompatible sources, rebuild legacy connectors, refactor Power BI reports, optimize ingestion pipelines (e.g. XML → JSON). | Fabric Solution Architect Data Engineer Project Manager |
| Replicate datasets or source data | Load initial historical data from on-premises or legacy clouds into OneLake using Pipelines, Dataflows Gen2, or third-party tools. | Data Engineer |
| Prepare for monitoring and governance | Onboard to Fabric Monitoring, enable lineage in Purview, validate access policies and workspaces alignment. | Cloud Operations Manager Fabric Platform Admin Workload Owner |
| Test deployment | Validate DQ rules, Fabric Capacity Unit sizing, workspace isolation, report refresh times, and integration points (API, scheduling, data sharing). | Project Manager Fabric Solution Architect Test Engineer |
Recommended Tools
- Microsoft Fabric Deployment Pipelines
- Azure DevOps or GitHub Actions
- Power BI Deployment and Workspace Inventory scripts
- Microsoft Purview & Azure Policy (data classification, tagging, compliance)
Continue to the next section: Release workloads to Fabric.