Step-by-Step: Using Copy Job to Move Data Across Tenants in Fabric Data Factory
Microsoft Fabric Blog presents a practical tutorial for moving data across Azure tenants using Copy job in Fabric Data Factory. Authored by the Microsoft Fabric Blog team, this walkthrough emphasizes secure authentication, configuration, and transfer into a Fabric Data Warehouse.
Step-by-Step: Using Copy Job to Move Data Across Tenants in Fabric Data Factory
Author: Microsoft Fabric Blog
Overview
Copy job in Microsoft Fabric Data Factory streamlines the process of moving data across clouds, on-premises systems, and Azure services. This guide focuses on migrating data securely between different Azure tenants using service principal authentication and highlights scenarios like bulk copy, incremental copy, and change data capture (CDC).
Scenario
- Tenant A: Owns Fabric Data Warehouse and will execute the Copy job.
- Tenant B: Hosts the Azure Data Lake Gen2 source account.
The task: Move data from Azure Data Lake Gen2 (Tenant B) into a Fabric Data Warehouse (Tenant A), using a service principal for secure authentication.
Prerequisites
- Azure Data Lake Gen2 account in Tenant B with a registered service principal.
- Service principal credentials: Application (client) ID, Directory (tenant) ID, and a client secret.
- Proper role assignment for the service principal (e.g., Storage Blob Data Contributor).
Setting up Azure Data Lake Gen2 with Service Principal
- Sign into the Azure Portal with Tenant B credentials.
- Create a Storage Account and add a data container.
- Register a new application in Azure AD (Microsoft Entra ID) for a service principal.
- Record the Application (client) ID and Directory (tenant) ID.
- Generate and securely store a client secret.
- Assign appropriate RBAC roles (e.g., Storage Blob Data Contributor) to the service principal for the Data Lake account.
Initiating the Copy Job in Fabric
- Sign into Microsoft Fabric (app.powerbi.com) using Tenant A credentials.
- In your Fabric workspace, create a new Copy job.
- Choose Azure Data Lake Gen2 as the source, and select service principal as the authentication method.
- Enter the Tenant ID, Client ID, and Client Secret obtained from setup.
- Select files to copy from the Data Lake container.
- Choose the destination as Fabric Data Warehouse (Tenant A).
- Optionally, configure table or column mapping.
- Select full or incremental copy mode as desired.
- Review and run the job. Monitor progress in the job panel.
- After a successful run, validate that data appears in the target Fabric Data Warehouse.
Key Features and Options
- Bulk and incremental copy: Supports one-time and ongoing data sync.
- CDC replication: Facilitates change-based data movement.
- Authentication: Uses service principals for secure, cross-tenant access.
- Role-based access control: Granular permission management.
- Table/column mapping: Customize destination schemas.
Additional Resources
-
[What is Copy job in Data Factory – Microsoft Fabric Microsoft Learn](https://learn.microsoft.com/en-us/fabric/data-factory/what-is-copy-job) - Register a Microsoft Entra app and create a service principal
- Access storage using a service principal & Microsoft Entra ID
- Microsoft Fabric documentation
- Fabric Community – Copy job discussions
Summary
The Copy job capability in Microsoft Fabric Data Factory is a practical tool for data professionals tasked with transferring datasets across Azure tenants. With thorough authentication, configuration, and clear support for multiple workflows (full, incremental, CDC), it enables robust, secure data migration into Fabric Data Warehouse environments.
This post appeared first on “Microsoft Fabric Blog”. Read the entire article here