Automate the copy of a database in Azure SQL using Azure Data Factory
Proof of concept to copy your Azure SQL database using Azure Data Factory (ADF)
Introduction
Copy of a database in Azure SQL is a transactionally consistent snapshot of the source database at a given point in time. Once the copy is complete, this database can be used as a fully functional and independent database. Copy can used to stand-up and refresh your test/dev environment from production database. Copy technology is based on the geo-replication technology behind the scenes and created from the snapshot of the blobs.
Important Points
- When you copy a database to a different server, the security principal that initiated the copy operation on the target server becomes the owner of the new database.
- Database copy is an asynchronous operation, but the target database is created immediately after the request is accepted.
- You can copy database in Azure SQL across subscription. However, this is only supported when using a SQL auth to login to the target server.
- The minimum necessary level of access is membership in the dbmanager role in the
master
database on both servers is required. - If you decide to cancel the copying while it’s in progress, execute the DROP DATABASE statement on the new database.
- Database copy using T-SQL isn’t supported when connecting to the destination server over a private endpoint. If a private endpoint is configured but public network access is allowed, database copy is supported when connected to the destination server from a public IP address using SQL authentication. Once the copy operation completes, public access can be denied.
- The Azure portal, PowerShell, and the Azure CLI don’t support database copy to a different subscription.
Setup
In this proof of concept (PoC), we have created two Azure SQL Database server in two separate subscriptions.
You can follow the steps mentioned outline below to copy a SQL Database from one server to another server. These servers can be in different subscriptions.
Note: Make sure you use a login that has the same name and password as the database owner of the source database. Additionally, the login must be a member of the dbmanager role or a server administrator, on both source and target servers.
T-SQL for source server
Lauch SSMS and run the following in source server in master database.
--Step# 1
--Create login and user in the master database of the source server.
GO
CREATE LOGIN copydemo WITH PASSWORD = 'strongpassword'
GO
CREATE USER [copydemo] FOR LOGIN [copydemo] WITH DEFAULT_SCHEMA=[dbo];
GO
ALTER ROLE dbmanager ADD MEMBER copydemo;
GO
--Step# 2
--Create the user in the source database and grant dbowner permission to the database.
CREATE USER [copydemo] FOR LOGIN [copydemo] WITH DEFAULT_SCHEMA=[dbo];
GO
ALTER ROLE db_owner ADD MEMBER copydemo;
GO
Now capture the SID of the user created above from master database. You would need this in the next steps.
--Step# 3
--Capture the SID of the user "loginname" from master database
SELECT [sid] FROM sysusers WHERE [name] = 'copydemo';
T-SQL for target server
Now connect to the destination server. Make sure your login name and password should be same as source server. Run the following T-SQL in master database.
--Create login and user in the master database, same as of the source server.
CREATE LOGIN copydemo WITH PASSWORD = 'strongpassword', SID = 0x01060000000000640000000000000000211BED7C93AC454F950AE0BC1A706F5B;
GO
CREATE USER [copydemo] FOR LOGIN [copydemo] WITH DEFAULT_SCHEMA=[dbo];
GO
ALTER ROLE dbmanager ADD MEMBER copydemo;
GO
Configuration in Azure Data Factory
In this PoC, we have used Azure Data Factory (ADF) to automate the copy of a database. However, several other solutions can be used to automate this such as Azure Function, Automation account, SQL Agent job, and so on.
Create a link service in ADF for target server
- Lauch ADF. Click on Manage → link services
- Click +New to launch configuring link services wizard and apply SQL connection related information as per your environment as shown below.
- Click Test Connection to verify if connection is successful.
Create an ADF Pipeline
Perform the following steps to create a pipeline.
- In ADF portal, click on pipeline. Click “+” to create a new pipeline as shown below.
- Expand Activites, drag “Script” activity to designer pane and give a name as shown below.
- Click on Settings tab.
- Select the previously created link service for SQL database from the dropdown.
- For script, select “NonQuery”
- In the script box, apply the following T-SQL
DROP DATABASE IF EXISTS [new_demo]
CREATE DATABASE new_demo
AS COPY OF [azuresql-eastus-4455].demo;
Publish the pipeline.
Run the pipeline by clicking Add trigger → Trigger now as shown below.
During the pipeline execution, you run the following query to check the status
select name, state_desc from sys.databases
The state_desc column will show “COPYING”
Once the pipeline is successful. you can verify it under Monitor tab in ADF.
You can also verify the new database in target server as shown in the below figure.
- If the copying fails, the
state_desc
column of thesys.databases
view for the new database is set toSUSPECT
. Execute the DROP statement on the new database, and try again later. - If the copying succeeds, the
state_desc
column of thesys.databases
view for the new database is set toONLINE
. The copying is complete as shown below.
Conclusion
The new database is a regular database that can be changed independent of the source database. You can also schedule your ADF pipeline to refresh destination database in test/dev scenarios.