How to transfer table from one database to another database? - database

I was working on Azure Data Studio. By mistake I created a table in the system administrator's database. I want to transfer it to another database which is created by me. How can I transfer that table?

With Azure Data Studio, we can't transfer the table to another database directly. Azure SQL database also doesn't support USE statement. And Azure Data Studio doesn't support import or export jobs. The only way are that:
Create the table with data again in user DB again an then delete is
in System administration's database
Elastic query(CREATE EXTERNAL DATA SOURCE,EXTERNAL TABLE) to cross database query the table data in System
administration's database, and then import it to your user DB.
I tested in Azure Data Studio and it works well. Since you can create table in System administration's database, I think you have enough permission to do this operations.
If you can use SSMS, the could be much easier and there are many ways can achieve it. For example:
Ref the blog:
https://blog.atwork.at/post/How-to-copy-table-data-between-Azure-SQL-Databases
#Json Pan provide in comment.
Export the table into a csv file and then import to user DB.
You also can use Elastic query.
Just choose the way you like.

Related

Data Migration from On Prem to Azure SQL (PaaS)

We have an on-prem SQL Server DB (SQL Server 2017 Comp 140) that is about 1.2 TB. We need to do a repeatable migration of just the data to an on cloud SQL (Paas). The on-prem has procedures and functions that do cross DB queries which eliminates the Data Migration Assistant. Many of the tables that we need to migrate are system versioned tables (just to make this more fun). Ideally we would like to move the data into a different schema of a different DB so we can avoid the use of External tables (worried about performance).
Moving the data is just the first step as we also need to do an ETL job on the data to massage it into the new table structure.
We are looking at using ADF but it has trouble with versioned tables unless we turn them off first.
What are other options that we can look and try to be able to do this quickly and repeatedly? Do we need to change to IaaS or use a third party tool? Did we miss options in ADF to handle this?
If I summarize your requirements, you are not just migrating a database to cloud but a complete architecture of your SQL Server, which includes:
1.2 TB of data,
Continuous data migration afterwards,
Procedures and functions for cross DB queries,
Versioned tables
Point 1, 3, and 4 can be done easily by creating and exporting .bacpac file using SQL Server Management Studio (SSMS) from on premises to Azure Blob storage and then importing that file in Azure SQL Database. The .bacpac file that we create in SSMS allows us to include all version tables which we can import at destination database.
Follow this third-party tutorial by sqlshack to migrate data to Azure SQL Database.
The stored procedures can also be moved using SQL Scripts. Follow the below steps:
Go the server in Management Studio
Select the database, right click on it Go to Task.
Select Generate Scripts option under Task
Once its started select the desired stored procedures you want to copy
and create a file of them and then run script from that file to the Azure SQL DB which you can login in SSMS.
The repeatable migration of data is challenging part. You can try it with Change Data Capture (CDC) but I'm not sure that is what exactly your requirement. You can enable the CDC on database level using below command:
Use <databasename>;
EXEC sys.sp_cdc_enable_db;
Refer to know more - https://www.qlik.com/us/change-data-capture/cdc-change-data-capture#:~:text=Change%20data%20capture%20(CDC)%20refers,a%20downstream%20process%20or%20system.

How to store hot and cold data with Azure SQL

I have a huge order table in Azure SQL. I have one boolean field "IsOrderActive" to separate hot and cold orders. Is it possible to automatically transfer cold data to a separate database with Azure SQL?
One way to accomplish required task is to divide the order table into two using T-SQL command then transfer the table with cold data in different database (different server) using SSMS.
Please follow the repro steps done by me.
Create a table
create table hotcoldtable (orderID int, IsOrderActive char(3))
Inserted demo data into the table
insert into hotcoldtable
values (1,'yes')
,(2,'no')
,(3,'yes')
,(4,'yes')
,(5,'no')
,(6,'no')
,(7,'yes')
Divide the table into cold and hot data tables using below commands
cold data table - select OrderID, IsOrderActive into coldtable from hotcoldtable where IsOrderActive = 'no'
hot data table - select OrderID, IsOrderActive into coldtable from hotcoldtable where IsOrderActive = 'yes'
You can see two new tables in your database.
In SQL Server Management Studio (SSMS), login to your Azure SQL Server. Fill the details and click on Connect.
Left click on database name where you have order tables and click on Generate Scripts...
Select Select specific database objects and mark the objects for which you want to create script as shown in below image.
Set the below settings.
Review the details and click on Next. This will generate your script.
Go to the location where your script got saved. Open the file in any editor and copy the script.
Now in Azure Portal, go to the database where you want to transfer the cold data table. Go the the Query Editor and paste the copied script in the white space. Run the script and you will get the tables in this database as shown below.
Are you referring to SQL Server Stretch Database to Azure? Check this out https://www.mssqltips.com/sqlservertip/5526/how-to-setup-and-use-a-sql-server-stretch-database
If you are interested in saving space by archiving the cold data, you can use two separate tables in the same or different databases. The thing to note is you should use columnstore index for the archive(cold) table. Depending upon your data, you should be able to achieve between 30%-60% data compression.
However, this can't be done without running some queries. But it can be automated using Azure workbooks.
I built a similar kind of functionality that helped me save 58% space in Azure SQL database.
Please comment if this is something you feel might help. I can share more details about this.
Database sharding seems like a possible solution for the scenario where cold orders can be put on Azure Serverless databases that have auto-pause and auto-resume capabilities where you can save when they are not in use, only paying for storage used. Azure SQL Database provides a good number of tools here to support sharding.

How to Import data into Azure SQL database from Data Lake Storage Gen 1?

I created an empty SQL database using Azure portal.
I also added some sample data to a data lake in Data Lake Storage Gen 1.
I downloaded SSMS, linked it to the server containing the SQL database, and added a new table using SSMS in order to have a target location to import the data into the SQL database.
Questions: 1. Will the new table I added in SSMS be recognized in Azure?; 2. How do I get the sample data from the data lake I created into the new table I created in the Azure SQL database?
An article suggested using Azure HDInsight to transfer the data, but it's not part of my free subscription and I don't know how much charges I will incur using it.
Will the new table I added in SSMS be recognized in Azure?
Yes, when we connect to the Azure SQL database with SSMS, all the operations in query editor will register/sync to the Azure SQL database. When the SQL statements executed/ran, we can refresh the SSMS and the new table will exist.
How do I get the sample data from the data lake I created into the new table I created in the Azure SQL database?
What's the sample data did you mean added to Data Lake Storage Gen 1, CSV files? If the file is not very large, I would suggest you download to you local computer can then using SSMS Import and Export Wizard to load the data to Azure SQL database. It may take some time, but it's free.
Or you may follow the tutorial Copy data between Data Lake Storage Gen1 and Azure SQL Database using Sqoop, I didn't find the price of Azure HDInsight when we create it. I'm not sure if it's free. I think you could try it.
Hope this helps.

How to convert database in SQL Server to schema based object in Azure SQL

I'm trying the migrate a database to Azure SQL. As I'm trying to minimize the servers, the challenge is like let's say db name is college, under college db we have table folder and I have a table name "forms". In order to access forms, it will be like college..dbo.forms.
Now I'm trying to migrate the database and tables in Azure SQL. I want the database to be converted into schema based object. Let's say for accessing table "forms" under college, it will be like "college.forms". There will be no separate database "college".
As I know, you could not do that during migrating.
No matter which way you migrate your database to Azure SQL, you all need to do the two steps:
specify the source server and database.
select schema objects.
[dbo] is the default schema in Azure SQL. And a new database will certainly appear in Azure SQL.
Reference: SQL Server database migration to Azure SQL Database.
But Azure SQL database supports your create or alter the default schema.
You can see:
CREATE SCHEMA (Transact-SQL);
ALTER SCHEMA (Transact-SQL);
DROP SCHEMA (Transact-SQL).
Hope this helps.

Create a blank database with Oracle DBCA

I use Oracle Database Configuration Assistant to create database. There are 3 default templates:
General purpose or transaction processing
Custom database
Data warehouse
However, choosing any of these 3 template would lead to creation of a database with thousands of default tables (of which I don't understand what they are).
Any method to create a blank database (no tables, no data) with this Oracle Database Configuration Assistant?
As Bob Jarvis noted in the comment area below the question. Those thousands of default tables appear because SYSTEM username is used when accessing the database.
It's impossible to create a totally-blank database (when being viewed as SYSTEM). Connect as SYSDBA and these default tables won't appear.

Resources