I have two SQL server databases running on Azure - one Prod and one Test. I want every night, the Test server will automatically sync its data and structure (include any TABLE, Stored Proc changes ) with Prod - so Test is a mirror of Prod that can be used for development / QA.
I am wondering how to achieve this on Azure? Please note I would like this to happen automatically and on a schedule. Thanks everybody.
Azure the moment, we can use Azure data sync service to sync data between database. But we cannot use the service to sync Stored Procedures. For more details, please refer to here and here. So if we want to sync Stored Procedures, SQL Server Management Studio Generate Scripts Wizard is an easy means of producing a script that will copy all of your Stored Procedures to another database.
Besides, regarding how to create a schedule to start azure SQL data sync, please refer to the blog
Related
I want to move all data from one Azure SQL Server to different Azure SQL Server which more than 90 days old, and after moving need to delete moved data from first Azure SQL Server.
I want to run these steps on daily basis.
I am new to Azure and able to do same with Azure Data Factory. Can you please suggest any other best suited approach?
You are already using the best approach.
Azure Data Factory is an easy to use when it comes to extract and copy the data between the services. It also provide scheduling the triggers, i.e., triggering the copy pipeline after specific interval of time or any event. Refer Create a trigger that runs a pipeline on a schedule.
If the volume of data is large, you can re-configure the Integration Runtime (IR) resources (Compute type and Core count) to overcome the performance issue, if required. Refer below image.
I'm relatively new to Azure and am having trouble finding what options are out there for connecting to an existing SQL database to push data into it.
The situation is that we have an external client who needs to connect to our Azure SQL database to push data into it, on an on-going basis. We can't give them permission to get into our database, so we're looking at what we can do allow data in. At this point the best option seems to be to create a web service deployed in Azure that will validate the data and then push it into our database.
The question I have is, are there other options to do this in an easier way? Are there Azure services or processes that can be set up to automatically process a file and pull the data into a database? Any other go-between options when each side has their own database and for security reasons can't just open up access to it?
Azure Data Factory works great for basic ETL. If neither party can grant direct access, you can use an intermediate repository like Blob Storage to drop csv/xml/json files for ingestion. If they'll grant you access to pull, you can setup a linked service that more or less functions the same as a linked server in MSSQL. As of the last release ADF now supports Azure hosted SSIS packages too.
I would do this via SSIS using SQL Studio Managemenet Studio (if it's a one time operation). If you plan to do this repeatedly, you could schedule the SSIS job to execute on schedule. SSIS will do bulk inserts using small batches so you shouldn't have transaction log issues and it should be efficient (because of bulk inserting). Before you do this insert though, you will probably want to consider your performance tier so you don't get major throttling by Azure and possible timeouts.
We have a SQL Server 2005 database on our local server.
Here it is (ofc i've to repeat the proceder for the other databases):
I've to transfer it to our SQL server 2012 instance on Amazon RDS.
I right clicked the database and selected Generate Scripts - All tables - Copy Schema and Data and saved everything as a sql file
At this point I attempted to use the SQL Azure MW v5.15 (in a question here I saw that it works with AWS too, way to go Microsoft!) to transfer the database on AWS.
However it crashes.
No problem, I try to use SQL Management studio to import the file but as soon the RAM consumed by the program reaches 1gb (as you can see that DB is 3,4gb) BOOM - out of memory error!
What should I do now?
You'll need to do part-by-part of your creation. I'd faced that problem some time before, my scripts reaches like 4 GB, only with the schemas, tables, etc. So, I think you should first of all, generate your scripts of creating schemas, users and logins. After that, tables, views and procedures. Then, another objects, like jobs, functions... To conclude, all the data you have, you should export to the RDS through the IMPORT/EXPORT Wizard in SSMS.
I've followed that steps and it worked for me.
Good luck!
What we do
We run a website that provides statistics. We used to run Access as a backend database, but now made the transition to SQL Server.
How we work
When we receive new statistics we put them in a staging table for proofreading and testing, before exporting them to the live database. Now we are using Access as a frontend for SQL Server with linked tables. This works fine.
What is the best way to having a Access database with staging tables that when ready can be exported to a table in SQL Server? Mind you that the final process should be fairly simple and not technical, as the reason for using Access is the relatively user-friendly UI. Using SQL Server Management Studio would be to technical for the users handling the data.
Let me stress that the solution we need is not a one-time conversion of a table or database, but for staging changes and then pushing them to SQL Server.
Ended up using linked tables and a local staging table which we upsized when the data is ready to be updated on production database.
https://support.office.com/en-us/article/Move-Access-data-to-a-SQL-Server-database-by-using-the-Upsizing-Wizard-5d74c0df-c8cd-4867-8d07-e6e759d72924
Here's some documentation on what I discussed in the comments. If you have never used integration services before the beauty of it is once the import procedure has been created it can be used over and over again.
So once you have the data in excel format, even access if you really wanted to you can follow the steps on the link below
Creating A Simple SSIS Package
I have a few databases that I want to migrate to another server. These are production Databases, what is the best way to Migrate
1) Take full back up of the Current Databse and then Restore it on to the other Server.
or
2) Detach and then Copy the mdf/ldf files on the Destination Server and then attach the files there.
I know that after Migrating Sql Logins and Sql Agent Jobs will have to be created manually. Are there any other risks that come to mind?
Any help will be helpful.
Thanks,
Ben
For Login transfer use "sp_help_revlogin" you get the script
https://support.microsoft.com/en-us/kb/918992
this Stored procedure list out all the instance login not particular database login. One special thing about this stored procedure is no need to do orphaned fix. Just migrate the logins and check. It works.