sync data from remote SQL Server to local server - sql-server

I have read only access to 2 SQL Server databases used by our company's primary application. For reporting and testing purposes, I would like to sync the databases to 2 databases i have on prem at our corporate offices. Because it's a hosted application on azure, replication is not an option. I want to run queries against both databases. I can query one at a time but cant query both.
I can link to one server via ssms with linked servers but not both because the have the same server name. odbc ,ight be a solution for that but we will see.
The issue here is I want to run a task that copies the data from the hosted sql server to the local server without replication as an option.
I have access to sql Server 2012 and sql Server 2019 if that matters.
I can link to one server via ssms with linked servers but not both, because the have the same server name. odbc ,ight be a solution for that but we will see.
The issue here is I want to run a task that copies the data from the hosted sql server to the local server without replication as an option.
I have access to sql Server 2012 and sql Server 2019 if that matters.

I have done manual synchronization tasks before. For example, our phone system has a large call log table in MySQL and I periodically have to replicate that over to a SQL server used for reporting. I basically create a synchronization log table to record when the last synchronization occured. Each time the stored procedure is run, it grabs all records between the last synchronization and now. You can expand this example to multiple tables to synch (i.e. you can add server_name and table_name columns to db_synch_log and servers).
The problem I see is that you want to replicate an entire DB. If you're updating more than just a few tables, this method is not scalable IMO. For my purposes, I was replicating 3 tables and nothing else.
--You need a log table to record when the last synchronization was performed.
CREATE TABLE db_synch_log (
last_synch_date datetime;
);
--Create variables to hold the last_synch and current datetimes.
DECLARE #run_date datetime = GETDATE();
DECLARE #last_run_date datetime;
--Get the last run date from the log table;
SET #last_run_date = (SELECT MAX(last_synch_date) FROM db_synch_log);
--Get all records from remote database between last run time an this run.
INSERT INTO myTable (column1Name, column2Name, ...)
SELECT column1Name, column2Name, ...
FROM remoteserver.dbo.myTableRemote as rt
WHERE rt.date_created > #last_run_date
AND rt.date_created <= #run_date
;
--Update synch log.
INSERT INTO db_synch_log (last_synch_date)
SELECT #run_date
;

Related

Syncing Data from local database to online database SQL Server and VB.Net

Is there a way of syncing a local database to online database ?
Lets say, I have a local database with a bunch of data stored. When I click a button all data from my local database will be sync to my online database. I am using SQL Server 2012 and VB.Net.
Thank you.
As Squirrel recommend to use LINKED SERVER , a built-in feature of SQL Server Management Studio (I am using SSMS 2012) to enable the Database Engine to execute a Transact-SQL statement that includes tables in another instance of SQL Server, or another database product such as Oracle. see linked-servers. I am able to synchronize data using queries.
I achieve inserting data from local database to online database using the following queries:
INSERT INTO [linkedserverhostname].databasename.dbo.tablename
SELECT * From dbo.tablename Where Convert(Date, StartTime) = '08/26/2018'
EXPLANATION:
Insert into
INSERT INTO [linkedserverhostname].databasename.dbo.tablename
whatever the value of this select statement
SELECT * From dbo.tablename Where Convert(Date, StartTime) = '08/26/2018'
Same as the update and delete.
Look into SQL Server Replication Services.

How to store sql result without creating a table to store the result of a stored procedure

We are running exec xp_fixeddrives to get the free space for each physical drive associated with the SQL Server.
I am running this code as part of an SSIS package which fetches all the SQL servers free space.
Currently I am creating a table in tempdb and inserting the results of exec xp_fixeddrives into this table. But the issue I am facing is, when ever the server is restarted I am facing access issue as the table is on Tempdb.
I don't really like the idea of creating a table in Master DB or Model DB. A challenge I am facing is we execute on difference instance of SQL Server versions ranging from 2000 - 2014. So obviously there are issues I have to keep in mind.
Any suggestion on this are much appreciated.
That's obvious, SQL Server reset TempDb whenever SQL Services is restarted. In that case, you will face access issues because that table won't exists. I would probably create my own table to store the details if I want to store historical check information also.
If you are running your code from SSIS and you want to send mail just after validating it then you don't even have to create any table. Just fill a object variable in SSIS from execute SQL task which will be running below query
DECLARE #t TABLE
(Drive VARCHAR(1),Size INT)
INSERT INTO #t
EXEC MASTER..xp_fixeddrives
SELECT * FROM #t
Read this object variable in script task to send mail.

Inserting results from a very large OPENQUERY without distributed transactions

I am trying to insert rows into a Microsoft SQL Server 2014 table from a query that hits a linked Oracle 11g server. I have read only access rights on the linked server. I have traditionally used OPENQUERY to to do something like the following:
INSERT INTO <TABLE> SELECT * FROM OPENQUERY(LINKED_SERVER, <SQL>)
Over time the SQL queries I run have been getting progressively more complex and recently surpassed the OPENQUERY limit of 8000 characters. The general consensus on the web appears to be to switch to something like the following:
INSERT INTO <TABLE> EXECUTE(<SQL>) AT LINKED_SERVER
However, this seems to require that distributed transactions are enabled on the linked server, which isn't an option this project. Are there any other possible solutions I am missing?
Can you get your second method to work if you disable the "remote proc transaction promotion" linked server option?
EXEC master.dbo.sp_serveroption
#server = 'YourLinkedServerName',
#optname = 'remote proc transaction promotion',
#optvalue = 'false'
If SQL Server Integration Services is installed/available, you could do this with an SSIS package. SQL Server Import/Export Wizard can automate a lot of the package configuration/setup for you.
Here's a previous question with some useful links on SSIS to Oracle:
Connecting to Oracle Database using Sql Server Integration Services
If you're interested in running it via T-SQL, here's an article on executing SSIS packages from a stored proc:
http://www.databasejournal.com/features/mssql/executing-a-ssis-package-from-stored-procedure-in-sql-server.html
I've been in a similar situation before, what worked for me was to decompose the large query string while still using the query method below. (I did not have the luxury of SSIS).
FROM OPENQUERY(LINKED_SERVER, < SQL >)
Instead of Inserting directly into your table, move your main result set into a local temporary landing table first (could be a physical or temp table).
Decompose your < SQL > query by moving transformation and business logic code into SQL Server boundary out of the < SQL > query.
If you have joins in your < SQL > query bring these result sets across to SQL Server as well and then join locally to your main result set.
Finally perform your insert locally.
Clear your staging area.
There are various approaches (like wrapping your open queries in Views) but I like flexibility and found that reducing the size of my open queries to the minimum, storing and transforming locally yielded better results.
Hope this helps.

Creating a snapshot database to another SQL Server

I'm trying to save the values of several columns of one table to another table on a different server. I am using SQL Server. I would like to do this without running external programs that query from this database and insert the results into the new database. Is there any way to do this from within the SQL Server Management Studio?
This is a recurring event that occurs every hour. I have tried scheduling maintenance tasks that execute custom T-SQL scripts but I'm having trouble getting the connection to the remote server.
Any help would be appreciated.
If you can set up the remote server as a linked server you should be able to configure the SQL Server Agent to execute jobs that contain queries that access tables on both the local and linked server. Remember that you might have to configure the access rights for the account used to run SQL Server Agent so that it has permissions to read/write tables on both servers.
This practice might not be without issues though as this article discusses.
You can use a 4 part name like;
INSERT [InstanceName].[DatabaseName].[SchemaName].[TableName]
SELECT * FROM [SourceInstanceName].[SourceDatabaseName].[SourceSchemaName].[SourceTableName]
But first you will have to set the remote server as a linked server as so;
https://msdn.microsoft.com/en-us/library/aa560998.aspx

SQL Server Service Broker to insert data on a SQL Server 2000

I have two servers, the first one with SQL Server 2005 and the second one with SQL Server 2000. I want to insert data from the 2005 to the 2000 and I want to do it unsync (without distributed transactions because "save transaction" are used).
Once the information is inserted in the tables of the 2000 server, some instead-of triggers are fired to process this information.
In that scenario I decided to use Service Broker. So I have a Stored Procedure to insert information from one server to the other and it works perfectly.
But when I call this procedure from the target queue process message procedure it fails, and I don't know why!!
Also, I know it works because when I use the same structure (queues & stored procedures) to copy form one database to another on the same SQL 2005 server.
So it fails only between machines, anyone knows why or how to get more information about the cause of the failure? Or how to insert data unsync (I can't to use the SQL Agent because I want insert the information more frequently than 1 minute).
The usual issue is that the SSB procedure uses WAITFOR and WAITFOR is incompatible with distributed transactions. The only solution is to get rid of the WAITFOR(RECEIVE) and use a ordinary RECEIVE instead.
Consider using a linked server instead of service broker. With a linked server, you can:
insert into LinkedServer.The2000Db.dbo.The2000Table
(col1, col2, col3)
select col1, col2, col3
from The2005Table

Resources