How to drop and recreate table in destination server? - sql-server

Here is the scenario for which Iam trying to create a SSIS package using VS2013. We have SQLServer2014 servers - A & B.
From ServerA.Database1, need to copy data for few specific tables to ServerB.Database2.
But before copy, have to drop the table and recreate the table at destination (ServerB.Database2) using the table schema from source (ServerA.Database1) because of schema changes happening frequently. Need to schedule this weekly.
How to accomplish using SSIS (how to retrieve source table schema information in the ssis package so that it will be used to create a table at the destination) ? or any other way ?
Thanks
Bhanu.

Add an execute sql task and provide sql text 'Drop table tablename' and then create table tablename (variables and datatypes)

Below approach will require a Linked Server. No need to worry about table schema.
Drop table TargetServer.database.dbo.tablename
Insert * into TargetServer.database.dbo.tablename from SourceServer.database.dbo.tablename

Related

How to speed up tables transfer between Access and SQL Server using VBA?

I am trying to move tables from access to SQL Server programmatically.
I have some limitation in the system permissions, ie: I cannot use OPENDATASOURCE or OPENROWSET.
What I want to achieve is to transfer some table from Access to SQL Server and then work on that tables through vba (excel)/python and T-SQL.
The problem is in the timing that it is required to move the tables.
My current process is:
I work with vba macros, importing data from excel and making same transformation in access, to then import into the SQL Server
destroy the table in the server: "DROP TABLE"
re-importing the table with DoCmd.TransferDatabase
What I have notice is that the operation seems to be done based on a batch of rows and not directly. It is taking 1 minutes and half each 1000 rows. The same operation on Access it would have taken few seconds.
I understood that it is a specific way of SQL Server to use import by batches of 10 rows, probably to have more access on data: Micorsoft details
But in the above process I just want a copy the table from access to the SQL as fast as possible as then I would avoid cross platform links and I will perform operation only on the SQL Server.
Which would be the faster way to achieve this goal?
Why are functions like OPENDATASOURCE or OPENROWSET are blocked? Do you work in a bank?
I can't say for sure which solution is the absoute fastest, but you may want to consider exporting all Access tables as separate CSV files (or Excel files), and then run a small script to load each of those files into SQL Server.
Here is some VBA code that saves separate tables as separate files.
Dim obj As AccessObject, dbs As Object
Set dbs = Application.CurrentData
For Each obj In dbs.AllTables
If Left(obj.Name, 4) <> "MSys" Then
DoCmd.TransferText acExportDelim, , obj.Name, obj.Name & ".csv", True
DoCmd.TransferSpreadsheet acExport, acSpreadsheetTypeExcel9, obj.Name, obj.Name & ".xls", True
End If
Next obj
Now, you can very easily, and very quickly, load CSV files into SQL Server using Bulk Insert.
Create TestTable
USE TestData
GO
CREATE TABLE CSVTest
(ID INT,
FirstName VARCHAR(40),
LastName VARCHAR(40),
BirthDate SMALLDATETIME)
GO
BULK
INSERT CSVTest
FROM 'c:\csvtest.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
--Check the content of the table.
SELECT *
FROM CSVTest
GO
--Drop the table to clean up database.
DROP TABLE CSVTest
GO
https://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
Also, you may want to consider one of these options.
https://www.online-tech-tips.com/ms-office-tips/ms-access-to-sql-database/
https://support.office.com/en-us/article/move-access-data-to-a-sql-server-database-by-using-the-upsizing-wizard-5d74c0df-c8cd-4867-8d07-e6e759d72924

ETL Script to dynamically map multiple EXECUTE SQL resultset to multiple tables (table name based on sql file provided)

ETL Script to dynamically map multiple execute sql resultset to multiple tables (table name based on sql file provided)
I have a source folder with sql files ( I can put it up as stored procedures as well ) . I know how to loop and execute sql tasks in a foreach container. Now the part where I'm stuck is I need to use the final result set of each sql queries and shove it into a table with the same name as the sql file.
So, Folder -> script1.sql , script2.sql etc -> ETL -> goes to table script1, table script2 etc.
EDIT : Based on the comment made by Joe, I just want to say that I'm aware of using insert within a script but I need to insert it onto a table in a different server.And Linked servers are not the ideal solutions
Any psuedocode or link to tutorials will be extremely helpful . Thanks!
I would add the table creation to the script. It is probably the simplest way to do this. If your script is Select SomeField From Table1, you could change it to Select SomeField Into Table script1 From Table1. Then there is no need to map in SSIS which is not easy to do from my experience.

insert into table from another table if there is any duplicates do not insert

Ok I have a database with a table LOOKUP (1st), and the same database on another server also with LOOKUP (2nd).
Is there a way I can insert into the 1st database from the second, if duplicate exist then skip else all other values that is present in 2nd should be inserted into 1st. Basically I want the exact same Database!
The think that confuses me is they are on different servers.
Can I export the one to like excel and import it again and replace my database or anything.
You will have to use 2 MERGE queries if you want to make both the databases identical. This is because the first merge will only insert records that are available in DB1 into DB2. But, DB1 will still not contain the records present in DB2 but not in DB1.
I would suggest you to do this task using SSIS.
You can use 2 sources DB1 and DB2 and a LOOKUP transformation on each source (LKP1 and LKP2).
Then you can insert the No Match output of LKP1 into DB2 as destination and No Match output of LKP2 into DB1 as destination.
This will solve the multi-server issue as well because you can create connection to any server in SSIS.

Create a copy of a table within the same database with SSIS

I want to create a copy of a table, say TestTable, with a new name, say TestTableNew, in the same database with the use of an SSIS package. I've created a "Transfer SQL Server Objects Task" for this with the source database specified as both the SourceDatabase and the DestinationDatabase. When I run this task, the original table TestTable is overwritten with a new -empty- TestTable.
This might well be something really obvious that I've overlooked, but can I somehow specify another name for the destination table somewhere in this transfer task? Or should I solve this in another way?
You can't use the "Transfer SQL Server Objects Task" to copy a table to the same database because there isn't an option to specify the new table name. You would be copying table "TestTable" to table "TestTable", which will fail because they both have the same name.
You can set the "DropObjectsFirst" property to true, but that will make you lose your original table and its data, which I think you did on your test, otherwise you would have received a failure message.
The best option here is to use an "Execute SQL Task" to create the structure of your TestTableNew based on your TestTable and then do a simple OleDBSource -> OleDBDestination transformation to load all the data from one table to another.
My knowledge of SSIS is very limited but I assume you can run sql commands passing in
parameters and therefore generating something like the following dynamically
select *
insert into TestTableNew
from TestTable

How to merge table from access to SQL Express?

I have one table named "Staff" in access and also have this table(same name) in SQL 2008.
Both table have thousands of records. I want to merge records from the access table to sql table without affecting the existing records in sql. Normally, I just export using OCBC driver and that works fine if that table doesn't exist in sql server. Please advise. Thanks.
A simple append query from the local access table to the linked sql server table should work just fine in this case.
So, just drop in the first (from) table into the query builder. Then change the query type to append, and you are prompted for the append table name.
From that point on, just drop in the columns you want (do not drop in the PK column, as they need not be used nor transferred in this case).
You can also type in the sql directly in the query builder. Either way, you will wind up with something like:
INSERT INTO dbo_custsql
( ADMINID, Amount, Notes, Status )
SELECT ADMINID, Amount, Notes, Status
FROM custsql1;
This may help: http://www.red-gate.com/products/sql-development/sql-compare/
Or you could write a simple program to read from each data set and do the comparison, adding, updating, and deleting, etc.

Resources