How to automate a script to insert data into SQL Server table - sql-server

I have an excel file which has a column and the data in the column is :
INSERT INTO dbo.Test (Id1, Id2, Id3)
VALUES (1, 2, 3)
Can someone please let me know how can i automate this through SSIS so the insert statement directly loads data into the table?
I can create an SSIS package to load this excel file into a table however I need the column to execute and insert data into Test table.
Any suggestions would be helpful.
Thanks

So as I understand you already have a complete SSIS package with connection to excel and connection to database and finally also with transformation in pipeline (grab data from source, push it into destination).
There are different ways how to execute SSIS package from within SQL Server (exec as bat or cmd), but I suggest you official one.
You can find complete step by step tutorial how to deploy SSIS to SQL Server here :
https://learn.microsoft.com/en-us/sql/integration-services/packages/deploy-integration-services-ssis-projects-and-packages?view=sql-server-ver15
Basically, right click in the Visual studio on the project and click deploy... Follow steps...
Take a look here:
https://learn.microsoft.com/en-us/sql/integration-services/packages/deploy-integration-services-ssis-projects-and-packages?view=sql-server-ver15#to-deploy-and-execute-a-package-using-stored-procedures
There is step by step tutorial how to run your SSIS package from Stored Procedure.
Now you have to decide how to automatize run of Stored procedure itself. Also here you have so many approaches, for example your own timer implementation from outside of SQL Server, or using Windows Task Scheduler.
As I don't really know your environment, Il suggest you easy way to create a scheduled Job with SQL Server Agent (it have to be installed at your SQL Server instance).
step by step tutorial is here:
https://learn.microsoft.com/en-us/sql/ssms/agent/create-a-job?view=sql-server-ver15

Related

SQL Server data transfer from one server to another server

I want to transfer multiple tables and their data from one SQL Server to another SQL Server on the local network automatically every 1 hour.
There is a tool built-in tool in SSMS to do this.
In SSMS, right click on the database name. Select Data > Import on the destination database. You will be prompted to provide connection information for the source database. This is internally using SSIS integration tool.
Create a SSIS Package USE Type two SCD if you want insert and
updates. you can use staging table from source to
destination is a good practice and it is industry standard. if you
are not having staging environment. You can use temp tables within
ssis package to achieve the same.
Schedule a job and run that ssis package in the job for every half an hour

Is there a way to save all queries present in a ssis package/dtsx file?

I need to run some analysis on my queries (specifically finding all the tables which a ssis calls).
Right now I'm opening up every single ssis package, every single step in it and copy and pasting manually the tables from it.
As you can imagine it's very time consuming and mind-numbing.
Is there a way to do export all the queries automatically ?
btw i'm using sql server 2012
Retrieve Queries is not a simple process, you can work in two ways to achieve it:
Analyzing the .dtsx package XML content using Regular Expression
SSIS packages (.dtsx) are XML files, you can read these file as text file and use Regular Expressions to retrieve tables (as example you may search all sentences that starts with SELECT, UPDATE, DELETE, DROP, ... keywords)
There are some questions asking to retrieve some information from .dtsx files that you can refer to to get some ideas:
Reverse engineering SSIS package using C#
Automate Version number Retrieval from .Dtsx files
Using SQL Profiler
You can create and run an SQL Profiler trace on the SQL Server instance and filter on all T-SQL commands executed while executing the ssis package. Some examples can be found in the following posts:
How to capture queries, tables and fields using the SQL Server Profiler
How to monitor just t-sql commands in SQL Profiler?
SSIS OLE DB Source Editor Data Access Mode: “SQL command” vs “Table or view”
Is there a way in SQL profiler to filter by INSERT statements?
Filter Events in a Trace (SQL Server Profiler)
Also you can use Extended Events (has more options than profiler) to monitor the server and collect SQL commands:
Getting Started with Extended Events in SQL Server 2012
Capturing queries run by user on SQL Server using extended events
You could create a schema for this specific project and then have all the SQL stored within views on that schema... Will help keep things tidy and help with issues like this.

SQL Server - copy all data from all tables from one server to another server with identical structure

I want to copy all data from all tables from one SQL server database to another existing SQL server database of the same structure. I have a script to initially delete all contents of all tables in my output database before proceeding so that it is 'fresh' for the copy.
I understand that 'select into' statements can get this done but I want to be able to do it in bulk. I want to emulate the behavior that works very well in Management Studio of:
Right-click a DB
Select 'Tasks'
Select 'Export Data...'
In here, I can select an output DB and then select all tables. The transfer goes straight through without issue. I cannot find a command line way to achieve this.
The reason I am after this is that we want a daily copy of the prod database in a testing environment, so need to task schedule this process to run each night.
Due to some contstraints, I can't use a bacpac in this case.
Using the import/export task in SSMS, the last step has 2 options. Run immediately or save as SSIS package. So - save it as a SSIS package. You can then run this package whenever you want. And yes - you will need to do this twice. Once for export, once for import. You can also do exactly the same thing using SSIS btw.
So how do you execute a package from the command line? Like you do for any question, you should search first. Some suggestions/examples are here.
And if needed, you can schedule this using the agent.

Automate SQL Server join to Excel

I have an automated SSRS report that runs off a SQL Server SP. A subscription then exports this to a folder each morning. I'm not sure if this next step is possible, but maybe someone knows:
Another Excel report from an outside source will be dropped into the same folder each morning. Is it possible to then automatically compare the two reports and delete records from my automated report that do not match?
I know I can manually import the new Excel file into SQL Server, do a join and then delete the records. But is it possible to do this automatically?
This sounds like a job for SSIS. SSIS is designed to handle ETL workflows like this. I would create a package that pulls in the outside Excel and either the output from your SSRS subscription or preferably calls the stored procedure itself. You can do all the data comparison you need and then output in whatever format you need or run T-SQL statements against your source database based on the results of your comparison as appropriate to your use case. You'd deploy your package to a SQL Server setup for SSIS and create a SQL Agent job to run it at the appropriate interval.

Dynamically create destination table from source server with SSIS

I need a bit advice how to solve the following task:
I got a source system based on IBM DB2 (IBMDA400) which has a lot of tables that changes rapidly and daily in structure. I must load specified tables from the DB2 into a MSSQL 2008 R2 Server. Therefore i thought using SSIS is the best choice.
My first attempt was just to add both datasources, drop all tables in MSSQL and recreate them with a "Select * Into #Table From #Table". But I was not able to get this working because I could not connect both OLEDB Connections. I also tried this with an Openrowset statement but the SQL Server does not allow that for security reasons and I am not allowed to change that.
My second try was to manually read the tables from the source and drop and recreate the tables with a for each loop and then load the data via the Data Flow Task. But I got stuck on getting the meta data from the Execute SQL Task... so i dont got the column names and types.
I can not believe that this is too hard to archieve. Why is there no "create table if not exist" checkbox on the Data Flow Task?
Of course i searched for the problem here before but could not find a solution.
Thanks in advance,
Pad
This is the solution i got at the end:
Create a File/Table which is used for selection of the source tables.
Important: Create a linked Server on your SQL Instance or a working Connectionstring for the OPENROWSET (i was not able to do so - i choosed the linked server)
Query source File/Table
Build a loop through the resultset
Use Variables and Script Task to build your query
Drop the destination table
Build another Querystring with INSERT INTO TABLE FROM OPENROWSET (or if you used linked Server OPENQUERY)
Execute this Statement
Done.
As i said above i am not quite happy with this but for now it should be ok. I will update this if i got another solution.

Resources