Dynamically load .xlsx files in ssis - sql-server

I created an SSIS to load excel files. It loops through a specified folder for relevant files and reads the data in each file into a raw data table and then I have SQL scripts that does the validation and places the data in the relevant tables etc... and it all works fine.
but I now need make the ssis package handle loading excel files with 3 different file structures. ie one file will have 50 columns, one will have 55 and one will have 60.
I have tried using a script task to load the data
Insert into <rawdatatable> select * from openrowset('Microsoft.Jet.OLEDB.4.0','excel 8.0; database=D:\SSIS\FileToLoad.xlsx', 'Select * from [Sheet1$]')
but I keep getting the error below, but adding error logging doesn't give any other errors
Exception has been thrown by the target of an invocation
I am using SQL Server 2014 and VS 2013
I'm not really sure what I am doing here, any help or guidance would be appreciated
Thanks

You must use Microsoft.ACE.OLEDB.12.0 provider, try following:
Insert into <rawdatatable>
select * FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
'Excel 12.0;Database=D:\SSIS\FileToLoad.xlsx;HDR=YES',
'SELECT * FROM [Sheet1$]')
References
Import/Export Excel (.Xlsx) or (.Xls) File into SQL Server

Related

How to generate Insert statement from PGAdmin4 Tool?

We are writing a new application, and while testing, we will need a bunch of dummy data. I've added that data by using MS Access to dump excel files into the relevant tables into the Postgres database.
What should I do now to generate an Insert statements from the PGAdmin4 Tool similar to what SQL Studio allow us to generate an Insert statements for SQL Server? There are no options available to me. I can't use the closest one, which is to export and import the data via CSV.
I understand that you cannot import the CSV file into the actual DB as this needs to be done through ASP.NET core EF. Perhaps, you can probably create a test schema and import the CSV file into the test schema. Once you have the data imported into the test schema, you can use that to generate SQL statements using the steps below:
Right click on target table and select "Backup".
Select a file path to store the backup. You can save the file name as data.backup
Choose "Plain" as Format.
Open the tab "Options" check "Use Column Inserts".
Click the Backup-button.
Once the file gets generated you can open with Notepad++ or VSCode to get the SQL insert statements
You can use the statements generated and delete the test schema created
Here is a resource that might help you in loading data from Excel file into PostgresSQL if you still need to take this path Transfer Data from Excel to PostgreSQL

SSIS package is executing while modification done on source table

i created a SSIS package using simple DFT and transferring files from OLEDB source to OLEDB destination. When i am adding new column in my source table package is still getting executed successfully but i want my package to get failed. Could anyone suggest how to fix it.
In the Control Flow area, add an Execute SQL Task before your DFT.
Set up the connection to your database and for your SQLStatement use the following:
CREATE TABLE #temp (<define all columns currently in your OLEDB Source Table>)
INSERT INTO #temp
SELECT TOP 1 *
FROM <your OLEDB Source Table>
By using this "worst practice" insert syntax, you can cause a failure if your OLEDB Source Table has any columns added or removed.
When selecting a Table in OLEDB Source, the metadata of the table is mapped to the OLEDB Source component.
Each time you will run the package it will sends a SELECT * From Table command to the SQL server, and retrieve the data and it will map every column from the SQL table to the OLEDB Source column.
If a column is found in SQL and it is not defined in OLEDB Source it will be ignored. In the other hand, if a column is not found in SQL and it is defined in OLEDB Source it will throw an exception.
The only way to validate metadata before running the package is to add an Execute SQL Task or Script Task to check the metadata before the DFT is executed.
References
SSIS OLE DB Source Editor Data Access Mode: “SQL command” vs “Table or view”
OLEDB Source

SSIS - How to insert into OLE DB destination using SQL command while the source is flat file?

I want to know how to insert value in SQL Server database with the flat file source in SSIS using SQL command. I've done inserting it using table view, now i have to insert it using SQL command
Well you need a good query to set into a Execute SQL Task in SSIS
you can get help for queries in the site below
----here is the link ----
well you can parametrize the query in Execute SQl Task of SSIS
BCP
This is one of the options that is mostly widely used. One reason for this is that it has been around for awhile, so DBAs have come quite familiar with this command. This command allows you to both import and export data, but is primarily used for text data formats. In addition, this command is generally run from a Windows command prompt, but could also be called from a stored procedure by using xp_cmdshell or called from a SSIS package.
Here is a simple command for importing data from file C:\ImportData.txt into table dbo.ImportTest.
bcp dbo.ImportTest in 'C:\ImportData.txt' -T -SserverName\instanceName
BULK INSERT
This command is a T-SQL command that allows you to import data directly from within SQL Server by using T-SQL. This command imports data from file C:\ImportData.txt into table dbo.ImportTest.
BULK INSERT dbo.ImportTest
FROM 'C:\ImportData.txt'
WITH ( FIELDTERMINATOR =',', FIRSTROW = 2 )
Forgot to say that u can write a select query too with the samples in a OLEDB Source Using Sql Command

Export Sql Server (64 bit) to Excel (32 bit)

I tried to export data from SQL Server to excel, but it does not work.
My code is
INSERT INTO OPENROWSET('Microsoft.ACE.OLEDB.12.0', 'Excel 12.0 Xml;HDR=YES;
Database=D:\FATXL.xlsx;',
'SELECT * FROM [Sheet1$]')
SELECT * FROM tabel1
Error
The 32-bit OLE DB provider "Microsoft.ACE.OLEDB.12.0" cannot be
loaded in-process on a 64-bit SQL Server.
I need to know if its possible to export data from 64 bit SQL Server to 32 bit MS Excel and how to do it?
Should we have same bit of SQL server and excel for this to work?
Here's are some alternative solutions which never fail.
Alternative solution 1
Caveat: You'd need to run this query manually in the SSMS.
Use Export to save you results in a CSV file which can be opened in Excel and saves as XLS.
Alternative solution 2
Caveat: You'd need to run this query manually in the SSMS.
You can change the Query results display options from grid to file which can be a CSV file and also check the SET NO COUNT option like below
Alternative solution 3
Caveat: Once again need SSMS
Go to Database name and right click to select Tasks> Export from menu which should open up SQL import and Export Wizard. You can save it as package and deploy it under the SSIS catalog and run as scheduled job too.
Alternative solution 4
Write an SSIS package to get data from SQL server and put into Excel.

Importing Excel files into SQL Server using a script

I have a bunch of Excel files that I want to import into a SQL Server database using a script.
I have imported one of the files (BasicCompanyData-2015-02-01-part1_5.csv) using the import wizard to setup the table correctly and then run the
delete * from dbo.temp
But when I run the following query I get the error:
Bulk load data conversion error(type mismatch or invalid character for
the specified codepage)
My SQL code:
BULK INSERT [UK DATABASE 2014].dbo.temp
FROM 'D:\The Data Warehouse.co.uk\BasicCompanyData-2015-02-01-part1_5.csv'
WITH
(
FIELDTERMINATOR=',',
ROWTERMINATOR='\n',
FIRSTROW=2
)
Any ideas for what I am doing wrong?

Resources