I have a SSIS package. Inside it, are two Execute Package Task elements. The two packages are linked to run in order.
Package One - downloads CSV file from external site (SalesForce.com)
E:\SalesForce\MSSQL_Import\PricebookEntry.csv
E: is a mapped network drive.
Package Two - SQL task to BULK INSERT contents of CSV file into empty table.
BULK INSERT SalesForce.dbo.PriceBookEntry
FROM 'E:\SalesForce\MSSQL_Import\PricebookEntry.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = '\t',
ROWTERMINATOR = '\n',
CODEPAGE = 'ACP',
ERRORFILE = 'E:\SalesForce\Data_Loader\Logs\Error\Bulk_Insert\BI_PriceBookEntry.csv',
TABLOCK
)
I am getting the following error when the second SSIS package is triggered:
[Execute SQL Task] Error: Executing the query "BULK INSERT
SalesForce.dbo.PriceBookEntry
FROM..." failed with the following error: "Cannot bulk load. The file "E:\SalesForce\MSSQL_Import\PricebookEntry.csv" does not exist.".
Possible failure reasons: Problems with the query, "ResultSet"
property not set correctly, parameters not set correctly, or
connection not established correctly.
I have checked the location of the file. The path and spelling are good. The file exists while the script runs.
I have tried saving the file on the C: just to check. The BULK INSERT package still fails.
I have run the BULK query in management studio as a command and it works fine.
If I chain the packages together inside another package, the BULK INSERT fails. If I run the SSIS packages separately in order, they work fine.
I have other SSIS packages that do the same thing and work fine.
I've been Googling high and low for a possible solution and still nothing. Anyone have any ideas on what might be the issue?
I found the problem.
The quick answer is that the CSV file being downloaded is not completed before the BULK INSERT command is fired in SSIS. Thus SQL Server can't get access to the file.
Previous jobs were only working with very small files (2000 or so lines). This job is working with a lot more (~1 mil lines).
The solution will involve holding off the BULK INSERT until the file has completed download.
Related
I've SSIS Package with me containing Execute SQL Task. This task invokes a SQL File and executes it against SQL Server.
In the script file, I've roughly 2000 statements, where any statement can fail depending on certain scenarios and other source input files.
In SSIS Package, for Execute SQL Task, I've set MaximumErrorsCount to 0 or even 5000. THe problem is - this task stops as soon as first error occurs in SQL Script execution.
How should I upgrade the package so that SSIS tries next line in SQL file post errored line? I tried using Error Handler approach, but it didn't worked out.
I'm trying to load multiple files from a location into DB using Foreach Loop Container & DataFlow task in SSIS.
It's getting crashed while I try to execute the package. It's not giving any error message, whenever I execute the package it crashes and closes the visual studio app immediately. I have to kill the debug task in the task manager for the next execution of the package.
So I tried the below steps:
I used a FileSystem task instead of DataFlow task to just
move all the files from the source to the archive directory, which ran
fine without any issues.
Ran the DataFlow task individually to load a single file into DB,
which was also executed successfully.
I couldn't figure out what was going wrong here. Any help would be appreciated! Thanks!
Screenshots
All screenshots look fine to me. I will give some tips to try to figure out the issue.
Since the File System Task is executed without any problem, there is no problem with the ForEach Loop Container. You can try to remove the OLE DB Destination and replace it with a dummy task to check if it causing the issue. If the issue remains, it means that the Flat File Source could be the cause.
Things to try
Make sure that the TargetServerVersion is accurate. You can learn more about this property in the following article: How to change TargetServerVersion of my SSIS Project
Try running the package in 32-bit mode. You can do this by changing the Run64bitRuntime property to False. You can learn more about this property in the following article: Run64bitRunTime debugging property
Running Visual Studio in safe mode. You can use the following command devenv.exe /safemode.
Workaround - Using Bulk Insert
Since you are inserting flat files into the SQL database without performing any transformation. Why not use the SSIS Bulk Insert Task. You can refer to the following step-by-step guide for more information:
SSIS Basics: Bulk-Import various text files into a table
As mentioned in the official documentation, make sure that the following requirements are met:
The server must have permission to access both the file and the destination database.
The server runs the Bulk Insert task. Therefore, any format file that the task uses must be located on the server.
The source file that the Bulk Insert task loads can be on the same server as the SQL Server database into which data is inserted, or on a remote server. If the file is on a remote server, you must specify the file name using the Universal Naming Convention (UNC) name in the path.
Currently I am building a database in SQL Server Management Studio (SSMS). Several data sources are imported in this database, via the import/export wizard. Upon completion of the import, not only am I running the import, I also save an SSIS package. This SSIS package is to schedule a weekly refresh of the data source.
On my Windows Server 2012 R2 the Express edition is installed, therefore I have manually created several batch files that run every week (as scheduled via Task Scheduler). This works fine for most tables, however for some tables I encounter some strange (?) behaviour.
This is as follows: when creating the SSIS package via import/export wizard, and directly running the import, the table shows up correctly in the database. That is, with all column names and the thousands of rows it contains.
The strange thing is that, when executing the SSIS package (via the batch file), the table is empty (column names are correct though). For some tables, I do not encounter this behaviour. For others, this behaviour is encountered all the time.
The batch script is as follows (quite straightforward):
"C:\Program Files (x86)\Microsoft SQL Server\140\DTS\Binn\DTExec.exe" /F "C:\Users\username\ssispackage.dtsx"
The batch file seems to run correctly at all times, as the table 'creation_date' changes when I run the batch file. Moreover, for all the tables that do correctly 'refresh', these same batch files do the job.
Some settings of the SSIS package:
Data source: Oracle OLE DB provider
Destination: SQL Server Native Client / MS OLE DB Provider for SQL Server (tried both)
Data via query (as I am querying several tables from Oracle); query is parsing correctly
Mappings: Create destination table & Drop and re-create destination table
Dropping and re-creating is done, because the data source is rather small, and has some changes weekly/monthly (to some rows).
For most data sources imported (and refreshed weekly) via this method, the tables are correctly showing each week (simply dropping the previous table, and re-creating the source).
I hope someone can explain to me why this issue occurs. If some more information is needed from my side, please ask.
Thanks in advance!
UPDATE:
When looking at the log of the batch file, this is (part) of the output:
Source: ..... "SourceConnectionOLEDB"
Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "OraOLEDB" Hresult: 0x80004005 Description: **"ORA-01005: null password given; logon denied".**
End Error
.....next error.... "SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.
Thus, it seems that the password is not remembered/saved correctly in the SSIS package?
This is strange though, as for most tables it does correctly store the password (as those do refresh correctly).
When setting the properties of the data source, namely Oracle Provider for OLE DB, I select the option "Allow saving password". So it should store the password correctly?
Found the answer after all.. The .dtsx file that is saved (the SSIS package) contains the variables for the connection string, it shows that the Password (Sensitive="1") is there. But in the wizard, I did not select 'Save sensitive data with user key'. When selecting this option, an encryption string was added. Now the SSIS packages run well!
I'm updating a SSIS task and getting a weird error: When executed, it says it can't bulk load a file the same SSIS task created.
This task retrieves a .zip file, extracts it in a path and then runs a stored procedure to bulk load the XML file, insert it's contents in some tables etc.
It's printing the following error in the logs:
Empresas:Error: Executing the query "execute carga.sp_cargaInicialEmpresas ?, ?"
failed with the following error: "Erro na procedure xxxxx2016_CI.carga.sp_cargaInicialEmpresas)
Linha: 1Mensagem: Cannot bulk load because the file "C:\xxxxx2016\arquivos\Empresa\2017\2403\02\151423_ExecucaoEmpresas\ExecucaoEmpresas.xml" could not be opened.
Operating system error code 3(The system cannot find the path specified.).".
Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
This is a local path, not a network path, and the .xml file is created by the SSIS task itself. I checked the path in windows explorer and the file is there, waiting to be read.
Also the SQLSERVERAGENT user has all permissions in the "C:\xxxxx2016\arquivos\Empresa\2017\2403\02\151423_ExecucaoEmpresas\" directory.
How can I solve this?
This is an OS issue -- Error 3. The path does not exist. Please see system error codes page. Please check on the full path to the file.
An access issue would an error code 5.
Also, it depends on who is executing the SSIS package. Are you doing it from the server using your login? Are you running the package from SQL Agent under that credential? Did you change the run as credential in the job.
Like someone said, try executing it by hand. If it fails, then it is the Stored Procedure code. If it passes, then it is the path passed from SSIS to the SP. Make sure you know what your working directory is.
Tell me how you make out.
That was a very misleading error message.
After a few days trying to work it out, I found out that the database was misconfigured in my SSIS task connections.
I want to know how to insert value in SQL Server database with the flat file source in SSIS using SQL command. I've done inserting it using table view, now i have to insert it using SQL command
Well you need a good query to set into a Execute SQL Task in SSIS
you can get help for queries in the site below
----here is the link ----
well you can parametrize the query in Execute SQl Task of SSIS
BCP
This is one of the options that is mostly widely used. One reason for this is that it has been around for awhile, so DBAs have come quite familiar with this command. This command allows you to both import and export data, but is primarily used for text data formats. In addition, this command is generally run from a Windows command prompt, but could also be called from a stored procedure by using xp_cmdshell or called from a SSIS package.
Here is a simple command for importing data from file C:\ImportData.txt into table dbo.ImportTest.
bcp dbo.ImportTest in 'C:\ImportData.txt' -T -SserverName\instanceName
BULK INSERT
This command is a T-SQL command that allows you to import data directly from within SQL Server by using T-SQL. This command imports data from file C:\ImportData.txt into table dbo.ImportTest.
BULK INSERT dbo.ImportTest
FROM 'C:\ImportData.txt'
WITH ( FIELDTERMINATOR =',', FIRSTROW = 2 )
Forgot to say that u can write a select query too with the samples in a OLEDB Source Using Sql Command