Currently I am building a database in SQL Server Management Studio (SSMS). Several data sources are imported in this database, via the import/export wizard. Upon completion of the import, not only am I running the import, I also save an SSIS package. This SSIS package is to schedule a weekly refresh of the data source.
On my Windows Server 2012 R2 the Express edition is installed, therefore I have manually created several batch files that run every week (as scheduled via Task Scheduler). This works fine for most tables, however for some tables I encounter some strange (?) behaviour.
This is as follows: when creating the SSIS package via import/export wizard, and directly running the import, the table shows up correctly in the database. That is, with all column names and the thousands of rows it contains.
The strange thing is that, when executing the SSIS package (via the batch file), the table is empty (column names are correct though). For some tables, I do not encounter this behaviour. For others, this behaviour is encountered all the time.
The batch script is as follows (quite straightforward):
"C:\Program Files (x86)\Microsoft SQL Server\140\DTS\Binn\DTExec.exe" /F "C:\Users\username\ssispackage.dtsx"
The batch file seems to run correctly at all times, as the table 'creation_date' changes when I run the batch file. Moreover, for all the tables that do correctly 'refresh', these same batch files do the job.
Some settings of the SSIS package:
Data source: Oracle OLE DB provider
Destination: SQL Server Native Client / MS OLE DB Provider for SQL Server (tried both)
Data via query (as I am querying several tables from Oracle); query is parsing correctly
Mappings: Create destination table & Drop and re-create destination table
Dropping and re-creating is done, because the data source is rather small, and has some changes weekly/monthly (to some rows).
For most data sources imported (and refreshed weekly) via this method, the tables are correctly showing each week (simply dropping the previous table, and re-creating the source).
I hope someone can explain to me why this issue occurs. If some more information is needed from my side, please ask.
Thanks in advance!
UPDATE:
When looking at the log of the batch file, this is (part) of the output:
Source: ..... "SourceConnectionOLEDB"
Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "OraOLEDB" Hresult: 0x80004005 Description: **"ORA-01005: null password given; logon denied".**
End Error
.....next error.... "SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.
Thus, it seems that the password is not remembered/saved correctly in the SSIS package?
This is strange though, as for most tables it does correctly store the password (as those do refresh correctly).
When setting the properties of the data source, namely Oracle Provider for OLE DB, I select the option "Allow saving password". So it should store the password correctly?
Found the answer after all.. The .dtsx file that is saved (the SSIS package) contains the variables for the connection string, it shows that the Password (Sensitive="1") is there. But in the wizard, I did not select 'Save sensitive data with user key'. When selecting this option, an encryption string was added. Now the SSIS packages run well!
Related
On our servers (windows 2016, SQL reporting server 2016, Microsoft Access Database Engine 2016) we run 2 SSIS packages. 1 imports data from an excel file to the database and 1 exports data from the database to an excel file. Both are xlsx files.
We run this exact package on TST, ACC, RES and PRD (same server and access setup). We didn't have any issues until a week ago the packages on PRD just kept on getting stuck in the "beginning validation phase" of the Dataflow Task. The other environments are fine.
We've determined that it is not a problem in the application since a simple read package that we created for this issue, gave the same problem. It doesn't seem to be an access issue either. The account that runs the script is sysadmin in SQL and local admin on the fileserver.
We also tried
• Only using one import flows instead of two in Data Flow task: no change https://social.msdn.microsoft.com/Forums/sqlserver/en-US/781c855f-833e-4578-a43a-1729482bbabd/dtspipeline-validation-phase-is-beginning-but-never-stop?forum=sqlintegrationservices
• Set connection managers for OLE DB sources are all set toDelayValidation to True: no change SSIS pre-evaluation phase taking long
• Set ValidateExternalMetadata is set to false for Excel Sources: no change SSIS pre-evaluation phase taking long
• Reinstall Microsoft Access Database Engine on server: no change
• Tested reading a flat file (txt) which worked without issue.
We're fresh out of ideas so any help would be greatly appreciated.
UPDATE:
When manually trying to run the import/export wizard (and selecting excel file) I get "The operating system is not presently configured to run this application". Investigating this message as well.
If you had no problem reading a text file then that would point me to the excel driver (32 or 64), however I would think if that was the issue you would see a connection error. Do this as test.
Go to the console and open excel on the server. This will let you know if you have some licensing issue or something preventing excel from opening on the server.
Import a small amount of data into SQL server from excel using the import data into a test database (or just make a test table). Be sure to use the same driver you are using in the SSIS package.
I'm having an issue exporting a large dataset (500k+) to Excel via SSIS, where the output file ends up with 0 rows exported. Before saying that I shouldn't be exporting that many records to Excel, let me state that I know and normally wouldn't. Accounting does not want a CSV and is unwilling to open a CSV in Excel.
Using Visual Studio 2012 SSDT, here are the components involved.
Execute SQL Task -> Creates the empty file with headers
Data Flow Task ->
OLE DB Source -> SQL Query
Excel Destination
While the package is running, you can see records flowing from the source to the destination. The package completes without error, but when you open the file, it's empty. The only thing in there is the header.
If I select the Top 1000 records and export to Excel, it works as intended.
Some things I've tried:
Export to Excel on the network
Export to Excel locally
Export to CSV to Excel on both network and locally
Export to Ole DB Destination using Office Access Database Engine 12.0 with "Excel 12.0" extended properties.
Tried running as different users
All with the same outcome.
Can anyone provide any insight into why this may be happening and how to proceed?
We experienced a similar behaviour, when runnig the ETL in a SQL Server Agent job. Debugging it in Visual Studio, worked, however. So I do not know, whether this solution applies to you.
The reason was that the user, under which the package ran, did not have access to C:\users\Default.
I found this out by using sysinternals process monitor.
I was inspired by that post: Empty Excel File permissions issue: SSIS Excel Destination buffers large record sets through C:\Users\Default
[I explained my search for the bug in my blog: https://www.csopro.de/biblog/2018/04/ssis-fehlerbehebung-bei-excel-destination-schreibt-keine-zeilen/ Unfortunately it is in German]
I got this SSIS package working this past December. It only runs on Friday mornings. Last Friday it failed with this error message:
Package:Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft Access Database Engine" Hresult: 0x80004005 Description: "The Microsoft Access database engine cannot open or write to the file '\\ServerShare\IT\Reports\Export Templates\YoderReport.xlsx'. It is already opened exclusively by another user, or you need permission to view and write its data.".
I've checked out a couple of other questions that were similar, but they did not answer my question. I have checked to make sure that no one has that file open.
The file in question is a template that is copied over then populated, so no one should have it open, to begin with.
I've tried changing the RetainSameConnection to True, but no difference. I have run it in debug mode, and it works fine.
Anyone got any ideas how to clear this up so it runs automatically again?
UPDATE
After some more testing, it appears that the file is getting the data, but isn't being copied. Here is what I have setup:
I have a File System Task that copies a template from my template
folder to my Export folder.
Then I have a Data Flow Task
Begins with a OLE DB Source that runs a SQL script to pull data
Runs a Data Conversion to update a couple of fields to the correct format
Excel Destination is used for the output. (This is the template that was copied to the Export folder
There is also a Flat File Destination just in case there are any errors
Then back to the Control Flow with another File System Task, this one moves the file from the Export folder to its final destination on a shared drive
When I run this from VS 2015 it works fine and creates the file. When I run this from the SQL Agent job it fails with the above error message. The only thing that I can think of is that in the Data Flow Task the Excel Destination isn't releasing the file before the final File System Task tries to copy it? But if that is the case, why did that just start happening now?
I think that there is a problem in the filepath, as you can see the file path start with one single backslash:
\ServerShare\IT\Reports\Export Templates\YoderReport.xlsx
Maybe it should start with double back slash \\ if it is on the network
\\ServerShare\IT\Reports\Export Templates\YoderReport.xlsx
Or the Partition is missed --> Incomplete path
Check the way that the file path is provided to the connection manager (if you are using expressions)
Also based on this microsoft article, there is two possible causes:
You must have permission to read data in the specified file in order to view its data. To change your permission assignments, see your system administrator or the table or query's creator.
You tried to open the indicated file exclusively, but another user already has the file open.
Looks like an access issue. Ensure that the Agent Service Account is running with full rights to the network share path. Maybe you can try with your credentials on the Agent Server.
Also, Ensure that your Excel destination connection string supports .xlsx.
Provider=Microsoft.ACE.OLEDB.12.0;Data Source=c:\path\xxx.xlsb;Extended Properties="Excel 12.0;HDR=YES";
Changing “Excel 12.0” to “Excel 12.0 Xml” will tell the provider to output in .xslx format instead.
I have an simple SSIS package and I'm trying to export same set of data from a table to both flat file and excel destination. The package works fine when I run locally and it creates both text file and excel file with data.
But when deployed to a different server the sql agent job runs fine and the log inside integrations services catalog for the package says it wrote like 9000 rows to excel, and a new excel file is also created but it doesn't write any data to it(blank with just headers). text file works fine and it has all data I need.
SSIS package flow:
I'm working with Sql server 2014, Visual studio 2013 with SSDT and used Excel 2007 in excel destination.
We had the same issue.
The solution is that the user, which runs the SSIS package, must have full access to c:\users\default.
You can check this by running sysinternals' process monitor on the machine that executes the SSIS job.
You can find more information here:
Empty Excel File permissions issue: SSIS Excel Destination buffers large record sets through C:\Users\Default - This post made me find this solution
https://www.csopro.de/biblog/2018/04/ssis-fehlerbehebung-bei-excel-destination-schreibt-keine-zeilen/ - my blog. Here I describe the issue - unfortunately in German]
I had the same problem writing to several worksheets in an Excel file from a scheduled SQL Agent job. It worked fine for about 4 months. Then suddenly with no changes to the package, one of the 5 worksheets was no longer populated with data. No error message generated and it worked fine on every test from Visual Studio and Data Tools (the old "BIDS" tools as we used to call it.)
I never did find a solution and it continues to not write any data to that single worksheet of the 5 in the Excel file. (So answers above about the Account that the job runs under from SQL Agent does not have the appropriate permissions is NOT a correct answer for this issue.)
Plus, a new package I built today is having the same issue, only this one has only a single worksheet. Again, works fine in the development environment, but no data appears in the destination file and no errors. Not only that, but the timestamp on the file is the same as the template file -- it seems that it never even TRIES to write to the file.
Checking each run log for the package in the Integration Services Catalog has an entry in each log that shows 9K+ records "written" for the dataflow task.
Lastly, if I change the destination file name, the SQL Agent job generates the expected error, so that rules out answers that guess that the path is wrong.
This is bizarre. And exasperating.
I have encountered odd behaviour when using scheduled SSIS packages which use the Excel object.
The fix for me, was to edit the Agent Job properties. On the Execution Properties tab, try enabling the "use 32-bit runtime" option and force the SSIS to run in 32-bit mode instead of 64-bit mode.
I have a problem with SQL JOB.
I am using SSIS to import data into database. While I am running SSIS through Visual studio it works as expected. Also when I am running the SSIS from Integration Services Catalogues in SQL Server, it works fine. But my issue is that when I am trying to import data into my DB using SQL JOB (using the same SSIS as in previous cases), although there is no error message and Sql job runs with success, the destination table in my DB is empty (no data are imported into my DB).
When I checked the execution report for SSIS, run by SQL I see this warning:”Foreach Loop Container: Warning: The for each file enumerator is empty. The for each file enumerator did not find any files that matched the file pattern, or the specified directory was empty.”
Once again, everything works OK when I am running SSIS in other way than by using SQL JOB.
Am I doing something wrong? How can I solve this problem? Please help me, because I am not able to solve this issue.
Thank you.