I have an simple SSIS package and I'm trying to export same set of data from a table to both flat file and excel destination. The package works fine when I run locally and it creates both text file and excel file with data.
But when deployed to a different server the sql agent job runs fine and the log inside integrations services catalog for the package says it wrote like 9000 rows to excel, and a new excel file is also created but it doesn't write any data to it(blank with just headers). text file works fine and it has all data I need.
SSIS package flow:
I'm working with Sql server 2014, Visual studio 2013 with SSDT and used Excel 2007 in excel destination.
We had the same issue.
The solution is that the user, which runs the SSIS package, must have full access to c:\users\default.
You can check this by running sysinternals' process monitor on the machine that executes the SSIS job.
You can find more information here:
Empty Excel File permissions issue: SSIS Excel Destination buffers large record sets through C:\Users\Default - This post made me find this solution
https://www.csopro.de/biblog/2018/04/ssis-fehlerbehebung-bei-excel-destination-schreibt-keine-zeilen/ - my blog. Here I describe the issue - unfortunately in German]
I had the same problem writing to several worksheets in an Excel file from a scheduled SQL Agent job. It worked fine for about 4 months. Then suddenly with no changes to the package, one of the 5 worksheets was no longer populated with data. No error message generated and it worked fine on every test from Visual Studio and Data Tools (the old "BIDS" tools as we used to call it.)
I never did find a solution and it continues to not write any data to that single worksheet of the 5 in the Excel file. (So answers above about the Account that the job runs under from SQL Agent does not have the appropriate permissions is NOT a correct answer for this issue.)
Plus, a new package I built today is having the same issue, only this one has only a single worksheet. Again, works fine in the development environment, but no data appears in the destination file and no errors. Not only that, but the timestamp on the file is the same as the template file -- it seems that it never even TRIES to write to the file.
Checking each run log for the package in the Integration Services Catalog has an entry in each log that shows 9K+ records "written" for the dataflow task.
Lastly, if I change the destination file name, the SQL Agent job generates the expected error, so that rules out answers that guess that the path is wrong.
This is bizarre. And exasperating.
I have encountered odd behaviour when using scheduled SSIS packages which use the Excel object.
The fix for me, was to edit the Agent Job properties. On the Execution Properties tab, try enabling the "use 32-bit runtime" option and force the SSIS to run in 32-bit mode instead of 64-bit mode.
Related
I have designed a SSIS project and deployed it to SQL server and also created the job to run on daily basis but its giving me this error when executing this as job (doesnt give any error within VS):
There is this CLSid in this error message but there is no application associated to it in
--> Component Services -> Computers -> My Computer -> DCOM Config
But this CLSid is registered inside registry editor
About this particular task on which this error is occurring: This is a script task which is modifying and deleting the un-wanted rows from the excel file in which I am trying to write SQL table data.
Script task code looks like this:
I have been working for hours now trying to fix this problem but no success. Kindly guide me how can I fix this issue. If any other information is required related to this project, please let me know....
Doing Excel automation in a SQL Server agent job is totally unsupported and probably won't work.
To have even a ghost of a chance of making this work you'll need to run a real desktop session on the server and automate Excel in that. Excel expects a real user to be logged in with a full profile. And Excel has failure conditions where it displays a popup window, which you'll need to be able to access via remote desktop.
You can read and write Excel files on a server with the OpenXML SDK, without actually having to run Excel. There's also a wrapper library called ClosedXML which you may find easier to use than using OpenXML directly.
tl;dr;
You need to install Office (Excel) on the server AND ensure that you install it in a manner that mirrors the SQL Agent's expected bit-edness. Default for Agent is going to be 64bit, default for Office is still 32 :(
Error guessing
You have a script task that uses the Office interop libraries to delete some rows (2 through 11?) out of a spreadsheet.
You have Office installed on your machine and therefore you have the libraries installed. Excel still has COM based "stuff" in it, thus the interop and errors shrieking about the CLSid, registry, etc but that's likely just secondary errors because there is no base "application is not installed" exception to be thrown.
If Office is installed, then ensure your agent execution model matches the version of Office. If 32 bit Excel is already installed, don't potentially break everyone else's stuff by uninstalling and reinstalling as 64 bit, just got the Advanced section of the SQL Agent Job Step and check the 32bit box.
Once all that's done, then if you're still getting errors but new ones, then the existing comments mentioning permissions may come into play - it depends on where the Excel document actually exists (on the computer where SQL agent can access vs on the computer where it cannot vs networked drive)
Good luck in not finding people on the sanctions lists.
This is a new problem that began about three weeks ago on several jobs that have been running for years without problems - with both SSIS Flat File Sources and script tasks. The error occurs on READ and never makes it to write.
The files are semicolon separated .csv files, and the columns that are throwing the error are decimal values. Since the locale is Danish, that means the decimal character is a comma.
The job in SSMS only points to the SSIS package, which consists of only a data flow task, which contains a Flat File Source and an OLE DB Destination. The job fails before it can get to the destination, throwing the error: "Data conversion failed. The data conversion for column 'Pris' returned status value 2 and status text "The value could not be converted because of a potental loss of data""
The package runs without error manually from within VS 2019 using the latest SSIS extension. The errors come when running the package from a SQL Server Agent job.
My first instinct was to change the column properties on the flat file connection manager. It was DT_NUMERIC with a precision of 9 and scale of 4 (the destination is numeric(9,4). I changed it to DT_R4 instead (float) and... well that fixed it.
It's not THE fix though. I mean, it works for this particular file, but what about all the others? We have a LOT of them, and even though I could modify the packages for all of them, something bothers me about the fact that these jobs were running just fine for years...
I haven't found anything that might have triggered the change. On the day the problems started the Service Account user was added to the local administrator group on another of our machines in order to be able to run a PS script on that machine. SSMS 18 was also installed on the server that day. Other than that, pretty uneventful.
The job is run as the SQL Server Agent Service Account. It runs 99% of our jobs, this is the only type I can see that it acts strange with.
I've checked the locale on the server, the file, the package, everything seems to be matching.
We have a parallel test server. Putting .csv files in the corresponding folders and running the jobs works fine. It's something either with the SQL Server or the Service Account.
I've just been fighting with this so much I can't see straight. Can anyone help me figure out what's going on and point me in the right direction so I can get this fixed?
On our servers (windows 2016, SQL reporting server 2016, Microsoft Access Database Engine 2016) we run 2 SSIS packages. 1 imports data from an excel file to the database and 1 exports data from the database to an excel file. Both are xlsx files.
We run this exact package on TST, ACC, RES and PRD (same server and access setup). We didn't have any issues until a week ago the packages on PRD just kept on getting stuck in the "beginning validation phase" of the Dataflow Task. The other environments are fine.
We've determined that it is not a problem in the application since a simple read package that we created for this issue, gave the same problem. It doesn't seem to be an access issue either. The account that runs the script is sysadmin in SQL and local admin on the fileserver.
We also tried
• Only using one import flows instead of two in Data Flow task: no change https://social.msdn.microsoft.com/Forums/sqlserver/en-US/781c855f-833e-4578-a43a-1729482bbabd/dtspipeline-validation-phase-is-beginning-but-never-stop?forum=sqlintegrationservices
• Set connection managers for OLE DB sources are all set toDelayValidation to True: no change SSIS pre-evaluation phase taking long
• Set ValidateExternalMetadata is set to false for Excel Sources: no change SSIS pre-evaluation phase taking long
• Reinstall Microsoft Access Database Engine on server: no change
• Tested reading a flat file (txt) which worked without issue.
We're fresh out of ideas so any help would be greatly appreciated.
UPDATE:
When manually trying to run the import/export wizard (and selecting excel file) I get "The operating system is not presently configured to run this application". Investigating this message as well.
If you had no problem reading a text file then that would point me to the excel driver (32 or 64), however I would think if that was the issue you would see a connection error. Do this as test.
Go to the console and open excel on the server. This will let you know if you have some licensing issue or something preventing excel from opening on the server.
Import a small amount of data into SQL server from excel using the import data into a test database (or just make a test table). Be sure to use the same driver you are using in the SSIS package.
Let me start by saying I've read just about everything on 'Unexpected Termination' errors the last few days and am none the wiser!
I have a number of XLSX files in a network folder, and have 2 packages in a project to read them and load to SQL server (2017) using Excel connector (ACE.16) on dev and prod.
One package loops through about 35 workbooks and appends them all to a single SQL table.
The second package simply reads a single (~5mb) file and writes to a SQL table (with a single data transformation along the way - its not that, I removed it and it still failed)
Both packages use same source folder.
Both packages write to same destination DB.
Both packages run fine on my local machine.
Both packages can be executed correctly from the SSISDB catalog on the server (Right-click-->Execute).
When scheduled in Agent (we have a proxy account with correct folder permissions) the looping package runs just fine, whilst the simpler package fails with 'Unexpected Termination'
Verbose logging reveals nothing, as does Event Viewer on the SQL box.
I started to look into other options such as converting to CSV first using Script Task and Excel Interop but we are a 365 site now so we'd need Office client installs on dev and production server (and not knowing much C# won't help!)
My next route is to see if I can get it to write to CSV destination successfully. If so then I might be able to go XLSX-->CSV-->SQL without having to use Interop or external libraries.
Unfortunately I've not been able to turn up anything further in my searches, so wondered if someone more enlightened than me might be able to suggest where to look next.
I'm having an issue exporting a large dataset (500k+) to Excel via SSIS, where the output file ends up with 0 rows exported. Before saying that I shouldn't be exporting that many records to Excel, let me state that I know and normally wouldn't. Accounting does not want a CSV and is unwilling to open a CSV in Excel.
Using Visual Studio 2012 SSDT, here are the components involved.
Execute SQL Task -> Creates the empty file with headers
Data Flow Task ->
OLE DB Source -> SQL Query
Excel Destination
While the package is running, you can see records flowing from the source to the destination. The package completes without error, but when you open the file, it's empty. The only thing in there is the header.
If I select the Top 1000 records and export to Excel, it works as intended.
Some things I've tried:
Export to Excel on the network
Export to Excel locally
Export to CSV to Excel on both network and locally
Export to Ole DB Destination using Office Access Database Engine 12.0 with "Excel 12.0" extended properties.
Tried running as different users
All with the same outcome.
Can anyone provide any insight into why this may be happening and how to proceed?
We experienced a similar behaviour, when runnig the ETL in a SQL Server Agent job. Debugging it in Visual Studio, worked, however. So I do not know, whether this solution applies to you.
The reason was that the user, under which the package ran, did not have access to C:\users\Default.
I found this out by using sysinternals process monitor.
I was inspired by that post: Empty Excel File permissions issue: SSIS Excel Destination buffers large record sets through C:\Users\Default
[I explained my search for the bug in my blog: https://www.csopro.de/biblog/2018/04/ssis-fehlerbehebung-bei-excel-destination-schreibt-keine-zeilen/ Unfortunately it is in German]