SSIS package hangs on validation - sql-server

I have created my first SSIS package. I have a for each container the loops thru a folder. The data flow has a derived column task and 4 lookups. When I run the package in Visual Studio (2013) it starts with the first file and arrives at the destination, but it does insert the data it only hangs with the text "Validating OLE DB Destination 1" in the status bar.
The files are located on my hard drive and the destination database is on the locale network. I'm using a sysadmin account the be sure that the user have sufficient access rights.
I'm unable to query the destination database table from SSMS as well.
Anyone have some idea what could be the problem and how I could solve it?
Sorry for the unspecific question. In my control flow in ssis I have a for each loop container who contains a data flow task to import all the data in every file that the container loops. Connected to the task, is two move file tasks dependent on success or failure of the import task. The strange part is that one file i moved, no data is inserted in the database and the for each loop hangs after the first loop (the folder contains 150 files). While this ssis process hangs, i'm unable to to query the database with select *, no error it just says "executing query".
The latter. It finishes the first round (moves the file to my success folder) and then halts with the "still working" icon on the data import task. But the data is not inserted even if the file is move. Will the transaction commit first when it has finished processing all the files?
Edit: Image of the control flow and the data flow

The answer was found in the "lock table" option. Since both destinations are the same table, i guest the first destination locks it, when the second destination hits the same table it is locked and it waits until it is unlocked. And that does not happen since its not ready to commit yet.

Related

SSIS Foreach Loop Container to read files and load into DB getting crash during execution

I'm trying to load multiple files from a location into DB using Foreach Loop Container & DataFlow task in SSIS.
It's getting crashed while I try to execute the package. It's not giving any error message, whenever I execute the package it crashes and closes the visual studio app immediately. I have to kill the debug task in the task manager for the next execution of the package.
So I tried the below steps:
I used a FileSystem task instead of DataFlow task to just
move all the files from the source to the archive directory, which ran
fine without any issues.
Ran the DataFlow task individually to load a single file into DB,
which was also executed successfully.
I couldn't figure out what was going wrong here. Any help would be appreciated! Thanks!
Screenshots
All screenshots look fine to me. I will give some tips to try to figure out the issue.
Since the File System Task is executed without any problem, there is no problem with the ForEach Loop Container. You can try to remove the OLE DB Destination and replace it with a dummy task to check if it causing the issue. If the issue remains, it means that the Flat File Source could be the cause.
Things to try
Make sure that the TargetServerVersion is accurate. You can learn more about this property in the following article: How to change TargetServerVersion of my SSIS Project
Try running the package in 32-bit mode. You can do this by changing the Run64bitRuntime property to False. You can learn more about this property in the following article: Run64bitRunTime debugging property
Running Visual Studio in safe mode. You can use the following command devenv.exe /safemode.
Workaround - Using Bulk Insert
Since you are inserting flat files into the SQL database without performing any transformation. Why not use the SSIS Bulk Insert Task. You can refer to the following step-by-step guide for more information:
SSIS Basics: Bulk-Import various text files into a table
As mentioned in the official documentation, make sure that the following requirements are met:
The server must have permission to access both the file and the destination database.
The server runs the Bulk Insert task. Therefore, any format file that the task uses must be located on the server.
The source file that the Bulk Insert task loads can be on the same server as the SQL Server database into which data is inserted, or on a remote server. If the file is on a remote server, you must specify the file name using the Universal Naming Convention (UNC) name in the path.

SSIS Package stopped working with Error code: 0x80004005

I got this SSIS package working this past December. It only runs on Friday mornings. Last Friday it failed with this error message:
Package:Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft Access Database Engine" Hresult: 0x80004005 Description: "The Microsoft Access database engine cannot open or write to the file '\\ServerShare\IT\Reports\Export Templates\YoderReport.xlsx'. It is already opened exclusively by another user, or you need permission to view and write its data.".
I've checked out a couple of other questions that were similar, but they did not answer my question. I have checked to make sure that no one has that file open.
The file in question is a template that is copied over then populated, so no one should have it open, to begin with.
I've tried changing the RetainSameConnection to True, but no difference. I have run it in debug mode, and it works fine.
Anyone got any ideas how to clear this up so it runs automatically again?
UPDATE
After some more testing, it appears that the file is getting the data, but isn't being copied. Here is what I have setup:
I have a File System Task that copies a template from my template
folder to my Export folder.
Then I have a Data Flow Task
Begins with a OLE DB Source that runs a SQL script to pull data
Runs a Data Conversion to update a couple of fields to the correct format
Excel Destination is used for the output. (This is the template that was copied to the Export folder
There is also a Flat File Destination just in case there are any errors
Then back to the Control Flow with another File System Task, this one moves the file from the Export folder to its final destination on a shared drive
When I run this from VS 2015 it works fine and creates the file. When I run this from the SQL Agent job it fails with the above error message. The only thing that I can think of is that in the Data Flow Task the Excel Destination isn't releasing the file before the final File System Task tries to copy it? But if that is the case, why did that just start happening now?
I think that there is a problem in the filepath, as you can see the file path start with one single backslash:
\ServerShare\IT\Reports\Export Templates\YoderReport.xlsx
Maybe it should start with double back slash \\ if it is on the network
\\ServerShare\IT\Reports\Export Templates\YoderReport.xlsx
Or the Partition is missed --> Incomplete path
Check the way that the file path is provided to the connection manager (if you are using expressions)
Also based on this microsoft article, there is two possible causes:
You must have permission to read data in the specified file in order to view its data. To change your permission assignments, see your system administrator or the table or query's creator.
You tried to open the indicated file exclusively, but another user already has the file open.
Looks like an access issue. Ensure that the Agent Service Account is running with full rights to the network share path. Maybe you can try with your credentials on the Agent Server.
Also, Ensure that your Excel destination connection string supports .xlsx.
Provider=Microsoft.ACE.OLEDB.12.0;Data Source=c:\path\xxx.xlsb;Extended Properties="Excel 12.0;HDR=YES";
Changing “Excel 12.0” to “Excel 12.0 Xml” will tell the provider to output in .xslx format instead.

Shared Access 2010 database is in an inconsistent state

I have a shared accde file on a network drive. Occasionally we will have an inconsistent state problem. The error message appears below. It seems to be associated with network connection interruptions for even one user. We have an example when a user unplugged the Ethernet and switched automatically to wireless and other examples where users have left the database open overnight, perhaps when a machine hibernates.
Once this happens the one user cannot work and no one can open the accde file. Other users who have the database open can continue to work.
After the problem occurs it remains until everyone closes the database. At that time it completes whatever recovery it requires and all users can get back in.
This was disruptive when we had six users in one room. Now we have 17 in two cities and a few work-from-home users. It's becoming intolerable.
The obvious answer is to move away from Access. We're working on it but it's a long way off. In the mean time I would appreciate any advice.
Is there a way to prevent the problem entirely?
Is there a VBA way to detect the problem in the instances that are not showing the error message?
Is there something I'm not thinking of?
What would you do?
Error message:
Microsoft Access has detected that this database is in an inconsistent state, and will attempt to recover the database. During this process, a backup copy of the database will be made and all recovered objects will be placed in a new database. Access with then open the new database. The names of objects that were not successfully recovered will be logged in the "Recovery Error" table.
The solution that Microsoft gives is Splitting the database, which just means to put the data elements on a shared server, and everyone has their own copy of the front end.
This might cause problems if that front end needs to be updated (e.g. additional forms). Details here:
http://answers.microsoft.com/en-us/office/forum/office_2007-access/microsoft-office-has-detected-that-this-database/3fb41c70-f7ba-41dd-a847-e62203071466?auth=1
Check the row count in the tables, the tables most likely have large amounts of data creating latency on the read and write queries, causing the locking.
Archive older data and keep the database small and neat, perhaps create referenced databases for archived information
I gather that your MS Access database is getting corrupt when up put it on a shared drive. A Microsoft Access database may get corrupt when in a multi- user environment. Here are the workaround that you can use in order to fix it.
Step 1: Run Command Prompt as Administrator
Click on the Windows icon and type Command Prompt. Then right-click on the Command Prompt and choose Run as administrator option.
Step 2: Execute Compact and Repair Database Command
In the command prompt window, type the following command and then press ‘Enter’.
msaccess <database file name> /compact
In the command, replace <database filename> with database path. For instance,
msaccess "C:\Program Files\Reports.mdb" /compact
This will start the process to compact and repair the faulty Access database file.
Otherwise, You can check out this thread for an alternative solution : https://dba.stackexchange.com/questions/71906/ms-access-mdb-ldb-database-corrupted/171275#171275

SSIS process files from folder

Background:
I've a folder that gets pumped with files continuously. My SSIS package needs to process the files and delete them. The SSIS package is scheduled to run once every minute. I'm picking up the files in ascending order of file creation time. I'm building an array of files and then processing-deleting them one at a time.
Problem:
If an instance of my package takes longer than one minute to run, the next instance of the SSIS package will pick up some of the files the previous instance has in its buffer. By the time the second instance of teh package gets around to processing a file, it may already have been deleted by the first instance, creating an exception condition.
I was wondering whether there was a way to avoid the exception condition.
Thanks.
How are you scheduling the job? If you are using the SQL Server Job Scheduler I'm under the impression is should not re-run a job already running; see this SO question: Will a SQL Server Job skip a scheduled run if it is already running?
Alternatively, rather than trying to move the file around you could build a step of your job to test if it is already running. I've not done this myself but it would appear to be possible, have a read of this article Detecting The State of a SQL Server Agent Job
Can you check for the existance of the file before you delete it.
File.Exists(filepathandname)
To make sure your package are not messing up with the same files, you could just create an empty file called just like the filename but with another extension (like mydata.csv.being_processes) and make sure your Data Flow Task is running only on files that don't have such file.
This acts as a lock.
Of course you could change the way you're scheduling your jobs but often - when we encounter such an issue - it's because we got no leverage on those things :)
You can create a "lock file" to prevent parallel execution of packages. To protect yourself from case with crashed package consider using file's create date to emulate lock timeout.
I.e.: at the beginning of package you will check for existence of lock file. If it's non-existent OR it was created more than X hours ago - then continue with import. Otherwise exit.
I have a similar situation. What you do is have your SSIS package read all files within a folder and create a work file like 'process.txt'. This will create a list of valid files at that point in time. If you have multiple packages, then create the file with a name like 'process_.txt'. The package will only process the files named in their process file. This will prevent overlap.

DTS - Manual step excution gives different results than just hitting "Run"

I have a DTS (not SSIS) package that hasn't been touched in years that I had to update a query with. When I run the package by manually executing each step in the editor, everything works out fine and generates a file of a couple thousand records as expected. When I hit the "Execute" button at the top of the editor to run the whole package, it doesn't error but the file is generated with only 1 record.
All tasks inside of the package are either transformation steps or Sql Tasks. There are not any ActiveX script tasks. When I watch the process as it's running the steps by itself, the execution is following the mapping correctly.
I'm at a loss on this one. Has anyone seen this issue before or have any idea where to start?
I just ran into a similar issue recently. While working with the senior DBA, we found that the server where the package ran did not have the right permissions to a directory on the network. The package ran fine in my box, but died on the production server. We need to give permissions to the sqlservice account on the production box, to write to the directory on the network.
You might also want to check out any ActiveX Script step that changes the connection string or destination of Data Pump steps. I've had cases where these were different on the destination server that the DTS packages run.
After going through all of the lines of all of the stored procedures and straight sql tasks used in the package, I located a SET ROWCOUNT 1 that was never reset. While I was manually executing each step separately, the RowCount would be automatically reset; however, when it was run as a complete package, the RowCount was never reset. Adding SET ROWCOUNT 0 at the end of the particular script resolved this issue.

Resources