I want to create a SQL Server SSIS package where I can watch a folder and once I have all (20 files) the required files I want to execute a sql statement. The files may come at different times and sometime they will be in csv and sometime they can come in zip. I know ssis has a wmi event watcher task but I’m not sure how I can specify to look for all 20 files. I guess I want wmi event watcher to look into that folder every 30 minutes and once it sees all the files move to the next step (execute sql task). Can someone tell me how I can specify the file name in wmi event watcher task? Thanks.
This article seems relevant to your plan. You need to create the proper WQL code.
http://blogs.technet.com/b/heyscriptingguy/archive/2007/05/22/how-can-i-monitor-the-number-of-files-in-a-folder.aspx
("ASSOCIATORS OF {Win32_Directory.Name='C:\Logs'} Where " _
& "ResultClass = CIM_DataFile")
I'm not sure how that will behave in the WMI Event watcher though. Have you looked at the docs for the SSIS task?
Here is a more step-by-step approach:
http://microsoft-ssis.blogspot.com/2010/12/continuously-watching-files-with-wmi.html
Some good points there, even if it doesn't address the pesky 20 file requirement.
You could also have a powershell script on the server monitor the files and then chuck them into a subfolder when they are all there, which SSIS would be monitoring.
Here is a doc page showing how to specify one file:
http://msdn.microsoft.com/en-us/library/windows/desktop/aa394594(v=vs.85).aspx
With that, I'm sure you could set up a chain of WMI checks in your SSIS package.
Related
I'm trying to load multiple files from a location into DB using Foreach Loop Container & DataFlow task in SSIS.
It's getting crashed while I try to execute the package. It's not giving any error message, whenever I execute the package it crashes and closes the visual studio app immediately. I have to kill the debug task in the task manager for the next execution of the package.
So I tried the below steps:
I used a FileSystem task instead of DataFlow task to just
move all the files from the source to the archive directory, which ran
fine without any issues.
Ran the DataFlow task individually to load a single file into DB,
which was also executed successfully.
I couldn't figure out what was going wrong here. Any help would be appreciated! Thanks!
Screenshots
All screenshots look fine to me. I will give some tips to try to figure out the issue.
Since the File System Task is executed without any problem, there is no problem with the ForEach Loop Container. You can try to remove the OLE DB Destination and replace it with a dummy task to check if it causing the issue. If the issue remains, it means that the Flat File Source could be the cause.
Things to try
Make sure that the TargetServerVersion is accurate. You can learn more about this property in the following article: How to change TargetServerVersion of my SSIS Project
Try running the package in 32-bit mode. You can do this by changing the Run64bitRuntime property to False. You can learn more about this property in the following article: Run64bitRunTime debugging property
Running Visual Studio in safe mode. You can use the following command devenv.exe /safemode.
Workaround - Using Bulk Insert
Since you are inserting flat files into the SQL database without performing any transformation. Why not use the SSIS Bulk Insert Task. You can refer to the following step-by-step guide for more information:
SSIS Basics: Bulk-Import various text files into a table
As mentioned in the official documentation, make sure that the following requirements are met:
The server must have permission to access both the file and the destination database.
The server runs the Bulk Insert task. Therefore, any format file that the task uses must be located on the server.
The source file that the Bulk Insert task loads can be on the same server as the SQL Server database into which data is inserted, or on a remote server. If the file is on a remote server, you must specify the file name using the Universal Naming Convention (UNC) name in the path.
this is one that has me stumped and Ive been doing this a long while.
Migrating to SQL server 2016, large number of ETL. Easy enough.
One of the ETL packages has a simple script task to take a table of files, run a file exists foreach loop.
it uses a project parameter to create the unc ( \servername\share) and then binds that to the file name in the script task.
use an environment config setup in SSISDB
execute in SSDT works fine, deploy to catalog and it cant see the file. i know youll say permissions, but ive permissioned everyone group to share and drive in case its that. SSISDB execution means it should be running under my security context and im domain admin, local admin and creator owner of the share.
even strangeR, i have created simple package to grab the contents of one of the files and import into a dump table in case permissions or pathway were duff ( even though they work in SSDT might be the enviroNment config in SSISDB). THIS WORKS FINE, therefore it cant be the envrionment setup of SSISDB being referenced.
please note this is not running from an agent job yet so wont be due to agent server account issue. need to get it running from ssisdb first then ill create an agent job
So -- script task cant see unc share, built from two variables, that works in ssdt and its running under same credentials...
Go
For what its work the script task code is
Dts.Variables("BolFileExists").Value = File.Exists(Dts.Variables("StrLoadFileLocation").Value.ToString & Dts.Variables("StrCurrentFile").Value.ToString)
This is a slightly different answer as it shows a different approach and removes the script task. I use a foreach to check if the file exists using GUI tools provided by SSIS:
Well I found the answer and I deserve to punch myself in the face.
Tried everything, it was a file variable and path variable being pulled together in the script task so tried concatenation that before the script task, pumped this into a table to ensure it was going to write table.
Literally everything was fine and still didn’t work.
The issue....
Building it as a 2017 package onto a 2016 Sql server.
I’ve not found what was missing dll wise but it must have been one of those that meant the script task couldn’t find the files but weird it didn’t break and just said the files weren’t there!
Thanks all for input, I’m going to go put my head in the door and slam it
I have a SSIS package which look for 8 files in a pre-defined location. Using script task I am checking if any files are missing? If any files are missing then I send an email stating files are missing. Now I want to stop the current package if any files are missing after sending email task.
From the Microsoft link:
RunningPackage.Stop Method
I can see I can stop the ssis package by stopping the ssis service (from the SQL Server which is running SSIS Service) what I do not want to do as I am not sure if it will start the ssis service automatically again. Also I do not have permission to see and run the packages in the ssis server and test this way.
I am not sure about how to stop using DTEXEC tool either. I would appreciate any kind of help.
Just fail the container by adding code "dts.TaskResult=ScriptResults.Failure".
After that, you can add an "Failure" container (just add a container and change the green arrow to a red one" to send out the email.
Once you do this, you must force a "fail" on the email container and go to the container's properties (you can just right-click and choose "Properties"). Look for "FailPackageOnFailure" and change that to "True". Hope this helps.
I need to import a flat file daily. The file changes its name every day. After the file is processed, it needs to be moved to another folder.
I noticed I can schedule jobs in the SQL Server Agent, and that I can tell it to run every hour or so and that I am able to add CMD commands to it.
The solution I found was to run a script to check if the file exists, since the folder should be empty or have at least one file.
If the file exists, the script renames the file to one used in the SSIS package and then it runs the SSIS package.
After the whole thing is done, it should rename the file again based on today's date and move it to another folder.
If the file does not exist, then it should do nothing and wait another hour or so to run again.
What's the best solution to this scenario? Is the script a good idea? Maybe is it possible to add the if/else -for the file exists- into the SSIS package? Or even make the script run from the SSIS package itself instead of adding it to the Server Agent?
EDIT:
It seems I was a little naïve, it's possible to run VB scripts from the server. Would that be the recommended solution? It does solve my problem, but I'm just wondering if it's a good idea.
This solves all my questions:
http://www.sqlservercentral.com/articles/Integration+Services+%28SSIS%29/90571/
I have a requirement. We have a FTP server where the data will change everyday. There are around 9 files. Each file is data for MS SQL ETL. Now what i want to do is. As soon as file arrives in FTP location. Powershell should read that date modified of the file and trigger the job in SQL Server. Is that possible with powershell?
Challenges involved
Limited with technology (Only powershell and TSQL Can be used)
Old file (Day - 1) Data. to get each file completely replace it
will take 15 Minutee, before that job should not triggered.
Need your inputs on this.
You may want to try a FileSystemWatcher. A similar-ish question has been asked before, so I won't try to regurgitate the answer:
Watch file for changes and run command with powershell
See also on MSDN:
FileSystemWatcher class
FileSystemWatcher events