I have a strange situation. I've caused strange situations before, but now it's happening to me. I have a .txt file ( log.txt) being created on a server drive and I don't know where it's coming from.
The contents of the .txt file say the actual date and the actual time an application process ran, but this the format of what's in the .txt file:
(date) (time) AM Starting Job: (date) (time) AM
I've checked a number of things to try to see what's causing this. I have identified a SQL Server Agent Job that runs at that specific time. It runs a SSIS package. Part of that package runs a PowerShell script that starts 16 processes of an application.
The txt file is defiantly showing data of when the PowerShell script is executing the 16 or so Start-Process's in that script.
The agent job doesn't have any steps to create such a file
The SSIS package doesn't have logging turned on. (Right-click the screen in Visual Studio > logging.)
There are no tasks on the project to create the .txt file in the SSIS project.
An application is running a process on part of this and I think that's what's creating it, but the developer doesn't think it's the app creating it.
Is there anything else I should check to see what's generating this?
I found the answer by using SysInternals > ProcMon. I scheduled the job at a time where I could start ProcMon and monitor the creation of the file. In this case, it was the tool I thought it was. The developer fixed the issue.
In case anyone else would like to learn more about SysInternals, here are a few links:
Info on SysInternals:
https://learn.microsoft.com/en-us/sysinternals/
ProcMon:
https://learn.microsoft.com/en-us/sysinternals/downloads/procmon
Video on how to find issues from the creator of SysInternals:
https://channel9.msdn.com/Events/Ignite/2016/BRK4028
Related
this is one that has me stumped and Ive been doing this a long while.
Migrating to SQL server 2016, large number of ETL. Easy enough.
One of the ETL packages has a simple script task to take a table of files, run a file exists foreach loop.
it uses a project parameter to create the unc ( \servername\share) and then binds that to the file name in the script task.
use an environment config setup in SSISDB
execute in SSDT works fine, deploy to catalog and it cant see the file. i know youll say permissions, but ive permissioned everyone group to share and drive in case its that. SSISDB execution means it should be running under my security context and im domain admin, local admin and creator owner of the share.
even strangeR, i have created simple package to grab the contents of one of the files and import into a dump table in case permissions or pathway were duff ( even though they work in SSDT might be the enviroNment config in SSISDB). THIS WORKS FINE, therefore it cant be the envrionment setup of SSISDB being referenced.
please note this is not running from an agent job yet so wont be due to agent server account issue. need to get it running from ssisdb first then ill create an agent job
So -- script task cant see unc share, built from two variables, that works in ssdt and its running under same credentials...
Go
For what its work the script task code is
Dts.Variables("BolFileExists").Value = File.Exists(Dts.Variables("StrLoadFileLocation").Value.ToString & Dts.Variables("StrCurrentFile").Value.ToString)
This is a slightly different answer as it shows a different approach and removes the script task. I use a foreach to check if the file exists using GUI tools provided by SSIS:
Well I found the answer and I deserve to punch myself in the face.
Tried everything, it was a file variable and path variable being pulled together in the script task so tried concatenation that before the script task, pumped this into a table to ensure it was going to write table.
Literally everything was fine and still didn’t work.
The issue....
Building it as a 2017 package onto a 2016 Sql server.
I’ve not found what was missing dll wise but it must have been one of those that meant the script task couldn’t find the files but weird it didn’t break and just said the files weren’t there!
Thanks all for input, I’m going to go put my head in the door and slam it
Modifying this post as a friend has helped me figure it out.
The culprit was that SQL was not able to map drive tag to network shared folder, so the deployed SSIS package was not able to write. The execution report showed all green and success, so I was confused as a beginner. See also the comments below.
Backup original post below:
SSIS package text file write works in visual studio not when deployed on sql server
I have narrowed down the issue to the text file writing action in script task (C#), the experiment simply writes the current time stamp into a text file.
It works in Visual Studio 2015, both with (F5) and without (Cntl+F5) Debugger. The project is in package deployment mode. When deployed to a database server of SQL Server 2016 and manually trigger execute with Administrator login, the writing action never happens although execution report shows all success, and Windows system log shows no clue to me either.
I am a beginner on SSIS and hints and tips will be highly appreciated.
Yes, SSIS doesn't like mapping shared folder to a letter drive. Thanks #Nick.McDermaid .
Never use mapped drives. Use UNC instead i.e. \server\share\folder –
Nick.McDermaid Nov 8 '18 at 23:40
Hopefully this question is unique enough not to be a duplicate. I have a PowerShell script which does two things.
Inserts records into a SQL Server table
Writes text to a text file
For the purpose of this post, I have simplified the script. On my computer, the script is located at C:\Temp\ssis.ps1. Following is the contents of the script.
DTEXEC.EXE /F "C:\Temp\ssisjob.dtsx"
$date = Get-Date
Write-Output "This PowerShell script file was last run on $date" >> C:\Temp\test.txt
When I manually run this PowerShell script, records are inserted into the SQL Server table, and a line of text is written to the test.txt file. If I schedule this script to run using Windows Task Scheduler, a new line of text is written to the text file, but the records are not inserted into the SQL Server table. This tells me that Windows Task Scheduler is able to run the PowerShell script. However, for some unknown reason, Windows Task Scheduler seems to not want to run the SSIS job (DTEXEC.EXE) part of the script. Event Viewer confirms there is an issue with the SSIS job. I am running Microsoft SQL Server 2014, Developer Version.
In my task, on the Actions tab, the Add arguments field has the following reference: C:\Temp\ssis.ps1. Task Scheduler is configured to run with the highest privileges.
I have tried all of the following Execution Policies in PowerShell. Regardless of the Execution Policy I select, my experience does not change.
Bypass
Unrestricted
RemoteSigned
The History tab in Task Scheduler has information events, but no error events.
I do not have the permission to view the SQL Server logs (this is a production server).
I have been debugging this issue for a few weeks, and I have read numerous posts here on Stack Overflow, yet I still cannot seem to find the answer, so hopefully I have done my due diligence before making a new post here. I could add some additional observations, but I do not want my post here to get extensively long. If anyone has any hints or tips or insight that might lead me down the right path, it would be greatly appreciated.
Here is the solution I came up with. Instead of exporting the file to Excel, I exported to a flat file (txt file). Also, using Nick McDermaids excellent recommendations, instead of using PowerShell in Task Scheduler, I started the dtexec.exe file in Task Scheduler.
Task Sheduler Actions Tab
Keep the action as Start a program
In Program/script, type dtexec.exe
In Add arguments, type /f "C:\path\to\example.dtsx
Leave the Start In box empty
We've run into a strange issue. We have some SSIS 2008 packages that had been working fine for quite a while. As part of the work flow they copy files to a backup folder, and then delete the original file. (No, I don't know why they didn't just make it a File Move task.) Suddenly, the delete task has stopped working. It's not just our packages either, it's the packages of a different group on a different SSIS server. Same target folders though.
The server team has gone over the folder permissions, the account the jobs run under have full access. And, if it was an access issue, how could we move the file but not delete it?
We finally found the answer. The file server we're using is Linux based. It is case sensitive, but doesn't throw an error if you use the wrong case. Was a very tough issue for us to track down.
I need to import a flat file daily. The file changes its name every day. After the file is processed, it needs to be moved to another folder.
I noticed I can schedule jobs in the SQL Server Agent, and that I can tell it to run every hour or so and that I am able to add CMD commands to it.
The solution I found was to run a script to check if the file exists, since the folder should be empty or have at least one file.
If the file exists, the script renames the file to one used in the SSIS package and then it runs the SSIS package.
After the whole thing is done, it should rename the file again based on today's date and move it to another folder.
If the file does not exist, then it should do nothing and wait another hour or so to run again.
What's the best solution to this scenario? Is the script a good idea? Maybe is it possible to add the if/else -for the file exists- into the SSIS package? Or even make the script run from the SSIS package itself instead of adding it to the Server Agent?
EDIT:
It seems I was a little naïve, it's possible to run VB scripts from the server. Would that be the recommended solution? It does solve my problem, but I'm just wondering if it's a good idea.
This solves all my questions:
http://www.sqlservercentral.com/articles/Integration+Services+%28SSIS%29/90571/