I have a PowerShell script do the following tasks:
Loop a big database table
Generate text file
Zip the text file
FTP upload the zipped file
Write to the log table
The step generating text file may take short or longer time depends on the data. And the FTP uploading time takes a while. So I want to make at least these two steps asynchronous. Is SQL Server Service Broker a viable choice? Is there any example? Any other options?
You cant make them aysnc in PowerShell, but you could use the Start-Job cmdlet to put them on another thread and wait until they complete.
Using Service Broker will will by default make them work asynchronously. The tricky thing would be if you still want to run some of then sequentially, for which you need to add a conversation-group id for them.
Related
Ideally would like to run something from a SQL query or SQL agent job to FTP upload a file to an external site, but cannot use xp_cmdshell.
Yes. You need to split your work into two separate tasks:
How to run executable or a batch program from within SQL Server without resorting to xp_cmdshell.
An example of how to do it can be found in:
https://www.mssqltips.com/sqlservertip/2014/replace-xpcmdshell-command-line-use-with-sql-server-agent/.
You should modify this example to suit your particular needs. Suggested stored procedure would:
run command passed as a parameter in created on-the-fly SQL job (indicate CmdExec subsystem)
wait for SQL job completion (query msdb.dbo.sysjobactivity) or kill the job if predefined timeout value has been reached
return results of job execution (query msdb.dbo.sysjobhistory)
delete the job
Note: Full version of SQL Server required. If you are using express version, you would have to manually define a windows scheduled task.
How to send a file via ftp using a batch program.
Please see:
How to ftp with a batch file?
is there any way to retrieve a file from a web server using FTP protocol without using "Get A FILE WITH FTP" JOB STEP. I can only use TRANSFORMATION in Pentaho. Any ideas.
You can use REST Client to do that in a transformation. I am doing the same thing in some of my transformations. You can GET the file content in the stream using REST Client and use a Text File Output to store the content in a file. File format will be your choice.
Even if you can only use a transformation, a transformation can call a job using the "Job Executor" step. You can do the FTP in the job you are calling.
A note of caution - you've got to explicitly check the output of the Job Executor Step for errors.
I have a requirement. We have a FTP server where the data will change everyday. There are around 9 files. Each file is data for MS SQL ETL. Now what i want to do is. As soon as file arrives in FTP location. Powershell should read that date modified of the file and trigger the job in SQL Server. Is that possible with powershell?
Challenges involved
Limited with technology (Only powershell and TSQL Can be used)
Old file (Day - 1) Data. to get each file completely replace it
will take 15 Minutee, before that job should not triggered.
Need your inputs on this.
You may want to try a FileSystemWatcher. A similar-ish question has been asked before, so I won't try to regurgitate the answer:
Watch file for changes and run command with powershell
See also on MSDN:
FileSystemWatcher class
FileSystemWatcher events
I want to create a SQL Server SSIS package where I can watch a folder and once I have all (20 files) the required files I want to execute a sql statement. The files may come at different times and sometime they will be in csv and sometime they can come in zip. I know ssis has a wmi event watcher task but I’m not sure how I can specify to look for all 20 files. I guess I want wmi event watcher to look into that folder every 30 minutes and once it sees all the files move to the next step (execute sql task). Can someone tell me how I can specify the file name in wmi event watcher task? Thanks.
This article seems relevant to your plan. You need to create the proper WQL code.
http://blogs.technet.com/b/heyscriptingguy/archive/2007/05/22/how-can-i-monitor-the-number-of-files-in-a-folder.aspx
("ASSOCIATORS OF {Win32_Directory.Name='C:\Logs'} Where " _
& "ResultClass = CIM_DataFile")
I'm not sure how that will behave in the WMI Event watcher though. Have you looked at the docs for the SSIS task?
Here is a more step-by-step approach:
http://microsoft-ssis.blogspot.com/2010/12/continuously-watching-files-with-wmi.html
Some good points there, even if it doesn't address the pesky 20 file requirement.
You could also have a powershell script on the server monitor the files and then chuck them into a subfolder when they are all there, which SSIS would be monitoring.
Here is a doc page showing how to specify one file:
http://msdn.microsoft.com/en-us/library/windows/desktop/aa394594(v=vs.85).aspx
With that, I'm sure you could set up a chain of WMI checks in your SSIS package.
What is the most secure and easier way to send approx. 1000 different records into database that is not directly accessible - MySQL database on Web provider's server - using Windows application
.
Data will be stored into different tables.
Edited:
The application will be distributed to users who have no idea what is database or putty or... They just install my application, open it, enter some data and press Submit.
Currently I'm using php to upload the generated script into webserver and there process it. I think I should also include some signature to the file to avoid some "drop..." hacks.
If you can export the data as a sql script you can just run it against the remote server using your application of choice. 1000 records wont create that big a script.
In current project on my job we have the same situation - remote (faraway) database.
I made next solution: serialization sql query into xml and putting it via HTTP to web daemon, which is running on remote server instead of open sql server. Daemon checks credentials and executes query.
As I can't execute any external programs on external server, I created following solution:
My program creates script file and calculates it's salted hash
Program sends this file together with user credentials and hash into PHP page on the server
PHP page checks the username and password, then checks hash and then executes script. Only Insert and Update commands are allowed.
Is this approach secure enough?