I have a user that needs a CSV from SQL Server once or twice each month. I have been running the query and then exporting the CSV to a network location manually, but I want to automate this so that they can pull the data on their own or have it scheduled to update the file every other week.
It has been a long time since providing a solution like this, and I am overthinking it, so I am looking for a suggestion on what would be the proper way to provide a file like this now.
I currently just have a database connection setup in Excel that will run the query when the user wants it, but this feels unprofessional to provide as a solution.
Thanks for any recommendations.
Use Powershell:
PS C:\Users\david> invoke-sqlcmd "select * from sys.objects" | export-csv -Path "c:\temp\data.csv"
Related
I need to create a CSV file from a stored procedure that runs everyday on SQL Server with shipping details.
What would be the best way to go about this?
The solution we found best was to call the stored procedure using Powershell and have a task running in windows scheduler everyday which runs the Powershell.
Another option is BCP, or bulk command protocol. You essentially can either build this into the stored proc, or exclusively in the sql server agent job that runs it, or in an SSIS package. Wherever you prefer.
Here’s a link to Microsoft’s BCP utility:
https://learn.microsoft.com/en-us/sql/tools/bcp-utility?view=sql-server-ver15
It is important to note you may need to enable some more advanced options in SQL Server. It is highly configurable though, including the ability to dump entire tables to a csv file.
I don't know about Oracle but for SQL Server the easiest method I found was to use PowerShell.
$hostname = hostname
Invoke-Sqlcmd -ServerInstance $hostname -Database master -Query "select * from sys.sysdatabases" | Export-Csv "d:\result.csv" -NoTypeInformation
gives a CSV file at desired location.
If your SQL Server is at remote location, please make sure your server can connect to said SQL Server and if any certificate/encryption is in place then you have all that is necessary.
I have a dbscript.sql file that I would like to have run against roughly 30 databases sitting on a named SQL Server instance. The script is fairly lengthy (1000+ lines) and contains numerous quotes to change to accomodate using MS_ForEachDB. I had tried a mass ctrl-h replace-all apostrophe's but that just ended up producing other erroneous errors.
I have started down the road of using dbatools and powershell to accomplish this task but I'm wondering if there is a simpler trick to get this script applied to multiple databases at once.
Dbatools is a good option as you could put use the pipeline to pass it into a the Invoke-DbaQuery command. An example would be:
Get-DbaDatabase -SqlInstance "server1", "server1\nordwind", "server2" | Invoke-DbaQuery -File "C:\scripts\sql\rebuild.sql"
I have a requirement. We have a FTP server where the data will change everyday. There are around 9 files. Each file is data for MS SQL ETL. Now what i want to do is. As soon as file arrives in FTP location. Powershell should read that date modified of the file and trigger the job in SQL Server. Is that possible with powershell?
Challenges involved
Limited with technology (Only powershell and TSQL Can be used)
Old file (Day - 1) Data. to get each file completely replace it
will take 15 Minutee, before that job should not triggered.
Need your inputs on this.
You may want to try a FileSystemWatcher. A similar-ish question has been asked before, so I won't try to regurgitate the answer:
Watch file for changes and run command with powershell
See also on MSDN:
FileSystemWatcher class
FileSystemWatcher events
I have a PowerShell script do the following tasks:
Loop a big database table
Generate text file
Zip the text file
FTP upload the zipped file
Write to the log table
The step generating text file may take short or longer time depends on the data. And the FTP uploading time takes a while. So I want to make at least these two steps asynchronous. Is SQL Server Service Broker a viable choice? Is there any example? Any other options?
You cant make them aysnc in PowerShell, but you could use the Start-Job cmdlet to put them on another thread and wait until they complete.
Using Service Broker will will by default make them work asynchronously. The tricky thing would be if you still want to run some of then sequentially, for which you need to add a conversation-group id for them.
A while back I needed to parse a bunch of Serve-U FTP log files and store them in a database so people could report on them. I ended up developing a small C# app to do the following:
Look for all files in a dir that have not been loaded into the db (there is a table of previously loaded files).
Open a file and load all the lines into a list.
Loop through that list and use RegEx to identify the kind of row (CONNECT, LOGIN, DISCONNECT, UPLOAD, DOWNLOAD, etc), parse it into a specific kind of object corresponding to the kind of row and add that obj to another List.
Loop through each of the different object lists and write each one to the associated database table.
Record that the file was successfully imported.
Wash, rinse, repeat.
It's ugly but it got the job done for the deadline we had.
The problem is that I'm in a DBA role and I'm not happy with running a compiled app as the solution to this problem. I'd prefer something more open and more DBA-oriented.
I could rewrite this in PowerShell but I'd prefer to develop an SSIS package. I couldn't find a good way to split input based on RegEx within SSIS the first time around and I wasn't familiar enough with SSIS. I'm digging into SSIS more now but still not finding what I need.
Does anybody have any suggestions about how I might approach a rewrite in SSIS?
I have to do something similar with Exchange logs. I have yet to find an easier solution utilizing an all SSIS solution. Having said that, here is what I do:
First I use logparser from Microsoft and the bulk copy functionality of sql2005
I copy the log files to a directory that I can work with them in.
I created a sql file that will parse the logs. It looks similar to this:
SELECT TO_Timestamp(REPLACE_STR(STRCAT(STRCAT(date,' '), time),' GMT',''),'yyyy-M-d h:m:s') as DateTime, [client-ip], [Client-hostname], [Partner-name], [Server-hostname], [server-IP], [Recipient-Address], [Event-ID], [MSGID], [Priority], [Recipient-Report-Status], [total-bytes], [Number-Recipients], TO_Timestamp(REPLACE_STR([Origination-time], ' GMT',''),'yyyy-M-d h:m:s') as [Origination Time], Encryption, [service-Version], [Linked-MSGID], [Message-Subject], [Sender-Address] INTO '%outfile%' FROM '%infile%' WHERE [Event-ID] IN (1027;1028)
I then run the previous sql with logparser:
logparser.exe file:c:\exchange\info\name_of_file_goes_here.sql?infile=c:\exchange\info\logs\*.log+outfile=c:\exchange\info\logs\name_of_file_goes_here.bcp -i:W3C -o:TSV
Which outputs a bcp file.
Then I bulk copy that bcp file into a premade database table in SQL server with this command:
bcp databasename.dbo.table in c:\exchange\info\logs\name_of_file_goes_here.bcp -c -t"\t" -T -F 2 -S server\instance -U userid -P password
Then I run queries against the table. If you can figure out how to automate this with SSIS, I'd be glad to hear what you did.