I have a folder with a .sql files; 1 file per query. I want to execute all queries/ sql files and save them as csv.
Is there a way to do that automated without using the windows cli (disabled in my environment). I do have the SQL Server Management Studio.
I would approach this task using SSIS, providing you have Business Intelligence Development Studio (BIDS) installed.
First create a 'Foreach Loop Container' pointed to the folder with the SQL files, then use a variable to retreive each file name.
Next, create a flat file connection and set the 'Connection String' property to the variable that contains the file location.
Next, using the 'Execute SQL Task' component set the 'SQLSourceType' to 'File Connection' and the 'FileConnection' to the one created in the previous step.
Finally, depending on how the data is returned you have a couple of options, if the result set is small, only a row or a single column, then you can save the results to a variable and using a 'Dataflow' task create a 'Derived Column' component and export the contents of that variable to a CSV file. Or, if the dataset is larger you could dump the results to a temp table and then using an 'OLE DB Source' and 'OLE DB Destination' you could push the full result set straight into a CSV.
Hopefully this isn't too convoluted of a solution, this approach has the advantage of being able be run from either a remote machine or from the server itself, plus you can automate its execution with a SQL Agent Job.
Create a VB.NET console application.
Generate a list of files that end in .SQL from the folder in question.
Load the contents of each file into individual SQL Commands
Execute the SQL Command for each, storing the results in DataSets.
For each table in each dataset, create a new .csv file
For each .csv file, you will need to iterate over each cell in the datatable, and utilize proper escaping for .csv files.
Use 'for' in combination with either sqlcmd or bcp command for each file in the script folder.
Related
We are writing a new application, and while testing, we will need a bunch of dummy data. I've added that data by using MS Access to dump excel files into the relevant tables into the Postgres database.
What should I do now to generate an Insert statements from the PGAdmin4 Tool similar to what SQL Studio allow us to generate an Insert statements for SQL Server? There are no options available to me. I can't use the closest one, which is to export and import the data via CSV.
I understand that you cannot import the CSV file into the actual DB as this needs to be done through ASP.NET core EF. Perhaps, you can probably create a test schema and import the CSV file into the test schema. Once you have the data imported into the test schema, you can use that to generate SQL statements using the steps below:
Right click on target table and select "Backup".
Select a file path to store the backup. You can save the file name as data.backup
Choose "Plain" as Format.
Open the tab "Options" check "Use Column Inserts".
Click the Backup-button.
Once the file gets generated you can open with Notepad++ or VSCode to get the SQL insert statements
You can use the statements generated and delete the test schema created
Here is a resource that might help you in loading data from Excel file into PostgresSQL if you still need to take this path Transfer Data from Excel to PostgreSQL
I have to weekly upload text files from a server location to Microsoft SQL Server Management Studio .I wish to automate the task so that files are automatically uploaded .Can somebody suggest me the way?
Methods I know of:-
Via SQL:
Use OPENROWSET to open the file and obtain the records to write into
a table.
Use BULK INSERT to open the file and insert directly into a table (you may need to pair with XP_CMDSHELL to get a directory listing to loop through)
VIa SSMS:
Create a DataFlow to import from file
SSMS makes it easier to do clever things with the import process. But it can be very finnicky.
With both of those you can set up an Agent job to run the script / package automatically.
I have an ogr2ogr batch file that reprojects SQL data into a new SQL Server table.
It works fine when I run the bat file manually but it fails if I run the bat file via a SQL Server stored procedure. I have given the gdal folders SQL Service permissions and xp_CommandShell is also enabled. I'm using
EXECUTE xp_CMDShell 'blah'
in the T-SQL script.
For some reason the ogr_MSSQLSpatial.dll causes it to fail.
ERROR 1: Can't load requested DLL: Z:\BroadSpectrumSQLTreeExtract\ogr2ogr\gdalplugins\ogr_MSSQLSpatial.dll
If I remove this dll the script runs via SQL but it means I need to add extra commands that the dll must take care of, such as setting source coordinate system. I haven't managed to get it working 100%. The furthest I got to was producing the reprojected table but the geometry field is empty.
The DLL does contain SQL commands to the system tables. Could this be a SQL Server security issue stopping it from working?
I again had this problem with another ogr2ogr bat while executing with SQL. If I put the bat in the same folder as the dll's it works fine.
I am working on a Windows Server 2012 64-bit. I want to be able to import data from a .dbf file into a SQL Server table. I used the import wizard and it worked correctly. However, I have SQL Server Express and can't schedule this insertion.
Is there another way to schedule the insertion of the .dbf data to the SQL Server tables, without the use of the SSIS package loader?
Update
I ended up using Python and writing a script to import from XML. However, I believe the answer by #Oleg was the most accurate, given the circumstances.
Thank you all!
You can also use DBF Commander Pro for this task:
Create command line for your insertion - choose 'File -> Export to DBMS'. Specify transfer options in the window appears, then copy the command line from the bottom of the window:
Create text .BAT file and insert the copied command line, e.g.:
"c:\Program Files\DBFCommander\DBFCommander.exe" -edb "D:\Data\customer.dbf" customer_table "Provider=SQLOLEDB.1;User ID=user1;Initial Catalog=test_db;Data Source=test_server"
Make a schedule using Windows Scheduler that will execute this .BAT file.
Additional info that may be useful for you:
Using DBF in batch mode
Export DBF file to SQL database
I suggest you the next approach:
Create C# script which will use the OleDbConnection (to fetch) and SqlConnection (to upload) objects to import data from the .DBF file to SQL Server database table.
By using LinqPad, LinqPad command-line utility (lprun.exe) and windows Scheduled Task service automate the execution of the mentioned script file
Useful links:
How to get data from DBF file using C#
How to load data into datadase using C#
About LINQPad command-line utility
Another way is create a SQL linked server an ODBC that is pointing at the DBF. Use Windows scheduler to call SQLCMD.EXE to run some SQL to copy the data in.
So I have a lot of table create scripts that I need to run on a fresh database. They are all in one folder, and theirs over 250 of them.
Is there a good way to go about doing this using SSMS?
Option 1: USE BATCH FILE
Using Batch file[to run all the scripts in one click] but that cmd implicitly will use sqlcmd.
Option 2: MERGE MULTIPLE SQL FILES INTO A SINGLE SQL FILE AND THEN EXECUTE IN SSMS IN ONE SHOT
Create a simple batch file under the folder(wherein your 250 table creation scripts reside) as follows
type *.sql > allinone.sql
Note: Open txt document and then copy paste the above content and then save it with extension .bat(Ex:merge.bat)
Just double clicking that batch file will give you the single sql file which you can run in SSMS in single go.