I have a linked database in MS SQL Server management Studio. I can query it and get the exact data I need in the format I need with headers and all.
I need a way to automate this query on a schedule and export as a csv automatically.
Currently my process is to open the query in MS SQL SMS, and run the query (which mind you takes 15 minutes to run) and then in the results area right click and "save results as"
I would like to either automate it with in SMS, or be able to script it. What's confusing to me is how to write a script for a linked database.
I noticed that when you right click on the Linked Server there are a couple of options to script. For example, there's "Create To", "DROP To", and "DROP And CREATE To". Each of the scripts seem to be formated in a similar way with the following:
USE [master]
GO
/****** Object: LinkedServer [QREMOTE] Script Date: 2/24/2022 10:59:26 AM ******/
EXEC master.dbo.sp_dropserver #server=N'QREMOTE', #droplogins='droplogins'
GO
I'm not familiar with this scripting and I'm not sure how to use it. Any assistance would be appreciated!
Some background, this is a quickbooks database connected through the QODBC driver in which I'm using SMS to query specific fields from a table that need to be exported in a specific order so they can be imported into another system.
Related
Using SQL Manager ver 18.4 on 2019 servers.
Is there an easier way to allow an end user with NO access to anything SQL related to fire off some SQL commands that:
1.)create and update a SQL table
2.)then create a file from that table (csv in my case) that they have access to in a folder share?
Currently I do this using xp_command shell with bcp commands in a cloud hosted environment, hence I am not in control of ANY permission or access, etc. For example:
declare #bcpCommandIH varchar(200)
set #bcpCommandIH = 'bcp "SELECT * from mydb.dbo.mysqltable order by 1 desc" queryout E:\DATA\SHARE\test\testfile.csv -S MYSERVERNAME -T -c -t, '
exec master..xp_cmdshell #bcpCommandIH
So how I achieve this now is allowing the end users to run a Crystal report which fires a SQL STORED PROCEDUE, that runs some code to create and update a SQL table and then it creates a csv file that the end user can access. Create and updating the table is easy. Getting the table in the hands of the end user is nothing but trouble in this hosted environment.
We always end up with permission or other folder share issues and its a complete waste of time. The cloud service Admins tell me "this is a huge security issue and you need to start and stop the xp_command shell with some commands every time you want generate this file to be safe".
Well this is non-sense to me. I wont want to have to touch any of this and it needs to be AUTOMATED for the end user start to finish.
Is there some easier way to AUTOMATE a process for an END USER to create and update a SQL table and simply get the contents of that table exported to a CSV file without all the administration trouble?
Are there other simpler options than xp_command shell and bcp to achieve this?
Thanks,
MP
Since the environment allows you to run a Crystal Report, you can use the report to create a table via ODBC Export. There are 3rd-party tools that allow that to happen even if the table already exists (giving you an option to replace or append records to an existing target table).
But it's not clear why you can't get the data directly into the Crystal report and simply export to csv.
There are free/inexpensive tools that allow you to automate/schedule the exporting/emailing/printing of a Crystal Report. See list here.
I have a SQL Server database set up that I manage using SQL Server Management Studio 17.
In that database, I have 27 tables that I maintain by running pretty simple OPENQUERY scripts every morning, something to the effect of:
DROP TABLE IF EXISTS [databasename].[dbo].[table27]
SELECT * INTO [databasename].[dbo].[table27] FROM OPENQUERY(OracleInstance, '
SELECT
table27.*
FROM
table27
INNER JOIN table26 ON table27.criteria = table26.criteria
WHERE
< filter >
< filter >
');
And this works great! But, it is cumbersome to every morning, sign into SSMS, and right click on my database and hit "New Query" and copy in 27 individual SQL scripts and run them. I am looking for a way to automate that. My directory that holds these scripts looks like this:
I don't know if this is achievable in SSMS or in like a batch script. I would imagine for the latter, some pseudocode looking like:
connect to sql server instance
given instance:
for each sql_script in directory:
sql_script.execute
I have tried creating a script in SSMS, by following:
Tasks -> Script Database ->
But there is no option to execute a .sql file on the tables in question.
I have tried looking at the following resources on using T-SQL to schedule nightly jobs, but have not had any luck conceiving of how to do so:
https://learn.microsoft.com/en-us/sql/ssms/agent/schedule-a-job?view=sql-server-2017
Scheduled run of stored procedure on SQL server
The expected result would be the ability to automatically run the 27 sql queries in the directory above to update the tables in SQL Server, once a day, preferably at 6:00 AM EST. My primary issue is that I cannot access anything but SQL Server Management Studio; I can't access the configuration manager to use things like SQL Server Agent. So if I am scheduling a task, I need to do so through SSMS.
You actually can't access the SQL Server Agent via Object Explorer?
This is located below "Integration Services Catalog"
See highlighted below:
You describe not being able to access that in the question for some reason. If you can't access that then something is wrong with SQL Server or perhaps you don't have admin rights to do things like schedule jobs (a guess there).
In SSMS you would wnat to use Execute T-SQL Statement Task and write your delete statement in the SQL Statement field in the General Tab.
However, I would look at sqlcmd. Simply make a batch script and schedule it in Task Scheduler (if you're using windows). Or you could use
for %%G in (*.sql) do sqlcmd /S servername /d databaseName -E -i"%%G"
pause
From this post. Run all SQL files in a directory
So basically you have to create a Powershell script that calls and execute the sql scripts.
After that you can add your Powrshell script to the Task Scheduler.
I suggest you add these scripts as jobs for the SQL Server Agent.
I have to weekly upload text files from a server location to Microsoft SQL Server Management Studio .I wish to automate the task so that files are automatically uploaded .Can somebody suggest me the way?
Methods I know of:-
Via SQL:
Use OPENROWSET to open the file and obtain the records to write into
a table.
Use BULK INSERT to open the file and insert directly into a table (you may need to pair with XP_CMDSHELL to get a directory listing to loop through)
VIa SSMS:
Create a DataFlow to import from file
SSMS makes it easier to do clever things with the import process. But it can be very finnicky.
With both of those you can set up an Agent job to run the script / package automatically.
Say I already created my database but forgot to save the sql commands do create it.
How could I reverse engineer the code from an already existing database?
I'm using Microsoft SQL Server Express 2008.
You can do this pretty easily by using SQL Server Management Studio (SSMS) - it's available for free if you don't already have it installed.
Connect to the database
Expand out Databases > YourDataBaseName.
Right-click on the database and select the option "Script database as" then "Create To" then finally "File".
That will create the necessary scripts to recreate your database.
To script out all the tables in your database:
Right-click on the database node
Select "Tasks" then "Generate Scripts".
When the wizard appears, click Next.
Select the database. At this point you can check the "Script all objects in the selected database" which does exactly what it says, or if you leave it unchecked you will get the option later in the process to pick which items are scripted.
Click next. Now you're given some scripting options.
I'd suggest scrolling down the list and checking the option to Script Indexes/Script Triggers. You can also script the data if necessary (though I wouldn't do this if you've got a lot of data in your database).
Modify any options you'd like and click Next.
Select the database types you'd like to script (Users/Tables/Views). Click Next.
Now you've got the opportunity to select more specific items. Hit Next and repeat the process of any of your other database types.
Hit next one more time, then select where you'd like the script written to. You get the chance to review your selections.
Click Finish.
Here's a link for the 2008 version SSMS Express 2008
Your RDBMS comes with some sort of "dump" tool that will give you the structure and content of your database, in the form of SQL statements.
As others have mentioned, if you have SQL Management Studio (you should, it's free as part of SQL Server Express). Fire it up, connect to your instance then expand the Database tree.
Right click on your database and select Tasks->Generate Scripts..
Click next, then Next again (which selects all objects in the database by default), pick an output option (defaults as "Save to File"), click next and voila!
If you also want to script the data as well as the schema, in the "Set Scripting Options" window, click on the Advanced button, scroll down to "Types of data to script" (just above the Table/View Options header) and select "schema and data".
[Edit] Tested - The Generate Scripts option exists and works in the (free) 2008 R2 edition of SSMS. See the link in my comment below for the URI for the R2 version.
I use the Database Publishing Wizard in Visual Studio to create a script of a test database (schema and data): The wizard creates a file that I store in my source control system. When I make a few changes to the database I want to publish again in order to save the changes in the source control system.
The problem for me is that the wizard creates a line like the following for every object:
/****** Object: Schema [dbo] Script Date: 06/18/2010 15:47:19 ******/
Since these lines contain a date I have thousands of lines that are changed even if just added one record to the database and thus comparing to the previous version is virtually impossible.
Anyone know how to suppress these lines?
The database publishing wizard options seem really limited to me too. If you script the database using SQL Server Management Studio Generate Script option you can opt to remove the "Script Descriptive Headers Option" which takes these headers out. In 2008 this can do the data too.
If that breaks your workflow you could just strip the lines out with search / replace.