End user initiating SQL commands to create a file from a SQL table? - sql-server

Using SQL Manager ver 18.4 on 2019 servers.
Is there an easier way to allow an end user with NO access to anything SQL related to fire off some SQL commands that:
1.)create and update a SQL table
2.)then create a file from that table (csv in my case) that they have access to in a folder share?
Currently I do this using xp_command shell with bcp commands in a cloud hosted environment, hence I am not in control of ANY permission or access, etc. For example:
declare #bcpCommandIH varchar(200)
set #bcpCommandIH = 'bcp "SELECT * from mydb.dbo.mysqltable order by 1 desc" queryout E:\DATA\SHARE\test\testfile.csv -S MYSERVERNAME -T -c -t, '
exec master..xp_cmdshell #bcpCommandIH
So how I achieve this now is allowing the end users to run a Crystal report which fires a SQL STORED PROCEDUE, that runs some code to create and update a SQL table and then it creates a csv file that the end user can access. Create and updating the table is easy. Getting the table in the hands of the end user is nothing but trouble in this hosted environment.
We always end up with permission or other folder share issues and its a complete waste of time. The cloud service Admins tell me "this is a huge security issue and you need to start and stop the xp_command shell with some commands every time you want generate this file to be safe".
Well this is non-sense to me. I wont want to have to touch any of this and it needs to be AUTOMATED for the end user start to finish.
Is there some easier way to AUTOMATE a process for an END USER to create and update a SQL table and simply get the contents of that table exported to a CSV file without all the administration trouble?
Are there other simpler options than xp_command shell and bcp to achieve this?
Thanks,
MP

Since the environment allows you to run a Crystal Report, you can use the report to create a table via ODBC Export. There are 3rd-party tools that allow that to happen even if the table already exists (giving you an option to replace or append records to an existing target table).
But it's not clear why you can't get the data directly into the Crystal report and simply export to csv.

There are free/inexpensive tools that allow you to automate/schedule the exporting/emailing/printing of a Crystal Report. See list here.

Related

Execute Stored Procedure on multiple databases on multiple servers SQL Server

I have multiple databases on multiple servers (SQL Server 2008) with similar schema. I want to execute Stored Procedure on each of them. Right now I have to execute one by one on every server via SQL Server management studio.
Is there any possibility/option in SQL Server Management Studio that I can execute SP just once on all databases.
You can use a group query to run a script against more than one server. Look here
Then use the sp_MSForEachDB mentioned by #Ram
There are two ways I can suggest if you want to avoid doing it programmatically.
1) Use Registered Servers in SSMS. Each target database can be created as a Registered Server within a Server Group. You can then right click on the Server Group and select "New Query". This query will execute against all Registered Servers in the Group. This is explained in detail on MSSQLTips.
2) SQL Multi Script is a dedicated tool we developed at Red Gate to satisfy this use case. However, this isn't integrated into SSMS.
Using the sp_MSForEachDB stored procedure you should be able execute on multiple databases of same server.
EXEC sp_msforeachdb " IF '?' NOT IN ('DBs','to','exclude')
BEGIN
EXEC sp_whatever_you_want_to
END "
Looking around I'm sure you could write a powershell or batch script to do this but I do not have time to learn, build and test one.
So I'll do it in the language I'm happiest in: SQL and batch script
Paste the below query into SSMS and run it, substituting
Your Server List
Path to a file containing the script you want to run (i.e. replace YourSQLScript.SQL)
Path to a log file (i.e. replace YourOutputLog.TXT)
You might want to alter your script and add SELECT ##SERVERNAME to the start to log the server to your output file
WITH ServerList As (
SELECT 'Server1' ServerName UNION ALL
SELECT 'Server2' UNION ALL
SELECT 'Server3' UNION ALL
SELECT 'Server4' UNION ALL
SELECT 'Server5'
)
SELECT
'SQLCMD -S ' + ServerName + ' -E ' + ' -i C:\YourSqlScript.SQL -o C:\YourOutputLog.TXT'
From ServerList
UNION ALL
SELECT 'PAUSE'
So in this example, the file C:\YourSqlScript.SQL should probably contain something like:
SELECT ##SERVERNAME
EXEC sp_msforeachdb 'USE [?]; SELECT '?'; EXEC p_YourStoredProcedure;'
(Thanks to RAM for providing this)
(You should definitely test this script in just one database first)
Copy the output and paste into a text file. Save the text file as MyFirstBatchFile.CMD. Double click this file
Check the output file (C:\YourOutputLog.TXT)
This is not going to work first time - I just built it on the fly to show you how it can be done. If/when you get your first error, sit back take a look and see if you can solve it yourself.
If you need to do this regularly then you can have a think about how you want to automate it. For example there is a way to automate getting a list of servers (hint: SQLCMD -L)
If you are going to regularly administer multiple servers you should consider using Powershell.

Update SQL Server 2005 view with new database name?

I have approximately 100 SQL views that are a variation of this:
select * from RTC.dbo.MyTable
...now I find I need to change the name of the RTC table to something else. Rather than edit one view at a time, is there a way to script out all their drop/create statements to a text file so that I can do a global replacement?
In SSMS right click the database, go to Tasks and select there 'Generate Scripts...'. Select 'Views', select the views you want exported, export.
I'd use PowerShell. If you're not using SQL 2008 Client Tools, install them. Then get the PowerShell client, add the registered snapins (plenty of information out there on how to do that), and then use the directory structure to get to the folder representing your Views.
Then script them using something like:
Get-ChildItems | % {$_.Script()}
Use ScriptOptions to tell it to use an Alter script.
And replace "RTC." with the new database name... and run them using sqlcmd.
PowerShell actually becomes a really nice deployment option too.

How to import data from a view in another database (in another server) into a table in SQL Server 2000?

I was thinking about using bcp command to solve the user authentication, but does a bcp command capable to import to a table in my database? By the way, I am using SQL Server 2000 environment.
Here's the code I have got so far:
SET #Command = 'bcp "SELECT vwTest.* from [myserver\sql].test.dbo.vwTest" queryout dbo.Test -C ACP -c -r \n -t ";" -S myserver\sql -Umyuser -Puser1'
EXEC master.dbo.xp_cmdshell #Command
Based on the comparison of BCP, BULK INSERT, OPENROWSET (infer Linked Server) here:
...the bcp utility runs out-of-process. To move data across process memory spaces, bcp must use inter-process data marshaling. Inter-process data marshaling is the process of converting parameters of a method call into a stream of bytes. This can add significant load to the processor. However, because bcp [both] parses the data and [converts the] data into [the] native storage format in the client process, they can offload parsing and data conversion from the SQL Server process.
...bcp possibly isn't the most efficient means of transferring data. You might be better off to:
Create a linked server instance to the other database
Use INSERT statements, so that the tables are populated based on records from the database exposed in the linked server instance.
Besides potentially being more efficient, you only need to setup the linked server instance once versus running BCP to create output scripts every time you want to move data.
Mind that the linked server instance is based on a user on the other database, so permissions to the other database are based on that users' permissions.
SURE !!
Use this command (adopt it for your needs) on your source machine:
bcp database.dbo.viewname out c:\temp\viewname.bcp
and then import the data back into your destination system using:
bcp newdatabase.dbo.importtable in c:\temp\viewname.bcp
-c -S(servername) -U(username) -P(password)
That should grab the contents of your "viewname" from the source server, put it in a temporary file, and insert that file back into the new database on the new server.
Typically, you would load those data rows into a new, temporary staging table, and form there, use T-SQL or other means to insert that data into your actual tables.
Check out the MSDN documentation on bcp in SQL Server 2000 for details on all those switches and their meanings.

How to version control SQL Server databases?

I have SQL Server databases and do changes in them. Some database tables have records that are starting records required my app to run. I would like to do version control over database and these records (rows). Is it possible to do this and bundle it to SVN version control I have for my source code or are there other solutions to this? I would like to accomplish this to be able to return to previous version of database and compare changes between database revisions. It would be nice if tools for this are free, open source or not very expensive.
My environment is Visual C# Express, SQL Server 2008 Express and Tortoise SVN.
Late answer but hopefully useful to other readers
I can suggest using the SSMS add-in called ApexSQL Source Control. By utilizing this add-in, developers can easily map database objects with the source control system via the wizard directly from SSMS. It includes support for Git, TFS, Mercurial, Subversion, TFS (including Visual Studio Online) and other Source Control systems. It also includes support for source controlling Static data (so you can version control records also).
After downloading and installing ApexSQL Source Control, simply right-click the database you want to version control and navigate to ApexSQL Source Control sub-menu in SSMS. Click the “Link database to source control” option and select the source control system and the database development model, for example:
After that, you may exclude objects you don’t want to be linked to source control. It is possible to exclude specific objects by owner or type.
On the next step, you will be prompted to provide the log-in information for the source control management system:
Once done, just click the “Finish” button and the “Action center” window will be shown, offering the objects that will be committed to the repository (this is by default, if the repository is empty).
Once the database has been linked to source control, all the operations that can be executed from a source control client will be available from the “Object Explorer” pane. Those include:
checking out with or without lock the versioned objects,
view history of that object and apply specific revision,
view changes on that object that were made and
place data from table to source control using the “Link static data”
You can read this article for more information: http://solutioncenter.apexsql.com/sql-source-control-reduce-database-development-time/
We've just started doing the following on some of our projects, and it seems to work quite well, for populating "static" tables.
Our scripts follow a pattern where a temp table is constructed, and is then populated with what we want the real table to resemble. We only put human readable values here (i.e. we don't include IDENTITY/GUID columns). The remainder of the script takes the temp table and performs appropriate INSERT/UPDATE/DELETE statements to make the real table resemble the temp table. When we have to change this "static" data, all we have to update is the population of the temp table. This means that DIFFing between versions works as expected, and rollback scripts are as simple as getting a previous version from source control.
The INSERT/UPDATE/DELETEs only have to be written once. In fact, our scripts are slightly more complicated, and have two sets of validation run before the actual DML statements. One set validate the temp table data (i.e. that we're not going to violate any constraints by attempting to make the database resemble the temp table). The other validate the temp table and the target database (i.e. that foreign keys are available).
Static data support is being added to SQL Source Control 2.0, currently available in beta. More information on how to try this can be found here:
http://www.red-gate.com/messageboard/viewtopic.php?t=12298
There is a free microsoft product called Database Publishing Wizard which you can use to script the entire database (schema and data). It is great for taking snapshots of the current state of a DB and will enable you to recreate from scratch at any point
For database (schema) versioning we use custom properties, which are added to the database when the installer is ran. The contents of these scripts is generated with our build scripts.
The script to set the properties looks like this:
DECLARE #AssemblyDescription sysname
SET #AssemblyDescription = N'DailyBuild_20090322.1'
DECLARE #AssemblyFileVersion sysname
SET #AssemblyFileVersion = N'0.9.3368.58294'
-- The extended properties DatabaseDescription and DatabaseFileVersion contain the
-- AssemblyDescription and AssemblyFileVersion of the build that was used for the
-- database script that creates the database structure.
--
-- The current value of these properties can be displayed with the following query:
-- SELECT * FROM sys.extended_properties
IF EXISTS (SELECT * FROM sys.extended_properties WHERE class_desc = 'DATABASE' AND name = N'DatabaseDescription')
BEGIN
EXEC sys.sp_updateextendedproperty #name = N'DatabaseDescription', #value = #AssemblyDescription
END
ELSE
BEGIN
EXEC sys.sp_addextendedproperty #name = N'DatabaseDescription', #value = #AssemblyDescription
END
IF EXISTS (SELECT * FROM sys.extended_properties WHERE class_desc = 'DATABASE' AND name = N'DatabaseFileVersion')
BEGIN
EXEC sys.sp_updateextendedproperty #name = N'DatabaseFileVersion', #value = #AssemblyFileVersion
END
ELSE
BEGIN
EXEC sys.sp_addextendedproperty #name = N'DatabaseFileVersion', #value = #AssemblyFileVersion
END
GO
You can get a version of SQL Management Studio for SQL Server Express. I believe you'll be able to use this to produce scripts of the schema of your database. I think that will leave you to create scripts by hand for inserting the starting records.
Then, put all the scripts into source control, along with a master script that runs the individual scripts in the correct order.
You'll be able to run diffs using windiff (free with Visual Studio SDK), or else Beyond Compare is inexpensive, and a great diff/merge/sync tool.
MS Visual Studio Team System for Database Developers has functionality to easily generate create scripts for the whole schema. Only drawback is the cost!
Have you considered using SubSonic?
You should rather use DB specific versioning.
http://msdn.microsoft.com/en-us/library/ms189050.aspx
When either the
READ_COMMITTED_SNAPSHOT or
ALLOW_SNAPSHOT_ISOLATION database
options are ON, logical copies
(versions) are maintained for all data
modifications performed in the
database. Every time a row is modified
by a specific transaction, the
instance of the Database Engine stores
a version of the previously committed
image of the row in tempdb. Each
version is marked with the transaction
sequence number of the transaction
that made the change. The versions of
modified rows are chained using a link
list. The newest row value is always
stored in the current database and
chained to the versioned rows stored
in tempdb.
I use bcp for this (bulk loading utility, part of a standard SQL Server install, Express edition included).
Each table with data needs a control file Table.ctl and a data file Table.csv (these are text files that can be generated from an existing database using bcp). As text files, these can very easily be versioned.
As part of my generation batches (see my answer there for more information), I iterate through every control file like this :
SET BASE_NAME=MyDatabaseName
SET CONNECT_STRING=.\SQLEXPRESS
FOR /R %%i IN (.) DO (
FOR %%j IN ("%%~fi\*.ctl") DO (
ECHO + %%~nj
bcp %BASE_NAME%..%%~nj in "%%~dpsj%%~nj.csv" -T -E -S %CONNECT_STRING% -f "%%~dpsj%%~nj.ctl" >"%TMP%\%%~nj.log"
IF %ERRORLEVEL% GTR 0 (
TYPE "%TMP%\%%~nj.log"
GOTO ERROR_USAGE
)
)
)
A current limitation of this script is that the name of the file must be the name of the table, which may not be possible if the table name contains specific special characters.
This project has a good example of deploy and rollback

How can I parse Serv-U FTP logs with SSIS?

A while back I needed to parse a bunch of Serve-U FTP log files and store them in a database so people could report on them. I ended up developing a small C# app to do the following:
Look for all files in a dir that have not been loaded into the db (there is a table of previously loaded files).
Open a file and load all the lines into a list.
Loop through that list and use RegEx to identify the kind of row (CONNECT, LOGIN, DISCONNECT, UPLOAD, DOWNLOAD, etc), parse it into a specific kind of object corresponding to the kind of row and add that obj to another List.
Loop through each of the different object lists and write each one to the associated database table.
Record that the file was successfully imported.
Wash, rinse, repeat.
It's ugly but it got the job done for the deadline we had.
The problem is that I'm in a DBA role and I'm not happy with running a compiled app as the solution to this problem. I'd prefer something more open and more DBA-oriented.
I could rewrite this in PowerShell but I'd prefer to develop an SSIS package. I couldn't find a good way to split input based on RegEx within SSIS the first time around and I wasn't familiar enough with SSIS. I'm digging into SSIS more now but still not finding what I need.
Does anybody have any suggestions about how I might approach a rewrite in SSIS?
I have to do something similar with Exchange logs. I have yet to find an easier solution utilizing an all SSIS solution. Having said that, here is what I do:
First I use logparser from Microsoft and the bulk copy functionality of sql2005
I copy the log files to a directory that I can work with them in.
I created a sql file that will parse the logs. It looks similar to this:
SELECT TO_Timestamp(REPLACE_STR(STRCAT(STRCAT(date,' '), time),' GMT',''),'yyyy-M-d h:m:s') as DateTime, [client-ip], [Client-hostname], [Partner-name], [Server-hostname], [server-IP], [Recipient-Address], [Event-ID], [MSGID], [Priority], [Recipient-Report-Status], [total-bytes], [Number-Recipients], TO_Timestamp(REPLACE_STR([Origination-time], ' GMT',''),'yyyy-M-d h:m:s') as [Origination Time], Encryption, [service-Version], [Linked-MSGID], [Message-Subject], [Sender-Address] INTO '%outfile%' FROM '%infile%' WHERE [Event-ID] IN (1027;1028)
I then run the previous sql with logparser:
logparser.exe file:c:\exchange\info\name_of_file_goes_here.sql?infile=c:\exchange\info\logs\*.log+outfile=c:\exchange\info\logs\name_of_file_goes_here.bcp -i:W3C -o:TSV
Which outputs a bcp file.
Then I bulk copy that bcp file into a premade database table in SQL server with this command:
bcp databasename.dbo.table in c:\exchange\info\logs\name_of_file_goes_here.bcp -c -t"\t" -T -F 2 -S server\instance -U userid -P password
Then I run queries against the table. If you can figure out how to automate this with SSIS, I'd be glad to hear what you did.

Resources