Using Microsoft products, I have a collection of SQL scripts and a VS project that I use to refresh a DW on a weekly basis. I know the process can be automated, but the documentation for VS is so vast I don't know where to start. The HIGH LEVEL process is outlined below:
Open SSMS and MANUALLY run scripts to truncate tables and drop indexes
Open VS project and in dev mode MANUALLY press START to extract the data from application to the truncated tables in the DW
Open SSMS and MANUALLY run transformation scripts to create analysis cubes end users can access
I am trying to get to a point where I can just schedule this process to run every X period so I don't have to press any buttons.
From what you described it seems like SSIS is something that can cover the tasks you listed. SSDT will need to be installed to use this. Since you're using scripts I'm assuming you have SQL script files saved that you execute. These can definitely be run in SSIS using an Execute SQL Task, with a file connection as the SQL source. With a collection of scripts, I'd suggest looking into using a Foreach Loop to run these via an Execute SQL Task while iterating through the folder(s) that contain the scripts. As far as running the Visual Studio project in development mode, configurations in Visual Studio can used to accomplish this for SSIS. SSIS has both an Analysis Services Processing Task and Analysis Services Execute DDL Task, and sounds like you're looking for the latter. Both XMLA and TMSL commands can be executed from an SSAS Execute DDL Task. Below are some links to get you started. The Data Flow Task may help with what you're doing for your data extraction.
SSIS
SSDT
Execute SQL Task
Analysis Services Processing Task
Analysis Services Execute DDL Task
Configurations
Data Flow Task
Foreach Loop
DTEXEC
I think, you can use jobs. Descriptions is here
Related
I am in process of researching the best way to track all the queries which run within an SSIS package. The plan is to create an automatised report which could pull from SQL server that shows the querys the SSIS package runs and where it loads the data. I was wondering is it possible to customised a log message. My plan would be to insert a log after the task has run and say the logic which ran. An example would be if i had a Execute SQL task i could write a custom message that would say the logic which just ran. I have been trying a various of other solutions but couldnt really see much which would help me create a directory of the info of all SSIS packages.
I'm trying to do data comparison and automate database upgrade script generation by using SQL Server Database tools > SchemaComparison/DataComparison.
Using VS command window, I am able to run
> Tools.NewDataComparison [args...]
> SQL.DataCompareExportToFile [file]
which produces .sql file containing required inserts/updates/deletes.
Now I would like to go a step further and automate this.
I have been looking at Microsoft.VisualStudio.Data.Tools.DataCompare, Microsoft.SqlServer.Dac and similar assemblies but haven't found any of the methods above.
Which dll exposes these?
I have also tried to run these methods with devenv.exe (which only starts VS and executes the arguments in command window) but was only successful at running Tools.NewDataComparison. Is there a way to chain or reuse the context, so I can run the SQL.DataCompareExportToFile afterwards?
I know I can use SqlPackage.exe to export dacpac and bacpac, but comparison is possible only for schema. Data comparison is not. Is there a way to use bacpac for comparison?
I would accept any solution which would allow me to compare the databases and generate the upgrade .sql script. Either in C# or any other script.
Unfortunately I cannot comment to ask for clarification. If this is a DB project in Visual Studio you can setup Pre and Post execution scripts to manage data updates.
We break ours down into
Reference Data - Static lookup data, we kill and fill so Ref data is always clean and controlled.
Migration Data - Copies data from another source and is based on what release is being run and only executes once.
Sandbox Data - Aligns data for repeatable test by the front-end. Think of unit testing. Only runs on non-Prod environments.
So once you have this in place, then we use SqlPackage to upgrade the DAC which includes the Pre and Post scripts.
If you are talking about actually comparing 2 databases, then I only know client tools like VS Data Compare and Red Gate that could do that. But you could try using Merge between tables or maybe explore an SSIS solution.
I'm trying to desgin a program, that 100 people need to input infornamtion of 300 projects in Relational database(SQL SERVER), and then run SSIS pack to ETL and return calculation results from SSAS. Is it possible that I can pass varible of projet name individually to SSIS pack, and run SSIS pack on demand(not sql server agent scheduled way), so that I can get the SSAS results immediately for certain project? Is there any reference I can learn?
Thanks.
What SQL Server version do you have? If 2012 and newer you can use SSIS Project Deployment Model and Parameters and run packages using stored procedures = on demand
Here you have information about run packages using stored procedures
Here you have information about ssis parameters
I am trying to find a better way to test our SSIS application, which is used to load data from a file into SQL Server and validate the file data.
I have created a SQL script which can be run to insert 'bad' data into the database tables, and ensure that our validations are performing correctly.
The SQL script:
- loads 'bad' data
- executes the SSIS validations
- ensures the errors in the data were detected
- Outputs a PASS or FAIL
- Deletes the TEST data if passed
Is there anyway I can get this script to be run automatically somehow, for example after someone checks in some code? Should I add it as a stored proc?
I had a look at the Default template Build definition but I couldn't see how to run this SQL script.
The wrong way to do this is using a build server. There are lots of parts to a continuous integration and a build server is really designed for compilation and validation that does not require an instance of your environment. There are two good practices that you can adopt:
Create a test harness that allows you to load a singe package from your SSIS scripts and test the inputs and outputs. Essentially unit tests. I did this for a customer a few years ago and it worked well.
Use a release management tool to push out your packages, the data, and everything else that you need. Release Management for Visual Studio can do this easily.
A rule of thumb that I always go buy is if I do not need an instance of anything but in memory objects then all I need is a build server. However if I need an instance of my app then I want to add release management tools to my continuous integration strategy.
It doesn't appear that you can do it as part of a checkin based on my 2 minutes of searching.
How to modify the default Check-in Action in TFS?
http://msdn.microsoft.com/en-us/library/ms243849(VS.80).aspx
Can I modify the default Check-in Action in TFS 2012? (pretty picture)
http://www.codeproject.com/Tips/562994/Perform-a-custom-action-for-Check-in-event-in-Micr
Instead, you'll need to set up a build server to handle the action. Out of the box, MSBuild doesn't have the ability to run a SQL Script so you'll need to snag the MSBuild Extension Pack and leverage SqlExecute
http://msdn.microsoft.com/en-us/library/ms181712.aspx
http://msdn.microsoft.com/en-us/library/ms181710(v=vs.90).aspx
http://msdn.microsoft.com/en-us/library/cc668755(v=vs.90).aspx
http://www.msbuildextensionpack.com/help/4.0.5.0/html/0d864b98-649a-5454-76ea-bd3069fde8bd.htm
You can use SQL Server Agent to embed for example your SQL code and you can than run this job from your application.
I need to restore a backup from a production database and then automatically reapply SQL scripts (e.g. ALTER TABLE, INSERT, etc) to bring that db schema back to what was under development.
There will be lots of scripts, from a handful of different developers. They won't all be in the same directory.
My current plan is to list the scripts with the full filesystem path in table in a psuedo-system database. Then create a stored procedure in this database which will first run RESTORE DATABASE and then run a cursor over the list of scripts, creating a command string for SQLCMD for each script, and then executing that SQLCMD string for each script using xp_cmdshell.
The sequence of cursor->sqlstring->xp_cmdshell->sqlcmd feels clumsy to me. Also, it requires turning on xp_cmdshell.
I can't be the only one who has done something like this. Is there a cleaner way to run a set of scripts that are scattered around the filesystem on the server? Especially, a way that doesn't require xp_cmdshell?
First off and A-number-one, collect all the database scripts in one central location. Some form of Source Control or Version Control is best, as you can then see who modified what when and (using diff tools if nothing else) why. Leaving the code used to create your databases hither and yon about your network could be a recipe for disaster.
Second off, you need to run scripts against your database. That means you need someone or something to run them, which means executing code. If you're performing this code execution from within SQL Server, you pretty much are going to end up using xp_cmdshell. The alternative? Use something else that can run scripts against databases.
My current solution to this kind of problem is to store the scripts in text (.sql) files, store the files in source control, and keep careful track of the order in which they are to be executed (for example, CREATE TABLEs get run before ALTER TABLEs that add subsequent columns). I then have a batch file--yeah, I've been around for a while, you could do this in most any language--to call SQLCMD (we're on SQL 2005, I used to use osql) and run these scripts against the necessary database(s).
If you don't want to try and "roll your own", there may be more formal tools out there to help manage this process.
Beyond the suggestions about centralization and source control made by Phillip Kelley, if you are familiar with .NET, you might consider writing a small WinForms or WebForms app that uses the SQL Server SMO (SQL Server Management Objects). With it, you can pass an entire script to the database just as if you had droppped it into Management Studio. That avoids the need for xp_cmdshell and sqlcmd. Another option would be to create a DTS/SSIS package that would read the files and use the Execute T-SQL task in a loop.