I am in process of researching the best way to track all the queries which run within an SSIS package. The plan is to create an automatised report which could pull from SQL server that shows the querys the SSIS package runs and where it loads the data. I was wondering is it possible to customised a log message. My plan would be to insert a log after the task has run and say the logic which ran. An example would be if i had a Execute SQL task i could write a custom message that would say the logic which just ran. I have been trying a various of other solutions but couldnt really see much which would help me create a directory of the info of all SSIS packages.
Related
I've currently got an ETL process that dynamically builds and executes sql jobs based on job steps that are saved in my database. Included in these jobs are steps to call SSIS packages that move data from one server to another and/or call stored procs on target servers to do further processing. I'm looking at what it would take to migrate our process from SQL Server to a Azure Managed Instance. One of the specific things I'm looking at is the feasibility of replacing the steps that call the SSIS packages with steps that execute Azure Data Factory pipelines or other ADF actions that accomplish the same results. So far I have not run across any examples of this. Anyone have any experience with accessing Data Factory functionality with SQL Agent jobs?
You can run powershell scripts via SQL agent as mentioned in below MSFT docs:
https://learn.microsoft.com/en-us/sql/powershell/run-windows-powershell-steps-in-sql-server-agent?view=sql-server-ver16
And via powershell and ADF REST APIS, you can trigger the ADF pipelines
Using Microsoft products, I have a collection of SQL scripts and a VS project that I use to refresh a DW on a weekly basis. I know the process can be automated, but the documentation for VS is so vast I don't know where to start. The HIGH LEVEL process is outlined below:
Open SSMS and MANUALLY run scripts to truncate tables and drop indexes
Open VS project and in dev mode MANUALLY press START to extract the data from application to the truncated tables in the DW
Open SSMS and MANUALLY run transformation scripts to create analysis cubes end users can access
I am trying to get to a point where I can just schedule this process to run every X period so I don't have to press any buttons.
From what you described it seems like SSIS is something that can cover the tasks you listed. SSDT will need to be installed to use this. Since you're using scripts I'm assuming you have SQL script files saved that you execute. These can definitely be run in SSIS using an Execute SQL Task, with a file connection as the SQL source. With a collection of scripts, I'd suggest looking into using a Foreach Loop to run these via an Execute SQL Task while iterating through the folder(s) that contain the scripts. As far as running the Visual Studio project in development mode, configurations in Visual Studio can used to accomplish this for SSIS. SSIS has both an Analysis Services Processing Task and Analysis Services Execute DDL Task, and sounds like you're looking for the latter. Both XMLA and TMSL commands can be executed from an SSAS Execute DDL Task. Below are some links to get you started. The Data Flow Task may help with what you're doing for your data extraction.
SSIS
SSDT
Execute SQL Task
Analysis Services Processing Task
Analysis Services Execute DDL Task
Configurations
Data Flow Task
Foreach Loop
DTEXEC
I think, you can use jobs. Descriptions is here
We are currently using TeamCity and I am wondering if it is possible to have it handle our database process. Here is what I am trying to accomplish.
User runs a build
TeamCity remotes into database server (or tells a program to via command line)
SQL script is run that updates record(s)
Copys the mdf/ldf back to team city for manipulation in the build
Alternatively it could work like this if this is easier
User logs in to database server and runs batch file which does the following:
SQL script is run that updates record(s)
MDF/LDF is copied and then uploaded to repository
Build process is called through web hook with parameter
I cant seem to find anything that even gets me started. Any help getting pointed in the right direction would be helpful.
From your description above I am going to guess you are trying to make a copy of a shared (development) database, which you then want to modify and run tests against on the CI server.
There is nothing to stop you doing what you describe with TeamCity (as it can run any arbitrary code as a build step) but it is obviously a bit clunky and it provides no specific support for what you are trying to do.
Some alternative approaches:
Consider connecting directly to your shared database, but place all your operations within a transaction so you can discard all changes. If your database offers the capability, consider database snapshots.
Deploy a completely new database on the CI when you need one. Automate the schema deployment and populate it will test data. Use a lightweight database such as SQL Local DB, or SQL Lite.
I am trying to build a tool to facilitate some redundant importing of data into a SQL Server database. The flat text files we get have are mostly static, but there is often about a 5-10% variance in field names and sometimes some extra fields added (in which we will add columns to the table in the database before importing).
I'd like to build a front end interface for a SSIS package to make the field mapping the only real work for the user as I don't think we can program it. Is there anything out there that would allow this? Should I consider something other than SSIS? Appreciate any input, thanks!
SSIS packages are generally headless because they typically will run as a scheduled job somewhere on a database server. That said, there are definitely ways to do this.
One option that I have used is SQL Management Objects (SMO) to connect to the SQL Server Agent where the job is hosted. A client can interactively run such a job and even update the user on execution status. The same client could ask the user for input prior to kicking off the job, and you could store such input in a place where the package can access it.
I am trying to find a better way to test our SSIS application, which is used to load data from a file into SQL Server and validate the file data.
I have created a SQL script which can be run to insert 'bad' data into the database tables, and ensure that our validations are performing correctly.
The SQL script:
- loads 'bad' data
- executes the SSIS validations
- ensures the errors in the data were detected
- Outputs a PASS or FAIL
- Deletes the TEST data if passed
Is there anyway I can get this script to be run automatically somehow, for example after someone checks in some code? Should I add it as a stored proc?
I had a look at the Default template Build definition but I couldn't see how to run this SQL script.
The wrong way to do this is using a build server. There are lots of parts to a continuous integration and a build server is really designed for compilation and validation that does not require an instance of your environment. There are two good practices that you can adopt:
Create a test harness that allows you to load a singe package from your SSIS scripts and test the inputs and outputs. Essentially unit tests. I did this for a customer a few years ago and it worked well.
Use a release management tool to push out your packages, the data, and everything else that you need. Release Management for Visual Studio can do this easily.
A rule of thumb that I always go buy is if I do not need an instance of anything but in memory objects then all I need is a build server. However if I need an instance of my app then I want to add release management tools to my continuous integration strategy.
It doesn't appear that you can do it as part of a checkin based on my 2 minutes of searching.
How to modify the default Check-in Action in TFS?
http://msdn.microsoft.com/en-us/library/ms243849(VS.80).aspx
Can I modify the default Check-in Action in TFS 2012? (pretty picture)
http://www.codeproject.com/Tips/562994/Perform-a-custom-action-for-Check-in-event-in-Micr
Instead, you'll need to set up a build server to handle the action. Out of the box, MSBuild doesn't have the ability to run a SQL Script so you'll need to snag the MSBuild Extension Pack and leverage SqlExecute
http://msdn.microsoft.com/en-us/library/ms181712.aspx
http://msdn.microsoft.com/en-us/library/ms181710(v=vs.90).aspx
http://msdn.microsoft.com/en-us/library/cc668755(v=vs.90).aspx
http://www.msbuildextensionpack.com/help/4.0.5.0/html/0d864b98-649a-5454-76ea-bd3069fde8bd.htm
You can use SQL Server Agent to embed for example your SQL code and you can than run this job from your application.