I want to find when an SSIS 2008 package was deployed under MSDB in an instance of SQL Server. In the table dbo.sysssispackages, I can see package creation date but where can I find the last modified/deployed date of a package?
The date an SSIS package was deployed to the MSDB is not tracked so you do not have the ability to know when a package was deployed, who performed this feat, etc.
With SQL Server 2012+ and the project deployment model, the SSISDB supports the ability to track when a project was deployed and by whom.
The best answer I have for you is much the same as Tab has just posted except I tied mine to VerBuild, which is a monotonically increasing number that VS updates whenever you save a package.
If it's absolutely crucial that you have this information, you could look at modifying msdb.dbo.sp_ssis_putpackage. That's definitely off the reservation so buyer beware, etc but depending on your appetite for risk, you could either extend dbo.sysssispackages by adding your custom columns there or create a new table dbo.sysssispackages_extended and there record who did what and when.
This information is not stored, and is not available for retrieval from SQL Server.
The best way to make this information available that I have found is to use the Version-related fields (VersionMajor, VersionMinor, VersionComments) in the SSIS package. Combined with use of source control, you can see which version of your package is currently live on your server, and find that version in source control to find which version of the code it is.
Related
As a company we have grown and we are now moving a couple of SQL Server 2016 databases over to a new server. We have SSIS packages that run off the databases that we are moving from server 1 to server 2.
Is there a way to easily identify using SSMS which SSIS packages use the current server and databases we are moving? Some of the old SSIS packages don't have documentation so we are trying to avoid physically opening up all the SSIS packages. We would prefer to identify the SSIS packages that are impacted.
Thank you!
Here are my solutions on the top of my head. I'm not an expert by any means so don't be surprised if someone comes up with something better.
In SSMS, you can view the data sources being utilized by a
package by... Object Explorer > SQL Server Agent > Jobs > (Your Job) > Steps > Edit... > Data Sources (Tab)
Here, you can view the data sources of your package. This is
slightly faster than opening all your SSIS packages; but, it isn't a
great solution either.
Conversely, recognize that .dtsx files are simply plain text
files. You can scan keywords within all of them using a number of
different scripts (PowerShell, Python, SSIS package w/ a Script
Task, etc.)
What you can use depends on the tech stack that your organization
supports but I imagining Googling for such a program/script would
not be difficult.
If you are utilizing SQL Server configurations in your packages
and you consistently do so for every package, you can query the
[SSIS_Configurations].[dbo].[SSIS Configurations]
NOTE: Solution (1) and (2) do not take configurations into account.
Hopefully, some of these solutions are helpful to you. I would be interested in an efficient means to do this without delving into scripts as well.
I have a huge number of SSIS 2012 packages from different SSIS projects, often the SSIS packages names in the different projects are identical.
I use SSIS logging from all the packages to one single table in a log database. I would like to keep it that way, to be sure only to have one database to search for all the SSIS logging.
When I am using the SSIS logging in the packages, is it possible to identify the project name also ? so I can identify what SSIS packages and project that are affected?
BR
Carsten
It sounds like you are using package deployment, in which case there is no ssis catalog and no project (per se), hence, no further information about your package. Logging that writes to sysssislog was designed before the concept of project deployment so that's why that piece of information is missing. As well, the use of MSDB also predated project deployment so it has no information either.
So there's no simple solution. I would guess that you could convert to project deployment and take advantage of all of the built in logging and reporting there (which you already said you don't want). Or you could modify all the packages to log there package id and project name into an additional table.
I would need to migrate a SQL database from Sybase to MS SQL Server. Before doing the actual migration on the production server I first created an SSIS-package with SQL Server Management Studio's Import/Export Wizard for testing with other databases. The test migration was successful and I would now like to deploy my SSIS-package to the real servers.
However, it seems I cannot simply run the package in Management Studio choosing different data sources for it - it only runs on the same databases for which it was created. Now, it can be edited in something called SQL Server Business Intelligence Development Studio (or BIDS for short)(I am using the SQL Server 2008 version), but going through every data flow task changing the destination source manually for each of the ~ 150 tables I am moving is ineffective and also introduces a possibility for error.
I there a way to quickly change what data source is to be used for ALL destination sources in ALL the flow tasks of an SSIS-package? If not, what simple method is there for testing migration with test databases first and simply changing the data sources when deploying?
I am using ODBC data sources, but for some the package shows OLE-sources in BIDS instead.
I hope I was clear enough. If you have additional questions, please ask! Thank you!
I would use a variable for the ConnectionString property of the connection manager. A package level configuration can be very useful for accomplishing this task. Several ways to do this. I prefer to use a table in SQL Server that holds all the configurations for all packages. This can be especially effective if you have multiple packages and need to dynamically change a set of connection managers across those multiple packages.
The basic steps are:
Opposite click on your SSIS design surface and select "Package Configurations..."
Create a package level configuration of Configuration Type "SQL Server"
Store your connection in a Configuration table in SQL Server
Alter your Connection Manager to use a variable for the ConnectionString Property
Populate that variable from the Configuration table via your package level configuration
When it comes time to switch from Test to Production, simply update the connection string in your configuration table
These screenshots can help...
This is part of a larger package management framework that I implemented using this book:
Microsoft SQL Server 2008 Integration Services: Problem, Design, Solution
I highly recommend it. Should take less than a day to set it up. Book has step by step instructions.
This question and its associated answers also helpful.
At the moment we manually push changes from our DEV SQL environment to the TEST and production (using Schema compare in Visual studio, plus some script we create while making changes to the DEV), but this is very time consuming and error prone.
We were wondering if there was a better way of doing this and how would we need to implement this.
I've read about maybe using versioning (how would this work?), or maybe using RED GATES' SQL Source control (but can this be used to push changes to the TEST, or is it only used to keep track of local changes?)
We want a reliable way to update our TEST & Production servers so that data won't be corrupted/lost... We use SQL Server 2008 R2 and Visual Studio 2012.
We are starting a new project, so it's time for a change! Thank you for your time!
One simple way to do this would be to have simple version table in the db with one row and one column which stores the version number.
Now everytime you are pushing changes to dev, create the incremental sql script, Have a master script which based on the current version of the db, will call the necessary incremental sql scripts to upgrade the schema to the latest version.
Be careful of dropping columns, changing column types, or reducing columns sizes e.g. varchar(100) to varchar(10) in your incremental scripts, as that could result in data loss if not planned properly.
You incremental scripts should be idempotent, that they could be run over and over, just in case to handle the case when db crashes during upgrade.
Although there are many benefits in using SQL Source Control (and I'd love for you to give it a go, as I'm the product manager!), its purpose is limited to versioning and not managing and deploying to your various environments. The correct Red Gate tool for this would be Deployment Manager.
http://www.red-gate.com/delivery/deployment-manager/
There is a blog maintained by the Deployment Manager project team here, which should give you an idea of where the tool is headed:
http://thefutureofdeployment.com/
Does Schema Compare in VS have CLI? If so you can probably automate it to run several times during the day. If not you can try using some other 3rd party tools that support CLI such as ApexSQL Diff for schema and ApexSQL Data Diff for synchronizing data.
In the 2008R2 version I was using SSIS logging to a sysssislog table in a defined database. 2012 brings now the concept of Integration Services Catalogs that have their own SSISDB db. Is it still necessary to use the logging to sysssisslog tables or is the information that ends there now somewhere in SSIS DB (what i would expect, since all the reporting for SSIS execution is based on this db as well).
The logging you are familiar with remains available to you with the 2012 release of SQL Server. That said, database logging no longer has to be explicitly defined in your package if you are using the Project Deployment model (SSISDB).
Out of the box, you'll get Basic logging level when you run a package. The other options are none, performance and verbsose. Read more about how to set these and other execution parameters via MVP Phil Brammer. Matt Masson of the actual SSIS team points out what events those levels correspond to in his post on What Events are Included in the SSIS Catalog Log Levels.
Finally, SSIS Reporting Pack is an open source project from MVP Jamie Thomson that provides different insight into the basic data being captured in the new integration services catalog.
My thoughts: necessary no. But if you already have a framework built out culling data from that log (we use it for an alerting system), you are supported to keep using it. If you run integration services packages from multiple servers, there is no functionality to combine the logging from all those disparate SSISDB catalogs to provide insight into your entire universe. You can get that if you all the packages log to a centralized server using the classic technique.