SSIS Package External columns for component are out of synchronization - sql-server

I have a solution in BIDS that contains many SSIS packages.
I had to restore the database that is being used by the OLEDB connection managers in my SSIS packages. Even though, all the data structure stayed the same as before when I developed those packages, all of my packages right now are getting error/warning icons stating that "external columns for component are out of synchronization" or "column cannot convert between unicode and non-unicode string datatypes".
When I go into each individual task in the package that shows error/warning, all I have to do is click ok, and they would go away.
I have about 50 packages in the solution, so refreshing it like this for each package is a pain and time-consuming.
Is there a fast an easy way to get rid of those errors/warnings for all the packages at once?
Thanks!

Sorry, but since you changed the back end of it, BIDS has no way to make sure that it has the correct version of Metadata. If this package was running on a scheduled job, you wouldn't have to do this IMO.
To answer your question, there is no automatic way of doing it.

Related

SSIS Logging in sql server 2012

I have a huge number of SSIS 2012 packages from different SSIS projects, often the SSIS packages names in the different projects are identical.
I use SSIS logging from all the packages to one single table in a log database. I would like to keep it that way, to be sure only to have one database to search for all the SSIS logging.
When I am using the SSIS logging in the packages, is it possible to identify the project name also ? so I can identify what SSIS packages and project that are affected?
BR
Carsten
It sounds like you are using package deployment, in which case there is no ssis catalog and no project (per se), hence, no further information about your package. Logging that writes to sysssislog was designed before the concept of project deployment so that's why that piece of information is missing. As well, the use of MSDB also predated project deployment so it has no information either.
So there's no simple solution. I would guess that you could convert to project deployment and take advantage of all of the built in logging and reporting there (which you already said you don't want). Or you could modify all the packages to log there package id and project name into an additional table.

Shared Data Sources vs. OLE DB Connections In SSIS

I've been using Shared Data Sources in all of my SSIS projects because I thought it was a "best practice". However, now that everything is under source control (TFS) just about every time I open a package it updates the Data Source connection in the package. I either have to roll the change back or check it in with some nonsense description.
I saw this SSIS Best Practice blog entry and it got me thinking about whether Shared Data Sources are really the way to go.
Don’t use Data Sources: No, I don't
mean data source components. I mean
the .ds files that you can add to your
SSIS projects in Visual Studio in the
"Data Sources" node that is there in
every SSIS project you create.
Remember that Data Sources are not a
feature of SSIS - they are a feature
of Visual Studio, and this is a
significant difference. Instead, use
package configurations to store the
connection string for the connection
managers in your packages. This will
be the best road forward for a smooth
deployment story, whereas using Data
Sources is a dead-end road. To
nowhere.
What are your experiences with data sources, configuration and source control?
We use SVN, so it doesn't integrate in the same way the TFS does. When starting out with SSIS, I used Shared Datasource, but it got me into all sorts of trouble when I finally uploaded the package to run on a schedule. So now I use XML configuration files (package configurations) to provide the connection properties, and I've never had any trouble with these.
So I agree, share datasources = bad idea/lack of hair
when we were migrating from SSIS 2005 to 2008, data sources were quite painful. Configurations on the other hand are pretty flexible. Especially if you store configurations in one database table - that way you can easily change connections with just one UPDATE statement!

SSIS DTSX File Repair Tool

I'm working with an SSIS 2005 file that crashes Visual Studio 2005 on my workstation. This happens when I open the data flow diagram and Visual Studio attempts to validate the package. I can open it successfully on another computer though. The package itself is fairly simple and only has two control flow tasks and maybe ten tasks in the data flow.
I'm wondering if there is a tool that goes through the XML in the dtsx file and repairs any issues or if this is even necessary. The dtsx file is about 171 kB and it seems like there's a lot in it considering what a simple package it is.
I've had the same problem with a seemingly corrupt package crashing VS. Trying to analyze the xml directly is pretty painful, so through process of elimination, you might fix issues by doing things like moving one object in and out of a container, as this seems to fix issues with orphan references in the dtsx.
The "big red button" option is to recreate the package, which is less arduous than it sounds: you can just copy paste components between two VS sessions and you'll maybe discover the container causing the problem this way.
There's no tool to automate this unfortunately: SSIS is just an unfinished and quirky product.

Are my issues with SSIS justified?

I've only been using SSIS briefly, but I find that my complaints are numerous. Here are my current issues:
In order for a package to store a password, you need to encrypt it. Even if the package is part of a larger solution, you need to supply a password anytime to open any of the encrypted packages. Why can't you just encrypt the whole solution with one password? I have a solution with 10 encrypted packages. When I hit "Build", I have to enter 10 passwords.
Encrypting credentials is great. Deploying an encrypted package to the server, supplying your password, testing it successfully, scheduling it, and then having it fail during the schedule because it can't decrypt itself SUCKS. It seems to randomly do this, and I have redeployed a given package several times before it is actually able to decrypt the package credentials successfully during a scheduled job.
Windows authentication only? Maybe this is a security feature, but it makes it a real pain in the butt to remotely manage the server. It basically forces me to use remote desktop. Does it really matter that I can't access SSIS when I have access directly to the DB Engine???
DTS Support. DTS was pretty ugly, but it worked, and was pretty straightforward. Why didn't they provide the DTS 2000 package designer WITH SSIS??? Now I need to go download it and install it with admin privileges.
UPSERTS??? I replicate some data to an external database, and upserting to that database is SUCH A PAIN. Why isn't this functionality built in? Why can't I just say "This is the key column. Update if exists, create if it doesn't".
Are these valid issues, or am I just to new to the product to know how to do things the right way?
Do others have the same issues, or other issues?
Are there easy alternatives to using SSIS?
The following links from #SQLServerSleuth might shed some light on the situation - a back and forth re: SSIS in 2005. Are you on SQL 2008, or still working with SQL 2005? This picture changed a bit in 2008.
SSIS' 15 Faults
SSIS: The backlash (1)
SSIS: The backlash (2)
SSIS: A response from Microsoft to some growing criticism
In my system it was overall easier to just develop data loads in C#. The loads are rock solid and do not change unless we want them to change, so we don't spend any more time after we are done with development.
Check out Package Configuration files for some of the security issues.
Do you actually need the encryption on each package? You can say no encryption storage if you aren't storing an ftp or other authentication passwords. Configuration files are also a good idea. I recommend www.pragmaticworks.com/products/Business-Intelligence/BIxpress/ BIExpress as it will create all the config files for you, log the crap out of your packages and provide awesome blow your socks off graphical reporting for next to nothing cost wise...
let me preface this by saying that SSIS sucks. it's a pain to work with, manage, and develop. While there are tools that make things better, these features should have been included from the start. let me also say, that I haven't found (and don't believe there currently exists) a better tool for scalable high performance data loads than SSIS.
1,2: set the package to "Don't Save Sensitive", and use either configurations, or "Set Values" inside whatever execution context you are using.
3: agreed, partially. browsing the package store would be nice through sql auth, but executing the package should absolutely not be allowed (under what context do you execute?)
You can always execute through the job.
4: not related to SSIS
besides, DTS is deprecated, and in most ways, considerably less flexible and harder to manage than even SSIS.
5: Upserts are admittedly trickier than they could be, but if done right, it can work flawlessly:
either use a lookup to determine whether you need to insert or update, and define your logic accordingly.
side note: seriously consider setting up a package template. if done right, you can alleviate many of these concerns from the start. I may need to publicly post my package template at some point.
We ran into many of the same issues, especially #5, so I agree these are valid. In general, I found SSIS to be a massive pain to work with.
For 1, 2, I use package configurations.
For 5 you can use a slowly changing dimension task or the third-party table difference component. I personally prefer to load to a staging table and code the UPSERT in SQL.
I've been using SSIS fairly intesively on a DW project for the last 2 years and I find it to have a few quirks but it is far more powerful than DTS.

DTS- Debugging Tips

In a legacy project that I'm on, we have several processing that are preformed via DTS. DTS is not something I worked with a lot back in its hey day.... I was in college.
More specificity, these process are in ActiveX code blocks -- which is basically VBScript for database. It is really hard to debug.
Anyway, I'm wondering if any past or present experienced DTS professionals can offer tips on how to deal with debugging, troubleshooting or otherwise dealing with DTS package development.
This this question is marked as community wiki, I'm hoping to have general and targeted ideas and methods for all types of DTS package implementations.
I had a complex DTS package that imported some data, ran some batch scripts, made a CSV file and uploaded the resulting output via FTP. Sometimes the FTP process would fail.
I created a "DTS LOG" table and after each step I simply added a SQL insert task and wrote a time stamp and function name into the table. I made a view to show me any process that did not complete.
While this may not be as granular as you need, but at least you'll know where the problem is in the execution.
In the scripting portion, I have used the MsgBox to display "I got here" or "xfer worked" or whatever you want to indicate something happened which is not so obvious at run time.
You can also use conditional statements to branch off to an 'End' if you are testing a particular portion of the flow.
If you are stuck working with DTS, but are also running a SQL Server 2005 instance, you might see if you cant upgrade the DTS packages to DTSX (SQL Server Integration Services) and re-do them there. I know this isn't a 'trick' but you work in the VS2005 IDE, can write in .NET and also you can set break-points and will make life in 'DTS' world much easier.
There are also some articles here:
http://www.databasejournal.com/article.php/1503191
Scroll down and you will see the "SQL Server 2000 DTS" articles.

Resources