I'm working with an SSIS 2005 file that crashes Visual Studio 2005 on my workstation. This happens when I open the data flow diagram and Visual Studio attempts to validate the package. I can open it successfully on another computer though. The package itself is fairly simple and only has two control flow tasks and maybe ten tasks in the data flow.
I'm wondering if there is a tool that goes through the XML in the dtsx file and repairs any issues or if this is even necessary. The dtsx file is about 171 kB and it seems like there's a lot in it considering what a simple package it is.
I've had the same problem with a seemingly corrupt package crashing VS. Trying to analyze the xml directly is pretty painful, so through process of elimination, you might fix issues by doing things like moving one object in and out of a container, as this seems to fix issues with orphan references in the dtsx.
The "big red button" option is to recreate the package, which is less arduous than it sounds: you can just copy paste components between two VS sessions and you'll maybe discover the container causing the problem this way.
There's no tool to automate this unfortunately: SSIS is just an unfinished and quirky product.
Related
I recently had the unfortunate experience of having the size of a field changed in a table that was being used in SSIS packages. The developer who wrote the packages had since retired, and had grouped them all in one VS BI project. She developed them solely on her local PC and moved them to a shared drive when she left.
Anyone with any knowledge of Protection Levels in SSIS knows what happened next. She saved them with the default EncryptSensitiveWithUserKey option, so as a result, I couldn't modify the package due to the fact that I wasn't her and I wasn't on her machine. Her AD account has long since been deactivated, and her machine may be checking people out at a grocery store now for all I know.
I had to recreate the packages from scratch. Fortunately, the protection level prevents you from saving or building, not from looking at things in Design, so I was able to replicate what she did, but it was a long, tedious process that took the better part of a full day to complete.
My question is: We use SourceSafe to maintain our projects, so that's where the new SSIS project will be going. Given the following:
We each have our own PCs with working folders that sync to SourceSafe
We do not have the option of saving the project on the database server itself; we can only deploy the pacakges.
Software details: MS Visual Studio 2008, VSS 8.0, MS SQL Server 2008 R2
What would be the best security option to configure our project with? I immediately see that DontSaveSensitive would be the logical approach, but I don't know where the passwords would then be supplied from. I would think a config file, but I don't know how to set that up for an SSIS deployment.
Thanks!
Use VSS as the source control provider for Visual Studio and add the SSIS project just like any other source project. After installing VSS, go to Tools\Options\Source Control and set VSS as the provider.
I've been using this setup for several months. There's no difference, as far as I can tell, between any of my C# projects and my SSIS projects in relation to VSS.
I have a solution in BIDS that contains many SSIS packages.
I had to restore the database that is being used by the OLEDB connection managers in my SSIS packages. Even though, all the data structure stayed the same as before when I developed those packages, all of my packages right now are getting error/warning icons stating that "external columns for component are out of synchronization" or "column cannot convert between unicode and non-unicode string datatypes".
When I go into each individual task in the package that shows error/warning, all I have to do is click ok, and they would go away.
I have about 50 packages in the solution, so refreshing it like this for each package is a pain and time-consuming.
Is there a fast an easy way to get rid of those errors/warnings for all the packages at once?
Thanks!
Sorry, but since you changed the back end of it, BIDS has no way to make sure that it has the correct version of Metadata. If this package was running on a scheduled job, you wouldn't have to do this IMO.
To answer your question, there is no automatic way of doing it.
I've been using Shared Data Sources in all of my SSIS projects because I thought it was a "best practice". However, now that everything is under source control (TFS) just about every time I open a package it updates the Data Source connection in the package. I either have to roll the change back or check it in with some nonsense description.
I saw this SSIS Best Practice blog entry and it got me thinking about whether Shared Data Sources are really the way to go.
Don’t use Data Sources: No, I don't
mean data source components. I mean
the .ds files that you can add to your
SSIS projects in Visual Studio in the
"Data Sources" node that is there in
every SSIS project you create.
Remember that Data Sources are not a
feature of SSIS - they are a feature
of Visual Studio, and this is a
significant difference. Instead, use
package configurations to store the
connection string for the connection
managers in your packages. This will
be the best road forward for a smooth
deployment story, whereas using Data
Sources is a dead-end road. To
nowhere.
What are your experiences with data sources, configuration and source control?
We use SVN, so it doesn't integrate in the same way the TFS does. When starting out with SSIS, I used Shared Datasource, but it got me into all sorts of trouble when I finally uploaded the package to run on a schedule. So now I use XML configuration files (package configurations) to provide the connection properties, and I've never had any trouble with these.
So I agree, share datasources = bad idea/lack of hair
when we were migrating from SSIS 2005 to 2008, data sources were quite painful. Configurations on the other hand are pretty flexible. Especially if you store configurations in one database table - that way you can easily change connections with just one UPDATE statement!
This is starting to frustrate me, mostly because it should be easy to do but I can't find anything.
I am trying to version control my database. The method I have decided upon is to generate scripts at each check-in that creates a sql script for every object in my database. Most of my edits are done via the visual designer in SSMS and I don't trust myself to remember to generate change scripts every time. I am only concerned with the schema for now, although soon I would like data from certain lookup tables.
I have tried the following with not much success:
I attempted to generate scripts via SSMS Tasks->Generate Scripts. I don't like this for two reasons.
It does not seem to save what settings you give it, so every time I want to generate scripts I have to change the settings to what I want, change the output path, etc.. I really just want to press a button or two and be a quick pre-commit process.
it seems to generate scripts using USC formatting, which means version control (git) sees it as being a binary file and thus can't do comparisons between versions (which is one of the big reasons I want to store the DB in version control).
I also attempted to use dbSourceTools. This tool fixed issue #1 from SMSS and made it real easy to update the schema definitions, yet the .sql files it generated used USC formatted text as well, which meant version control saw it as a binary file.
I am looking for a free method to do this as I am working on a hobby project right now.
If you are using Visual Studio Developer Edition or Visual Studio Team Suite, you can use Visual Studio Database Professional for free. This is a fantastic tool for managing your database schema.
In a legacy project that I'm on, we have several processing that are preformed via DTS. DTS is not something I worked with a lot back in its hey day.... I was in college.
More specificity, these process are in ActiveX code blocks -- which is basically VBScript for database. It is really hard to debug.
Anyway, I'm wondering if any past or present experienced DTS professionals can offer tips on how to deal with debugging, troubleshooting or otherwise dealing with DTS package development.
This this question is marked as community wiki, I'm hoping to have general and targeted ideas and methods for all types of DTS package implementations.
I had a complex DTS package that imported some data, ran some batch scripts, made a CSV file and uploaded the resulting output via FTP. Sometimes the FTP process would fail.
I created a "DTS LOG" table and after each step I simply added a SQL insert task and wrote a time stamp and function name into the table. I made a view to show me any process that did not complete.
While this may not be as granular as you need, but at least you'll know where the problem is in the execution.
In the scripting portion, I have used the MsgBox to display "I got here" or "xfer worked" or whatever you want to indicate something happened which is not so obvious at run time.
You can also use conditional statements to branch off to an 'End' if you are testing a particular portion of the flow.
If you are stuck working with DTS, but are also running a SQL Server 2005 instance, you might see if you cant upgrade the DTS packages to DTSX (SQL Server Integration Services) and re-do them there. I know this isn't a 'trick' but you work in the VS2005 IDE, can write in .NET and also you can set break-points and will make life in 'DTS' world much easier.
There are also some articles here:
http://www.databasejournal.com/article.php/1503191
Scroll down and you will see the "SQL Server 2000 DTS" articles.