I have been involved in discussions in this forum about whether SSIS overwrites packages when you modify XML config files. The two posts at the bottom of this posting disagree with each other, one saying packages do get overwritten, one saying they don’t. I believe that I have the answer but am looking for others thoughts on this – if you modify xml package configurations at design time and run the package through visual studio, when the package is saved, the package code is actually modified by SSIS to reflect the configurations you have changed. However If you run packages through DTEXEC , or SQL Server Agent the package code is not modifed
If I have a package in Visual Studio, if I modify the servername in the connectionstring (change server A to Server B) in the XML configurations, and run the package
The behavior I expect is fine, it writes the data to the table on the second server
The behavior I DON’T expect is this. When I pull up the connection manager in the package visual studio, server A has been changed to server B
SSIS has modified the code in my package.
This behavior can be very confusing and hast cost me a lot of time on the project I am working on. IMHO should only override values at run time, not overwrite packages at design time!!
Do you have any thoughts on this?
Post 1 states it does overwrite
ssis xml configuration modifies package xml - am i crazy?
Post 2 states it does not overwrite
ssis xml configurations - configs overwrite packages - does ssis change config without you telling it to?
If you Execute a package from Visual Studio 2005 / 2008 (for SQL 2005 / 2008 / 2008 R2) then yes it does overwrite using the configured values.
Visual Studio 2010 (for SQL 2012) using the Project Deployment Model appears to avoid this issue.
Related
I need to understand what a certain job in SQL Server 2012 does. It's a job someone created and left the company before I started to work here, and nobody on my team knows what this job does also.
The job isn't SQL command based but SSIS Package (which I'm not familiar with), the package points to a Maintenance Plan with the same name as the job. As I read on the internet I connected via Integration Services type, then Stored Packages -> MSDB -> Maintenance Plans, right click on the Maintenance Plan and exported it to a .dtsx file.
I opened it in Visual Studio 2010 Shell, but I can't edit anything because of this error:
The task with the name "" and the creation name "" is not registered for use on this computer.
I also don't have a Solution Explorer for that package, and the icons of the tasks seem a bit faded compared to tasks icons of a new project if I create one.
Maybe the dated version of the VS can be the problem? Perhaps there is other way to see what this job does?
I never worked with SSIS before so maybe I'm missing something very basic but I've been on the Internet for days already and can't find any solution.
Please help,
Thanks in advance
By way of background, until Visual Studio 2019, VS, by itself, couldn't open an SSIS package.
Prior to VS 2010, you needed to use a different product, called Business Intelligence Design Studio (BIDS), which was based on VS, but was built for SQL Server functionality. With the release of SQL Server 2012, Microsoft created SQL Server Data Tools (SSDT) as a plugin for VS that required a separate installation. Until SSDT(VS) 2017, you needed to have stand-alone versions of BIDS or SSDT for each version of SSIS you were working with in your environment. The 2017 version, though, is backward compatible to SQL Server 2012, and forward compatible (!) to SQL Server 2019. Visual Studio 2019 has SQL Server Data Tools sort of built in, but you need to add extensions through the extension manager for SQL Server Integration Services (SSIS), Analysis Services (SSAS), and/or Reporting Services (SSRS).
That's kind of a lot to take in, and is sort of irrelevant to you, but it forms the context for Larnu's comments. Rather than messing around with antique versions of Visual Studio, you should just install either VS2017 and also SSDT, or VS2019 with the SSIS extension. Both are in current widespread use, so support is easy to find, and either will allow you to work with the 2012 package you're trying to open up.
Once you have the software installed, you should be able to create a "dummy" integration services solution that you can use to open random packages. I keep one around called MiscSolution. Right click on the solution name, select Add -> Existing Item. Then find your .dtsx package and import it. (Or right click SSIS Packages -> Add Existing Package. I always do it at the Solution level because that interface allows you to select multiple packages.)
Note that when you do this, you're making a new copy of the package in your local solution directory. You are NOT working on the copy of the package that sits in the folder where you picked it up. This matters if you're going to be making any changes, since it will need to be redeployed. It also matters if you accidentally hose up the package, because you've done no harm as long as you just delete that copy.
Once the package imports, which might take a minute or so depending on how complex it is, you should be able to open it up and see what's going on.
I'd like to know the difference between SSDT in Visual Studio and Integration Services in SQL Server. When developing SSIS package locally using SSDT in VS I could pass data to my local SQL Server without Integration Services being installed in SQL Server instance. I'm wondering if I need Integration Services to allow between servers communication. Let's say when SQL Server DB is on one server and .dtsx package is on another.
This is where Microsoft's marketing department has run amok. It is important to understand that Microsoft is not a company that writes code. Microsoft is a marketing company that happens to write code.
The simple answer is that SSDT is the package of project templates for what is intended to be Business Intelligence.
SSDT contains three template types:
SSIS: Integration Services
SSRS: Reporting Services
SSAS: Analysis Services
UPDATE
I failed to answer the question of running the packages. Basically, you can run any SSDT package against SQL Server within Visual Studio. However, if you want to deploy the SSDT packages to the SQL Server, then you must have those services installed. The services can be installed via the SQL Server instance installation wizard. You will need to be mindful of another hairy concept that is known as SQL Server Version Targeting: Click Here
For example, if you want to deploy and run SSIS packages to SQL Server, then you will need to install Integration Services (this will include DTExec.exe and ISDeploymentWizard.exe). Now, you will also need to install SSISDB to the SQL Server in order to be able to deploy SSIS packages to the SQL Server - this is performed via SQL Server Management Studio (SSMS). The actual packages are both deployed to, and managed from, a folder called Integration Services Catalog. The packages can then be automatically scheduled to run via the SQL Server Agent. It is extremely unlikely that you will ever work directly with the SSIDB, other than perhaps querying it for information: see here.
See Microsoft's instructions: click here
SSRS packages are managed through a separate Web-UI and I have not dealt with SSAS packages. Isn't this fun!?
A note on DTExec.exe
I have ran into purists who disdain SSIS, largely because they do not understand it. The general argument that I get is that it is slower than PowerShell or Stored Procedures. This may cause someone's head to explode.
Basically, PowerShell runs through the .NET Framework, which is in C# and has to run through a few layers in the OS in order to execute. While SSIS components are written in C#, the DTExec.exe application is written in C++, which can access system resources directly (C# cannot do this because it is managed code!). So, SSIS is going to blow PowerShell and Stored Procedures away in large tasks.
Stored Procedures are a different animal, but still slower because they lack the pipeline buffer (i.e. data flow tab). Another major limitation is how SQL Server executes Stored Procedures and that is sequentially. So, let us imagine that a comparable SSIS job is broken into multiple stored procedures and that those procedures are called by a main stored procedure - a super stored procedure so to speak. SQL Server will execute the stored procedures sequentially, one at a time - this is a huge performance bottleneck. SSIS's pipeline buffer obliterates that by processing a default of 10,000 rows (this is configurable btw) in each task and then passing them off to the next task. So, we can think of data flow tasks as their own stored procedure.
Additional Context
There is a long running confusion with what constitutes SSIS as it relates to Visual Studio, not necessarily SQL Server.
Pre 2005: It was Data Transformation Services (DTS)
2005 & 2008: In 2005, DTS was substantially overhauled and renamed to SSIS late in the development. That is why everything in SSIS still references DTS (i.e., *.dtsx files). It remains the case to this day. Odd, only a masochist likes DTS. BUT! The package of templates was renamed to Business Intelligence Development Studio (BIDS)
2012 & 2013: It was renamed to SSDT-BI. Apparently, there was already another product named SSDT
2015 and forward: It is now named SSDT
See Microsoft's attempt to explain SSDT: click here
Up through Visual Studio 2017 (VS 2017), SSDT and its various incarnations have been largely treated like the idiot step child to Visual Studio. I say this because VS was installed as a standalone product for these project types only. I don't know why it was done this way - my best guess is because SSDT is free. Anyways, if you wanted to use Visual Studio for other application development then you had to install a separate instance of Visual Studio. So, us developers quite literally have two standalone installations on our dev box and we have to use the specific install for whatever we are doing (i.e., SSDT or non-SSDT development).
Now, with VS 2019, Microsoft is doing away with this model and has finally integrated the SSDT package into the product. Though, the initial roll out of VS 2019 for SSDT was a comedy of errors right out of the box. See my explanation by clicking here. Basically, SSIS does not install with the package and has to be added separately. Though, you still have one instance of VS 2019. Additionally, the SQL11 data provider has been deprecated. And, that too apparently does not come with the installation package either and needs to be installed separately. So, any existing packages that use it will need to be upgraded and re-deployed (see Known Issue #1).
I am holding off on upgrading to VS 2019 for now. VS 2017 has been a pain to say the least. I personally still use VS 2013 Update 5. All VS instances are targeting SQL Server 2014.
Beside of #JWeezy good and detailed answer, i'd like to add a brief explanation:
SQL Server Data Tools for Visual Studio are the development environment for the SQL Server business intelligence suite (SQL Server Integration Services, Reporting Services, Analysis Services).
SQL Server Integration Services (installed from SQL Server installation), install all the files needed to run SSIS packages on the local machine.
Both products can run .dtsx packages but the first one is only for development and testing purposes while the second one is for production server.
References
Previous releases of SQL Server Data Tools (SSDT and SSDT-BI)
SQL Server Integration Services
We recently updated our production SQL Server 2016 Enterprise instance from SP1 to SP2. We are currently on version 13.0.5026. Prior to the upgrade, a user with connect rights to SSISDB and proper rights on the Integration Services Catalog folder could deploy an ISPAC file successfully.
After the upgrade, the same users can still deploy to the SSISDB, but when you execute the .DTSX, the script task inside fails validation. If I deploy the exact same ISPAC as a sysadmin, there's no issue. The usual solution I've seen is to confirm that SSDT Configuration Properties are set to SQL Server 2016. We have verified this is set correctly prior to building the ISPAC.
I saw a similar issue when migrating from SQL Server 2014 to 2016 a couple years back, but the solution at that time was to give the Proxy account that runs the package modify rights to the C:\Windows\Temp folder so it can generate temp files. This new issue is hard to pin down, and I don't want to give out sysadmin just so others can do the simple deployment steps.
Any thoughts or suggestions are appreciated.
*******Update/edit************:
The server has a SQL Server 2016 deployment tool located under SQL Server/130/DTS/Binn - ISDeploymentWizard.exe. This deployment tool works. There is another identical wizard under the 140/DTS/Binn location, same name but 1 KB larger (assuming this is because SSMS is a separate install now, and I installed latest and greatest on server). This one fails deployment. I'm banging my head against the wall as to why one works but the other doesn't. Locally we all use SSMS 2017, and with that we get the 140/DTS/Binn ISDeployment file, not the 130 (since that's SQL Server 2016 and we're using SSMS 2017, which I thought was backwards compatible). Either way, this problem just started occurring and we've been on the same version of SSMS for a few months.
Image of the Execution information report from SSMS
Had resolved a similar issue with C# scripts recently. In short: don't use 140 version of ISDeploymentWizard.exe with MS SQL 2016. It apparently mangles something in C# code, or components' properties, and 2016 runtime stops recognising them.
In my case, a package with C# script source has started to throw the following error during the validation phase:
Error: Microsoft.SqlServer.Dts.Pipeline.ComponentVersionMismatchException:
The version of C# source component name is not compatible with this
version of the DataFlow. [[The version or pipeline version or both for
the specified component is higher than the current version. This
package was probably created on a new version of DTS or the component
than is installed on the current PC.]]
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostCheckAndPerformUpgrade(IDTSManagedComponentWrapper100
wrapper, Int32 lPipelineVersion)
The first comment here has helped me to ultimately identify the cause.
I have an one data transfer package with .DTSX extension.
I want to edit that package.
Please help me.
DTS was the old extension name used for Integration Services packages.
DTSX is new (I guess my new! refers to SQL2005)
With recent releases SQL Server introduced SQL Server Data Tools which is very similar to Visual Studio but concentrated on SQL Server tasks.
SQL Server Business Intelligence developers can use SQL Server Data Tools for creating and editing Analysis Services, Integration Services and Reporting Services solutions
Here is a download link
Follow the following steps:
Open BIDS or SSDT (it depends which version of sql server development tool you have installed)
Create a new Sql Server Integration Project
copy the .dtsx package by pressing Ctrl + c and go back to the newly created project , you will see the portion "Package" in Solution Explorer , Paste your copied package here by selecting the "Package" header.
double click to this copied package, you can now edit your package.
I am trying to create, test and deploy a SQL CLR (database project) using Visual Studio 2012 (update 2) and it is getting very frustrating.
The coding itself was trivial but now I cannot deploy or execute unit tests on the project. I have developed SQL CLR in VS 2010 and it was much simpler (Test.sql).
So here are my questions:
How does one set up a local SQL script to test the newly minted functions/sprocs? [I have tried adding a .sql file and marking it to be run on debug but I get error SQL71006.]
How does one deploy the project? I am deploying to SQL 2008 R2 and I know I need to use .NET 2.0 but I still have not been able to deploy anything. When I execute the generated .SQL on the target SQL Server instance, nothing happens (no errors, nothing).
This has worked so well with previous versions of Visual Studio one wonders if MSFT is perhaps now trying to discourage the generation of SQL CLR modules?
For first point, ensure that .sql file in properties has anything then Build in field Action
For second point, try to use a Publish option, where you will get a dialog box to set up database connection and other config option.
Deploy option does nothing when you work with database projects.