SSIS Package (SQL 2016) getting failed from batch execution - batch-file

We have developed SSIS package which had target framework SQL Server 2014.
Package has 2 states:
1. Which truncates the required SQL Table.
2. Pull data from SharePoint list data by hitting Service call and dumps data to truncated sql server table.
Latter on some SQL Server 2016 upgrades been done on server from IT department. So Dev team has to change target version in SSIS to SQL Server 2016. Which is working fine from SSDT tool (All states executes perfect).
The generated ".dtsx" I found in "~/obj/Development" path in source code directory. Then we developed the the batch file which is targeting to execute generated .dtsx file. If we executes the batch file as administrator it doesn't works, takes amount of time and shows "Operation TimeOut" at the end. Why should this is happening, any clue ?

Related

SSIS Package fails on Excel Source when run as a job from SQL Server Agent but it succeds when run in local from Visual Studio 2019

I have a SSIS Package that reads an excel file located in a NAS folder.
The excel file has multiple sheets, but I'm interested in only one named "GDP".
The SSIS package correctly runs and loads data to a table in the SQL Database.
I deployed the package and added it as a step in a SQL Server Agent job.
The job fails giving the following error:
Opening a rowset for "GDP$" failed. Check that the object exsists in the database.
Any suggestion about fixing this issue?

Multiple SSIS Package Execution by Agent: one step fails with "must install standard edition of Integration Services"

We have seven SSIS packages that are stored as File System packages on server A, which is running Integration Services 15.0.2000. All of the packages use configuration files and all of them run successfully when run individually on Server A (by connecting to Server A integration services using SSMS).
Each package does essentially the same task: run a stored procedure, export the data to a text file on a network share.
We are trying to execute the Server A packages using a SQL Server Agent Job on Server B (SQL Server 15.0.2000.5). The job has seven steps: one for each package. Each step has the identical setup. The only difference is the package name and the configuration file.
When the Agent Job is run, six of the seven steps run successfully. One fails, returning this error:
Code: 0xC000F427 Source: Data Flow Task SSIS.Pipeline Description: To
run a SSIS package outside of SQL Server Data Tools you must install
Standard Edition (64-bit) of Integration Services or higher.
This is counterintuitive since the other six packages already ran successfully.
The packages all share the same package types, package format versions.
I've created a new version of one of the failing packages from scratch and it still fails.
Not sure where to go from here.
While trying to document the problem thoroughly, I came across a fix.
The package that was failing is the only one of seven that had a Derived Column object in the data flow. I moved the derived column into the stored procedure, removed the Derived Column object, mapped the column to the flat file destination, redeployed to Server A, and voila. It worked.
I don't know why the Derived Column object was an issue. Again, the package ran fine from the SSIS server itself. The problem presented itself only when attempting to execute the package from a different server using a SQL Server Agent.

The database 'xxx' cannot be opened because it is version 904

I can't attach my database. When I try to attach a database in SQL Server Management Studio, I get this error:
The database 'C:\FILES\ACCOUNTING.MDF' cannot be opened because it is version 904. This server supports version 852 and earlier. A downgrade path is not supported. Could not open new database 'C:\FILES\ACCOUNTING.MDF'. CREATE DATABASE is aborted. (.Net SqlClient Data Provider)
and I have tried these commands:
cd "C:\Program Files\Microsoft SQL Server\130\LocalDB\Binn"
SqlLocalDB.exe delete "MSSQLLocalDB"
SqlLocalDB.exe create "MSSQLLocalDB"
but it still has an error
You CANNOT do this - you cannot attach/detach or backup/restore a database from a newer version of SQL Server (v904 = SQL Server 2019) down to an older version (v852 which is SQL Server 2016) - the internal file structures are just too different to support backwards compatibility.
You can either get around this problem by
using the same version of SQL Server on all your machines - then you can easily backup/restore databases between instances
otherwise you can create the database scripts for both structure (tables, view, stored procedures etc.) and for contents (the actual data contained in the tables) either in SQL Server Management Studio (Tasks > Generate Scripts) or using a third-party tool
or you can use a third-party tool like Red-Gate's SQL Compare and SQL Data Compare to do "diffing" between your source and target, generate update scripts from those differences, and then execute those scripts on the target platform; this works across different SQL Server versions.
The error message in the problem statement occurs because the SQL Server database files (*.mdf, *.ndf and *.ldf) and backups are not backward compatible. Backward compatibility is why we cannot restore or attach a database created from a higher version of SQL Server to a lower version of SQL Server. However, there are a few options that can help us to downgrade the database from a higher version of SQL Server to a lower version SQL Server. These options include:
Use the Generate Scripts Wizard in SQL Server Management Studio
Use SQL Server Integration Services
Create Custom Scripting and BCP
In this tip we will use the Generate Scripts Wizard in SQL Server Management Studio.
Here are the basic steps we need to follow:
Script the database schema and data from the higher version of SQL Server by using the Generate Scripts Wizard in SSMS.
Connect to the lower version of SQL Server, and run the SQL scripts that were generated in the previous step, to create the database schema and data.
In the next section, I will demonstrate the steps for downgrading a SQL Server 2012 database to SQL Server 2008 R2 database.
Steps to Downgrade a SQL Server Database Using SSMS Generate Scripts Wizard
Step 1 Script the schema of the OUTLANDER database on the SQL Server
2012 instance (IITCUK\DEV01) using the Generate Scripts wizard in
SSMS.
In Object Explorer connect to IITCUK\DEV01, right-click on the
OUTLANDER database, expand Tasks and choose "Generate Scripts...".
This launches Generate and Publish Scripts wizard. Click Next, to skip the Introduction screen and proceed to the Choose Objects page.
On the Choose Objects page, choose option "Script entire database and all database objects", and then click Next to proceed to "Set Scripting Options" page.
n the Advanced Scripting Options dialog box,
set Script for Server Version to SQL Server 2008 R2 (or whatever version you want)
under the Table/View Options, set Script Triggers, Script Indexes and Script Primary Keys to True
and set Types of data to script to Schema and Data - this last option is key because this is what generates the data per table.
Once done, click OK, to close the Advanced Scripting Options dialog box and return to Set Scripting Options page. In Set Scripting Options page, click Next to continue to Summary page.
After reviewing your selections on Summary page, click Next to generate scripts.
Once scripts are generated successfully, choose the Finish button to close the Generate and Publish Scripts wizard.
Step 2 Connect to the SQL Server 2008 R2 instance
(IITCUK\SQLSERVER2008), and then run the SQL scripts that were
generated in Step 1, to create the OUTLANDER database schema and data.
In Object Explorer connect to IITCUK\SQLServer2008, then in SQL Server
Management Studio, open the SQL Server script you saved in Step 1
Modify the script, to specify the correct location for the OUTLANDER database data and log files. Once done, run the script to create the OUTLANDER database on IITCUK\SQLServer2008 instance.
Upon successful execution, refresh the Database folder in Object Explorer. As you can see in the following image, the OUTLANDER database has been successfully downgraded.
Notes
There are a few things to be aware of when using this approach.
This solution creates one large SQL file that has the scripts to create the database objects and also INSERT statements for the data in the tables.
For a large databases, the SQL file can get very large if you script out both the schema and the data and could be hard to load into an editor. Also, you may get a memory related error message from the editor if the file is too big.
For large databases, around 1GB or more, if this approach does not work, then you should look at using SSIS to migrate the database or create custom scripts to script out the objects and BCP out the data for each of the tables. You can use this Generate Scripts wizard to just generate the schema without the data and use SSIS or BCP to export and import the data.
This approach works for SQL Server 2017 to SQL Server 2005. Some of the scripting options might be a bit different in newer versions, but the process is still the same.
Before just executing the script, you should review the script to make sure everything looks correct such as the path of the database files, database options, etc.
Also if you are using new functionality that does not exist in the lower version, SQL Server won't be able to create the objects and you will need to review the scripts that were generated and update the code accordingly.
For a very simple database this approach should work pretty easliy, but you might need to spend some time making some modifications to the script for a more complex database.
Below is a list of all of the scripting options. If you click on an item, the bottom part of the screen gives you a short definition of the option.
Next Steps
To avoid this issue, always make sure that you perform a full backup of the database before you upgrade the SQL Server and database to a higher version of SQL Server. In addition, be sure to thoroughly test the application prior to releasing the application to the users.
Consider this downgrade option as your last option to rollback from an upgrade because the time and storage needed can be very large.
With a very large database be sure you have sufficient storage to support the data needs.
Be sure to verify row and object counts as well as test your application before releasing to production.
Additional Resources:
Why Can't I Restore a Database to an Older Version of SQL Server?
SQL Server Database Engine Backward Compatibility
SQL Server Upgrade Tips

SSIS package: permissions

I'm fairly certain at this point that my trouble is rooted in permissions.
My SSIS package simply pulls data from an Access DB and populates a SQL table. When I run this package in VS (by clicking Start), it works and my SQL table populates. As soon as I try to execute the package within SSMS or in a SQL Agent job, it falls apart, with the following errors:
I am running SQL Server 2016 and SQL Server Data Tools in VS 2013, in case that is relevant information.
Great appreciation for anyone who can point me in the right direction.

SSIS Package Store vs. MSDB

When setting up a SQL Agent job, there are options for choosing, among others, "SQL Server" or "SSIS Package Store". If I choose "SSIS Package Store" and browse to MSDB instead of File System, I can choose SSIS packages stored in sysssispackages.
However, the documentation for the /DTS switch, which SSMS automatically chooses when selecting SSIS Package Store, appears to be only for packages stored on the file system.
I've also noticed that if the user executing the job doesn't have file system rights, the job will fail when choosing SSIS Package Store --> MSDB --> Package with an "Access is denied" error or "~Object doesn't exist". Since the package doesn't exist on the file system though, why does execution work when choosing SQL Server and fail when using SSIS Package Store? Is dtexec actually looking for a dtsx file on the file system even when MSDB is chosen? I have, likely a false, understanding that SQL Server and SSIS Package Store --> MSDB are the same thing when referencing the Server on which the agent is running.
I've done some research and I can't find a clear distinction of the differences, if any, between using SQL Server and SSIS Package Store when selecting from the MSDB folder.
Can anyone shed some light on the differences?
There are 2 locations to store SSIS packages: The file system or SQL Server. The rest is semantics.
File System
You can use the SSIS Package Store which is nothing but a well known location in the installation location.
%Program Files%\Microsoft SQL Server\{Version}\DTS\Packages
Or you can pick anywhere on the file system you like. If you go this route, then you'll need to ensure the SQL Agent account, or the credentialed proxies or, if you running packages from xp_cmdshell the SQL Server Service Account has access to that location.
The only advantage, if you want to call it that, of using the Package Store (i.e. the folder I mentioned) is you can use the Integration Services management tool that exists in SSMS (by connecting to Integration Services instead of database engine).
However that has a lot of pitfalls such as not being able to handle multiple instances, packages only run in 64 bit mode, no access to proxy accounts, etc. You shouldn't run packages from SSMS anyway.
SQL Server
If memory serves correct
2005 - stored in msdb.dbo.sysdtspackages90
2008 - stored in msdb.dbo.sysssispackages (I seem to recall 2008 RTM using a different table, or reused the 90 table but that got patched out)
2008 R2 - stored in msdb.dbo.sysssispackages
2012 (package deployment model) - stored in msdb.dbo.sysssispackages
2012 (project deployment model) - stored in SSISDB.catalog.packages*
2014 (package deployment model) - stored in msdb.dbo.sysssispackages
2014 (project deployment model) - stored in SSISDB.catalog.packages*
*With the project deployment model, packages are "compiled" (zipped with a manifest) into a .ispac which is stored into the bowels of the SSISDB.internals.* tables.
Wrapup
Ultimately, where you store your packages does not affect your ability to run them. You can run packages using DTEXEC, SQL Agent or custom .NET code. The choice of storing packages is primarily dependent upon your management style.
References
Package Management

Resources