SSIS Package For Loading Data into SQL Server database - sql-server

I have a SQL Server database with defined (1) tables, (2) corresponding views, and (3) stored procedures.
To my knowledge there is one and only one SSIS package, which takes care of the process of loading the data in the mentioned database.
At the end there is a web interface for the end user simply makes SELECT statements on top of one view in the database and displays the data.
Problem I am facing with is - some values (specific logs) are outdated.
It means - the load job (SSIS package) executes successfully, however does not displays the newest data.
I am assuming that the problem is because either the log files are not placed where they should be placed, or the source for the logs has changed.
Therefore, I opened the productive version of the SSIS package. But then I go the error messages:
The script is corrupted.
There were errors during task validation.
There should be no errors, because the package runs every day successfully.
I tried to find something like 'Rebuild the Project' option, but could not find anything.
How can I test it in order to find the cause for outdated rows?
-> Where in the package can I see for example where a particular table is being filled?

You should open the ssis package with the proper version of BIDS or visual studio. i.e. the SSIS packages differ from SQL 2008 or SQL 2012. which version of SQL is the package made with and with which tool are you opening the package

Related

Is there a way to save all queries present in a ssis package/dtsx file?

I need to run some analysis on my queries (specifically finding all the tables which a ssis calls).
Right now I'm opening up every single ssis package, every single step in it and copy and pasting manually the tables from it.
As you can imagine it's very time consuming and mind-numbing.
Is there a way to do export all the queries automatically ?
btw i'm using sql server 2012
Retrieve Queries is not a simple process, you can work in two ways to achieve it:
Analyzing the .dtsx package XML content using Regular Expression
SSIS packages (.dtsx) are XML files, you can read these file as text file and use Regular Expressions to retrieve tables (as example you may search all sentences that starts with SELECT, UPDATE, DELETE, DROP, ... keywords)
There are some questions asking to retrieve some information from .dtsx files that you can refer to to get some ideas:
Reverse engineering SSIS package using C#
Automate Version number Retrieval from .Dtsx files
Using SQL Profiler
You can create and run an SQL Profiler trace on the SQL Server instance and filter on all T-SQL commands executed while executing the ssis package. Some examples can be found in the following posts:
How to capture queries, tables and fields using the SQL Server Profiler
How to monitor just t-sql commands in SQL Profiler?
SSIS OLE DB Source Editor Data Access Mode: “SQL command” vs “Table or view”
Is there a way in SQL profiler to filter by INSERT statements?
Filter Events in a Trace (SQL Server Profiler)
Also you can use Extended Events (has more options than profiler) to monitor the server and collect SQL commands:
Getting Started with Extended Events in SQL Server 2012
Capturing queries run by user on SQL Server using extended events
You could create a schema for this specific project and then have all the SQL stored within views on that schema... Will help keep things tidy and help with issues like this.

Tables and stored procedures not getting included in the generated script

I have created a SQL Server database project using Visual Studio 2015. The underlying database is SQL Server 2016. I have created the project by importing an existing database into the project. I have the structure ready. When I click on publish and select generate, it doesn't include the tables and stored procedures in the generated script. I can see one odd script added. Am I missing something?
Please see the screenshot of my project
The issue has been now resolved. The mistake i was making is importing the data objects from the database and trying to generate the script pointing to the same database. It was not generating the scripts since the database objects existed . Pointing to another database helped me generating the scripts.

SQL Server - copy all data from all tables from one server to another server with identical structure

I want to copy all data from all tables from one SQL server database to another existing SQL server database of the same structure. I have a script to initially delete all contents of all tables in my output database before proceeding so that it is 'fresh' for the copy.
I understand that 'select into' statements can get this done but I want to be able to do it in bulk. I want to emulate the behavior that works very well in Management Studio of:
Right-click a DB
Select 'Tasks'
Select 'Export Data...'
In here, I can select an output DB and then select all tables. The transfer goes straight through without issue. I cannot find a command line way to achieve this.
The reason I am after this is that we want a daily copy of the prod database in a testing environment, so need to task schedule this process to run each night.
Due to some contstraints, I can't use a bacpac in this case.
Using the import/export task in SSMS, the last step has 2 options. Run immediately or save as SSIS package. So - save it as a SSIS package. You can then run this package whenever you want. And yes - you will need to do this twice. Once for export, once for import. You can also do exactly the same thing using SSIS btw.
So how do you execute a package from the command line? Like you do for any question, you should search first. Some suggestions/examples are here.
And if needed, you can schedule this using the agent.

Table not found error in SSAS

I am working on SSAS i developed an datacube and deployed succesfully but when i call that SSAS through my SSIS it says that some table is not found but the table exist in my SSAS module.
All things are working fine previous days suddenly this hapened kindly help me in this issue.
Internally the SSIS [Analysis Services Process task] uses the Id of the table and not the friendly name. This means that if you have redeployed the cube or pointed it at a different database then it will not find the table you have specified even though you know it is there. You would then need to update the package every time you recreated the SSAS object.
A more manageable solution would be to use the [Analysis services Execute DDL Task] task or use AMO inside of a script task.
*
You can get the XMLA command by clicking on the cube, dimension, measure group or partition in management studio, Select the Process option then chose the script option at the top. Paste this XMLA code into the command window in the task

Mimic DTS Copy SQL Server Objects Task

I am in the process of migrating a web application database from SQL Server 2000 to SQL Server 2008. Currently there is a DTS package that is used to deploy content changes from a staging database to the production database.
The DTS package is using a Copy SQL Server Objects task with the following options selected: Copy Data (Append Data) and Use Collation. The specific tables to copy are selected in the "Select Objects" dialog.
Because this is the only DTS package we have, it doesn't make much sense to learn and implement an SSIS solution, IMO, so I want to recreate the functioning of the DTS package using only T-SQL.
Writing the Insert and Select is not a problem. What I need to know is how the "Append Data" option works.
Is it looking at each row in the source, finding matching rows in the destination, comparing and updating as necessary OR is is ignoring existing rows and simply appending new rows?
If it is indeed comparing and updating, is it safe to use the SQL Server Checksum function on the data as a method of comparison against the target or is there a better way? Ideally, I'd like to avoid any schema changes.
Please check this msdn article : Migrating DTS Packages to Integration Services
You might be able to migrate the single DTS package to SSIS package very easily using the tool noted in the article.

Resources