I am working on SSAS i developed an datacube and deployed succesfully but when i call that SSAS through my SSIS it says that some table is not found but the table exist in my SSAS module.
All things are working fine previous days suddenly this hapened kindly help me in this issue.
Internally the SSIS [Analysis Services Process task] uses the Id of the table and not the friendly name. This means that if you have redeployed the cube or pointed it at a different database then it will not find the table you have specified even though you know it is there. You would then need to update the package every time you recreated the SSAS object.
A more manageable solution would be to use the [Analysis services Execute DDL Task] task or use AMO inside of a script task.
*
You can get the XMLA command by clicking on the cube, dimension, measure group or partition in management studio, Select the Process option then chose the script option at the top. Paste this XMLA code into the command window in the task
Related
I have an excel file which has a column and the data in the column is :
INSERT INTO dbo.Test (Id1, Id2, Id3)
VALUES (1, 2, 3)
Can someone please let me know how can i automate this through SSIS so the insert statement directly loads data into the table?
I can create an SSIS package to load this excel file into a table however I need the column to execute and insert data into Test table.
Any suggestions would be helpful.
Thanks
So as I understand you already have a complete SSIS package with connection to excel and connection to database and finally also with transformation in pipeline (grab data from source, push it into destination).
There are different ways how to execute SSIS package from within SQL Server (exec as bat or cmd), but I suggest you official one.
You can find complete step by step tutorial how to deploy SSIS to SQL Server here :
https://learn.microsoft.com/en-us/sql/integration-services/packages/deploy-integration-services-ssis-projects-and-packages?view=sql-server-ver15
Basically, right click in the Visual studio on the project and click deploy... Follow steps...
Take a look here:
https://learn.microsoft.com/en-us/sql/integration-services/packages/deploy-integration-services-ssis-projects-and-packages?view=sql-server-ver15#to-deploy-and-execute-a-package-using-stored-procedures
There is step by step tutorial how to run your SSIS package from Stored Procedure.
Now you have to decide how to automatize run of Stored procedure itself. Also here you have so many approaches, for example your own timer implementation from outside of SQL Server, or using Windows Task Scheduler.
As I don't really know your environment, Il suggest you easy way to create a scheduled Job with SQL Server Agent (it have to be installed at your SQL Server instance).
step by step tutorial is here:
https://learn.microsoft.com/en-us/sql/ssms/agent/create-a-job?view=sql-server-ver15
I want to copy all data from all tables from one SQL server database to another existing SQL server database of the same structure. I have a script to initially delete all contents of all tables in my output database before proceeding so that it is 'fresh' for the copy.
I understand that 'select into' statements can get this done but I want to be able to do it in bulk. I want to emulate the behavior that works very well in Management Studio of:
Right-click a DB
Select 'Tasks'
Select 'Export Data...'
In here, I can select an output DB and then select all tables. The transfer goes straight through without issue. I cannot find a command line way to achieve this.
The reason I am after this is that we want a daily copy of the prod database in a testing environment, so need to task schedule this process to run each night.
Due to some contstraints, I can't use a bacpac in this case.
Using the import/export task in SSMS, the last step has 2 options. Run immediately or save as SSIS package. So - save it as a SSIS package. You can then run this package whenever you want. And yes - you will need to do this twice. Once for export, once for import. You can also do exactly the same thing using SSIS btw.
So how do you execute a package from the command line? Like you do for any question, you should search first. Some suggestions/examples are here.
And if needed, you can schedule this using the agent.
I have an automated SSRS report that runs off a SQL Server SP. A subscription then exports this to a folder each morning. I'm not sure if this next step is possible, but maybe someone knows:
Another Excel report from an outside source will be dropped into the same folder each morning. Is it possible to then automatically compare the two reports and delete records from my automated report that do not match?
I know I can manually import the new Excel file into SQL Server, do a join and then delete the records. But is it possible to do this automatically?
This sounds like a job for SSIS. SSIS is designed to handle ETL workflows like this. I would create a package that pulls in the outside Excel and either the output from your SSRS subscription or preferably calls the stored procedure itself. You can do all the data comparison you need and then output in whatever format you need or run T-SQL statements against your source database based on the results of your comparison as appropriate to your use case. You'd deploy your package to a SQL Server setup for SSIS and create a SQL Agent job to run it at the appropriate interval.
I have a SQL Server database with defined (1) tables, (2) corresponding views, and (3) stored procedures.
To my knowledge there is one and only one SSIS package, which takes care of the process of loading the data in the mentioned database.
At the end there is a web interface for the end user simply makes SELECT statements on top of one view in the database and displays the data.
Problem I am facing with is - some values (specific logs) are outdated.
It means - the load job (SSIS package) executes successfully, however does not displays the newest data.
I am assuming that the problem is because either the log files are not placed where they should be placed, or the source for the logs has changed.
Therefore, I opened the productive version of the SSIS package. But then I go the error messages:
The script is corrupted.
There were errors during task validation.
There should be no errors, because the package runs every day successfully.
I tried to find something like 'Rebuild the Project' option, but could not find anything.
How can I test it in order to find the cause for outdated rows?
-> Where in the package can I see for example where a particular table is being filled?
You should open the ssis package with the proper version of BIDS or visual studio. i.e. the SSIS packages differ from SQL 2008 or SQL 2012. which version of SQL is the package made with and with which tool are you opening the package
I need to create a copy of all of our production databases (SQLServer), without any data.
I need to do this on a regular basis, preferably scheduled and not manually.
Do I have to write code that extracts from systables and builds whe SQL-statements itself or is there a good way to do this?
As there is a method to do that in SQL Management Studio:
Select a database
right click
tasks
generate script
etc ...
You can then save the script generated as an sql file.
Once the script set, and if it is T-SQL, you can just add it to the jobs of your server. .And if you have only a SQL EXPRESS server (with no job schedule), I remember it was possible, a few years ago, to find some free products on the net that would do the job.