SQL Server 2005 Analysis Service (SSAS) Partial Synchronization - sql-server

I'm new to SSAS (this is my first project that involves ssas).
I have a regular a SQL Server 2005 server (say Blah) that run database instance and sql server analysis service. I want to synchronize some of the data in Blah (based on some condition) to another server, Blah2. Partial data sync is quite straightforward with the help of replication server. However, I'm not sure how to do a partial data synchronization for the analysis service.
I have a table in Blah database that list all of the cubes in its analysis service. I then need to filter this table to list all the necessary cube and this is ok. But I'm sure how to continue from there.
I've looked into the SSAS Database Sync Wizard but I couldnt find any command line tools for this or a way to run this as procedure in SQL Script (I will need to do this as a regular sql server job, so it's necessary not to rely on gui). Even if I want to use the gui, there doesnt seems to be a way to filter the cube/measurement from the gui.
I'm thinking of getting the cube, measurement, data source view, etc dynamically but I cant find a way to get these definitions dynamically from sql script. I'm trying to do simple openquery to get list of cube in sql server screen with this (olap_server is a linked server to the ssas):
select *
from openquery(olap_server, 'select [CATALOG_NAME]
from $system.dbschema_catalogs')
with no luck. I got the "An error occurred while preparing the query "select [CATALOG_NAME]
from $system.dbschema_catalogs" for execution against OLE DB provider "MSOLAP" for linked server "olap_server"." error instead.
Is there any straightforward way to do this task?

I am looking at doing something similar...
You can synchronize without using the Database Sync Wizard gui by issuing an XMLA Synchronize command against the Target database as described here http://msdn.microsoft.com/en-us/library/ms187156.aspx
or for 2005 here: http://msdn.microsoft.com/en-us/library/ms187156(SQL.90).aspx
Your SQL Server job will need to have a step of type "SQL Server Analyis Services Command"
An example and some more background info is here: http://dwbijourney.blogspot.com/2008/01/ssas-database-synchronization-for.html

Related

Is there a way to save all queries present in a ssis package/dtsx file?

I need to run some analysis on my queries (specifically finding all the tables which a ssis calls).
Right now I'm opening up every single ssis package, every single step in it and copy and pasting manually the tables from it.
As you can imagine it's very time consuming and mind-numbing.
Is there a way to do export all the queries automatically ?
btw i'm using sql server 2012
Retrieve Queries is not a simple process, you can work in two ways to achieve it:
Analyzing the .dtsx package XML content using Regular Expression
SSIS packages (.dtsx) are XML files, you can read these file as text file and use Regular Expressions to retrieve tables (as example you may search all sentences that starts with SELECT, UPDATE, DELETE, DROP, ... keywords)
There are some questions asking to retrieve some information from .dtsx files that you can refer to to get some ideas:
Reverse engineering SSIS package using C#
Automate Version number Retrieval from .Dtsx files
Using SQL Profiler
You can create and run an SQL Profiler trace on the SQL Server instance and filter on all T-SQL commands executed while executing the ssis package. Some examples can be found in the following posts:
How to capture queries, tables and fields using the SQL Server Profiler
How to monitor just t-sql commands in SQL Profiler?
SSIS OLE DB Source Editor Data Access Mode: “SQL command” vs “Table or view”
Is there a way in SQL profiler to filter by INSERT statements?
Filter Events in a Trace (SQL Server Profiler)
Also you can use Extended Events (has more options than profiler) to monitor the server and collect SQL commands:
Getting Started with Extended Events in SQL Server 2012
Capturing queries run by user on SQL Server using extended events
You could create a schema for this specific project and then have all the SQL stored within views on that schema... Will help keep things tidy and help with issues like this.

Move data between different servers

I'm working on a project where I need to automatically run an insert statement to insert a result set - problem is that I need it to go from a SQL Server over to a DB2 server. I can't create a file or script and then import it or run it on the other side. I need to insert or update the DB2 side from the SQL Server side.
Is this possible? I need this to run all by itself as part of a stored procedure in SQL Server.
You're looking for the linked server feature.
Typically linked servers are configured to enable the Database Engine to execute a Transact-SQL statement that includes tables in another instance of SQL Server, or another database product such as Oracle. Many types OLE DB data sources can be configured as linked servers, including Microsoft Access and Excel. Linked servers offer the following advantages:
The ability to access data from outside of SQL Server.
The ability to issue distributed queries, updates, commands, and transactions on heterogeneous data sources across the enterprise.
The ability to address diverse data sources similarly.
(I believe most of the major RDBMSs have a similar feature)
For the most part, this essentially allows you to treat tables or sources in the other database as if they were part of the SQL Server instance - an INSERT statement should just work "normally".
As mentioned you can use a linked server on the SQL Server side to perform operations between two servers. I haven't done much with running DML on DB2 from SQL Server, but from my experience SSIS performs far better than linked servers for transactions pulling data from DB2 to SQL Server using an OLE DB connection. You can read more about OLE DB connections in SSIS here and you'll want to reference the DB2 documentation for the specific DB2 type (Mainframe, LUW, etc.) that's used for details on setting up the connection there. If you setup the SSIS catalog you can run packages using SQL Server stored procedures, which you can either use directly or execute from an existing user stored procedures.

Creating a snapshot database to another SQL Server

I'm trying to save the values of several columns of one table to another table on a different server. I am using SQL Server. I would like to do this without running external programs that query from this database and insert the results into the new database. Is there any way to do this from within the SQL Server Management Studio?
This is a recurring event that occurs every hour. I have tried scheduling maintenance tasks that execute custom T-SQL scripts but I'm having trouble getting the connection to the remote server.
Any help would be appreciated.
If you can set up the remote server as a linked server you should be able to configure the SQL Server Agent to execute jobs that contain queries that access tables on both the local and linked server. Remember that you might have to configure the access rights for the account used to run SQL Server Agent so that it has permissions to read/write tables on both servers.
This practice might not be without issues though as this article discusses.
You can use a 4 part name like;
INSERT [InstanceName].[DatabaseName].[SchemaName].[TableName]
SELECT * FROM [SourceInstanceName].[SourceDatabaseName].[SourceSchemaName].[SourceTableName]
But first you will have to set the remote server as a linked server as so;
https://msdn.microsoft.com/en-us/library/aa560998.aspx

How do you pull data from SQL Server to Oracle?

I'm wanting to take data from a SQL Server table and populate a Oracle table. Right now, my solution is to dump the data into a Excel table, write a macro to create a sql file that I can load into Oracle. The problem with this is I want to automate this process and I'm not sure I can automate this.
Is there an easy way to automate populating a Oracle table with data from a SQL Server table?
Thanks in advance
I suppose it depends on your definition of "easy".
The most robust approach would be to either use heterogeneous connectivity in Oracle to create a database link to the SQL Server database and then pull the data from SQL Server or to create a linked server in SQL Server that connects to Oracle and then push the data from SQL Server to Oracle.
Yes. Take a look at MS SQL's SSIS which stands for SQL Server Integration Services. SSIS allows all sorts of advanced capabilities, including automated with Sql Server Jobs, for moving data between disparate data sources. In your case, connecting to Oracle can be achieved a variety of ways.
There are three ways to automate this:
1) You can do as Paul suggested and created an SSIS package that will do this and it can be scheduled via SQL Agent,
2) If you don't want to deal with SSIS, you can download the free SQL# (SQLsharp) CLR Library from http://www.SQLsharp.com/ and use the DB_BulkCopy Stored Procedure to do this in a T-SQL Stored Proc which can also be scheduled via SQL Agent. [note: I am the author of SQL#]
3) You can also set up a Linked Server from SQL Server to Oracle, but this has the draw-back of being a potential security hole. Of course, you could use an Oracle Login that only has write-access to that single table (or something similar to that).
There are lots and lots of ways to do it. Which you choose depends on your requirements.
Using Excel is fine if it's a one time thing.
If it's a once-in-a-while thing, then you could write a simple .NET app that uses a single DataSet and multiple DataAdapters to do the data dump. C# code example here.
if it's a regular thing, then you could put the above in a Schtasks task, or you could use SSIS. I think SSIS is an extra-cost option.
if the requirement is for "online access", then a linked database is probably appropriate.

How can I migrate database from SQL Server 2008 to SQL Server 2000

I am replacing an Access application with a web app, but the client is using SQL Server 2000, and I am using SQL Server 2008.
So, I have the database redesigned, with foreign keys, but now I need to get the data on the client's system.
Part of the problem is that they have images that are over 32k, so osql failed as the command buffer filled up.
I should be able to use osql to import the new schema at least, and perhaps all of the data except for the images.
The Export wizard just wouldn't work, even though I tried the Native SQL Driver and the OLE DB Sql Driver.
Flat files seems like a bad choice, as I don't know if it can do the images.
So, what is a good way to copy a 330M database from 2008 -> 2000?
Not sure about performance or time needed, but you could always try a tool like
Red-Gate SQL Compare / SQL Data Compare
Apex SQL Diff / SQL Data Diff
These will allow you to compare both the schema of two databases, as well as the data, and allow you to create synchronization scripts, or synchronize online.
Marc
I set the image column to null, which reduced the size of the insert statements.
This enabled me to import the data into the target database.

Resources