Backup SSAS via SSMS Job with Date Appended - sql-server

I'd like to backup my SSAS database via a SQL Server Agent job in SSMS – and I’d also like to append the day number to the end of the file and allow overwrites.
This will ensure that I only ever have a month of backups (i.e. Jan 1st backup will be called backup_01.abf, Jan 2nd = backup_02.abf etc.)
I connected to the SSAS DB via SSMS and scripted out the backup procedure, which is as follows:
{
"backup": {
"database": "ExampleDB",
"file": "Backup.abf",
"allowOverwrite": true,
"applyCompression": false
}
}
I believe I can simply add this as a Step in a server agent job as a SQL Server Analysis Services Command.
But how can I then append the day of the month to the file?
Can’t seem to find much about this online
[EDIT]
Using a combination of what Vaishali responded with below, and another article I've found online about this, I've performed the following:
Created a linked server (Link_SSAS)
Generated the following script:
Declare #XMLA nvarchar(1000),
#DateSerial nvarchar(35)
-- Get Day Number from GETDATE()
Set #DateSerial = RIGHT('0' + RTRIM(DAY(GETDATE())),2)
-- Create the XMLA string
Set #XMLA = N'<Backup xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<Object>
<DatabaseID>ExampleDB</DatabaseID>
</Object>
<File>C:\bak\Backup_' + #DateSerial + '.abf</File>
</Backup>'
-- Execute the string across the linked server
Exec (#XMLA) AT Link_SSAS;
The above xmla runs perfectly if I just execute it through a query window in SSMS
However, when I try to put the XMLA into a job-step, i get an error
With Job Type set to T-SQL I get:
The Specified '#server' is invalid (valid values are returned by sp_helpserver)
If I run sp_helpserver, it does show my linked server as being there.
Tried disconnected / reconnecting to the server.

You can with 'dynamic xmla script' execute via sql. below link will be useful:
https://www.mssqltips.com/sqlservertip/2790/dynamic-xmla-using-tsql-for-sql-server-analysis-services/

Related

How to load data in different servers

I am designing an ETL project on SSIS and I want it to be dynamic. I will use this project for many customers therefore I will query these extractions against different servers.
For example, I have this query in a step with "execute SQL task" component :
INSERT DataWarehouse.schema.fact1
SELECT *
FROM Database.schema.table1
My datawarehouse is always in localhost But "Database.schema.table1" could be in different servers therefore I will have Different linkservers in our customer's servers to retrieve its data.
This means for example I will need the query change like this for customer1 :
INSERT DataWarehouse.schema.fact1
SELECT *
FROM [192.168.1.100].Database.schema.table1
And for customer2 I will need the query to be like this :
INSERT DataWarehouse.schema.fact1
SELECT *
FROM [10.2.5.100].Database.schema.table1
I've tried extract and loading with SSIS components but because of my complex queries, It became so messy.
Any ideas how to make my query dynamic?
As per this link Changing Properties of a Linked Server in SQL Server
One way to solve your problem is to make sure that the linked server logical name is always the same, regardless of what the actual physical host is.
So the process here would be:
Create the linked server with the linked server wizard
Use this to rename the server to a consistent name that can be used in your code
i.e.
EXEC master.dbo.sp_serveroption
#server=N'192.168.1.100',
#optname=N'name',
#optvalue=N'ALinkedServer'
Now you can refer to ALinkedServer in your code
A better way is to script the linked server creation properly - don't use the SSMS wizard
Here's the template - you need to do more research to fund out the correct values here
USE master;
GO
EXEC sp_addlinkedserver
#server = 'ConsistentServerName',
#srvproduct = 'product name',
#provider = 'provider name',
#datasrc = 'ActualPhysicalServerName',
#location = 'location',
#provstr = 'provider string',
#catalog = 'catalog';
GO
But the last word is: Don't use linked servers. Use SSIS
I would suggest you to do the below steps to execute same statement across multiple servers. As suggested by #Nick.McDermaid, I would strongly recommend against linked server. It is better to go for exact server name in SSIS.
Put the INSERT statement into a separate variable
Create a foreach container in SSIS.
Inside foreach containter, have a script task and get the current server name from the list of servernames. You can have comma separated list of servernames and get current one.
Again, inside foreach container, create Execute Process Task & call Sqlcmd.exe with connection information specific to each server, based on the server name got in Step No. 3, using SSIS expressions. Refer to this Stackoverflow post on using expressions for Execute ProcessTask for more information on calling Execute process task in SSIS.
How about making a SSIS package that works for one of your systems.
Parameterize your working package to accept a connection string
create another package that loops thru your connection strings and calls your working package and passes the conn string

SQL Server 2008 R2 standard data-drive subs workaround

So after some research I figured out that Standard edition does have the ability to manage data-driven subscriptions, so you'd have to write a custom script or stored procedure to work around it and get the same result.
My goal is be able to edit our existing reports (most of them are done with SSRS but we have a number of them created as SQL Server Agent jobs) so that they only e-mail the report if data is available; if no rows come up I want it to cancel sending the e-mail.
Before tackling an existing report, I tried creating a simple test script to get a better understanding DB Mail and stored procedures, so I came up with this script:
IF Exists ( Select cht_number, cht_itemcode, cht_description from chargetype where last_updatedate>'11/11/2014')
execute msdb.dbo.sp_send_dbmail
#profile_name=Null,
#recipients='email#company.com',
#subject='Test',
#Execute_Query_Database='DB_Name',
#query='Select cht_number, cht_itemcode, cht_description from chargetype where last_updatedate>''11/11/2014''',
#attach_query_result_as_file=1,
#query_attachment_Filename='TEST.csv',
#query_result_no_padding=1,
#query_result_header = 1,
#query_result_width = 256,
#query_result_separator=' '
IF ##ROWCOUNT = 0 raiserror ('No Data', 16, 1)
To test this script, I would edit something in that table so that only the most recent items would be sent in the report. If no data was available, it would just raise the error "No Data".
Can anyone suggest another way of getting this result, or how I could shorten this and be able to use it sort of like a template I could fiddle with to fit in an existing script?

Running SQL Transport Schema Generation Wizard with datetime parameter in biztalk

I am trying to run the SQL Transport Schema Generation Wizard against a SQL 2012 server. The stored procedure has a datetime parameter. If i simply put in a date like 12/26/2013 05:00:00 Am, then the "Generate" button doesn't show an argument. If i try putting the date/time in a single quote or using a string like 2013-12-26T05:00:00.000, the parameter is generated, but i get the following error when i try to execute. "Failed to execute SQL Statement. Please ensure that the supplied syntax is correct. "
I got to this point by creating a SQL Query that output it's response using FOR XML AUTO, ELEMENTS in it. I then open my BizTalk solution in VS 2012, Go to "Add Items -> Add Generated Items". Select Add Adapter Metadata. From there, it asks the location of the message box. I use my local server. It then asks for the connection string for the SQL Server with the stored procedure. I enter that (it's the same as the server with the message box). I specify the namespace and the root element name for the document. This is set as a receive port. I next select stored procedure and move to the next screen. I then select the stored proc from a drop down list. Below, in a grid, i am shown the parameters for the stored proc. Here is where i am having trouble. I cannot seem to get it to accept the datetime argument no matter what i put in here.
Is there something i am doing wrong?
It is better to do the following steps and to use the new WCF-SQL rather than the old deprecated SQL adapter.
Add Items
Add Generated Items
Consume Adapter Service
Select sqlBinding and Configure the URI
Click Connect
Select Client (Outbound operations)
Select Strongly-Type Procedures
Select the Stored Procedure from Available categories and operations
Click Add
Give it a Filename Prefix
Click OK
This will generated the schemas plus binding files to create the port.
You also don't need to have the FOR XML AUTO, ELEMENTS in your stored procedure any more.

how can I create job for make Back Up From SSAS Database?

I use SQL Server 2008 R2 and SSAS.
I have some dimensional DataBase in SSAS.
I need create a job for make back up from all DataBase of SSAS in each days.
How i can do it?
So there are a few ways:
1/ Create a SQL Job using Analysis Services Commands as steps, one for each DB using code similar to this:
<Backup xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<Object>
<DatabaseID>DBName</DatabaseID>
</Object>
<File>\\server\share\folder\backupFile.abf</File>
<AllowOverwrite>true</AllowOverwrite>
</Backup>
This is nice and easy and if you only have a few SSAS databases that you don't delete or add more to on a regular basis, relatively painless to maintain
2/ Create an SSAS linked server and then use a regular SQL statement in a job step to pass in your command (gives you a bit more flexibility as you can use variables to modify things like the filename etc):
Create SSAS Linked Server
DECLARE #XMLA NVARCHAR(1000) ,
#timeStamp NVARCHAR(35);
SET #timeStamp = CONVERT(NVARCHAR, GETDATE(), 112) + '_'
+ REPLACE(CONVERT(VARCHAR(8), GETDATE(), 108), ':', '');
SET #XMLA = N'<Backup xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<Object>
<DatabaseID>dbName</DatabaseID>
</Object>
<File>\\server\share\folder\dbName_'
+ #timeStamp + '.abf</File>
<AllowOverwrite>true</AllowOverwrite>
<ApplyCompression>false</ApplyCompression>
</Backup>';
EXEC (#XMLA) AT [SSAS_LinkedServer]
EDIT: What you can also do with this method is easily get a list of SSAS databases from the Analysis Server using the following:
SELECT [catalog_name] FROM OPENQUERY([SSAS_LinkedServer],'select [catalog_name] from $system.dbschema_catalogs')
You can then just cursor through this list, executing the SSAS command through the linked server for each DB in the list therefore meaning that you don't need to maintain the job manually if you add a new SSAS database.
3/ Use powershell, this method below won't work on 2008 R2, but could be modified to use SMO directly rather than the nice provider that they provide in SQL Server 2012 (this is the method that we use and the below is a sample from my script):
import-module sqlps -disablenamechecking
$server = "SERVER\INSTANCE"
$backupPath = "\\backup-server\share\SSAS\$server\"
#Get a list of the SSAS databases on the server
$ssasDBs = dir SQLSERVER:\SQLAS\$server\databases
#Backup each SSAS database on the server
foreach ($db in $ssasDBs)
{
$extension = Get-Date -UFormat "_%Y_%m_%d_%A_%H%M%S.abf"
$backupFile = $backupPath+$db+$extension
try
{
$db.Backup($backupFile)
}
catch
{
Write-Warning "Backup of $db Failed!!!"
}
}
This method has the advantage that you don't have to change the job if you add a new SSAS database, it will get backed up automatically by the script with no changes.
I'm sure there are other ways to achieve this, but these are 3 ways that I have used to manage SSAS backups, I generally find the management side of SSAS rather painful as it seems to be a bit lacking, but maybe that's just me. The new powershell bits in SQL 2012 make it much easier though.
Hope this helps with at least some ideas of what you can do.

Call DB2 stored procedure from SQL Server 2008 linked server

I have a linked server from SQL Server 2008 to DB2. The linked server uses the IBM Drivers and not the Microsoft ones.
So this works from SQL Server:
exec ('call RERTEBT.GET_DEFINITION (69,'''','''')') AT MyLinkedDB2Server
This also works using openQuery... which is returning different data from another table
select
RPMG_ETY_CD,
ROW_CU_DATA_IN,
ROW_EF_DT,
ROW_XPR_DT,
RPMG_ETY_NM
from
OPENQUERY
(MyLinkedDB2Server,
'select
RPMG_ETY_CD,
ROW_CU_DATA_IN,
ROW_EF_DT,
ROW_XPR_DT,
RPMG_ETY_NM
from RERTEBT.V1RERRMM')
However I cannot get a select to return data with the DB2 Sproc
This fails -
SELECT FLT_DFN_ID, FLT_SRC_DFN_NO, FLT_VRSN_NO, FLT_STAT_CD, FLT_TY_CD, FLT_NAME
FROM OPENQUERY (MyLinkedDB2Server,
'call RERTEBT.GET_DEFINITION 69,'''','''')')
Has anyone any idea on how to call a DB2 stored procedure from SQL Server Linked server and return the data or can this be done. I read somewhere the DB2 cant do this but haven't seen any real documentation on it.
Thanks D
More explanation for Josef's answer:
You need to right-click the linked server's "properties"
then -> "Server option"
The "RPC" and "RPC Out" option in the right pane need to be TRUE
-- edited -- I can't comment on the answer yet (don't have 50 rep)
Your should be able to do this:
EXEC ('{CALL RERTEBT.GET_DEFINITION (69,'''','''')}') AT MyLinkedDB2Server;
Or even cleaner with passing variables
EXEC ('{CALL RERTEBT.GET_DEFINITION (?,?,?)}', 69, '', '') AT MyLinkedDB2Server;

Resources