Master Data Services - sql-server

I'm a SQL Server developer learning MDS. I loaded some entities via staging tables and via Excel add-in.
I'm trying to update members in an entity in MDS via the staging table. I can successfully add new members, but any attribute updates to existing members aren't populated to the entity view. The import process runs successfully with no errors.
I've tried ImportType = 0 and 2, neither works. When I set to 1, as expected I get an error. I also tried to update the code value using the NewCode column and that also does not get updated.
I've set up staging data with an SSIS package, and also with direct T-SQL INSERT INTO statement.
I am using almost the same T-SQL INSERT statement for a test entity which I created to load a new member, and then to modify attributes for the new member in a second batch.
Do you have any ideas why the updates would be ignored, or suggestions for things I can try?

Look at your batch in the staging table to see if an errors occurred. If the "ImportStatus_ID" = 2 then the record failed to import. You can see the reason for failure by querying the view that shows reasons for the import failures. The view will be named "stg.viw_EntityName_MemberErrorDetails.
Here is a Microsoft link for reference:
https://technet.microsoft.com/en-us/library/ff486990(v=sql.110).aspx
Hope this helps.

As suggested above Member error details view describe the error
Make sure that you are checking below points when updating in MDS
1) Put code column in your INSERT statement
2) Include all columns of staging table in INSERT query when using
importType = 2 (Otherwise all column will be updated as NULL)

You should insert the data into staging table with ImportType as 0 or 2 along with the batchtag and then run the staging stored procedure to load the data from staging table to entity table. SP will compare the data from staging table with the data in entity table based on Code value and update the data in entity table.

While you can update importstatus_id in the stg.leaf table.
update stg.C_Leaf
set
ImportStatus_ID = 0
While I think it will force the data to be ready for staging and load to mdm entity.

Using Import type =0 shall help u update the new attributes untill the updated new attribute has NoT Null Data. If it is so then Update shall fail. Recheck the data in entity.
If that doesn't work. Please try to refresh the cache in Model and try to get the entiry details again.
Learn more about import types in MDS from below link:
https://learn.microsoft.com/en-us/sql/master-data-services/leaf-member-staging-table-master-data-services?view=sql-server-2017
Hope this helps.

Related

Copy tables into database

I imported data from Power BI into SQL-Server. You can see how is look like imported data.
Additionally I created own database with commands below:
CREATE DATABASE MY_DW
GO
USE MY_DW
GO
Now I want to copy all this table into my base named as MY_DW. So can anybody help me how to solve this problem and copy all tables into my base ?
Please check https://www.sqlshack.com/how-to-copy-tables-from-one-database-to-another-in-sql-server/.
This link suggests various methods to copy the data tables from one database to another.
Thanks,
Rajan
Following approach could resolve your issue:
Imported Database
Generate Scripts
Introduction
Next button
Select the database objects (Tables in your case) to script
Next button
Specify how scripts should be saved
Advanced -> Types of data to script -> Schema and data
Next button
Review your selections
Next button
Script generation would take place and saved which you should run under the database,
MY_DW, you created
Another approach:
Assuming that the databases are in the same server.
The below query will create the table into your database(without constraints).
SELECT * INTO MY_DW.Table_Name
FROM ImportedDB.Table_Name
And the below query will insert the data into your database table.
INSERT INTO MY_DW.Table_Name
SELECT * FROM ImportedDB.Table_Name
Final approach:
Assuming that the databases are in the linked server.
Incase of linked server, four part database object naming convention will be applied like below.
The below query will create the table into your database(without constraints).
SELECT * INTO [DestinationServer].[MY_DW].[dbo].[Table_Name]
FROM [SourceServer].[ImportedDB].[dbo].[Table_Name]
And the below query will insert the data into your database table.
INSERT INTO [DestinationServer].[MY_DW].[dbo].[Table_Name]
SELECT * FROM [SourceServer].[ImportedDB].[dbo].[Table_Name]

How to run SSIS packages dynamically?

We have a large production MSSQL database (mdf appx. 400gb) and i have a test database. All the tables,indexes,views etc. are same eachother. I need to make sure that tha datas in the tables of this two database consistent. so i need to insert all the new rows and update all the updated rows into test db from production every night.
I came up with idea of using SSIS packages to make the data consistent by checking updated rows and new rows in all the tables. My SSIS Flow is ;
I have packages in SSIS for each tables seperately because;
Orderly;
Im getting the timestamp value in the table in order to get last 1 day rows instead of getting whole table.
I get the rows of the table in the production
Then im using 'Lookup' tool to compare this data with the test database table data.
Then im using conditional sprit to get a clue whether the data is new or updated.
If the data is new, i insert this data to the destination
5_2. If the data is updated, then i update the data in the destination table.
Data flow is in the MTRule and STBranch package in the picture
The problem is, im repeating creating all this single flow for each table and i have more than 300 table like this. It takes hours and hours :(
What im asking is;
Is there any way in SSIS to do this dynamically ?
PS: Every single table has its own columns and PK values but my data flow schema is always same. . (Below)
You can look into BiMLScript, which lets you create packages dynamically based on metadata.
I believe the best way to achieve this is to use Expressions. They empower you to dynamically set the source and Destination.
One possible solution might be as follows:
create a table which stores all your table names and PK columns
define a package which Loops through this table and which parses a SQL Statement
Call your main package and pass the stmt to it
Use the stmt as Data Source for your Data Flow
if applicable, pass the Destination Table as Parameter as well (another column in your config table)
This is how I processed several really huge tables: the data had to be fetched from 20 tables and moved to one single table.
You are better off writing a stored procedure that takes the tablename as parameter and doing your CRUD there.
Then call the stored procedure in a FOR EACH component in SSIS.
Why do you need to use SSIS?
You are better off writing a stored procedure that takes the tablename as parameter and doing your CRUD there. Then call the stored procedure in a FOR EACH component in SSIS.
In fact you might be able to do everything using a Stored Procedure and scheduling it in a SQL Agent Job.

Master Data Services deleting rows from UI

We have duplicate data in entities in Master data services and not in staging tables. how can we delete these? We cannot delete each row because these are more than 100?
Did you create a view for this entity? see: https://msdn.microsoft.com/en-us/library/ff487013.aspx
Do you access to the database via SQL Server Management Studio?
If so:
Write a query against the view that returns the value of the Code field for each record you want to delete.
Write a query that inserts the following into the staging table for that entity: code (from step 1), BatchTag, ImportType of 4 (delete)
Run the import stored proc EXEC [stg].[udp_YourEntityName_Leaf] See: https://msdn.microsoft.com/en-us/library/hh231028.aspx
Run the validation stored proc see: https://msdn.microsoft.com/en-us/library/hh231023.aspx
Use ImportType 6 instead of 4 as the deletion will fail if the Code which you are trying to delete is being referenced by a domain based attribute in other entities if you use ImportType 4. Rest all the steps will remain same as told by Daniel.
I deleted the duplicate data from the transaction tables which cleared the duplicates from the UI also.
MDS comes out-of-the-box with two front-end UIs:
Web UI
Excel plugin
You can use both of them to easily delete multiple records. I'd suggest using the excel plugin.
Are there any Domain-based attributes linked to the entity you're deleting values from? If so, if the values are related to child entity members, you'll have to delete those values first.

How to copy a single view with all the data on it in sql server from one database to another database?

I am looking for some idea if we can generate a script for just one view and run that on the another database to create that view with its datas intact. Please help, thank you
If your destination server is not linked with the source, getting this data out will take a few more steps. I am assuming that you only want to transport the data from the view, but the steps below could be applied to the source table(s), making this view instantiation part unnecessary.
First, since a view does not store data, it only references data, you will need to instantiate the view into a table.
Select *
INTO tblNewTable --this creates a new table from the data selected from the view
FROM dbTest.dbo.Tester;
Next, open SSMS. Right click the database, select tasks, then generate scripts
Then select the newly created table, and next
You will need to select advanced and change the 'types of data to script' to schema and data. It will be schema only by default. Select Next and Finish.
SSMS will export a file, or load a new query window with the code to create a new table, but will also have the insert statements to load the new table exactly as it was on the source server
Use following as an example
use dbNew;
go
create view dbo.ViewTest as
select * from dbTest.dbo.Tester;
Following code will create table using another table. The new Table will contain all the data of the previous table.
Select * into DBName1.SchemaName.NewTableName from DBName2.SchemaName.PreviousTableName
You can use this query to create new table in any database and schema.

Copying tables between databased with different authentication DB2

Hey StackOverflow community,
My question is as follows:
I have a table, say USER_ADDR with a bunch of columns in one database, say DB001
I need to copy the contents of this table(based on a criteria) to a similar table USER_ADDR (same name, yes) in another database DB002 with a different userID and pwd.
I need to do this in a stored procedure that will be executed using a .net framework.
I tried this:
INSERT INTO "DB002".USER_ADDR (--column names--)
SELECT *
FROM "DB001".USER_ADDR
WHERE ID = "APPLICATION_NO_IN";
I get:
0: Error occurred: [IBM][DB2/NT64] SQL0204N "DB002.USER_ADDR" is an undefined name. LINE NUMBER=15. SQLSTATE=42704 : -204: IBM.Data.DB2: 42704
What am I doing wrong?
Thanks in advance
Vashist
i'm deleting my other answer after seeing the additional info about your use case. Load is mainly for bulk loads of large numbers of records.
in this case i'd recommend you do something like open connection1 in .Net to your data source, select the data and hold it in a .Net DataTable. If required, you can do that select in a stored proc that returns either individual column values for a single row or return a cursor (rowset) that contains all the columns (and rows). Then in .Net open connection2 and insert the data from the DataTable to your destination. Again, that can be done with a stored proc.
Another approach is using an external script that connects to both databases.
From just one database is not possible, at least you use, as already mentioned, Information integration (federation) or by exporting the data and then loading it.

Resources