Incrementally load data from SQL Server views into D365 - sql-server

I'm implementing a SSIS package where I want to incrementally load data from SQL Server to D365. From source to staging, we are loading data to tables with 15 minutes frequency. This incremental data is based on the DATELASTMAINT (Last Date Maintenance). We have created a few views on top of these tables. We are loading data from these views into D365 entities.
But in this workflow, we just want to incrementally load data into D365 as it's taking a long time for INSERT and UPDATE. We are using KingswaySoft for D365 data connection.
I tried couple of scenarios to incrementally get data, but couldn't succeed. What is the best way to incrementally fetch data from views (which are based on multiple tables) and push that data into D365?

Related

Load data from one Snowflake Database into other within same account cluster

Can you suggest an approach to load data from one Snowflake(SF) database into another SF database within same cluster?
I have to -
Do data transformation, incremental load while loading into destination SF table
Schedule the load like ETL job
Thanks,
Nikhil

Tableau incremental refresh from Snowflake

I have a question regarding incremental refresh from Snowflake to Tableau. I know the feature for Incremental refresh/Incremental extract is available in Tableau but can it be used for incremental loads from Snowflake? And how does it work?
The reason for me asking is because I know that query folding which other BI-tools on the market uses for incremental refreshes, isn't possible in Snowflake.
Thanks!
/P
Tableau incremental refreshes work the same for Snowflake as it does for other databases.
"Query Folding" looks like a Microsoft (and specifically PowerBI) term. According to this article https://exceleratorbi.com.au/how-query-folding-works/ "query folding" is the process of pushing the work load down to the database, which is what Tableau does when querying Snowflake tables directly.
With Snowflake I would recommend querying the tables directly as they are already setup in columnar format, and you can avoid moving the data to a Tableau Server and waiting on refreshes. Snowflake has unlimited storage whereas you might be limited by your Tableau Server.
If you need the tables in Snowflake to only show data as of a point in time, there are different ways you could accomplish this including:
Preset date filters (or parameters as filter within Tableau) that are pushed down to Snowflake
Using Tasks in Snowflake to run at a specific time to:
Clone your tables, and use the clones for reporting
Update existing reporting tables
I agree with Chris' answer accept for avoiding the extracts on Tableau Server. There can be a lot of performance gains had by using Tableau to extract the data. We run extracts out of Snowflake for most of our data sources. We also test both live connections and extracts for each to see which performs best. If timing is an issue, extracts can be set to refresh every 15 minutes at the most.
To get extracts loaded and refreshing use the following steps.
Switch your data source to an extract in Tableau Desktop
This will create a local copy of the data to be used to publish next.
Select Server/Publish Workbook
In the Publish settings, choose your refresh schedule and publish to Tableau Server. The workbook and data source will be loaded to Server.
You can also update the refresh schedules directly in Server by navigating to the new data source and going to the Extract Refreshes tab.
If you don't have the correct schedule available, you can create one in the Schedules menu for the site.

Long running view in ssas-tabular

I have a SQL Server database where we have created some views based on dim and fact tables. I need to build SSAS tabular model based on my tables and views. But one of the view runs for 1.5 hour inside SQL query (SSMS). Now I need to use this same view to build my SSAS tabular model but 1.5 hour is not acceptable. This view is made up of more than 10 table joins and lot of Where conditions.
1) Can I bring all these tables being used in this view inside my SSAS tabular model but then I am not sure how to join them all and use where clauses inside SSSAS and build something similar to my view. Is that possible? If yes how?
or
2) I will build one time SSAS model from that view and then if I want to incrementally load the data daily, whats is the best way to do that?
The best option is to set up a proper ETL process. That is:
Extract the tables from your source SQL database into a new SQL database that you control.
Transform the data into a star schema.
Load the data from the star schema into SSAS.
On SQL Server, the most common approach is use SSIS packages for data extraction, movement, and orchestration, and SQL Server Agent Jobs for scheduling.
To answer your questions:
Yes, it is certainly possible to bring in all of the tables directly from your source system into your tabular model, but please don't do this! You will only create problems for yourself later on when creating DAX calculations. More information here.
Incrementally loading data is something you decide for each table that is imported into your tabular model. Again, this is much easier if you have a proper star schema, as you would typically run a full processing on all your dimension tables, and then do incremental processing only on the largest fact tables.

SSIS: Loading Fact Table ID's (Look up Dimension ID's) and measures data (from Excel file)

I'm having some trouble loading a fact table (Fact_Servicio) of a star schema I made in SQL Server, here the diagram:
All the ID's are identity columns.
Our case is the following: we have a Service Desk software which makes daily reports, and we want to use this data to make Business Intelligence, creating a Data Mart (Star Schema) in which we will populate all this data and then display it through Power BI.
Current problem: our issue is in the ETL process with SSIS. After creating the database in SQL Server, we made a SSIS Package to populate all the data from the excel file to this Star-Schema, we start by populating the dimensions, and after this we attempt to populate the fact table Fact_Servicio, but we don't know how exactly to take the ID's of each dimensions, join them with the fields we need to grab from the excel file(which are our previously defined measures) and then insert all in the fact table, we tried using the lookup transformation, but we cannot match any dimension ID with any column in the excel file because these ID's are created in the database and they just autogenerate per record. (The lookup task needs to match the columns we grabbed with columns in the excel file but in the Excel file we don't have any column for ID's, and we wouldn't like to create fields for it because we want to avoid manual tasks as much as possible, because this would be a repetitive task every time we export the data from the Service Desk Software). Here I'm putting some pictures of our SSIS Package structure:
Control Flow View:
Data Flow View of the Fact Table:
Look Up View:
Connection Tab
Columns Tab
Here is where we can't match columns because the ID's are created in the database.
Guys, if there is maybe other way to do this data load, then go ahead and propose how would you do it, otherwise what can we fix here or what transformations from the toolbox can we use. We were also thinking about loading the big excel file sheet to one big single table in SQL Server and work from there but we aren't sure if we would get advantages by doing this.
Thank you all!

Sync two databases by CDC component in SSIS

I am going to keep two databases sync with each other by using CDC component in SSIS (For VS 2013). So far I have created a SSIS Package which transfers all data from one to the other. However, I have some difficulties with incremental loading of records.
The problem is I have about 30 tables in my database. Although I can incrementally move data for a single table like the following, but it is really time-consuming to create this process for each individual table. Is there any workaround to do the same procedure for all tables automatically?
Figure 1: Incremental loading for one table.
Figure 2: For each table I have to write Update and Delete commands manually as depicted.

Resources