We are trying to insert entities into a local storage table using Azure logic action. We are using Microsoft Azure storage emulator for storage. But we could not find built in action in azure logic apps to insert entities into table. As of now we could see below options ,
Delete table(preview)
Create table(preview)
List tables(preview)
Is there any option to insert entities into local storage table?
You will find insert entity as below:
Here i used Consumption plan while creating logic app
or
Related
I want to create a daily process where I reload all rows from table A into table B. Over time table A rows will change due to changes in source system and also because of aging/deletion of records in the origin table. Table A gets truncated/reloaded daily in step 1. Table B is the master table that just gets new/updated rows.
From a historical point of view, I want to keep track of ALL the rows in table B and be able to do a point in time comparison for analytics purposes.
So I need to do two things, Daily insert rows from table A to table B if they don't exist and then also create a new record in Table B if the record already exists but ANY of the columns have changed. At one point I attempted to use temporal tables but I had too many false/positives on 'real' changes, basically certain columns were throwing off things because a date/time column was updated(only real change in row).
I'm using a Azure SQL Server Managed Instance database (Microsoft SQL Azure (RTM) - 12.0.2000.8).
At my disposal I have SSMS, SQL Server and also Azure Data Factory.
Any suggestions on the best way to do this or tools to help with this?
There are 2 concepts out of which you can implement any one.
Temporal table
Capture Data Change (CDC)
As CDC is the commonly used approach in which you can create an Azure data factory with a pipeline that loads delta data based on change data capture (CDC) information in the source Azure SQL Managed Instance database to an Azure blob storage.
To implement the CDC, you can you can follow this simple Microsoft tutorial Incrementally load data from Azure SQL Managed Instance to Azure Storage using change data capture (CDC)
Note: You also need to Create a storage account which is required but not given in above tutorial.
In our SQL tables we have columns such as UpdatedBy and CreatedBy with a ref key to a User table. This is useful to keep track of who created/updated an business entity.
However we are migrating from this local User table to using Azure AD. We will use Azure AD for authentication and authorization in our client applications.
There should be no need for a local User table (or any other tables related to identity, such as Role etc...), but then how do I reference user ids from Azure AD into my audit columns? Obviously I can no longer have a reference key with constraint.
What is the usual approach to this?
Did you read about System for Cross-Domain Identity Management (SCIM)? You can provide out of the box mechanism for syncing Users and Groups created in the Azure AD.
There is a whole tutorial about how to do that:
https://learn.microsoft.com/en-gb/azure/active-directory/app-provisioning/use-scim-to-provision-users-and-groups#step-4-integrate-your-scim-endpoint-with-the-azure-ad-scim-client
There is a ready to go CRUD and SCIM based application in C# created by Microsoft https://github.com/AzureAD/SCIMReferenceCode
That will give you a solution what to do for example when Azure AD is removing the user, but you want to keep it.
I have an empty search service created in my Azure portal. What I need to do is create an index, fill it with data from a SQL Server hosted on an Azure VM from one table (so not an Azure database itself, but a database hosted on Azure VM), create an indexer so this data is pulled into Azure when changes happen.
I'm not understanding what the exact order should be for creating these items (Index, Indexer, DataSource) and how to tie the Indexer to the Index. I'll be making API calls since I can't seem to use the Azure portal for importing data from a SQL Server hosted on an Azure VM. Also, direction on how to make these API calls would be great as well.
Generally speaking, the logical ordering would be Index > DataSource > Indexer.
However, you could potentially flip Index and DataSource.
An Index is where your search data is stored and is the entity against which you'll perform queries.
DataSource describes the location of data you want to pull into an Index and would include a SQL query to pull out rows that should be included in your Index.
The Indexer is the glue between these two things as it schedules a regular pull between an Index and DataSource.
This is the reason that DataSource and Index are interchangeable. A DataSource has no concept (and no need to know) about which Index it's data might end up in. You could even have a DataSource used against multiple Indices if you wanted. They would just each need their own Indexer that links the DataSource to each Index on a particular schedule.
A typical workflow is described by Microsoft at: https://learn.microsoft.com/en-us/rest/api/searchservice/Indexer-operations
Please take a look at these articles:
Connecting Azure SQL Database to Azure Search using indexers
Configure a connection from an Azure Search indexer to SQL Server on an Azure VM
You can configure data sources and indexers directly in Azure portal by using the Import Data wizard, but it's still good to go over the more in-depth articles linked above.
I have an existing Azure web application backed by an Azure SQL db. My plan is to utilize this same db in new mobile applications I am building, however I made a design decision originally that doesn't work with Azure Mobile Services. I made my keys in the existing database integer ID's and as of recently to utilize a db within the Mobile service it needs ID's that are String GUID's. I have over 200 users entered in my existing database with other associated tables all tied to these ID's.
My question is, is there a feature or methodology for converting all of these integer keys to string keys without dropping all of their data and requiring everyone to manually go in and re set things up again?
My database knowledge is limited but from all I've seen, Azure Mobile Service requires that the keys be strings now and there isn't a work around for it.
Any help is much appreciated, Thanks!
To change the datatype of a column in SQL, just simply run the following command:
ALTER TABLE table_name
ALTER COLUMN column_name column_type
For example, assume your table name is table1 and the column is called keys,
ALTER TABLE table1
ALTER COLUMN keys varchar(10)
I would like to be able to store the tracking tables in a different database the original. For a couple of reasons.
I would like to be able to drop it on demand if I change versions of my application.
I would like to have multiple sync scopes separated by user permissioning.
I am sure through the sqlmetadatastore class there is a way, but I have not found it yet.
the sqlmetaadatastore will not help you in any way with what you're trying to achieve. am pretty sure its not in anyway exposed in the database sync providers you're using.
note that the tracking tables are not the only objects Sync Framework provisioning creates, you will have triggers, tracking tables, stored procedures and user defined table types. and you're not supposed to be dropping them separately or even dropping them by yourself, but you should be using the deprovisioning API.
now if you really want to have the tracking tables on a separate db, the provisioning API has a Script method that can generate the SQL statements required to create the Sync Fx objects.
you can alter that to create the tracking tables on another DB, but you have to alter the triggers as well to insert on this other database.