Copy tables from an on-prem database to an azure sql database - sql-server

I am quite new to Logic App. I tried to copy tables from an on-prem database to an azure sql database. Everything is set up (Gateway,...). I tried to make a loop on the tables with the action Foreach but I get stucked with the Insert row (V2) action
When I run the Logic App, I obtain the error:
{
"status": 400,
"message": "A value must be provided for item. XXX,
"error": {
"message": "A value must be provided for item."
},
"source": "sql-we.azconn-we.p.azurewebsites.net"
}
I know from the Insert Row V2 that I have to add an item (Parameter: Row) but I don't know how to add it from the dynamic content because it only shows when I click on "Body":
How can I add the item?

Can you include your table schema and the logic app model you are using ? So I can answer more precisely.
I assume you have only one column named 'Body' in your on-prem DB table. Make sure to configure in a way that the result has multiple rows ( in previous steps "GetRowsV2" action)
The outputs of GetRowsv2 operation are dynamic. Usually the default parameter "Value" contains complete result as seen below.
When the result content has more than one occurrence, it is automatically considered to be put inside "ForEach" action.
The value that do not match the field data types are not listed under dynamic content.

Related

Power Automate - Insert/ Updates rows in On-prem SQL server table

My goal is to copy data from one SQL table to another SQL Table.
I'm trying to create a conditional Insert/Update statement in Power Automate based on Row ID. I have two tables right now with the same columns.
Source SQL Table
Destination SQL Table
I would like to update rows if Row ID already exists or create new if already not exists.
I tried Execute SQL query but this is not supported.(Known issues)
I tried "Transform data using Power Query to fetch rows from Source and Destination" and then had if condition to compare "Source.ProjectName = Dest.ProjectName" then its going into two Apply each conditions but still not creating items..
Nothing like searching for an answer to your specific problem and finding exactly one other person with the same issue, but no resolution.
Fortunately, I've managed to work out a decent solution for an SQL Upsert with an On-Premises SQL Connector in Power Automate.
Here's the general overview, I'll go through step-by-step after:
First step is to retrieve the single row by ID using Get row (V2).
The next step is to parse the JSON of the body of the previous call.
Here is the Schema that I used:
{
"type": "object",
"properties": {
"status": {
"type": "integer"
},
"message": {
"type": "string"
},
"error": {
"type": "object",
"properties": {
"message": {
"type": "string"
}
}
},
"source": {
"type": "string"
}
}
}
Now the key bit, hit Configure Run After for the Parse JSON action and have it run on both Success and Failure of the previous action.
Then we add a conditional that checks the status code of the Get Row action (as output by the Parse JSON action). If it failed with a 404 Status, we do an Insert. Otherwise, do an Update.
Hopefully this helps anyone else trying to work around the limitations of the On-Premises connector.

Power automate populate excel

I want that when a contract is signed the following information is entered in an excel table: Agreement ID, Agreement Name, Creation Date, Status. And if the row already exists changes the status.
But it doesn’t really work, it creates several rows each time.
The foreach gives "False" to each operation while the line is existing
After reproducing from our end, we have observed that changing condition in the condition connector to below expression have updated the changes in excel, after using List rows present in a table connector before the condition connector. As it gets all the rows present in the table.
array(body('List_rows_present_in_a_table')?['value']?[0]?['Agreement ID'])
Here is my Logic App
Below is my excel initially
Result:

Azure Search - hard-code a string value into a field in an index

This one is for the MS Azure Search team, unless someone else has run into this and found a resolution for it.
I'm creating an index which is importing data from a SQL Server Database. I would like to add a field to the index whose value is just "OK" for every document. This field does not exist in the database and we do not want to add it there.
Is it possible to add a hard-coded field to an Azure Search index which auto-populates with the given string (in this case, "OK") for all documents that get imported?
Injecting constant values isn't currently possible with indexers - you would need to add this to the table, or create a SQL query that SELECTs that value, and use that query as your Azure Search datasource.
However, we've seen several customers ask for this, so please vote for this suggestion: Provide field mapping function to inject a constant value into the index. Thanks!

Preventing 'The data has been changed' error when using Me.Refresh with ODBC linked tables

In my Access application's data entry form I'm using Me.Refresh to show the user the new value of a calculated field based on data they entered in a non-calculated field. The purpose is to give them the deadline for doing a job next.
This Access app is using linked SQL Server 2012 tables via ODBC. The calculated field is in the underlying SQL Server table, not part of the Access Record Source query because I want to store the calculated value in the actual data, not just as an interface element.
The calculation is very simple:
nextjob = jobdate + 79
So I have the field for jobdate set to run Me.Refresh after update. All well and good.
The problem is that if the user updates jobdate, triggers the refresh by moving to another field, then returns to the jobdate field and changes the date they entered Access throws a "The data has been changed by another user" error.
I tested the method using native Access tables and the problem does not occur. However the data needs to stay on the server, so moving to native tables is not a solution for me.
There are several possible solutions.
1- If it's always jobdate + 79, don't store it at all, use a view that has the calculated field.
2- Use Me.Requery instead of Me.Refresh. If the form shows multiple records, you must navigate back to the current record, you can use Me.Bookmark for that.
3- Move the calculation into the Access frontend - make nextjob an ordinary column and set it in the form, so it isn't another user (the server) that updates the data.

How do you delete all the documents from a Cloudant Database or even Delete the database in Bluemix

I have been working on the Bluemix demo templates and have filled up a Cloudant database with 12k documents. Now I want to start over as this is test data, and want to start with a clean database.
lots of documents
I found I could interactively delete documents 100 at a time, but I need to just start over with a clean database.
Is there a command or menu option for this?
I tried making a new database, which was easy, but i didn't find a way to delete the old one or then rename the new one.
thanks
john
You could perform a DELETE request to your database:
curl -X DELETE https://username:password#username.cloudant.com/yourdatabase
The answer indicates whether you succeeded.
As an alternative you can use the Dashboard (GUI). Once you are on the database panel and selected a single database, click the icon for settings and select Delete.
If you want to keep the existing database but remove the documents, you can use CouchDB's Bulk API.
Make a call to the all_docs end point to retrieve the current list of all document identifiers and revision. For each document retrieved, consisting the the _id and _rev fields, add a _deleted field with value true.
Now send a HTTP POST to the _bulk_docs end point for the database, with the updated documents list.
POST /database/_bulk_docs HTTP/1.1
{
"docs": [
{
"_id": "some_document_id",
"_rev": "1-6a466d5dfda05e613ba97bd737829d67",
"_deleted": true
}
...
]
}
Full details on the Bulk API are here.
Deleted documents will remain in your database until you run the purge command.
If you want to do this within to dashboard, select the database you want then select the settings cog by the DB name. You'll see a Delete option.

Resources