AzureSearch- Error on detecting index schema from data source - azure-cognitive-search

I created a data source on Azure Search via rest API. I use the API instead of portal, as I have a rowversion data type that isnt handled yet on portal. I am able to view the data source on portal
When I try to import the data source into an index, I get the following error
"Error detecting index schema from data source: "Data source payload should specify at least one of datasouce name and type"
What am I missing here?

Related

Getting data not available error while importing bcfks certificate into salesforce

Getting this error while importing the BCKFS file into salesforce system.
Data Not Available
The data you were trying to access could not be found. It may be due to another user deleting the data or a system error. If you know the data is not deleted but cannot access it, please look at our support page.

Invalid schema definition: Could not parse schema when creating PubSub subscription

I would like to create a subscription with a delivery type "Write to BigQuery" and Use Topic schema and Write metadata options.
So i created Topic with following protobuf schema :
and a BigQuery table with additional columns for metadata specified in the docs.
When i try to create subscription with both Use Write Metadata and Use Topic schema options enabled, i get the following error:
API returned error: "Invalid schema definition: Could not parse schema."
Using just Write Metadata or just Use Topic schema works fine and i am able to create subscription and receive msgs
This appears to be an internal issue with how some nested records are processed. Please follow https://issuetracker.google.com/issues/267444753 for the investigation and fix.

Azure Data Factory API config basic

Hi I am trying to connect to a HR system to simply pull down some data and copy in ASDB
I have managed to get it to work in excel using the following code below in the advanced editor but dont even know where to start in terms of where i put this information in Azure Data Factory
let url="https://api.peoplehr.net/Query",
body="{""APIKey"": ""ENTER API KEY HERE"",""Action"": ""GetQueryResultByQueryName"",""QueryName"":""ENTER QUERY NAME HERE""}",
Source = Json.Document(Web.Contents(url,[
Headers = [ #"Content-Type"="application/json"],
Content = Text.ToBinary(body)
]
))
in Source
my understanding is this is a REST api and i have an API key. I am very new to ADF and to API's in general, i have spent days on google trying to get a solution that works
there is a guide here https://help.peoplehr.com/en/articles/2492019-people-queries-and-excel-power-bi
that describes how to do it in excel which i basicall want to replicate in Azure Data Factory and create a table in SQL DB
thanks in advance
I have used the ingest wizard in ADF with the "Copy" function and the source set to REST
The options on the source only allow me to pass Auth Headers and not additional headers and body (hope that makes sense)

Log Logic Apps error messages to Azure storage file

I have some steps in my logic apps eg Parse JSON. If they fail, I can see the reason for the failure when I open the step in logic apps eg string instead of integer.
How can I log these error messages in my Azure storage account.
The dynamic content dialogue box doesn't specify error messages.
I have created a storage account, created files, populated them with a string and put them into the storage account. I just need to get hold of the error message.
I will be processing JSON from HTTP requests. If the JSON is invalid ie does not conform to the expected schema, I need the error logged, so people can query it with the provider of the data.
If you just want to log the runs error message, it is not necessary to be so troublesome. You could just use outputs to implement it.
You set the create blob action run after parse json action fails, the blob content could be outputs('Parse_JSON')['errors'], if just want to get the error message it should be outputs('Parse_JSON')['errors'][0]['message'].

Where to find the OSB Business service configuration details in the underlying database?

In OSB Layer when the endpoint uri is changed, I need to alert the core group that the endpoint has changed and to review it. I tried SLA Alert rules but it does not have options for it. My question is, the endpoint uri should be saved somewhere in the underlying database. If so what is the schema and the table name to query it.
URI or in fact any other part of OSB artifact is not stored in relational database but rather kept in memory in it's original XML structure. It can be only accessed thru dedicated session management API. Interfaces you will need to use are part o com.bea.wli.sb.management.configuration and com.bea.wli.sb.management.query packages. Unfortunately it is not as straightforward as it sounds, in short, to extract URI information you will need to:
Create session instance(SessionManagementMBean)
Obtain ALSBConfigurationMBean instance that operates on SessionManagementMBean
Create Query object instance(BusinessServiceQuery) an run it on ALSBConfigurationMBean to get ref object to osb artifact of your interest
Invoke getServiceDefinition on your ref object to get XML service
definition
Extract URI from XML service definition with XPath
Downside of this approach is that you are basically pooling configuration each time you want to check if anything has changed.
More information including JAVA/WLST examples can be found in Oracle Fusion Middleware Java API Reference for Oracle Service Bus
There is also a good blog post describing OSB customization with WLST ALSB/OSB customization using WLST
The information about services and all its properties can be obtained via Java API. The API documentation contains sample code, so you can get it up and running quite quickly, see the Querying resources paragraph when following the given link.
We use the API to read the service (both proxy and business) configuration and for simple management.
As long as you only read the properties you do not need to handle management sessions. Once you change the values, you need to start a session and activate it once you are done -- a very similar approach to Service bus console.

Resources