Unable to send aws iot mqtt messages into aws kinesis firehose in a batch mode - aws-iot

This is my mqtt payload
{ "type": "1", "data": [
{
"customer_org_id": "777233",
"vdms_id": "2",
"vendor_id": "2",
"remote_session_type": "open",
"timestamp": "125434115",
"email": "rajweeeeeath#acdercessonline.io",
"network": "iptv"
},
{
"customer_org_id": "555777233",
"vdms_id": "5552",
"vendor_id": "5552",
"remote_session_type": "555open",
"timestamp": "1255555434115",
"email": "rajweeeeeat5555h#acdercessonline.io",
"network": "iptv5555"
} ] }
Here is my iot Sql Rule
SELECT (select * from data) AS messages FROM 'test/fairfield' where type = '1'
I am trying to generate a json array so that I can ingest this into kinesis firehose in batchmode.
However, I am getting an error message
"errorMessage": "Failed to send message to Firehose. The error received was The payload must be a valid json array when batchMode=true, e.g. '[\"a\", \"b\", \"c\"]'. Message arrived on: test/fairfield
Note: I even tried SELECT data from 'test/fairfield' where type = '1'
But, i am getting the same error.
Any help here will be highly appreciated.

I think following rule did the magic which is part of new sql version 2016-03-23
SELECT VALUE data FROM 'test' where type = '1'
However, I would still like to know, if we can pick specific elements of each object within an array and rename the keys.

Related

Azure Logic Apps and Microsoft Forms - Get field descriptors

I have a Logic App that retrieves the responses submitted by the users through Microsoft Forms.
When I see the Logic App Run, I can see the descriptor for each field (MuleSoft, IoT & Integration, Encuesta de tecnologías, ...), for example:
But in the "Show raw outputs" I can't see those fields, I get an identifier (rcb6ccf0fc9e44f74b44fa2715fec4f27, ...):
How I can retrieve those descriptors??
The solution is to add a 'Send an HTTP request to SharePoint' action to get the details of the form.
The Site Address is: https://forms.office.com
The Method is: GET
The Uri is: /formapi/api/forms('')?select=id,title,questions&$expand=questions($expand=choices)
This returns a JSON with all the questions and for each question the ID, Title and more info about the question.
We can implement a loop through these questions and with each ID, extract the response from the Microsoft Forms:
foreach": "#body('Send_an_HTTP_request_to_SharePoint')['questions']"
And Compose the result:
"Compose": {
"inputs": {
"Id": "#{items('For_each')['id']}",
"Name": "#items('For_each')['title']",
"Value": "#{body('Get_response_details')[item()['id']]}"
},
"runAfter": {},
"type": "Compose"
}
These are field identifiers. You can retrieve them directly from the Dynamic content of Get response details.
Alternatively, you can build your own JSON body(in your case Get response details) from Compose connector.

Microsoft Graph API - Create Contact doesn't work

I am attempting to use the Create Contacts endpoint for Microsoft Graph API (Doc is here: https://learn.microsoft.com/en-us/graph/api/user-post-contacts?view=graph-rest-1.0&tabs=http) to register new contact for my user. I created body as described in API documentation but getting the error below:
{
"error": {
"code": "Request_BadRequest",
"message": "A value without a type name was found and no expected type is available. When the model is specified, each value in the payload must have a type which can be either specified in the payload, explicitly by the caller or implicitly inferred from the parent value.",
"innerError": {
"request-id": "daf78520-50e6-444b-97a2-779762b3e6ed",
"date": "2020-01-23T14:20:18"
}
}
}
Requests used:
1. POST https://graph.microsoft.com/v1.0/{{tenant_id}}/contacts;
2. POST https://graph.microsoft.com/v1.0/me/contacts;
Request body example:
{
"givenName": "Yulia",
"surname": "Lukianenko",
"emailAddresses": [
{
"address": "yulia#lukianenko.onmicrosoft.com",
"name": "Yulia Lukianenko"
}
],
"businessPhones": [
"+1 732 555 0102"
]
}
Did somebody meet such kind of issue? How you resolved it?
Thank you in advance for your help!
POST request is incorrect here.
It should be :
https://graph.microsoft.com/v1.0/me/contacts
Also you need to make sure "Contacts.ReadWrite" permission is granted to the app registered in AAD.
P.S: I used the same JSON in your example using graphExplorer and contact was created successfully.

Gmail returns base64 encoded but lists as quotable printable

When I call the Gmail API I get the following back (just an excerpt obviously since the body is massive:
{
...
payload: {
...
parts: [
{
"partId": "1",
"mimeType": "text/html",
"filename": "",
"headers": [
{
"name": "Content-Type",
"value": "text/html; charset=\"UTF-8\""
},
{
"name": "Content-Transfer-Encoding",
"value": "quoted-printable"
}
],
"body": {
"size": 4696,
"data": "PCFET0NUWVBFIGh0bWw-PGh0bWwgbGFuZz1lbj48....
I have just included the relevant parts. You will see that the html body part has the email encoded as base64Url, but toe content transfer encoding clearly says quoted-printable. I run it through a base64url decoder and it gives the correct data. But the header explicitly says it is quoted-printable
What am I missing?
The plain body part is this, which seems perfectly correct.
"headers": [
...
{
"name": "Content-Transfer-Encoding",
"value": "base64"
} ],
"body": {
"size": 601,
"data": "R29vZ2xlIEFQSXMgRXhwbG9yZXIgd2FzIGdyYW5
When you request a Message resource, the Gmail API can deliver message data in one of four formats that you can set via a query string (see documentation).
Below is a description of each format option, taken from the official docs:
"full": Returns the full email message data with body content parsed in the payload field; the raw field is not used.
(default)
"metadata": Returns only email message ID, labels, and email headers.
"minimal": Returns only email message ID and labels; does not return the email headers, body, or payload.
"raw": Returns the full email message data with body content in the raw field as a base64url encoded string; the payload field is
not used.
"full" is the default option where the body content is parsed and automatically stored as a base64 encoded string in the data property.
Keep in mind that the Message resource object is provided as a convenience to interact with the RFC5322 payload and it always provides its data payload in base64 regardless of the value on the Content-Transfer-Encoding header.
If you want to wrangle with the raw IMF(Internet Message Format - RFC5322) text, then set the format to "raw" in your query string. You'll have to base64 decode the raw string to get the IMF data. It will look exactly as you expect, but you'll have to write your own parser to manage its contents.

Is it possible to process JSON and accessing parameters using service bus?

I have seen that it is possible to add a JSON schema when you are using the "HTTP Request"-trigger and adding the JSON schema in the "Request Body JSON Schema"-box.
I have also looked at adding schema in the "Integration Account", however the section in the documentation says its "to confirm that XML documents you receive are valid", which is not what i am looking for.
I am using a Azure Service Bus Queue.
In this case i am having PeekLock as a trigger, the idea is that the input in the service bus will be of a certain format. It will all be in JSON. I dont "care" or need to know what happens before the service bus, all i know is that each message will contain the same format. What my logic app is supposed to do is to receive the message in the service bus and then mail it to whoever its supposed to go to, and add if there is anything to add from blob storage. I want to be able to access certain "tags" or "parameters", since Service Bus only have its own few tags.
I used the jsonschema.net to get the schema, and here is the JSON of how a format will look like:
{
"items": [
{
"Key": "XXXXXX-XXXX-XXXX-XXXX-XXXXXXX",
"type": "Email",
"data": {
"subject": "Who is the father?",
"bodyBlobUID": "00000000-0000-0000-0000-000000000000",
"to": [
"darth.vader#hotmail.com"
],
"cc": [
"luke.skywalker#nomail.com"
],
"bcc": [
"leia.skywalker#nomail.com"
],
"encoding": "System.Text.UTF8Encoding",
"isBodyHtml": false,
"organisationUID": "00000000-0000-0000-0000-000000000000",
"BlobUIDs": [
"luke.skywalker#nomail.com"
]
}
}
]
}
So my questions are of 2 parts:
1: Is it possible to add JSON schemas without using the HTTP Request
trigger for using service bus?
2: If #1 is possible, or maybe it can
be done in another way. How do i access the tags or parameters of the
JSON format? At this moment i am trying to do transformations using
schemas and maps with the Integration account but it seems
unnecessary.
UPDATE: Parse JSON is now available in Logic Apps.
we will be releasing an action called JSON Parse next week, in which you can specify the service bus output as payload, define schema of the payload, then custom friendly tokens in subsequent steps.

Google App Engine - Deployment not working

I have a Google App Engine project (an API) working against a Google Cloud SQL instance. I have a exact copy of both, the API and the SQL instance, running local.
When I execute the project on development (local), I can explore the API just fine. Yet, when I go on deployment, all the calls return empty.
This is the API code (Java):
#ApiMethod(
name = "test.users.list",
path = "test/users/list",
httpMethod = "get"
)
public UserList testUserList ()
{
UserList users = UserList.getAll();
return users;
}
This is what it returns local:
{
"users": [
{
"id": "3",
"firstName": "Test",
"lastName": "User",
"email": "test#test.com",
"password": "12fc892642c48a8227410f5b6722e1edeeefedfb",
"logins": 0,
"lastLogin": 0,
"roles": [
{
"id": "1",
"name": "user",
"loaded": true
}
],
"fullName": "Test User",
"admin": false,
"lastLoginDat": "1969-12-31T21:00:00.000-03:00",
"loaded": true
},
{
"id": "6",
"firstName": "Test",
"lastName": "User 2",
"email": "test2#test.com",
"password": "c5bc2a33ddffcfb3d61779ab44d7d933e1336b02",
"logins": 0,
"lastLogin": 0,
"fullName": "Test User 2",
"admin": false,
"lastLoginDat": "1969-12-31T21:00:00.000-03:00",
"loaded": true
}
]
}
This is what it returns on the server explorer:
{
"kind": "myapp#usersItem",
"etag": "\"l4AE0sdQvyB-SkumpjWQFJVUZzo/MSGC-asdfasdf\""
}
Some insight:
I have access (through MySQL Workbench) to the local SQL instance and the Google Cloud SQL external instance.
Both, local and external SQL instances, are equal to each other.
I've also guaranteed access to the project I'm using to hit the DB (trough the Google Cloud panel).
I don't really know what to do or check. I've searched for something similar but couldn't find anything related.
Any thought is appreciated.
As Jan pointed out, GAE local and production are two separate stories. But I'm gonna give you something even better than this particular solution, which is the tools to resolve this and further problems:
1. Go to Console > Compute > App Engine > Versions, and make sure the backend version you uploaded is the right one (take a look at the time it was uploaded) and is selected or default.
2. Create an API test method that does a rather basic operation against the DB, but without calling other methods nor using try/catch blocks. You're trying to raise a exception here, if any. All the code you need to hit the DB and return the data should be inside that method.
3. Go to the explorer panel, and execute your test method. If it works, then your problem is not in the connection, but in the logic. Use the same approach of (2) to test the logic.
4. If the method doesn't work, go to: Console > Monitoring > Logs. There you'll find the exception trace.

Resources