I want to send accelerometer data to IBM iot cloud using c language.
In this i am using json format to publish event in cloud.
char *data="{\"d\" : {\"x\" : 43 }}"; //is working correctly ..
I want to send instant values through this pointer in JSON Format.. Please help me how to send the instant values by using JSON format in IBMIOTfclient side
If you use the IBMIOTfclient (iotfclient.h) you can call publishEvent, like:
publishEvent(&client, "accel","json", "{\"d\" : {\"x\" : 43 }}", QOS0);
There is a sample in github.
Related
I have a simple problem where I am not able to insert an array in json payload to call graph api to update user profile while using Bot Framework Composer.
Example:
Step1. I created a property Conversation.UpdatedSkills which holds following value
["SharePoint","Dotnet","Analytics"]
Now I want to build JSON object for MS Graph service and working sample payload looks like this.
{
"skills":["SharePoint","Dotnet","Analytics"]
}
Step2. Now to build this JSON dynamically, I need pass body as JSON Object in Send an HTTP Request activity and I have folllowing code to generate payload.
{
"skills":"${conversation.UpdatedSkills}"
}
The output from Step2 looks something like this.
{
“skills”: “[\r\n “SharePoint”,\r\n “Dotnet”,\r\n “Analytics”\r\n]”
}
DESIRED JSON WAS THIS:
{
"skills":["SharePoint","Dotnet","Analytics"]
}
My question is, How do I pass my array from step 1 such a way so that it creates json object that works in service. The object created using step 2 is not the right object that service takes.
Any idea?
I tried different string manipulations but I think this is basic and there has to be something.
Don't use the string interpolation ${ }, but use an expression (=). For your example above, it should be:
{
"skills": "=conversation.UpdatedSkills"
}
I don't see a solution to this using the available api documentation.
It is also not available on the web console.
Is it possible to get the file url using the Watson Discovery Service?
If you need to store the original source/file URL, you can include it as a field within your documents in the Discovery service, then you will be able to query that field back out when needed.
I also struggled with this request but ultimately got it working using Python bindings into Watson Discovery. The online documentation and API reference is very poor; here's what I used to get it working:
(Assume you have a Watson Discovery service and have a created collection):
# Programmatic upload and retrieval of documents and metadata with Watson Discovery
from watson_developer_cloud import DiscoveryV1
import os
import json
discovery = DiscoveryV1(
version='2017-11-07',
iam_apikey='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx',
url='https://gateway-syd.watsonplatform.net/discovery/api'
)
environments = discovery.list_environments().get_result()
print(json.dumps(environments, indent=2))
This gives you your environment ID. Now append to your code:
collections = discovery.list_collections('{environment-id}').get_result()
print(json.dumps(collections, indent=2))
This will show you the collection ID for uploading documents into programmatically. You should have a document to upload (in my case, an MS Word document), and its accompanying URL from your own source document system. I'll use a trivial fictitious example.
NOTE: the documentation DOES NOT tell you to append , 'rb' to the end of the open statement, but it is required when uploading a Word document, as in my example below. Raw text / HTML documents can be uploaded without the 'rb' parameter.
url = {"source_url":"http://mysite/dis030.docx"}
with open(os.path.join(os.getcwd(), '{path to your document folder with trailing / }', 'dis030.docx'), 'rb') as fileinfo:
add_doc = discovery.add_document('{environment-id}', '{collections-id}', metadata=json.dumps(url), file=fileinfo).get_result()
print(json.dumps(add_doc, indent=2))
print(add_doc["document_id"])
Note the setting up of the metadata as a JSON dictionary, and then encoding it using json.dumps within the parameters. So far I've only wanted to store the original source URL but you could extend this with other parameters as your own use case requires.
This call to Discovery gives you the document ID.
You can now query the collection and extract the metadata using something like a Discovery query:
my_query = discovery.query('{environment-id}', '{collection-id}', natural_language_query="chlorine safety")
print(json.dumps(my_query.result["results"][0]["metadata"], indent=2))
Note - I'm extracting just the stored metadata here from within the overall returned results - if you instead just had:
print(my_query) you'll get the full response from Discovery ... but ... there's a lot to go through to identify just your own custom metadata.
I was able to use the Java API to decode the binary that is read from Kafka as there is BinaryDecoder from the DecoderFactory. I am trying repeat this using the C API.
However, no luck so far. It is straightforward to set values and then get values (schema and data are bond with each other). In the case where the encoded data is put into Kafka, we need to decode the data using a provided schema. Can you anyone shed some light on how to do this using the C API? Thanks a lot!
For example, we will get the following encoded data from Kafka
" site_visit^Z111.222.333.444^#^T1446231845^#^N1209600^#^F1.0"
and we have the schema available (separated from the data above, as we do using Java):
I want to the decoded data to be:
{"segment_name": "site_visit", "ip_address": "111.222.333.444", "received_at": "T1446231845", "ttl": "1209600", "probability": "1.0"}
You can use libserdes for this purpose. There are examples (serdes-kafka-avro-client.c, kafka-serdes-avro-console-consumer.cpp) how to get such JSON result.
Hy, I have some problems with the Go endpoints and Dart client library.
I use the Go library https://github.com/crhym3/go-endpoints and the dart generator https://github.com/dart-lang/discovery_api_dart_client_generator
The easy examples works fine. But they show never how to use time.Time.
In my project, I have a struct with a field:
Created time.Time `json:"created"`
The output in the explorer looks like this:
"created": "2014-12-08T20:42:54.299127593Z",
When i use it in the dart client library, I get the error
FormatException: Invalid date format 2014-12-08T20:53:56.346129718Z
Should I really format every time fields in the go app (Format Timestamp in outgoing JSON in Golang?)?
My research come to that the dart accept something:
t.Format(time.RFC3339) >> 2014-12-08T20:53:56Z
Second problem, if comment out the Created field or leave it blank. I get a other error:
The null object does not have a method 'map'.
NoSuchMethodError: method not found: 'map' Receiver: null Arguments:
[Closure: (dynamic) => dynamic]
But I can't figure it out which object is null. I'm not sure if I'm using the Dart client correct
import 'package:http/browser_client.dart' as http;
...
var nameValue = querySelector('#name').value;
var json = {'name':nameValue};
LaylistApi api = new LaylistApi(new http.BrowserClient());
api.create(new NewLayListReq.fromJson(json)).then((LayList l) {
print(l);
}).catchError((e) {
querySelector('#err-message').innerHtml=e.toString();
});
Does anyone know of a larger project on github with Go endpoint and Dart?
Thanks for any advice
UPDATE[2014-12-11]:
I fixed the
NoSuchMethodError
with the correct discovery url https://constant-wonder-789.appspot.com/_ah/api/discovery/v1/apis/greeting/v1/rest
The problem with the time FormatExcetion still open, but I'm one step further. If i create a new item, it doesn' work. But if I load the items from the datastore and send it back, this works.
I guess this can be fixed with implementing Marshaler interface, thanks Alex. I will update my source soon.
See my example:
http://constant-wonder-789.appspot.com/
The full source code:
https://github.com/cloosli/greeting-example
I have been struggling with some very basic understanding of how google app engine store data
I have defined a class defining a client profile as such :
class ClientProfile(ndb.Model):
nickname = ndb.StringProperty(required=True)
photo = ndb.BlobProperty()
uuid = ndb.StringProperty(required = True)
I retrieve an image data only uploading image.src via a POST using jquery.ajax(...)
The data are properly sent to Google app engine and I can assign them to a variable with
imagesrc = self.request.get('photosrcbase64')
The data content is something looking like :
"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAA...
So fare so good, the data is an image/png and the encoding is Base64, but should I care if it end in a Blob ?
Now if I try to put the data in the photo Blob
with for example :
clientprofile.photo = imagesrc I get a Bad Value Error, in this case :
BadValueError: Expected str, got
u'data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAA
I tried all kind of combinations using different solutions and get back all kind of BadValue or type errors.
My question are :
1) Why does the Blob care, if it is a binary storage, I should be able to dump in it anything without having to interpret it and/or convert it, so is a Blob really a Binary/Raw storage and why does it care about how things are stored in it ?
2) I have started having problems with this 2 years ago when still using db instead of ndb, I found a solution that I did not understand by stripping out the MIME information at the beginning of the data string, decoding the string Base64 and using db.Blob(...) to convert my string to a Blob. the problem is that db.Blob() does not seem to exist in ndb so I can't do this any more.
I am convinced that I am missing something fundamental in the way informations are exchanged between google app engine and the browser and thank you in advance for a mind clearing answer
A BlobProperty is meant to be binary data. The str type in Python is fully equivalent to a byte string since the only characters allowed are
[chr(byte_value) for byte_value in range(2**8)]
So before storing the value from self.request.get('photosrcbase64'), which is of type unicode, you'll need to cast to type str.
You can do this either by directly doing so
imagesrc = str(self.request.get('photosrcbase64'))
or first trying to decode to ascii.