mosquitto MQTT parsing issue - c

I am using libmosquitto and the parson library in a C application running on my IoT device for doing stuff with MQTT and parsing JSON.
When I publish a message to my device using the mosquitto_pub command on a Linux terminal as follows :
mosquitto_pub -t "mytopic/test" -u "admin" -P "admin" -h 192.168.5.100 -m "{"value1": 1, "value2": 2, "value3": 3}"
I am succesfully receiving the message on the device as I am subscribed to it, however, I cannot parse the values at all using json_object_dotget_value
JSON_Value* root_value = json_parse_string(payload);
JSON_Object* root_object = json_value_get_object(root_value);
JSON_Value* value1 = json_object_dotget_value(root_object, "value1");
JSON_Value* value2 = json_object_dotget_value(root_object, "value2");
JSON_Value* value3 = json_object_dotget_value(root_object, "value3");
The values returned are NULL.
I know that i'm leaving out lots of code here. However, the problem is not on the receiving end, because when another application in Python publishes this message, the parsing works fine.
There is something wrong with my mosquitto_pub command, but I can't see it.
I'd appreciate any help. Thanks.

Per your comment, your program is fine.
In the invocation, you may have to escape the " in the argument to your sender program (or use single quotes)
With -m "{"value1": 1, "value2": 2, "value3": 3}", the args [as the program sees them] are:
-m
{value1: 1, value2: 2, value3: 3}
Notice that the " got stripped around (e.g.) value1 and json needs them.
So, change to -m '{"value1": 1, "value2": 2, "value3": 3}' and the args become:
-m
{"value1": 1, "value2": 2, "value3": 3}

Related

`vespa` tutorial : ./src/python/user_search.py U33527 10 KeyError: 'children'

I'm following step by step the Vespa tutorials: https://docs.vespa.ai/en/tutorials/news-5-recommendation.html
(vespa) raphy#pc:~/vespa/sample-apps/news$ python3 src/python/train_cold_start.py mind 10
Total loss after epoch 9: 534.6995239257812 (0.4087916910648346 avg)
{'auc': 0.8578, 'mrr': 0.4789, 'ndcg#5': 0.5482, 'ndcg#10': 0.6013}
{'auc': 0.6265, 'mrr': 0.2846, 'ndcg#5': 0.3117, 'ndcg#10': 0.3747}
Total loss after epoch 10: 517.1571044921875 (0.39538004994392395 avg)
{'auc': 0.8758, 'mrr': 0.5073, 'ndcg#5': 0.5817, 'ndcg#10': 0.6315}
{'auc': 0.6246, 'mrr': 0.2843, 'ndcg#5': 0.3113, 'ndcg#10': 0.3732}
(vespa) raphy#pc:~/vespa/sample-apps/news$
But I'm encountering this problem:
(vespa) raphy#pc:~/vespa/sample-apps/news$ ./src/python/user_search.py U33527 10
Traceback (most recent call last):
File "./src/python/user_search.py", line 58, in <module>
main()
File "./src/python/user_search.py", line 51, in main
user_vector = query_user_embedding(user_id)
File "./src/python/user_search.py", line 21, in query_user_embedding
embedding = parse_embedding(result["root"]["children"][0])
KeyError: 'children'
(vespa) raphy#pc:~/vespa/sample-apps/news$
(vespa) raphy#pc:~/vespa/sample-apps/news$ grep "U33527" mind/vespa_user_embeddings.json
{"put": "id:user:user::U33527", "fields": {"user_id":"U33527", "embedding": {"values": [0.000000,0.060903,0.158397,0.003585,0.230960,0.005171,-0.300856,-0.295116,-0.042150,-0.416067,-0.173345,-0.241960,-0.140207,-0.000399,0.463869,-0.294422,-0.080257,-0.208765,-0.070218,0.189583,0.031040,-0.073909,-0.147883,-0.164819,-0.229605,-0.248327,0.174647,-0.168265,-0.370106,-0.209611,-0.206252,-0.288447,0.091576,-0.122662,0.000394,0.172982,-0.147844,0.326629,-0.103831,-0.312612,-0.209032,0.190745,-0.335539,0.261593,0.699852,0.041234,0.241921,0.052331,0.103968,-0.216830,-0.279406]} }},
OS: Ubuntu 20.04
How to solve it ?
The Vespa index has no user documents here, so most likely the user and news embeddings have not been fed to the system. After they are calculated in the previous step (https://docs.vespa.ai/en/tutorials/news-4-embeddings.html), be sure to feed them to Vespa:
$ java -jar vespa-http-client-jar-with-dependencies.jar \
--file mind/vespa_user_embeddings.json \
--endpoint http://localhost:8080
$ java -jar vespa-http-client-jar-with-dependencies.jar \
--file mind/vespa_news_embeddings.json \
--endpoint http://localhost:8080
That will solve the problem.

Parsing stdout chunks into arrays in bash or ruby

I am trying to find the most efficient way to turn stdout log entries from racadm (dell chassis/idrac) into individual arrays or json arrays so I can evaluate each entry one at a time. The output always has the same fields. The output below is pretty typical
$ racadm chassislog view -c Storage -b PDR
SeqNumber = 11700
Message ID = PDR17
Category = Storage
AgentID = CMC
Severity = Information
Timestamp = 2020-03-21 00:02:06
Message Arg 1 = Physical Disk 0:0:15
FQDD = Disk.Bay.15:Enclosure.Internal.0-0:RAID.ChassisIntegrated.1-1
Message = Global hot spare assigned to Physical Disk 0:0:15.
--------------------------------------------------------------------------------
SeqNumber = 11699
Message ID = PDR26
Category = Storage
AgentID = CMC
Severity = Information
Timestamp = 2020-03-21 00:02:04
Message Arg 1 = Physical Disk 0:0:3
FQDD = Disk.Bay.3:Enclosure.Internal.0-0:RAID.ChassisIntegrated.1-1
Message = Physical Disk 0:0:3 is online.
--------------------------------------------------------------------------------
SeqNumber = 11696
Message ID = PDR71
Category = Storage
AgentID = CMC
Severity = Information
Timestamp = 2020-03-21 00:02:01
Message Arg 1 = Physical Disk 0:0:15
Message Arg 2 = Physical Disk 0:0:3
FQDD = Disk.Bay.15:Enclosure.Internal.0-0:RAID.ChassisIntegrated.1-1
Message = Copyback completed from Physical Disk 0:0:15 to Physical Disk 0:0:3.
--------------------------------------------------------------------------------
SeqNumber = 11670
Message ID = PDR70
Category = Storage
AgentID = CMC
Severity = Information
Timestamp = 2020-03-20 21:45:47
Message Arg 1 = Physical Disk 0:0:15
Message Arg 2 = Physical Disk 0:0:3
FQDD = Disk.Bay.15:Enclosure.Internal.0-0:RAID.ChassisIntegrated.1-1
Message = Copyback started from Physical Disk 0:0:15 to Physical Disk 0:0:3.
--------------------------------------------------------------------------------
SeqNumber = 11667
Message ID = PDR8
Category = Storage
AgentID = CMC
Severity = Information
Timestamp = 2020-03-20 21:45:44
Message Arg 1 = Physical Disk 0:0:3
FQDD = Disk.Bay.3:Enclosure.Internal.0-0:RAID.ChassisIntegrated.1-1
Message = Physical Disk 0:0:3 is inserted.
--------------------------------------------------------------------------------
i'd really love to read the entire output into an associative array so I could step through
each entry in a for loop for events. Looking for guidance in ruby(chef) or bash.
This perl one-liner converts input like the above into an array of JSON objects which you can then process in any JSON-aware tool.
racadm chassislog view -c Storage -b PDR | \
perl -MJSON::PP -lne 'if (/([^=]*?)\s*=\s*(.*)/) { $obj{$1} = $2 }
elsif (/^-+$/) { push #records, { %obj }; undef %obj }
END { push #records, { %obj } if defined %obj;
print encode_json(\#records) }'
outputs (After pretty-printing):
[
{
"Timestamp": "2020-03-21 00:02:06",
"Message ID": "PDR17",
"Category": "Storage",
"Message": "Global hot spare assigned to Physical Disk 0:0:15.",
"AgentID": "CMC",
"Severity": "Information",
"SeqNumber": "11700",
"FQDD": "Disk.Bay.15:Enclosure.Internal.0-0:RAID.ChassisIntegrated.1-1",
"Message Arg 1": "Physical Disk 0:0:15"
},
{
"Category": "Storage",
"Message ID": "PDR26",
"Timestamp": "2020-03-21 00:02:04",
"SeqNumber": "11699",
"Message": "Physical Disk 0:0:3 is online.",
"Severity": "Information",
"AgentID": "CMC",
"Message Arg 1": "Physical Disk 0:0:3",
"FQDD": "Disk.Bay.3:Enclosure.Internal.0-0:RAID.ChassisIntegrated.1-1"
},
{
"FQDD": "Disk.Bay.15:Enclosure.Internal.0-0:RAID.ChassisIntegrated.1-1",
"Message Arg 2": "Physical Disk 0:0:3",
"Message Arg 1": "Physical Disk 0:0:15",
"Severity": "Information",
"AgentID": "CMC",
"Message": "Copyback completed from Physical Disk 0:0:15 to Physical Disk 0:0:3.",
"SeqNumber": "11696",
"Timestamp": "2020-03-21 00:02:01",
"Category": "Storage",
"Message ID": "PDR71"
},
{
"Message Arg 1": "Physical Disk 0:0:15",
"FQDD": "Disk.Bay.15:Enclosure.Internal.0-0:RAID.ChassisIntegrated.1-1",
"Message Arg 2": "Physical Disk 0:0:3",
"SeqNumber": "11670",
"Message": "Copyback started from Physical Disk 0:0:15 to Physical Disk 0:0:3.",
"Severity": "Information",
"AgentID": "CMC",
"Category": "Storage",
"Message ID": "PDR70",
"Timestamp": "2020-03-20 21:45:47"
},
{
"Timestamp": "2020-03-20 21:45:44",
"Message ID": "PDR8",
"Category": "Storage",
"Message": "Physical Disk 0:0:3 is inserted.",
"AgentID": "CMC",
"Severity": "Information",
"SeqNumber": "11667",
"FQDD": "Disk.Bay.3:Enclosure.Internal.0-0:RAID.ChassisIntegrated.1-1",
"Message Arg 1": "Physical Disk 0:0:3"
}
]
Not bash, since shell is for handling files and launching commands, but using GNU awk which is often falsely percieved as part of the shell, it's simple yet powerfull programming language. step through each entry in a for loop for events is not really a requirement so here is a small sample:
$ gawk -v item="Message Arg 2" ' # queried item as parameter
BEGIN {
RS="\n-+$\n" # record is separated by a bunch of -:s
FS="\n" # a line is a field within a record
}
{
for(nf=1;nf<=NF;nf++) { # loop all lines in a record
split($nf,t,/ *= */) # split lines by = and surrounding space
a[NR][t[1]]=t[2] # hash to a 2 dimensional array indexed by
} # record no. and the item, value as value
}
END { # after lines are hashed, make queries
for(nr in a) # for each record in hash
if(item in a[nr]) # if queried item is found in it
printf "%d: %s = %s\n", nr,item,a[nr][item] # output
}' file
Output for query item Message Arg 2:
3: Message Arg 2 = Physical Disk 0:0:3
4: Message Arg 2 = Physical Disk 0:0:3
Here is an alternate ending for match a condition im looking for in "Message" I would like to reference the corresponding FQDD:
$ gawk -v item=Message -v cond=started -v output=FQDD
BEGIN {
RS="\n-+$\n" # record is separated by a bunch of -:s
FS="\n" # a line is a field within a record
}
{
for(nf=1;nf<=NF;nf++) { # loop all lines in a record
split($nf,t,/ *= */) # split lines by = and surrounding space
a[NR][t[1]]=t[2] # hash to a 2 dimensional array indexed by
} # record no. and the item, value as value
}
END {
for(nr in a)
if((item in a[nr]) && a[nr][item]~cond)
printf "%d: %s = %s\n", nr,output,a[nr][output]
}
Output now:
4: FQDD = Disk.Bay.15:Enclosure.Internal.0-0:RAID.ChassisIntegrated.1-1
ie. if variable item is found in a[nr][item] and that array elemets value matches with cond print the value of a[nr]["FQDD"] in the same record.
In SQL that would be SELECT output FROMfileWHERE item LIKE '%cond%'
Based on Shawns one liner as a pattern, a colleague ended up finding a python 2.7 compatible way to do exactly what we want, code is below and offers the exact functionality I need.
import re
import json
from pprint import pprint
regex_string_1 = '([^=]*?)\s*=\s*(.*)'
regex_string_2 = '^-+$'
regex1 = re.compile(regex_string_1)
regex2 = re.compile(regex_string_2)
current_entry = {}
entries = []
lines = test.split('\n')
for line in lines:
if regex1.match(line):
key, value = [element.strip() for element in line.split('=')]
current_entry[key] = value
elif regex2.match(line):
entries.append(current_entry)
current_entry = {}
pprint(entries)

Formatting into CSV JSON file using jq

I've some data in a file called myfile.json. I need to format using jq - in JSON it looks like this ;
{
"result": [
{
"service": "ebsvolume",
"name": "gtest",
"resourceIdentifier": "vol-999999999999",
"accountName": "g-test-acct",
"vendorAccountId": "12345678912",
"availabilityZone": "ap-southeast-2c",
"region": "ap-southeast-2",
"effectiveHourly": 998.56,
"totalSpend": 167.7,
"idle": 0,
"lastSeen": "2018-08-16T22:00:00Z",
"volumeType": "io1",
"state": "in-use",
"volumeSize": 180,
"iops": 2000,
"throughput": 500,
"lastAttachedTime": "2018-08-08T22:00:00Z",
"lastAttachedId": "i-086f957ee",
"recommendations": [
{
"action": "Rightsize",
"preferenceOrder": 2,
"risk": 0,
"savingsPct": 91,
"savings": 189.05,
"volumeType": "gp2",
"volumeSize": 120,
},
{
"action": "Rightsize",
"preferenceOrder": 4,
"risk": 0,
"savingsPct": 97,
"savings": 166.23,
"volumeType": "gp2",
"volumeSize": 167,
},
{
"action": "Rightsize",
"preferenceOrder": 6,
"risk": 0,
"savingsPct": 91,
"savings": 111.77,
"volumeType": "gp2",
"volumeSize": 169,
}
]
}
}
I have it formatted better with the following
jq '.result[] | [.service,.name,.resourceIdentifier,.accountName,.vendorAccountId,.availabilityZone,.region,.effectiveHourly,.totalSpend,.idle,.lastSeen,.volumeType,.state,.volumeSize,.iops,.throughput,.lastAttachedTime,.lastAttachedId] |#csv' ./myfile.json
This nets the following output ;
"\"ebsvolume\",\"gtest\",\"vol-999999999999\",\"g-test-acct\",\"12345678912\",\"ap-southeast-2c\",\"ap-southeast-2\",998.56,167.7,0,\"2018-08-16T22:00:00Z\",\"io1\",\"in-use\",180,2000,500,\"2018-08-08T22:00:00Z\",\"i-086f957ee\""
I figured out this but its not exactly what I am trying to achieve. I want to have each recommendation listed underneath on a seperate line, and not at the end of the same line.
jq '.result[] | [.service,.name,.resourceIdentifier,.accountName,.vendorAccountId,.availabilityZone,.region,.effectiveHourly,.totalSpend,.idle,.lastSeen,.volumeType,.state,.volumeSize,.iops,.throughput,.lastAttachedTime,.lastAttachedId,.recommendations[].action] |#csv' ./myfile.json
This nets :
"\"ebsvolume\",\"gtest\",\"vol-999999999999\",\"g-test-acct\",\"12345678912\",\"ap-southeast-2c\",\"ap-southeast-2\",998.56,167.7,0,\"2018-08-16T22:00:00Z\",\"io1\",\"in-use\",180,2000,500,\"2018-08-08T22:00:00Z\",\"i-086f957ee\",\"Rightsize\",\"Rightsize\",\"Rightsize\""
What I want is
"\"ebsvolume\",\"gtest\",\"vol-999999999999\",\"g-test-acct\",\"12345678912\",\"ap-southeast-2c\",\"ap-southeast-2\",998.56,167.7,0,\"2018-08-16T22:00:00Z\",\"io1\",\"in-use\",180,2000,500,\"2018-08-08T22:00:00Z\",\"i-086f957ee\",
\"Rightsize\",
\"Rightsize\",
\"Rightsize\""
So not entirely sure how to deal with the array inside the "recommendations" section in jq, I think it might be called unflattening?
You can try this:
jq '.result[] | [ flatten[] | try(.action) // . ] | #csv' file
"\"ebsvolume\",\"gtest\",\"vol-999999999999\",\"g-test-acct\",\"12345678912\",\"ap-southeast-2c\",\"ap-southeast-2\",998.56,167.7,0,\"2018-08-16T22:00:00Z\",\"io1\",\"in-use\",180,2000,500,\"2018-08-08T22:00:00Z\",\"i-086f957ee\",\"Rightsize\",\"Rightsize\",\"Rightsize\""
flatten does what it says.
try tests if .action is neither null nor false. If so, it emits its value, otherwise jq emits the other value (operator //).
The filtered values are put into an array in order to get them converted with the #csv operator.
That didn't overly work for me actually it omitted all the data in the previous array - but thanks!
I ended up with the following, granted it doesn't put the Rightsize details on a seperate line but it will have to do:
jq -r '.result[] | [.service,.name,.resourceIdentifier,.accountName,.vendorAccountId,.availabilityZone,.region,.effectiveHourly,.totalSpend,.idle,.lastSeen,.volumeType,.state,.volumeSize,.iops,.throughput,.lastAttachedTime,.lastAttachedId,.recommendations[][]] |#csv' ./myfile.json

Attribute Syntax for JSON query in check_json.pl

So, I'm trying to set up check_json.pl in NagiosXI to monitor some statistics. https://github.com/c-kr/check_json
I'm using the code with the modification I submitted in pull request #32, so line numbers reflect that code.
The json query returns something like this:
[
{
"total_bytes": 123456,
"customer_name": "customer1",
"customer_id": "1",
"indices": [
{
"total_bytes": 12345,
"index": "filename1"
},
{
"total_bytes": 45678,
"index": "filename2"
},
],
"total": "765.43gb"
},
{
"total_bytes": 123456,
"customer_name": "customer2",
"customer_id": "2",
"indices": [
{
"total_bytes": 12345,
"index": "filename1"
},
{
"total_bytes": 45678,
"index": "filename2"
},
],
"total": "765.43gb"
}
]
I'm trying to monitor the sized of specific files. so a check should look something like:
/path/to/check_json.pl -u https://path/to/my/json -a "SOMETHING" -p "SOMETHING"
...where I'm trying to figure out the SOMETHINGs so that I can monitor the total_bytes of filename1 in customer2 where I know the customer_id and index but not their position in the respective arrays.
I can monitor customer1's total bytes by using the string "[0]->{'total_bytes'}" but I need to be able to specify which customer and dig deeper into file name (known) and file size (stat to monitor) AND the working query only gives me the status (OK,WARNING, or CRITICAL). Adding -p all I get are errors....
The error with -p no matter how I've been able to phrase it is always:
Not a HASH reference at ./check_json.pl line 235.
Even when I can get a valid OK from the example "[0]->{'total_bytes'}", using that in -p still gives the same error.
Links pointing to documentation on the format to use would be very helpful. Examples in the README for the script or in the -h output are failing me here. Any ideas?
I really have no idea what your question is. I'm sure I'm not alone, hence the downvotes.
Once you have the decoded json, if you have a customer_id to search for, you can do:
my ($customer_info) = grep {$_->{customer_id} eq $customer_id} #$json_response;
Regarding the error on line 235, this looks odd:
foreach my $key ($np->opts->perfvars eq '*' ? map { "{$_}"} sort keys %$json_response : split(',', $np->opts->perfvars)) {
# ....................................... ^^^^^^^^^^^^^
$perf_value = $json_response->{$key};
if perfvars eq "*", you appear to be looking for $json_reponse->{"{total}"} for example. You might want to validate the user's input:
die "no such key in json data: '$key'\n" unless exists $json_response->{$key};
This entire business of stringifying the hash ref lookups just smells bad.
A better question would look like:
I have this JSON data. How do I get the sum of total_bytes for the customer with id 1?
See https://stackoverflow.com/help/mcve

sms using adb shell not working

I am trying to send the sms by both adb shell am and adb shell service call .I am getting error for both .
a)I am trying to send the sms following adb shell am command
adb shell am start -a android.intent.action.SENDTO -d sms:"phoneNumber" --es sms_body "SMS BODY GOES HERE" --ez exi
t_on_sent true adb adb shell input keyevent 66
and I am getting error-
Starting: Intent { act=android.intent.action.SENDTO dat=sms:xxxxxxxxxxxx pkg=BODY (has extras) }
Error: Activity not started, unable to resolve Intent { act=android.intent.action.SENDTO dat=sms:xxxxxxxxxxxx flg=0x1000
0000 pkg=BODY (has extras) }
b) when sending by service cell
adb shell service call isms 5 s16 "phonenumber" i32 0 i32 0 s16 "BodyText"
Result-
Result: Parcel(
0x00000000: ffffffff 00000061 006f0043 00790070 '....a...C.o.p.y.'
0x00000010: 006e0069 00200067 0065006d 00730073 'i.n.g. .m.e.s.s.'
0x00000020: 00670061 00200065 006f0074 00490020 'a.g.e. .t.o. .I.'
0x00000030: 00630063 0020003a 0065004e 00740069 'c.c.:. .N.e.i.t.'
0x00000040: 00650068 00200072 00730075 00720065 'h.e.r. .u.s.e.r.'
0x00000050: 00320020 00300030 00200030 006f006e ' .2.0.0.0. .n.o.'
0x00000060: 00200072 00750063 00720072 006e0065 'r. .c.u.r.r.e.n.'
0x00000070: 00200074 00720070 0063006f 00730065 't. .p.r.o.c.e.s.'
0x00000080: 00200073 00610068 00200073 006e0061 's. .h.a.s. .a.n.'
0x00000090: 00720064 0069006f 002e0064 00650070 'd.r.o.i.d...p.e.'
0x000000a0: 006d0072 00730069 00690073 006e006f 'r.m.i.s.s.i.o.n.'
0x000000b0: 0052002e 00430045 00490045 00450056 '..R.E.C.E.I.V.E.'
0x000000c0: 0053005f 0053004d 0000002e '_.S.M.S..... ')
but there is no sms on my device.
Please help me figure out What is missing in both the cases.
Thanks !!!
Maybe there is an issue with the number. Try without quotes.
Have you tried this:
adb shell am start -a android.intent.action.SENDTO -d sms:+1-222-333-4444
adb shell input text "sampletexthere"
adb shell input keyevent 66
I had the same problem.
you have to escape your spaces in your message body. use \
"message\ body\ goes\ here"

Resources