I am building an application using Mule 4.2.2 version where in I have to retrieve data from Mongo DB. For this I am using the MongoDB connector version 6.3.0. I am using the "Find Documents" as show in the image below, where you can see I have mentioned the query as
{"eventCode": $[vars.eventCode]} where eventCode is the field on which I am querying and eventCode is the variable where I am storing the incoming eventCode.
When I run the mule application I see an error in the logs that says -
org.bson.json.JsonParseException: Invalid JSON input. Position: 15. Character: '#'.
I thought that I could combine literals and mule expressiong using #[], but that doesn't seem to work. Any pointers on how to solve this?
If it is an expression, then you can not use #[...] inside. Just write the expression:
{"eventCode": vars.eventCode}
If it is not an expression -if the fx button clicked?- you might need to enclose the entire expression into #[...].
#[{"eventCode": vars.eventCode}]
Related
I am working on MQTT connection establishment to the server.
I need to send the data to the server in JSON schema format using AT Commands.
The module used in N58 Neoway module. Using AT Commands connection got established and for publishing data or subscribing data to/from the server, it should happen in JSON format.
The AT Command used is:
AT+MQTTPUB=1,1,<"topic_name">,<"data">
I need to send the JSON schema in the place of data.
Looking for any suggestions/help.
The source code is based on C.
The problem in sending JSON through AT commands is that it contains double quotes ", that are unfortunately interpreted according to AT commands ETSI specification as the beginning of a string parameter. So, what happens in many modules is that it is impossible so send a JSON string as a parameter.
Some modems vendors solve this issue by starting an online mode in which data can be sent rawly.
N58 uses a different strategy instead, that consists in escaping the special characters. In the AT command guide it is called data link escape.
Though the guide could be better (there's not explicit explanation of data link escape), we can infer it from the examples (see for example the one in AT+UDPSEND): in order to escape " character, just write \" as you would do in a C string. Example:
AT+MQTTPUB=1,1,"topic_name","{\"menu\":{\"id\":\"1\",\"value\":\"2\"}}"
I am pretty new to Snowflake Cloud offering and was just trying to load a simple .csv file from AWS s3 staging are to a table in Snowflake using copy command.
Here is what I used as the command:
copy into "database name"."schema"."table name"
from #S3_ACCESS
file_format = (format_name = format name);
When run the above code, I get the following error: Numeric value '63' is not recognized
Please see the attached image. Not sure what this error is and i'm not able to find any lead in Snowflake UI itself to find out what could be wrong with the value.
Thanks in Advance!
The error says, it was waiting a numberic value, but it got "63", and this value can not be converted to numeric value.
From the image you share, I can see that there are some weird characters around 6 and 3. There could be an issue with file encoding or data is corrupted.
Please check encoding option for file format:
https://docs.snowflake.com/en/sql-reference/sql/create-file-format.html#format-type-options-formattypeoptions
By the way, I recommend you always use utf-8.
I wrote a query using the query tool in pgadmin 4. Now I want to download the results as a csv. I´ve got two problems with that.
The 'Download as CSV'-button does not work sometimes. Especially when the result contains 1000+ rows.
When I finally have a csv and I want to open it, this message is all I see:
"'ascii' codec can't encode character u'\xbb' in position 26: ordinal not in range(128)"
Since I´m fairly new to all of this, could someone enlighten me to what is wrong?
On your questions:
The broken CSV download was a known bug that was fixed in pgAdmin v1.5 (Bug summary at the login-required https://redmine.postgresql.org/issues/2253; the gist is that there were multiple issues with exporting JSON data and Unicode). If you're not on that version, try updating and see whether you continue to have the issue.
You didn't specify where you're seeing that message regarding encoding, but the character referenced in the error is a "Right-Pointing Double Angle Quotation Mark" (») (http://www.codetable.net/hex/bb).
As per the official Camel document, "Expressions and Predicates can then be used to create the various Enterprise Integration Patterns in the DSL or Xml Configuration like the Recipient List.
To support dynamic rules Camel supports pluggable Expression strategies using a variety of different Languages."
Could anyone please elaborate or explain it in plain English?
Expressions are used to return any value from the current message exchange. An example of an expression would be using Xpath to retrieve a node from the XML in the body of a message. When the documentation says pluggable strategies, its essentially saying you can use different approaches / languages, including Groovy, JavaScript, etc. A predicate is a specialized expression used to evaluate a condition on the message exchange. It is similar to the condition you would find in a Java if statement. A predicate always returns a Boolean value.
Here's a code example where a message is coming from ActiveMQ and based on the XPath predicate, the message gets routed to another queue.
from("activemq:queue:ORDER_ITEM_PROCESSING").
choice().
when().xpath("/o:Order/o:OrderType/o:FulfillmentCenter = '" +
com.pluralsight.orderfulfillment.generated.FulfillmentCenter.ABC_FULFILLMENT_CENTER.value()
+ "'", namespace).to("activemq:queue:ABC_FULFILLMENT_REQUEST")
Hope this helps.
I want to read Database result into variables so I can use it for later requests.
How can i do it?
What if i want to return from database multiple
columns, or even rows? can loop the returned table same way i can
with "CSV Data Set Config"?
--edit--
Ok, i found this solution that uses regular expression to parse the response, but this solution and other like it doesn't work for me, because they require me to change SQL queries so Jmeter could parse them more "easily". I'm using Jmeter to do testing (load testing), and the last thing I want is to maintain 2 different codes, one for "testing" and other for "runtime".
Is there a "specific" JDBC Request solution that enable me to read result into variables using the concept of result-sets and columns?
Using The Regular Expression shouldn't affect what your SQL statement looks like. If you need to modify which part of the response you store in variable, use a Beanshell sampler with java code to parse out the response and store into a variable.
You can loop through the returned table, by using a FOREACH controller, referencing the variable name in the reg ex. Make sure in your reg ex, you set the match value to -1 to capture every possible match.