Please forgive me for the potentially basic question but I am a z/OS person trying to learn cURL and Cloudant. I have gotten the following example to work to add a record to a database (using DOS from Windows) :
curl -X POST -b /tmp/cloudant.cookie -H "Content-Type: application/json" -d "{\"_id\":\"2\",\"empName\":\"John Doe\",\"phone\":\"646-598-4133\",\"age\":\"28\"}" --url https://xxxxxxxxxx-bluemix.cloudant.com/rcdb
Now I would like to add a _attachment image1.jpg dile to that record...
Could anyone please tell me what the syntax on windows would be...trying a few combinations but nothing so far works.
To add an attachment follow the instructions in the Cloudant documentation at https://docs.cloudant.com/attachments.html
Example:
Assuming you have already created a document with ID "2" and revision number "1-954695fb9642f02975d76b959d0b5e98" in database rcdb, run the following command:
curl -X PUT -H "Content-Type: image/jpeg" --data-binary "#image1.jpg" --url https://xxxxxxxxxx-bluemix.cloudant.com/$DATABASE/$DOCUMENT_ID/$ATTACHMENT?rev=$REV
replacing $DATABASE with rcdb, $DOCUMENT_ID with 2, $REV with 1-954695fb9642f02975d76b959d0b5e98 and $ATTACHMENT with the desired attachment property name, e.g. mypic.
Related
Distributed Services Got Started Successfully:
[2021-10-17 18:04:29,693] INFO Started o.e.j.s.ServletContextHandler#1422ac7f{/,null,AVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:916)
[2021-10-17 18:04:29,693] INFO REST resources initialized; server is started and ready to handle requests (org.apache.kafka.connect.runtime.rest.RestServer:319)
[2021-10-17 18:04:29,693] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:57)
But not able to make schema registry:
curl -X PUT -H "Content-Type:application/json" --data '{"name": "file-stream-demo-distributed","config":{"connector.class":"com.snowflake.kafka.connector.SnowflakeSinkConnector","topic":"demo-2-distributed","file":"/home/ramakrishnakonda/kafka_2.13-2.8.0/config/connect-distributed.properties"}}' http://localhost:8083/connectors
{"error_code":405,"message":"HTTP 405 Method Not Allowed"}
Please help
Use POST rather than PUT, as in this example:
$ curl -X POST -H "Content-Type: application/json" --data '{"name": "local-file-sink", "config": {"connector.class":"FileStreamSinkConnector", "tasks.max":"1", "file":"test.sink.txt", "topics":"connect-test" }}' http://localhost:8083/connectors
curl -X POST "http://localhost:8083/connectors" -H "Content- type:application/json" --data
"{
"name":"file-stream-demo-distributed",
"config":{
"connector.class":"com.snowflake.kafka.connector.SnowflakeSinkConnector",
"tasks.max":"1",
"topics":"demo-2-distributed",
"buffer.count.records":"10000",
"buffer.flush.time":"60",
"buffer.size.bytes":"5000000",
"snowflake.url.name":"XXXXXXXX.XXXXXXXXX.snowflakecomputing.com:443",
"snowflake.user.name":"kafka_connector_user_1",
"snowflake.private.key":"XXXXXXXXXXXXXXXXXXX+XXXXXXXXXXXXXXXXXXXXXXXXXXXXX,
"snowflake.database.name":"KAFKA_DB",
"snowflake.schema.name":"KAFKA_SCHEMA",
"key.converter":"org.apache.kafka.connect.storage.StringConverter",
"key.converter.schemas.enable":"true", "value.converter":"com.snowflake.kafka.connector.records.SnowflakeJsonConverter",
"value.converter:schemas.enable":"true",
"value.converter.schema.registry.url":"http://localhost:8081"
}
}"
I was trying to send a post request to Flink Job manager running on Kubernetes cluster. While sending post request of /jar/run for a class which doesn't need any command line arguments, it works fine. But while trying to submit a different class in the same jar which requires command line arguments gives following error. -:
{"errors":["Request did not match expected format JarRunRequestBody."]}'
However, while passing command line arguments and submitting job directly like following works -:
./flink run -m localhost:30287 -c com.class.name ~/path/to/jar/1.0-1.0-SNAPSHOT.jar --bootstrap.servers izac-cp-kafka:9092 --group.id test --topic bank_transaction --schema.registry http://mysr-schema-registry:8081 --CepJson """{\"keyId\": \"customer_id\",\"pattern\": [{\"patternName\": \"p1\",\"simpleCondition\":{\"columnName\": \"amount\",\"operator\": \">\",\"value\": \"50\",\"dataType\": \"Int\"}},{\"patternName\":\"p2\",\"simpleCondition\":{\"columnName\":\"amount\",\"operator\":\">\",\"value\":\"30\",\"dataType\":\"Int\"}}],\"connector\":[{\"name\":\"begin\",\"connectorType\":\"next\",\"start\":\"p1\",\"end\":\"p2\"}]}"""
To convert above command to a flink REST based post request I did the following -:
curl -k -v -X POST -H "Content-Type: application/json" --data '{ "entryClass":"com.class.name", "programArgsList": [ "--bootstrap.servers izac-cp-kafka:9092", "--group.id test", "--topic bank_transaction", "--schema.registry http://mysr-schema-registry:8081", "--CepJson """{\"keyId\": \"customer_id\",\"pattern\": [{\"patternName\": \"p1\",\"simpleCondition\":{\"columnName\": \"amount\",\"operator\": \">\",\"value\": \"50\",\"dataType\": \"Int\"}},{\"patternName\":\"p2\",\"simpleCondition\":{\"columnName\":\"amount\",\"operator\":\">\",\"value\":\"30\",\"dataType\":\"Int\"}}],\"connector\":[{\"name\":\"begin\",\"connectorType\":\"next\",\"start\":\"p1\",\"end\":\"p2\"}]}""""]}' http://localhost:30287/jars/2a788e33-c92d-47c4-84af-31e3dff28666_1.0-1.0-SNAPSHOT.jar/run
However, this gave the error as mentioned earlier. I just wanted to convert the above command line job submission to a rest api based submission to flink cluster.
Note -: The post request is for a cluster of flink which already contains the required Jar. I only want to submit a job using a particular class.
I don't think that curl accepts """ as a string interpolation the way scala does so it won't send correct CepJson parameter for sure, so I would start with changing that.
I changed the above curl request to the following and it worked -:
curl -k -v -X POST -H "Content-Type: application/json" --data '{ "entryClass":"com.class.name", "programArgsList": [ "--bootstrap.servers", "izac-cp-kafka:9092", "--group.id"," test", "--topic","bank_transaction", "--schema.registry", "http://mysr-schema-registry:8081", "--CepJson", "{\"keyId\": \"customer_id\",\"pattern\": [{\"patternName\": \"p1\",\"simpleCondition\":{\"columnName\": \"amount\",\"operator\": \">\",\"value\": \"50\",\"dataType\": \"Int\"}},{\"patternName\":\"p2\",\"simpleCondition\":{\"columnName\":\"amount\",\"operator\":\">\",\"value\":\"30\",\"dataType\":\"Int\"}}],\"connector\":[{\"name\":\"begin\",\"connectorType\":\"next\",\"start\":\"p1\",\"end\":\"p2\"}]}"]}' http://localhost:30287/jars/2a788e33-c92d-47c4-84af-31e3dff28666_1.0-1.0-SNAPSHOT.jar/run
I crossposted this in the curl mailing list (curl-users).
I know how to get multiple parts of an email header like this:
curl --url "imaps://imap.ionos.de/INBOX;UID=216;SECTION=HEADER.FIELDS%20(DATE%20FROM%20TO%20SUBJECT)" -u "user:password"
But is it possible to get multiple sections (received date and text in
my case) at once? I want to combine these two commands:
1: Fetch email receiving date
curl --url "imaps://imap.ionos.de/INBOX;UID=216;SECTION=HEADER.FIELDS%20(DATE)" -u "user:password"
2: Fetch email text
curl --url "imaps://imap.ionos.de/INBOX;UID=216;SECTION=TEXT" -u "user:password"
I tried something like this:
curl --url "imaps://imap.ionos.de/INBOX;UID=216;SECTION=HEADER.FIELDS%20(DATE);SECTION=TEXT" -u "user:password"
–––––––––––––––––––––
Thank you diciu, this works just fine. I have an additional question:
When i add --ouput output.txt the file only contains the last FETCH (SECTION=TEXT in this case). Is it possible to output all FETCH results to one output file? I know i can use >> output.txt to append but i use that to write a log file within the same command.
Here my full command:
curl --url "imaps://imap.ionos.de/INBOX;UID=295;SECTION=HEADER.FIELDS%20(DATE)" "imaps://imap.ionos.de/INBOX;UID=295;SECTION=TEXT"-u "user:password" --output output.txt --verbose >> logfile.log 2>&1
You have to add the two requests one after another, e.g.
curl --url "imaps://imap.ionos.de/INBOX;UID=216;SECTION=HEADER.FIELDS%20(DATE)" "imaps://imap.ionos.de/INBOX;UID=216;SECTION=TEXT" -u "user:password"
curl will issue the two commands on the same session, that is after SELECT-ing the folder it will yield two FETCH commands.
I am trying to update a classifier. In zip folder, I have more than 10 images. But still, not able to update.
Tried via swagger URl: https://watson-api-explorer.ng.bluemix.net/apis/visual-recognition-v3#!/Custom/updateClassifier
URL: https://gateway.watsonplatform.net/visual-recognition/api/v3/classifiers/sports_cars_1042527461?version=2018-03-19&api_key=xxxxxxxxxxxxxx
CURL: curl -X POST --header 'Content-Type: multipart/form-data' --header 'Accept: application/json' {"type":"formData"} 'https://gateway.watsonplatform.net/visual-recognition/api/v3/classifiers/sports_cars_1042527461?version=2018-03-19&api_key=xxxxx'
RESPONSE: {
"error": {
"code": 400,
"error_id": "input_error",
"description": "Cannot execute learning task. : Unable to create retraining task - previous training data not present."
}
}
Tried it with Node JS code too but got the same error.
Is there anything I missed or tried something wrong?
One thing I noticed is that you have a mix of URL and authentication for older and newer classifiers.
For classifiers created before May 23, you use the URL gateway-a. and &api_key=....
For classifiers created afterward, you use the URL gateway. and IAM authentication (-u "apikey:{apikey}").
so
curl -X POST \
-F "sportscars_positive_examples=#sc.zip" \
-F "negative_examples=#suvs.zip" \
"https://gateway-a.watsonplatform.net/visual-recognition/api/v3/classifiers
/sports_cars_1042527461?version=2018-03-19&api_key=xxxxxxxxxxxxxx"
or
curl -X POST -u "apikey:yyyyyyyyyyyyyyyyyyy" \
-F "sportscars_positive_examples=#sc.zip" \
-F "negative_examples=#suvs.zip" \
"https://gateway.watsonplatform.net/visual-recognition/api/v3/classifiers
/sports_cars_1042527461?version=2018-03-19"
For details, look at the API reference.
I'm working on a simple slack integration for SVN projects via a client side post-commit hook in Windows Tortoise SVN using curl to send JSON. I would like to post some basic information, ideally revision number and comment when a commit is made. I added a post-commit client side hook, that executes a postcommit.bat file that succeeds in sending my slack channel a message. What I can't get working is including the passed in variables from tortoise. According to tortoise a SVN client side hook post-commit will include the following argument PATH DEPTH MESSAGEFILE REVISION ERROR CWD https://tortoisesvn.net/docs/release/TortoiseSVN_en/tsvn-dug-settings.html
Here is my working code, that sends a static string (not including message/revision to my slack channel):
C:\p\curl -H "Content-Type: application/json" -X POST https://hooks.slack.com/services/MYSECRETSERVICE -d "{\"text\":\"Test message\"}"
I would like to do something like this, but the following code does not, it simply posts "RevMessage", or in other words the parameters are not properly passed:
C:\p\curl -H "Content-Type: application/json" -X POST https://hooks.slack.com/services/MYSECRETSERVICE -d "{\"text\":\"Rev%REVISION%Message%MESSAGE%\"}"
Okay, this may have been so specific (judging by view count) that I was bound to answer it myself, lol. Anyway, perhaps someone else will find this useful, here is what ended up working for me. First variables need to be set from arguments, then they can be referenced as shown:
set tsvnPath=%1
set tsvnDepth=%2
set tsvnMsgFile=%3
set /p THEMSG=<%3
set tsvnRevision=%4
set tsvnError=%5
set tsvnCwd=%6
C:\p\curl -H "Content-Type: application/json" -X POST https://hooks.slack.com/services/MYSECRET -d "{\"text\":\"MSG %THEMSG% REV %tsvnRevision%\"}"