Use curl to post file from pipe - file

How might i take the output from a pipe and use curl to post that as a file?
E.g. the following workds
curl -F 'file=#data/test.csv' -F 'filename=test.csv' https://mydomain#apikey=secret
I'd like to get the file contents from a pipe instead but I can't quite figure out how to specify it as a file input. My first guess is -F 'file=#-' but that's not quite right.
cat data/test.csv | curl -F 'file=#-' -F 'filename=test.csv' https://mydomain#apikey=secret
(Here cat is just a substitute for a more complex sequence of events that would get the data)
Update
The following works:
cat test/data/test.csv | curl -XPOST -H 'Content-Type:multipart/form-data' --form 'file=#-;filename=test.csv' $url

If you add --trace-ascii - to the command line you'll see that curl already uses that Content-Type by default (and -XPOST doesn't help either). It was rather your fixed -F option that did the trick!

Related

Why does quoting password-variable in curl lead to authorization failure? (Bash)

I have a very specific problem with bash and curl.
What we do is:
reading a password from jenkins and paste it to a config-file (i don't have access to the password)
read parameters from config-file in bash (host, user, password, etc.) and store it in variables
post something with curl to a database and store the result in a variable
Recently we added shellcheck to our deploy-scripts and therefore we need to put the variables in quotes.
That's the request we want to send (shellcheck-approved):
result=$(curl -s -XPOST "${dbURL}" --header "Authorization: Basic $(echo -n "${dbUser}:${dbPwd}" | base64)" --data-binary "blabla")
And here's the error message we get in return:
{"error":"authorization failed"}
It does work, when I unquote the password-variable ("${dbUser}":${dbPwd}). But then spellcheck complains, that I need to put all variables in quotes. Also it does work on another machine with different password (which I have no access to either).
It is the same, when I use --user username:password. So it seems like the problem lies within the password.
Using google and testing the procedure (without the curl) with different special characters couldn't solve it either.
Has anyone experienced something like this?
Edit1:
This is an extract from jenkins-deploy-file ..
stage('config files') {
withCredentials([string(credentialsId: "${env_params.db_password}", variable: 'db_pw')]) {
sshagent(credentials: ["${env_params.user}"]) {
sh "echo \"dbPwd=${db_pw}\" >> environment_variables/config.properties"
This is how the shell script stores the password ..
dbPwd=$(grep ^"$dbPwd" <PATH>/config.properties | cut -d "=" -f2)
thanks for your support.
It seems like there are trailing whitespaces in the password-storage.
I removed them using sed and now it works.
dbPwd=$(grep ^"$dbPwd" <PATH>/config.properties | cut -d "=" -f2 | sed -e 's/[[:space:]]*$//')
You can just set the password in another file and use the content of the file as your password variable.

Imap: Get multiple sections from email with one command

I crossposted this in the curl mailing list (curl-users).
I know how to get multiple parts of an email header like this:
curl --url "imaps://imap.ionos.de/INBOX;UID=216;SECTION=HEADER.FIELDS%20(DATE%20FROM%20TO%20SUBJECT)" -u "user:password"
But is it possible to get multiple sections (received date and text in
my case) at once? I want to combine these two commands:
1: Fetch email receiving date
curl --url "imaps://imap.ionos.de/INBOX;UID=216;SECTION=HEADER.FIELDS%20(DATE)" -u "user:password"
2: Fetch email text
curl --url "imaps://imap.ionos.de/INBOX;UID=216;SECTION=TEXT" -u "user:password"
I tried something like this:
curl --url "imaps://imap.ionos.de/INBOX;UID=216;SECTION=HEADER.FIELDS%20(DATE);SECTION=TEXT" -u "user:password"
–––––––––––––––––––––
Thank you diciu, this works just fine. I have an additional question:
When i add --ouput output.txt the file only contains the last FETCH (SECTION=TEXT in this case). Is it possible to output all FETCH results to one output file? I know i can use >> output.txt to append but i use that to write a log file within the same command.
Here my full command:
curl --url "imaps://imap.ionos.de/INBOX;UID=295;SECTION=HEADER.FIELDS%20(DATE)" "imaps://imap.ionos.de/INBOX;UID=295;SECTION=TEXT"-u "user:password" --output output.txt --verbose >> logfile.log 2>&1
You have to add the two requests one after another, e.g.
curl --url "imaps://imap.ionos.de/INBOX;UID=216;SECTION=HEADER.FIELDS%20(DATE)" "imaps://imap.ionos.de/INBOX;UID=216;SECTION=TEXT" -u "user:password"
curl will issue the two commands on the same session, that is after SELECT-ing the folder it will yield two FETCH commands.

How to convert multiple documents using the Document Conversion service ina script bash?

How can I convert more than one document using the Document Conversion service.
I have between 50-100 MS Word and PDF documents that I want to convert using the convert_document API method?
For example, can you supply multiple .pdf or *.doc files like this?:
curl -u "username":"password" -X POST
-F "config={\"conversion_target\":\"ANSWER_UNITS\"};type=application/json"
-F "file=#\*.doc;type=application/msword"
"https://gateway.watsonplatform.net/document-conversion-experimental/api/v1/convert_document"
That gives an error unfortunately: curl: (26) couldn't open file "*.doc".
I have also tried "file=#file1.doc,file2.doc,file3.doc" but that gives errors as well.
The service only accept one file at a time, but you can call it multiple time.
#!/bin/bash
USERNAME="<service-username>"
PASSWORD="<service-password>"
URL="https://gateway.watsonplatform.net/document-conversion-experimental/api/v1/convert_document"
DIRECTORY="/path/to/documents"
for doc in *.doc
do
echo "Converting - $doc"
curl -u "$USERNAME:$PASSWORD" \
-F 'config={"conversion_target":"ANSWER_UNITS"};type=application/json' \
-F "file=#$doc;type=application/pdf" "$URL"
done
Document Conversion documentation and API Reference.

why is checking a URL failing when run through icinga?

I created my own command to check a specific URL
define command{
command_name check_url
command_line /usr/lib/nagios/plugins/check_http -f follow -H '$HOSTNAME$' -I '$HOSTADDRESS$' -u '$ARG1$'
}
If I run my command from the command line, it works:
/usr/lib/nagios/plugins/check_http -f follow -H www.example.com -u http://www.example.com/server-status
HTTP OK: HTTP/1.1 200 OK - 4826 bytes in 0.011 second response time |time=0.010625s;;;0.000000 size=4826B;;;0
But when run through Icinga, I'm getting
HTTP WARNING: HTTP/1.1 404 NOT FOUND - 314 bytes in 0.011 second response time
My guess is for check_http plugin for -u option you should provide the url appended after the server name not the whole url.
Ex.
/usr/lib/nagios/plugins/check_http -f follow -H www.example.com -u /server-status
Your manual test is not equivalent to your command definition.
The distinction with -H/-I is subtle, but very important.
When I have problems like this, where Icinga is abstracting exactly how it is executing the command, I find it helpful to find out precicely what Icinga is executing. I would accomplish this as follows:
Move check_http to a temporary location
# mv /usr/lib/nagios/plugins/check_http /usr/lib/nagios/plugins/check_http_actual
Make a bash script that Icinga will call instead of the actual check_http script
# vi /usr/lib/nagios/plugins/check_http
In that file, create this simple bash script, which simply echos the command line arguments it was called with, then exits:
#!/bin/bash
echo $#
Then of course, make that bash script executable:
# chmod +x /usr/lib/nagios/plugins/check_http
Now in Icinga, run the check_http command. At that point, the return status shown in the Icinga web interface will show exactly how Icinga is calling check_http. Seeing the raw command, it should be obvious as to what Icinga is doing wrong. Once you correct Icinga's mistake, you can simply move the original check_http script back into place:
# mv /usr/lib/nagios/plugins/{check_http_actual,check_http}

Solr Server Posting Error

How to post 5000 files to Solr server?
While posting by using command "java -jar post.jar dir/*.xml", command tool tells Argument list is too long.
The quickest solution would be using a bash script like the following:
for i in $( ls *.xml); do
cat $i | curl -X POST -H 'Content-Type: text/xml' -d #- http://localhost:8080/solr/update
echo item: $i
done
which adds to Solr, using curl, all the xml files within the current directory.
Otherwise you can write a Java main similar to the one included in post.jar, which adds all the xml files within a directory instead of having to pass all of them as arguments.

Resources