Why does quoting password-variable in curl lead to authorization failure? (Bash) - database

I have a very specific problem with bash and curl.
What we do is:
reading a password from jenkins and paste it to a config-file (i don't have access to the password)
read parameters from config-file in bash (host, user, password, etc.) and store it in variables
post something with curl to a database and store the result in a variable
Recently we added shellcheck to our deploy-scripts and therefore we need to put the variables in quotes.
That's the request we want to send (shellcheck-approved):
result=$(curl -s -XPOST "${dbURL}" --header "Authorization: Basic $(echo -n "${dbUser}:${dbPwd}" | base64)" --data-binary "blabla")
And here's the error message we get in return:
{"error":"authorization failed"}
It does work, when I unquote the password-variable ("${dbUser}":${dbPwd}). But then spellcheck complains, that I need to put all variables in quotes. Also it does work on another machine with different password (which I have no access to either).
It is the same, when I use --user username:password. So it seems like the problem lies within the password.
Using google and testing the procedure (without the curl) with different special characters couldn't solve it either.
Has anyone experienced something like this?
Edit1:
This is an extract from jenkins-deploy-file ..
stage('config files') {
withCredentials([string(credentialsId: "${env_params.db_password}", variable: 'db_pw')]) {
sshagent(credentials: ["${env_params.user}"]) {
sh "echo \"dbPwd=${db_pw}\" >> environment_variables/config.properties"
This is how the shell script stores the password ..
dbPwd=$(grep ^"$dbPwd" <PATH>/config.properties | cut -d "=" -f2)

thanks for your support.
It seems like there are trailing whitespaces in the password-storage.
I removed them using sed and now it works.
dbPwd=$(grep ^"$dbPwd" <PATH>/config.properties | cut -d "=" -f2 | sed -e 's/[[:space:]]*$//')

You can just set the password in another file and use the content of the file as your password variable.

Related

Use curl to post file from pipe

How might i take the output from a pipe and use curl to post that as a file?
E.g. the following workds
curl -F 'file=#data/test.csv' -F 'filename=test.csv' https://mydomain#apikey=secret
I'd like to get the file contents from a pipe instead but I can't quite figure out how to specify it as a file input. My first guess is -F 'file=#-' but that's not quite right.
cat data/test.csv | curl -F 'file=#-' -F 'filename=test.csv' https://mydomain#apikey=secret
(Here cat is just a substitute for a more complex sequence of events that would get the data)
Update
The following works:
cat test/data/test.csv | curl -XPOST -H 'Content-Type:multipart/form-data' --form 'file=#-;filename=test.csv' $url
If you add --trace-ascii - to the command line you'll see that curl already uses that Content-Type by default (and -XPOST doesn't help either). It was rather your fixed -F option that did the trick!

Bash: works at command line but getting 'curl: (1) Protocol ""https" not supported or disabled in libcurl' when using in a script

Got stuck with the situation that a curl request works fine with the command line but does not work at a script giving weird errors. Here is the part of the script:
#we already got the $token
#vars defining curl parameters
dc="NA EU AU SEA JP"
curl_url='https://example.com/api/config/v1/'
curl_request='alertingProfiles?'
curl_properties='tenant=all&stage=prd&cluster='
curl_auth='"Authorization: Bearer '"$token"'"'
for k in $dc; do
# Download the json for each DC using the curl request
curl_combined=\""$curl_url$curl_request$curl_properties$k"\"; echo "$curl_combined"
curlCMD=( -vv --location --request GET "$curl_combined" -H "$curl_auth" -o \""$k.json"\")
echo "${curlCMD[#]}"
curl "${curlCMD[#]}"
done
When I run this in a script I see the following output:
First, the output from the first echo command
"https://example.com/api/config/v1/alertingProfiles?tenant=all&stage=prd&cluster=NA"
Then, the output from the second echo command which is what curl should take as a parameter, I've skipped the bearer token, but trust me, it is not a problem here.
-vv --location --request GET "https://example.com/api/config/v1/alertingProfiles?tenant=all&stage=prd&cluster=NA" -H "Authorization: Bearer " -o "NA.json"
And I am getting following output
Note: Unnecessary use of -X or --request, GET is already inferred.
* Protocol ""https" not supported or disabled in libcurl
* Closing connection -1
curl: (1) Protocol ""https" not supported or disabled in libcurl
Please note two double quotes in the message above in front of https.
If I simply copy/paste the above second echo output in the command line right after curl and execute, it works as expected and downloads JSON in the file.
If I remove " from curl_combined var, the request is sent without double quotes and now I am getting the message curl: (56) Unexpected EOF from the script.
If I copy/paste the second echo output (without double quotes), supply it to curl, and execute from the command line, I am getting 403 message until I put double or single quotes around the https request.
In short - it seems that double quotes around the https request is necessary but for some reason, curl does not take it.
So what went wrong here? I don't mind even to write the constructed URL in a file and read from it or use herestring if it helps, but want to know what exactly went wrong here.
Thanks.
P.S. Removing --location or --request GET or both does not change the outcome.
The script is using passing the "${curlCMD[#]}" array as parameter to curl. This approach will handle the correct expansion of the url, regardless of any special characters in it. There is no need to wrap the curl_combined with additional quotes using the \" construct.
Suggesting replacing setting curl_combined with:
curl_combined="$curl_url$curl_request$curl_properties$k"
echo "$curl_combined"
This solution works replacing curl request with array to the following
curl -vv \
--location \
--request GET \
"$curl_combined" \
--header "Authorization: Bearer ${token}" \
-o "$k.json"
But it is ugly, let's say politely...

psql Batch File - Escaping "Not Equal" Operator

I'm working on a batch file that will import data into the PostgreSQL database I use for testing. The batch file drops all of the databases, then recreates/reloads them from a previous dump file made from our production database. However, I sometimes run into a problem if I've accidentally left a connection open to that server/database. The "drop" portion fails because there are still users connected (me).
I've been trying to "tweak" my batch file with a command to disconnect all users from the database(s) prior to issuing the command to drop them, but I can't get that part (disconnection) to work. I've taken the disconnect code from another SO question How to drop a PostgreSQL database if there are active connections to it?, and I've been looking at other questions like How to execute postgres' sql queries from batch file? for help with the syntax.
I've also seen the "alternate" syntax for a not equal operator on the 9.2. Comparison Functions and Operators page of the official PostgreSQL documentation, but that seems to also be using "special" characters that would require escaping, so I'm not sure how to proceed.
At this point, the batch file looks like this:
#Echo OFF
SET PGPASSWORD=PASSWORD
cd /D "C:\PostgreSQL\bin"
psql.exe -h localhost -p 5432 -d postgres -U username -c 'SELECT pg_terminate_backend(pg_stat_activity.pid) FROM pg_stat_activity WHERE pg_stat_activity.datname = ''betadb'' AND pid \<\> pg_backend_pid();'
dropdb.exe -h localhost -p 5432 -U username betadb
psql.exe -h localhost -p 5432 -d postgres -U username < "C:\PostgresSQL\prodserverdump.sql"
Everything else works except for the pg_terminate_backend query. Every time I run that, I get strange errors indicating a problem with a path, or a file, or something else like that. I believe I've narrowed the problem down to the "not equal" operator (<>) in the query, but I can't seem to find the correct way to escape this so it doesn't try to pipe in data from a file that's not being defined.
I've tried using single backslashes (\) and double backslashes (\\), in front of one or both of the characters in the operator, but that doesn't appear to work. Is there a special way to escape the "greater than" and "less than" characters for the -c command line option in psql?
Using a combination of suggestions and "trial & error", I believe I found the correct syntax for executing this particular SQL command through a batch file.
Trying the "alternative" not equal operator (!=), I was still getting errors. They were different errors (it was giving me some nonsense about too many parameters), but it still wouldn't execute.
Using #Compo's suggestion from the comments, I then tried to enclose the entire SELECT statement in double quotes instead of single quotes. Still not quite there.
Finally, I removed the "extra" single quotes I was using around the database names from before. The query appears to have executed properly.
The final result looks like this:
#Echo OFF
SET PGPASSWORD=PASSWORD
cd /D "C:\PostgreSQL\bin"
psql.exe -h localhost -p 5432 -d postgres -U username -c "SELECT pg_terminate_backend(pg_stat_activity.pid) FROM pg_stat_activity WHERE pg_stat_activity.datname = 'betadb' AND pid != pg_backend_pid();"
dropdb.exe -h localhost -p 5432 -U username betadb
psql.exe -h localhost -p 5432 -d postgres -U username < "C:\PostgresSQL\prodserverdump.sql"
I suppose I had assumed that, because all of the examples I had found were using single quotes to surround the SQL statement, that's what I had to use. Apparently, that assumption was incorrect.
Regardless, it all seems to be working correctly now. Hope this helps someone else who's looking to accomplish something similar.

How to convert multiple documents using the Document Conversion service ina script bash?

How can I convert more than one document using the Document Conversion service.
I have between 50-100 MS Word and PDF documents that I want to convert using the convert_document API method?
For example, can you supply multiple .pdf or *.doc files like this?:
curl -u "username":"password" -X POST
-F "config={\"conversion_target\":\"ANSWER_UNITS\"};type=application/json"
-F "file=#\*.doc;type=application/msword"
"https://gateway.watsonplatform.net/document-conversion-experimental/api/v1/convert_document"
That gives an error unfortunately: curl: (26) couldn't open file "*.doc".
I have also tried "file=#file1.doc,file2.doc,file3.doc" but that gives errors as well.
The service only accept one file at a time, but you can call it multiple time.
#!/bin/bash
USERNAME="<service-username>"
PASSWORD="<service-password>"
URL="https://gateway.watsonplatform.net/document-conversion-experimental/api/v1/convert_document"
DIRECTORY="/path/to/documents"
for doc in *.doc
do
echo "Converting - $doc"
curl -u "$USERNAME:$PASSWORD" \
-F 'config={"conversion_target":"ANSWER_UNITS"};type=application/json' \
-F "file=#$doc;type=application/pdf" "$URL"
done
Document Conversion documentation and API Reference.

Solr Server Posting Error

How to post 5000 files to Solr server?
While posting by using command "java -jar post.jar dir/*.xml", command tool tells Argument list is too long.
The quickest solution would be using a bash script like the following:
for i in $( ls *.xml); do
cat $i | curl -X POST -H 'Content-Type: text/xml' -d #- http://localhost:8080/solr/update
echo item: $i
done
which adds to Solr, using curl, all the xml files within the current directory.
Otherwise you can write a Java main similar to the one included in post.jar, which adds all the xml files within a directory instead of having to pass all of them as arguments.

Resources