Looping Through a (CSV) in Bash as variables - arrays

I'm trying to use a CSV file with 3 column (Name,Date,Manager) and run a script that will read the CSV and use its variables to send an email for every row
The test columns looks like this
Name Date . Manager
user1 11/11/2018 test1
user2 11/12/2018 test2
user3 11/13/2018 test3
What I have so far is this
while IFS="," read Name Date Manager
do
echo $Name, $Date $Manager
sendEmail -o tls=yes -f YourEmail#gmail.com -t $Manager#domain.com -s smtp.gmail.com:587 -xu my#gmail.com -xp YOURPASSWORD -u "Hello $Name" -m "How are you? Testing $Date."
done < test.csv
echo $Name, $Date $Manager only prints out the first row of the column.
output user1 11/11/2018 test which I will think only send one email instead of 3
Any help would be amazing

Related

Script to add users to mariadb

I am trying to add users into mariadb. Male users use an airline as password and females use a continent as a password all this info is taken form a csv file.
For some reason my script is not working. here is the script:
#!/bin/bash -p
FILENAME=mgarr048.csv
while IFS=: read -r username first last gender dob state municipality season continent elective f1 airline
do
(( gender == "m" )) || continue
mysql -e GRANT INSERT ON *.* to $username#'%' IDENTIFIED by $airline;
done
csv headers
username,first,last,gender,dob,state,municipality,season,continent,elective,f1,airline
Nowhere in your script you tell it to read FILENAME. Modify your script like this:
#!/bin/bash
FILENAME="mgarr048.csv"
while IFS=, read -r username first last gender dob state municipality season continent elective f1 airline
do
if [[ "$gender" != "m" ]]
then
echo "mysql -e GRANT INSERT ON *.* to $username#'%' IDENTIFIED by "$airline""
fi
done < "$FILENAME"
You said you have a CSV file, so I changed IFS=; to IFS=,. C in CSV is "comma".
You can read a full description on bash loops to read files here: https://mywiki.wooledge.org/BashFAQ/001
EDIT
Since I do not have your database to test, I added echo to the mysql line. Remove it to use this on your system. If you get a mysql error, it is not the script, it is related to your mysql command and/or database.
Tested with that data set:
username1,user1,name1,m,1111-11-11,State1,City1,season1,continent1,elective1,f1,airline1
username2,user2,name2,f,2222-22-22,State2,City2,season2,continent2,elective2,f2,airline2
Output:
mysql -e GRANT INSERT ON *.* to username2#'%' IDENTIFIED by airline2

Return SQL Query as bash array

First Post. Bash novice. Couldn't find an effective solution.
Looking for an efficient parsing method / alternative method
My initial attempt: (+ edit thanks to #larks)
services=($($PSQL "SELECT array(select name from services);"))
echo ${services[#]}
>array -------------------------------- {Shampoo,Dying,Long-cut} (1 row)
echo ${#services[#]}
>5
echo ${services[2]}
>{Shampoo,Dying,Long-cut}
I'm looking to end up with an array identical to the ones below but without creating a csv in the process.
echo $($PSQL "\copy (select name from services) to 'services.csv';")
readarray arr -t a < services.csv
echo ${arr[#]}
>Shampoo Dying Long-cut
echo ${#services[#]}
>3
Your services variable is not an array; to create an array you need to surround the value with (...). For example, compare this:
$ example=$(echo one two three)
$ echo ${example[0]}
one two three
With this:
$ example=( $(echo one two three) )
$ echo ${example[0]}
one
So assuming that your $PSQL command generates output in an appropriate format, you want:
services=( $($PSQL "SELECT array(select name from services);") )
For what you're trying to do in your question, I don't see any reason to use the array function. Given a table like this:
CREATE TABLE services (
id serial primary key,
name text
);
INSERT INTO services (name) VALUES ('foo');
INSERT INTO services (name) VALUES ('bar');
INSERT INTO services (name) VALUES ('qux');
A query like this will produce results amendable for turning into a bash array:
$ psql -t --csv -U postgres -d arraytest -c 'select name from services'
foo
bar
qux
In a bash script:
services=( $(psql -t --csv -U postgres -d arraytest -c 'select name from services') )
for service in "${services[#]}"; do
echo "SERVICE: $service"
done
Which produces:
SERVICE: foo
SERVICE: bar
SERVICE: qux

How to write a query to find all tables in a db that have a specific column name? HIVE

I've got a database with about 400 tables and I need to find all the tables searching by the column names. Basically I need something like:
select <tables> from <database> where table.columnName='tv';
How can I do this?
below shell script will give you desired result:
output=""|
hive -e 'show tables in <database_name>' |
while read line
do
echo "TABLE NAME : $line"
if eval "hive -e 'describe <database_name>.$line'" | grep -q "tv"; then
output+="Required table name: $line"'\n'
else
output+=""
fi
echo -e "$output"
done

How to generate table from csv in postgres sql

I am new to database management and we are using psql. All I need to do is to migrate csv (around 200 tables) to our database. Manually creating tables for every csv file is bit tiresome so please help me out, Is there any way to generate table from csv file?
Answered at DBA Stackexchange by the OP. I'm copying the answer here because this was the first link returned by my search engine.
OP made a script like:
DATADIR='data' # this directory name
PREFIX='jobd'
DBNAME='divacsv'
function createSchema {
COLUMNS=`head -n 1 $1 |
awk -F, '{for(i=1; i<=NF; i++){out=out $i" text, ";} print out;}' |
sed 's/ text, $/MYEXTRA text/' |
sed 's/"//g'`
CMD_CREATE="psql $DBNAME -c \"CREATE TABLE $2 ($COLUMNS);\""
echo $CMD_CREATE
sh -c "$CMD_CREATE"
CMD_COPY="psql divacsv -c \"COPY $2 FROM '"`pwd`"/$1' DELIMITER ',' CSV;\""
echo $CMD_COPY
sh -c "$CMD_COPY"
}
for file in $DATADIR/*.csv; do
table=$PREFIX"_"`echo $file | sed 's/.*\///' | sed 's/.csv//' `
createSchema "$file" $table
done
Comments advise that HEADER might be needed to avoid loading first line with header texts, which is true.
I've tested this code but couldn't make it work under CentOS.

mysqldump puts CREATE on the first line with mysqldump

Here's my full bash script:
#!/bin/bash
logs="$HOME/sitedb_backups/log"
mysql_user="user"
mysql_password="pass"
mysql=/usr/bin/mysql
mysqldump=/usr/bin/mysqldump
tbackups="$HOME/sitedb_backups/today"
ybackups="$HOME/sitedb_backups/yesterday"
echo "`date`" > $logs/backups.log
rm $ybackups/* >> $logs/backups.log
mv $tbackups/* $ybackups/ >> $logs/backups.log
databases=`$mysql --user=$mysql_user -p$mysql_password -e "SHOW DATABASES;" | grep -Ev "(Database|information_schema)"`
for db in $databases ; do
$mysqldump --force --opt --user=$mysql_user -p$mysql_password --databases $db | gzip > "$tbackups/$db.gz"
echo -e "\r\nBackup of $db successfull" >> $logs/backups.log
done
mail -s "Your DB backups is ready!" yourmail#gmail.com <<< "Today: "`date`"
DB backups of every site is ready."
exit 0
Problem is when i try to import it with mysql i am gettint error 1044 error connecting to oldname_db. When i opened sql file i have noticed on the first line CREATE command so it tries to create that database with the old name. How can i solve that problem?
SOLVED.
Using --databases parameter in my case is not necessary and because of --databases it was generating CREATE and USE action in the beginning of the sql file, hope it helps somebody else.
Use the --no-create-db option of mysqldump.
From man mysqldump:
--no-create-db, -n
This option suppresses the CREATE DATABASE statements that are
otherwise included in the output if the --databases or --all-databases
option is given.

Resources