Unable to connect to Database using MySQLdb in Robotframework - database

Using Robotframework 2.6.3 and the python database library, I want to connect to MySql Database.
I have downloaded the DatabaseLibrary and MySQLdb however when I try to connect using:
Library DatabaseLibrary
DatabaseLibrary.Connect To Database MySQLdb cts1 root password 172.16.7.20
I get the following error, when I run this using pybot:
OperationalError: (2003, "Can't connect to MySQL server on '172.16.7.20' (10061)")
Note:
cts1 - is the DB name and 172.16.7.20 - is the IP address of DB.
This works when I use mysql under the command prompt. Why would it not be able to connect?

The issue was I did not provide any PORT into robotframework and it does not correctly use the default port of 3306. Once I put the port in, it worked.

Related

Airflow 2.0 SQL Server connector

I want to use the airflow 2.01 docker-compose file from apaches github.
here is the link docker-compose.yaml and here is the link to the dockerfiledockerfile
I want to use a Dag which should grab data out of my SQL Server database. Actually I get the following error:
no module named pymssql
After I manually installed it, I get an error like no module named pyodbc.
When I want to install this manually I get an gcc error, that it is not possible to install.
Does anyone have any clue about this?
Is there any docker-compose file which is able to handle SQL Server connection for Airflow 2?
Thanks in advance

Connect cloud instance locally for laravel project running as service

I am trying to connect a Laravel to Google mysql instance locally.
.env file
APP_NAME=Laravel
APP_ENV=local
APP_KEY=base64:GptV9AIxUX7TuPkKxbLs54DLjivt2wCmuG37qsnqxJU=
APP_DEBUG=true
APP_URL=http://localhost
LOG_CHANNEL=stack
DB_CONNECTION=mysql
DB_HOST=127.0.0.1
DB_PORT=3306
DB_DATABASE=**********
DB_USERNAME=**********
DB_PASSWORD=**********
DB_SOCKET=/cloudsql/[INSTANCE NAME]
BROADCAST_DRIVER=log
CACHE_DRIVER=file
SESSION_DRIVER=file
SESSION_LIFETIME=120
QUEUE_DRIVER=sync
REDIS_HOST=127.0.0.1
REDIS_PASSWORD=null
REDIS_PORT=6379
MAIL_DRIVER=smtp
MAIL_HOST=smtp.mailtrap.io
MAIL_PORT=2525
MAIL_USERNAME=null
MAIL_PASSWORD=null
MAIL_ENCRYPTION=null
PUSHER_APP_ID=
PUSHER_APP_KEY=
PUSHER_APP_SECRET=
PUSHER_APP_CLUSTER=mt1
MIX_PUSHER_APP_KEY="${PUSHER_APP_KEY}"
MIX_PUSHER_APP_CLUSTER="${PUSHER_APP_CLUSTER}"
I am using this cloud_sql_proxy command:
~/Downloads/cloud_sql_proxy -instances=/cloudsql/[INSTANCE_ID]=tcp:3306
and it working properly,
2018/07/18 13:46:29 Listening on 127.0.0.1:3306 for /cloudsql/[INSTANCE ID]
2018/07/18 13:46:29 Ready for new connections
I also enabled enabled Google Cloud SQL and Cloud SQL Admin API for my project.
The error I receive is:
SQLSTATE[HY000] [2002] No such file or directory (SQL: select * from `campaigns`)
Below are my observation in struggle to find a solution.
In Stackdriver logging error that I see is
severity: "ERROR"
textPayload: "2018-07-18T05:57:08.204597Z 859643 [Note] Aborted connection 859643 to db: '[DB_NAME]' user: '[DB_USER]' host: 'cloudsqlproxy~173.194.90.33' (Got an error reading communication packets)"
When I check ~/Downloads/cloud_sql_proxy info:
2018/07/18 14:10:50 Using gcloud's active project: [PROJECT ID]
2018/07/18 14:10:55 must set -dir: using a unix socket for [INSTANCE NAME]
I am actually not able to relate if this is related to error.
I have tried connecting cloud sql using service account, its connects properly but still not able to resolve this issue.
Thanks in advance.
There are two ways to connect to Cloud SQL: via TCP or via a Unix Socket.
If you are using a Unix Socket, you need to specify -dir /cloudsql (or similar) as a directory to create your socket in. Full command looks something like this:
./cloud_sql_proxy -dir=/cloudsql -instances=myProject:uscentral1:myInstance
You would then use DB_SOCKET=/cloudsql/myProject:uscentral1:myInstance to connect. (You don't need a host or port - you're connecting via the unix socket)
If you are using TCP, you apply the port you want to listen to your instance name. So the command looks like this:
./cloud_sql_proxy -instances=myProject:us-central1:myInstance=tcp:3306
In this case, you use DB_HOST=127.0.0.1 and DB_PORT=3306 (but not DB_SOCKET).

"Login timeout expired" error when accessing MS SQL db via sqlalchemy and pyodbc

So I have some trouble getting sqlalchemy and pyodbc working with a remote MS SQL Server. Local sqlcmd worked properly but not when I try to read the db via python code. Any help would be appreciated.
Environment:
Centos 7
SQLCmd version: Version 17.1.0000.1 Linux
MS SQL Server 6.01.7601.17514
Python 2.7
The following sqlcmd worked properly
sqlcmd -S {Host},{Port} -U {USER} -P {PWD} -Q "use {Database};"
Attempts to work with sqlalchemy or pyodbc directly didn't work. Error:
pyodbc.OperationalError: ('HYT00', u'[HYT00] [unixODBC][Microsoft][ODBC Driver 17 for SQL Server]Login timeout expired (0) (SQLDriverConnect)')
Code:
Attempt with pyodbc
conn = pyodbc.connect(
r'DRIVER={ODBC Driver 17 for SQL Server};'
r'SERVER=HOST,PORT;'
r'DATABASE=DATABASE;'
r'UID=UID;'
r'PWD=PWD'
)
Attempt with sqlalchemy:
create_engine('mssql+pyodbc://{user}:{password}#{host}:{port}/{database}?driver={driver}'.format(
user=user,
password=password,
host=host,
database=database,
port=port,
driver="ODBC+Driver+17+for+SQL+Server"
)).connect()
I can reproduce the error with sqlcmd if I remove the port from the command, so maybe the conn_string I am passing to pyodbc is not in the correct format?
The problem might be DNS related, as you can read here.
Try to use an IP address, instead of the hostname, in the connection string, or check your DNS configuration.
In my case, this happened when I didn't properly escape passwords that has special characters. This was my solution:
from urllib.parse import quote
...
passwd = 'p#ssw0rd!'
...
engine_string = f"mssql+pyodbc://{user}:{quote(passwd)}#{host}/{name}?driver=ODBC+Driver+17+for+SQL+Server"
What does your python code do? Problem might be multiple Connections calls. Dont open the connection in a loop. Or conn.close() at the wrong point.
Other problem could be a firewall rule issue, check it.
I use pymssql to acces to my sql server. Read the documentation and install pymssql and freetds-dev on your centos system. Maybe u need to edit freetds.conf and add the ip and port of your sql server.

Laravel/Homestead - can't connect to the DB

I have installed homestead on my machine and cloned a project from a remote repository, now I am trying to create a DB for my project.
This is how my .env file looks like:
APP_ENV=local
APP_DEBUG=true
APP_KEY=base64:e7K7SUDgogNqTtP9TO1CfpbXFAC6FmLLSJ1l6K8pbWs=
DB_HOST=localhost
DB_DATABASE=homestead
DB_USERNAME=homestead
DB_PASSWORD=secret
CACHE_DRIVER=file
SESSION_DRIVER=file
QUEUE_DRIVER=sync
REDIS_HOST=localhost
REDIS_PASSWORD=null
REDIS_PORT=6379
MAIL_DRIVER=smtp
MAIL_HOST=mailtrap.io
MAIL_PORT=2525
MAIL_USERNAME=null
MAIL_PASSWORD=null
MAIL_ENCRYPTION=null
I have tried to connect to DB using sequel pro standard option:
Host: localhost
Password: secret
But I get:
Unable to connect to host 127.0.0.1, or the request timed out.
Be sure that the address is correct and that you have the necessary
privileges, or try increasing the connection timeout (currently 10
seconds).
MySQL said: Can't connect to MySQL server on '127.0.0.1' (61)
What am I doing wrong?
Some options:
Check the service of mysql is running;
Check the user and passowrd are correct;
Check the privilege of the user, try root user
For Mysql Connection you should have check username and password. and .env file
DB_HOST=localhost
DB_DATABASE=Your Database name
DB_USERNAME=username
DB_PASSWORD=password
Along With that, you can update database.php file for database configurations.
I had to add the port 33060 to be able to connect. Hope that will help others who got stuck like me.
I have some problem with Laravel 7 Homestead. I solve the error with this step:
Use mysql port 33060
Clear laraver config. php artisan config:clear
Reload vagrant vagrant reload --provision
And make sure your database name and password is correct.
Happy coding.

Connect to SQL Server using SQLAlchemy

I'm trying to connect to a SQL Server Express database using SQLALchemy and pyodbc, but I'm continuously getting the error:
(pyodbc.Error) ('IM002', '[IM002] [unixODBC][Driver Manager]Data
source name not found, and no default driver specified (0)
(SQLDriverConnect)')
And I really don't understand if my engine url is wrong or what else.
My scenario is the following:
I'm on a Mac
I have a docker container (based on a Debian image with unixodbc and unixodbc-dev) in which my python app tries to connect to...
a virtualbox virtual machine running windows 8 with SQL express 2014...
I configured a user for the SQL express, with SQL Server authentication:
user: ar_user
password: ar_psw
...then:
I configured TCP ports as 1433 and disabled dynamic ports (SQL Server Configuration Manager > Network Configurations > Protocols).
I turned off Windows Firewall.
I used an Host-only adapter for the VM running windows8
now...
The VM is accessible from the host (my mac), since a:
ping -c 3 vm-ip
succeed!
But although I tried every possible permutation of user, password, ip, server name and port:
'mssql+pyodbc://ar_user:ar_psw#vm-ip/master'
'mssql+pyodbc://ar_user:ar_psw#vm-ip:1433/master'
'mssql+pyodbc://IE10WIN8\\SQLEXPRESS'
'mssql+pyodbc://ar_user:ar_psw#IE10WIN8\\SQLEXPRESS'
'mssql+pyodbc://ar_user:ar_psw#IE10WIN8\\SQLEXPRESS:1433'
'mssql+pyodbc://ar_user:ar_psw#IE10WIN8\\SQLEXPRESS:1433/master'
...and many more!
I always get the "datasource not found error".
What should I do?
ps: the vm is pingable even in the docker container!
UPDATE (solved but not 100%):
I solved in this way:
I configured FreeTDS driver using /etc/odbcinst.ini in this way:
[FreeTDS]
Description = TDS driver (Sybase/MS SQL)
Driver = /usr/lib/x86_64-linux-gnu/odbc/libtdsodbc.so
Setup = /usr/lib/x86_64-linux-gnu/odbc/libtdsS.so
client charset = UTF-8
and in /etc/freetds/freetds.conf:
[global]
tds version = 7.3
client charset = UTF-8
Then I created the engine using the following string:
'mssql+pyodbc://my_user:my_psw#machine_ip:1433/my_db?driver=FreeTDS'
It seems to work properly, but I get this warning:
SAWarning: Unrecognized server version info '95.12.255'. Version
specific behaviors may not function properly. If using ODBC with
FreeTDS, ensure TDS_VERSION 7.0 through 7.3, not 4.2, is configured in
the FreeTDS configuration.
I also defined the TDS version using environment variables but it doesn't fix the issue... any idea?
I wrote a tutorial here of how to do this. Essentially, you need to:
brew install unixodbc
brew install freetds --with-unixodbc
Add the freetds driver to odbcinst.ini
Add a DSN (Domain Source Name) to odbc.ini named "MY_DSN"
pip install pyodbc
e = create_engine("mssql+pyodbc://username:password#MY_DSN")
The walkthrough here does a much more thorough job of explaining this, including issues with SQL Server/FreeTDS Protocol Version Compatibility.

Resources