Django on Google cloud - database

I am new to both GKE and Django. I made an app in Django, made a docker container and push it to gcr and deploy it via GKE. The deployment works fine but when i try to login, I got the OperationalError. For database connection, I am using CloudSQL proxy.I have collected the static file and stored in google storage. Any help will be highly appreciated.
I have tried quite many opinions available already online but failed to succeed.
When i try to login as admin, I got the following output after input my username and password for login.
OperationalError at /admin/login
server closed the connection unexpectedly
This probably means the server terminated abnormally
before or while processing the request.
Following are my database setting in Django.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'polls',
'USER': os.getenv('DATABASE_USER'),
'PASSWORD': os.getenv('DATABASE_PASSWORD'),
'HOST': '127.0.0.1',
'PORT': '5432',
}
}
error while trying to login as admin

You should check in the docker logs , and check if there is any error connecting to the database
If it is a databse connection issue, then you can try the following in your docker-compose.yml . You can customize the rest of the variables mentioned, as needed for your polls application
you can try this
web:
build: ./app
image: {imagename}
depends_on:
- cloud-sql-proxy
environment:
- SQL_ENGINE=django.db.backends.postgresql_psycopg2
- SQL_DATABASE=test_db
- SQL_USER=postgres1
- SQL_PASSWORD=6728298
- SQL_HOST=cloud-sql-proxy
- SQL_PORT=5432
- DATABASE=postgres
cloud-sql-proxy:
image: gcr.io/cloudsql-docker/gce-proxy:1.11
command: /cloud_sql_proxy -instances=<INSTANCE_CONNECTION_NAME>=tcp:0.0.0.0:5432 -credential_file=/config
volumes:
- {service_account_creds_path.json}:/config
You could read this article https://adilsoncarvalho.com/how-to-use-cloud-sql-proxy-on-docker-compose-f7418c53eed9 for reference. The article is about mysql, but the concepts are the same . Good Luck!

Related

Connect a dockerized app to a database from a remote machine via a VPN connection

I'm currently working on a small app that has to fetch data from a SQL Server DB and push it on the cloud. It works correctly, but I would like to dockerize it to make its deployment easier.
The database is on a private network and I have to use a VPN connection to access it for development (in red in the diagram below). In production, the app will be on a VM in the database's network.
I'm still confused with Docker networks and the --publish option.
Here is my docker-compose file for now.
version: "3.4"
services:
myapp:
build:
context: .
network: host
restart: always
ports:
- "128.1.X.Y:1433:1433"
container_name: myapp
But when I connect to the VPN from my machine (remote) and run my image with this configuration, I get this error:
driver failed programming external connectivity on endpoint myapp (bbb3cc...):
Error starting userland proxy: listen tcp4 128.1.X.Y:1433: bind: cannot assign requested address
Simply "1433:1433" does not work either. The database cannot be accessed. Not really sure about "network: host" either...
Does anyone know what I could be doing wrong?
And another thing I'm wondering is, will the Docker config be the same when I will deploy my container on the VM?
Thank you!

Database and SpringBoot in same Docker-Containter without password safe?

I was wondering if it is safe to configure a DB without a password, when you deploy the SpringBootApp in the same container.
So that you dont have to expose the ports of the DB.
Roughly like this docker-compose.yml
version:'3'
service:
myspringapp:
...
dependson: 'mydb'
ports:
- 8080:8080
mydb:
...
Now the DB should not be accessible for outstanding persons or am I missing something?

Error connecting to Google Cloud SQL from App Engine custom environment using TCP

I'm trying to connect to google sql cloud instance from custom runtime environment in App Engine.
When I follow the doc to connect using unix domain socket, it works. The problem is when I try to connect using a TCP connect. It shows:
Warning: mysqli_connect(): (HY000/2002): Connection refused in
/var/www/html/index.php on line 3
Connect error: Connection refused
This is my app.yaml file:
runtime: custom
env: flex
beta_settings:
cloud_sql_instances: testing-mvalcam:europe-west1:testdb=tcp:3306
resources:
cpu: 1
memory_gb: 0.5
disk_size_gb: 10
The Dockerfile:
FROM php:7.0-apache
ENV PORT 8080
CMD sed -i "s/80/$PORT/g" /etc/apache2/sites-available/000-default.conf /etc/apache2/ports.conf && docker-php-entrypoint apache2-foreground
RUN docker-php-ext-install mysqli
RUN a2enmod rewrite
COPY ./src /var/www/html
EXPOSE $PORT
And index.php:
<?php
$link = mysqli_connect('127.0.0.1', 'root', 'root', 'test');
if (!$link){
die('Connect error: '. mysqli_connect_error());
}
echo 'successfully connected';
mysqli_close($link);
?>
What am I doing Wrong?
The ip address ‘172.17.0.1’ is related with the docker container where the webserver is running, you can get more context on that in this documentation.
The documentation page you’re using might be lacking on adjusting the use case if you’re deploying with a presence of a Dockerfile. In the following documentation you can read more information about App Engine flexible runtimes.
As demonstrated by the documentation you’re using (remember to click on the TCP CONNECTION tab on this page), on the section of the app.yaml related to Cloud SQL instances information about the TCP port in use by the database server is needed.

React + Express on Azure: Invalid Host Header

The Error
When deploying to Azure Web Apps with Multi-container support, I receive an "Invalid Host Header" message from https://mysite.azurewebsites.com
Local Setup
This runs fine.
I have two Docker containers: client a React app and server an Express app hosting my API. I am using a proxy to host my API on server.
In client's package.json I have defined:
"proxy": "http://localhost:3001"
I use the following docker compose file to build locally.
version: '2.1'
services:
server:
build: ./server
expose:
- ${APP_SERVER_PORT}
environment:
API_HOST: ${API_HOST}
APP_SERVER_PORT: ${APP_SERVER_PORT}
ports:
- ${APP_SERVER_PORT}:${APP_SERVER_PORT}
volumes:
- ./server/src:/app/project-server/src
command: npm start
client:
build: ./client
environment:
- REACT_APP_PORT=${REACT_APP_PORT}
expose:
- ${REACT_APP_PORT}
ports:
- ${REACT_APP_PORT}:${REACT_APP_PORT}
volumes:
- ./client/src:/app/project-client/src
- ./client/public:/app/project-client/public
links:
- server
command: npm start
Everything runs fine.
On Azure
When deploying to Azure I have the following. client and server images have been stored in Azure Container Registry. They appear to load just fine from the logs.
In my App Service > Container Settings I am loading the images from Azure Container Registry (ACR) and I'm using the following configuration (Docker compose) file.
version: '2.1'
services:
client:
image: <clientimage>.azurecr.io/clientimage:v1
build: ./client
expose:
- 3000
ports:
- 3000:3000
command: npm start
server:
image: <serverimage>.azurecr.io/<serverimage>:v1
build: ./server
expose:
- 3001
ports:
- 3001:3001
command: npm start
I have also defined in Application Settings:
WEBSITES_PORT to be 3000.
This results in the error on my site "Invalid Host Header"
Things I've tried
• Serving the app from the static folder in server. This works in that it serves the app, but it messes up my authentication. I need to be able to serve the static portion from client's App.js and have that talk to my Express API for database calls and authentication.
• In my docker-compose file binding the front end to:
ports:
- 3000:80
• A few other port combinations but no luck.
Also, I think this has something to do with the proxy in client's package.json based on this repo
Any help would be greatly appreciated!
Update
It is the proxy setting.
This somewhat solves it. By removing "proxy": "http://localhost:3001" I am able to load the website, but the suggested answer in the problem does not work for me. i.e. I am now unable to access my API.
Never used azure before and I also don't use a proxy (due to its random connection issues), but if your application is basically running express, you can utilize cors. (As a side note, it's more common to run your express server on 5000 than 3001.)
I first set up an env/config.js folder and file like so:
module.exports = {
development: {
database: 'mongodb://localhost/boilerplate-dev-db',
port: 5000,
portal: 'http://localhost:3000',
},
production: {
database: 'mongodb://localhost/boilerplate-prod-db',
port: 5000,
portal: 'http://example.com',
},
staging: {
database: 'mongodb://localhost/boilerplate-staging-db',
port: 5000,
portal: 'http://localhost:3000',
}
};
Then, depending on the environment, I can implement cors where I'm defining express middleware:
const cors = require('cors');
const config = require('./path/to/env/config.js');
const env = process.env.NODE_ENV;
app.use(
cors({
credentials: true,
origin: config[env].portal,
}),
);
Please note the portal and the AJAX requests MUST have matching host names. For example, if my application is hosted on http://example.com, my front-end API requests must be making requests to http://example.com/api/ (not http://localhost:3000/api/ -- click here to see how I implement it for my website), and the portal env must match the host name http://example.com. This set up is flexible and necessary when running multiple environments.
Or if you're using the create-react-app, then simply eject your app and implement a proxy inside the webpack production configuration.
Or migrate your application to my fullstack boilerplate, which implements the cors example above.
So, I ended up having to move off of containers and serve the React app up in more of a typical MERN architecture with the Express server hosting the React app from the static build folder. I set up some routes with PassportJS to handle my authentication.
Not my preferred solution, I would have preferred to use containers, but this works. Hope this points someone out there in the right direction!

Lumen App Deployed on GAE is unable to connect to DB in cloud-sql

I developed an API in Lumen and deployed it on GAE. I followed this tutorial for that purpose. It worked fine until the deployment of API but the problem came when I tried to access the DB that is on the cloud-sql instance. I am unable to access the DB in the way described in the above mentioned tutorial. Then I tried the different solution and none of them work. At-last just for test purpose I added the Instances IP address (where the GAE is deployed) in the authorized list of the cloud-sql instance and its working now. But this is a temporary solution as Instances are deployed on VM's and their IP address can change at anytime that will break the connection between GAE and cloud-sql. Any solution or suggestion will be appreciated.
Although I have tried by changing my App.yaml multiple ways but still I have given it below:
runtime: php
env: flex
runtime_config:
document_root: public
# Ensure we skip ".env", which is only for local development
skip_files:
- .env
env_variables:
# Put production environment variables here.
#APP_ENV: production
APP_DEBUG: true
#QUEUE_DRIVER: sync
APP_LOG: errorlog
APP_KEY: 32 char key
STORAGE_DIR: /tmp
CACHE_DRIVER: database
SESSION_DRIVER: database
#APP_TIMEZONE: UTC
## Set these environment variables according to your CloudSQL configuration.
DB_HOST: 127.0.0.1
DB_DATABASE: dbname
DB_USERNAME: root
DB_PASSWORD: pass
DB_SOCKET: "/cloudsql/Instance Connection Name"
MYSQL_DSN: "mysql:unix_socket=/Instance Connection Name;dbname=dbname"
MYSQL_USER: root
MYSQL_PASSWORD: pass
beta_settings:
# for Cloud SQL, set this value to the Cloud SQL connection name,
# e.g. "project:region:cloudsql-instance"
cloud_sql_instances: "Instance Connection Name"

Resources