I am using GAE for a Laravel application. I use WKHTMLTOPDF to create PDFs. For my local deployment i use a php-fpm docker container and i have to install a few libraries to make it work.
# Install all dependencies
apt-get update -yqq && \
apt-get install -y \
libxrender1 \
libfontconfig1 \
libjpeg62 \
libxtst6 \
libssl1.0-dev \
wget \
&& wget https://github.com/h4cc/wkhtmltopdf-amd64/blob/master/bin/wkhtmltopdf-amd64?raw=true -O /usr/local/bin/wkhtmltopdf \
&& chmod +x /usr/local/bin/wkhtmltopdf \
How do i add these libraries to my GAE deployment?
Here you have an example on how to deploy an application with a Dockerfile to App Engine Flex on a custom runtime. For more information on how to build custom runtimes, check this document.
Related
I have create a Laravel Project & React Scaffolding, it means React as UI.
Before this project, i have use both Libraries as two different, for example App-frontend(React) and App-backend(Laravel).
But now i have create only one App and it will run on one Container. My Container what i have used before for the App-backend is this one;
FROM php:8.0.10-fpm-alpine
LABEL Maintainer="me" \
Description="Docker container with Nginx 1.15 & PHP-FPM 7.4 based on Alpine Linux. Nginx Confg Ready for Laravel/Lumen"
ENV BUILD_DEPS \
cmake \
autoconf \
g++ \
gcc \
make \
pcre-dev \
gmp-dev \
zip \
libzip-dev \
imagemagick-dev
RUN apk update && apk add --no-cache --virtual .build-deps $BUILD_DEPS $PHPIZE_DEPS
RUN set -ex && apk --no-cache add sudo
RUN apk --no-cache add nginx supervisor curl postgresql-dev libuv-dev openldap-dev ssmtp libxml2-dev
RUN docker-php-ext-install mysqli pdo_mysql pgsql pdo_pgsql
RUN apk add --no-cache libpng libpng-dev && docker-php-ext-install gd && apk del libpng-dev
RUN docker-php-ext-configure zip
RUN docker-php-ext-install zip
#Install Locales
ENV MUSL_LOCALE_DEPS musl-dev gettext-dev libintl
ENV MUSL_LOCPATH /usr/share/i18n/locales/musl
RUN apk add --no-cache \
$MUSL_LOCALE_DEPS \
&& wget https://gitlab.com/rilian-la-te/musl-locales/-/archive/master/musl-locales-master.zip \
&& unzip musl-locales-master.zip \
&& cd musl-locales-master \
&& cmake -DLOCALE_PROFILE=OFF -D CMAKE_INSTALL_PREFIX:PATH=/usr . && make && make install \
&& cd .. && rm -r musl-locales-master
# Add Envsubst
ENV TZ="Europe/Berlin" \
RUNTIME_DEPS="libintl"
RUN apk add --update $RUNTIME_DEPS \
&& apk add --virtual build_deps gettext \
&& cp /usr/bin/envsubst /usr/local/bin/envsubs \
# Add default timezone
&& apk add tzdata \
&& cp /usr/share/zoneinfo/${TZ} /etc/localtime \
&& echo "${TZ}" > /etc/timezone
# Configure nginx
COPY .deploy/nginx.conf /nginx.conf
# Configure PHP-FPM
COPY .deploy/fpm-pool.conf /usr/local/etc/php-fpm.d/zzz_custom.conf
COPY .deploy/php.ini /usr/local/etc/php/conf.d/laravel_custom.ini
# Configure supervisord
COPY .deploy/supervisord.conf /etc/supervisor/conf.d/supervisord.conf
# Configure ssmtp config
COPY .deploy/ssmtp.conf /etc/ssmtp/ssmtp.conf
# Configure Cron Job for Scheduler
#COPY .deploy/scheduler /var/www/html/scheduler
#RUN chmod +x /var/www/html/scheduler
COPY .deploy/crontab /etc/crontabs/root
RUN chmod 0644 /etc/crontabs/root
# Add application
WORKDIR /var/www/html
COPY src/. /var/www/html/
RUN chown -R www-data:www-data /var/www/html/storage/
RUN chown -R www-data:www-data /var/www/html/bootstrap/cache
RUN sudo chmod -R 777 /var/www/html/storage/
RUN sudo chmod -R 777 /var/www/html/bootstrap/cache
# RUN php artisan cache:clear
# RUN php artisan config:cache && php artisan view:cache
CMD ["/usr/bin/supervisord", "-c", "/etc/supervisor/conf.d/supervisord.conf"]
Now i need help to add the Docker command for the React to run and copy in the src where it run. Can you help me please?
I have some project in docker. When i recreating docker app, docker still deleting old databases in localhost. I did not find any solution on internet. Is there someone who knows how this problem solved?
Thanks for the responding
There is my docker file
FROM php:7.2-apache
ENV DOCKER=1
ENV MASTER_URL_DOCKERFILE='http://website/'
RUN docker-php-ext-install mysqli pdo_mysql
RUN apt-get update -y && apt-get install -y \
libpng-dev \
libwebp-dev \
libjpeg62-turbo-dev \
libpng-dev libxpm-dev \
libfreetype6-dev
RUN docker-php-ext-configure gd \
--with-gd \
--with-webp-dir \
--with-jpeg-dir \
--with-png-dir \
--with-zlib-dir \
--with-xpm-dir \
--with-freetype-dir
RUN docker-php-ext-install gd
RUN docker-php-ext-install calendar && docker-php-ext-configure calendar
RUN a2enmod rewrite
RUN ln -s /etc/apache2/mods-available/expires.load /etc/apache2/mods-enabled/
COPY core /var/www/core/
COPY chainway/src /var/www/html/
COPY chainway/docker/app/ /usr/local/bin/
RUN service apache2 restart
And there is how i running containers
#!/bin/bash
DIR=$(dirname $0)
cd $DIR
wget –V
wget -O "$DIR/docker/db/dump.sql" "http://website/senddatabasetolocalhost.php?auth=authkey"
docker-compose stop
docker-compose build
docker-compose up -d
You will have to use a volumes in docker-compose.yml like this :
volumes:
- $PWD/my_sql:/var/lib/mysql
You can store your db data using volumes.
Add to your docker-compose.yml file in mysql section:
mysql:
volumes:
- db_data:/var/lib/mysql
And to the end of the file:
volumes:
db_data:
I am developing a python web server in Google App Engine.
I want to debug it in VScode so I want to get the Dockerfile for the latest python 3 version in the gcr.io/google-appengine/python
Where do I get it?
Here is the Dockerfile you can use:
FROM gcr.io/google-appengine/python
# Create a virtualenv for dependencies. This isolates these packages from
# system-level packages.
# Use -p python3 or -p python3.7 to select python version. Default is version 2.
RUN virtualenv /env
# Setting these environment variables are the same as running
# source /env/bin/activate.
ENV VIRTUAL_ENV /env
ENV PATH /env/bin:$PATH
# Copy the application's requirements.txt and run pip to install all
# dependencies into the virtualenv.
ADD requirements.txt /app/requirements.txt
RUN pip install -r /app/requirements.txt
# Add the application source code.
ADD . /app
WORKDIR /app
# Run a WSGI server to serve the application. gunicorn must be declared as
# a dependency in requirements.txt.
ENTRYPOINT ["gunicorn", "-b", ":8080", "server:app"]
You can also look at the Github Repository
This is the github repo of the Python Runtime for App Engine Flex, in that repository you can find the Dockerfile and all the Scripts to create an Docker container similar than the used on App Engine Flex
# The Google App Engine base image is debian (jessie) with ca-certificates
# installed.
# Source: https://github.com/GoogleCloudPlatform/debian-docker
FROM ${OS_BASE_IMAGE}
ADD resources /resources
ADD scripts /scripts
# Install Python, pip, and C dev libraries necessary to compile the most popular
# Python libraries.
RUN /scripts/install-apt-packages.sh
# Setup locale. This prevents Python 3 IO encoding issues.
ENV LANG C.UTF-8
# Make stdout/stderr unbuffered. This prevents delay between output and cloud
# logging collection.
ENV PYTHONUNBUFFERED 1
RUN wget https://storage.googleapis.com/python-interpreters/latest/interpreter-3.4.tar.gz && \
wget https://storage.googleapis.com/python-interpreters/latest/interpreter-3.5.tar.gz && \
wget https://storage.googleapis.com/python-interpreters/latest/interpreter-3.6.tar.gz && \
wget https://storage.googleapis.com/python-interpreters/latest/interpreter-3.7.tar.gz && \
tar -xzf interpreter-3.4.tar.gz && \
tar -xzf interpreter-3.5.tar.gz && \
tar -xzf interpreter-3.6.tar.gz && \
tar -xzf interpreter-3.7.tar.gz && \
rm interpreter-*.tar.gz
# Add Google-built interpreters to the path
ENV PATH /opt/python3.7/bin:/opt/python3.6/bin:/opt/python3.5/bin:/opt/python3.4/bin:$PATH
RUN update-alternatives --install /usr/local/bin/python3 python3 /opt/python3.7/bin/python3.7 50 && \
update-alternatives --install /usr/local/bin/pip3 pip3 /opt/python3.7/bin/pip3.7 50
# Upgrade pip (debian package version tends to run a few version behind) and
# install virtualenv system-wide.
RUN /usr/bin/pip install --upgrade -r /resources/requirements.txt && \
/opt/python3.4/bin/pip3.4 install --upgrade -r /resources/requirements.txt && \
rm -f /opt/python3.4/bin/pip /opt/python3.4/bin/pip3 && \
/opt/python3.5/bin/pip3.5 install --upgrade -r /resources/requirements.txt && \
rm -f /opt/python3.5/bin/pip /opt/python3.5/bin/pip3 && \
/opt/python3.6/bin/pip3.6 install --upgrade -r /resources/requirements.txt && \
rm -f /opt/python3.6/bin/pip /opt/python3.6/bin/pip3 && \
/opt/python3.7/bin/pip3.7 install --upgrade -r /resources/requirements.txt && \
rm -f /opt/python3.7/bin/pip /opt/python3.7/bin/pip3 && \
/usr/bin/pip install --upgrade -r /resources/requirements-virtualenv.txt
# Setup the app working directory
RUN ln -s /home/vmagent/app /app
WORKDIR /app
# Port 8080 is the port used by Google App Engine for serving HTTP traffic.
EXPOSE 8080
ENV PORT 8080
# The user's Dockerfile must specify an entrypoint with ENTRYPOINT or CMD.
CMD []
I am new with hugo and I dont find the way to addresss it.
My idea is use hugo in docker and get the content from another source, this way hugo will be updated.
The source can be almost anything, but I would prefer a repository.
Is there any way to do it?
From https://firepress.org/en/best-practices-for-getting-code-into-a-container/ it might be an option to use wget to download the data you want.
There might be better ways, but this one seemed like a pretty viable way to accomplish what you want.
I removed some "logging" features from the given sample and added some explanations of what does what.
##############################################################################
# Install App
##############################################################################
WORKDIR $APP
# Some of the APK's are installed that will be removed later in this process.
RUN apk update && \
apk upgrade && \
apk --no-cache add tar curl tini \
&& apk --no-cache add --virtual devs gcc make python wget unzip ca-certificates \
&& apk del devs gcc make python wget unzip ca-certificates \
&& npm cache clean \
&& rm -rf /tmp/npm*
##############################################################################
# PART ONE
# Install/copy FirePress_Klimax into casper from Github
##############################################################################
#directory name, for url building and renaming the unpacked zip.
THEME_NAME_FROM="FirePress_Klimax"; \
# directory where the file should be
THEME_NAME_INTO="casper"; \
# The url where to get you data from.
GIT_URL="https://github.com/firepress-org/$THEME_NAME_FROM/archive/master.zip"; \
# Local directory names.
DIR_FROM="$DIR_THEMES/$THEME_NAME_FROM"; \
DIR_INTO="$DIR_THEMES/$THEME_NAME_INTO"; \
# enter the themes directory.
cd $DIR_THEMES; \
# download the master.zip
wget --no-check-certificate -O master.zip $GIT_URL; \
# unzip the master.zip that was downloaded from github.
unzip $DIR_THEMES/master.zip; \
# remove the zip file, since the contents are on disk now
rm $DIR_THEMES/master.zip; \
# rename the "master" directory that's on disk now to it's proper name
mv $THEME_NAME_FROM-master $THEME_NAME_INTO; \
##############################################################################
# Clean up
##############################################################################
# delete the apk cache of unneeded cached downloads
rm -rf /var/cache/apk/*; \
# we don't need these programs anymore
apk del wget unzip ca-certificates;
Hugo has inbuilt functionality for you Modules, The module can be used to fetch the files from git and also you can specify which folder from git repo needs to be placed on where at your Hugo
List item
site. This happens at the time of Build.
You may need to install Go language to get this work.
I am building my spring boot application using maven and google cloud build but somehow I get different deployment results whether I run locally using mvn appengine:run or that I deploy using Cloud Build.
If I run locally using mvn appengine:run, I can access my controller as expected. Using Cloud Build, I get a 404 error.
My cloudbuild.yaml is the following:
steps:
- name: 'gcr.io/cloud-builders/mvn'
args: ['package']
- name: 'gcr.io/cloud-builders/gcloud'
args: ['app', 'deploy', 'target/myapp/WEB-INF/appengine-web.xml']
How would you recommend configuring a cloud build in order to build and deploy a spring boot application on google app engine?
After additionnal digging, the issue seems to be related to some kind of error returned:
javax.servlet.ServletContext log: 2 Spring WebApplicationInitializers detected on classpath
I do not get this message in the stack trace when deploying from local machine using mvn appengine:deploy
My question still remains, how do I go about creating a cloudbuild.yaml that can invoke mvn appengine:deploy ?
In order to build a spring boot project and deploy it to google appengine using Google Cloud Build. I ended up having to first build a "builder" image using the image below and reference this image when performing my actual application builds.
Dockerfile
FROM debian:stretch
#
# Google Cloud SDK installation
# https://cloud.google.com/sdk/docs/quickstart-debian-ubuntu
RUN apt-get update -y && \
apt-get install \
apt-utils \
dialog \
gnupg \
lsb-release \
curl -y && \
export CLOUD_SDK_REPO="cloud-sdk-$(lsb_release -c -s)" && \
echo "deb http://packages.cloud.google.com/apt $CLOUD_SDK_REPO main" | tee -a /etc/apt/sources.list.d/google-cloud-sdk.list && \
curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | apt-key add - && \
apt-get update -y && \
apt-get install google-cloud-sdk -y
# Install all available components
RUN apt-get install google-cloud-sdk \
google-cloud-sdk \
google-cloud-sdk-app-engine-go \
google-cloud-sdk-app-engine-java \
google-cloud-sdk-app-engine-python \
google-cloud-sdk-app-engine-python-extras \
google-cloud-sdk-bigtable-emulator \
google-cloud-sdk-cbt \
google-cloud-sdk-datastore-emulator \
google-cloud-sdk-cloud-build-local \
google-cloud-sdk-datalab \
kubectl \
google-cloud-sdk-pubsub-emulator -y
#
# OpenJDK installation
# https://linuxhint.com/install-openjdk-8-on-debian-9-stretch/
RUN apt-get install openjdk-8-jdk -y
#
# MAVEN installation
# https://github.com/carlossg/docker-maven/blob/f581ea002e5d067deb6213c00a4d217297cad469/jdk-8/Dockerfile
ARG MAVEN_VERSION=3.5.4
ARG USER_HOME_DIR="/root"
ARG SHA=ce50b1c91364cb77efe3776f756a6d92b76d9038b0a0782f7d53acf1e997a14d
ARG BASE_URL=https://apache.osuosl.org/maven/maven-3/${MAVEN_VERSION}/binaries
RUN mkdir -p /usr/share/maven /usr/share/maven/ref \
&& curl -fsSL -o /tmp/apache-maven.tar.gz ${BASE_URL}/apache-maven-${MAVEN_VERSION}-bin.tar.gz \
&& echo "${SHA} /tmp/apache-maven.tar.gz" | sha256sum -c - \
&& tar -xzf /tmp/apache-maven.tar.gz -C /usr/share/maven --strip-components=1 \
&& rm -f /tmp/apache-maven.tar.gz \
&& ln -s /usr/share/maven/bin/mvn /usr/bin/mvn
ENV MAVEN_HOME /usr/share/maven
ENV MAVEN_CONFIG "$USER_HOME_DIR/.m2"
WORKDIR /workspace
cloudbuild.yaml
# In this directory, run the following command to build this builder.
# $ gcloud builds submit . --config=cloudbuild.yaml
steps:
- name: 'gcr.io/cloud-builders/docker'
args: ['build', '--tag=gcr.io/$PROJECT_ID/gcloud-maven', '.']
# Simple sanity check: invoke java to confirm that it was installed correctly.
- name: 'gcr.io/$PROJECT_ID/gcloud-maven'
args: ['java', '-version']
# Simple sanity check: invoke gcloud to confirm that it was installed correctly.
- name: 'gcr.io/$PROJECT_ID/gcloud-maven'
args: ['gcloud', 'projects', 'list']
# Simple sanity check: invoke maven to confirm that it was installed correctly.
- name: 'gcr.io/$PROJECT_ID/gcloud-maven'
args: ['mvn', '--version']
images: ['gcr.io/$PROJECT_ID/gcloud-maven']
timeout: 1200s
My spring boot project's cloudbuild.yaml now references this image:
steps:
- name: 'gcr.io/$PROJECT_ID/gcloud-maven'
args: ['mvn', 'appengine:deploy']
I will try to put this docker image on dockerhub and github for others to find. I will also appreciate people more familiar with docker and linux to help improve this image to reduce its size. (For example, use Alpine instead of Debian or Debian Stretch Slim). In the meantime, I hope this helps others like me.