endpointscfg get an error google.appengine.ext - google-app-engine

I want to create end point lib with this command
python lib/endpoints/endpointscfg.py get_client_lib java -bs src.service.mobile_api.MobileApi
but geting this error
from google.appengine.ext import vendor
ImportError: No module named appengine.ext

downlaod
google-cloud-sdk
https://cloud.google.com/sdk/docs/downloads-versioned-archives
it this folder in Document directory then
cd Document/google-cloud-sdk and run this command
./install.sh
after
gcloud components install app-engine-python
and update your lib directory with this command in your project like this
cd /project/
ls
lib src etc
sudo pip install -t lib google-endpoints

Related

Which ChromeDriver & Headless Chrome versions exist that are compatible with ruby 2.7?

The issue
I have a web scraper running in AWS lambda but in a few weeks AWS lambda will stop supporting Ruby 2.7. I built my scraper last year using this tutorial.
I need to find a version of chrome driver & headless chrome that is compatible with Ruby 2.7, But I don't know exactly where to start.
I have looked at the ChromeDriver's downloads portal But I don't see any indication there that Chrome driver will work for ruby 2.7 or any other specific version of ruby for that matter.
The code I have works by accessing the ChromeDriver binary and starting it inside a specific folder
I downloaded the specific binaries I am using by running these commands:
# serverless chrome
wget https://github.com/adieuadieu/serverless-chrome/releases/download/v1.0.0-37/stable-headless-chromium-amazonlinux-2017-03.zip
unzip stable-headless-chromium-amazonlinux-2017-03.zip -d bin/
rm stable-headless-chromium-amazonlinux-2017-03.zip
# chromedriver
wget https://chromedriver.storage.googleapis.com/2.37/chromedriver_linux64.zip
unzip chromedriver_linux64.zip -d bin/
rm chromedriver_linux64.zip
Solution
I found the solution to this problem. Ruby 2.7 that Lambda offers by default runs on top of Amazon Linux 2 (which lacks many important libraries & dependencies), unfortunately, there's nothing you can do to change that.
However, Amazon offers you the ability to run your code in a custom docker image that can be up to 10GB in size.
I fixed this problem by creating my own image using the following Dockerfile
FROM public.ecr.aws/lambda/ruby:2.7
# Install dependencies needed to run MySQL & Chrome
RUN yum -y install libX11
RUN yum -y install dejavu-sans-fonts
RUN yum -y install procps
RUN yum -y install mysql-devel
RUN yum -y install tree
RUN mkdir /var/task/lib
RUN cp /usr/lib64/mysql/libmysqlclient.so.18 /var/task/lib
RUN gem install bundler
RUN yum -y install wget
RUN yum -y groupinstall 'Development Tools'
# Ruby Gems
ADD Gemfile ${LAMBDA_TASK_ROOT}/
ADD Gemfile.lock ${LAMBDA_TASK_ROOT}/
RUN bundle config set path 'vendor/bundle' && \
bundle install
# Install chromedriver & chromium
RUN mkdir ${LAMBDA_TASK_ROOT}/bin
# Chromium
RUN wget https://github.com/adieuadieu/serverless-chrome/releases/download/v1.0.0-37/stable-headless-chromium-amazonlinux-2017-03.zip
RUN unzip stable-headless-chromium-amazonlinux-2017-03.zip -d ${LAMBDA_TASK_ROOT}/bin/
RUN rm stable-headless-chromium-amazonlinux-2017-03.zip
# Chromedriver
RUN wget https://chromedriver.storage.googleapis.com/2.37/chromedriver_linux64.zip
RUN unzip chromedriver_linux64.zip -d ${LAMBDA_TASK_ROOT}/bin/
RUN rm chromedriver_linux64.zip
# Copy function code
COPY app.rb ${LAMBDA_TASK_ROOT}
WORKDIR ${LAMBDA_TASK_ROOT}
RUN tree
RUN ls ${LAMBDA_TASK_ROOT}/bin
# Set the CMD to your handler (could also be done as a parameter override outside of the Dockerfile)
CMD [ "app.handle" ]
Notes
If your code was previously deployed using a zip file you will have to either destroy the previous function or create a second function with the code update, it all comes down to how you want to handle deployment.
It is possible to automate the deployment process using the serverless framework

ImportError libSDL2-2.0.so.0 in wxPython wx.adv

I am running Ubuntu 18.04.3 LTS, Python 3.6.9, wx.version: 4.0.7.post2 gtk3 (phoenix) wxWidgets 3.0.5
When I import wx.adv, I get the error
ImportError: libSDL2-2.0.so.0: cannot open shared object file: No such file or directory
If I run:
sudo apt-file search libSDL_image-1.2.so.0
I get:
libsdl-image1.2: /usr/lib/x86_64-linux-gnu/libSDL_image-1.2.so.0
libsdl-image1.2: /usr/lib/x86_64-linux-gnu/libSDL_image-1.2.so.0.8.4
What is wrong?
Seems I was missing some libraries :(
I ran
sudo apt-get install git curl libsdl2-mixer-2.0-0 libsdl2-image-2.0-0 libsdl2-2.0-0
and all is fine

react-360 did not create project

I installed react-360 using npm install -g react-360-cli.
But react-360 init Hello did not create any directory.
I am using ubuntu 14.04.
Any errors you have encountered?
1) You could be having permission issues while trying to install packages in node_modules folder. Try changing the permission to the folder like:
sudo chown -R USERNAME /FILE_PATH
2) Try updating node version if 1st doesn't work.

Cannot finish Wagtail install (No such file or directory)

I am running virtualenv on a Mac and have successfully created a new MySite directory. Now I want to follow the remaining instructions:
$ pip install -r requirements.txt
$ ./manage.py migrate
$ ./manage.py createsuperuser
$ ./manage.py runserver
I get the following error:
bad interpreter: No such file or directory
I checked that Django and Pillow are properly installed. They are. Any ideas?
To do a clean wagtail install on a Mac simply do this:
→ virtualenv-3.4 ~/installation_test # installed via sudo port install py34-virtualenv
→ source ~/installation_test/bin/activate
→ pip install wagtail
→ wagtail start my_site
→ cd my_site/
→ python manage.py migrate
→ python manage.py runserver

Managed VM add to PATH

In Google App Engine's Python runtime for Managed VMS, I want to install the Splinter (selenium) Chromedriver. According to the documentation for Linux, I have the following in my dockerfile:
# Dockerfile extending the generic Python image with application files for a
# single application.
FROM gcr.io/google_appengine/python-compat
RUN apt-get update && apt-get install -y apt-utils zip unzip wget
ADD requirements.txt /app/
RUN pip install -r requirements.txt
RUN cd $HOME/
RUN wget https://chromedriver.googlecode.com/files/chromedriver_linux64_20.0.1133.0.zip
RUN unzip chromedriver_linux64_20.0.1133.0.zip
RUN mkdir -p $HOME/bin
RUN mv chromedriver /bin
ENV PATH "$PATH:$HOME/bin"
ADD . /app
I can't get the web application to start Splinter with the chrome webdriver as it does not find it in the PATH.
WebDriverException: Message: 'chromedriver' executable needs to be
available in the path. Please look at
http://docs.seleniumhq.org/download/#thirdPartyDrivers
and read up at
http://code.google.com/p/selenium/wiki/ChromeDriver
And if I run docker exec -it <container id> chromedriver, as expected, it doesn't work.
Also, the environment variables printed out in Python are:
➜ ~ docker exec -it f4d9541c4ba6 python
Python 2.7.3 (default, Mar 13 2014, 11:03:55)
[GCC 4.7.2] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import os
>>> print os.environ
{'GAE_MODULE_NAME': 'parsers', 'API_HOST': '10.0.2.2', 'GAE_SERVER_PORT': '8082', 'MODULE_YAML_PATH': 'parsers.yaml', 'HOSTNAME': 'f4d9541c4ba6', 'SERVER_SOFTWARE': 'Development/2.0', 'GAE_MODULE_INSTANCE': '0', 'DEBIAN_FRONTEND': 'noninteractive', 'GAE_MINOR_VERSION': '580029170989395749', 'API_PORT': '59768', 'GAE_PARTITION': 'dev', 'GAE_LONG_APP_ID': 'utix-app', 'PATH': '/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin', 'GAE_MODULE_VERSION': 'parsers-0-0-1', 'HOME': '/root'}
What would be the correct way of making the chromedriver be in the PATH, or any workaround?
Thanks a lot
You need to check the ENTRYPOINT and CMD associated with that image (do a docker inspect on the container you launched)
If the image is set to open a new bash session, the profile or .bashrc associated with the account running that session might redefine $PATH, overriding the Dockerfile ENV PATH "$PATH:$HOME/bin" directive.
If that is the case, making sure the profile or .bashrc defines the right PATH is easier (with a COPY of a custom .bashrc for instance) that modifying the ENV.

Resources