Base Directory for a module in App Engine yaml file - google-app-engine

I started working with App Engine today and I am trying to find a way to set a root folder for each of my modules/services. Example:
Folder Structure
/mod1/*
/mod2/*
dispatch.yaml
app.yaml
mod1.yaml
mod2.yaml
Is it possible to set the base directory for a module in App Engine yaml file?
Something similar to RewriteBase / in apache. This way in my mod1.yaml I dont have to specify the mod1 directory 30 time for each endpoints.
Maybe a commend in the dispactch.yaml
- url: "api-dot-lyreka-com.appspot.com/"
module: api
path: /mod1 -- Just for example. Something like that
I have been looking for a couple hours now.

Just move the module .yaml files inside the respective module dir which makes that module dir become the "root" of the module, so you don't need to specify it anymore. More details here:
Run Google App Engine application with microservice
New project structure for Google App Engine
Note: each module only sees what's inside its "root" dir, nothing above it is deployed when the module is deployed. But you can symlink stuff in each of the module dir to share it across modules: Sharing entities between App Engine modules

Related

Typeorm + Firebase functions: "No connection options were found in any orm configuration files" after deployed

Env:nodejs12
Folder structure:
#root
/functions
/src
...
/models
/resolvers
index.ts
ormconfig.json
package.json
tsconfig.json
...
.firebaserc
firebase.json
Everything worked when developing in local environment. After deployed to firebase functions, No connection options were found in any orm configuration files shows up.
What might be the cause?
I'll update with more information if needed.
=========================================
update
Below is the folder structure of deployed codes. (Cloud functions can't show more than 50 files so I downloaded the source code from GCP)
As you can see the ormconfig.json does exist in the root, but somehow it cannot be located. I have to create connection manually with typeorm.createConnection({type: "postgres",...}) to make the code work.
This is likely being caused by a known bug with app-root-path (which TypeORM uses for config file resolution) when used in conjunction with Google Cloud Functions.
The workaround / fix that worked for me was to set the environment variable APP_ROOT_PATH to /workspace when I deployed my Google Cloud Function (app-root-path will short-circuit when it sees that variable).

How to support react js app for 2 different sub domains

I am a bit struggle with support to a react js to support 2 different subdomains. Followings are the subdomains I need my app to support
www-dev.somedomain/apps/myapp
app-dev.somedomain/myapp
As you can see, react-app-path is also changing with the subdomains. I have defined PUBLIC_URL and REACT_APP_PATH in my .env file as below.
REACT_APP_PATH=/myapp
GENERATE_SOURCEMAP=false
PUBLIC_URL=/myapp
With above env vars app-dev... URL is working. If I change to the path to apps/myapp then www subdomain in working. I need a way to support both subdomains at once
How can I achieve this?
Finally, I solved this problem with the following steps; I was using Nginx to be redirected to the same host. The problem I have was with the paths.
www-dev.somedomain/apps/myapp
app-dev.somedomain/myapp
According to my Nginx configurations, both URLs were redirected to / in the server. Then the app couldn't find the static files because paths were different with domains. So to fix this, I did as below.
First, remove PUBLIC_URL from my env file. Then app files will be hosted at the / location
Added homepage attribute to package.json file. adding homepage will serve assets relative to index.html. Read more about homepage. Using `"homepage"` in package.json, without messing up paths for localhost
Since the app is using internal routing, I added simple Nginx rule to rewrite the location of static files as below.
rewrite /static/(.*)$ /static/$1 break;
This solved my problem with supporting two doamins.
No way, Your React app will be compiled into static HTML, JS, and CSS files, and the only time you can pass parameters to the build tool is before building, which is what you're doing right now. Once the building is complete, it can't be changed.
You can write two shell script, with different environment variable. Then invoke each of them will build different web app.

Google Cloud Bucket and ReactJS app Access

Using ReactJS I made a Build (reactJs static, npm build) and uploaded it to Google Cloud Storage Bucket, but getting a issue with the Path and Build folder files. The app (/static website) running but could not fetch the files from the bucket directory for eg the index.html & logo. (404 or 403 error )
Structure: Parent Bucket > Build folder (index.html, static folder & other files inside Build)
Any one have any suggestion on this. How to resolve this?
Do I need to create an app.yaml for GCS Bucket or any alternative?
I have gone through the article quite similar but for AppEngine instead of Bucket. https://medium.com/google-cloud/how-to-deploy-a-static-react-site-to-google-cloud-platform-55ff0bd0f509.
I have tried with app.yaml file but does not work for me.
I had exactly the same issue as mentioned by the OP. I am sharing my version of solution just in case anyone else ends up here.
As shown in the screenshots by OP, the 403 errors showed up for me because the URL of the static files in build/static folder was not correctly configured by the react-scripts build script.
Eg:
The url for index.html file was https://storage.googleapis.com/{bucket-name}/index.html.
However, when the page loaded, it requested files having url https://storage.googleapis.com/static/js/main.f555c3e0.chunk.js. It should rather be
https://storage.googleapis.com/{bucket-name}/static/js/main.f555c3e0.chunk.js
This is happening because by default react-scripts build assumes that your files will be served from root folder of your host.
To fix this add the following field in package.json of your project
"homepage": "https://storage.googleapis.com/{bucket-name}"
This tells the build script that it needs to add a relative path for serving static files.
For details please refer: https://create-react-app.dev/docs/deployment/#building-for-relative-paths
In order to set the routes of a static website stored in Google Cloud Storage, you need to assign a suffix to your objects. In other words, using suffixes is the intended way to configure your website. You can see more information in Hosting a static website document.
For your main index page you should set MainPageSuffix and for the not found page 404.html you should set NotFoundPage as suffix.
You can see more information on how to configure your static web here

How to properly deploy node apps to GAE with secret keys?

I am exploring GAE with nconf and I'm wondering if the following setup is secured after I deploy an App.
What concerns me is are both my "config.dev.json" and "config.prod.json" files deployed despite including them in ".gitignore".
I am unsure what information is passed along to gae (I don't want my config keys exposed) after I do:
$ git add .
$ git commit -m 'Commiting'
$ glcoud app deploy
My Node App structure looks like this:
- /myProject
- /node_modules
- .gitignore
- app.js
- app.yaml
- config.js
- keys.dev.json
- keys.prod.json
- package-lock.json
- package.json
// .gitignore
node_modules
keys.dev.json
keys.prod.json
// config.js:
const nconf = require("nconf");
nconf.argv().env();
if (nconf.get("NODE_ENV") === "production") {
nconf.file("keys.prod.json");
} else {
nconf.file("keys.dev.json");
}
...
Including files in .gitignore has no implications whatsoever on deployment on GAE, that file is only used by git.
If you want to prevent deployment of a file to GAE you need to use the skip_files option in your app.yaml file's General settings:
skip_files
Optional. The skip_files element specifies which files in the
application directory are not to be uploaded to App Engine. The value
is either a regular expression, or a list of regular expressions. Any
filename that matches any of the regular expressions is omitted from
the list of files to upload when the application is uploaded.
For example, to skip files whose names end in .bak, add a
skip_files section like the following:
skip_files:
- ^(.*/)?\.bak$
Side notes:
if I understand correctly, your app uses those files, so it appears to me like you will have to deploy them together with your app.
even if a file is deployed on GAE it is your app's responsability (and complete control) in deciding if the file is exposed to ouside requests or not.
if you want to know exactly which files are included in the deployment you can see them displayed during deployment by using the --verbosity option for the gcloud app deploy command.

Setting a Python Script Module with Classes on the Directory of Google App Engine

My question is similar to google app engine app.yaml url handlers. But somehow, my question include classes.
I just recently transferred customers.py to resources/customers.py. customers.py contains a class named CustomersResources. Here is the app.yaml configuration:
- url: /resources/customers
script: resources.CustomersResources.app
I got the following error:
ImportError('%s has no attribute %s' % (handler, name))
ImportError: <module 'resources' from 'C:\xampp\htdocs\pawnsoftware\trunk\pawnsoftware-0.0.1\resources.pyc'> has no attribute CustomersResources
Edit:
Since I have a conflict with the resources directory and resources.py. I have decided to remove the file resources.py from the root directory. Now, I have the following error.
ImportError: No module named resources
The name of your class is resources.customers.CustomersResources and your app is defined in the resources.customers module, so it would be resources.customers.app.
EDIT to reflect changes in question:
It seems you have both a resources folder and a resources.py file. They can't coexist. In your resources folder you need an __init__.py file.

Resources