"Error: Could not load the default credentials." in App Engine production environment - google-app-engine

I have been using Datastore in AppEngine since few weeks ago and there was no such access issue in production. Today 1pm SGT, my service was suddenly returning 500 error with this error message although I never deploy to production.
Error: Could not load the default credentials. Browse to https://cloud.google.com/docs/authentication/getting-started for more information.
This error probably happened when accessing Datastore with my GCP's default credential:
const {Datastore} = require('#google-cloud/datastore');
const datastore = new Datastore();
const [shop] = await datastore.get(
datastore.key(['Shop', Number(phone)])
)
My stacks:
Standard AppEngine with nodejs10
Koa, Next, Datastore
In App Engine dashboard, as a random solution, I changed to the old version. Then, it suddenly started working. Then, I changed back to the original version. It worked well too. Any clue?
Suspect 1: Around the time, I was working locally. Although I never deploy, I was executing few GCP commands. These should not affect production tho
export GOOGLE_APPLICATION_CREDENTIALS="[my local credencial json file]"
gcloud config set project [project-name]

Seems like this problem has to do with App Engine/Cloud Function cold starts. In this scenario, the environment would not have loaded the credentials yet when the client library is being initialized.
Apparently it is fixed in newer versions of the client libraries: https://github.com/googleapis/gapic-generator-typescript/issues/287
But if not or if you do not want to upgrade your #google-cloud/ client libraries, this is the suggested workaround:
const {Datastore} = require('#google-cloud/datastore');
let datastore;
// If you are using it in a request handler
exports.handler = async (req, res) {
if (!datastore) {
datastore = new Datastore();
}
const [shop] = await datastore.get(
datastore.key(['Shop', Number(phone)])
);
}
See also: https://github.com/googleapis/google-auth-library-nodejs/issues/798#issuecomment-591622283

Related

How to connect a React frontend on Netlify to a Flask backend on PythonAnywhere

TLDR: React app interfaces properly with Flask API on PythonAnywhere when hosted locally but not when a static build is hosted on Netlify. Perhaps the proxy information is missing from the build?
EDIT 1:
Here are the errors in the browser console:
I've created a Flask API that pulls machine learning models from Amazon S3 and returns predictions on data input from POST requests. I've put this API on PythonAnywhere.
I've also created a React frontend which allows me to input data, submit it, and then receive the prediction. When I host this app locally, it behaves appropriately (i.e. connecting to the Flask app on PythonAnywhere, loading the models properly, and returning the predictions).
I've tried deploying a static build of the React app on Netlify. It behaves as expected, except for anything that requires interacting with the Flask App. I have a button for testing that simply calls the Flask app in a GET request, and even this is throwing a 404 error.
I checked the error and server logs on PythonAnywhere and see nothing. The only thing I can thik of is that my proxy which lists the domain of the PythonAnywhere app in my package.json file is for some reason unincluded in the build, but I don't know why this would be the case.
Has anyone else run into a similar issue or know how I can check to see if the proxy information is included in the static build? Thanks in advance!
Thanks to #Glenn for the help.
Solution:
I realized (embarrassingly late) that the requests were not going to the right address, as can be seen in the browser console error above. I was using a proxy during development, so the netlify app was calling itself rather than the pythonanywhere API. I simply went into my react code and edited the paths to pythonanywhere. E.g.
onClick={ async () => {
const response = await fetch("/get", {...}}
became
onClick={ async () => {
const response = await fetch("https://username.pythonanywhere.com/get", {...}}
As #Glenn mentioned, there may have been a CORS issue as well, so in my flask application I utilized flask_cors. I can't say for sure that this was necessary given that I didn't test removing it after the fetch addresses had changed, but I suspect that it is necessary.
Hopefully this can help someone else

NextJs: The Serverless Function exceeds the maximum size limit of 50mb

I'm new working with NextJs and when trying to deploy my project to Vercel I'm getting the following error:
Error! The Serverless Function "api/auth" is 50.55mb which exceeds the maximum size limit of 50mb.
I have spent a lot of my time trying to find a proper answer but I didn't find any. Here is the code of the api request I'm making:
const { auth: adminAuth } = require("firebase/admin");
export default async function auth(req, res) {
const tokenId = req.query.token;
return new Promise((resolve) => {
adminAuth
.verifyIdToken(tokenId)
.then((user) => {
res.json(user);
resolve();
})
.catch(() => {
res.status(302).send("Invalid authentication");
resolve();
});
});
}
I'll be really grateful if anybody can help me, thanks y'all!
I've been dealing with the same issue. It appears that when bundling the serverless function vercel is pulling in ALL assets within your project. So 50.55MB is likely the size of your current entire build. I'm researching how to only include certain files within the vercel.json but have so far not figured exactly how to do that. For now you could probably just remove a few files from your public assets to get under the limit.
This is likely caused by firebase/admin including everything in the firebase package, not just the "admin" parts.
You can verify this by creating a file with only the import and running #vercel/nft to trace the files.
npm init -y
npm add firebase
echo "const { auth: adminAuth } = require('firebase/admin')" > index.js
npm i -g #vercel/nft
nft print index.js
The entire firebase package is quite large, so its best to follow the recommendation from the firebase team and use the firebase-admin package inside Serverless Functions.
This SDK (firebase) is intended for end-user client access from environments such as the Web, mobile Web (e.g. React Native, Ionic), Node.js desktop (e.g. Electron), or IoT devices running Node.js. If you are instead interested in using a Node.js SDK which grants you admin access from a privileged environment (like a server), you should use the Firebase Admin Node.js SDK (firebase-admin).
source: firebase NPM
You could add .vercelignore file to avoid this
Ref: https://vercel.com/guides/prevent-uploading-sourcepaths-with-vercelignore
# Ignore everything (folders and files) on root only
/*
!api
!vercel.json
!*.html
!*.css

getUserMedia() audio fails in static website after working in sdk

2 new important facts here
As stated in this Google App Engine page no app.yaml file is employed. As mentioned in the Answer, app.yaml is in fact required, duh.
[contracalls.appspot.com] works when I launch it from my own Mac Terminal from within it's own directory as follows
server:contracalls brian$ gcloud app browse --project contracalls
But, of course, others can't launch from my desktop computer, so I still need a fix. Are the instructions at the Google App Engine page incomplete, perhaps?
jsfiddle.net added
This shows a working jsfiddle version
Line 3 of the following code produces Uncaught (in promise) TypeError: Cannot read property 'getUserMedia' of undefined when deployed here, but works fine in the GAE sdk. (You can try the link yourself.) Any ideas?
Index.js:
const recordAudio = () =>
new Promise(async resolve => {
const stream = await navigator.mediaDevices.getUserMedia({ audio: true });
const mediaRecorder = new MediaRecorder(stream);
let audioChunks = [];
// ...
});
Notice:
This is a static website application with only html and javascript employed.
Using Chrome Version 79.0.3945.88 (Official Build) (64-bit).
I chatted and then emailed with a Google support professional who finally figured out the problem (see below). Along the way I discovered that contrary to my updated question, there is an app.yaml file as was described at the first link in the question.
The problem was that the app must be called with https:// and to force that I needed to place secure: always in the app.yaml file.

Running Firestore local e.g. for testing

Is there a way to run firestore locally (e.g. for testing purposes)?
What would the approach to write tests against the DB (except of using mocks)
Update 2020:
There's now also a Firebase Emulator Suite.
Update Nov 2018:
Local emulation, at least for the purpose of testing Firestore rules, was demoed at Firebase Summit 2018 using #firestore/testing and documented under Test your Cloud Firestore Security Rules.
It looks like it's along the lines of:
const firebase = require(`#firebase/testing`)
const app = firebase.initializeTestApp({
projectId: 'my-project',
auth: { uid: '123', email: 'name#domain.com' }
})
const attempt = app.firestore()
.collection('colId').doc('docId').get()
firebase.assertFails(attempt)
firebase.assertSucceeds(attempt)
It seems early-on, as it's not been noted in the release-notes, but I'm sure it's coming along.
There is not currently, but stay tuned as it's something we want to provide.
In the meantime we suggest uses a separate testing project to cover this. The daily free tier per project helps with this too.
You can run the Firestore emulator by running:
gcloud beta emulators firestore start
and then set the FIRESTORE_EMULATOR_HOST environment variable as per the console output (e.g. run export FIRESTORE_EMULATOR_HOST=::1:8505).
This requires the Google Cloud SDK and a Java 8+ JRE installed and on your system PATH.
for a firestore testing write a js example test.js
you could test write with this format example
var data = {
value: {createTime: new Date(),
updateTime: new Date(),
fields:{
name:{stringValue:'new value data'},
age:{integerValue:50}
}
},
oldValue: {createTime: new Date(), //old create time
updateTime: new Date(), //old update time time
fields:{
name:{stringValue:'olvalue data'},
age:{integerValue:50}
}
}
};
testFireStoreEvent(data);
for run execute
firebase experimental:functions:shell < test.js
UPDATE!!!! VALID FOR WRITE AND UPDATE EVENTS
var data = {
before: {
//your before data
},
after: {
//your after data
}
};
testFireStoreEvent(data);
There are two libraries which attempt to facilitate mocking of the firebase sdk.
1) https://github.com/soumak77/firebase-mock
2) https://github.com/mikkopaderes/mock-cloud-firestore
I currently use the first one, since it seems to have a bit more of the SDK implemented.
They're not perfect, but they're currently sufficient for my needs, and are preferable to the other approaches since they're entirely in-process.
Note that firebase-mock (#1) does cause a webpack error if used as-is from Webpack/web code. To resolve, you can use option #2 (mock-cloud-firestore), or use the workaround mentioned here (until a fix gets merged): https://github.com/soumak77/firebase-mock/issues/157#issuecomment-545387665
Other options:
3) Firestore emulator: needs the google-cloud-sdk, and relies on a separate process
4) Separate test project: relies on connection to the internet, which also means possible quota limitations/costs
5) firebase-server: Only supports the realtime-database api, not Firestore
Firestore can be setup in local using gcloud.
Start the firestore emulator by running gcloud beta emulators firestore start --host-port=localhost:8081 and if it started successfully you will be seeing Dev App Server is now running
In case if you are using #google-cloud/firestore then create the Firestore instance in this way
// Firestore instance only for test env
const { Firestore } = require('#google-cloud/firestore')
const instance = new Firestore({ projectId; 'Your project id', host: 'localhost', 'port': 8081})
Now you have an option to work with local firestore emulator by setting local host:
var db = firebaseApp.firestore();
if (location.hostname === "localhost") {
db.settings({
host: "localhost:8080",
ssl: false
});
}
https://firebase.google.com/docs/emulator-suite/connect_and_prototype#instrument_your_app_to_talk_to_the_emulators

A second node server (or port) won't start in production (Elastic Beanstalk)

I have a Node/Angular app I'm trying to deploy. It uses two node servers: One to essentially serve the app; another to get data from an API, when a specific port is requested by the app, and store that data locally.
I've got it working perfectly on my own local machine. However, when I deploy to production environments -- either Heroku or AWS Elastic Beanstalk -- I find that the second script either won't run or won't start properly. The end result is, it doesn't get the data I need.
Here are the two scripts; they're both set to run in package.json under "start": "node main.js & node node-server.js"
main.js (again, this one seems to be serving the app just fine):
var express = require('express');
var app = express();
app.use(express.static(__dirname + '/app'));
app.listen(process.env.PORT || 3000);
node-server.js (the one that doesn't seem to work; no data is gathered or populated in the app):
var http = require('http');
var port2 = 1234
var fs = require('fs');
//We need a function which handles requests and send response
function handleRequest(req, res) {
request.get({
url: 'http://sample-url.json',
qs: {
url: 'http://sampletool/pb/newsletter/?content=true'
}
}, function (err, result) {
res.end(result.body);
fs.writeFile('app/data.json', result.body, function (err) {
if (err) return console.log(err);
console.log('API data > data.json');
});
});
}
//Create a server
var server = http.createServer(handleRequest);
//Lets start our server
server.listen(port2, function () {
//Callback triggered when server is successfully listening. Hurray!
console.log("Server listening on: http://0.0.0.0:%d", port2);
});
Then, the main Angular app calls this port (http://0.0.0.0:1234) when the page is loaded, to request new data.
Elastic Beanstalk is using nginx, something I'm not super familiar with and that I don't have running on my local.
Is there something big I'm missing in configuring multiple node.js servers to be running on different ports in a production environment? Thanks in advance for any help.
For security reasons, cloud service providers typically allow the usage of only one port (which is dynamically and randomly assigned to the PORT environment variable) for an application to use from a node server. Read this section from Heroku documentation to understand more about this.
This is why the main app (main.js) that uses process.env.PORT is working and the other app (node-server.js) that uses hard-coded 1234 is not.
This question has some pointers about the feasibility of multiple ports on Heroku (though, there is no good news there, I am afraid).
As how to go about fixing this, one thing that could be tried is to split this into two separate apps that are deployed separately with separate package.json etc.

Resources