I have a Node JS App Engine app that uses Request Promises Native to fetch external URLs. App Engine scripts launched from Cloud Tasks are supposed to have up to a 24 hour deadline, but all my tasks are being killed after 60 seconds. I know other frameworks have to use a built-in URLFetch library, and this gives them an automatic 60 second task deadline, but the docs don't state anything about Node.JS bases app engine tasks. This task fetches json from an external URL, then processes the results and inserts them into firebase, which is typically what is happening when the task is killed with an HTTP 504 Deadline Exceeded error
app.post('/tasks/import', async (req, res) => {
const paramsObj = JSON.parse(Buffer.from(req.body, 'base64').toString('utf-8'));
const page = paramsObj.page * 1;
const storeID = paramsObj.storeid;
const pageSize = 50;
const config: StoreConfig = await getConfigById(storeID);
const options = {
uri: EXTERNAL_URL,
json: true,
resolveWithFullResponse: false,
qs: {
limit: pageSize,
page: page,
},
};
try {
const results = await rp.get(options);
if (results.products.length === 0) {
return res.status(200).end();
}
const prodLooper = idx => {
const product = results.products[idx];
product.store_id = storeID;
product.body_html = cleanProductBody(product.body_html);
getUnusedUPC(storeID)
.then(upcID => {
product.upc = upcID;
fsdb
.collection('products')
.add(product)
.then(() => {
idx++;
if (idx < results.products.length) {
prodLooper(idx);
} else {
return res.send('OK').end();
}
});
})
.catch(err => {
console.log(err.code, ':', page, ':', idx);
if (err.code === 10) {
setTimeout(() => {
prodLooper(idx);
}, 10);
}
});
};
prodLooper(0);
} catch (error) {
console.log('caught error:');
console.log(error.statusCode);
console.log(error.message);
return res.status(500).end();
}
});
As per the documentation:
google.appengine.runtime.DeadlineExceededError: raised if the overall
request times out, typically after 60 seconds, or 10 minutes for task
queue requests.
Same documentation also provides some common errors that can cause this, and suggestions on how to avoid them.
Related
After calling the refresh token endpoint to refresh the user's auth tokens, the local storage does not update the token field consistently. Sometimes, the local storage is updated properly and the app works well, other times the token and admin/student fields are deleted from local storage despite no error being logged and the endpoint returning a success response. How do I fix this? Code below
import { parseTokens, parseAdmin, parseUser } from "../utils/auth-parser";
import { adminAuthFetch } from "../config/axios/axios-admin.config";
import { studentAuthFetch } from "../config/axios/axios-user.config";
export const refresher = async () => {
const admin = parseAdmin();
const student = parseUser();
const token = parseTokens();
if (!admin && !student) {
return;
}
if (admin && !student) {
console.log(
"==========================refreshing token==================",
new Date().getMilliseconds()
);
try {
const response = await adminAuthFetch.post(`/auth/refresh-tokens`, {
refresh_token: token.refresh,
});
const data = response?.data;
console.log(data);
localStorage.setItem(
"tokens",
JSON.stringify({
access: data?.data?.auth_token,
refresh: data?.data?.refresh_token,
})
);
} catch (error) {
console.log(error);
localStorage.removeItem("tokens");
localStorage.removeItem("admin");
}
} else if (student && !admin) {
console.log(
"==========================refreshing student token==================",
new Date().getMilliseconds()
);
try {
const response = await studentAuthFetch.post(`/auth/refresh-tokens`, {
refresh_token: token.refresh,
});
const data = response?.data;
console.log(data);
localStorage.setItem(
"tokens",
JSON.stringify({
access: data?.data?.auth_token,
refresh: data?.data?.refresh_token,
})
);
} catch (error) {
console.log(error)
localStorage.removeItem("tokens");
localStorage.removeItem("student");
}
}
};
Here's the Effect that is called from the root app
const refreshFunction = () => {
if (!refreshRef.current) {
refreshRef.current = true;
refresher();
} else {
refreshRef.current = false;
}
};
useEffect(() => {
const timer = setInterval(refreshFunction, 1000 * 60 * 2);
return () => clearInterval(timer);
}, []);
Despite receiving a success response from the endpoint and ensuring refresher function is called only once with the useref check, the token field in the local storage doesn't update consistently. Sometimes the values are updated, sometimes they are deleted without an error being logged to the console. Tried removing strict mode but it still does not work
Without being certain about how everything in your code works, it's possible that despite your best intentions, the refresher function is rendering twice.
Could you share more context around the React version you're using? If you're using version 17, try doing something like this:
let log = console.log
at the top level of your code, and use it for logging instead. My working theory is that some form of console.log suppression is happening on a second render, which is why you're not getting the logs, even though the localStorage removeItem call is still executing.
Let me know the React version, and we can continue debugging.
Currently, I'm working on a messaging platform and looking to do pagination for my messages and load later messages when a user scrolls up to the top of the page. But now, it is calling this function multiple times and adding the same messages to the current array. Is anyone able to help me solve this issue so that I will be able to just add the message to the array once.
setMessages((prev) => { return [...res.data.results, ...prev];});
This is the code in my frontend to add messages that I got from the new page whenever a user scrolls to the top of the page.
useEffect(() => {
const getLimitMessages = async () => {
try {
let $chatBoxTop = $(".chatBoxTop")[0];
// get the highest page number
const page = Math.round(messageLength / 10);
// On initial load - Load 10 datas first
if (messages.length === 0) {
const res = await axios.get(
`/messages/pagination/${currentChatId}?page=${page}&limit=10`
);
setMessages(res.data.results);
}
$($chatBoxTop).on("scroll", async function () {
// if user scrolls to the top of the div
if ($($chatBoxTop).scrollTop() === 0) {
pageRef.current = page;
// get the previous page
pageRef.current = pageRef.current - 1;
console.log("page number is " + pageRef.current);
if (parseInt(pageRef.current) !== 0) {
const res = await axios.get(
`/messages/pagination/${currentChatId}?page=${pageRef.current}&limit=10`
);
// add to the front of current list of messages
setMessages((prev) => {
return [...res.data.results, ...prev];
});
console.log("Pagination res.data is ", res.data.results.length);
} else {
return;
}
}
});
} catch (err) {
console.log(err);
}
};
getLimitMessages();
}, [currentChat, messages]);
This is my backend endpoint to paginate the messages
router.get(
"/messages/pagination/:conversationId",
middleware.isLoggedIn,
async (req, res) => {
const page = parseInt(req.query.page);
const limit = parseInt(req.query.limit);
const offset = parseInt(req.query.offset);
const startIndex = (page - 1) * limit;
const endIndex = page * limit;
try {
const messages = await Message.find({
conversationId: req.params.conversationId,
});
const results = {};
if (startIndex > 0) {
results.previous = {
page: page - 1,
limit: limit,
};
}
if (endIndex < messages.length) {
results.next = {
page: page + 1,
limit: limit,
};
}
results.results = messages.slice(startIndex, endIndex);
res.status(200).json(results);
} catch (err) {
res.status(500).json(err);
}
}
);
I'am having trouble figuring out how to delete a document from mongo DB after a timeout. Anyone can help me out with a simple way of doing it and maybe explain to me why this one is wrong? (it works, the document is deleted after some time , but I get a error message and the server stops)
this is the code I used written and also as a picture together with the terminal errormessage, I note, the document is deleted after the setTimeout runs out, but server stops:
documents are pretty simple consist of these:
server.js
import express from "express";
import cors from "cors";
import mongoose from "mongoose";
import shareRoutes from "./routes/shares.js";
const app = express();
app.use(cors());
app.get("/", (req, res) => {
res.json({ message: "API running..." });
res.end("");
});
app.use(express.json());
app.use("/radar", shareRoutes);
mongoose
.connect(
"mongodb+srv://<creditentials>#cluster0.dqlf2.mongodb.net/locations?retryWrites=true&w=majority",
{ useNewUrlParser: true },
{ useFindAndModify: false }
)
.then(() => {
app.listen(5000, () => {
"Server Running on port 5000";
});
})
.catch((err) => {
console.log(err);
});
shares.js for the route
import express from "express";
import {
createLocation,
getLocations,
} from "../controllers/shareController.js";
const router = express.Router();
// create location
router.post("/", createLocation);
// get Locations
router.get("/", getLocations);
export default router;
shareController.js
import express from "express";
import shareLocation from "../models/shareLocation.js";
const router = express.Router();
export const createLocation = async (req, res) => {
const { latitude, longitude, dateShared, timeShared, city, road } = req.body;
const newLocation = shareLocation({
latitude,
longitude,
dateShared,
timeShared,
city,
road,
});
try {
await newLocation.save();
res.status(201).json(newLocation);
setTimeout(() => {
(async () => {
try {
await shareLocation.findByIdAndRemove(newLocation._id);
res.json({ message: "Shared location deleted" });
} catch (error) {
res.status(409).json({ message: error });
}
})();
}, 30000);
} catch (error) {
res.status(409).json({ message: newLocation });
}
};
export const getLocations = async (req, res) => {
try {
const locations = await shareLocation.find({});
res.status(200).json(locations);
} catch (error) {
console.log(error);
res.status(409).json("Unable to fetch Locations");
}
};
export const deleteLocation = async (req, res) => {
const { id } = req.params;
try {
await shareLocation.findByIdAndRemove(id);
res.json({ message: "Shared location deleted" });
} catch (error) {
res.status(409).json({ message: error });
}
};
export default router;
shareLocations.js for the schema
import mongoose from "mongoose";
const hours = new Date().getHours().toLocaleString();
const minutes = new Date().getMinutes().toLocaleString();
const actualHours = hours.length < 2 ? "0" + hours : hours;
const actualMinutes = minutes.length < 2 ? "0" + minutes : minutes;
const locationSchema = mongoose.Schema({
timeShared: {
type: String,
default: actualHours + ":" + actualMinutes,
},
dateShared: {
type: String,
default: new Date().toDateString(),
},
latitude: {
type: String,
required: true,
},
longitude: {
type: String,
required: true,
},
city: {
type: String,
},
road: {
type: String,
},
});
export default mongoose.model("shareLocation", locationSchema);
I'll start with what is the "proper" solution, we'll take a look at what's wrong with the code after.
Mongo provides a built in way to remove documents after a certain time period, the "proper" way is to built a TTL index on the required field, and to specify after how many seconds you want that document deleted.
Mongo will then periodically check the index and clear documents when the time is up, this removes all kinds of levels of complexity from your app ( for example this timeout can easily cause a memory leak if too many calls are called in a short time window ).
A TTL index is created by using the simple createIndex syntax:
db.collection.createIndex( { "created_at": 1 }, { expireAfterSeconds: 30 } )
This will make documents expire 30 seconds after creation, you'll just have to add this timestamp to your code:
const newLocation = shareLocation({
latitude,
longitude,
dateShared,
timeShared,
city,
road,
created_at: new Date() // this new line is required
});
I can also tell you're using mongoose, then mongoose provides created_at field automatically if you set the Schema to include timestamps meaning your app can even ignore that.
Now what's wrong with your code?
It's simple, you first respond to the response in this line:
res.status(201).json(newLocation);
But then after a 30 second timeout you try to respond again, to the same response:
res.json({ message: "Shared location deleted" });
Node does not allow this behavior, you can only call set headers once ( which is called when responding ), I will not go into detail why as there are many stackoverflow answers (like this) that explain it.
Apart from the obvious issue that crashes your app, other issue's can arise from this code, as I mentioned before a memory leak can easily crash your app,
If your app restarts for whatever reason the locations that were "pending" in memory will not be cleared from the db, and more.
This is why it's recommended to let the DB handle the deletion.
I am synchronising clocks in two JavaScript clients by writing to Firestore every second and then subscribing the second remote client / "slave" to the document.
This works fine and I can read the document changes in real time based on this method of creating the document reference:
const useCloudTimer = (isMaster, opponentCCID, userCCID) => {
const dispatch = useDispatch();
const { localRemainingTimeMS, clockTimeOut } = useSelector((state) => state.clock);
const timerRef = useRef(null);
useEffect(() => {
const db = firebase.firestore();
timerRef.current = db.collection('timers').doc(`${isMaster
? userCCID + opponentCCID
: opponentCCID + userCCID
}`);
dispatch(clock(CLOCK, { cloudTimerDbRef: timerRef.current }));
}, []);
const pushTimer = async (duration) => {
try {
await timerRef.current.set({ duration });
} catch (error) {
logger.error(error, 'cloud timer');
}
};
useEffect(() => { if (isMaster) pushTimer(localRemainingTimeMS); }, [localRemainingTimeMS]);
const getTimer = async () => {
try {
const unsubscribeRemoteTimer = await timerRef.current.onSnapshot((doc) => {
if (!clockTimeOut && doc.exists) {
const duration = Number(doc.data().duration);
dispatch(clock(CLOCK, { remoteRemainingTimeMS: duration }));
}
});
if (clockTimeOut) unsubscribeRemoteTimer().then((arg) => console.log('unsubscribeRemoteTimer', arg));
} catch (error) {
logger.error(error, 'getTimer');
}
};
useEffect(() => { if (!isMaster) getTimer(); }, []);
};
export default useCloudTimer;
The problem is when I want to delete the document. If the client that did not create the document tries to delete it, what happens is that a new document is created for a split second with the same name, and then that one is deleted. Here is the exact moment this happens where you can see two documents with the same name:
A document is green when it's being written to and red when it's being deleted.
My document ref is being stored in redux and then used via the store when required:
export const deleteCloudTimer = async (timerRef) => {
if (timerRef) {
try {
await timerRef.delete();
} catch (error) {
logger.error(error, 'Error removing document: ');
}
}
};
How can my Firebase client app delete a document if it didn't create it?
Tasks requiring precision timing especially with high amounts of reads/writes are not suggested for firestore. You might consider an alternative GCP product instead?
I am using loggingPrefs in my test to log all network call/ http request made during the test, I want to know if its possible to log performance logs after certain test step and stop it once that step is done. so that I can validate that certain Http request have been made after the step is executed.
For e.g. :
//Start Recording Performance after user is logged in
//and validate certain HTTP request is made
//Stop Recording Performance
//Log out from application - I do not want to record performance for Logout step
this should do the job. Adapt the code to your needs. Do not forget to attach the browser
capabilities:
'goog:chromeOptions': {
'perfLoggingPrefs': {
'enableNetwork': true,
'enablePage': false
}
},
loggingPrefs: {
'performance': 'ALL'
}
Example spec:
import { browser } from 'protractor';
describe('dummy test', () => {
it('should do something', async () => {
const startTime = Date.now();
await browser.waitForAngularEnabled(false);
await browser.get('https://google.com');
await browser.sleep(5000);
const endTime = Date.now();
await collectPerformLog(startTime, endTime);
});
});
export async function collectPerformLog(timeFrom, timeTo) {
await browser.manage().logs().get('performance').then((browserLog) => {
const fs = require('fs-extra');
const filePath = '/performance/performance.log';
fs.ensureDirSync('/performance/');
let filteredData = browserLog
.filter(data => timeFrom < data.timestamp && timeTo > data.timestamp);
fs.appendFile(filePath, JSON.stringify(filteredData));
}).catch((e) => {
console.log('Error collection performance logs.', e);
});
}