I am trying to figure out what is the best way to prevent random people access/make requests to the server.
Currently, everyone that have the url and the endpoint can easily make requests and I believe it's kind of security breach.
I am using Express.js that hosting static build of React.js
Here is an example of a call:
// Client:
async function getData(id){
try {
const res = await axios.get(`${backendDomain}/data/${id}`)
const data = res.data
return data
} catch (error) {
...
}
}
// Server:
app.get('/data/:id', function(req, res) {
...logic
res.send(data);
});
I tried adding to the Client "x-api-key" header and pass an api key that only I have and add a middleware that will check the api and see if it passed currectly from the client. But obviously it is not a good solution because you can see the key on "Network" section while inspecting.
What can I do?
Looking for the best way to prevent random people accessing the data
I am having trouble with receiving array data as it flows from another source into my client. My goal is to have the HTML document populate with data from the array as it is received by the server.
Server:
const keywordsList = [];
const keywordsListSerialize = JSON.stringify(keywordsList);
const arrayToString = JSON.stringify(Object.assign({}, keywordsList))
// const keywordsListObject = JSON.parse(arrayToString);
app.use(cors())
const bodyParser = require('body-parser');
app.get('/', (req, res) => {
// res.writeHead(200, {
// 'Connection': 'keep-alive',
// 'Content-Type': 'text/event-stream',
// 'Cache-Control': 'no-cache',
// });
//res.flushHeaders();
res.send(keywordsList);
//res.write(arrayToString);
//res.status(200).send(arrayToString);
//res.status(500).send({ error: 'something blew up' })
});
As you can see from the server code, I have tried multiple variations of sending data as a JSON object/just an array or a string and I can't seem to get ANY of it to even show up on my client.
Client:
var xhttp = new XMLHttpRequest();
console.log(xhttp);
xhttp.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
console.log(xhttp.responseText)
var jsonObj = JSON.parse(xhttp.responseText);
for(i = 0; i < jsonObj.length; i++) {
document.getElementById("keywordsList").innerHTML = jsonObj;
}
}
};
xhttp.open("GET", "/", true);
xhttp.send();
The client does two very basic but interesting things. 1. the xhttp.responseText is the whole HTML file which makes me think I should be handling the request somehow in my server (I figured I could get away with just constantly streaming data to the client) and 2. Sends me an error
VM812:1 Uncaught SyntaxError: Unexpected token < in JSON at position 0 at JSON.parse (<anonymous>) at XMLHttpRequest.xhttp.onreadystatechange
I believe I am having multiple issues. I understand that I will want to send data almost definitely as a JSON object and parse it on the client but at the most basic examples, I can't even get that to work. I have double checked that how I have my client and server setup is correct and if need be, I can discuss how it's setup as to make sure they are linked correctly locally.
I am looking for guidance as well as possibly a technical answer. Please do not just link me ajax documentation because I have probably read it 3 times over (minimum) at this point. I thought of just throwing everything away and using a websocket as it would accomplish what I am trying to achieve but that means learning all of that.
Thank you for your time.
So I figured it out eventually. The only post above from someone else is correct. It's that simple to send data across. Unfortunately, when trying to understand AJAX and express at the same time, you get confused on whether you are really sending data AND if that data is structured correctly so the client doesn't just throw out errors. I was conflating AJAX and express issues together which could have been resolved quicker had I understood how to correctly test stuff but alas, I am by myself working this all out.
For any poor sap who might come across this, probably don't use XMLHttpRequests and just use JQuery.
Thanks for the response.
Server:
const express = require('express')
const app = express()
const port = 3000
const keywordsList = ['apple', 'banana', 'cucumber']
app.get('/', (req, res) => {
res.send(JSON.stringify(keywordsList))
})
app.listen(port, () => {
console.log(`Example app listening at http://localhost:${port}`)
})
Testing the web server:
curl http://localhost:3000
["apple","banana","cucumber"]
To iterate over the response in the client:
const arrayOnClientSide = JSON.parse(xhttp.responseText);
arrayOnClientSide.forEach(element => console.log(element));
which outputs:
apple
banana
cucumber
Note that every time you call .innerHTML you will overwrite whatever value it previously held, so construct the presentation format of your display value first, then assign it to innerHTML at the end.
I currently am using react hook powered component to record my screen, and subsequently upload it to Google Cloud Storage. However, when it finishes, the file created inside Google Cloud appears to be corrupt.
This is the gist of the code within my React component, where useMediaRecorder is from here: https://github.com/wmik/use-media-recorder -
let {
error,
status,
mediaBlob,
stopRecording,
getMediaStream,
startRecording,
liveStream,
} = useMediaRecorder({
onCancelScreenShare: () => {
stopRecording();
},
onDataAvailable: (chunk) => {
// do the uploading here:
onChunk(chunk);
},
recordScreen: true,
blobOptions: { type: "video/webm;codecs=vp8,opus" },
mediaStreamConstraints: { audio: audioEnabled, video: true },
});
As data becomes available through this hook - it calls onChunk( chunk ) passing a binary Blob through to that method, to perform the upload, I tie in with this section of code to perform the upload:
const onChunk = (binaryData) => {
var formData = new FormData();
formData.append("data", binaryData);
let customerApi = new CustomerVideoApi();
customerApi.uploadRecording(
videoUUID,
formData,
(res) => {},
(err) => {}
);
};
customerApi.uploadRecording looks like this (using axios).
const uploadRecording = (uuid, data, fn, fnErr) => {
axios
.post(endpoint + "/stream/upload", data, {
headers: {
"Content-Type": "multipart/form-data",
},
})
.then(function (response) {
fn(response);
})
.catch(function (error) {
fnErr(error.response);
});
};
The HTTP request succeeds, and all is well with the world: the server side code to upload is based on laravel:
// this is inside the controller.
public function index( Request $request )
{
// Set file attributes.
$filepath = '/public/chunks/';
$file = $request->file('data');
$filename = $uuid . ".webm";
// streamupload
File::streamUpload($filepath, $filename, $file, true);
return response()->json(['uploaded' => true,'uuid'=>$uuid]);
}
// there's a service provider used to create a new macro on the File:: object, providing the facility for appropriate handling the stream:
public function boot()
{
File::macro('streamUpload', function($path, $fileName, $file, $overWrite = true) {
$resource = fopen($file->getRealPath(), 'r+');
$storageClient = new StorageClient([
'projectId' => 'myprjectid',
'keyFilePath' => '/my/path/to/servicejson.json',
]);
$bucket = $storageClient->bucket('mybucket');
$adapter = new GoogleStorageAdapter($storageClient, $bucket);
$filesystem = new Filesystem($adapter);
return $overWrite
? $filesystem->putStream($fileName, $resource)
: $filesystem->writeStream($fileName, $resource);
});
}
So to reiterate:
React app chunks out blobs,
server side determines if it should create or append in Google Cloud Storage
server side succeeds.
4) Video inside Google Cloud platform is corrupted.
However, the video file, inside the Google Cloud container is corrupted and won't play. I'm unsure exactly why it is corrupted, but my guesses so far:
Some sort of Dodgy Mime type problem.. - different browsers seem to handle the codec / filetype differently from the mediarecorder: e.g. Chrome seems to be x-matroska (.mkv?) - firefox different again.. Ideally I would have a container of .webm - notice how I set the file name server side, and it isn't coming from the client. Should it? I'm unsure how to force the MediaRecorder to be a specific mimeType - I thought the blobOptions option should do it, but changing the extension and mime type seems to have little to no impact on the corruption occurring.
Some sort of problem during upload where an HTTP request doesn't execute and finish in order - e.g.
1 onDataAvailable completes second
2 onDataAvailable completes first
3 onDataAvailable completes third
I've sort of ruled this out because I think the chunks should be small enough.
Some sort of problem with Google Cloud Storage APIs that I'm using, perhaps in the wrong way? Does the cloud platform support streaming, and does this library send the correct params to do so?
Some sort of problem with how I'm uploading - should the axios headers be multipart formdata, or something else?
This is the package I'm using for the Server side: https://github.com/Superbalist/flysystem-google-cloud-storage
Can anyone could shed any light on how to achieve this goal of streaming up into Google Cloud without the video from the mediarecorder being corrupted? Hopefully there's enough detail here in the question to help figure it out. The problem as illustrated isn't on getting the file as far as Google cloud, but rather the resulting file being unplayable in any video format.
Update
I've ordered my chunks client side now, and queued them properly before letting them reach the server. No difference to the output. As some have suggested - a single blob upload request works fine.
Tried using streamable config param (from reading source code it seems like chunks need to be a certain size before Google recognises them as a resumable upload
$filesystem = new Filesystem($adapter, [
'resumable'=>true
]);
Not sure how: https://cloud.google.com/storage/docs/performing-resumable-uploads - is implemented within the libraries I'm using, (or within the Google Cloud APIs themselves if at all?). Do I need to implement that myself? Documentation is light on Google's part.
Short version:
The first thing you should do is buffer the whole video locally, and send a single payload to the server and to google drive. This will validate your code for a small video is actually correct. Once you can verify this you can move onto handling multi-chunk uploads.
Longer version:
For starters, you aren't passing the uuid to the request, it's being used:
const uploadRecording = (uuid, data, fn, fnErr) => {
axios
.post(endpoint + "/stream/upload", data, {
headers: {
"Content-Type": "multipart/form-data",
},
})
.then(function (response) {
fn(response);
})
.catch(function (error) {
fnErr(error.response);
});
};
Next, you can't trust how chunking will work, I think you verified this behavior with the out of order result of chunk logging. You need to assume on your server you will get chunks out of order and handle them correctly.
Each chunk you get on the server needs to put in the right place, you can't just "writeStream", you need to write to the explicit binary block. Specifically, on every request specify the byte range: Google docs:
curl -i -X PUT --data-binary #CHUNK_LOCATION \
-H "Content-Length: CHUNK_SIZE" \
-H "Content-Range: bytes CHUNK_FIRST_BYTE-CHUNK_LAST_BYTE/TOTAL_OBJECT_SIZE" \
"SESSION_URI"
CHUNK_LOCATION is the local path to the
chunk that you're currently uploading. CHUNK_SIZE is the number of
bytes you're uploading in the current request. For example, 524288. CHUNK_FIRST_BYTE is the
starting byte in the overall object that the chunk you're uploading
contains. CHUNK_LAST_BYTE is the ending byte in the
overall object that the chunk you're uploading contains.
TOTAL_OBJECT_SIZE is the total size of the
object you are uploading. SESSION_URI is the value returned in the
Location header when you initiated the resumable upload.
Try to eliminate as many variables as possible and pinpoint where exactly the file is getting corrupted.
Since you are using a React(JS) -> Laravel(PHP) -> GoogleCloud path,
first thing I would suggest is to test each step separately:
React -> Laravel - save the file on your server and check if its corrupted at this point
Laravel -> GoogleCloud - Load a file from the server filesystem and upload to cloud and see if it gets corrupted
I don't have experience with Google cloud, but I did something very similar with AWS and found that their video uploading service was extremely picky about the requests (including order of headers that were sent).
Try to compare the specs on the service you are using with your input, make the smallest possible thing that works and start adding variables until you get to the final state.
Also I don't see any kind of data ordering in your code.
If your chunks are close to each other, and with streaming it is highly possible then there is a chance that they will arrive in different order than originally sent. If you just append them to a file without any control of the sorting then the file will indeed get corrupted. Not sure if for webm that would cause just parts of the video to be broken or the entire thing to die.
I’m using next.js to build static HTML webpages.
One of my webpages needs data from a third-party API, which I’d like to fetch at build time and bake into the resulting HTML.
I don’t want this call to ever happen on the client, because:
CORS prevents the request from succeeding anyway
I would have to expose an API key on the client (no thank you)
I thought getInitialProps was the answer, because the fetched data is indeed baked in during the build/export process, but when I navigate away from the page and return from it, getInitialProps gets triggered on the client, breaking everything.
My current code in getInitialProps is something like:
static async getInitialProps(){
// Get Behance posts
const behanceEndpoint = `https://www.behance.net/v2/users/${process.env.BEHANCE_USERNAME}/projects?api_key=${process.env.BEHANCE_API_KEY}`
const behanceRes = await fetch(behanceEndpoint)
let behancePosts = await behanceRes.json()
// Return only the required number of posts
return {
behancePosts: behancePosts
}
}
Any advice or best practice on how to handle this? I know Gatsby.js does it out of the box.
one possibility would be, if you just want to execute this once on the server to check if the req parameter is present in getInitialProps: (Documentation)
req - HTTP request object (server only).
One dirty approach:
static async getInitialProps({ req }){
if (req) {
// only executed on server
// Get Behance posts
const behanceEndpoint = `https://www.behance.net/v2/users/${process.env.BEHANCE_USERNAME}/projects?api_key=${process.env.BEHANCE_API_KEY}`
const behanceRes = await fetch(behanceEndpoint)
let behancePosts = await behanceRes.json()
// Return only the required number of posts
return {
behancePosts: behancePosts
}
} else {
// client context
}
Hope this helps a little bit.
I am trying to save a variable's data into a text file and update the file every time the variable changes. I found solutions in Node.js and vanilla JavaScript but I cannot find a particular solution in React.js.
Actually I am trying to store Facebook Long Live Access Token in to a text file and would like to use it in the future and when I try importing 'fs' and implementing createFile and appendFile methods I get an error saying Method doesn't exist.
Please help me out. Here is the code below
window.FB.getLoginStatus((resp) => {
if (resp.status === 'connected') {
const accessToken = resp.authResponse.accessToken;
try {
axios.get(`https://graph.facebook.com/oauth/access_token?client_id=CLIENT_id&client_secret=CLIENT_SECRET&grant_type=fb_exchange_token&fb_exchange_token=${accessToken}`)
.then((response) => {
console.log("Long Live Access Token " + response.data.access_token + " expires in " + response.data.expires_in);
let longLiveAccessToken = response.data.access_token;
let expiresIn = response.data.expires_in;
})
.catch((error) => {
console.log(error);
});
}
catch (e) {
console.log(e.description);
}
}
});
React is a frontend library. It's supposed to be executed in the browser, which for security reasons does not have access to the file system. You can make React render in the server, but the example code you're showing is clearly frontend code, it uses the window object. It doesn't even include anything React-related at first sight: it mainly consists of an Ajax call to Facebook made via Axios library.
So your remaining options are basically these:
Create a text file and let the user download it.
Save the file content in local storage for later access from the same browser.
Save the contents in online storage (which could also be localhost).
Can you precise if any of these methods would fit your needs, so I can explain it further with sample code if needed?