How to send state data to API which expects json gunzip file as payload (ex: sample.json.gz) in reactjs? - reactjs

I have constructed a json data,
sample =[]; if (this.state.data) { console.log(this.state.data); this.state.data.map((Number) => { console.log(Number); Sample.push({ Number: Number, Attributes: { 'subAttribut': { Name: 'Name_ok', GroupName: 'GroupName_ok', }, }, }); }); }
I need to store this json data into a json file say sample.json,
then I need to gunzip it say sample.json.gz,
then I need to pass this file as a payload to an API in reactjs.
Or is there any other way to send this sample data as sample.json.gz file as payload to API

If you have access to any kind of backend serving this javascript client, the best solution would be to proxy your JSON through your own backend, gzip it there, and send it to the final destination.
If you must do it in the browser, you could try the Compression Streams API - But browser compatibility seems quite narrow still.
Edit: #ant Also pointed out this library https://www.npmjs.com/package/wasm-gzip

Related

how to read the contents of a json file returned from s3 storage as a blob in React

I have successfully retrieved a json file from s3 storage. It is returned as a blob. I am able to turn the blob into text with this code (taken from https://docs.amplify.aws/lib/storage/download/q/platform/js/#monitor-progress-of-download):
export async function getS3Item(filename)
{
const result = await Storage.get(filename, { download: true });\
result.Body.text().then(string => {
// // handle the String data return String
console.log(string)
});
}
but the text is all gibberish (I'm assuming since object is in binary?)... such as: "h�b```f�d`a}��ǀ|#1V ..."
Is there a way I can directly read this as a json object in javascript so that I can extract data from it...?
Optionally, I can download the json file (which is shown in the link above and I've gotten this to work) -- but I'd prefer not to download it -- just to extract legible data from the file
thanks so much (I'm quite unfamiliar with blobs).

How to convert data array object to file JSON and use this file to send requests to the S3 server in ReactJS?

EX: const jsonData = [{ age: 12, name: 'Someone' }];
And i want convert jsonData to file JSON and use this file to send request to the S3 sever.
I don't think you can do something like that, I believe writing to file using react is not quite straight forward refer: this
Refer node solution here
Well, what do you mean 'use this file to send request'? Do you want to store the created JSON file in S3? then pls do it in the backend.
Ok... the S3 thing is a big complicated for such a short question, but in general you can do something like this...
fs.writeFile("../../pathToFile.json", yourJSON, 'utf8', (err) => {
if (err) {
console.log("Ahh mother****");
return console.log(err);
}
console.log("Now I have to send it to S3");
});

Corrupt video uploads when chunking MediaRecorder to Google Cloud platform

I currently am using react hook powered component to record my screen, and subsequently upload it to Google Cloud Storage. However, when it finishes, the file created inside Google Cloud appears to be corrupt.
This is the gist of the code within my React component, where useMediaRecorder is from here: https://github.com/wmik/use-media-recorder -
let {
error,
status,
mediaBlob,
stopRecording,
getMediaStream,
startRecording,
liveStream,
} = useMediaRecorder({
onCancelScreenShare: () => {
stopRecording();
},
onDataAvailable: (chunk) => {
// do the uploading here:
onChunk(chunk);
},
recordScreen: true,
blobOptions: { type: "video/webm;codecs=vp8,opus" },
mediaStreamConstraints: { audio: audioEnabled, video: true },
});
As data becomes available through this hook - it calls onChunk( chunk ) passing a binary Blob through to that method, to perform the upload, I tie in with this section of code to perform the upload:
const onChunk = (binaryData) => {
var formData = new FormData();
formData.append("data", binaryData);
let customerApi = new CustomerVideoApi();
customerApi.uploadRecording(
videoUUID,
formData,
(res) => {},
(err) => {}
);
};
customerApi.uploadRecording looks like this (using axios).
const uploadRecording = (uuid, data, fn, fnErr) => {
axios
.post(endpoint + "/stream/upload", data, {
headers: {
"Content-Type": "multipart/form-data",
},
})
.then(function (response) {
fn(response);
})
.catch(function (error) {
fnErr(error.response);
});
};
The HTTP request succeeds, and all is well with the world: the server side code to upload is based on laravel:
// this is inside the controller.
public function index( Request $request )
{
// Set file attributes.
$filepath = '/public/chunks/';
$file = $request->file('data');
$filename = $uuid . ".webm";
// streamupload
File::streamUpload($filepath, $filename, $file, true);
return response()->json(['uploaded' => true,'uuid'=>$uuid]);
}
// there's a service provider used to create a new macro on the File:: object, providing the facility for appropriate handling the stream:
public function boot()
{
File::macro('streamUpload', function($path, $fileName, $file, $overWrite = true) {
$resource = fopen($file->getRealPath(), 'r+');
$storageClient = new StorageClient([
'projectId' => 'myprjectid',
'keyFilePath' => '/my/path/to/servicejson.json',
]);
$bucket = $storageClient->bucket('mybucket');
$adapter = new GoogleStorageAdapter($storageClient, $bucket);
$filesystem = new Filesystem($adapter);
return $overWrite
? $filesystem->putStream($fileName, $resource)
: $filesystem->writeStream($fileName, $resource);
});
}
So to reiterate:
React app chunks out blobs,
server side determines if it should create or append in Google Cloud Storage
server side succeeds.
4) Video inside Google Cloud platform is corrupted.
However, the video file, inside the Google Cloud container is corrupted and won't play. I'm unsure exactly why it is corrupted, but my guesses so far:
Some sort of Dodgy Mime type problem.. - different browsers seem to handle the codec / filetype differently from the mediarecorder: e.g. Chrome seems to be x-matroska (.mkv?) - firefox different again.. Ideally I would have a container of .webm - notice how I set the file name server side, and it isn't coming from the client. Should it? I'm unsure how to force the MediaRecorder to be a specific mimeType - I thought the blobOptions option should do it, but changing the extension and mime type seems to have little to no impact on the corruption occurring.
Some sort of problem during upload where an HTTP request doesn't execute and finish in order - e.g.
1 onDataAvailable completes second
2 onDataAvailable completes first
3 onDataAvailable completes third
I've sort of ruled this out because I think the chunks should be small enough.
Some sort of problem with Google Cloud Storage APIs that I'm using, perhaps in the wrong way? Does the cloud platform support streaming, and does this library send the correct params to do so?
Some sort of problem with how I'm uploading - should the axios headers be multipart formdata, or something else?
This is the package I'm using for the Server side: https://github.com/Superbalist/flysystem-google-cloud-storage
Can anyone could shed any light on how to achieve this goal of streaming up into Google Cloud without the video from the mediarecorder being corrupted? Hopefully there's enough detail here in the question to help figure it out. The problem as illustrated isn't on getting the file as far as Google cloud, but rather the resulting file being unplayable in any video format.
Update
I've ordered my chunks client side now, and queued them properly before letting them reach the server. No difference to the output. As some have suggested - a single blob upload request works fine.
Tried using streamable config param (from reading source code it seems like chunks need to be a certain size before Google recognises them as a resumable upload
$filesystem = new Filesystem($adapter, [
'resumable'=>true
]);
Not sure how: https://cloud.google.com/storage/docs/performing-resumable-uploads - is implemented within the libraries I'm using, (or within the Google Cloud APIs themselves if at all?). Do I need to implement that myself? Documentation is light on Google's part.
Short version:
The first thing you should do is buffer the whole video locally, and send a single payload to the server and to google drive. This will validate your code for a small video is actually correct. Once you can verify this you can move onto handling multi-chunk uploads.
Longer version:
For starters, you aren't passing the uuid to the request, it's being used:
const uploadRecording = (uuid, data, fn, fnErr) => {
axios
.post(endpoint + "/stream/upload", data, {
headers: {
"Content-Type": "multipart/form-data",
},
})
.then(function (response) {
fn(response);
})
.catch(function (error) {
fnErr(error.response);
});
};
Next, you can't trust how chunking will work, I think you verified this behavior with the out of order result of chunk logging. You need to assume on your server you will get chunks out of order and handle them correctly.
Each chunk you get on the server needs to put in the right place, you can't just "writeStream", you need to write to the explicit binary block. Specifically, on every request specify the byte range: Google docs:
curl -i -X PUT --data-binary #CHUNK_LOCATION \
-H "Content-Length: CHUNK_SIZE" \
-H "Content-Range: bytes CHUNK_FIRST_BYTE-CHUNK_LAST_BYTE/TOTAL_OBJECT_SIZE" \
"SESSION_URI"
CHUNK_LOCATION is the local path to the
chunk that you're currently uploading. CHUNK_SIZE is the number of
bytes you're uploading in the current request. For example, 524288. CHUNK_FIRST_BYTE is the
starting byte in the overall object that the chunk you're uploading
contains. CHUNK_LAST_BYTE is the ending byte in the
overall object that the chunk you're uploading contains.
TOTAL_OBJECT_SIZE is the total size of the
object you are uploading. SESSION_URI is the value returned in the
Location header when you initiated the resumable upload.
Try to eliminate as many variables as possible and pinpoint where exactly the file is getting corrupted.
Since you are using a React(JS) -> Laravel(PHP) -> GoogleCloud path,
first thing I would suggest is to test each step separately:
React -> Laravel - save the file on your server and check if its corrupted at this point
Laravel -> GoogleCloud - Load a file from the server filesystem and upload to cloud and see if it gets corrupted
I don't have experience with Google cloud, but I did something very similar with AWS and found that their video uploading service was extremely picky about the requests (including order of headers that were sent).
Try to compare the specs on the service you are using with your input, make the smallest possible thing that works and start adding variables until you get to the final state.
Also I don't see any kind of data ordering in your code.
If your chunks are close to each other, and with streaming it is highly possible then there is a chance that they will arrive in different order than originally sent. If you just append them to a file without any control of the sorting then the file will indeed get corrupted. Not sure if for webm that would cause just parts of the video to be broken or the entire thing to die.

POST request via requestJS to send Array of JSON Objects and image files

I'm building REST API with NODEjs, using Express routers and Multer middleware to handle multiple body data and files.
My endpoint route 127.0.0.1/api/postData expects: json data with fields, one of which is array of json objects (I'm having nested mongoose schema) and 2 named images (png/jpg).
I need to send Post request via cURL with the following 5-object data structure:
name String
description String
usersArray Array of json objects like: [{"id": "123"}, {"id": "456}]
imgIcon Png/Image providing /path/to/imageIcon.png
imgHeader Png/Image providing /path/to/imageHeader.png
Any idea how to write this request with the help of request.js node http request library ?
Try the following:
request.post({
url:'http://127.0.0.1:7777/api/postData'
, formData: formData
, qsStringifyOptions : {
arrayFormat : 'brackets' // [indices(default)|brackets|repeat]
}
}, function (err, httpResponse, body) {
// do something...
}
I found three options for arrayFormat in https://www.npmjs.com/package/qs (used by https://www.npmjs.com/package/request):
'indices' sends in postbody: (this is the default case)
usersArray%5B0%5D%5Bid%5D=a667cc8f&usersArray%5B1%5D%5Bid%5D=7c7960fb
decoded:
usersArray[0][id]=a667cc8f&usersArray[1][id]=7c7960fb
'brackets' sends in postbody:
usersArray%5B%5D%5Bid%5D=a667cc8f&usersArray%5B%5D%5Bid%5D=7c7960fb
decoded:
usersArray[][id]=a667cc8f&usersArray[][id]=7c7960fb
'repeat' sends in postbody:
usersArray%5Bid%5D=a667cc8f&usersArray%5Bid%5D=7c7960fb
decoded:
usersArray[id]=a667cc8f&usersArray[id]=7c7960fb
these are three different ways to serialize arrays before posting. Basically it depends on the receiving end how these need/can be formatted. In my case it helped to use 'brackets'

Send array of objects to RESTful API

How can I send an array (a set of simular items) in a POST request to a RESTful API build with Codeigniter?
I can send singular items like this:
name=John&email=demo#demo.com
But how can I send items like this and use them on the server side like an array?
keywords=blue,sky,weather,car,vacation
I would think you could send a json array depending on the capabilities of the backend. If the RESTful API consumes this content type you are able to then send an array by simply POSTing to the RESTful path, using the json array as the payload.
One way to send the payload would be with jQuery / AJAX:
.ajax({
url: "",
type: "POST",
contentType: "application/json",
data: JSON.stringify(jsonArray)
});
is it possible to send it in JSON?
{"keywords" : ["blue","sky","weather","car","vacation"]}

Resources