Uploading and reading images to database in base64 format in react - reactjs

I have an assignment and I need to do it. I'm new to react. There are a lot of resources about uploading pictures on the internet, but they all do it in a different way and it's not what I want. I have a component named product upload in functional component format, and when uploading a product, I want the user to select an image from their computer and upload an image to the product, but this will be in base64 format. At the same time, I need to read the pictures of the products from the database, since I will bring this picture while fetch the products. and the user can cancel the image he selected while uploading the product. Can you make a sample react code with these items? really important. I'm new to React and I don't know much about it.I need to do this.
To summarize briefly, I am writing below in bullet points.
1. the user will select an image from his computer and when he
selects this image, it will appear on the screen in a certain size.
2. if the user clicks the upload button, this image will be
uploaded to the database in base 64 format.
3. if the user presses the cancel button, for example, it may be next
to the picture. The picture selected by the user will be
cancelled.
4. Lastly How can I read the picture information in this database and bring the picture back
to the screen.

I will try to briefly summarize what you need to do and provide a minimal working example.
You need to understand before continuing:
What are Client and Server
HTTP Request Get/Post etc
SQL Database
You need a server. The server will receive the base64 image and store it in the db. It also needs to provide a way to request sending the image back.
For your basic example you can use the python webserver. We use sqlite in python to manage the database.
from http.server import BaseHTTPRequestHandler, HTTPServer
from sqlite3 import connect
if __name__ == "__main__":
db = connect('database.db')
cursor = db.cursor()
cursor.execute(
'CREATE TABLE IF NOT EXISTS images (id INTEGER PRIMARY KEY, base64 TEXT)')
class ServerHandler(BaseHTTPRequestHandler):
def do_GET(self):
cursor.execute("SELECT base64 FROM images LIMIT 1")
image = cursor.fetchnone()
self.send_response(200)
self.send_header('Content-type', 'text/plain')
self.send_header('Content-Length', len(image))
self.end_headers()
self.wfile.write(image)
def do_POST(self):
content_length = int(self.headers['Content-Length'])
post_body = self.rfile.read(content_length)
cursor.execute(
"INSERT INTO images (base64) VALUES (?)", (post_body,))
db.commit()
self.send_response(200)
self.end_headers()
webServer = HTTPServer(("localhost", 8080), ServerHandler)
print("Server started")
try:
webServer.serve_forever()
except KeyboardInterrupt:
pass
webServer.server_close()
db.close()
print("Server stopped.")
For the client side aka your react script take a look at the following component. It has an HTML input element that accepts images.
on upload, the image is converted to base64. We then use fetch to send the data to the post method of our python server.
import React from "react";
export default function ImageUpload() {
function convertBase64(file) {
return new Promise((resolve, reject) => {
const fileReader = new FileReader();
fileReader.readAsDataURL(file);
fileReader.onload = () => {
resolve(fileReader.result);
};
fileReader.onerror = (error) => {
reject(error);
};
});
}
async function uploadImage(event) {
const file = event.target.files[0];
const base64 = await convertBase64(file);
fetch("http://localhost:8080/", {
method: "POST",
headers: { "Content-Type": "text/plain" },
body: base64,
});
}
return (
<input
type="file"
id="img"
name="img"
accept="image/*"
onChange={uploadImage}
/>
);
}
Now you got the basic idea but there's a lot of work todo.
Make the python server better by sending correct responses, validating that the input is base64 encoded, etc. Come up with a good SQL table to store your images. Make sure the image you want is returned. For the frontend make the component pretty. Write another component that displays the image from the db. catch errors and lots more...

Related

How to send a local image instead of URL to Computer Vision API using React

I would like to upload local image file and extract text from it. I followed the below link and it works as expected when I pass URL. https://learn.microsoft.com/en-us/azure/developer/javascript/tutorial/static-web-app/add-computer-vision-react-app
I managed to configure for local image and get the base64 encoded dataURL of the uploaded image. But when I pass base64 encoded dataURL to Computer Vision API , it says "Input data is not a valid image" (POST 400 status code). I am getting error in the line that is shown below:
const analysis = await computerVisionClient.analyzeImage(urlToAnalyze, { visualFeatures });
The code I have included for handling local image:
const handleChange = (e) => {
var file = e.target.files[0];
var reader = new FileReader();
reader.onloadend = function()
{
setFileSelected(reader.result) // this is the base64 encoded dataurl
}
reader.readAsDataURL(file);
}
In computerVision.js file, I have changed the 'contentType' in header as below.
const computerVisionClient = new ComputerVisionClient(
new ApiKeyCredentials({ inHeader: {'Ocp-Apim-Subscription-Key': key, 'Content-Type': 'application/octet-stream'} }), endpoint);
I tried replacing client.read() with readTextInStream() as per docs in computerVision.js (please refer above link), but still throws error.
May I know why I get the error "Input data is not a valid image" ? Thanks.
Here is the link for input requirements.
There is a brand new online portal provided by Microsoft https://preview.vision.azure.com/demo/OCR
The advantage is that it will directly list your available resources so you just have to pick the right one, then you test, and there are also some samples.

Corrupt video uploads when chunking MediaRecorder to Google Cloud platform

I currently am using react hook powered component to record my screen, and subsequently upload it to Google Cloud Storage. However, when it finishes, the file created inside Google Cloud appears to be corrupt.
This is the gist of the code within my React component, where useMediaRecorder is from here: https://github.com/wmik/use-media-recorder -
let {
error,
status,
mediaBlob,
stopRecording,
getMediaStream,
startRecording,
liveStream,
} = useMediaRecorder({
onCancelScreenShare: () => {
stopRecording();
},
onDataAvailable: (chunk) => {
// do the uploading here:
onChunk(chunk);
},
recordScreen: true,
blobOptions: { type: "video/webm;codecs=vp8,opus" },
mediaStreamConstraints: { audio: audioEnabled, video: true },
});
As data becomes available through this hook - it calls onChunk( chunk ) passing a binary Blob through to that method, to perform the upload, I tie in with this section of code to perform the upload:
const onChunk = (binaryData) => {
var formData = new FormData();
formData.append("data", binaryData);
let customerApi = new CustomerVideoApi();
customerApi.uploadRecording(
videoUUID,
formData,
(res) => {},
(err) => {}
);
};
customerApi.uploadRecording looks like this (using axios).
const uploadRecording = (uuid, data, fn, fnErr) => {
axios
.post(endpoint + "/stream/upload", data, {
headers: {
"Content-Type": "multipart/form-data",
},
})
.then(function (response) {
fn(response);
})
.catch(function (error) {
fnErr(error.response);
});
};
The HTTP request succeeds, and all is well with the world: the server side code to upload is based on laravel:
// this is inside the controller.
public function index( Request $request )
{
// Set file attributes.
$filepath = '/public/chunks/';
$file = $request->file('data');
$filename = $uuid . ".webm";
// streamupload
File::streamUpload($filepath, $filename, $file, true);
return response()->json(['uploaded' => true,'uuid'=>$uuid]);
}
// there's a service provider used to create a new macro on the File:: object, providing the facility for appropriate handling the stream:
public function boot()
{
File::macro('streamUpload', function($path, $fileName, $file, $overWrite = true) {
$resource = fopen($file->getRealPath(), 'r+');
$storageClient = new StorageClient([
'projectId' => 'myprjectid',
'keyFilePath' => '/my/path/to/servicejson.json',
]);
$bucket = $storageClient->bucket('mybucket');
$adapter = new GoogleStorageAdapter($storageClient, $bucket);
$filesystem = new Filesystem($adapter);
return $overWrite
? $filesystem->putStream($fileName, $resource)
: $filesystem->writeStream($fileName, $resource);
});
}
So to reiterate:
React app chunks out blobs,
server side determines if it should create or append in Google Cloud Storage
server side succeeds.
4) Video inside Google Cloud platform is corrupted.
However, the video file, inside the Google Cloud container is corrupted and won't play. I'm unsure exactly why it is corrupted, but my guesses so far:
Some sort of Dodgy Mime type problem.. - different browsers seem to handle the codec / filetype differently from the mediarecorder: e.g. Chrome seems to be x-matroska (.mkv?) - firefox different again.. Ideally I would have a container of .webm - notice how I set the file name server side, and it isn't coming from the client. Should it? I'm unsure how to force the MediaRecorder to be a specific mimeType - I thought the blobOptions option should do it, but changing the extension and mime type seems to have little to no impact on the corruption occurring.
Some sort of problem during upload where an HTTP request doesn't execute and finish in order - e.g.
1 onDataAvailable completes second
2 onDataAvailable completes first
3 onDataAvailable completes third
I've sort of ruled this out because I think the chunks should be small enough.
Some sort of problem with Google Cloud Storage APIs that I'm using, perhaps in the wrong way? Does the cloud platform support streaming, and does this library send the correct params to do so?
Some sort of problem with how I'm uploading - should the axios headers be multipart formdata, or something else?
This is the package I'm using for the Server side: https://github.com/Superbalist/flysystem-google-cloud-storage
Can anyone could shed any light on how to achieve this goal of streaming up into Google Cloud without the video from the mediarecorder being corrupted? Hopefully there's enough detail here in the question to help figure it out. The problem as illustrated isn't on getting the file as far as Google cloud, but rather the resulting file being unplayable in any video format.
Update
I've ordered my chunks client side now, and queued them properly before letting them reach the server. No difference to the output. As some have suggested - a single blob upload request works fine.
Tried using streamable config param (from reading source code it seems like chunks need to be a certain size before Google recognises them as a resumable upload
$filesystem = new Filesystem($adapter, [
'resumable'=>true
]);
Not sure how: https://cloud.google.com/storage/docs/performing-resumable-uploads - is implemented within the libraries I'm using, (or within the Google Cloud APIs themselves if at all?). Do I need to implement that myself? Documentation is light on Google's part.
Short version:
The first thing you should do is buffer the whole video locally, and send a single payload to the server and to google drive. This will validate your code for a small video is actually correct. Once you can verify this you can move onto handling multi-chunk uploads.
Longer version:
For starters, you aren't passing the uuid to the request, it's being used:
const uploadRecording = (uuid, data, fn, fnErr) => {
axios
.post(endpoint + "/stream/upload", data, {
headers: {
"Content-Type": "multipart/form-data",
},
})
.then(function (response) {
fn(response);
})
.catch(function (error) {
fnErr(error.response);
});
};
Next, you can't trust how chunking will work, I think you verified this behavior with the out of order result of chunk logging. You need to assume on your server you will get chunks out of order and handle them correctly.
Each chunk you get on the server needs to put in the right place, you can't just "writeStream", you need to write to the explicit binary block. Specifically, on every request specify the byte range: Google docs:
curl -i -X PUT --data-binary #CHUNK_LOCATION \
-H "Content-Length: CHUNK_SIZE" \
-H "Content-Range: bytes CHUNK_FIRST_BYTE-CHUNK_LAST_BYTE/TOTAL_OBJECT_SIZE" \
"SESSION_URI"
CHUNK_LOCATION is the local path to the
chunk that you're currently uploading. CHUNK_SIZE is the number of
bytes you're uploading in the current request. For example, 524288. CHUNK_FIRST_BYTE is the
starting byte in the overall object that the chunk you're uploading
contains. CHUNK_LAST_BYTE is the ending byte in the
overall object that the chunk you're uploading contains.
TOTAL_OBJECT_SIZE is the total size of the
object you are uploading. SESSION_URI is the value returned in the
Location header when you initiated the resumable upload.
Try to eliminate as many variables as possible and pinpoint where exactly the file is getting corrupted.
Since you are using a React(JS) -> Laravel(PHP) -> GoogleCloud path,
first thing I would suggest is to test each step separately:
React -> Laravel - save the file on your server and check if its corrupted at this point
Laravel -> GoogleCloud - Load a file from the server filesystem and upload to cloud and see if it gets corrupted
I don't have experience with Google cloud, but I did something very similar with AWS and found that their video uploading service was extremely picky about the requests (including order of headers that were sent).
Try to compare the specs on the service you are using with your input, make the smallest possible thing that works and start adding variables until you get to the final state.
Also I don't see any kind of data ordering in your code.
If your chunks are close to each other, and with streaming it is highly possible then there is a chance that they will arrive in different order than originally sent. If you just append them to a file without any control of the sorting then the file will indeed get corrupted. Not sure if for webm that would cause just parts of the video to be broken or the entire thing to die.

Sending a file from Flask using send_file to a React frontend returns a string of weird characters

In my current program, the user selects a file in react, which is sent to flask as such
return axios
.post(`http://localhost:5000/time`, data, {
headers: {
'Content-Type': 'multipart/form-data',
},
})
.then(res => {
this.onReturnProcessed(res);
return res
});
}
and recieves data from flask as such
onReturnProcessed = res =>{
console.log((res.data))
this.setState({img:res.data})
this.setState({ImgReturned:true})
}
The Flask backend takes these files, and makes them a numpy array, and then a PIL Image object. It then saves the Image object to the Flask folder as such
img = Image.fromarray((np.uint8(cm(img)*255)))
img.save("./thenewimg.png")
Flask then sends the file to the react frontend
return send_file('./thenewimg.png',mimetype = 'image/png',as_attatchment=True)
The problem is that when React renders the file sent by flask
render(){
<div>
<img src = {this.state.img}/>
</div>
}
it is unable to interpret the file, and displays nothing except the file icon in the corner. When the file sent by flask is logged, it outputs many characters like ���}>�{���o��n�_����|��t����Jm~�\Ӳ���. I'm not sure how to change the file on the python backend so that react can interpret it and display it. Otherwise, I'm not sure how to change the frontend to display an image made in python.
This worked
#app.route('/img/<filename>',methods = ['GET'])
def give(filename):
filen = './UPLOADS/'+filename+'.png'
return send_file(filen)
I believe what is happening here is you are assigning res.data to your img variable. res.data is not the image itself that is assigned to var, but instead its base64 representation. This is why you are seeing the weird characters instead of the actual image.
An easier way to accomplish what you want might be to send back the URL of the image, not the image itself. This way you can store the image URL in your string, and then present the image like so:
console.log(this.state.img); // <-- this now returns 'https://whateveryourapiaddris.com/img/ID'
...
<img src={this.state.img} />
This means that you will need to create a separate flask route that will ingest the image ID that you assign to the image, and return the corresponding image using send_file at that end point.
Related questions:
How to get the image response from a Flask api server and then display it in a react application
https://www.reddit.com/r/reactjs/comments/bifsrt/display_image_from_flask_send_file_function/

How to send a JSON and an image file to a server?

I am trying to send a JSON file and an image file together to a server, but am really struggling.
1) If I send just the quilt item, so skipping the formData and changing the $http part below to $http.post('quilts/create/', quilt), then set the server end point to expect (#RequestBody QuiltRequest quiltRequest) without the bits about transformRequest and headers, it processes the data therein quite happily but I don't have an image to add to the records.
2) If I don't add the quilt item to the formData, and tell the server to expect (#RequestParam("image") MultipartFile image), I can save the image file on my server and generate a url string for it, but have no other quilt information to make the corresponding database entry.
How can I send both the quilt and the image in one request, and have the server receive and process both?
Many thanks!
Client-side service:
this.create = function (quilt, image) {
quilt.size = JSON.parse(quilt.size);
quilt.maker = JSON.parse(quilt.maker);
const formData = new FormData();
formData.append('quiltRequest', quilt);
formData.append('image', image);
$http.post('quilts/create/', formData, {
transformRequest: angular.identity,
headers: {'Content-Type': undefined}
}).then(function (response) {
return window.location = '#!/quilts/created/' + response.data;
})
};
Server-side end point:
#PostMapping(path = "/create")
public BigInteger create(#RequestPart QuiltRequest quiltRequest, #RequestPart MultipartFile image) throws IOException {
// do stuff based on parameters received
}
Apart of it, i think you can try to encode the image to base64 string. Send it to server and at the server, You decode it
My required solution have given by a real-world hero, and is posted here in case anyone else with a similar problem stumbles upon this thread :) (But thank you to user3562932 for taking some time to read and make a suggestion).
On the client side, we have moved the five lines of data preparation into a separate method, such that the original create() now takes a bunch of parameters and jumps straight to $http.post(url, data which has been magically transformed into something appropriate to send {rules on how to send the data}).
$http.post('quilts/create/', formData(quilt, image), {
transformRequest: angular.identity,
headers: {'Content-Type': undefined}
}).then(function (response) {
return window.location = '#!/quilts/created/' + response.data;
})
The magical transformation happens in new function formData(), which takes as its parameters the data we want to send and makes the necessary changes:
1) make a formData container for the data to be POSTed.
2) stringify information from the html form (e.g. text, numbers) into a JSON and append to formData.
2a) in this particular case, my quilt structure contains size and maker details which arrived from the backend as JSONs, and were selected in the webpage from drop-down lists of various sizes and makers, hence the parsing rows to get these items ready to be included in the formData.
3) convert files into BLOBs, and then likewise append.
4) return formData, with all required information neatly wrapped up and ready to go!
Note: in the services.js file, this formData() method actually appears above the create() method, but it feels more logical to talk about them this way around.
function formData(quilt, image) {
let formData = new FormData();
quilt.size = JSON.parse(quilt.size);
quilt.maker = JSON.parse(quilt.maker);
formData.append('quiltRequest', JSON.stringify(quilt));
formData.append('image', new Blob([image]));
return formData;
}
On the server side, we can now happily receive this through:
#PostMapping(path = "/create", consumes = MediaType.MULTIPART_FORM_DATA_VALUE)
public BigInteger create(#RequestParam(value = "quiltRequest") String quiltRequest,
#RequestParam(value = "image") MultipartFile image) throws IOException {
QuiltRequest quilt = new ObjectMapper().readValue(quiltRequest, QuiltRequest.class);
QuiltResponse quiltResponse = quiltService.create(quilt, image);
return quiltResponse.getQuilt().getId();
}
In order to enable the end point to consume our exciting multimedia input, we have to add the following import at the top of the class:
import org.springframework.http.MediaType;
We use another import to enable the use of the MultipartFile class that we have designated for the incoming image file:
import org.springframework.web.multipart.MultipartFile;
The JSON object from the webpage has come through as a String, but that needs to be parsed into its underlying components to actually be of use. This is where the ObjectMapper comes into play. Call on its readValue() method, and pass in the string argument plus a template of what the information should look like when unwrapped (here, a QuiltRequest class with defined properties corresponding to the information we fed into the JSON back in the client-end server). Remember to include the necessary import to access the ObjectMapper:
import com.fasterxml.jackson.databind.ObjectMapper;
Hopefully this breakdown of the changes makes sense, with enough explanation to help other developers build end-to-end POST requests to suit their own projects.

React / Rails API Image Uploading

I've built a React frontend along with a Rails API only backend. I want to allow the user to create a task and enter a title, description and upload an image.
So I've attempted to use DropZone to get access to the image and then send the image info along with the title and description to my Rails API via a post request using Axios.
I set up Carrierwave on my Rails API in hopes of uploading to an AWS S3 bucket once my Task has been added to the database per the post request.
None of this is working so my question is, should I take care of the image uploading to AWS on the react side and if so, how do I associate that image with the additional information I'm saving to my Rails database (title and description).
Thanks!
First, on React side, there should be no proble with title and description, but for image, you need to encode the image to Base64 string. It is something like this.
getBase64 = (callback) => {
const fileReader = new FileReader();
fileReader.onload = () => {
console.log(fileReader.result);
};
fileReader.readAsDataURL(fileToLoad);
fileReader.onerror = (error) => {
console.log('Error :', error);
};
}
Then, on Axios, send those 3 parameters alltogether with one POST request.
For Rails, you need to set up code that can read the Base64 string. Usually, you can use Paperclip or CarrierWavegem to add image attachment. It will look like this.
property_image = listing.property_images.new(param_image)
if param_image[:file_data]
image_file = Paperclip.io_adapters.for(param_image[:file_data])
image_file.original_filename = param_image[:image_file_name]
image_file.content_type = "image/png"
property_image.image = image_file
end
private
def param_image
params.permit(:image, :image_file_name, :file_data)
end

Resources