react-player uploading and playing new video - reactjs

I am uploading video through express server to react's public folder and using hook for the name of the file. If its a large video file, after uploading react player starts playing it correctly. But if the file is small, player doesn't run it and I have to reload after some time then it works.
Also if a small file is uploaded after a large file it returns an error range not satisfiable.
React-player:
<ReactPlayer
url={
filePath
}
class="react-player" width="100%"
ref={ref}
controls={true}
onDuration={setDuration}
/>
Axios connection:
useEffect(() => {
const request = () => {
if (serverPath === 'http://localhost:5000/upload') {
const datatoSend = new FormData()
datatoSend.append('file', myFile)
const fetchdata = async () => await axios.post(serverPath, datatoSend, {
onUploadProgress: ProgressEvent => {
setLoaded(ProgressEvent.loaded / ProgressEvent.total * 100)
}
})
const result = fetchdata()
result.then(res => {
if (res.data ==='Server Connected') {
setFilePath('CurrentMedia.mp4')
}
})
}
}
}, [serverPath])
Express Code:
app.post("/upload", async (req, res) => {
console.log('/ route called')
const file = req.files.file
await file.mv(`${__dirname}/client/public/CurrentMedia.mp4`, (err) => {
if (err) {
console.error(err);
return res.status(500).send(err);
}
res.send('Server Connected');
});
})

I was able to solve it by saving the file by a different name every time new file is uploaded. The problem was with react-player trying to run a new video file with the same name. Since the name of the file was same it ran the one it had cached before.
just doing this in express and updating hooks in react respectively fixed it
app.post("/upload", async (req, res) => {
console.log('/ route called')
const file = req.files.file
const name = file.name
file.mv(`${__dirname}/public/${name}`, (err) => {
if (err) {
console.error(err);
return res.status(500).send(err);
}
res.send('Server Connected');
console.log('Server Connected')
});
})

Related

Uploading React - Node image(s) to Cloudinary

Has anyone uploaded an image using Node, React, Antd library to cloudinary? I'm getting an error saying that the file is missing. I'm not sure if I should be sending the entire file object or just the name. I have sent the thumburl (sometimes it works others it doesn't there has to be something I'm doing wrong).
This is my backend
const uploadImage = async (req, res) => {
try {
const result = await cloudinary.uploader.upload(req.body.image, {
public_id: `${Date.now()}`,
resource_type: 'auto' //jpeg or png
});
res.json({
public_id: result.public_id,
url: result.secure_url
});
} catch (error) {
console.log(error);
res.status(400).send(error)
}}
This is my frontend:
const Uploader = () => {
const { token } = useSelector(state => state.user);
const handleChange = ({file}) => {
console.log(file)
axios.post(`${process.env.REACT_APP_API}/uploadImages`, {file}, {
headers:{
authtoken: token
}
})
.then(res => console.log(res))
.catch(err => console.log(err))
};
return (
<Upload
listType="picture"
showUploadList
onChange={handleChange}
>
<Button>Upload</Button>
</Upload>
)
}
export default Uploader
IDK why the onChange triggers 3 times when it has worked I've sent the thumbUrl and it uploads 3 times, I have seen that I can use beforeUpload but I believe this works before uploading. I want to upload, preview the image and sending it to the server and then send it to my Form Component to add it to the values I have there
Anyone who has already uploaded could help or any ideas would be appreciated?
When it comes to the file not uploading properly, I am guessing it is because the res.body.image isn't the actual file. I would look at what the value is there. I would guess you are missing a middleware.
As far as your frontend issue, I'm still a little unclear about what the issue you are having exactly is. For an example of a clean frontend upload I would check out https://codesandbox.io/embed/jq4wl1xjv. You could also consider using the upload widget which will handle the preview as well as give you some easy editing options.
https://cloudinary.com/documentation/upload_widget
I was able to figure it out, indeed there is a need of a middleware, I used formidable on the routes of my backend
router.post('/uploadImages', authCheck, adminCheck, formidable({maxFileSize: 5 * 1024 * 1024}), imageUpload);
And fixed the controller
const imageUpload = async (req, res) =>{
try {
//console.log('req files:', req.files.file.path)
const result = await cloudinary.uploader.upload(req.files.file.path, {
public_id: `${Date.now()}`,
resource_type: 'auto'
});
res.json({
public_id: result.public_id,
url: result.secure_url
})
} catch (error) {
console.log(error)
}
}
As far as the Frontend goes. Ant designs component takes an action that makes the POST to the backend, and it also takes headers if needed (in my case it takes an authtoken for firebase).
When the image is uploaded it makes the POST to the backend based on the "action". The response will bring back the "FileList" with the upload URL to access it.
<Upload
listType="picture"
showUploadList
multiple
action={`${process.env.REACT_APP_API}/uploadImages`}
headers={{authtoken: token}}
>
<Button>Upload Images</Button>
</Upload>
I hope this helps somebody else too
I tried this and it's worked for me.
Here I used Multer middleware for handling form data(for uploading files).
I used this route,
router.post('/uploadImages', authMiddleware, multer.single("image"), imageUpload);
Corrected controller,
try {
let result;
if (req.file) {
result = await cloudinary.uploader.upload(req.files.file.path, {
public_id: `${Date.now()}`,
resource_type: 'auto'
});
}
res.json({
public_id: result.public_id,
url: result.secure_url
})
} catch (error) {
console.log(error)
}
}
//Multer config file,
const multer = require("multer");
const path = require("path");
module.exports = multer({
storage: multer.diskStorage({}),
fileFilter: (req, file, cb) => {
let ext = path.extname(file.originalname);
if (
ext !== ".jpg" &&
ext !== ".jpeg" &&
ext !== ".png" &&
ext !== ".PNG" &&
ext !== ".JPG" &&
ext !== ".JPEG"
) {
cb(new Error("File type is not supported"), false);
return;
}
cb(null, true);
},
});

Is it possible to upload an image in firebase storage through a Cypress test in GitHub Actions?

I am trying to add a file (jpeg file) into the firebase storage in a Cypress e2e test. It runs fine and the test case passes locally. I login the user, perform all the required actions (clicking button, attach file from fixtures folder to the input tag for file, save changes by clicking button).
But issue occurs in Github actions CI/CD pipeline, All the testcases pass but error occurs on this test case that Cannot read properties of undefined (Reading name). I am accessing fileselected.name when changes are saved.
My assumption is that the file is not found by cypress or there is an issue related to permissions in firebase storage that is why it is failing. Can anybody help/guide me about why is it not working?
My Cypress test:
it("adds company successfully", () => {
cy.visit("http://127.0.0.1:3000/dashboard/company", {
headers: { "Accept-Encoding": "gzip, deflate" },
}).then(() => {
cy.wait(10000);
cy.get("#newCompany").click();
cy.get("#companyName").type("Test company").blur();
cy.get("#companyEmail").type("testCompany#gmail.com").blur();
cy.get("input[type=file]").selectFile(
{
contents: "cypress/fixtures/file.jpeg",
fileName: `file${new Date().toISOString()}.jpeg`,
},
{
force: true,
}
);
cy.get(".purple-button").contains("Save Changes").click();
});
});
The handleSave() function runs on clicking Save Changes button.
const handleSave = () => {
setIsSubmitted(true);
if (isFormValid()) {
if (fileSelected.name) {
handleUpload();
} else {
saveCompany(companyPictureUrl);
props.handleClose();
}
}
};
const handleUpload = () => {
try {
setIsUploading(true);
const storageRef = ref(storage, "company/" + fileSelected.name);
const uploadTask = uploadBytesResumable(storageRef, fileSelected);
uploadTask.on(
"state_changed",
(snapshot: { bytesTransferred: number; totalBytes: number }) => {
const progress =
Math.round(
(snapshot.bytesTransferred / snapshot.totalBytes) * 1000
) / 10;
setUploadProgress(progress);
},
(error: any) => {
console.log(error);
},
async () => {
try {
const url = await getDownloadURL(storageRef);
setCompanyPictureUrl(url);
saveCompany(url);
props.handleClose();
} catch (error: any) {
console.log(error);
}
}
);
} catch (ex) {
console.log("Error while uploading file :: ", ex);
}
};`
Github actions error:

Compare SSE local and Global versions when using eventSource and Server Sent Events

Am using server sent events in an express server like this;
const sendEventDashboard = async (req, res) => {
try {
const orders = await Order.find({ agent_id: req.params.id })
.populate("agent_id")
.sort({ _id: -1 });
res.writeHead(200, {
"Cache-Control": "no-cache",
"Content-Type": "text/event-stream",
Connection: "keep-alive",
});
const sseId = new Date().toDateString();
const intervalId = setInterval(() => {
writeEvent(res, sseId, JSON.stringify(orders));
}, SEND_INTERVAL);
res.on("close", () => {
clearInterval(intervalId);
res.end();
// console.log("Client closed connection browser");
});
} catch (error) {
console.log(error);
}
};
export const getOrdersStreamDashboard = async (req, res) => {
if (req.headers.accept === "text/event-stream") {
sendEventDashboard(req, res);
} else {
res.json({ message: "Okay" });
}
};
and this is how i use it in a react app using a useEffect hook;
useEffect(() => {
const es = new EventSource(
`${process.env.REACT_APP_SERVER_URL}/weborders/${agentId}/stream_dashboard`
);
es.addEventListener("open", () => {
console.log("Dashboard stream opened!");
});
es.addEventListener("message", (e) => {
const data = JSON.parse(e.data);
setTrackOrderCount(data);
});
return () => {
// es.removeAllEventListeners();
es.close();
es.removeEventListener("message", (e) => {
const data = JSON.parse(e.data);
setTrackOrderCount(data);
});
};
}, [trackOrderCount]);
Everything runs as desired apart from event source always running until when the app/browser crushes. I get no error when it stops running and have to refresh for it to start again. This happens like after 10mins of inactivity or being on that same page for a long duration. Is there a way I can only run sse only when the state in the server is different from that of the client because i think the browser crushes because server sent events continuously run even when there's no event. I tried to remove the dependency array [trackOrderCount] in the useEffect and the setInterval in the server but that didn't solve the issue.
The solution might be in comparing the local and global versions before the event is sent but i've failed to figure out where to put that logic! I the browser's console, this is what i get;
and this will run for sometime then crush!

FormData with NextJS API

Background
I am trying to create a simple CRUD application using NextJS along with react-redux, so what it does is that it saves peoples contacts.So when adding a contact i am trying to send some data along with a file to a NextJS API.
Issue
ContactAction.js
Make a POST request from redux action to add a contact
export const addContact = (data) => async (dispatch) => {
try {
var formData=new FormData();
formData.append('name',data.Name);
formData.append('email',data.Email);
formData.append('phone',data.Phone);
formData.append('image',data.Image);
let response= await Axios.post(`http://localhost:3000/api/contact/addContact`,formData,{
headers:{
'x-auth-token':localStorage.getItem('token')
}
});
} catch (error) {
console.log(error);
}
}
addContact.js
This is the API route in /api/contact/
const handler = async (req, res) => {
switch(req.method){
case "POST":{
await addContact(req,res)
}
}
}
const addContact = async (req, res) => {
console.log(req.body);
// do some stuff here and send response
}
this is what i get in the terminal after the log,also the file is Gibberish as well when logging req.files
Current Effort
I tried using third party packages such as formidable and formidable-serverless but got no luck. so after a day i made it work with a package called multiparty.
addContact.js
const handler = async (req, res) => {
switch(req.method){
case "POST":{
let form = new multiparty.Form();
let FormResp= await new Promise((resolve,reject)=>{
form.parse(req,(err,fields,files)=>{
if(err) reject(err)
resolve({fields,files})
});
});
const {fields,files} = FormResp;
req.body=fields;
req.files=files;
await addContact(req,res)
}
}
}
const addContact = async (req, res) => {
console.log(req.body); //Now i get an Object which i can use
// do some stuff here and send response
}
The above solution is obviously redundant and probably not the best way to go about it plus i don't want to add these 7 8 lines into each route.
so if someone could help me understand what i am doing wrong and why formData doesn't seem to work with NextJS API (when it works with the Express server) i would be grateful.
FormData uses multipart/form-data format. That is not a simple POST request with a body. It is generally used for uploading files, that's why it needs special handling. As an alternative, you could use JSON.
Here is my solution, i hope this helps anybody.
First of all you need to install next-connect and multer as your dependencies.
Now you can use this API route code.
import nextConnect from "next-connect";
import multer from "multer";
const apiRoute = nextConnect({
onError(error, req, res) {
res.status(501).json({ error: `Sorry something Happened! ${error.message}` });
},
onNoMatch(req, res) {
res.status(405).json({ error: `Method "${req.method}" Not Allowed` });
},
});
apiRoute.use(multer().any());
apiRoute.post((req, res) => {
console.log(req.files); // Your files here
console.log(req.body); // Your form data here
// Any logic with your data here
res.status(200).json({ data: "success" });
});
export default apiRoute;
export const config = {
api: {
bodyParser: false, // Disallow body parsing, consume as stream
},
};
Here is an example about uploading file with Next.js:
https://codesandbox.io/s/thyb0?file=/pages/api/file.js
The most important code is in pages/api/file.js
import formidable from "formidable";
import fs from "fs";
export const config = {
api: {
bodyParser: false
}
};
const post = async (req, res) => {
const form = new formidable.IncomingForm();
form.parse(req, async function (err, fields, files) {
await saveFile(files.file);
return res.status(201).send("");
});
};
const saveFile = async (file) => {
const data = fs.readFileSync(file.path);
fs.writeFileSync(`./public/${file.name}`, data);
await fs.unlinkSync(file.path);
return;
};
Generally speaking,in your api file,you should disable the default bodyParser,and write your own parser

React Dropzone cancel file upload on a button click

I'm using react-dropzone to upload files to my server in my react app. Everything is working great but I want to add a feature where if a file is taking too long to upload due to its size, the user can cancel the process with the click of a button.
<Dropzone
multiple={ false }
accept={ allowedMimeTypes }
onDrop={ this.onDrop }
onDragEnter={ this.onDragEnter }
onDragLeave={ this.onDragLeave }
className={ classes.dropzone }
maxSize={ MAX_UPLOAD_BYTES }
>
</Dropzone>
<button onClick={ this.onCancelUpload }>Cancel</button>
Please advise, if It is possible using react-dropzone. I can't think of a way to stop the event that is already triggered and is uploading the file.
I was able to solve the above problem using the Axios Cancel Token.
Use Axios to handle the upload as a promise.
Create a source at the start of your code.
const CancelToken = axios.CancelToken;
let source = CancelToken.source();
Pass on the source to the request in config.
const {
acceptedFiles
} = this.state;
const uploaders = acceptedFiles.map((file) => {
const formData = new FormData();
// data must be set BEFORE sending file
formData.append('document', file);
const uploadConfig = {
onUploadProgress: (progressEvent) => {
const progressUpload = (progressEvent.loaded * 100) / progressEvent.total;
this.setState({
progressUpload
});
},
cancelToken: source.token,
};
return inkerzApi.post('/uploads/file', formData, uploadConfig)
.then(response => response.data)
.catch(() => console.log('Upload canceled'));
});
Promise.all(uploaders).then((filesMetadata) => {
filesMetadata.forEach((metadata) => {
if (metadata && metadata.mediaLink && metadata.totalPages) {
this.onNewFileUploaded(metadata);
// show success message here
} else if (this.state.uploadCanceled) {
// show cancelation notification here
this.setState({
uploadCanceled: false
});
}
});
this.setState({
acceptedFiles: [],
progressUpload: 0,
});
});
On Cancel Button Click
onCancelUpload = () => {
source.cancel('Operation canceled by the user.');
source = CancelToken.source();
this.setState({ uploadCanceled: true });
}
This worked out for me. Hope this helps others as well.

Resources