`S3Client.send()` doesn't return `VersionId` - reactjs

I have a bucket which has versioning enabled. In my ReactJS app, I need to upload files to the bucket and receive the new object metadata. I use S3 Client with PutObjectCommand to do the upload. The documentation here states:
Versioning
If you enable versioning for a bucket, Amazon S3 automatically generates a unique version ID for the object being stored. Amazon S3 returns this ID in the response. When you enable versioning for a bucket, if Amazon S3 receives multiple write requests for the same object simultaneously, it stores all of the objects.
So, I expect to receive a VersionId. But not only this field is undefined, but also other fields like requestId or `cfId.
Here is my module code:
import {
PutObjectCommand,
S3Client
} from '#aws-sdk/client-s3';
const S3_BUCKET_NAME = process.env.S3_BUCKET_NAME;
const AWS_REGION = process.env.AWS_REGION;
const AWS_ACCESS_KEY = process.env.AWS_ACCESS_KEY;
const AWS_SECRET_KEY = process.env.AWS_SECRET_KEY;
const client = new S3Client({
region: AWS_REGION,
credentials: {
accessKeyId: AWS_ACCESS_KEY,
secretAccessKey: AWS_SECRET_KEY
}
});
const uploadToS3 = async function (fileToUpload) {
const data = await fileToUpload.arrayBuffer();
const params = {
Bucket: S3_BUCKET_NAME,
Key: fileToUpload.name,
Body: data
};
const command = new PutObjectCommand(params);
const result = await client.send(command);
console.log(result); // requestId: undefined, extendedRequestId: undefined, cfId: undefined
console.log(`VersionId: ${result.VersionId}`); // VersionId: undefined
}
export default {
uploadToS3
}
Have I missed anything here?

Related

Uploading image to azure blob storage using React

I am trying to upload image from react to azure blob storage but the request fails with error :
TypeError: Cannot read properties of undefined (reading 'size')
at BlockBlobClient.uploadFile
here is a sample of code trying to achieve it :
import { BlobServiceClient } from '#azure/storage-blob';
const account = process.env.REACT_APP_AZURE_ACCOUNT_NAME;
const sas = process.env.REACT_APP_SAS_TOKEN;
const containerName = 'usercontainer';
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net/?${sas}`,
);
export const uploadToBlob = async (file) => {
const containerClient = blobServiceClient.getContainerClient(containerName);
const blobName = file.src + new Date().getTime();
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const uploadBlobResponse = await blockBlobClient.uploadFile(file.src);
console.log(
`Upload block blob ${blobName} successfully`,
uploadBlobResponse.requestId,
);
};
There are a few issues here:
You are calling uploadFile method which is only available in NodeJS runtime and is not available in the browser. Please see documentation of this method here: https://learn.microsoft.com/en-us/javascript/api/#azure/storage-blob/blockblobclient?view=azure-node-latest##azure-storage-blob-blockblobclient-uploadfile.
The method you would want to use is uploadData which expects an object of type Blob. Considering you have a data URL, you would need to create a Blob object out of it. Please see this question regarding converting a data URL to Blob: Blob from DataURL?

Send FormData object and File in a single Axios POST request

I currently have a Spring boot API controller method (below) that accepts an object as well as a MultiPart file. I am able to successfully send a POST request via Postman however I am now struggling to make this post request via my front-end ReactJS application using Axios.
#PostMapping(
path = "/upload",
consumes = {
MediaType.APPLICATION_JSON_VALUE,
MediaType.MULTIPART_FORM_DATA_VALUE
},
headers = {
"Content-Type=multipart/form-data"
}
)
public SoundProfile uploadSoundProfile(Authentication auth,
#RequestPart("soundProfileRequest") SoundProfileRequest soundProfileRequest,
#RequestPart("audio_file") MultipartFile audio_file){
return soundProfileService.uploadSoundProfile(auth, soundProfileRequest, audio_file);
}
Postman POST request:
The following is my service method to process the object and file which is responsible for saving the object to a MySQL database and then storing the file in an Amazon S3 bucket.
public SoundProfile uploadSoundProfile(Authentication auth, SoundProfileRequest soundProfileRequest, MultipartFile audio_file) {
if (audio_file.isEmpty()){
throw new IllegalStateException("no audio file received");
}
AppUser current_user = appUserRepository.findByEmail(auth.getName())
.orElseThrow(
() -> new IllegalStateException("User not found")
);
Map<String, String> metadata = new HashMap<>();
metadata.put("Content-Type", audio_file.getContentType());
metadata.put("Content-Length", String.valueOf(audio_file.getSize()));
String soundPath = UUID.randomUUID().toString();
SoundProfile soundProfile = new SoundProfile(
soundPath, // SoundPath = S3 key
soundProfileRequest.getCaseBrand(),
soundProfileRequest.getCaseModel(),
soundProfileRequest.getSwitches(),
soundProfileRequest.getKeycaps(),
soundProfileRequest.getStabilizers(),
soundProfileRequest.getLube(),
soundProfileRequest.getMods(),
current_user
);
// save sound profile to database
soundProfileRepository.save(soundProfile);
String path = String.format("%s/%s", BucketName.KEYBOARD_AUDIO_BUCKET.getBucketName(), current_user.getUserId());
String filename = String.format("%s-%s", audio_file.getOriginalFilename(), soundPath);
// Save audio file to s3 bucket
try {
fileStore.saveAudio(
path,
filename,
Optional.of(metadata),
audio_file.getInputStream()
);
} catch (IOException e) {
throw new IllegalStateException(e);
}
return soundProfile;
}
I would like to send the SoundProfileRequest object and the multipart file separately, meaning I don't want to append the file to a FormData object, but I would still like to send the file along with the form fields in a single post request.
For example in my front-end React Component:
export default function UploadSoundProfile() {
const [caseBrand, setCaseBrand] = useState("");
const [caseModel, setCaseModel] = useState("");
const [switches, setSwitches] = useState("");
const [keycaps, setKeycaps] = useState("");
const [lube, setLube] = useState("");
const [stabilizers, setStabilizers] = useState("");
const [mods, setMods] = useState("");
const [selectedFile, setSelectedFile] = useState("");
const history = useHistory();
const createSoundProfile = (e) => {
e.preventDefault();
const url = "/sound-profile/upload";
const formData = new FormData();
formData.append('caseBrand', caseBrand);
formData.append('caseModel', caseModel);
formData.append('switches', switches);
formData.append('keycaps', keycaps);
formData.append('lube', lube);
formData.append('stabilizers', stabilizers);
formData.append('mods', mods);
**// SHOULD FILE ALSO BE APPENDED TO FORMDATA OBJECT HERE?**
formData.append('audio_file', selectedFile);
const config = {
headers: {
"content-type": "multipart/form-data"
}
}
uploadProfileService.createSoundProfile(url, formData, config);
history.push("/sound-profile/profile");
};
return (
...
)
}
Is there a way to make the POST request with Axios without appending the file to the FormData object while still making a single post request?
I am unsure of how to accomplish this, or if it is possible. I have seen other posts where a file is being .append() to a FormData object, but I am unsure if this will cause an error on the backend.
Thanks for any help in advance!

Error 400 Bad Request while Uploading Image to firebase storage in React Native

I am working on react native project connected to firebase. I am =using firebase storage ad=nd trying to upload a file to firebase storage But I get following error.
{code: 400, message: "Bad Request. Could not access bucket quickbuy-a0764.appspot.com","status":"Access_Bucket"}
I tried configuring my permissions but did not work for me.
example of Image Uri I am providing to put() is as follows
data:image/jpeg;base64,/9j/4AAQSkZJRgABAQAAAQABAAD/2wCEAAk and so on
Now what should I do to resolve this issue?
let filename = values.images + Date.now();
let uri = values.images[0];
const uploadTask = storage.ref(`images/${filename}`).put(uri);
uploadTask.on("state_changed", (snapshot) => {
console.log(snapshot);
});
firebase.storage.Reference#put() accepts a Blob, Uint8Array or an ArrayBuffer. Because you are trying to upload a Data URI, which is a string, you need to use [firebase.storage.Reference#putString()`](https://firebase.google.com/docs/reference/js/firebase.storage.Reference#putstring).
To do this for a data URI, you would use:
someStorageRef.putString(uri, firebase.storage.StringFormat.DATA_URL);
Next, based on these lines:
const filename = values.images + Date.now();
let uri = values.images[0];
values.images is an array, which means that filename will end up being something similar to "[object Object],[object Object]1620528961143".
As I covered in this answer on your question yesterday, this is a poor way to generate IDs as it can lead to duplicates & collisions - use a Push ID instead.
const uri = /* ... */;
const rootRef = firebase.database().ref();
const filename = rootRef.push().key;
const uploadTask = storage.ref(`images/${filename}`)
.putString(uri, firebase.storage.StringFormat.DATA_URL);
uploadTask.on("state_changed", (snapshot) => {
console.log(snapshot);
});
Future Use with Version 9 of the SDK
import { getStorage, ref, uploadBytes } from "firebase/storage";
const uploadImage = async (values) => {
const filename = values.images + Date.now();
const uri = values.images[0];
// Create a root reference
const storage = getStorage();
// Create a reference to 'images/$filename.jpg'
const filesImagesRef = ref(storage, 1images/${filename}.jpg);
await uploadBytes(filesImagesRef, uri).then((snapshot) => {
console.log('Uploaded a blob or file!');
});
}
Let us know how this works for you!

Firestore Data not getting uploaded to Algolia

I am facing a problem of uploading data(records) from firebase firestore to Algolia database. I am using the Algolia API to upload data.
Here is the code of my index.js(code for uploading data from firestore into Algolia)-
// Import all needed modules.
const functions=require("firebase-functions");
const admin=require("firebase-admin");
const algoliasearch=require("algoliasearch");
// Set up Firestore.
admin.initializeApp();
const db = admin.firestore();
const env=functions.config();
// Set up Algolia.
const client = algoliasearch(env.algolia.appid, env.algolia.apikey);
const index = client.initIndex("users");
// Create a HTTP request cloud function.
async function sendCollectionToAlgolia() {
// This array will contain all records to be indexed in Algolia.
var algoliaRecords = [];
const querySnapshot = await db.collection('users').get();
querySnapshot.docs.forEach(doc => {
const document = doc.data();
const record = {
objectID: doc.id,
name: document.name,
age: document.age,
gender: document.gender,
certificate: document.certificate,
about: document.about,
interestedIn: document.interestedIn,
isVerified: document.isVerified,
joinDate: document.joinDate,
level: document.level,
location: document.location,
meetups: document.meetups,
geoHash: document.geoHash,
height: document.height,
photoUrl: document.photoUrl,
points: document.points,
preferredBuddyFitnessLevel: document.preferredBuddyFitnessLevel,
uid: document.uid,
weight: document.weight,
userFitnessLevel: document.userFitnessLevel
};
algoliaRecords.push(record);
});
// After all records are created, we save them to
index.saveObjects(algoliaRecords,{ autoGenerateObjectIDIfNotExist: true })
.then(objects =>{console.log(objects.firstname);})
.catch(err=>{console.log(err);})
}
module.exports=sendCollectionToAlgolia;
Now after I execute firebase deploy --only functions:sendCollectionToAlgolia command in my terminal:
This is the output I get-Terminal Output
It is saying that the deployement is complete and successful but the data is not getting uploaded to my Algolia Dashboard
This is my Algolia dashboard-
Algolia Dashboard
Here the data is not visible.
Help will be appreciated!
Thanks in advance.

React Application Update Data from DynamoDB Change

I am building a React application with GraphQL using AWS AppSync with DynamoDB. My use case is that I have a table of data that is being pulled from a DynamoDB table and displayed to the user using GraphQL. I have a few fields that are being updated by step functions running on AWS. I need those fields to be automatically updated for the user much like a subscription from GraphQL would do but I found out that subscriptions are tied to mutations and thus an update to the database from step functions will not trigger a subscription update on the frontend. To get around this I am using the following:
useEffect(() => {
setTimeout(getSubmissions, 5 * 1000)
})
Obviously this is a lot of overfetching and will probably incur unnecessary expense. I have looked for a better solution and come across DynamoDB streams but DynamoDB streams can't help me if they can't trigger the frontend to refresh the component. There has to be a better solution than what I have come up with.
Thanks!
You are correct, in AWS AppSync, to trigger a subscription publish you must trigger a GraphQL mutation.
but I found out that subscriptions are tied to mutations and thus an
update to the database from step functions will not trigger a
subscription update on the frontend.
If you update your DynamoDB table directly via step functions or via DynamoDB streams, then AppSync has no way to know the data refreshed.
Why don't you have your step function use an AppSync mutation instead of updating your table directly? That way you can link a subscription to the mutation and have your interested clients get pushed updates when the data is refreshed.
Assuming you are using Cognito as your authentication for your AppSync application, you could set a lambda trigger on the dynamo table that generates a cognito token, and uses that make an authorized request to your mutation endpoint. NOTE: in your cognito userpool>app clients page, you will need to check the Enable username password auth for admin APIs for authentication (ALLOW_ADMIN_USER_PASSWORD_AUTH) box to generate a client secret.
const AWS = require('aws-sdk');
const crypto = require('crypto');
var jwt = require('jsonwebtoken');
const secrets = require('./secrets.js');
var cognitoidentityserviceprovider = new AWS.CognitoIdentityServiceProvider();
var config;
const adminAuth = () => new Promise((res, rej) => {
const digest = crypto.createHmac('SHA256', config.SecretHash)
.update(config.userName + config.ClientId)
.digest('base64');
var params = {
AuthFlow: "ADMIN_NO_SRP_AUTH",
ClientId: config.ClientId, /* required */
UserPoolId: config.UserPoolId, /* required */
AuthParameters: {
'USERNAME': config.userName,
'PASSWORD': config.password,
"SECRET_HASH":digest
},
};
cognitoidentityserviceprovider.adminInitiateAuth(params, function(err, data) {
if (err) {
console.log(err.stack);
rej(err);
}
else {
data.AuthenticationResult ? res(data.AuthenticationResult) : rej("Challenge requested, to verify, login to app using admin credentials");
}
});
});
const decode = auth => new Promise( res => {
const decoded = jwt.decode(auth.AccessToken);
auth.decoded = decoded
res(auth);
});
//example gql query
const testGql = auth => {
const url = config.gqlEndpoint;
const payload = {
query: `
query ListMembers {
listMembers {
items{
firstName
lastName
}
}
}
`
};
console.log(payload);
const options = {
headers: {
"Authorization": auth.AccessToken
},
};
console.log(options);
return axios.post(url, payload, options).then(data => data.data)
.catch(e => console.log(e.response.data));
};
exports.handler = async (event, context, callback) => {
await secrets() //some promise that returns your keys object (i use secrets manager)
.then( keys => {
#keys={ClientId:YOUR_COGNITO_CLIENT,
# UserPoolId:YOUR_USERPOOL_ID,
# SecretHash:(obtained from cognito>userpool>app clients>app client secret),
# gqlEndpoint:YOUR_GRAPHQL_ENDPOINT,
# userName:YOUR_COGNITO_USER,
# password:YOUR_COGNITO_USER_PASSWORD,
# }
config = keys
return adminAuth()
})
.then(auth => {
return decode(auth)
})
.then(auth => {
return testGql(auth)
})
.then( data => {
console.log(data)
callback(null, data)
})
.catch( e => {
callback(e)
})
};

Resources