sveltekit TypeError: immutable - sveltekit

I'm using sveltekit 1.0.0-next.483 running with npm run dev -- --host
connecting to an endpoint with a mobile device i get this error:
typeError: immutable
at Headers.append ([..]node_modules/undici/lib/fetch/headers.js:227:13)
This error only occurs on mobile device, connecting to the local net ip address.
my endpoint: src/routes/gqlendpoint/+server.ts
const base = 'http://localhost:4000/graphql';
export async function POST( opts: { request: Request} ): Promise<Response> {
const { request } = opts;
const body = await request.json();
const response = await fetch(base, {
//credentials:"include",
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(body)
});
return response;
}
the only way I found to unlock this situation, is by commenting a line of code inside node_modules/undici/lib/fetch/headers.js
// 3. If headers’s guard is "immutable", then throw a TypeError.
// 4. Otherwise, if headers’s guard is "request" and name is a
// forbidden header name, return.
// Note: undici does not implement forbidden header names
if (this[kGuard] === 'immutable') {
**//throw new TypeError('immutable')**
} else if (this[kGuard] === 'request-no-cors') {
// 5. Otherwise, if headers’s guard is "request-no-cors":
// TODO
}
which is certainly not a good solution.

You have to return new Response body to avoid this issue , see example code below.
return new Response('test example')
in place of return response;

Related

403 when upload file to S3 bucket using axios

I'm using axios to upload an audio file to AWS s3 bucket.
The workflow is: React => AWS API Gateway => Lambda.
Here is the backend Lambda code where generates the S3 presigned URL:
PutObjectRequest putObjectRequest = PutObjectRequest.builder()
.bucket(AUDIO_S3_BUCKET)
.key(objectKey)
.contentType("audio/mpeg")
.build();
PutObjectPresignRequest putObjectPresignRequest = PutObjectPresignRequest.builder()
.signatureDuration(Duration.ofMinutes(10))
.putObjectRequest(putObjectRequest)
.build();
PresignedPutObjectRequest presignedPutObjectRequest = s3Presigner.presignPutObject(putObjectPresignRequest);
AwsProxyResponse awsProxyResponse = new AwsProxyResponse();
awsProxyResponse.setStatusCode(HttpStatus.SC_OK);
awsProxyResponse.setBody(
GetS3PresignedUrlResponse.builder()
.s3PresignedUrl(presignedPutObjectRequest.url().toString())
.build().toString());
return awsProxyResponse;
Here is the java code to create the bucket:
private void setBucketCorsSettings(#NonNull final String bucketName) {
s3Client.putBucketCors(PutBucketCorsRequest.builder()
.bucket(bucketName)
.corsConfiguration(CORSConfiguration.builder()
.corsRules(CORSRule.builder()
.allowedHeaders("*")
.allowedMethods("GET", "PUT", "POST")
.allowedOrigins("*") // TODO: Replace with domain name
.exposeHeaders("ETag")
.maxAgeSeconds(3600)
.build())
.build())
.build());
log.info("Set bucket CORS settings successfully for bucketName={}.", bucketName);
}
In my frontend, here is the part that try to upload file:
const uploadFile = (s3PresignedUrl: string, file: File) => {
let formData = new FormData();
formData.append("file", file);
formData.append('Content-Type', file.type);
const config = {
headers: {
"Content-Type": 'multipart/form-data; boundary=---daba-boundary---'
//"Content-Type": file.type,
},
onUploadProgress: (progressEvent: { loaded: any; total: any; }) => {
const { loaded, total } = progressEvent;
let percent = Math.floor((loaded * 100) / total);
if (percent < 100) {
setUploadPercentage(percent);
}
},
cancelToken: new axios.CancelToken(
cancel => (cancelFileUpload.current = cancel)
)
};
axios(
{
method: 'post',
url: s3PresignedUrl,
data: formData,
headers: {
"Content-Type": 'multipart/form-data; boundary=---daba-boundary---'
}
}
)
.then(res => {
console.log(res);
setUploadPercentage(100);
setTimeout(() => {
setUploadPercentage(0);
}, 1000);
})
.catch(err => {
console.log(err);
if (axios.isCancel(err)) {
alert(err.message);
}
setUploadPercentage(0);
});
};
However, when try to upload the file, it return 403 error.
And if I use fetch instead of axios instead and it works, like this:
export async function putToS3(presignedUrl: string, fileObject: any) {
const requestOptions = {
method: "PUT",
headers: {
"Content-Type": fileObject.type,
},
body: fileObject,
};
//console.log(presignedUrl);
const response = await fetch(presignedUrl, requestOptions);
//console.log(response);
return await response;
}
putToS3(getPresignedUrlResponse['s3PresignedUrl'], values.selectdFile).then(
(putToS3Response) => {
console.log(putToS3Response);
Toast("Success!!", "File has been uploaded.", "success");
}
);
It seems to me that the only difference between these two is that: when using fetch the request's Content-Type header is Content-Type: audio/mpeg, but when using axios it is Content-Type: multipart/form-data; boundary=----WebKitFormBoundaryClLJS3r5Xetv3rN7 .
How can I make it work with axios? I'm switching to axios for its ability to monitor request progress as I want to show an upload progress bar.
I followed this blog and not sure what I missed: https://bobbyhadz.com/blog/aws-s3-presigned-url-react
You are using POST in your axios. Should be PUT instead.
Also I think the content type has to match the one specified during requesting the pre-signed URL, which is audio/mpeg as you rightly pointed out.
Correspondingly, your data should be just file, instead of formData.
axios(
{
method: 'put',
url: s3PresignedUrl,
data: file,
headers: {
"Content-Type": 'audio/mpeg'
}
}
...
You didn't mark any answers as accepted so I guess you didn't solve it.
For any future viewers out there. The reason why you are getting 403 forbidden error is because your Content-Type in your server and client side are not matching. I'm assuming you set up the AWS policies correctly.
Your code in the backend should look like this:
const presignedPUTURL = s3.getSignedUrl("putObject", {
Bucket: "bucket-name",
Key: String(Date.now()),
Expires: 100,
ContentType: "image/png", // important
});
and in the front-end (assuming you are using axios):
const file = e.target.files[0]
const result = await axios.put(url, file, {
withCredentials: true,
headers: { "Content-Type": "image/png" },
});
In practical, you would normally have to send the file type to generate the pre-signed url in the POST body or whatever and then in axios you do file.type to get the file type of the uploaded file.
Check your Lambda execution role. It may be the culprit. Perhaps it does not grant enough permissions to allow PUTting files into your bucket.
URL signing is a delegation of power on behalf of the signer, which is restricted to a specified object, action... Signing does not magically grants full read/write permissions on S3, even on the specific object related to the presigned URL.
The "user" who generates the signature requires sufficient permissions to allow the actions you want to delegate through that presigned URL. In this case, this is the execution role of your Lambda function.
You can add the AmazonS3FullAccess managed policy to the execution role and see if it solves your situation. This change took me out of a blocked situation me after days of struggle. Afterwards, before going to production, restrict that rule to the specific bucket you want to allow uploads into (least privilege principle).
If you develop using SAM local emulation, those execution roles seem not to be taken into account as long as you run your functions locally; the signed links work in that context even without S3 permissions.

My fetch doesn't upload the JSON string, I can't see the error in my code

I'm using Slim v4 for my REST API for testing purposes.
I want to fetch a JSON Data string to my REST API for saving some events.
public async callSaveEvent(event: EventList) {
let url: string = config.basePath + "eventList/saveEventList";
console.table(JSON.stringify(event));
await fetch(url, {
method: 'POST',
mode: 'no-cors',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ event })
}).then(response => {
if (!response.ok) {
throw new Error("Something is bad");
}
}).catch(error => {
console.error("Das ist passiert!: ", error);
});
}
This is my current Code. If I use the fetch.options.mode == "cors", I recieve in Slim that this Method is not allowed. Method is OPTIONS instead of POST. Because of this I using mode == "no-cors".
$param = $req->getParsedBody();
$param_ = $param;
$resp->getBody()->write($param);
return $resp;
}
This is my Backend Code. When I try to read the parsedBody, its just empty.
If I send a request with PostMan its accept the data and I get the data in the $param variable.
Can someone find some errors? I can't find them.

How to check whether response JSON of an API is empty or has an error?

I am new to reactjs and I am stuck in one problem. I am calling an Update API which is of PUT type. I use the fetch function to call the API in reactjs and I check the response of the API. If Response is 200 OK, then I return the response.json() and then check whether the json object has error in it or not. If it has error, then I print the error else I update it.
But when there is no Error present in the response, then I get a syntax-error in return response.json() statement and If there is actually a Error present in the response then there is no syntax-error shown. So is there a method to check whether the response is empty or not so that accordingly I can return response.json().
I have tried by putting a condition as if(response.json() != '') but it shows error in response.json() statement.
fetch( API + name , {
method: 'PUT',
headers: {
'Accept' : 'application/json',
'Content-Type': 'application/json',
'Authorization': 'Bearer ' + localStorage.getItem('access_token')
},
body: JSON.stringify({
name: name,
description: updateDesc
}),
}).then(function(response) {
if(response.status == '200'){
flag=true;
return response.json();
}
else {
flag=false
}
})
.then(json => {
if(flag)
{
if(json.Error != "")
{
that.createNotification('error','update');
}
else {
this.createNotification('success','update');
}
}
});
Unhandled Rejection (SyntaxError): Unexpected end of JSON input
There are multiple issues with this imo:
The callback should be refactored to avoid the use of the flag variable. The code in the function supplied to handlers like then, catch and finally of promises is executed asynchronously. Therefore you cannot be sure / (should not assume) when this value will be assigned and in which state your context is at that time.
.then(json => { if there is an error this will actually use the promise returned by fetch aka response and not the promise returned by response.json() (Currently return response.json() is only executed in the success case)
Note that this happens (currently works in the error case) because you can chain promises. You can find more info and examples about this here
I would refactor the handling of the fetch promise like this:
You can shorten the following example and avoid assigning the promises, but it makes the example better readable
More information about the response object
const fetchPromise = fetch(<your params>);
fetchPromise.then(response => {
if (response.ok()){
//Your request was successful
const jsonPromise = response.json();
jsonPromise.then(data => {
console.log("Successful request, parsed json body", data);
}).catch(error => {
//error handling for json parsing errors (empty body etc.)
console.log("Successful request, Could not parse body as json", error);
})
} else {
//Your request was not successful
/*
You can check the body of the response here anyways. Maybe your api does return a json error?
*/
}
}).catch(error => {
//error handling for fetch errors
}))

InvalidSignatureException from POST request

I have a Lambda function that handles reading data from a file(stored inside S3 bucket) as well as inserting data to a Dynamodb table. This Lambda function is exposed as a REST endpoint using API gateway. The function accepts GET request as well as POST request. I'm making GET/POST requests from my REACT project using axios and aws4(for signing) libraries. GET request is to read data from a file stored inside S3 and it works just fine. And POST request is for inserting data into Dynamodb table. However, it doesn't work and AWS returns InvalidSignatureException error as a respond. This is an excerpt of my code :
createAWSSignedRequest(postData) {
let request = {};
if (postData) {
request = {
host: process.env.AWS_HOST,
method: 'POST',
url: process.env.AWS_URL,
path: process.env.AWS_PATH,
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify(postData)
}
} else {
request = {
host: process.env.AWS_HOST,
method: 'GET',
url: process.env.AWS_URL,
path: process.env.AWS_PATH
}
}
let signedRequest = aws4.sign(request, {
secretAccessKey: process.env.AWS_SECRET_KEY,
accessKeyId: process.env.AWS_ACCESS_KEY
});
return signedRequest;
}
This is how GET request is made :
let signedRequest = this.createAWSSignedRequest('GET');
axios(signedRequest)
.then(response => {
})
.catch((error) => {
console.log("error",error);
});
This is how POST request is made :
const data = {
uuid: "916b7d90-0137-11e8-94e6-116965754e23", //just a mock value
date : "22/jan/2018",
user_response: [
{
question:"this is quesiton1",
choice:"user selected A"
},
{
question:"this is quesiton2",
choice: "user selected b"
},
{
question:"this is quesiton3",
choice: "user selected C"
}
]
};
let signedRequest = this.createAWSSignedRequest(data);
axios(signedRequest)
.then(response => {
......
})
.catch((error) => {
console.log("error",error);
});
As you can see, the code for both GET and POST requests are exactly the same (except payload and method type). I'm singing with the same secret access key and access key id for both requests. I'm not sure why one request results in "InvalidSignatureException" when the other doesn't. Can anyone shed a light on this issue for me.
Thanks
After having discussion with AWS4 lib developer, I figured out what I did wrong. AWS4 uses "body" as a payload attribute to compute signature. However, Axios uses "data" attribute as payload. My mistake was only setting either one of them. So when I set just "data" attribute, the payload was present in the request and content-length is computed correctly. However, the signature was incorrect since the payload was not taken into consideration when computing signature. When I set just "body", payload was not present in the request because Axios does not use "body" attribute for payload. The solution is to set both attributes with payload. I hope this helps to anyone who are having the same issue I have.
If you use the AWS Amplify library it has a module called API which should fit your use cases, and it will perform Sigv4 signing for you either with authenticated or unauthenticated roles. The Auth category uses Cognito as the default implementation. For instance:
npm install aws-amplify --save
Then import and configure the lib:
import Amplify, { API } from 'aws-amplify';
Amplify.configure({
Auth: {
identityPoolId: 'XX-XXXX-X:XXXXXXXX-XXXX-1234-abcd-1234567890ab',
region: 'XX-XXXX-X'
},
API: {
endpoints: [
{
name: "APIName",
endpoint: "https://invokeURI.amazonaws.com"
}
]
}
});
Then for your API Gateway endpoint calling a Lambda:
let apiName = 'MyApiName';
let path = '/path';
let options = {
headers: {...} // OPTIONAL
}
API.get(apiName, path, options).then(response => {
// Add your code here
});
More info here: https://github.com/aws/aws-amplify

Authentication in Angular 2, handling the observables

I just started with a Angular 2 project and am trying to get authentication up and running. Inspired by this tutorial I decided to do the following:
Create a custom RouterOutlet class (extending it) to handle the authentication logic whenever a url is called.
I succeeded in this custom class, but am still not sure how to check if a user is authenticated. My situation is as follows, I need to query a get call to a external API, for my development proces it is as follows:
getAdmin() {
let headers = new Headers({ 'Content-Type': 'application/json' });
let options = new RequestOptions({ headers: headers });
return this.http.get('http://localhost:3000/admin/is_admin.json', options)
.map(res => res)
.catch(this.handleError)
}
This API call returns true or false. I was wondering what would be the best option to use this information? Should I for example call the following function each time a URL should be checked?:
isAdmin() {
this.getAdmin().subscribe(
data => this.authenticationResult = data,
error => console.log("Error: ", error),
() => return JSON.parse(this.authenticationResult._data);
}
I can't get this up and running because my observable is undefined when using the function I gave as example.
The "problem" is that your method is asynchronous so you need to be careful the way and when you use it.
If you want to use within the activate method of your custom RouterOutlet, you need to leverage observables and reactive programming.
I don't know exactly the way you want to check admin roles:
activate(instruction: ComponentInstruction) {
return this.userService.getAdmin().flatMap((isAdmin) => {
if (this.userService.isLoggIn()) {
if (this._canActivate(instruction.urlPath, isAdmin) {
return Observable.fromPromise(super.activate(instruction));
} else {
this.router.navigate(['Forbidden']);
return Observable.throw('Forbidden');
}
} else {
this.router.navigate(['Login']);
return Observable.throw('Not authenticated');
}
}).toPromise();
}
_canActivate(url, admin) {
return this.publicRoutes.indexOf(url) !== -1
|| this.userService.isLoggedIn();
}
In order to optimize the request, you could lazily (and only once) call the request to check if the user is admin or not:
isAdmin:boolean;
getAdmin() {
if (this.isAdmin) {
return Observable.of(this.isAdmin);
} else {
let headers = new Headers({ 'Content-Type': 'application/json' });
let options = new RequestOptions({ headers: headers });
return this.http.get('http://localhost:3000/admin/is_admin.json', options)
.map(res => res)
.catch(this.handleError);
}
}
Another approach will be also to load this hint when authenticating the user. This way, the implementation of the activate method would be simplier:
activate(instruction: ComponentInstruction) {
if (this.userService.isLoggIn()) {
if (this.userService.isAdmin()) {
return super.activate(instruction);
} else if (this._canActivate(instruction.urlPath, isAdmin) {
return super.activate(instruction);
} else {
this.router.navigate(['Forbidden']);
}
} else {
this.router.navigate(['Login']);
}
}
_canActivate(url, admin) {
return this.publicRoutes.indexOf(url) !== -1
|| this.userService.isLoggedIn();
}
I would consider to call getAdmin() somehow as first Step of your app, store the result in a SessionService object which you move around using Dependency Injection. This way any time you need to check the result of getAdmin you can ask the SessionService instance.
I hope this helps

Resources