Authentication to serve static files on Next.js? - reactjs

So, I looked for a few authentication options for Next.js that wouldn't require any work on the server side of things. My goal was to block users from entering the website without a password.
I've set up a few tests with NextAuth (after a few other tries) and apparently I can block pages with sessions and cookies, but after a few hours of research I still can't find how I would go about blocking assets (e.g. /image.png from the /public folder) from non-authenticated requests.
Is that even possible without a custom server? Am I missing some core understanding here?
Thanks in advance.

I did stumble upon this problem too. It took my dumbass a while but i figured it out in the end.
As you said - for auth you can just use whatever. Such as NextAuth.
And for file serving: I setup new api endpoint and used NodeJS magic of getting the file and serving it in pipe. It's pretty similar to what you would do in Express. Don't forget to setup proper head info in your response.
Here is little snippet to demonstrate (typescript version):
import { NextApiRequest, NextApiResponse } from 'next'
import {stat} from "fs/promises"
import {createReadStream, existsSync} from "fs"
import path from "path"
import mime from "mime"
//basic nextjs api
export default async function getFile (req: NextApiRequest, res: NextApiResponse) {
// Dont forget to auth first!1!!!
// for this i created folder in root folder (at same level as normal nextjs "public" folder) and the "somefile.png" is in it
const someFilePath = path.resolve('./private/somefile.png');
// if file is not located in specified folder then stop and end with 404
if (! existsSync(someFilePath)) return res.status(404);
// Create read stream from path and now its ready to serve to client
const file = createReadStream(path.resolve('./private/somefile.png'))
// set cache so its proper cached. not necessary
// 'private' part means that it should be cached by an invidual(= is intended for single user) and not by single cache. More about in https://stackoverflow.com/questions/12908766/what-is-cache-control-private#answer-49637255
res.setHeader('Cache-Control', `private, max-age=5000`);
// set size header so browser knows how large the file really is
// im using native fs/promise#stat here since theres nothing special about it. no need to be using external pckages
const stats = await stat(someFilePath);
res.setHeader('Content-Length', stats.size);
// set mime type. in case a browser cant really determine what file its gettin
// you can get mime type by lot if varieties of methods but this working so yay
const mimetype = mime.getType(someFilePath);
res.setHeader('Content-type', mimetype);
// Pipe it to the client - with "res" that has been given
file.pipe(res);
}
Cheers

Related

Is there a way to rename automatically generated routes JSON file in Next.js?

I have a problem, when I click to go to the /analytics page on my site, adblockers block the analytics.json file that's being requested by Next.js as they think it's an analytics tracker (it's not, it's a page listing analytics products).
Is there a way to rename the route files Next.js uses when navigating to server-side rendered pages on the client-side?
I want to either obfuscate the names so they're not machine readable, or have a way to rename them all.
Any help appreciated.
With thanks to #gaston-flores I've managed to get something working.
In my instance /analytics is a dynamic page for a category, so I moved my pages/[category]/index.tsx file to pages/[category]/category.tsx and added the following rewrite:
// next.config.js
module.exports = {
async rewrites() {
return [
{
source: "/:category",
destination: "/:category/category",
},
];
},
};
This now gets the category.json file rather than analytics.json, which passes the adblockers checks and renders as expected.
Note that due to having a dynamic file name in the pages/[category] directory (pages/[category]/[product].tsx), I had to move that to pages/[category]/product/[product].tsx as I was seeing the /analytics page redirected to /analytics/category for some reason without this tweak.

React Load Binary File / URL scheme "file" is not supported

Background
I built an app, which converts files from type A to type B (a binary file). I want to import and use a dummy file of type B to fill the data of file type A. The dummy always stays the same. The app has no backend. I want to share the html, so anything which requires turning off browser security etc., isn't an option.
Problem
At the moment, I load the files as I found here, but this works only with a backend server:
Requesting blob images and transforming to base64 with fetch API
import dummy from '../templates/Grid2.shp';
let hex = await fetch(dummy)
.then( response => response.blob() )
.then( blob => new Promise( callback =>{
let reader = new FileReader() ;
reader.onload = function(){
const serumShp = atob(this.result.substring(37)); // 37 strips the base64 info data:...
callback(binaryToHex(serumShp))
} ;
reader.readAsDataURL(blob) ;
}) ) ;
It works in my development but not at the built stage. As the browsers requests from the filesystem.
I found a solution over a file loader, but this solution also throws an error:
Using file-loader to load binary file in react
import/no-webpack-loader-syntax
Also, I don't see any configuration files for Webpack. As far as I have seen I would need to eject them, which is also not recommended.
Question:
How can I import binary files into my app without a backend server/any changes, etc.?
Sorry, I cannot help, but pointing out that there is a general discussion in CRA to support a more elegant way of importing binary/raw data. Sadly there doesn't seem to be much progress, the proposal is from 2018.

Next.js: How can we have dynamic routing redirect to static pages?

Using Next.js , I currently have an app with a single entry point in the form of /pages/[...slug]/index.ts
It contains a getServerSideProps function which analyses the slug and decide upon a redirection
In some cases a redirection is needed, but it will always be towards a page that can be statically rendered. Example: redirect /fr/uid towards /fr/blog/uid which can be static.
In other cases the slug already is the url of a page that can be static.
How can I mix this dynamic element with a static generation of all pages?
Thanks a lot for your help!
If I understood you problem correctly, you cannot use getServerSideProps if you are going to export a static site.
You have two solutions:
Configure your redirection rules in your web hosting solution (i.e. Amazon S3/CloudFront).
Create client-side redirects (when _app.tsx mounts you can check if router.asPath matches any of the redirection you would like to have configured.
Please remember that the first solution is more correct (as 301 redirects from the browser) for SEO purposes.
EDIT: #juliomalves rightly pointed out OP is looking at two different things: redirection, and hybrid builds.
However, question should be clarified a bit more to really be able to solve his problem.
Because you will need to host a web-server for SSR, you can leverage Next.js 9.5 built-in redirection system to have permanent server-side redirects.
When it comes to SSR vs SSG, Next.js allows you to adopt a hybrid approach, by giving you the possibility of choosing with Data Fetching strategy to adopt.
In case you are using AWS CloudFront, then you can redirect with CloudFront Functions.
CloudFront Functions is ideal for lightweight, short-running functions for use cases like the following:
URL redirects or rewrites – You can redirect viewers to other pages based on information in the request, or rewrite all requests from one path to another.
Here is what we are using to redirect clients (e.g. Native App, Google search index, etc.) to new location when NextJS page was moved or removed.
// NOTE: Choose "viewer request" for event trigger when you associate this function with CloudFront distribution.
function makeRedirectResponse(location) {
var response = {
statusCode: 301,
statusDescription: 'Moved Permanently',
headers: {
'location': { value: location }
}
};
return response;
}
function handler(event) {
var mappings = [
{ from: "/products/decode/app.html", to: '/products/decode.html' },
{ from: "/products/decode/privacy/2021_01_25.html", to: '/products/decode/privacy.html' }
];
var request = event.request;
var uri = request.uri;
for (var i = 0; i < mappings.length; i++) {
var mapping = mappings[i]
if (mapping.from === uri) {
return makeRedirectResponse(mapping.to)
}
}
return request;
}

Upload images to Azure blob from front end (React)

The front end enables people to upload their photos, so i was sending the base64 to the server and working with it initially, but there are problems with firewall which blocks the request which contains base64. As an alternative solution I was trying to upload the image to azure blob get the file name and then send that to the server for processing where I generate a sas token for the blob validation and processing.
This works perfectly fine when I work locally and the front end connection works with #azure/storage-blob
and uploadBrowserData() when I send the arrayBuffer as the param
export const uploadSelfieToBlob = async arrayBuffer => {
try {
const blobURL = `https://${accountName}.blob.core.windows.net${sasString}`;
const blobServiceClient = new BlobServiceClient(blobURL, anonymousCredential);
const containerClient = blobServiceClient.getContainerClient(containerName);
let randomString = Math.random().toString(36).substring(7);
const blobName = `${randomString}_${new Date().getTime()}.jpg`;
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const uploadBlobResponse = await blockBlobClient.uploadBrowserData(arrayBuffer);
return { blobName, blobId: uploadBlobResponse.requestId };
} catch (error) {
console.log('error when uploading to blob', error);
throw new Error('Error Uploading the selfie to blob');
}
};
When I deploy this is not working, the front is deployed in the EastUs2 location and the local development location is different.
I thought the sasString generated for anonymous access had the timezone option so I generated 2 different one's one for local and one for hosted server with the same location selected.
Failed to send request to https://xxxx.blob.core.windows.net/contanainer-name/26pcie_1582087489288.jpg?sv=2019-02-02&ss=b&srt=c&sp=rwdlac&se=2023-09-11T07:57:29Z&st=2020-02-18T00:57:29Z&spr=https&sig=9IWhXo5i%2B951%2F8%2BTDqIY5MRXbumQasOnY4%2Bju%2BqF3gw%3D
What am I missing any lead would be helpful thanks
First, as mentioned in the comments there was an issue with the CORS Settings because of which you're getting the initial error.
AuthorizationResourceTypeMismatchThis
request is not authorized to perform this operation using this
resource type. RequestId:7ec96c83-101e-0001-4ef1-e63864000000
Time:2020-02-19T06:57:31.2867563Z
I looked up this error code here and then closely looked at your SAS URL.
One thing I noticed in your SAS URL is that you have set the signed resource type (srt) as c (container) and trying to upload the blob. If you look at the description of the kind of operations you can do using srt=c here, you will notice that blob related operations are not supported.
In order to perform blob related operations (like blob upload), you would need to set signed resource type value to o (for object).
Please regenerate your SAS Token and include signed resource type as object (you can also include container and/or service in there as well) and then your request should work. So essentially your srt in your SAS URL should be something like srt=o or srt=co or srt=sco.
I couldn't notice anything wrong with the code you mentioned about, but I have been using a different method to upload files to Azure Blog Storage using React, the method is exactly the same as in this blog article which works perfectly for me.
https://medium.com/#stuarttottle/upload-to-azure-blob-storage-with-react-34f37805fdfc

How to encode/compress gltf to draco

I want to compress/encode gltf file using draco programatically in threejs with reactjs. I dont want to use any commandline tool, I want it to be done programatically. Please suggest me a solution.
I tried using gltf-pipeline but its not working in client side. Cesium library was showing error when I used it in reactjs.
Yes, this is can be implemented with glTF-Transform. There's also an open feature request on three.js, not yet implemented.
First you'll need to download the Draco encoder/decoder libraries (the versions currently published to NPM do not work client side), host them in a folder, and then load them as global script tags. There should be six files, and two script tags (which will load the remaining files).
Files:
draco_decoder.js
draco_decoder.wasm
draco_wasm_wrapper.js
draco_encoder.js
draco_encoder.wasm
draco_encoder_wrapper.js
<script src="assets/draco_encoder.js"></script>
<script src="assets/draco_decoder.js"></script>
Then you'll need to write code to load a GLB file, apply compression, and do something with the compressed result. This will require first installing the two packages shown below, and then bundling the web application with your tool of choice (I used https://www.snowpack.dev/ here).
import { WebIO } from '#gltf-transform/core';
import { DracoMeshCompression } from '#gltf-transform/extensions';
const io = new WebIO()
.registerExtensions([DracoMeshCompression])
.registerDependencies({
'draco3d.encoder': await new DracoEncoderModule(),
'draco3d.decoder': await new DracoDecoderModule(),
});
// Load an uncompressed GLB file.
const document = await io.read('./assets/Duck.glb');
// Configure compression settings.
document.createExtension(DracoMeshCompression)
.setRequired(true)
.setEncoderOptions({
method: DracoMeshCompression.EncoderMethod.EDGEBREAKER,
encodeSpeed: 5,
decodeSpeed: 5,
});
// Create compressed GLB, in an ArrayBuffer.
const arrayBuffer = io.writeBinary(document); // ArrayBuffer
In the latest version of 1.5.0, what are these two files? draco_decoder_gltf.js and draco_encoder_gltf.js. Does this means we no longer need the draco_encoder and draco_decoder files? and how do we invoke the transcoder interface without using MeshBuilder. A simpler API would be musth better.

Resources