.NET / React App not serving /.well-known/filename - reactjs

I'm trying to setup Apple Pay via Stripe, which requests access to a file to verify domain ownership. The problem I'm having is that this file has no extension, so either React, Azure or my .NET application does not like it!
I've tried various solutions to date within the web.config but to no avail.
The file in question is: /.well-known/apple-developer-merchantid-domain-association
My project is a .NET project running a React SPA. I've added the file mentioned above to the /public folder, too.
I can access .txt files in the same folder, so the path/folder is accessible - It seems to be a problem with the lack of extension.

I've managed to resolve this with the following, and ensuring the ./well-known and subfiles are in my project root.
app.UseStaticFiles(new StaticFileOptions
{
FileProvider = new PhysicalFileProvider(Path.Combine(env.ContentRootPath, #".well-known")),
RequestPath = new PathString("/.well-known"),
DefaultContentType = "application/json",
ServeUnknownFileTypes = true
});

Related

How to access files uploaded to the public folder in Next.js?

I have a project in Next.js. I have that upload files and share that in public URL to this project.
With npm run dev first I uploaded files to public folder and it worked fine, but when I change to npm run start and upload files, the files upload to public folder but with URL http://mydomain/fileuploaded.jpg it did not show, is rare but it's there.
I searched on the Internet but I didn't find a solution for this problem.
From Next.js documentation:
Only assets that are in the public directory at build time will be served by Next.js. Files added at runtime won't be available.
You'll have to persist the uploaded files somewhere else if you want to have access to them in the app at run time.
Alternatively, you could setup your own custom server in Next.js, which would give you more control to serve static files/assets.
You can also achieve something similar using API routes instead. See Next.js serving static files that are not included in the build or source code for details.
a bit late but if someone need the same.
If your goal is to upload and get picture from your next server, you can instead of using the Next router, getting the image by yourself by create a route /api/images/[id] where [id] is your file name and you manually with fs send the picture back.
something like:
const file = await fs.readFile(`./uploads/image.png`)
console.log(file)
res.setHeader('Content-Type', 'image/png')
res.send(file)
Try and use nginx or another webserver to serve the public directory. That way it will serve newly added files without having to write extra code to serve files in nextjs.
server {
/images/ {
root /var/www/site/public
}
}

Removing the need for pathing in cloudFront distribution of S3 bucket requiring .html at the end of the page name, in Next.js project

I have a Next.js, React, Ts project that exists on a S3 bucket as a static site and is distributed via cloudFront.
The problem I'm running into is for me to go a different page I have to append .html at the end of the page name.
So mysite.com/profile will return a <Code>NoSuchKey</Code> error, however mysite.com/profile.html will route me correctly.
Is there some way to remove this necessity?
If this is a next issue i'm using
npx next build
npx next export
To build and export the /out directory which I then upload to my S3 bucket
my next.config.js
module.exports = {
target: "serverless"
}
I had it like this as I was originally making use of serverless for Next but have since moved away from it as I'm largely making use of client-side rendering and don't need any of the features it was providing and I am still in the process of doing a cleanup on the project.
Routing in S3 is done with exact match of the file name. You can remove .html extension to use routing as you like. And set metadata Content-type to text/html, to view it properly in browser

React SPA dynamic environment configuration

I'm building React SPA application from ground up using create-react-app and setting up uri address for API server of my SPA. According to official documentation suggested way is to create environment .env files for such kind of needs. I'm using a continuous delivery as part of development workflow. After deployment React SPA application goes in one Docker container and API goes to another. Those containers are deployed in separate servers and I do not know exactly what uri for API will be, so there is no way to create separate .env file for each deployment. Is there any "right way" to provide dynamic configuration for my SPA application so I can easily change environment parameters
API URI examples in SPA
// api.config.js
export const uriToApi1 = process.env.REACT_APP_API1_URI;
export const uriToApi2 = process.env.REACT_APP_API2_URI;
// in App.js
import { uriToApi1, uriToApi2 } from '../components/config/api.config.js';
/* More code */
<DataForm apiDataUri={`${uriToApi1}/BasicService/GetData`} />
/* More code */
<DataForm apiDataUri={`${uriToApi2}/ComplexService/UpdateData`} />
Let's imagine that you build your frontend code in some dist folder that will be packed by Docker in the image. You need to create config folder in your project that also will be added in dist folder (and obvious, will be packed in Docker image). In this folder, you will store some config files with some server-specific data. And you need to load these files when your react application starts.
The flow will be like that:
User opens your app.
Your App shows some loader and fetches config file (e.g. ./config/api-config.json)
Then your app reads this config and continues its work.
You need to setup Docker Volumes in your Docker config file and connect config folder in Docker container with some config folder on your server. Then you will be able to substitute config files in a docker container by files on your server. This will help you to override config on each server.

Uploading image files to AmazonS3 using ReactJs

I am fairly new to web development (currently enrolled in a bootcamp) and have struggled finding the needed resources to incorporate uploading to Amazon S3 in my project. I apologize for the vagueness ahead of time.
I currently have a react app that is pulling images from my AmazonS3 account but I am intending to give the user the ability to upload to my bucket and use/view the images on my website.
I have tried watching tutorials and looking at various GitHub Repo's to identify what I am missing but have been unable to locate a tutorial that involves React, JSX and Javascript. (I've seen jquery, PHP, etc). Ultimately, I know this task is difficult and I am willing to put in the work but felt the need to ask if anyone knows of a useful resource that can help me?
I've tried using the 'aws-nodejs-sample' repo, 'themetoerchef/uploading-with-react' repo, watched a youTube tutorial, I've looked into FineUploader and have read the react-S3-uploader npm files but am unable to connect the dots. Additionally, I've included my AWS access keys in my .env file and tried making query strings to access the S3 bucket.
Is there a better way to go about this or are there other ways to upload with react that may be useful outside of S3?
To upload to s3 from the browser you need to get a signedUrl from an aws sdk which is how aws verifies your identity. In my last application I used skd for nodejs to generate the signedUrl and pass it to my front end application to use in pushing files to s3. You don't have to go that route there is an sdk that can be used by javascript within the browser.
Check this aws link for more
Go to your project directory and run
npm install --save react-aws-s3
https://www.npmjs.com/package/react-aws-s3
And add the code in your component as per the NPM document
import S3 from 'react-aws-s3';
const config = {
bucketName: 'myBucket',
dirName: 'media', /* optional */
region: 'eu-west-1',
accessKeyId: 'JAJHAFJFHJDFJSDHFSDHFJKDSF',
secretAccessKey: 'jhsdf99845fd98qwed42ebdyeqwd-3r98f373f=qwrq3rfr3rf',
s3Url: 'https:/your-custom-s3-url.com/', /* optional */
}
const ReactS3Client = new S3(config);
/* Notice that if you don't provide a dirName, the file will be automatically uploaded to the root of your bucket */
/* This is optional */
const newFileName = 'test-file';
ReactS3Client
.uploadFile(file, newFileName)
.then(data => console.log(data))
.catch(err => console.error(err))
/**
* {
* Response: {
* bucket: "myBucket",
* key: "image/test-image.jpg",
* location: "https://myBucket.s3.amazonaws.com/media/test-file.jpg"
* }
* }
*/
});
Now its everything is done, make sure to load your keys and secrets from Process ENV.
NOTE: Please don't forget to add the CORS policy on the AWS bucket if you see corse error, see here the detailed example.
thanks

Angular JS + Laravel 4: How to compile for production mode?

So i have watched the 5 part youtube videos by David Mosher about Angular JS (vids great by the way). In the part 2 (http://www.youtube.com/watch?v=hqAyiqUs93c), it has a practical mysql database usage which I almost wanted.
I'm going to use AngularJS along with Laravel 4, but I had no idea which files I'm going to upload for the web hosting later. I'm trying to run the web app under "/public" folder in my root on localhost (localhost/public/) but the css and js points to the wrong directory (points to the root: '/css/style.css').
Another method I have tried is by copying all the files to the root and moved all files inside "public" to root. Then I navigate to "localhost/public/". All works fine in script paths, except that it doesn't seemed to do any connection to the database (either the laravel or angular failed).
Is there any proper way to do this for practical use (without using php artisan serve or grunt run or lineman run on the server)? Which files I should upload later?
EDIT: the reason is my web hosting doesn't allow me to install nginx or run code remotely using putty, so I need a manual way to do this. Thanks.
First install latest laravel in your localhost. See doc.
Assuming you have completed composer install command.
Then move your all public folder contents to the project root.
Next change the line 21 in index.php from,
require __DIR__.'/../bootstrap/autoload.php';
to
require __DIR__.'/bootstrap/autoload.php';
and line 35 content
$app = require_once __DIR__.'/../bootstrap/start.php';
to
$app = require_once __DIR__.'/bootstrap/start.php';
Now you can access project without public folder.
Place your css, js and other assets folder in root like http://localhost/laravel/css
Note that the laravel blade and angular also using {{ syntax for compilation.So you need to change the laravel blade syntax to {= and =}.Otherwise you will get conflict.
To do this open vendor/laravel/framework/src/Illuminate/View/Compilers/BladeCompiler.php file and change line 45 to this
protected $contentTags = array('{=', '=}');
and line 52 to this
protected $escapedTags = array('{={', '}=}');
Now you can use {{ for angular and {= for blade.
For linking your assets, use HTMLBuilder functions, see doc here.
Now use these in blade,
{= HTML::style('css/style.css') =} // links localhost/project/css/style.css
{= HTML::script('js/jquery.js') =}
Use migrations and db seeds in localhost and make an exported copy of db for online hosting
After completing project, copy entire project content to online server and change db configuration and import database.
Directory Structure for Online
There will be a public directory for your file hosting, where you put your files in web root.
That may be htdocs or public_html and now it's your project public root.Now the directory structure will be,
-- app
-- bootstrap
-- css
-- images
-- js
-- vendor

Resources