Should an AngularJS + nginx codebase be dockerized - angularjs

I have a AngularJS front-end project that runs on nginx and communicates to a back-end java server (separate from this codebase). I find myself running the following commands to install the package:
# make sure node, npm, and gulp are installed
npm install
gulp watch
Should the above be dockerized or is it preferred to run these projects via the commands. The code will be modified locally as we develop (so we'd probably need to configure a volume that maps to the project's directory).
What would be the advantages or disadvantages of dockerizing the above vs. just running the above commands to get the project started? The main goal here is to reduce the time it takes for a new developer to get started/comfortable with the project.

Well the only benefit I can think of right now of why you might want to dockerize this application is if you would prefer someone else to be able to deploy the application a little easier (with the only dependency being Docker and access to a repository where any built containers are being stored). i.e. they could simply issue a docker run command and reference the application / build tag, and they'd have a running containerized application.
The other possible benefit I can foresee is portability across systems that are target environments. The only dependency again is Docker.
Then you have the added benefits that come with support for automatic container builds, built in versioning to name a few.
Also note, you could set up a remote SCM to store code / Dockerfiles to automate build / deploys, if you would like to move away from local host development.
If your main goal is to is to reduce the time it takes for a new developer to get started/comfortable with the project, then the the biggest issue you will face is OS (Windows/Linux use). An alternative solution to Docker would be to use Vagrant.

Related

How to build react.js apps at visual studio code?

I have created two apps using 'Visual Studio Code' and 'node.js.' I run them using command 'npm start,' and they show in the browser. I want to build them or deploy them so they can be used by anyone. It says there to use command 'npm run build.' How to do that, and what technique you use in order to build them?
It depends on what configuration you used for building the React app. If you used create-react-app, npm run build is the correct command for building it.
If you used a different configuration (e.g. webpack), you should use the relevant command for that configuration.
Either way, deploying it will be as easy as copy/pasting the build folder's content to the server you want to host it, after running the build command.
Visual Studio Code or any other Code Editor for that matter is not relevant. You can develop, build and deploy any React app using any Code Editor you want, it's just a matter of preference.
"Building" refers to the task of preparing (transforming, minifying, compressing, etc.) all the relevant project files so that they're ready for production (assuming that your build scripts are configured to do so).
"Deploying" an app is usually a separate task that will deploy (upload) your current project build to a development platform provider like Firebase, Netlify, Azure, etc. Note that you have to register with a provider and setup a new project on their end before your deploy your project.
Which provider you use is totally up to you. Also, you have to configure your current project once you've chosen your development provider. They'll provide instructions on how to deploy your project.
On a side note, keep in mind that you can configure your own npm scripts so that they run whatever you want. More about that here

create-react-app + docker = QA and PROD Deploy

I'm using create-react-app for my projects using docker as my dev env.
Now I would like to know how is the best practice to deploy my project into AWS (I'll deploy the docker).
Maybe my question is a dummy but I'm really stuck on it.
My docker file has a command yarn start... for dev it is enough I don't need to build anything, my bundle will run in memory, but for QA or PROD I would like to build using npm run build but as I know it will create a new folder with the files that should be used on prod env.
That said, my question is: what is the best practice for this kind of situation?
Thanks.
This is what I did:
Use npm run build to build all static files.
Use _/nginx image to customize an HTTP server which serves those static files. (Dockerfile)
Upload the customized image to Amazon EC2 Container Service (ECS).
Load the image in ECS task. Then use ELBv2 to start a load balance server to forward all outside requests to ECS.
(Optional) Enable HTTPS in ELBv2.
One time things:
Figure out the mechanism of ECS. You need to create at least one host server for ECS. I used the Amazon ECS-Optimized AMI.
Create a Docker repository on ECS so you can upload your customized Docker image.
Create ECS task definition(s) for your service.
Create ECS cluster(s) and add task(s).
Configure ELBv2 so it can forward the traffic to your internal ECS dynamic port.
(Optional) Write script to automate everyday deployment.
I would get paid if someone wants me to do those things for her/him. Or you can figure it out by yourself following those clues.
However, if your website is a simple static site, I recommend to use Github pages: it's free and simple. My solution is for multiple static + dynamic applications which may involved other services (e.g. Redis, ElasticSearch) and required daily/hourly deployments.
You would have to run npm run build and then copy the resulting files into your container. You could use a separate Dockerfile.build to build the files, extract them and add them to your final container. Your final container should be able to serve the files. You can base it on nginx or another server. You can also use it as a data volume container in your existing server container.
Recent versions of Docker make this process easier by allowing you to combine the two Dockerfiles. You can have a build container and then the final container both be defined in the same file.
Here's a simple example for your use case:
FROM node:onbuild AS builder
RUN npm run build
FROM nginx:latest
COPY --from=builder /usr/src/app/build /usr/share/nginx/html
You'd probably want to include your own nginx configuration file.
More on multistage builds here:
https://docs.docker.com/engine/userguide/eng-image/multistage-build/

Loopback 3 & Angular 2 generation of /reset-password endpoint unrealiable

We are building an application that so far has a simple user management implementation. This question relates to the built-in password resetting functionality of Loopback v3. User management is being worked on a model derived from the built-in User, and it is called MyCustomUser
Each time code changes are pushed into a GitHub repo, we have Jenkins build a Docker container, and inside of it run npm install then lb-sdk (with suitable parameters) then ng build --env=prod and finally node .. After this happens, the application runs normally, BUT:
When performing the same deployment commands locally (on my own linux laptop), the API endpoints /MyCustomUsers/reset and /MyCustomUsers/reset-password are created (i.e. they are visible and manipulable via the Strongloop Explorer)
When the deployment is run by Jenkins in the Docker container, only one of the two API endpoints is created, /MyCustomUsers/reset. God only knows where the other endpoint, /MyCustomUsers/reset-password, ends up.
Obviously, all deployments are run against the same codebase (i.e. the same commit ID of the GitHub repo). It is bewildering how the service behaves perfectly on localhost but not on the cloud-based docker container.
Sounds like you are running two different versions of the Loopback-Angular2-SDK. From what I've understood the SDK for Angular2 is still in heavy beta and not yet ready for production. However this doesn't excuse the difference, but it really sounds like two different versions.
We are using the same build-flow as you, are your package.json identical when it comes to #mean-expert/loopback-sdk-builder?
The guys working with the SDK-generator are really good at responding in their issue-section, would recommend asking there otherwise.
It turns out that the remote docker was running node 6.9.2 and npm 3.10.9, whereas I was running node 6.10.3 and npm 3.10.10. After making the docker instance run the same versions as I had locally and deploying the package.json along with its npm-shrinkwrap.json, the endpoint was correctly generated.

angular app with docker - production & development

I have a simple AngularJS application. the backend can be treated like a service (external api), so no sever side is needed at all. I would like to run it on a docker, however, i'm not sure what is the best practice here.
what i'm expecting to achieve is the following:
the docker should be able to run everything i was doing locally with nodejs - using webpack/grunt/gulp without the need to install anything on my local machine + making sure every team member is working on the same version of basically everything.
the docker should be able to be deployed to production easily and run as lightly as possible (its just static content!)
the real issue is that as far as i understand, the dev docker should be based on nodejs with a mounted volume and everything.. however, the production docker should be super simple nginx server that serves static content. so i might end up with a 2 separate dockers that use the same code base. not sure if this is the right way to go..
can anyone shed some light over this topic? thanks
Your ideas seems ok. I generally create a bash script(as for me it's flexible enough) to deploy different environments according to requirement(dev&prod).
Assumed created a bash script deployApp.sh
sh deployApp.sh `{dev or prod}`
So you can also create(or switch) Dockerfile on the fly according to your environment and build your app with this Dockerfile. So you can manage your prod environment requirements(only deploy to nginx with webpack's created bundles etc.) what you need respectively.
An example about creating deployApp.sh:
webpack `{if other required parameters here}` #created bundle.js etc.
#After webpack operations , choose Dockerfile for prod or dev :
#./prod/Dockerfile , ./dev/Dockerfile
#check if first parameter is prod or dev
docker build -f ./prod/Dockerfile #this will build nginx based container
#and copy needed files&folders
That is just an approach according to your idea, also i use like that approach. You just create that setup one time. Also you can apply another projects If it is suitable.

Yeoman angular grunt-serve vs http-serve

I've just used Yeoman to create an Angular project that looks great when I run grunt serve. But then I decided to view it by running http-server, and the page gets displayed without the formatting and without the images. Does anyone know why that is and if I'll run into this issue when I push it up to my web hosting server?
I discovered that I had to run grunt to build the project which fixes the references and places a distribution uglified version of the project in a dist folder. This ran just fine on my other server.
"Does anyone know why that is and if I'll run into this issue when I push it up to my web hosting server?"
Yes, you will run into this problem on your web hosting server.
grunt-serve serves the app used the setup on your local machine.
http-server mimics how a real web hosting server would evaluate your references.
My development routine is to use grunt-serve until I have a working version and then use http-server to test it out and see if it would work before I push it to my web hosting server. As #cdavid mentioned, running grunt build from your dist directory should be sufficient for general dependency issues.

Resources