We are building an application that so far has a simple user management implementation. This question relates to the built-in password resetting functionality of Loopback v3. User management is being worked on a model derived from the built-in User, and it is called MyCustomUser
Each time code changes are pushed into a GitHub repo, we have Jenkins build a Docker container, and inside of it run npm install then lb-sdk (with suitable parameters) then ng build --env=prod and finally node .. After this happens, the application runs normally, BUT:
When performing the same deployment commands locally (on my own linux laptop), the API endpoints /MyCustomUsers/reset and /MyCustomUsers/reset-password are created (i.e. they are visible and manipulable via the Strongloop Explorer)
When the deployment is run by Jenkins in the Docker container, only one of the two API endpoints is created, /MyCustomUsers/reset. God only knows where the other endpoint, /MyCustomUsers/reset-password, ends up.
Obviously, all deployments are run against the same codebase (i.e. the same commit ID of the GitHub repo). It is bewildering how the service behaves perfectly on localhost but not on the cloud-based docker container.
Sounds like you are running two different versions of the Loopback-Angular2-SDK. From what I've understood the SDK for Angular2 is still in heavy beta and not yet ready for production. However this doesn't excuse the difference, but it really sounds like two different versions.
We are using the same build-flow as you, are your package.json identical when it comes to #mean-expert/loopback-sdk-builder?
The guys working with the SDK-generator are really good at responding in their issue-section, would recommend asking there otherwise.
It turns out that the remote docker was running node 6.9.2 and npm 3.10.9, whereas I was running node 6.10.3 and npm 3.10.10. After making the docker instance run the same versions as I had locally and deploying the package.json along with its npm-shrinkwrap.json, the endpoint was correctly generated.
Related
I live in Ukraine, Kyiv. And due to the constant terrorist acts of russia, I very often have no electricity. However, there is a desire to learn React. But there is a dillema: How to create, view and deploy simple react app, because i know that
npm init vite
npm create-react-app
they all need internet connection to dowload data etc.
So how can i create local react app and when internet restores, publish it?
Probably codesandbox?
Really Hope the circumstances get better.
You don't need internet to develop in react, once you have installed all the npm dependencies. So whenever you have internet, just execute those commands and all the dependencies will be downloaded in a folder called node_modules.
Once that is complete, you can run npm start which will serve the app locally (without internet) on localhost:3000 and it will automatically open that url in a web browser.
But keep in mind, that if your app accesses a remote API or data, then you would need internet for that.
As soon as you have all the needed npm packages installed, you can develop local without the need of an internet connection. create-react-app also comes with a preconfigured local webserver server where your app gets deployed to by using npm start.
In addition, you can use Verdaccio. That is some sort of proxy/cache for npm. Once correct configured, it caches all used npm packages on your local machine. Then you can also create new react apps without the need of an internet connection.
Setting up Verdaccio is pretty straight forward.
All answers I've read are valid, but I want to contribute a bit. I'm the verdaccio maintainer so feel free to ask me more details via comments or asynchronously via GitHub discussions.
While you have internet you can use verdaccio (as already recommended) you can install packages through the proxy, verdaccio works by demand, so if you install npm create-react-app --registry http://localhost:4873 all packages will be in the local storage. The storage is just a folder that host all packages RAW that can be consumed offline with any package manager. Each packager manager has its own local cache, but is not sharable, but verdaccio can allow you that. If you suddenly lose internet the command npm create-react-app --registry http://localhost:4873 should still works without any issue, but if you need new packages (modify the package.json) you definitely need wait until get back online again.
The default behaviour should be good enough for your needs, but also there are plugins that can improve your experience like:
https://www.npmjs.com/package/verdaccio-offline-storage
If you need move your storage to another computer, you can use USB and just copy the storage folder (location might differ from OS you are using) zipped and just move it and install again in that computer.
The recommendation is always use proxy with all your projects while you are online and keep caching as much you can, there is now way to fetch in advance at this point but maybe in the future.
Verdaccio is maintained for many reasons, but the main one is to allow everyone keep learning Node.js/JavaScript independently of the lack of network access. Hopefully you can back to normality and circumstances get better, in the meantime feel free to ask follow up questions.
I have a AngularJS front-end project that runs on nginx and communicates to a back-end java server (separate from this codebase). I find myself running the following commands to install the package:
# make sure node, npm, and gulp are installed
npm install
gulp watch
Should the above be dockerized or is it preferred to run these projects via the commands. The code will be modified locally as we develop (so we'd probably need to configure a volume that maps to the project's directory).
What would be the advantages or disadvantages of dockerizing the above vs. just running the above commands to get the project started? The main goal here is to reduce the time it takes for a new developer to get started/comfortable with the project.
Well the only benefit I can think of right now of why you might want to dockerize this application is if you would prefer someone else to be able to deploy the application a little easier (with the only dependency being Docker and access to a repository where any built containers are being stored). i.e. they could simply issue a docker run command and reference the application / build tag, and they'd have a running containerized application.
The other possible benefit I can foresee is portability across systems that are target environments. The only dependency again is Docker.
Then you have the added benefits that come with support for automatic container builds, built in versioning to name a few.
Also note, you could set up a remote SCM to store code / Dockerfiles to automate build / deploys, if you would like to move away from local host development.
If your main goal is to is to reduce the time it takes for a new developer to get started/comfortable with the project, then the the biggest issue you will face is OS (Windows/Linux use). An alternative solution to Docker would be to use Vagrant.
So I am new with working with react and have followed this tutorial here to assist me. All runs fine following these videos and all locally, but after doing a npm run build and then pushing to Azure via a local git repo, the UI runs as expected, but whenever the UI tries to hit the Express/Node backend, it gets an error that I am not understanding how to resolve. Looking at the build scripts that runs on both, I do not see where or if I need to change an environment variable as it is already hitting the correct port on Azure. What I get is the following:
What do I need to revise for this? Since webpack with the build script in create-react-app seems to do what it needs to, I am not quite sure where things are going wrong.
I have a simple AngularJS application. the backend can be treated like a service (external api), so no sever side is needed at all. I would like to run it on a docker, however, i'm not sure what is the best practice here.
what i'm expecting to achieve is the following:
the docker should be able to run everything i was doing locally with nodejs - using webpack/grunt/gulp without the need to install anything on my local machine + making sure every team member is working on the same version of basically everything.
the docker should be able to be deployed to production easily and run as lightly as possible (its just static content!)
the real issue is that as far as i understand, the dev docker should be based on nodejs with a mounted volume and everything.. however, the production docker should be super simple nginx server that serves static content. so i might end up with a 2 separate dockers that use the same code base. not sure if this is the right way to go..
can anyone shed some light over this topic? thanks
Your ideas seems ok. I generally create a bash script(as for me it's flexible enough) to deploy different environments according to requirement(dev&prod).
Assumed created a bash script deployApp.sh
sh deployApp.sh `{dev or prod}`
So you can also create(or switch) Dockerfile on the fly according to your environment and build your app with this Dockerfile. So you can manage your prod environment requirements(only deploy to nginx with webpack's created bundles etc.) what you need respectively.
An example about creating deployApp.sh:
webpack `{if other required parameters here}` #created bundle.js etc.
#After webpack operations , choose Dockerfile for prod or dev :
#./prod/Dockerfile , ./dev/Dockerfile
#check if first parameter is prod or dev
docker build -f ./prod/Dockerfile #this will build nginx based container
#and copy needed files&folders
That is just an approach according to your idea, also i use like that approach. You just create that setup one time. Also you can apply another projects If it is suitable.
I cannot get my server code to update. I'm running a PHP instance on GAE and no matter what I do, the files won't update. In the source code view, I can see the files have updated, but when I attempt to access the updated file, I'm still viewing the old version. I've also attempted disconnecting my Bitbucket repo and using the appcfg.py update project-name command, but the files aren't refreshing when I attempt to access them. I'm not sure what to do to force the changes to take place.
My app.yaml contains the following code
- url: /(.+\.php)$
script: \1
secure: always
So the files should be getting read, right?
I was able to figure out what went wrong. I downloaded my code using appcfg.py download_app -A <your_app_id> -V <your_app_version> <output-dir> and noticed that I was downloading the old versions of the files (and wasn't downloading the new files). Turns out using source control within GAE will upload new code, but won't deploy it. I attempted to use appcfg.py update project-name one more time, but it didn't work. Turns out I didn't disconnect my Bitbucket account (could have sworn that I did...). Once disconnected, I was able to update the project using appcfg.py update project-name. While I was figuring this out, I reached out to Google support and received this message:
To use the feature of push to deploy you need to spin-up the Jenkins
Instance on GCE (Google Compute Engine) and then it will take the
updated code and execute it in the environment. Go through [1] for how
to enable the Jenkins instance and its configuration according to
different run time.
In your issue, you just mirrored the code from Bit Bucket to Cloud
Repository, as it is just doing the version control for the
application not executing the application. So basically you have have
the option of using Jenkins instance as I described above to test the
different version of the code or using the appcfg.py update command
from your local repository.
I haven't attempted to install and use Jenkins since I fixed it after disconnecting my Bitbucket account), but it may help others who have run into this problem.