I have a DC/OS cluster running a local instance of 'universe'. What is the specific procedure for adding custom packages to a local 'universe'? The only documentation I have found so far (which is very limited), is related to adding packages to the global universe repo. While this is great for the DC/OS community, it does not help in regards to maintaining private universes and repos.
The only procedures I found say:
1-Create a fork of the public universe repo: https://github.com/mesosphere/universe
2-Creating a custom package and then resubmitting it back to the community.
This is not exactly what I was expecting to see. I was hoping for a simple local package creation process. Is there such a thing?
Thanks,
GAOTU
Fork this universe repo and clone the fork:
git clone https://github.com/<user>/universe.git /path/to/universe
Add your package to the repo in right folder. Write the necessary markup files
(config.json, marathon.json.mustache, resource.json, package.json etc.)
Run the verification and build script to validate and build the Universe artifacts:
scripts/build.sh
This verifies the syntax of the files you added.
Build the Universe Server Docker image:
DOCKER_TAG="my-package" docker/server/build.bash
This will create your docker image of local universe(ngnix server) and marathon.json to start the universe server.
Run Universe Server
dcos marathon app add marathon.json
Point DC/OS to local universe server
dcos package repo add --index=0 dev-universe http://universe.marathon.mesos:8085/repo
Install your newly added package in DC/OS cluster
dcos package install new_package
Do you want to add the package to your local universe, correct (i.e., not the mesosphere universe)?
In this case, after creating your custom package (and yes, there should be better documentation...) you can add this local/custom universe to a DC/OS cluster: https://dcos.io/docs/1.7/usage/repo/#adding
In general, you don't even have to fork the universe: a package repo is basically a simple folder structure. Check out the universe_builder.py here : It build a zip file, uploads it to S3, which you can then add as a new package repository as described above (and as output by the script).
In general feel to contribute and help to help improve the documentation!
Related
Primarily, I'm trying to integrate a react application (Created and build separately) with Drupal.
Problem
Unable to install private package from Bitbucket using npm install git#bitbucket.org:user/shared-package.git in Drupal app, because no package.json found.
Implementation Details
Development Environment
To achieve this in development environment I run npm run build which produces the following content in dist directory.
Not going in the details of what are the roles of other files but to make the things work, I just need to copy bundle.js file and paste it inside a directory under app/web/themes/custom/abc_themes/js/.
This is okay for development environment to copy a folder from one project and paste it into another. However, for production environment it' not viable.
Production Environment
In production we thought to create a private package on Bitbucket, where through Bitbucket pipelines on every commit we trigger a build and push that build 's result into a separate repo (i.e. private package).
Here is the content that is pushed to the so-called private package. Since it's the entire react application (not a library) therefore when it builds it creates compiled js and doesn't contain packgae.json.
Now if I try to install this package throught npm install
code ENOLOCAL
npm ERR! Could not install from "bitbucket.org:user/shared-package.git" as it does not contain a package.json file.
That is obvious but to solve this I can't convert my project into a library. Because even if I convert it to a library, Drupal needs a build js file at the specified directory to work.
Expectation
Want to know if there is a way I could install that private package (that doesn't have package.json) into Drupal application.
OR any other way around to achieve the same.
NOTE: I know one solution could be to host the build file at some CDN and pull it from there. But problem is, the Drupal app might be running behind a corporate network and users won't be able to access the internet openly. Therefore, we want to make the react app a part of build process, so once Drupal is served, react application would be a part of it already. No loading at runtime.
I have a fork of Wagtail that I need to install into my Docker container for deployment to production. In dev, I've been using a complicated combination of building the static resources, mounting the git repo into my container, and then running manage.py collectstatic, but that's clearly not going to work in prod.
So I somehow need to do whatever it is that the Wagtail devs do when they package Wagtail for release on PyPI (or something to that effect). I have no experience in this, and thus I haven't got the faintest clue how that might be accomplished.
From the root of your Wagtail git checkout (and assuming the tooling for building the static assets has previously been installed using npm install), run:
python ./setup.py sdist
This will create a .tar.gz package within dist/, which can be installed with pip. For remote deployments, it's usually most convenient to upload this to a public URL somewhere and place that URL in your project's requirements in place of the standard wagtail line.
I'm using create-react-app for my projects using docker as my dev env.
Now I would like to know how is the best practice to deploy my project into AWS (I'll deploy the docker).
Maybe my question is a dummy but I'm really stuck on it.
My docker file has a command yarn start... for dev it is enough I don't need to build anything, my bundle will run in memory, but for QA or PROD I would like to build using npm run build but as I know it will create a new folder with the files that should be used on prod env.
That said, my question is: what is the best practice for this kind of situation?
Thanks.
This is what I did:
Use npm run build to build all static files.
Use _/nginx image to customize an HTTP server which serves those static files. (Dockerfile)
Upload the customized image to Amazon EC2 Container Service (ECS).
Load the image in ECS task. Then use ELBv2 to start a load balance server to forward all outside requests to ECS.
(Optional) Enable HTTPS in ELBv2.
One time things:
Figure out the mechanism of ECS. You need to create at least one host server for ECS. I used the Amazon ECS-Optimized AMI.
Create a Docker repository on ECS so you can upload your customized Docker image.
Create ECS task definition(s) for your service.
Create ECS cluster(s) and add task(s).
Configure ELBv2 so it can forward the traffic to your internal ECS dynamic port.
(Optional) Write script to automate everyday deployment.
I would get paid if someone wants me to do those things for her/him. Or you can figure it out by yourself following those clues.
However, if your website is a simple static site, I recommend to use Github pages: it's free and simple. My solution is for multiple static + dynamic applications which may involved other services (e.g. Redis, ElasticSearch) and required daily/hourly deployments.
You would have to run npm run build and then copy the resulting files into your container. You could use a separate Dockerfile.build to build the files, extract them and add them to your final container. Your final container should be able to serve the files. You can base it on nginx or another server. You can also use it as a data volume container in your existing server container.
Recent versions of Docker make this process easier by allowing you to combine the two Dockerfiles. You can have a build container and then the final container both be defined in the same file.
Here's a simple example for your use case:
FROM node:onbuild AS builder
RUN npm run build
FROM nginx:latest
COPY --from=builder /usr/src/app/build /usr/share/nginx/html
You'd probably want to include your own nginx configuration file.
More on multistage builds here:
https://docs.docker.com/engine/userguide/eng-image/multistage-build/
I am been using Acquia trial account. I have setup the project repository on the same. Now I want to clone the staging environment on my local machine but I am able to do that whereas I am able to clone dev environment successfully as per the mention steps.
Can anyone please help with the steps for cloning the Acquia Stage Environment.
We are developing a project built with yeoman angular generator. Now appears the need of "puppetize" it for deployment.
Obviusly the machine serving client part should be provided with a compiled (minimified, optimized) version of the angular project. But I have no idea if we should store it on our bitbucket repo -for example on the master branch when tagging a new release-
I couldn't find any post about this practice and I could use some help.
There are some facts about angular minified version:
It is uglificated and minificated, so code is unreadable and hard to change.
It demands compilation with tool like gruntjs, which takes some time to build each time.
It works on server, but when you choose to deploy non minificated, revisioned version, you can have other problems during adding new versions to same repo - scripts have same name and are cached in browser and possible other problems.
You decided to deploy compiled version to a client machine.
If you are using version control like git. you can add to repo a folder with compiled version, so your repository have sources and dist in same folder. Possibly you have also backend code, sometimes in backend code you can add compiled version to host on server. It's better to have all code and builds in one repo, so you can do this with one command.
In my case, i wrote scripts in java, to copy builded folder to another folder. We use also Visual Studio for backend, so i wrote script adding new filenames to .cs file, so it can be visible by continous integration tool.
Going to a final, create new branch in git from release master branch. It is useful to have copy of your partial work.
I don't know how often you have releases, but you can solve it by having branches in git.
So your branches can look like this:
master
release1
release2
...
Assuming you are doing development on master and copying new versions to releases.