I need to some help to identify the correct solution that would help me create a seamless documentation for my friends.
We have several repositories in which a doc folder with several .MD files are going to be placed
Repo1
|- Readme.MD
|-docs
|- Installation.MD
|- Usage.MD
Repo2
|- Readme.MD
|-docs
|- Installation.MD
|- Usage.MD
Repo3
|- Readme.MD
|-docs
|- Installation.MD
|- Usage.MD
We would like to use something like vuepress to generate a static site.
If there is any tool/framework which can easily solve this issue. I would be grateful
Thanks a lot for any response,we will definitely put below what we have done
You can accomplish this using MkDocs as your static site generator and the multirepo plugin. Below are the steps to get it all setup. I assume you have Python installed and you created a Python venv.
python -m pip install git+https://github.com/jdoiro3/mkdocs-multirepo-plugin
mkdocs new my-project
cd my-project
Add the below to your newly created mkdocs.yml. This will configure the plugin.
plugins:
- multirepo:
repos:
- section: Repo1
import_url: {Repo1 url}
- section: Repo2
import_url: {Repo2 url}
- section: Repo3
import_url: {Repo3 url}
Now, you can run mkdocs serve or mkdocs build, which will build a static site with all the documentation in one site.
Here is an idea,
Initialize a repository, lets call it TheRepo
Add all depending repos (in our case Repo1, Repo2, Repo3) as git submodules to TheRepo
Now we have everything from Repo1, Repo2 and Repo3 into TheRepo, which I assume might be undesirable for a documentation website.
Create a bash script (using find, grep, rm or similar bash charm) to retain the desired *.md files and remove the undesirable source files. Here is a sample on how-to:
// remove everything else that is not markdown file
find . -type f ! -name '*.md' -delete
Initialize vuepress at TheRepo's root and let vuepress generate cosolidated documentation.
If you do a good job at step 3. you can have a seperate section/classification for each individual dependency Repo in the resulting documentation.
To refresh the contents, simple use git submodule update to update documentation and then chain it with script created in step3 as to automate the refresh process.
git submodule update && ./remove-undesirable-files.sh
Related
I am working on a project that uses both React and Springboot apps, so I have individual folders for each of those and am trying to get them both into my github repo. I was easily able to drag and drop the springboot folder, but my react one does not upload at all when I drag it into the upload box. Is there an easy terminal command in vs code (the editor I'm using) to add the entire folder?
Since you want to add your 2 different projects in 1 repository, you can firstly put both of your project folder inside 1 main folder, example:
MyFolder:
- MyReactProject
- MySpringbootProject
here MyFolder is your main folder, which has both of your projects, react and springboot.
then finally make a file inside MyFolder with name .gitignore, and put this line inside that file:
**/node_modules
what this file will do, is when you will push your code to github, it will ignore all the files and folders which are listed inside .gitignore file.
You Don't Need node_modules folder
when uploading your code somewhere online, you don't need to upload the node_modules folder, because this folder contains all the dependencies your project need, but when uploading your code online people can download those dependencies by calling the command npm install which will read all the modules needed from your package.json file, and download it on your machine.
To upload your code online you first have to authorize yourself on your local git program, to authorize yourself for github you can read this Article.
After authorizing yourself, firstly make a new empty github repository, by going to github or just by clicking this Link, after making a github repository open Terminal/Command Prompt in your Folder where both of your projects were, in this context my both projects were in MyFolder.
After opening Terminal/Command Prompt enter this commands:
git init
git branch -M main
git add .
git commit "Added all the files"
After running these commands, finally run these 2 final commands:
git remote add origin https://github.com/username/repository-name.git
here, replace username with your github username, and replace repository-name with the name of your repository you specified while make a new repository.
and finally run this command
git push -u origin main
and your code should be pushed on github.
If you don't want to use git commands you can also use, Github Desktop, but it is recommended you first learn basics of git and then use github desktop
I have finished the tutorial for sphinx-versioning completely. After running the following command, I obtained a new index.html.
sphinx-versioning build -r feature_branch docs docs/_build/html
open docs/_build/html/index.html
However, I want to have version number such as 0.5.0 and 0.6.0 instead of the branches. How to stack the documentation with version numbers instead of the branches? I can't seem to find it in the official sphinx-versioning documentation.
In Pyramid, we create branches with the version number, e.g., 1.10-branch. Alternatively you can use git to tag a version number, then go into the RTD Admin for your project, and under "Versions" activate it for publication.
In the end I realized that sphinx-versioning does not recognize tag, instead it will only recognize release.
However, another problem with sphinx-versioning is that if you are using the autodoc extension, the documentation generated will be the same as your existing version across all branches and versions. The reason is that the autodoc will only generate the documentation with the package you are having on your computer now, it will not automatically download the old package and generate the older version of documentation for you. But there is a workaround for it.
Complete solution (hacks)
Say you have two releases v1.0 and v2.0 on GitHub.
Then you do a git checkout v1.0, and build the html with sphinx-versioning build <your source location> output1.
Similarly, do git checkout v2.0, and build the html with sphinx-versioning build <your source location> output2.
Then you will have two output folders like this:
output1
├── index.html
├── master
├── v1.0
└── v2.0
output2
├── index.html
├── master
├── v1.0
└── v2.0
I am omitting other unimportant files here.
Now, we just need to delete that v1.0 folder under output2 and move the v1.0 folder from output1 to output2. Then you will have a perfectly working autodoc generated documentation together with a working versioning.
Of course, the drawback for this is that the build time would increase exponentially as you have more versions and you need to manually build so many versions. But still works as a quick fix. Maybe we can write a script to do this for us so that we do not need to build them manually?
TL;DR
sphinx-versioning does not support versioning that well especially when you use autodoc. There are hacks to make it works, but it would be very slow and tedious.
If you want something convenient and don't mind having ads on your documentation, just follow Steve Piercy's suggestions and use the hosting service provided by RTD.
For the main git repository, devanshdalal.github.io, I am unable to choose the src branch to use for deploying, github.com always picks up the master branch. It becomes difficult now, because I now have to push my build/ forlder to master. Is there a way to automate this?
Currently github doesn't support choosing custom folder for repo named like {GITUSER}.github.io -
From community help post
From https://help.github.com/articles/configuring-a-publishing-source-for-github-pages/, the only three options that GitHub Pages recognizes are:
master branch
docs/ folder on the master branch
gh-pages branch
But for repo like {GITUSER}.github.io, having docs/ folder also doesn't work(I couldn't make it work). I faced this similar issue a while back. I was using jekyll to build the static pages for my site. I know It's really frustrating, but as of now what you want, is not possible.
However, I made a workaround to version-control my jekyll project as well the generated static github.io pages.
I maintain a separate repo for the jekyll version of the project(which in your case I guess would be the react project). So locally I have two separate repo -
{my_username}.github.io - > which will contain the static pages, and remote for this local repo would be the {my_username}.gihub.io repo(the static site repo).
I also have a separate repo for the jekyll project. Which has a different remote repo setup. I configured settings for this project in such way that after build, the static pages will be saved in the local repo of {my_username}.github.io folder. then I can just commit and push separately in the two repos.
This way I can keep track of the static pages as well as the jekyll project that builds the static pages.
You only need to make sure that before you configure a publishing source, the branch or folder you want to use as your publishing source already exists in your repository.
This link GithubPages will solve your problem.
I'm using Google AppEngine Flexible with python environment. Right now I have two services: default and worker that share the same codebase, configured by app.yaml and worker.yaml. Now I need to install native C++ library, so I had to switch to Custom runtime and added Dockerfile.
Here is the Dockerfile generated by gcloud beta app gen-config --custom command
FROM gcr.io/google-appengine/python
LABEL python_version=python3.6
RUN virtualenv --no-download /env -p python3.6
# Set virtualenv environment variables. This is equivalent to running
# source /env/bin/activate
ENV VIRTUAL_ENV /env
ENV PATH /env/bin:$PATH
ADD requirements.txt /app/
RUN pip install -r requirements.txt
ADD . /app/
CMD exec gunicorn --workers=3 --threads=3 --bind=:$PORT aces.wsgi
Previously my app.yaml and worker.yaml each had it's own entrypoint: config that specified the command needed to be run to start the service.
So, my question is how can I use two different commands to start the services?
EDIT 1
So far I was able to solve this by rewriting CMD line in dockerfile for each deploy of each service. However, I'm not quite satisfied with this solution.
gcloud app deploy command has --image-url flag that allows to set image url from GCR. I haven't researched that yet, but it seems that I can just upload images to GCR and use the urls since don't change that often
Yes, as you mentioned, I think using the --image-url flag, is a good option here.
Specify a custom runtime.
Build the image locally, tag it, and push it to Google Container Registry (GCR)
then, deploy your service, specifying a custom service file, and specifying the remote image on GCR using the --image-url option.
Here's an example that accomplishes different entrypoints in 2 services that share the same code:
...this is assuming that the "flex" and not "standard" app engine offering is being used.
lets say you have a: project called my-proj
with a default service that is not important
and a second service called queue-processor which is using much of the same code from the same directory.
Create a separate dockerfile for it called QueueProcessorDockerfile
and a separate app.yaml called queue-processor-app.yaml to tell google app engine what i want to happen.
QueueProcessorDockerfile
FROM node:10
# Create app directory
WORKDIR /usr/src/app
COPY package.json ./
COPY yarn.lock ./
RUN npm install -g yarn
RUN yarn
# Bundle app source
COPY . .
CMD [ "yarn", "process-queue" ]
*of course i have a "process-queue" script in my package.json
queue-processor-app.yaml
runtime: custom
env: flex
... other stuff...
...
build and tag the docker image
Check out googles guide here -> https://cloud.google.com/container-registry/docs/pushing-and-pulling
docker build -t eu.gcr.io/my-proj/queue-processor -f QueueProcessorDockerfile .
push it to GCR
docker push eu.gcr.io/my-proj/queue-processor
deploy the service, specifying which yaml config file google should use, as well as the image url you have pushed
gcloud app deploy queue-processor-app.yaml --image-url eu.gcr.io/my-proj/queue-processor
Since the Dockerfile name cannot be changed, the only way to not have to modify the Dockerfile would be to store each service in its own, separate directory. Clean separation, each service has its own Dockerfile and/or startup configuration.
But this raises a question: how to deal with the code shared by multiple services? Using symlinks (which works great for sharing code across standard env services) doesn't work for the flexible env services, see Sharing code between flexible environment modules in a GAE project.
I see a few possible approaches, none really ideal, but maybe more appealing than what you currently have:
hard-link each and every shared source code file (since hardlinking directories is not possible). A bit tedious and error-prone, but you only have to do that once per file
package and publish your shared code as an external library, added to the requirements.txt file of each service using it
split the shared code in a separate repository and have a copy of that repository in each service using it (maybe as a git submodule if using git?). You just need to ensure at the service deployment time that the shared repository is pulled at the proper version - can be quite reliably done through automation. A bit more complicated if you have uncommited changes in this repo - you'd have to patch the same changes in all services.
have multiple copies of the Dockerfiles with different names which you simply copy over instead of always editing the same file. Symlinking instead of copying might work as well, since the symlink doesn't need to be followed outside of the service directory, if it's just replicated as a symlink it'll work.
So i had a very similar issue with my Java applications. We were looking to migrate from Heroku to GAE and were attempting to simulate the Heroku Procfile with GAE services. Effectively what we did was to create separate directories in our application src/main/appengine/web and src/main/appengine/worker where each directory conainted the app.yaml and Dockerfile specific to the process. Then using the mvn appengine:deploy capabilities, we specified the -Dapp.stage.dockerDirectory and -Dapp.stage.appEngineDirecory respectively for each service we wanted to deploy. Then using just some parameters we were able to basically script out parallel deployments of each service from the same code base. Not sure if this works in your situation, but it was very useful for us: Here are the two example commands in their entirety:
Web Process:
mvn appengine:deploy -Dapp.stage.dockerDirectory=src/main/appengine/web -Dapp.stage.appEngineDirectory=src/main/appengine/web -Dapp.stage.stagingDirectory=target/appengine-web -Dapp.deploy.projectId=${project-id} -Dapp.deploy.version=${project-version}
Worker Process:
mvn appengine:deploy -Dapp.stage.dockerDirectory=src/main/appengine/worker -Dapp.stage.appEngineDirectory=src/main/appengine/worker -Dapp.stage.stagingDirectory=target/appengine-worker -Dapp.deploy.projectId=${project-id} -Dapp.deploy.version=${project-version}
How to deploy angular 2 website application on github? I am new to Git and github so just saw the basics on internet and created a repository on github and finally a url was generated in my git bash after running all steps and when I tried to open it Github 404 error pages was showing.
These are the commands which I ran through :
git remote add origin https://github.com/Muraliduke/MuraliDukeResume.git
git push -u origin master
ng github-pages:deploy
Is there any difficulty for single page application to host a website on github? I tried with normal html content and my website on github works fine. But this with ng2 is not working. Just saw on internet that there must be some prefix to be done to support SPA on github but since I am not familiar with github didn't understand it. So kindly suggest me a solution ?
There are a few things :
Deploying to GitHub pages using Angular CLI has been deprecated. Use angular-cli-ghpages
Add the 404.html fix
Ensure you have "turned on" GitHub pages for your gh-pages branch from the repository settings
optionally, add custom domain
This blog has everything you need.
Make sure to do a build to get the necessary files into dist .
ng build --prod
First get all relevant the files from the Dist Folder of your application
for me it was the css files in the assets folder main.bundle.js polyfills.bundle.js vendor.bundle.js
Then push this files in the repo which you have created.
1 -- If you want the application to run on the root directory - create a special repo with the name [yourgithubusername].github.io and opush these files in the master branch
2 -- Where as if you want to create these page in the sub directory of in a different brach other than than the root , create a branch gh-pages and push these files in that branch.
In both the cases the way we access these deployed pages will be different .
For the First Case it will be https://[yourgithubusername].github.io and for the second case it will be [yourgithubusername].github.io/[Repo name].
If Suppose you want to deploy it using the second case make sure to change the base url pf the index.html file in the dist as all the route mappings depend on the path you give and it should be set to [/branchname].
Github Repository - https://github.com/rahulrsingh09/Deployment