I have the code of a website in a subversion repository.
The admin of the site can upload images via a CMS.
These images go to different directories inside "webroot/uploads/".
This directory forms part of the repository, too.
I have a cron task to backup periodically (via svnadmin dump) the repository, but the images uploaded by the user aren't in the backup because they aren't in the repository.
At this moment only the admin of the CMS can upload images and not any other user of the web.
I'm thinking about doing a backup of "webroot/uploads/*" with tar and gzip.
A different idea is to somehow include automatically the uploaded images into the repository.
One more advantage of this is that I will receive in my development computer all the images when I update the repository.
What do you think is the best way?
Thank you!
Usually, binary contents (exe, dll, images, ...) which don't benefit from version control features (diff, labels, merges, ...) aren't under version control.
However:
if those images doesn't change much (ie the same image doesn't get modified over and over), and
if their number is limited (ie you don't upload to webroot/uploads/* thousands of images a day)
, you can consider adding them to your SVN repo (since it is a centralized repo, you don't have the worry of cloning the full history of a repo like for a DVCS).
Related
My website changes every day - I run a news website with new stories every day. I want Google to index my site as often as possible and want/need to autogenerate the sitemap.
I use Google App Engine (with Node.js) to run my site. With GAE - I do not have write-access to the root directory. To post the site map - I need to re-deploy my whole site after generating the map. That is an unnecessarily complex step.
I have searched far and wide and cannot see how to save my sitemap. So - I considered using a static one with a dynamically generated child that I store in another location where I have write access. Google says it wants all linked sitemaps in the same directory. So that appears to be a dead-end.
Can I use "App Deploy" in such a way that only the sitemap is uploaded? Any other possibilities? Appreciate any and all suggestions. It seems unlikely that Google didn't provide some way to solve this.
For a site where new URLs are being created regularly (like a news, blog site, etc), don't 'store' your sitemap. It should be generated on demand i.e. your App should include code to generate the content when the link <your_website>/sitemap.xml is loaded.
Separately, you should note that gcloud app deploy doesn't always deploys all your files. It usually deploys only files that have changed. You can easily confirm this by running the deploy command, changing a single file and then running the deploy command again. You will see that the logs will say something like - Uploading 1 files to Google Cloud Storage and the deploy will be faster. You can change X number of files, deploy again and the message will be updated to indicate it is only deploying x files.
However, I'm not sure what it uses to compute the diff. Maybe it compares it to the files currently in your staging bucket and if the files in the staging bucket have been deleted (they have a default life span of 15 days) it will deploy all the files again (but as I said, I'm not sure of this)
I'm developing a web application in which I want to insert users and be able to display files that they upload via a search option. I can get all of the logic that I need sorted and the files uploaded into the correct directory. However, if I insert a new user into the db, the web app cannot find their file in the directory until I restart the server.
How can I make it so that the resources directory of my web app automatically gets refreshed by the server? I'm developing in Java/JSP and using Tomcat as my server.
Thanks!!
I'm guessing you're putting the files into the src/main/resources folder, then it's being packaged into the artifact and then you access them as the classpath resources. Then the next portion of the resources is going to be available after the next packaging.
Instead you should access the Files via usual File System and Absolute or Relative paths.
I try to setup a deployment workflow, but I am completely new to it. I consider to use Git and Bamboo and in this whole thing I am stuck with the database.
Let's say I want to make changes on a CMS website and keep my files versioned on git (bitbucket), I understand how to setup Bamboo that it can SCP the files to my webserver, but I don't get it how I can get the database into this whole system? Are there any tools I am missing?
What I want:
I want to be able to checkout my website files from the gitserver, make changes and send them back to the gitserver and this (via Bamboo) should push the files to the live or testserver.
But even after searching for hours, I don't get a smooth way how to handle the database (getting it local, making changes and pushing it to the server via git) or any other smooth way.
I know there are tools to quickly dump the db for WordPress sites, but for other CMS there are no such tools.
Any advice how to do this right?
In my GWT application, a 'root' user upload a specific text file with data and that data should be available to anyone who have access to the app (using GAE).
What's the classic way to store a data that will be available to all users? I don't want to use any database (objectify!?) since this is a relatively small amount of information and it changes from time to time by root.
I was wondering if there was such static MAP on the 'engine level' (not user's session) that this info can be stored (and if the server is down - no bigi, root will upload again)
Thanks
You have three primary options:
Add this file to your /war/ directory and deploy with the app. This is what we typically do with all static files that rarely change (like .css file, images, etc.) This file will be available to all users, whether they are authenticated or not.
Add this file to your /war/WEB-INF/ directory and deploy with the app. This file will be available to your server-side code, so you can read it on the server-side and show to a user. This way you can decide which users can see this file and which users should not have access to it.
Upload this file to Google Cloud Storage. You can do it through an app, or you can simply upload it manually to a bucket using a GCS console or gsutil command-line tool. Then you simply provide a link to your users. The advantage of this option is that you do not have to redeploy your app when a file changes.
The only reason to go with the first two options is to have this file under version control. If you don't need that, I would recommend going with the GCS option.
Is it possible for DNN to read files directly off a server? What I would like to do is be able to drag files into a created folder on my server and they will automatically be uploaded and show up on my DNN site. My main concern in this scenario is that I want to be able to go into the file on the server make changes and the changes will automatically be reflected on my DNN site without me having to reupload the file.
You can put things into the Portals/#/ folder, or a subfolder there, and have DNN AutoSynchronize the file-system (there's a scheduled task) so that the files show up in the File Manager in DNN.
I have a client who has a specific Uploads directory they FTP files into, and the synchronization process makes them available to their editors.