Git - fetch changes made to files on remote - database

I would like first to thank you ! Everytime I get stuck I always find a solution here. So thanks a lot !
But this time I need to ask a question : I am currently building a small website with a web interface where admin can change text saved in the sqlite database. I use git locally and to deploy to the server, and a post-receive hook in the bare repo on the server to checkout files in another directory. This dir is then used by nginx and gunicorn to serve the flask app and the files.
Everything works fine for the moment but I had a question : as the database is stored in a file (sqlite) in another directory than the repo, how can I fetch the changes made on the remote file to my local development repo ? Also the user can upload pictures that are rendered on the website, but how can I fetch them to my local repo. Should I init a repo in the directory I could then fetch ? Or is there another solution ?
I know this question may be silly but I am beginning with databases and web interfaces. Thank you in advance for any help !

As I understand, you have the following setup:
local repo L is pushed to remote R
post-receive on remote clones R to U, which is served by HTTP server
user updates U with new content
you'd like to pull changes from remote U to local L
I'm assuming that U is a clone of R. You can just fetch changes from U to your local L - no problem here, as all commits made by the user have the same parent commit as you have in your local L. The only troublesome part is to commit changes made by the user in U. You'll probably need to do some scripting there, so if the user uploads some content - say, via ftp - git commit is fired. Repository U must be accessible somehow - ssh seems to be a good idea in this setup.

Related

How do i access the file datas changes that were made in Heroku?

I added my app on heroku and it makes changes to the file inside it and everytime i make a change in github all the datas are lost is there any way i can get access to the file inside heroku
Heroku has an “ephemeral” hard drive, this means that you can write files to disk, but those files will not persist after the application is restarted with the dynos. The best solution to this problem is to use cloud storage and dump all the new data on it with your worker script.

Create json server in github.io

I made an angularJS application and it runs form github.io. But I need to use REST API to handle data. I use json-server at my localhost. Is it possible to create and use json-server in github?
I know its old question but I'll still post an answer here just in case someone else wants it.
It is possible to have json-server running in Github pages. Here is the link [https://my-json-server.typicode.com/][1]
Basically create a db.json file in your repository just like you do in the local system. you can access your api with https://my-json-server.typicode.com// as your your root URL.
No, it is not possible because github.io only delivers static files and doesn't run any server.
You will need to use another hosting provider to run the json-server.
See also this question

How to restore websites in localhost

I am newbie in Composite, I already backup and download C1 websites (www.solve.sg), I also have installed (and working) Composite in localhost. So how to restore websites in localhost?.
I already used method to rename folder CompositesC1 to CompositesC1old, then extract and put extracted web in CompositeC1 folder at localhost but not working and gave me error.
Please help the right method how to read online backup websites and to read that at localhost
I dont know what Composite is, but you probably need to go back to your cPanel / backend of your webhost, dump the DB powering solve.sg and import that into your local instance.
Your hosting provider will probably have a tool called phpMyAdmin; go in there and find the 'Export' tab at the top of your DB and dump it out as a .sql file. You will then import that file into your local environment, probably also using phpMyAdmin.
You may also have to change around some of the settings files that show where your database is relative to your application.

weblogic domain server.out log file manually modified locked

I have a 2 questions regarding Weblogic.
I am using Weblogic 10.3.6.
Yesterday I deployed the war file.
Following are my 2 questions
1) When I restart the server the sys out logs at location
domains//servers//logs/Server-name.out
domains//servers//logs/Server-name.log
are not getting updated
Actually logs were getting updated initially but I cleared the log file by manually opening the log file and deleting its content.
Later I found on official oracle website that
"Oracle recommends that you do not modify log files by editing them manually. Modifying a file changes the timestamp and can confuse log file rotation. In addition, editing a file might lock it and prevent updates from WebLogic Server, as well as interfere with the Accessor"
I think my log files got locked due to above reason.
Is there any way I can do to get updates in log files.
I have restarted the server as well but the logs are not getting updated.
2) I have deployed my web application using packed war file. When I deploy using war file it is expected that the war file gets exploded at some temporarily location in weblogic server. War gets deployed successfully but when I checked the contents of
WEBLOGIC/bea10.3.6.0_BI/user_projects/domains/Managedserver_7011_7012/servers/Server-chanakya/tmp/_WL_user
It is blank.
I was expecting that war should get exploaded inside the _WL_user folder. But it is not happeining right now.
Please let me know what I can do with respect to above problems.
Thankx in advance.
First question:
Generally speaking the .out file is created during server start and not updated once the server reaches the RUNNING state. The .log file should be updated continually however. It is safe to delete both of these files and once the server is restarted they should be regenerated. If for some reason they are not, go to server name -> Logging tab -> Log file name and specify the full path and name for a new log file.
Second question:
If you chose nostage for your deployment, it will not be copied to your server and will live wherever the file originally was. stage mode should copy the file to tmp/_WL_user after starting out under a stage directory. You can remove your deployment from the weblogic admin console and also delete the tmp and cache folders and try the deployment again if you need to. It's also possible the deployment failed... check the Deployments link in the admin console to make sure it reached the Active state.
Last - welcome to Stack Overflow. In general you should ask one question at a time.

app engine upload mechanisme for changed data, does it upload whole or delta?

I upload my_app to app engine app by:
appcfg.py update c:\my_app ...
If I already uploaded for my_app then done a minor changement in file,
Does it upload whole project to app engine and overwrite whole previous project?
Or does it upload only relevent change and overwrite relevent part?
And what is the case for the issue for this command:
bulkloader.py --restore --filename=my_kind.dump ...
Did you try it?
update uploads the whole application each time. There's no concept of a delta. Normally, when you upload a new version I would suggest changing the version setting - that way you can keep up to 10 previous versions of your app on the site, and only set the new one to be the default once you are sure it is working.
If you upload without changing the version, AppEngine actually creates a new version before deleting the old one, so you need a spare slot in your versions list.
I don't understand your question about the bulkloader. Are you asking if that does a delta? No, it can't, because it sends the data serially via the remote API - there's no way for it to know in advance which rows in your data file already exist in the datastore.

Resources