I do my web development and testing on my laptop running an installation of xampp - I upload things to my host, but I always go through cpanel's file manager to do it. I realize that there's definitely a better way to go about it, but I need to be pointed in the right direction to do so, also other tips on how to manage stuff would be appreciated.
FTP - can I keep my site's stuff synched to a local directory on htdocs so I can keep my site backed up on my computer yet update the site with whatever changes I make locally? Can anyone recommend a good client (preferably free) that I can use to do this?
Database stuff - how do I backup / sync databases in the same way? Ideally I'd like to do the same thing as with my files. Merge / upload whatever I've developed with a click or two. Is this possible? Is this wise?
Any help and advice would be appreciated. :)
I do my development in Eclipse which allows me to combine development and sync via FTP in one environment. It will also tell you if a file changed on the server and allow you to decide whether to override it or not. You can also disable the syncing of certain types of files with pattern matching and use other technologies like WebDAV or SSH to sync (if supported by your host of course).
Related
i am using wso2 api-manager 02.01.00 on a linux system. The Api-Manager is deployed at Folder A. The Databases (h2) are deployed ad Folder B which is not in Folder A. The datasources in /repository/conf/datasources/master-datasources.xml are pointing correctly to the databases in Folder B. I configured it like that, because i want do preserve the databases if there is a deployment. (Becaus a fiew Developer are using the API-Manager and they don't want to loose their Data.) But it seem, that WSO2AM_DB.h2.db is created new if there is an api-manager-depoyment. I think this, because i had a look to the DB-Size. I started with a Size of 1750KB for WSO2AM_DB.h2.db. I published a view API's in the Manager and the Size increases to 2774KB. Then i did a Deployment and the size returned to 1750KB.
Effect is that API-Store/Publisher says "There are no APIS published yet".
But i could see the APIS at Application Subscriptions and in Carbon Resources at /_system/governance/apimgt/applicationdata/provider/admin.
I tried to force a new Indexing with this, but it doesn't change anything.
Could i configure at any place, that the Database should not be created/manipulated at start?
Meanwhile i'm really desperated of not solving this problem.
Maybe you could help me.
Thank you for your Time.
WSO2 does not recommend to run on H2 database. You need to use a production database such as mysql, oracle, etc. H2 is only for tryouts.
Basically, WSO2 servers store data in databases as well as use the file system. For this kind of a deployment, you need to do the following.
Point to an external database. If you are using this for demo purposes, still you can go with the current mode (H2 database).
Use dep-sync. The content which comes under the WSO2_HOME/repository/deployment/server location needs to be preserved. You can use SVN based dep-sync or rsync. Basic idea is that for a new deployment, you need to have the data of the previous deployment.
Solr Indexing preservation. If you have hundreds/thousands of APIs in the system, it would take time for indexing. To avoid that you can copy the content of WSO2_HOME/solr to the new deployment.
I've got a wordpress site that I have been using for a year now and it is hosted with HostGator. I have got a few tests i would like to run on the site, but I would like to test it offline using wamp first before making it LIVE.
The problem is previously I was always making changes to the LIVE site, usually at hours when I get little to no traffic. However, that has changed now and I do get traffic most hours through out a 24hr day.
So my problem is:
How do i download my existing website to laptop (wamp) and make those changes with new theme? (total newbie, sorry!)
I use Windows 7, so not sure what I need to be doing to get the site working like a live site offline.
Once I have implemented the new changes, what is the best way to upload the updated site back to the HostGator server without having any downtime or errors for site visitors?
Is there anything else I need to install or do inorder for this to work? I hope you can give me as much information as possible or any links to any guides or articles that explain how to do this.
Thanks so much for any help you can offer!!!
If you're using Hostgator, the process is simple:
Install XAMPP or WAMPP on your computer;
Go to your cPanel, backup and download your website;
Extract the backup to your computer, specially the homedir and the sql;
Go to your local environment, access http://localhost/phpmyadmin
Create a new database, doesn't matter the name but for the example let's call it "database";
Inside that database, import the one taken from the backup;
create a new folder inside your htdocs with the name of your website, "example.com";
Extract the content of the homedir there;
edit wp-config with the following data:
Host: 'localhost'
Username: 'root'
Password: blank
access http://localhost/example.com
You can check a good tutorial about the subject here.
About putting the site live, I recommend you to use a GIT repository, however it's understandable that might be a little complicated and perhaps too much work for what you're trying to achieve.
Try to move your files directly from your local to live environment using Filezilla or WinSCP, the drag and drop should replace the files live and the downtime should be minimal.
Instead of WAMP, you can always use VirtualBox to install CentOS or Ubuntu/Debian.
You can go one further and install either CentminMod to automate creating a LAMP, or a full panel like ISPConfig or Virtualmin.
That take care of create the environment.
Create a new account on the LAMP, using the same domain name.
You can FTP with Windows to get the files, but networking Windows and Linux is a pain. The better option is to use the command line (CLI) in the Linux VM to ftp the files from Hostgator to the VM. This guide will help with that process: http://www.tldp.org/HOWTO/FTP-3.html
Then your only concern is the MySQL database. And for this, you have several options.
For me, the easiest is to buy (or try!) SQLyog on Windows, and then copy the database from the Hostgator source to the localhost destination. Some mild networking is needed for Windows to see the Linux VM, but nothing as complex as file sharing (the FTP issue). SQLyog is far quicker than backing up the database, then restoring it -- especially since you can run into memory issues doing it this way. It fully depends on the size of the database.
The cheap/free backup>restore method is to use phpMyAdmin.
WordPress also has plugins, of varying cost, but you still have the possible backup>restore memory issue there as well.
When done, just copy it the other way, again using SQLyog and CLI ftp. You'll still have some downtime, but it will hopefully be minimal.
As a newbie, this probably seems like rocket science, but at least it gives you a good place to start. Welcome to the world of locally dev'ing sites!
Hi guys I've dumped (made a backup) of my Appengine datastore entities,following this tutorial, now I wonder if there is a way to restore the data locally ? so I can do some test and debug.
In windows, the datastore is in the directory
C:\Users\UserName\AppData\Local\Temp\AppName
In OSx this question can help you
In this directory are storade the datastore.db (the local storage), change the name (the app should not be running, and if is locked, kill all the python process)
Now go to the appengine dashboard
click in your app link
click in Blob Viewer (i'm assumming that you did the backup into a blobstore)
click in the file name
click in download
rename the file to datastore.db
copy to the previous path
start the app
Remote API (as koma mentions) is the main GAE-documented approach, and it's a good approach. Alternatively, you can download the entities using the cloud download tool, write your own store reader/deserializer, and execute it within your dev server local instance: http://gbayer.com/big-data/app-engine-datastore-how-to-efficiently-export-your-data. Read the part about the New Approach...
While these options are not automatic and require engineering, I really wanted to point out the side effect of doing this: We have been facing performance issues in the local development server for months now, specifically when the datastore has more than 1,000 entities with over 50 indexes. Just search for "require_indexes slow" and you'll see what I'm talking about.
I'm sure you have a solid reason to import lots of data locally for testing and debugging, just wanted to let you know your application will perform extremely slow, and debug mode will be impossibly slow; we can't even use debug mode with our setup anymore.
If you want to get some test data in your local db, you could copy some using the remote api
I am trying to deploy my WPF application to some users who are outside of our corporate network. Everything works great on our LAN but I can't get the updates working when I turn on security as the user is never prompted for their login details?
Does anyone know of a way to secure my ClickOnce files so that only my users can access it? I am not allowed to put this software up without it being secure.
Any help much appreciated.
There is no way to secure your files as the ClickOnce runtime will blindly return to it's deployment point and never keep hold of the users original credentials. I have heard of ways of getting round this using various techniques but its a fair bit of work.
This might be of use www.clickoncerevolution.com.
You could also always consider an MSI installer but you won't get the automatic updates.
Marty
Internally, you can restrict access to the files on the webserver. Externally, there's not much you can do easily.
We handle this by having our customers log in when they run the application, and we verify their credentials against backend services (running on Azure). So they can't run it unless they can log in.
If you don't want to do that, I'll share this article with you. It shows how to serve up your ClickOnce files from a SQL Server database by intercepting the requests to the webserver and responding. If you were smarter with web applications than I am (not a high bar, mind you), maybe you could figure out how to intercept and ask for authentication credentials at that point.
And here's an article from CodeProject where they show one solution for what you're trying to do.
I've got a website that runs on a shared hosting environment, using ASP.net 2.0 (C#) and MS SQL Server 2005. I've recently been asked if I can integrate my website with a piece of third party desktop software that uses the Access runtime as its database (transparent to the end user).
Primarily I want to be able to offer users of my website the option of exporting their data into the Access database on their local machine. The data schema's match sufficiently, the question is how to actually do this, and in the simplest way possible for the user.
Simply having a webpage update the local Access database isn't possible due to the obvious security restrictions. I've considered asking them to upload the Access database to the server, so I can migrate the data then allow them to download it again, however the competency of the users of this software is such that even locating the Access database, let alone uploading and downloading it from the website might be too complicated.
I've also considered if Adobe Air or Silverlight could help here, but don't know them well enough to know for sure. Similarly I'm assuming another exe could be written to perform this task that the user could simply download and run, however my experience is in web development, not program development, so this isn't a 100% certainty for me, or an ideal development option for me.
So, can this be done, and if so what technique can achieve this, with the stated aims being ease of use for the end user, followed by ease of development by someone with web development as their main skill. Many thanks!
You may find this answer of interest: Best way to stream files in ASP.NET
It is about transferring a file from the server. You could save Excel or CSV and use that to update Access.
Instead of trying to do this in a web page you might just expose some views from your sql server to some client specific logins.
Then within the Access application, allow them to tie to your sql server. You might even provide an access application for getting the data from your site and stuffing it in their local access database.
In my work we have done something similar that is transparent to the user by creating an ActiveX control. The problem is that you are limiting the users to use only Internet Explorer.
I think that the best way to achieve what you are trying to do is by installing a service in the client's computer. If creating a service is beyond your experience you can post a project in a place like oDesk and find somebody that can help you with the development for the money that you are willing to pay to complete your project.
Good Luck.