I have a client who owns a business with a handful of employees. He has a product website that has several hundred static product pages that are updated periodically via FTP.
We want to change this to a data-driven website, but the database (which will be hosted at an ISP) will have to be updated from data on my client's servers.
How best to do this on a shoestring? Can the database be hot-swapped via FTP, or do we need to build a web service we can push changes to?
Ask the ISP about the options. Some ISPs allow you to ftp upload the .mdf (database file).
Some will allow you to connect with SQL management studio.
some will allow both.
you gotta ask the ISP.
Last time I did this we created XML documents that were ftp'd to the website. We had an admin page that would clear out the old data by running some stored procs to truncate tables then import the xml docs to the sql tables.
Since we didn't have the whole server to ourselves, there was no access to SQL Server DTS to schedule this stuff.
There is a Database Publishing Wizard from MS which will take all your data and create a SQL file that can then be run on the ISP. It will also, though I've never tried it, go directly to an ISP database. There is an option button on one of the wizard screens that does it.
it does require the user to have a little training and it's still a manual process so mabe not what you're after but i think it will do the job.
Long-term, building a service to upload the data is probably the cleanest solution as the app can now control it's import procedures. You could go grossly simple with this and just have the local copy dump some sort of XML that the app could read, making it not much harder than uploading the file while still in the automatable category. Having this import procedure would also help with development as you now have an automated and repeatable way to sync data.
This is what I usually do:
You could use a tool like Red-Gate's SQL Data Compere to do this. The tool compares data between two catalogs (on same or different servers) and generates a script for syncing them.
Related
I have a web application that has an SQL database.
For clarity I'm using Asp.Net 4.0/c#/SQL Server 2008 Web edition.
I recently puclished the site, which was my first, by creating a deployment package for the database.
Now a couple of months down the line, I need to update the database structure. The web application now has data that has been entered via the web, so i'll need to update the structure, then copy data across.
As this is the first time I've done it, I'm unsure of the process I should follow - is there a standard practice for this kind of update?
Also, since some of the tables use incremental ID's I need to ensure they remain the same in the newly updated database.
Any tips, links, advice appreciated.
Important Guidelines:
I assume you have not changed structure entirely (means keys column are same though solution is around for that too)
Steps are as follows:
Take export of the database
Add or remove the columns or whatever changes you want
Import the database back
Check the log for rows/tables (if some) were not updated successfully
Make SQL queries for them and run them to sync
Here are some general steps for this:
Take backup of your online database and restore it locally
Modify local database to suite your needs
Use third party comparison and synchronization tool to publish changes to your production database
There are many of these available and you can use them in trial mode to get the job done if you’re on a tight budget. You can try tools from Red Gate, ApexSQL, Idera, Dev Art and others…
I have two applications with own database.
1.) Desktop application which has vb.net winforms interface, runs in offline enterprise network and stores data in central database [SQL Server]
**All the data entry and other office operations are carried out and stored in central database
2.) Second application has been build on php. it has html pages and runs as website in online environment. It stores all data in mysql database.
**This application is accessed by registered members only and they are facilitied with different reports of the data processed by 1st application.
Now I have to synchronize data between online and offline database servers. I am planning for following:
1.) Write a small program to export all the data of SQL Server [offline server] to a file in CVS format.
2.) Login to admin Section of live server.
3.) Upload the exported cvs file to the server.
4.) Import the data from cvs file to mysql database.
Is the method i am planning good or it can be tunned to perform good. I would also appreciate for other nice ways for data synchronisation other than changing applications.. ie. network application to some other using mysql database
What you are asking for does not actually sound like bidirectional sync (or movement of data both ways from SQL Server to MySQL and from MySQL to SQL Server) which is a good thing as it really simplifies things for you. Although I suspect your method of using CSV's (which I would assume you would use something like BCP to do this) would work, one of the issues is that you are moving ALL of the data every time you run the process and you are basically overwriting the whole MySQL db everytime. This is obviously somewhat inefficient. Not to mention during that time the MySQL db would not be in a usable state.
One alternative (assuming you have SQL Server 2008 or higher) would be to look into using this technique along with Integrated Change Tracking or Integrated Change Capture. This is a capability within SQL Server that allows you to determine data that has changed since a certain point of time. What you could do is create a process that just extracts the changes since the last time you checked to a CSV file and then apply those to MySQL. If you do this, don't forget to also apply the deletes as well.
I don't think there's an off the shelf solution for what you want that you can use without customization - but the MS Sync framework (http://msdn.microsoft.com/en-us/sync/default) sounds close.
You will probably need to write a provider for MySQL to make it go - which may well be less work than writing the whole data synchronization logic from scratch. Voclare is right about the challenges you could face with writing your own synchronization mechanism...
Do look into SQL Server Integration Service as a good alternate.
What's the best way to automatically deploy changes to a database driven web application? Is there a single product out there that can modify the following...
Website (dlls, aspx, css files etc)
Database Schema (add tables, columns, etc)
Database data (modify table contents)
Reporting Services reports
I've seen various separate products, but not one that does everything.
Yes, there is a Powershell method posted at http://www.codeproject.com/KB/install/DeploySite.aspx
note: in addition, for the schema stuff you might need to upload a schema.version file and then have a process up on that server detect a new schema file was uploaded and apply it. for new database rows you could maybe do something similar. another idea is that you could run the SQL database as a webservice and talk to it direct with your powershell script.
I've seen lots of questions regarding moving data from Access to SQL Server, but I'd like to go the other route. Here's why:
I've been working on a sizable project with a SQL Server 2008 back-end and Access 2007 front-end. I'd like to be able to do some work on the front-end from home over the weekend, but I don't have access (VPN or otherwise) to SQL Server from there. I'd like to change my linked tables from SQL Server to another Access database file where I imported a snapshot of the SQL Server data. Then, come Monday morning, switch links back to SQL Server.
My problem is when I go to the Linked Table Manager and attempt to change the link, all I get is the ODBC Select Data Source dialog. If I try to link to an Access database, it tells me ODBC can't be used to link, import or export to another Access file.
One thought has occurred to me but I haven't tried yet; maybe someone could tell me if it's a good or bad idea: would I be able to delete the links, re-create them to the other Access file, and not lose any functionality in my queries/forms/reports?
My proposal would be to have SQLExpress (free) installed on your computer. You can then have all the data available on the machine. Create a publication on your main server, and have your local machine suscribe to this replication (if you don't need to save/synchronize the data changes done on your machine, you can stick to a basic 'snapshot' replication.)
You then just have to change your connection string from your network MSSQLSERVER to your localhost SQLEXPRESS server instance to have your app work.
If, for any reasons, you have to make changes to the database model while being off-line, you will then have to unsubscribe from the main server before making the changes on the local server. When you're back to the office, make sure that the same changes are done on the main server. My advice is to write your changes in T-SQL, save them in a file, and launch the file against the main server once you're back to work.
My opinion: don't work too much on weekends, or make sure your client is being billed for that.
For linking tables, you need to delete them and entirely and recreate them. Using the linked table manager to refresh ODBC links doesn't even work reliably when you're still using ODBC, as there is data cached in the table link definitions that doesn't get refreshed (e.g., if you change the number of columns in a SQL Server view, refreshing the table link with Linked Table Manager will update the number of columns, but you won't necessarily get an updateable view (assuming the original was updateable)).
But it's not clear whether or not you'll lose functionality or not. That all depends on how much of your application's logic is server-side, in SQL Server views and stored procedures. None of those will work if you link to a Jet back end.
This is probably the most classic database problem.
I have an E-commerce software solution hosted on a SQL server for data, and a web server for the frontend. Every instance/customer has its own database on SQL Server 2008.
During development of the next version, I might change or add tables, views, stored procedures etc.
How do I publish this change to all databases, without losing data? It should be done via a script or something similar. Centralized management is the key...
Perhaps it's something you've already considered, but my company uses software from Red Gate (http://www.red-gate.com/) which compares our development version of the DB and the production one, generates and executes scripts to bring production on par with development.
(I'm not a sales person from Red Gate, but I think this might be what you're looking for)
I use SQL Compare for schema changes and SQL Data Compare for data changes. Works like a charm!
This problem is essentially one of automating the manual process of logging on to a SQL Server, and running a script against one or more databases, that does the modifications you need.
It's made worse, of course, if the instances of SQL Server that you need to update are remote from you, and therefore not directly accessible.
It's also vital to ensure that the scripts are applied in sequence - it would be no point running the "add index" script before the "create table" script.
The way we've solved this is with a web service that packages script files as datasets, and delivers them in the correct sequence to the remote systems when they call home.
On the remote SQL Server, we have a .NET application which calls the web service, downloads the script files, unpacks them and applies them to the database.
When the remote system calls in, it supplies the ID of the most recent upgrade it has. When the web service completes, it knows the last one it delivered. It's therefore trivial to know what level the remote systems are at.
The only manual intervention required is to create the scripts in the first place, and upload them to the central server.
A script should be executed on sql server machine by db admin.
Main algorithm of such scipt is about to create backup, lock table each table in loop, alter it, release.
Another poster mentioned the Red Gate products, and I'll throw another commercial product out there - Quest Change Director:
http://www.quest.com/change-director-for-sql-server/
Disclaimer: I work for Quest, although I'm not in sales. Change Director does comparisons, syncing, links to a change management system, can use your dev/qa server as a source or use T-SQL scripts, has an audit trail and rollback capabilities, etc.
Like you said, central management is key, and this product focuses on that.