So far, when I make any changes to my local Db, to make those changes to sql azure db, I need to manually write script and run it on sql azure db. Is there any solution that help my sql azure db get updated quicker? Like right after I check in my code to source control, my sql azure db will be updated automatically? I do have one Db project in my solution. I heard about Redgate that might have some products that can solve my problem but I'm not sure. Any suggestion?
Ps: If this question is not appropriate to ask here, please warn me so I can delete it.
Edit: There is another way I normally use is to make a schema compare between db project and sql azure db and then update. This way I dont need to write script manually but I still need to do multiple steps to get db in azure updated.
Related
I have two SQL server databases running on Azure - one Prod and one Test. I want every night, the Test server will automatically sync its data and structure (include any TABLE, Stored Proc changes ) with Prod - so Test is a mirror of Prod that can be used for development / QA.
I am wondering how to achieve this on Azure? Please note I would like this to happen automatically and on a schedule. Thanks everybody.
Azure the moment, we can use Azure data sync service to sync data between database. But we cannot use the service to sync Stored Procedures. For more details, please refer to here and here. So if we want to sync Stored Procedures, SQL Server Management Studio Generate Scripts Wizard is an easy means of producing a script that will copy all of your Stored Procedures to another database.
Besides, regarding how to create a schedule to start azure SQL data sync, please refer to the blog
I have an access database that is stored on a network drive where all users access the one file. The database is linked to Sql server tables located in a local on site server. In my vba code my connection is to the access database. My question is I know it's possible to just connect to sql server from vba but all my queries are stored in Access, so will my code be able to run the queries from access if it's connected to Sql server or would I need to re-write all the queries? The problem we are having is that more than one user may be on the same record pulled up and they are overwriting each other's changes. Also a user may need to take the program on their laptop instead of having to remote in to their desktop at office. I was thinking I could just give them a copy each and that would solve the problem. Does anyone have any answers?
Just re-write the queries in SQL Server. It may be painful now, but it shouldn't be too bad, and down the road a bit, you'll be glad you moved everything to SQL Server (much faster, more stable, you're using a real DB, etc.)
You will want to pull all the queries into VBA and rewrite them with the appropriate parameters.
I am trying to use Copy Database Wizard to copy from my live server (shared hosting) to my local machine. Both the live and local servers are SQL 2008 R2.
I have used CDW for several years with perfect success when copying from a live SQL 2000 server to my local 2008 R2. But now that I have migrated my live database to SQL 2008 R2 the CDW is giving me this error:
Could not read metadata, possibly due to insufficient access rights.
I've learned that this error can be predicted before you even complete the CDW setup: On the page where the CDW asks you for your desired destination database name, it is SUPPOSED to populate the .mdf and .ldf files with their name-to-be and size (e.g. MB, GB).
But in my case these file names and sizes are not being shown (area is simply blank in the wizard) and then of course when I attempt to execute the package it gives me the error.
After much research I believe that reason for this error is due to the CDW requirement of "You must be a member of the sysadmin fixed server role on both the source and destination servers."
On my local server, my Windows Authentication login is listed as a Role Member for the sysadmin Server Role. However on my live server (keep in mind it is a shared SQL server with 250+ databases) the only Role Member listed is [sa].
Am I right in thinking that the only way to satisfy this requirement would be to add my specific SQL user to the live/source Server > Security > Server Roles > sysadmin role? I'm guessing that would never be done on a shared server right? Or is there some other way to make it work by messing with the specific database properties/users/roles?
I can't explain why CDW is working from the live SQL 2000 server and not the 2008 R2. I HOPE it is simply that something isn't set up right on the live database, but maybe it is due to changes that were made to SQL security over the years.
In case it matters, I must use the SMO method instead of detach/attach because it is a live database that I don't want to take down. Historically the CDW from SQL 2000 only takes 3 minutes with SMO method so speed isn't an issue anyway.
Here's my preference for a solution:
Find a way to get CDW to work, most likely by changing something on the live server. Is this possible? What would it be?
If that fails, then...
What about an idea of using CDW to create the package, but then going into to BIDS and manipulating something in the package to circumvent the sysadmin role requirement. (Does it really need the metadata? I don't need anything beside the actual data tables.) Is this possible?
UPDATE 6/14/2016: Editing a CDW package in BIDS won't work as it appears to simply use the .mdf and .ldf files, which of course I don't have access to on the shared server. I think an alternative is to use Import/Export Wizard to create a package, then edit in BIDS. The annoying part is that without access to metadata the Import/Export Wizard doesn't seem to be aware of Foreign Keys, and thus doesn't know what order to process the tables in.
If that fails, then...
Is there any other way to easily automate a daily copy from my live server to local machine? The reason I like CDW is because it is super simple to use (when it works), it can be scheduled to run daily as a SQL agent job, and requires no manual work on my part. Is there a "next best thing" if CDW can't be made to work?
You'd think that a very common scenario for all websites out there would be "how do I get a copy of my live database onto my local SQL server, daily, automatically"? But maybe I'm the weird one!
Another simple solution would be the Import/Export Wizard.
In SSMS right-click the database you want transferred and select 'Tasks' and then 'Export Data...'. It will open a wizard that is very similar to that of CDW. The difference here is that I could not find a sysadmin requirement to use it.
At the end it will give the option to run immediately and/or save the SSIS package. If you save the SSIS package (I prefer to save it to disk) you can then create a schedule via a SQL Agent job.
I have a few databases that I want to migrate to another server. These are production Databases, what is the best way to Migrate
1) Take full back up of the Current Databse and then Restore it on to the other Server.
or
2) Detach and then Copy the mdf/ldf files on the Destination Server and then attach the files there.
I know that after Migrating Sql Logins and Sql Agent Jobs will have to be created manually. Are there any other risks that come to mind?
Any help will be helpful.
Thanks,
Ben
For Login transfer use "sp_help_revlogin" you get the script
https://support.microsoft.com/en-us/kb/918992
this Stored procedure list out all the instance login not particular database login. One special thing about this stored procedure is no need to do orphaned fix. Just migrate the logins and check. It works.
I'll start by explaining what the current set up is and then go onto where it needs to be.
Currently we have a local SQL Server database for a CMS. The database is updated from other servers on site to update product information to display on the website, and CMS information is updated from an MVC application.
Moving forward we need to have a remote server with a SQL Server database that is an identical copy of the local database, this database will never be updated from the remote location.
The problem arises when attempting to design a method to sync the data from the local database to the remote server with no downtime on either end. I know SQL Server Enterprise has features that would help in this case, but we do not have a licence for it at this time.
The best idea we have come up with is to log ship to the remote server, create a restore from database that is receiving the logs being shipped and then update the website's web.config to point to the newly restored database. This could work, but seems overly complicated and we have an issue of an ever changing database name.
If any one could think of a better/simpler solution or a way to make the current idea better it would be much appreciated.
If anything is not clear or more info is needed let me know.
I think the logshipping solution fits your needs. After a one-time setup proces, the logs would be continually shipped from the local db to the remote db. Keeping the remote perfectly in sync with the local one, providing you with the readonly copy.
Logshipping is available in all editions, besides Express.
There would be no need for a continuous restore process.
You can find more information here.