I have a problem with Amazon RDS SQL Server (2014) and trying to use it with Visual Studio SQL Server Data Tools, and there's not much in the way of help on the AWS support site.
I have run up an instance of RDS with and accessed it with SQL Management Studio with no problems. I create a database and then run a schema compare from SSDT and hit update.
The first thing the update process does is amends the db_owner authorisation which then completely locks out the master user from the database on RDS. The change is identified when you hit compare in SSDT, but there is no way of turning it off that I can see.
Can anyone tell me a way around the problem?
If you want it to stop deploying permissions I wrote this for an environment where I wasn't dbo and the same thing kept happening:
http://agilesqlclub.codeplex.com/
IgnoreSecurity will stop you hurting yourself.
Ed
On the SchemaCompareProject page click on the sttings icon and go to the Object Types tab. Within the Application-scoped list there is an option for Role Memberships.
Uncheck this and re-run the compare - the line item forcing the change of authorisation disappears. This keeps rdsa as the db_owner and everything syncs correctly.
Related
I am trying to backup my entire SQL Server database, so I can restore it in case I mess something up (about to revamp my entire Umbraco site). However, according to Microsoft's guide (and others' as well), there should be a task called Back Up. However, there is not.
That is for SSMS 17 (version 14). SSMS 2016 (version 13) shows the exact same thing.
To backup a SQL Azure database you need to select the "Export Data-tier Application..." option.
This will create a .bacpac file which you can then restore to either another SQL Azure database or an on-premises SQL Server.
See the Microsoft Documentation here for more details.
In Azure SQL DB, backups occur automatically. If you want to export a database, you can export to a BACPAC (make sure active transactions are not occurring during the export). See https://learn.microsoft.com/en-us/azure/sql-database/sql-database-automated-backups and https://learn.microsoft.com/en-us/azure/sql-database/sql-database-export.
For me the problem was the database name, of all things. I know it sounds weird, but while researching this issue I discovered a bug in SSMS that causes the context menu to change (including not having backup/restore options) based simply on the database name.
I'm using 18.4, but the bug likely exists in earlier versions as well.
I've reported the bug here, but I'll summarize the steps to reproduce the issue here for convenience:
In SSMS, Right-click the Databases node, choose New Database...
Name it 1.2.3.4, click OK
Right-click on the newly-created database. There is no option to Back up (or restore)
Click on the name (so that you can rename it)
Rename the 1.2.3.4 database 1234 (remove the periods)
Perform step 3 again - this time you'll see the expected Back up... and Restore options.
It appears that the context menu for SSMS is incorrectly assuming the database type because of some pattern in the name involving periods. As for the specific pattern, I don't know. I do know that periods in general aren't a problem, but names like 1.2.3.4 are problematic. Test_1.2.3.4 is fine. I'll leave it someone actually debugging the problem to figure it out.
Hopefully this will help someone else that comes along looking for answers.
My scenario is my live database is on Azure database name vlproduction and I am using SQL Server 2014 on my local machine, database name testvlproduction. For some reason, my testvlproduction database was deleted.
I want to generate testvlproduction to be the same as vlproduction but I found there is no way to take direct backup of live so I generate script with data but script is too big (300mb) whenever I'm trying to run script on my local it shows
System.OutOfMemoryException
Please tell me, what to do to fix this ? or is there another way to generate database same as live on local
Is there already built any such functionality in SQL Server?
May be this question repeated but still I have no solution for my issue.
Fill free to ask any query.
Thanks
This is a limitation of sql server,This happen because Management Studio is running out of memory, not the SQL Server service.
This is likely caused by the size of the result set that you are returning to Management Studio.
See https://support.microsoft.com/en-in/kb/2874903for more details.
For more sql-server-management-studio-cant-handle-large-files
Since your SSMS runs out of memory you should try it with the a new release. It should be compatible with SQL Server 2014.
If you have some idle time on your production database take a db backup and restore it locally - use Export Data Tier Application and select the proper version where you want to restore. If this is not an option for you I'd suggest to take #mohan111 's suggestion of committing your script in batches.
I believe you have two problems:
System out of memory exception - This usually happens while running big scripts in SSMS. One work around is to run it using command prompt using sqlcmd command (see the link - https://msdn.microsoft.com/en-us/library/ms180944.aspx)
Creating Test Database with the same schema as that of production - You can make use of SSDT tool in Visual Studio. This will help you to create a database project which mimics the production database and you can use the publish functionality to create database where ever required which will have the same schema
I am trying to use Copy Database Wizard to copy from my live server (shared hosting) to my local machine. Both the live and local servers are SQL 2008 R2.
I have used CDW for several years with perfect success when copying from a live SQL 2000 server to my local 2008 R2. But now that I have migrated my live database to SQL 2008 R2 the CDW is giving me this error:
Could not read metadata, possibly due to insufficient access rights.
I've learned that this error can be predicted before you even complete the CDW setup: On the page where the CDW asks you for your desired destination database name, it is SUPPOSED to populate the .mdf and .ldf files with their name-to-be and size (e.g. MB, GB).
But in my case these file names and sizes are not being shown (area is simply blank in the wizard) and then of course when I attempt to execute the package it gives me the error.
After much research I believe that reason for this error is due to the CDW requirement of "You must be a member of the sysadmin fixed server role on both the source and destination servers."
On my local server, my Windows Authentication login is listed as a Role Member for the sysadmin Server Role. However on my live server (keep in mind it is a shared SQL server with 250+ databases) the only Role Member listed is [sa].
Am I right in thinking that the only way to satisfy this requirement would be to add my specific SQL user to the live/source Server > Security > Server Roles > sysadmin role? I'm guessing that would never be done on a shared server right? Or is there some other way to make it work by messing with the specific database properties/users/roles?
I can't explain why CDW is working from the live SQL 2000 server and not the 2008 R2. I HOPE it is simply that something isn't set up right on the live database, but maybe it is due to changes that were made to SQL security over the years.
In case it matters, I must use the SMO method instead of detach/attach because it is a live database that I don't want to take down. Historically the CDW from SQL 2000 only takes 3 minutes with SMO method so speed isn't an issue anyway.
Here's my preference for a solution:
Find a way to get CDW to work, most likely by changing something on the live server. Is this possible? What would it be?
If that fails, then...
What about an idea of using CDW to create the package, but then going into to BIDS and manipulating something in the package to circumvent the sysadmin role requirement. (Does it really need the metadata? I don't need anything beside the actual data tables.) Is this possible?
UPDATE 6/14/2016: Editing a CDW package in BIDS won't work as it appears to simply use the .mdf and .ldf files, which of course I don't have access to on the shared server. I think an alternative is to use Import/Export Wizard to create a package, then edit in BIDS. The annoying part is that without access to metadata the Import/Export Wizard doesn't seem to be aware of Foreign Keys, and thus doesn't know what order to process the tables in.
If that fails, then...
Is there any other way to easily automate a daily copy from my live server to local machine? The reason I like CDW is because it is super simple to use (when it works), it can be scheduled to run daily as a SQL agent job, and requires no manual work on my part. Is there a "next best thing" if CDW can't be made to work?
You'd think that a very common scenario for all websites out there would be "how do I get a copy of my live database onto my local SQL server, daily, automatically"? But maybe I'm the weird one!
Another simple solution would be the Import/Export Wizard.
In SSMS right-click the database you want transferred and select 'Tasks' and then 'Export Data...'. It will open a wizard that is very similar to that of CDW. The difference here is that I could not find a sysadmin requirement to use it.
At the end it will give the option to run immediately and/or save the SSIS package. If you save the SSIS package (I prefer to save it to disk) you can then create a schedule via a SQL Agent job.
I have a set of databases on an Amazon RDS Instance. The version is SQL Server 2008 R2 and as far as I understand I cannot simply set up an audit via Management Studio. I am considering creating a table which will be filled by my ASP.Net application upon attempting a query, however this will not track a user that has made changes directly to my databases outside the application.
Does the Amazon console have anything to track database changes/user activity?
Thank you SO.
You might consider using audit triggers on the tables you wish to track.
This article
https://www.simple-talk.com/sql/database-administration/pop-rivetts-sql-server-faq-no.5-pop-on-the-audit-trail/
describes how to implement a simple audit trigger.
First a little introduction.
We have an SQL Server Express 2008 database, which schema is kept in the source control in the form of the scripts, originally created by opening the context menu on the database in the Management Studio and selecting Tasks|Generate Scripts ....
Each developer has a local SQL Server Express 2008 database and there is one central SQL Server 2008 database used by the CI server. Any time a developer changes the schema of the local database, (s)he regenerates the scripts, commits them along with the source code and updates the schema on the central database used by CI server.
Sometimes, the changes are vast and it is easier to simply delete the entire database and create it from scratch using the scripts, which is really easy.
BUT, there is one problem. Although, the scripts do create the right database user (say ServerUser), they do not grant that user the db_owner role membership required by the application. The developer must remember to open the properties dialog on the ServerUser user in Management Studio and check the db_owner checkbox in the Database role membership list. Needless to say, we forget this step frequently. I, myself, have just wasted about an hour trying to figure out why the server does not start.
Being somewhat a naive person, I thought I could manipulate the [master].[sys].[database_role_members] catalog with a simple sql insert statement to add the necessary role membership automatically, as part of the scripts execution. Of course, that failed with the error message Ad hoc updates to system catalogs are not allowed.. Stupid me.
Anyway, my question is this. Can we have an SQL script to be run when the database and the ServerUser are created, which would make this user have the db_owner role membership for the new database?
Thanks.
You can use sp_addrolemember.
http://msdn.microsoft.com/en-us/library/ms187750.aspx
EXEC sp_addrolemember 'db_owner', 'username'