I am trying to restore data from a series of databases onto a new server. The steps I have taken so far are:
Copy backup files (and transaction logs) to new server
Restore each database using SQL Server 2012
Attempt to run the Application Tier Only configuration tool in TFS 2015 (same version as the "LIVE" server.
When the wizard runs I get through the initial checks and the when it attempts to start the configuration I get the below error.
Can anyone suggest what the problem may be (I have tried remapdb but keep getting the syntax wrong)?
Error Text:
TF255356: The following error occurred when configuring the Team Foundation databases: TF246083: The configuration of Team Foundation Server is not valid. You must remap the databases in order to fix the configuration. The following error was received from the server: TF400673: Unable to find any compatible SQL Analysis Services database within the specified instance.
'2' hosts have been given updated connection strings.
Seems you are trying to Clone a server. Please make sure you exactly following the instruction : Move or Clone Team Foundation Server from one hardware to another or Restore data to a different server than the current one for TFS
Anyway, you can refer below info to fix the issue quickly:
According to the error message, seems you have not restored the TFS_Analysis database.
(More information please see Chaminda's Blog - Prepare Restored Databases)
Restore the TFS_Analysis database first. (If that restored, just
try to remove it, then restore it again)
Then run the PrepareClone command to check if there are any errors. (You must run this command before
configuring AT of your cloned TFS)
TFSConfig PrepareClone /SQLInstance:ServerName /DatabaseName:TFSConfigurationDatabaseName /notificationURL:ApplicationTierURL
After prepare clone, execute the ChangeServerID command to change the
server GUIDs associated with the TFS databases.
TFSConfig ChangeServerID /SQLInstance:ServerName /DatabaseName:ConfigurationDatabaseName
After that, execute RemapDBs command to redirect TFS to its databases in new hardware.
TFSConfig RemapDBs /DatabaseName:ServerName;ConfigurationDatabaseName /SQLInstances:ServerName1,ServerName2 [/AnalysisInstance:ServerName] [/AnalysisDatabaseName:DatabaseName] [/preview] [/continue] [/usesqlalwayson]
Related
I have taken a look at several articles including this unanswered question: SQL Server Job runs successfully but doesn't execute packages
I have the exact same problem in SQL Server 2012 using the integration services MSDB catalog. I can execute the SSIS packages manually from that catalog, but the agent job doesn't do anything except state that it completed successfully. I have also executed my SSIS packages from within Visual Studio and they worked just fine. Here's the situation and am wondering if it may be permissions:
SSIS packages look for Excel files matching criteria in a network location.
Once found, the SSIS packages writing the data into the database and archive the file to another folder on that same network location.
Emails are sent upon any failure of import of data into the database or migration into the archive folders.
I have the SQL Agent job running the SSIS packages from a package store (MSDB) using the SQL Server Agent Service Account to run under. Currently we are not doing any sort of project deployment to these servers so I am sticking with package deployment. Here are some steps I've taken:
Run packages manually from Visual Studio 2010 (fully successful).
Run packages manually from SQL Server MSDB catalog (fully successful).
Run job manually from SQL Server Agent using parent package as a step that will execute child packages as an external reference (success but nothing happens).
Run job manually from SQL Server Agent using each package as its own step excluding the parent package (success but nothing happens).
Any ideas? Permissions to the network location or need a proxy? Again, I am running Microsoft SQL Server 2012 Enterprise Edition 64-bit. Many thanks for any help you can provide.
Found the problem. My SSIS package has a foreach loop container and, while the tasks inside the loop container couldn't access the destination, the loop container technically completed successfully. We had to give permissions to the account the steps were running under for the job to correct that. These permissions were put on the network location to allow that account access to read and write to that location. Additionally, my Excel connection was 64-bit so we enabled it to 32-bit runtime and this allowed that portion of the process to complete successfully. I re-enabled any disabled tasks and it looks good to go now. Thanks!
I have also faced this scenario many times but when I checked running the package manually,its completing successfully because I was using for each loop container and sequence container as well.In both cases for each loop and sequence were completing without validating other ones.So I checked precedence constraint and change it,Now it working and all the component ran successfully.
Sometimes we miss to choose appropriate precedence constraint, there are many option like on Success ,failure,completion and then for you can choose values from Constraint ,Expression,Expression AND Constraint and Expression OR Constraint.
Initially i was using Expression OR Constraint for success and now changed it to Expression AND Constraint, its working fine for me.
You also need to do this,it will definitely work please try and let me know.
When I extract a data-tier application from a Microsoft Azure SQL database that has a Master Key, I was unable to import it into SQL server on my local PC.
You will find others had this issue here: SSMS 2016 Error Importing Azure SQL v12 bacpac: master keys without password not supported
However the steps provided as the answer did not work on my installation.
Steps are
1. Disable auditing on the server (or database)
2. Drop the database master key with DROP MASTER KEY command.
Microsoft Tech Support verified this solution did not work on my installation of SQL Server and after actually taking remote control of my PC and trouble shooting, they were unable to determine why this was occurring.
I needed to find a way to remove the Master Key from the bacpac file. I have a Powershell script to remove the Master Key from the BACPAC file but it requires extracting, renaming files and running scripts from Windows Powershell to get the db imported.
Does anyone have a program or set of scripts which would automate the process of removing the Master Key and importing a SQL DB from Azure with a single command?
I am new to this forum. Please do not be harsh with this post. I am trying to do the best I can to help others to save the many hours I spent coming up with this.
I have cobbled together a T-SQL script which calls a Windows Powershell script (also cobbled from multiple sources) to extract a data-tier application (database) from Microsoft Azure SQL database and import it into a database on my local SQL Server by running ONE command. Over the months I found some of the code that is in my scripts from other blogs etc. I am not able to provide the credit due to those folks as I didn't keep track of where I got the info. If you are reading this and you see your code, please take credit. I apologize for not being able to give you the credit for your work.
There may be configuration settings on your PC and your local SQL server that need adjustments as this entire solution requires pretty much full access to your computer. If you run into trouble with compatibility, let me know and I will do the best I can to let you know how my system is configured in case it will help you.
I am using Windows 10 Pro and Microsoft SQL Server Developer (64-bit) v12.0.5207.0
I have placed the two files that do all the work on GitHub here: https://github.com/Wingloader/Auto-Azure-BACPAC-Download.git
GetNewBacpac-forGitHub.sql
GetAzureDB-forGitHub.ps1
WARNING: The Powershell script file will store your SQL sa password and your Azure SQL login in clear text!
If you don't want to do this, don't use this solution.
My computer is owned and controlled solely by me so I am able to open up the security in my system and I am willing to assume the responsibility of safeguarding it.
The basic steps of my solution are are accomplished as follows: (steps 1 and 2 are optional as I like to keep a version of the DB I am working with as of the point in time I pull down a clean production copy of my Azure DB)
Back up the current DB as MyLocalDB.bak.
Restore that backup from step 1 to a new DB with the previous day stamped at the end of the DB name (e.g., MyLocalDB20171231)
Delete the original MyLocalDB database (needed so we can recreate the DB with the original name later on)
Pull down the production database from Azure and create a new database with the name MyLocalDB.
The original DB is deleted in step 3 so that the restored DB can use the original name (important when you have data connections referring to that DB name)
In Step 4, the work of extracting the data-tier application DB from Azure is initiated by this line in the T-SQL:
EXEC MASTER..xp_cmdshell '%SystemRoot%\System32\WindowsPowerShell\v1.0\powershell.exe -File C:\Git\GetUpdatedAzureDB\GetAzureDB.ps1"'
The Powershell script does the following:
The target for the extract is a file named today.bacpac (hardcoded). The first thing to do is delete that file if it already exists.
Extract the DB from Azure into the today.bacpac file.
Note: my DB on Azure has a Master Key for encryption. This will need to be removed from the files prior to importing the bacpac file into your local DB or it will fail (this may not be required in SQL 2017 according to my previous conversations with MS Support). If you do not use a Master Key, you can either strip out the code that does this step or just leave it alone. It won't remove anything if it isn't there. It would just add a little overhead to the program.
Open the today.bacpac file (zip file) and remove the MasterKey node from the Origin.xml file.
Modify the Model.xml file to updates the SHA hash length. This is required in order for the file not to appear to have been tampered with when SQL opens the bacpac file.
Re-zips the files back into a new file today-patched.bacpac
Runs this line of code (from Powershell) to import the bacpac file into SQL Server
&C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin\SqlPackage.exe" /Action:Import /SourceFile:"C:\Git\GetUpdatedAzureDB\today-patched.bacpac" /TargetConnectionString:"Data Source=MyLocalSQLServer;User ID=sa; Password=MySAPassword; Initial Catalog=MyLocalDB; Integrated Security=false;"
After editing the two files to provide updated paths, usernames and passwords, run the SQL script. You do not need to edit the scripts again. You can run the SQL script again without modification and it will create a new copy of your Azure DB.
Done!
The goal: decom the old server where TFS/SQL was originally installed, and run TFS/SQL on new server. To add insult to injury, the old server I will reference here is SBS 2011 - if you know anything about that environment, you may understand why it is slated for decom.
I performed a restoration-based move last week. While it was successful with respect to functionality, I now have what I would describe as a dual data + application tier implementation. Today, I have TFS/SQL installed on two servers, both with TFS Version: 11.0.60610.1 (Tfs2012.Update3) and SQL Version: 2008R2. Both servers in the same domain.
My curiosity lies in the behavior of the Tfs_Configuration db. I restored both the Tfs_ db as well as the Tfs_Configuration db (via .BAK files) to the new SQL server, but I still see activity happening on the old server here "c:\Program Files\Microsoft Team Foundation Server 11.0\Application Tier\Web Services_tfs_data", but no updated/recent files in the same location on the new server, suggesting the Tfs_Configuration db really did not move/restore properly.
In the TFS console on the new server, I see the URLs in the “Application Tier Summary” section referring to the old server, but the Machine Name is the new server. I also see in the "Application Tiers" section, a reference to the old server Machine name. Yet, in verifying change logs, the Tfs_ db is now resident on the new server and accepting Visual Studio commits/check-ins. There is a Tfs_Configuration db on the new server, but it seems to be the default install copy and not my restored db.
In the various guides I have read, I do understand the web.config file holds the instructional set for the catalog, etc. here "appSettings … add key="applicationDatabase" value="Data Source=instance name;Initial Catalog=Tfs_Configuration;Integrated Security=True". I was expecting to change that entry once it migrated to the new server, but rather it is still parked on the old server.
I have turned off the TFS and SQL services on the old server as a trial to see if the new installation would pick up the load, but as you might expect, TFS then goes into an unavailable state to the users.
The primary questions are:
Why did the Tfs_Configuration db not restore to the new server in the same fashion as the Tfs_ db?
How can I move that Tfs_Configuration db and turn off that old SBS 2011 unit?
Any tips or tricks are welcomed and appreciated.
Thank you.
What you did is a non trivial operation (see Move Team Foundation Server from One Hardware Configuration to Another).
Typical missing steps:
Changing URLs
Cleaning caches
Changing server ID if you want to keep both instances live
Changing accounts in case you used local user accounts
I have completed this process successfully. It required a triangulation between the three servers. The essential aspects involved solid SQL backups, coupled with the settings.xml from healthy TFS Console backups.
It was certainly a process that took a lot of planning and anticipating snags.
All-in-all, it was a great exercise in watching the data flow and understanding the roles of the configuration and collection DBs. Thank you for responding to my inquiry.
I want to install DB2 UDW in my machine for learning purpose but I am having a hard time configuring the local instance. Any help would be highly appreciated.
I installed DB2 express edition -c . I have selected all the default choices. I am trying to connect using IBM data Studio 4.1, In the "DB2 first Steps" GUI I have chosen to create SAMPLE Database. I am getting the below error
Creating database "SAMPLE" on path "C:"...
Existing "SAMPLE" database found...
The "-force" option was not specified...
Attempt to create the database "SAMPLE" failed
'db2sampl' processing complete.
I tried connecting from Data Studio using the following options
Database- SAMPLE
Port- 50000
host - localhost
Error I am getting
Explanation:
An attempt was made to access a database that was not found, has not been started, or does not support transactions.
User response:
Ensure that the specified database name exists in the system database directory. If the database name does not exist in the system database directory, either the database does not exist or the database name has not been cataloged. If needed, issue a db2start command and then resubmit the current command.
SQL4499N A fatal error occurred that resulted in a disconnect from the data source.
SQLSTATE: 08004
Problem is I am having zero knowledge in DB2. If I need to run db2start command from where I should run this? Please help
Probably the instance is not started.
Once you have installed DB2, you need to have an started instance in order to use any database. The instance could be created at the same time of the installation. You can verify which instances exist in your computer by issuing:
/opt/IBM/db2/V10.1/instance/db2ilist
The output should give you a set of users, where an instance has been configured.
You can change to that user and start the instance. For example if the user is db2inst1
su - db2inst1
db2start
Once the instance is started, you can now create a database and then connect to it.
I'm running the copy database wizard on a 2008 R2 instance of SQL Server.
The database I want to copy is a SQL 2000 database.
I'm copy that database to another SQL SErver 2008 R2.
The wizard uses SQL authentication for both servers, and both are sysadmins.
When I run it, I get the following error (FYI I have tried both copying the logins and leaving them out):
Event Name: OnError
Message: ERROR : errorCode=-1073548784 description=Executing the query "sys.sp_addrolemember #rolename = N'RandomRoleName..." failed with the following error: "The role 'RandomRoleName' does not exist in the current database.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
helpFile= helpContext=0 idofInterfaceWithError={C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
StackTrace: at Microsoft.SqlServer.Management.Dts.DtsTransferProvider.ExecuteTransfer()
at Microsoft.SqlServer.Management.Smo.Transfer.TransferData()
at Microsoft.SqlServer.Dts.Tasks.TransferObjectsTask.TransferObjectsTask.TransferDatabasesUsingSMOTransfer()
Any help would be appreciated!
Jim
My suggestion is dont use the copy database wizard. Create a full backup of the database on the 2000 server and then restore it on the 2008 server.
If you google "Microsoft.SqlServer.Management.Dts.DtsTransferProvider.ExecuteTransfer Copy Database Wizard" you will find that many many people have gotten this same error or other nearly identical smo errors... no-one appears to have gotten past it.
That's isn't to say its impossible... just, restoring a backup is so much easier then the wizard or troubleshooting the wizard. Good luck.
The copy wizard had missed some security and IIRC it's caused by subtle differences in security tables, principals etc between the 2 versions.
Frankly, the easiest way is to do one of these two:
backup/restore
detach, copy, attach
If you don't have access to the O/S and can't get it, another option is to create the missing role(s) in the background as the copy runs. You have to catch it between the creation of the files and when it tries to reference the roles, but there are a few seconds in which to create them if you keep clicking execute - I managed to create 9 roles.
Unfortunately, you'll end up with the roles in another database too (while yours cannot be used) so those need to be deleted.
Of course, this is only an option when you really can't use the other method.
Though the answer which is using the backup technique solves the problem generally, after facing the same issue several times, I was able to trace down the root of the problem using the Event Viewer of Windows to that the Database Copy wizard, using the SQL Agent, will eventually create a Job for the agent to run, after which the Agent will run using its own credentials (i.e. the credentials that you can look up in Windows Services, in my case, NT Service\SQLAgent$SQL2014)
All you need to do is to go the folder where SQL Server creates DB files (e.g. C:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\DATA by default for SQL 2014) and give the SQL Agent windows user write/read access on the folder.
The reason can be that a file with the new Database name already exist on the filesystem. We encountered this when we renamed Database X to X_Old, and tried to copy database Y to X. This cannot be done, because database X_Old is still associated with the filename X.
Either delete the conflicting database, or rename the file on the file system.
See http://codecopy.wordpress.com/2012/01/03/error-while-copying-a-database/