Related : Visual Studio 2013 (Professional Edition)
I am trying to create Data Migration Script to deploy the changes on Staging Server.
This works locally fine. But When I try to run the generated Script on Azure Database, I get TextPtr is not supported on Azure platform. I studied more about it & found that the newer editions of SQL Server (sply for Windows Azure (SQL 2014 may be)) has dropped some keywords/functionalities the list can be found here.
The Sql Database Project only provides the Schema Compare, but Data Compare is avilables in tools Section (where we can not set Target Project Type property).
I wonder how can I deploy/Migrate the changes made in one environment to another in such a Situation. Currently I had to overwrite the existing Database on Azure platform.
But this is not Identical also, for first time this could work but not for later, as there could be some changes made to the Staging or other environments.
I had a similar problem, when trying to migrate between a test and staging environment in Azure. As a quick fix, I got around the problem by just doing a "copy" of the dev database via the Azure dashboard.
Related
I'm currently working on a enterprise that uses TFS 2017, and we pretend to upgrade to Azure DevOps. So far I'm just studying about how to implement this TOOL. I'm new on devops things and I have these following doubts... Why do we need SQL Server to install DevOps Azure? What kind of information are stored in there?
In the server configuration wizard, I have the option to select an existing database to use for the Azure DevOps Server that's being deployed... Can I select the currently database that's used by TFS 2017?
Azure DevOps Server and Team Foundation Server store just about everything in massive SQL Server databases.
The main server configuration is stored in the tfs_configuration database and each project collection is stored in a separate database The default is tfs_defaultcollection.
The collection database holds all version controlled files (TFVC and Git), all work items (Product Backlog, Sprints etc), Test Cases and all test run attachments, your Pipelines, Builds and Releases as well as all of the artifacts produced by these pipelines.
These databases can grow considerably.
Whether you can keep your current database server depends on what version you're currently running. SQL Server 2016 SP1+ happens to be supported by both Azure DevOps Server 2020 as well as Team Foundation Server 2017. You could keep using that for the upgraded installation.
But my recommendation would be to install SQL Server 2019, you'll get all of the performance and security benefits of the new server, support for the latest Windows Server platform, as well as a support window that matches your new Azure DevOps Server installation.
You can find the SQL Server compatibility matrix for TFS/ADS here:
Azure SQL Database and SQL Server
TFS 2017 was one of the last versions to require a database for the Warehouse, which is a form of replicated data. Reports can be written to pull data from the warehouse, without impacting the user experience. The database is somewhat deprecated now, especially for reporting. Microsoft promotes the use of the API to pull data from the live database. TFS does however still need its "live" database to store all of the data presented to users. These will be work items, discussion comments, project templates, user mappings to AD, amongst other things.
You will need to upgrade your 2017 Schema to conform to the new standard as defined by Azure, which will be taken care of as part of the upgrade.
What kind of information are stored in there?
Issues, templates, build results, lots of things.
Can I select the database that's used by TFS 2017?
Yes, that will be upgraded during the installation.
Me and my team are developing a SQL Server database. We want to work on different PCs, but with the same database. Is it possible that we can synchronise our work on each PC, or share our database while working somehow?
If possible how can Team Foundation server be used for that?
You can use SQL Server Database Tools (SSDT) to represent your database as a series of scripts. These scripts can then be added to source control. Git is by far the most popular source control system there is and Team Foundation Server and Visual Studio Team Services have great Git support.
Each developer will use Visual Studio (or VS Code) on their own machine to do their database work. When the developer wants to share their changes, they commit them to source control. Other developers can then update their local version of the code with the new changes. SSDT adds support for bringing your database and database project in sync.
Now that your code is in source control you can go a step further and add things like continuous integration builds and automated deployments with VSTS Build and Release Management. That way you can automatically test database changes and even run unit and integration tests before deploying to test and production environments.
The following Channel9 video gives an introduction to these tools: Database Continuous Integration and Deployment with Visual Studio SQL Server Data Tools in < 10 minutes
If you only care about schema changes (and not data changes), you can use visual studio's SQL Server projects, and a source control system, to help manage this. You can then use the Schema Compare tool to compare your project to the server, the server to your project, or server to server.
There are some tools from the likes of Redgate, etc, that allow this process to be automated. I've not used those, but they may be another option.
I am having problems publishing a SSDT database project and registering it as a data tier application. Let me explain.
I have a database (A) which references two other databases (B & C) through linked servers. I have created projects based on B and C and snapshoted the projects to create dacpac's for databases B and C. I have created a database project for database A which has database references to B and C through dacpac's. I have set SQLCMD variables and modified the db project ddl scripts to use the SQLCMD variables in place of the un-resolved linked server names. The project builds!
I am trying to publish the project as a data tier application but keep receiving the following error "Databases registered as a DAC database must be hosted by an instance of SQL 2005 SP4, SQL 2008 SP2, SQL 2008 R2, SQL 2012 or SQL Azure". Incidentally I am running SQL server 2012.
I thought I would test whether I could register as a data tier application through SSMS. Within SSMS the option to "Register as data Tier Application" is grayed out. I therefore tried to "Export Data Tier Application" and received a number of error in reference to the linked server objects.
My question is; is it possible to deploy a SSDT database project and register it as a Data Tier Application where the project is using linked servers, or am I doing something wrong? If it is possible could some one provide some advice.
I have broken Google looking for the answer, so any help would be greatly appreciated...
I had this error recently so I'll add my solution for anyone else who comes across this, already added to the dba stack exchange
Turns out in my publish.xml I had RegisterDataTierApplication set to True. The first time I published the database it worked fine, but then I got the same error, as the database was already registered as a Data Tier application.
By setting to false (or unchecking the checkbox in the gui) it works fine.
I am running SQL Server 2012 and VS 2010 with SSDT (SQL Server Data Tools) installed. My dev DB uses stored procs, functions, CLR objects, etc. It has a snapshot of prod data of about 500GB.
I created SQL Server Database Project and then imported the database. This created all tables, views, procs and functions files under schema names. Great stuff -- now I can do a version control just like in other VS projects, create deployments, etc. So far, so good.
But, I am confused as to what my development process should be for changing/adding procs/tables under SQL Server Database Project. It appears that any changes I make are applied to some LocalDb/Projects database and NOT to my dev database.
Am I suppose to author all my objects in that LocalDb, then Build and deploy to my dev database via Publish? I am worried about my existing tables in the dev DB since if the publish process drops and recreates tables, I will loose my prod data snapshot.
What is the right development process to follow in SQL Server Database Project?
Think of the source database (in your case, your database project) as being the "to be" state after deployment. When a deployment is initiated, the executable (SqlPackage.exe) compares the source with the target and generates a difference/delta script to make the target look like the source. This is why we no longer have to specify CREATE or ALTER; the tool figures it out. To answer your question about ongoing development, you can develop either way. You can develop in the project files and publish them to a common Dev database (say, if you're on a team), or you can develop in the database with tools like SQL Server Management Studio (SSMS) and synchronize with the project files with a schema compare (I use the latter technique because I like SSMS).
For deployment, you'll have to have SSDT installed on the machine from which you execute the deployment (SSDT ships with SQL Server 2012 and later; I don't know about SQL Server 2008). You can create scripts to simplify deployment. You'll essentially call SqlPackage.exe (it lives in x:\Program Files (x86)\Microsoft SQL Server\nnn\DAC\bin) with an action and a source. I use Publish Profiles as well to take care of most command properties. So an example deployment might look like this:
SqlPackage.exe /Action:Publish /SourceFile:MyDatabase.dacpac /Profile:MyProfile.publish.xml
For more information:
SQL Server Data Tools Documentation
http://msdn.microsoft.com/en-us/library/hh272686(v=vs.103).aspx
SqlPackage.exe Documentation
http://msdn.microsoft.com/en-us/library/hh550080(v=vs.103).aspx
Make changes inside the VS DB project.
Deploy changes to localDB to test
Publish the database to your production server. I prefer to use Schema Compare to do this manually, but you can also publish the project via the right click --> publish menu (which will also create a publishing profile), or using command line arguments. The publish process won't drop and create tables (unless you tell it to drop & recreate the entire db).
Alternatively, in the project settings you can change the connection string to point to your production server (as pointed out in the comment). However, I recommend against this, as it will then attempt to publish to the production server every time you run a local build (F5).
I'm using Visual Studio Team System 2008's Database tools to develop my databases. On my local dev machine, when I want to deploy schema changes to the SQL Server instance on my machine, I just use the Data --> Schema Compare feature of VS2008.
But with live databases I can't do this because I can't connect to the database directly from my machine and the server haven't got VS2008 installed.
So I was thinking about the SQLCMD tool. Isn't that what VS2008 uses "under the hood"?
I want to use as part of an automatic deployment strategy. I want to be able to publish SQL scripts generated by VS2008 to the server and have an application run scripts on the live database to update the schema.
UPDATE
I'm trying to achieve automatic change script generation by taking the deploy script VS2008 Database Edition generates and comparing it against a live database. Only I want to do it through code, no tool or anything. It must be able to run from a Windows Service on the server.
SqlCmd would work. If you also want automatic versioning support you should check out Tarantino.Net, a database management tool that keeps track of which sql-files have already been run.