Is there support for SSDT replication environment? - sql-server

I could not locate specific, up-to-date documentation on this issue. Does publishing support already exist using SSDT in an environment with replication enabled?
If it exists, can you give me a link to a documentation?

SSDT does not support replication nor jobs.
You need to write scripts that will stop replication (or do whatever you need) before you publish and start after publishing. This can be done as pre/post deployment scripts or other scripts that are run from powershell/batch when deployment is automated.

Related

Upgrade TFS2017.3 to Azure DevOps Server

I'm trying to upgrade my TFS2017 Update 3 environment, to a new Azure DevOps Server (on-premise) environment.
I've created a new server for Azure DevOps Server, as I'd like a newer version of Windows Server, and in general just want a completely fresh environment. I took backups of my databases, shut down the old TFS2017, without deleting anything.
I migrated the databases to a new SQL Server instance (where I have all my other databases), as I see no need to use an SQL Server license just for source control.
Now comes the fun part. I tried to configure Azure DevOps Server to use the existing database (after the migration to the new SQL server instance was done). I had some issues with the TfsJobAgent service, but got those resolved.
I then tried to reconfigure Azure DevOps Server (as it failed the first time), but during configuration, it now tells me that data is corrupt, and that the existing database cannot be used. Good thing that I took backups :)
It should be said, that the new SQL server instance is a 2019 version, so that shouldn't be a problem.
I'm not quite sure what is happening here, and why it's giving me this error. Am I migrating in a wrong way? There's not much documentation out there describing this flow.
Please go through the documentation below before upgrade:
https://learn.microsoft.com/en-us/azure/devops/server/upgrade/get-started?view=azure-devops
And follow the steps in article Upgrade scenario walkthrough for Team Foundation Server to upgrade your TFS. Summarize the steps here:
Prepare your environment. The first step is to check the system requirements for TFS 2018. Upgrade SQL Server is
necessary for your scenario. Including SQL Server, you also need to check other system
requirements and prepare the environment.
Expect the best, prepare for the worst. You must have a complete and consistent set of database backups in case something
goes wrong.
Do the upgrade. Once the preparation is done, you'll need to install the new version of TFS to get new binaries, and then run
through the upgrade wizard to upgrade your databases.
Configure new features. Depending on what version you upgraded from, you may need to configure each team project to gain access
to some of the new features made available.

3rd Party dependencies and SVN Integration with Visual Studio Team Services

I'm assessing moving from TeamCity to VSTS and there are two steps I have in my pipeline that I'm not sure how to setup in VSTS.
How do I include 3rd party dlls in my build? Currently we use a tool that must be installed in the Developer's computers that has separate dlls for x86 and x64. The x86 are included in the project and are needed for the designer, but the x64 are copied from the Program Files folder with an after-build command in Visual Studio. For it to work in TeamCity the tool was installed in the server, so the same after-build command copies the dlls into the build directory as in any other developer computer.
I don't see a way to achieve this in VSTS without including the x64
dlls in the source code, which isn't desirable due to the tool's
license.
How do I publish to SVN? Currently our binaries are hosted in an SVN server. In TeamCity I have a PowerShell script that (in short) updates the SVN local repo in the server, copies all the files from the build directory into the SVN repo and commits the changes.
Storing your dependencies
Lot's of options available here:
Put them in a NuGet package and store them in VSTS Package Management. Have your build restore the package during build.
Put them in Source control, either SVN or TFVC and fetch them during the build.
Store them in Azure Blob storage and fetch them on-demand by downloading them at the start of your build.
Use a custom build agent (Azure VM?) and install the software and the VSTS build agent onto it.
Store them as Build Artefacts in one Build Definition and fetch them using the Fetch Build Artefacts task, which is available from the marketplace.
I'm not sure what kind of license issues you're facing, but I'd expect that each has the same issues if you're not allowed to put the binaries anywhere other than on a licensed machine. Maybe the vendor offers a better option or can be persuaded to offer a Cloud/VM license option.
Publish to SVN
I don't see why the same PowerShell script couldn't be used. Though I'd recommend not to alter your repository from the build pipeline. It makes future CI/CD scenarios much harder. You can attach the binaries as Artefacts to VSTS Builds and that way they can also easily be linked to Release pipelines. You may need to fetch the latest version of svn and store it somewhere in order to run your script. When running on a Azure VM, you can simply install SubVersion directly to the agent.
There is no built-in task available.

How to work in SQL Server as a team?

Me and my team are developing a SQL Server database. We want to work on different PCs, but with the same database. Is it possible that we can synchronise our work on each PC, or share our database while working somehow?
If possible how can Team Foundation server be used for that?
You can use SQL Server Database Tools (SSDT) to represent your database as a series of scripts. These scripts can then be added to source control. Git is by far the most popular source control system there is and Team Foundation Server and Visual Studio Team Services have great Git support.
Each developer will use Visual Studio (or VS Code) on their own machine to do their database work. When the developer wants to share their changes, they commit them to source control. Other developers can then update their local version of the code with the new changes. SSDT adds support for bringing your database and database project in sync.
Now that your code is in source control you can go a step further and add things like continuous integration builds and automated deployments with VSTS Build and Release Management. That way you can automatically test database changes and even run unit and integration tests before deploying to test and production environments.
The following Channel9 video gives an introduction to these tools: Database Continuous Integration and Deployment with Visual Studio SQL Server Data Tools in < 10 minutes
If you only care about schema changes (and not data changes), you can use visual studio's SQL Server projects, and a source control system, to help manage this. You can then use the Schema Compare tool to compare your project to the server, the server to your project, or server to server.
There are some tools from the likes of Redgate, etc, that allow this process to be automated. I've not used those, but they may be another option.

Deploying a SSIS 2016 packages (Project Deployment Model) to the file system

Working on a project to migrate SSIS 2008 projects to 2016 deployed to a File Server. Currently have the packages on the file server and prefer to keep it that way. I'm aware that the Project Deployment Model has been introduced since 2012.
Questions:
Can I change the migrated projects to Project Deployment Model and still deploy to the File System? Is changing to a Project Deployment Model a best practice?
Researching online, I can only find tutorials on how to deploy to SSISDB(Catalogue). Is the deployment to a File System still the same as previous versions ie. Build project > SSIS creates manfest file in project directory > open the manifest file to deploy?
Well, it is possible with certain limitations.
First, let's state that "deploying to File System" usually means that you store your package on a file system folder, and run it with dtexec. In that sense, deploy SSIS Project to File System is certainly possible, you can run any package from project file. For more details and examples - see MS Docs on dtexec.
However, this is not practical. By doing so, you loose a significant part of SSIS functionality introduced in 2012 version. For example, execution reports in SSIS Catalogue, and project environments which allow fine control and management of package parameters, including encryption of sensitive data like passwords. SSIS Catalogue keeps versions of deployed packages, so you can roll back to previous version easily.
Besides, SSIS Catalogue is fully supported in SSMS; on running package from project file - you are on your own to supply parameters; connection strings are usually passed from environments.
Yes, it's possible but not recommended (and not always possible). Package deployment model exists for backward compatibility. Once you convert your packages to Project Deployment Model you should deploy only to the SSISDB catalog on an instance of SQL Server.
Project Deployment Model contains packages, parameters, Connection managers and more very cool features introduced in 2012. This is the best option to work with SSIS these days.
https://learn.microsoft.com/en-us/sql/integration-services/packages/deploy-integration-services-ssis-projects-and-packages

Generate Change Scripts from a TFS and Visual Studio Database Project

Is there a way to generate change scripts from a Visual Studio and TFS project? I would like to either choose two versions of the database or enter a date range and get change scripts that I can pass off to a DBA to apply to the production database.
I do not have access to the production schema in order to do a schema compare.
My DBA wants SQL scripts and will not use snapshot method.
I am using Visual Studio 2013 and SQL Server 2008 and later.
Help!
I would suggest that your dba's attitude is blocking your ability to deliver high quality software on a regular cadence. Comparing with production is the correct and supported method to achive a change script. And remember the tools you are using are built and supported by the SQL Server team.
If you know when the current production version was shipped you can create a workspace and get that specific version of your code. If you compile it you should have a .dacpac for that version and you can use that in the compare.
Another option would be to ask the DBA to restore the last production version to another server so that you can 'test' the script. You can then generate your script from there.
Whatever you choose you need to raise this with management as a blocker for continuous delivery.

Resources