I'm learning about the environments and machines of Octopus. I have a Web project that is packaged into a Nuget package and deployed to Azure Websites, and I also have a DB project that is packaged into a separate Nuget package to SQL Azure. When Octo picks them up and deploys, is it better to have two separate machines have tentacles for each in the same environment, or should they be on one machine (in the case that the website deployment passes and the DB doesn't)?
If you're deploying to Azure, it doesn't really matter - 1 tentacle is enough for ALL environments (regardless of project type). We do this all the time for our Azure projects. You can think of the tentacle being a PowerShell script runner against Azure; nothing really happens on the actual server itself.
You can have multiple Octopus "environments" using the same tentacle (especially for Azure) - as you can reuse the same tentacle. This will allow you to use different scopes for your variables to apply the appropriate values per each logical environment, all the while targeting just one server which does runs scripts against Azure.
Related
Working on a project to migrate SSIS 2008 projects to 2016 deployed to a File Server. Currently have the packages on the file server and prefer to keep it that way. I'm aware that the Project Deployment Model has been introduced since 2012.
Questions:
Can I change the migrated projects to Project Deployment Model and still deploy to the File System? Is changing to a Project Deployment Model a best practice?
Researching online, I can only find tutorials on how to deploy to SSISDB(Catalogue). Is the deployment to a File System still the same as previous versions ie. Build project > SSIS creates manfest file in project directory > open the manifest file to deploy?
Well, it is possible with certain limitations.
First, let's state that "deploying to File System" usually means that you store your package on a file system folder, and run it with dtexec. In that sense, deploy SSIS Project to File System is certainly possible, you can run any package from project file. For more details and examples - see MS Docs on dtexec.
However, this is not practical. By doing so, you loose a significant part of SSIS functionality introduced in 2012 version. For example, execution reports in SSIS Catalogue, and project environments which allow fine control and management of package parameters, including encryption of sensitive data like passwords. SSIS Catalogue keeps versions of deployed packages, so you can roll back to previous version easily.
Besides, SSIS Catalogue is fully supported in SSMS; on running package from project file - you are on your own to supply parameters; connection strings are usually passed from environments.
Yes, it's possible but not recommended (and not always possible). Package deployment model exists for backward compatibility. Once you convert your packages to Project Deployment Model you should deploy only to the SSISDB catalog on an instance of SQL Server.
Project Deployment Model contains packages, parameters, Connection managers and more very cool features introduced in 2012. This is the best option to work with SSIS these days.
https://learn.microsoft.com/en-us/sql/integration-services/packages/deploy-integration-services-ssis-projects-and-packages
I use SQL Server 2012 with Visual studio 2010 with Integration Services Catalog. I would like to know if it is possible to deploy my ssis 2012 project to only one of the three environments I have created.
I am trying to understand if this is possible or if environments are created only for different variable values (for different test scenarios etc).
So my question is if I can have deployment version 1 on my DEV environment and deployment version 2 on another environment.
Thank you
Sure it's possible. It's the opposite of what most people want, but you can create a DEV version of your package and only deploy it to your DEV environment, and then create a 2nd version of the package and only deploy it to your 2nd environment.
Note that I am talking about having two separate .dtsx packages, which will have to be maintained separately. That is what I think you are asking. If I have misunderstood, then your question wasn't clear.
One of our customers uses Visual Studio Online ( http://www.visualstudio.com/en-us/products/what-is-visual-studio-online-vs.aspx ) which is based on capabilities of Team Foundation Server (TFS)
We were researching how to do automated Builds and automated Unit Tests using the Visual Studio Online account.
Using Visual Studio 2012 IDE, I was able to setup a build in the Hosted Build Service. However, my Unit Tests need a Microsoft SQL Server Database to run properly, which I have installed on my development workstation.
In order to run my tests I need to have my database running in a SQL instance on the build agent.
Is it possible to install SQL server on the Hosted Build Agent, and if so, how?
It is not possible to add non-standard capabilities to a hosted build agent. You are assigned a clean pre-built VM with all of the standard pre-requisite for compiling binaries and running unit tests. As integration tests, what you are describing above, require an instance of your application you can't run them there.
Here are the options that you have in order of recommendation:
1 - Re-write your tests as Unit and not Integration
The tests that you are describing above do not meet the standard definition of "Unit Test". You can use mocking, stubs, and other testing techniques to decouple your tests from the database so that they are only testing a single unit of functionality rather than the integration between your code and the database. Having sets of tests for each will better help you isolate the route cause of the issue and more quickly fix it at less cost to your customer. It will also aid in the reduction of bugs and other issues that can be introduced over time.
Large complicated integration tests should be kept to a minimum as it increases the amount of time required to support them and maximises the cost to your customer.
One obviously needs integration tests at some point and option #2 is the best for running integrations.
2 - Use Release Management to deploy and test
You can setup a VM to host a working version of your application and use Release Management to deploy the instance then execute your Integration Tests there. If you use the VSO hosted Release Management server you are limited to Azure VM's as targets. If you deploy your own RM server then you need to manage that as well. I would and do use the hosted RM.
The bit we care about is the DevOps bit on the right. Any time you need an instance of your application it should be deployed correctly separately from your build server.
http://nakedalm.com/create-release-management-pipeline-professional-developers/
Once you have your application deployed it is fairly simple to get the integration tests you want executing against the instance.
http://blogs.msdn.com/b/visualstudioalm/archive/2014/11/11/deploying-and-testing-web-applications-using-release-management.aspx
If you download the example from the link above it has a powershell for executing the tests in the environment.
3 - Setup your own agent and controller
You can create your own build controller & agent attached to the VSO account and build the app yourself. Most folks create an Azure VM with all of the bits they need and run there. This is your server that you will need to manage and pay for. You can also use a local server.
Note: This is the wrong approach as an instance of your application should not be available on a build server. A build server is for compiling your binaries and running actual Unit Tests, not integration or UI tests.
Thanks for all your help.
I contacted #tamasf who is one of the developers for the Effort Testing Tool for Entity Framework, and he told me that Effort Testing Tool only needs to know about he EDMX file in order mock Entity Framework ( which means Database won't need to be up and running, therefore, allowing us to follow proper Unit testing practices ) For more information, please read the following post: How would I configure Effort Testing Tool to mock Entity Framework's DbContext withOut the actual SQL Server Database up and running?
Can I have multiple environments of the same MVC4 Web application in the same Windows 2008 R2 Web Edition server?
I would like to have:
QA (Quality/Pre-production) environment - SQL-Server Database with QA MDF file
Production environment - SQL-Server Database with PRD MDF file
Demonstration environment - SQL-Server Database with DEMO MDF file
Is it safe and reliable publishing from Visual Studio to each one?
Thanks.
Yes, you can deploy multiple instances of the app to the server. I would measure "safe" and "reliable" as aspects of the deployment process, not where the application is deployed to.
I try to keep my production and non-production environments separate, just in case anything goes wrong in testing I know it won't impact production. this separation can be physical (different hardware) or logical (same hardware, different configuration).
There are multiple factors to consider when deciding where to host production vs non-production. The main thing to consider is you don't want non-production environments competing for resources with the production application.
some other factors to consider are:
size the application
# of users
nature of the app (is the app critical to the success of the business, or a nice to have)
public vs. private application, etc.
We have recently migrated to using the Visual Studio database projects. What we want to do is for the database to deploy when the TFS build server builds.
This is relatively simple and we have this working for a single database, however, what we need is for it to deploy to multiple database as we have a SaaS product with multiple databases. So for example, when we do a QA build, all the different databases with various configurations on the QA DB server should be updated.
Is there a 'proper' way to do this?
Our current plan is to take the deployment .sql script that will be generated from the database configured for deployment, then create a custom build task which runs this script against the rest of the databases.
I don't think there is a standard way of doing this, so we created a custom build task that iterates over the databases we want to deploy to executing the deployment script generated by the standard database project's deploy against each DB.