I am wanting to deploy basically the bin folder to a VM that is hosted on Azure. I want to deploy from my TFS 2015 Server to a Window 7 VM which are both hosted in Azure. I have set up a Machine File Copy task and tried to get it to point to the correct VM but nothing seems to work. It continually comes back with the error :
2015-12-29T14:27:30.5763871Z ##[debug]Initiating copy on machine-name
2015-12-29T14:27:57.0124127Z ##[debug]Finished copy operation on machine-name
2015-12-29T14:27:57.0280368Z ##[debug]Deployment logs for copy operation on machine-name
2015-12-29T14:27:57.0280368Z ##[debug]System.AggregateException: Failed to execute the powershell script. Consult the logs below for details of the error.
2015-12-29T14:27:57.0280368Z System.Management.Automation.RuntimeException: Failed to connect to the path \\machine-name with the user username for copying. System error 53 has occurred.
2015-12-29T14:27:57.0280368Z The network path was not found.
I changed the machine name and username in the above snippet.
I have followed most of the steps in these articles:
http://www.codewrecks.com/blog/index.php/2015/06/20/build-vnext-support-for-deploying-bits-to-windows-machines/
http://blogs.msdn.com/b/visualstudioalm/archive/2015/07/31/dev-test-in-azure-and-deploy-to-production-on-premises.aspx
But I could not see the option for Azure File Copy so that's out the window. When I use PsPing I can connect from the TFS Server to the target VM when targeting the specific WinRM port 5985.
I am not sure if what I am trying to achieve is even possible with TFS + Azure, but I would have imagined this is exactly the type of scenario Azure was built for?!
Any help would be greatly appreciated.
EDIT: Here is the add Step Dialog:
You need to use the Azure File Copy task to copy files to Azure VMs. Open the build definition, and click Add build step... -> go to Deploy page on the ADD TASKS dialog. Then you will find the Azure File Copy task. See:
However, you mentioned that the task is missing, can you share me the screenshot of the ADD TASKS dialog?
Related
I posted a similar question before, but I have now came back to the point where I need to deal with this and after some tuning I have managed to get rid of all the errors, except one warning that basically tells me that the path provided to the package not finding any files in the directory (which is false). This almost makes me want to believe this is again a permissions issue. As when I run the package locally it all works fine. Here is the warning:
Same message appears when I right click SSIS package under the catalogue and directly try to execute the job.
I made sure that the caller has full permissions to the folder in question
SQL Server Agent Job calls the job like so
With a single step in it to execute the SSIS package from the catalogue:
The history of the executions is all successful and the history of SQL Agent job is all green
One of the suggestions I got was to open SQL Server Configuration manager but I don't appear to have access to that. I am new to this whole process so I am not sure if authentication has something to do with it or the sa account's permissions.
Note: I am working on this on a remote dev server and not directly on my pc.
Any help would be greatly appreciated.
Since you're running the package as the Agent Service Account:
The permissions need to be applied to the SQL Server Agent Service Account or its per-service SID.
You can see the service account with PowerShell like this:
PS C:\Users\david> (Get-WmiObject win32_service | where Name -eq "SQLSERVERAGENT").StartName
NT Service\SQLSERVERAGENT
Turns out the database instance was on a whole different server that I wasn't even given access to. That server obviously didn't have the path specified in the SSIS package, so what I had to do is create a folder for files in the correct server, reroute my files there, and change the SSIS package path after obtaining the access to the server where DB instance lives. Me being new to all this, it was absolute frustration especially since our team is small and I am new in it I can't just ask someone questions about this all the time. Hopefully this will save someone a lot of time.
I've seen this question asked several times but never with a satisfacory answer. I have read through all the posts I can related to this and tried as much as I possibly can.
I have an SSIS package that loops thru a network folder of Excel files, I won't explain what it does inseide the loop container as that is not relevant.
I refernce the folder via a UNC \servername\folder
The package works fine from within Visual Studio.
I deployed the package to the Integration Services Catalogs on the
server
After I deploy, I connect to the server from my local pc via SSMS and then I execute it from SSMS via "Integration Services Catalogs"... Execute - This fails.
However If I remote desktop onto the SQL Server box, then start SSMS, connect to the SQL Server using my own credentials and execute the package using exactly the same method as above it works fine.
When I look in the logs of the failed attempts I see a warning that "The For Each File enumerator is empty". I'm not sure if this is telling the full story as, if I rename the network folder, I get the same message, (I expected to see an error that the folder was not found) - this may or may not be relevant.
I've sketched up a quick overview of what works and what doesn't
NOTE The "script execution" method is just a t-sql script for running these packages. This is the method I will use eventually but at the moment I'm focusing on the simple right-click execute method as this essentially does the same thing.
Since I did this I have tried a few more things...
I've tried accessing the folder as a UNC, a mapped drive and also UNC using IP address instead of server name.
I recreated the issue on our development server so I could change
service accounts etc.
I tried the default accounts on both SSIS and SQL Agent services.
I tried changing these to domain accounts and network service accounts.
I get the package to log the folder name which is and expression - this always looks correct. I do the same with the user name which always shows me as the exection user.
I can change most things as I can test on the development server with the excpetion of testing with a domain admin account so any suggestions would be greatly appreciated.
Cannot seem to find any concise information on this at all so trying my luck here.
Recently have setup Integration Services Catalogs because at present all our SSIS packages are stored within a folder and just ran as a File. We wish to move these, which has worked fine.
I have created a basic SSIS Package that puts a Username into a SQL Table and then also a File System Task to Delete a File.
When the package is ran from the Catalog via SSMS it completes the SQL side of things perfectly fine, placing both the Username running the package and the data into the SQL table, however it fails on the File System Task with
"File System Task: Error: An error occured with the following error message : "Access to the path "xxxx" is denied.
I have changed the SQL Agent Job on the SQL Server to have the permissions of our Administrator Account with no luck.
I can confirm the folder in question is Shared, it has FULL Read/Write Permission to "Everyone" and yet I still get this error.
I even went to the trouble of creating a new folder and just enabled full sharing to everybody on it - I still get the same access is denied.
Seen a previous post on stackoverflow about NETWORK SERVICE being added, can also confirm that this has full permission to the folder and thus it rules this out also.
Any thoughts would be appreciated.
I am trying to implement this idea https://www.sqlshack.com/continuous-integration-sql-server-data-tools-team-foundation-server/ and I am getting this error:
Error Deploy72002: Unable to connect to master or target server 'DatabseTest'. You must have a user with the same password in master or target server 'DatabaseTest'.
I did the idea in a very simple way I created a database for the test in a Development server (just one table with an ID and Name columns), I created a database project on the visual studio, I create a script to insert a few rows in the only table in the database. Then I create a publish profile, I added the connection to the 'DatabaseTest' in the 'DEV' server, the user I am using for the access to the database is a user with admin permissions, the script associated with the publish profile is the only script in the solution the one for insert the rows. I made the check in and I created a build definition. I am trying to make the project build successfully, so I just add a Build Solution Task, in the MSBuild Arguments this is what I am passing:
/t:build /t:publish /p:SqlPublishProfilePath=Database_Testing_Profile.publish.xml
And I am getting the error from the beginning of the question.
Can someone please give an idea, about what is the problem??
Thank you.
First, The password won’t be stored in publish profile file after saving it, you need to add it manually (User Name=XXX;Password=XXX)
Secondly, sure the Target Platform is correct (Right click the project=>Properties=>Project Settings)
On the other hand, there is an article that may benefit you: Using MSBuild to publish a VS 2012 SSDT .sqlproj database project the same way as a VS 2010 .dbproj database project (using command line arguments to specify the database to publish to)
Lightswitch solution with an external database was able to be built and debugged.
It has lain dormant and in the meantime the development machine and myself have moved onto a domain.
Now an attempt to build the solution gives:
Error 15 An error occurred while establishing a connection to SQL Server instance '(LocalDB)\v11.0'.
Login failed for user 'MyNewDomain\MyNewUserName'. C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\LightSwitch\v4.0\Microsoft.LightSwitch.targets 160 10 TrueTrackLightSwitch2013
It seems I need to tell (LocalDB)\V11 about a new user (MyNewUserName) but I don't know how to do this.
It also raises concerns in that I want to pass the maintenance of this project to another developer once I get it going again.
The project is accessing data from an external SqlServer 2008 database. This external database holds the Aspnet tables for form based security.
I don't understand where '(LocalDB)\v11.0' comes into the picture
thanks
Bob
Solution was to delete folder as per the steps in the following link.
The delete database step failed for me saying database did not exist.
But I was able to open my project and rebuild it.
MSDN LightSwitchForm Post
The Answer that worked was:
Matt on the LightSwitch team got me running again:
1) Delete the directory at:
C:\Users{your user name}\AppData\Local\Microsoft\Microsoft SQL Server Local DB\Instances\v11.0
Note: replace "{your user name}" with the name you logged in with.
2) On your Windows start Menu type in CMD and press enter. When the Command Window comes up type:
cd "C:\Program Files\Microsoft SQL Server\110\Tools\Binn"
then:
sqllocaldb.exe delete v11.0
It will say:
LocalDB instance "v11.0" deleted.
Then open a LightSwitch project and run it and it should then work