Firstly, apologies if this has already been asked. I searched, but couldn't find anything specifically relating to this problem...
I'm having a big issue with CodedUI. In my case, I deploy my software to an environment, and then run the automated tests via visual studio on this environment. As the test starts, the test agent appears to copy files as it wishes into a temporary directory from which to run the tests, as suggested on this page:
https://msdn.microsoft.com/en-us/library/ms182475(v=vs.100).aspx
The following files and folders are copied to the deployment folder
before the tests are run:
•The test assembly file
•All dependent assemblies
•Files that you have specified, such as XML files and configuration
files, on which tests depend. You can configure deployment both by
specifying additional deployment items to be copied and by changing
the deployment folder.
My issue with this is that my tests reference a file that is present on my environment. Into the directory
"C:\Program Files (x86)\Common Files\Microsoft
Shared\VSTT\12.0\UITestExtensionPackages"
I have placed an extension package that allows me to interface with some third-party WPF UI components. This is in place on the environment, and my test projects within Visual Studio all reference the file in that location as a dependency, and have "Copy Local" set to false and "Specific Version" set to true.
The problem occurs when the test agent starts the tests, and copies all of the files it thinks it requires to the temporary test directory. It also copies this extension file, resulting in every test failing with the error:
System.InvalidCastException: [A]<type here> cannot be cast to [B]<same
type here>
As it appears to be referencing the one I intentionally placed there as well as the one the agent copied over. If I manually go in and remove the file, then tests will start to pass.
My question is: how do I prevent the test agent from copying this file? I know the file will always be in the location which I placed it and do not require it to be copied over.
Related
I'm trying to get the direct path of my local database.
I put the database inside my main project folder:
Then I used this code to get the path:
Dim cn As New SqlConnection("Data Source=.\SQLEXPRESS;AttachDbFilename=" + My.Application.Info.DirectoryPath + "\database\database.mdf;Integrated Security=True;User Instance=True")
Everything is OK now.
So why is the database copied into the \bin\Debug folder?
Then if I open the source code and run the project and try to save data or find data that I saved before from the application in \bin\Debug I don't find it? Why?
What I mean
If I run the project from \bin\Debug its will save
If I run the project from project1 folder from (.sln) and try to show the data table I don't find may saved data.
The opposite is true
Here's how it works. You add a data file to your project and it is a source file. You build your schema in that file and you also add any default data to that file. You don't use that file for testing and debugging though. How would it make sense to pollute that file with test data and then you've got no clean database to deploy with your application when you release it?
When you build, that source file gets copied to the output folder. If you build a Debug version, the data file gets copied to the Debug folder and that's where you mess it up with your test data. When your application is ready to deploy, you switch to a Release build and a nice clean copy of your source database is created in the Release folder.
By default, the Copy to Output Directory property of that source file is set to Copy Always. That means that any time you run your project and there are changes to any source file, a new copy will overwrite the one already in the Debug folder and any changes you made last time you debugged will be lost. If you change that property to Copy if Newer, a new copy will only be made if you change source data file. That allows you to keep changes between debugging runs. To force a refresh, just do a Clean or delete the copy manually.
I am building a database dacpac using sqlpackage on a windows machine. The project contains a reference to master.dacpac
I take the move the dacpac to a linux machine (mssql-server-linux docker image) and restore the database.
deploy-database.sh
# publish dacpac using sqlpackage
./sqlpackage/sqlpackage /Action:Publish /sf:"/MyDb.dacpac" /tu:sa /tp:Password1 /tdn:MyDb /tsn:localhost
Error:
No file was supplied for reference master.dacpac; deployment might fail. When package was created, the original referenced file was
located C:$(windows machine path)\MASTER.DACPAC. Initializing
deployment (Failed)
An error occurred during deployment plan generation. Deployment cannot continue. Error SQL0: The reference to external elements from
the source named 'master.dacpac' could not be resolved, because no
such source is loaded. Warning SQL72025: No file was supplied for
reference master.dacpac; deployment might fail. When package was
created, the original referenced file was located C:$(windows machine
path)\MASTER.DACPAC.
An error occurred while adding references. Deployment cannot
continue. The command '/bin/sh -c sh /deploy-database.sh' returned a
non-zero code: 1
I have tried adding master.dacpac to the project directly and also copying it to the docker image but the same error occurs.
How can I restore a dapac in a linux environment that has a reference to master.dacpac?
I had a similar issue, my solution was to rename the dacpac files UPPERCASE, (ex: MASTER.DACPAC) which worked for me, as well as making the directory with the dacpac files the working directory.
I have tried adding master.dacpac to the project directly and also
copying it to the docker image but the same error occurs.
Make sure master.dacpac file is in the current working directory. Since your MyDb.dacpac file exists in the root directory, copy the master.dacpac file there and execute the sqlpackage command in the context of the root directory.
The example below specifies an absolute reference to sqlpackage (in case it's not already in your path) and a relative reference to your user dacpac (although an absolute reference will work too).
cd /
/sqlpackage/sqlpackage /Action:Publish /sf:"MyDb.dacpac" /tu:sa /tp:Password1 /tdn:MyDb /tsn:localhost
In a SQL Server Data Tools (SSDT) project in Visual Studio, we have a "core" set of SQL objects that are included in each SQL project we do - kind of like a class library. We keep these "core" SQL objects in a separate Git repo, and then include them in other projects as a Git submodule.
Once the "core" submodule is linked to the main project, we include the submodule files in our .SQLPROJ file like so:
<Content Include="..\CoreSubmodule\ProjectFolder\Scripts\**\*.*">
<Link>Scripts\%(RecursiveDir)%(FileName)%(Extension)</Link>
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</Content>
This works great for regular .sql files in project - they show up with a special icon in Visual Studio indicating that it's a referenced file, and the build engine is able to resolve references/dependencies just fine. Where we're running into a snag, though, is with our Pre- and Post-Deployment scripts.
We have a series of "core" Pre- and Post-Deployment master scripts that are common among projects, which we've just introduced into our "core" submodule. Here's how the directory structure looks at a high-level:
/Scripts/
/PostDeploy/
_PostDeployMaster.sql
/ReferenceData/
ReferenceDataScript1.sql
In the above structure:
_PostDeploymentMaster.sql is a local file in the project, and is set to Build Action = "PostDeploy". It has a reference to the *ReferenceDataScript1.sql file
ReferenceDataScript1.sql is a reference to a file that physically
exists in the submodule directory (one level up from our project),
and is set to Build Action = "None"
Note that Visual Studio displays it in the /ReferenceData/ folder as a linked file
The _PostDeploymentMaster script references other sub-scripts via a SQLCMD reference:
:r .\ReferenceData\ReferenceDataScript1.sql
go
Trying to build the project in this manner produces a SQL72001 error in Visual Studio ("The included file does not exist"). Obviously if we physically place the ReferenceDataScript1.sql file in the directory (without having a reference), it builds just fine.
Options we've explored include having a non-Build "buffer" script between the PostDeploy master and the core subscripts (same error), and having pre and post build actions set to physically copy files back and forth from the submodule to the project to satisfy the build engine (a little too hacky for our taste).
Has anyone run into this issue, or have a serviceable workaround?
We wound up working around this issue by using Peter Schott's suggested fix in the original question's comments - using relative paths back to the submodule on disk instead of the "virtual" link inside of the actual Visual Studio SQL project.
I was searching how can I organize SSDT project with Submodules and found your question.
I did some realizations, but I used "virtual" link inside of the actual Visual Studio SQL project. Here is my test projects.
https://github.com/gnevashev/Core
https://github.com/gnevashev/Installation01
In .sqlproj, I added:
<Build Include="Core\**\*.sql" Exclude="Core\**\*PostDepl*.sql" />
And in project PostDeploy Script, I added link on Core PostDeploy Script:
:r .\Core\Script.PostDeploymentPopulateData.sql
The question here indicates that when an ExePackage has DownloadUrl it also needs a copy of SourceFile.
We keep the copy of the Sql Server Setup in a separate Release folder that is not part of the development environment. We do this so our daily backup doesn't have to copy the same 300+MB every time.
However, when Burn builds our Setup, it copies the SourceFile to the output folder along with the .exe it creates. The filename is the DisplayName and the file is the same size as the file in the Release folder.
The result is similar to setting CopyLocal on a project reference.
Can I tell Burn not to copy this file on build?
Edit
I am deleting the file with the post-build event in Visual Studio. However, this doesn't answer the original question.
Further Information
After I delete the file and run the Setup, I get an error in the MSI log: Failed to resolve source for file.
This happens at run-time, and the file referenced is located in the project output folder. How is it possible that Burn is looking at the source file at run-time?
That question also mentioned that it if you provide the RemotePayload element, then it doesn't need the SourceFile. So use RemotePayload so that it never copies it.
I have a WPF application, when double click from it is location it runs - no problems.
When I trying to start the app from command prompt I get an error.
I need to run the app from command prompt (also using startup register)
Files that help describe the problem:
C:\Users\xxx\AppData\Local\Temp\WERC6F9.tmp.WERInternalMetadata.xml
C:\Users\xxx\AppData\Local\Temp\WERE6F8.tmp.appcompat.txt
C:\Users\xxx\AppData\Local\Temp\WERE718.tmp.mdmp
What could be the problem
I hope this can help you
This could be because of Path. When you Double click app the required files for the application will there at the current folder hence the application is able to find and load the files required. But if your executing it from different path other than app path, lets say
Your application is present in the location D:\Data\Example.exe.
And your executing the app from different location Say C:\Program files, now the app will start running from C:\Program files and it'll start searching required files at C:\Program files and some temp locations once it don't find there it'll throws exception
You can do like this
Write a Batch file.first change the current directory to application directory in this case say D:\Data or you can add your application path to path environment variable and use it
Then Invoke the exe
Save the this file and you call this file from where you want
I had the same issue.
I found out that it was due to me using Debug not Release while building my application. The Microsoft .dll-s in the redistributable is for release.