In SSIS I am using a Foreach loop container to loop through a list of files.
My project works fine in locally - it finds the files, but when I deploy to SSMS and run it, it gives the error:
The for each file enumerator is empty. The for each file enumerator did not find any files that match the pattern, or specified directory was empty.
The directory I am pointing to is a network location
I have permissions on this directory.
Has anyone experience this issue before?
Related
I have a package that loads CSV files from a folder, I have setup a foreach file enumarator to loop through the directory and assigns the files that starts with "Example*.csv" including the extention.
The package runs fine in SSDT but once i deploy and run it through the agent job does not load anything and the jobs runs successfully. I have removed the Foreach loop and connected the Flat file connector to one file and it works perfectly .
Please assist I am new to this.
Expression passsed to Flat File connection manager :
FolderLocation = "\Sharedrive\Business\Company-info
I am building a database dacpac using sqlpackage on a windows machine. The project contains a reference to master.dacpac
I take the move the dacpac to a linux machine (mssql-server-linux docker image) and restore the database.
deploy-database.sh
# publish dacpac using sqlpackage
./sqlpackage/sqlpackage /Action:Publish /sf:"/MyDb.dacpac" /tu:sa /tp:Password1 /tdn:MyDb /tsn:localhost
Error:
No file was supplied for reference master.dacpac; deployment might fail. When package was created, the original referenced file was
located C:$(windows machine path)\MASTER.DACPAC. Initializing
deployment (Failed)
An error occurred during deployment plan generation. Deployment cannot continue. Error SQL0: The reference to external elements from
the source named 'master.dacpac' could not be resolved, because no
such source is loaded. Warning SQL72025: No file was supplied for
reference master.dacpac; deployment might fail. When package was
created, the original referenced file was located C:$(windows machine
path)\MASTER.DACPAC.
An error occurred while adding references. Deployment cannot
continue. The command '/bin/sh -c sh /deploy-database.sh' returned a
non-zero code: 1
I have tried adding master.dacpac to the project directly and also copying it to the docker image but the same error occurs.
How can I restore a dapac in a linux environment that has a reference to master.dacpac?
I had a similar issue, my solution was to rename the dacpac files UPPERCASE, (ex: MASTER.DACPAC) which worked for me, as well as making the directory with the dacpac files the working directory.
I have tried adding master.dacpac to the project directly and also
copying it to the docker image but the same error occurs.
Make sure master.dacpac file is in the current working directory. Since your MyDb.dacpac file exists in the root directory, copy the master.dacpac file there and execute the sqlpackage command in the context of the root directory.
The example below specifies an absolute reference to sqlpackage (in case it's not already in your path) and a relative reference to your user dacpac (although an absolute reference will work too).
cd /
/sqlpackage/sqlpackage /Action:Publish /sf:"MyDb.dacpac" /tu:sa /tp:Password1 /tdn:MyDb /tsn:localhost
trying to archive all the files into a zip file that is formed in the workspace in jenkins pipeline script. I tried using this
archiveArtifacts 'C:\Program Files (x86)\Jenkins\jobs\pipeline CI_MS\workspace'
but error was shown as "file not found"
Thanks for any help
Do you really want to archive everything in the entire workspace? Hardcoding the path like that is a bad idea. The workspace moves, and if you are using a more recent version of Jenkins (that wasn't upgraded from an old version), you are probably not even looking in the right space.
Use this:
archiveArtifacts "${WORKSPACE}"
Add to the end of the path if you want to archive files in subdirectories.
Firstly, apologies if this has already been asked. I searched, but couldn't find anything specifically relating to this problem...
I'm having a big issue with CodedUI. In my case, I deploy my software to an environment, and then run the automated tests via visual studio on this environment. As the test starts, the test agent appears to copy files as it wishes into a temporary directory from which to run the tests, as suggested on this page:
https://msdn.microsoft.com/en-us/library/ms182475(v=vs.100).aspx
The following files and folders are copied to the deployment folder
before the tests are run:
•The test assembly file
•All dependent assemblies
•Files that you have specified, such as XML files and configuration
files, on which tests depend. You can configure deployment both by
specifying additional deployment items to be copied and by changing
the deployment folder.
My issue with this is that my tests reference a file that is present on my environment. Into the directory
"C:\Program Files (x86)\Common Files\Microsoft
Shared\VSTT\12.0\UITestExtensionPackages"
I have placed an extension package that allows me to interface with some third-party WPF UI components. This is in place on the environment, and my test projects within Visual Studio all reference the file in that location as a dependency, and have "Copy Local" set to false and "Specific Version" set to true.
The problem occurs when the test agent starts the tests, and copies all of the files it thinks it requires to the temporary test directory. It also copies this extension file, resulting in every test failing with the error:
System.InvalidCastException: [A]<type here> cannot be cast to [B]<same
type here>
As it appears to be referencing the one I intentionally placed there as well as the one the agent copied over. If I manually go in and remove the file, then tests will start to pass.
My question is: how do I prevent the test agent from copying this file? I know the file will always be in the location which I placed it and do not require it to be copied over.
The question here indicates that when an ExePackage has DownloadUrl it also needs a copy of SourceFile.
We keep the copy of the Sql Server Setup in a separate Release folder that is not part of the development environment. We do this so our daily backup doesn't have to copy the same 300+MB every time.
However, when Burn builds our Setup, it copies the SourceFile to the output folder along with the .exe it creates. The filename is the DisplayName and the file is the same size as the file in the Release folder.
The result is similar to setting CopyLocal on a project reference.
Can I tell Burn not to copy this file on build?
Edit
I am deleting the file with the post-build event in Visual Studio. However, this doesn't answer the original question.
Further Information
After I delete the file and run the Setup, I get an error in the MSI log: Failed to resolve source for file.
This happens at run-time, and the file referenced is located in the project output folder. How is it possible that Burn is looking at the source file at run-time?
That question also mentioned that it if you provide the RemotePayload element, then it doesn't need the SourceFile. So use RemotePayload so that it never copies it.