Artifactory deletes artifacts of previous builds - versioning

We want to use Artifactory with TeamCity CI for our production. I am in the process of testing it and now I found a problem. This is what happens:
I am using the generic repository.
There are 2 jobs, producer (deploys some artifacts) and consumer (only downloads them).
producer with build number #1 deploys 5 artifacts into Artifactory via the Artifactory Plugin.
consumer is set to resolve artifacts of the producer build. All 5 are downloaded.
producer with build number #2 is run again, again deploying 5 artifacts.
consumer is now set to resolve artifacts of the producer build of number #1. Only 2 of the 5 artifacts are downloaded.
The same thing happens when using the REST API - downloading artifacts of the newest build is okay, all the artifacts are downloaded. But when I try to download artifacts of the older builds, only some are.
I have not set up any cleanup policy, so artifacts should not be deleted.
The artifacts from producer build #1 and #2 may or may not be the same - in both cases I want to download all of them.
Is there something I am not getting right? It looks as if the older artifacts are, for some reason, thrown away with the new build.
When I view the published artifacts of the build I want in Artifactory web GUI, it says No path found (externally resolved or deleted/overwritten) next to it.

If you want to keep versions of the deployed artifacts they must be deployed with a unique path/file name.
Otherwise, Artifactory will override the artifact which already exist in this path. Notice that this is a different behavior from a version control tool which keeps revisions of the same file. This makes less sense when it comes to binaries, as the binary diff is usually not that useful.
The build info only keeps metadata about your build, but it does not take care of versions.

Related

How to add a service worker to an existing, old, react project?

I'm working on an old react project, which I need to add functionality to, but when deploying the react build on the server, it fails, claiming it cannot find several css and js files, although I published all files within the build folder. I tried different things:
First, I kept the old service-worker.js in the production folder the IIS uses, but replaced everything else.
Then, I tried also deleting the service-worker.js, since I thought it was optional, and my npm run build didn't create a service-worker.js file.
Then, I tried copying the service-worker.js file that existed on production, and manually changing it to point to my css and js files in the /static/ folder of my build folder.
All of these solutions have yielded the same result. So I have a few questions:
Is the service worker necessary? If not, could this error relate to something entirely different other than the service worker?
If it is necessary, why could my npm run build command not create the service worker with the rest of the files in the build folder?
If I do need it, how can I manually add it to a project that already exists?
If the production folder already had a service worker, and my build is not building it, I can also assume maybe my react version is newer, but I find that odd, since the computer I use is one an older employee in my company used, and I didn't manually change anything about this project.

Karaf : feature:install restarts previous bundles

I'm facing an irritating behavior from my karaf server: Title says it all, installed bundles get restarted when I use a feature: install command.
* Project context *
Most of the bundles I deal with are camel routes, the other ones are common tools, shared by the routes.
As a result, I have a 2 level project: a common part that is installed first, and the camel routes that all depends on the common part (dependent on Maven point of view).
* Scenario *
start a fresh instance of karaf
install the common features
install a camel route feature: no troubles so far
install a second camel route feature: the bundles from the previously installed feature will restart.
* Breakthrough made *
All the bundles declared a common config file, with the option "update-strategy=reload". This means that karaf would notify each bundle of any modification of this file, and the bundle would restart to take it into account.
As a matter of fact, when I installed a new bundle with a dependency on this file, it would be read in order to initialize the bundle's properties, and karaf considered it to be a file modification. Therefore, installing a new bundle made all the others restart.
As you expect, I dealt with that problem by removing the update-strategy option, and most of my features are now clean.
* Leftovers *
BUT, some of them still holds the bug: Installing any of those troublesome features will have all the other installed features restart. This is a ONE-WAY problem, installing a clean bundle will not have the troublesome ones restart.
I checked anyway, but no other config file could be responsible for that.
Any help or advice would be appreciated, I can also provide anonymized examples of any file that would help you understand, like an osgi-context or a feature's pom.xml
One last thing: my features regroup around 50 bundles each, therefore I can barely understand the karaf logs, and I can't pinpoint which bundle is restarted first.
Thanks for your time and attention!
I think there are some misconceptions in what you describe.
update-strategy=reload does not cause a bundle to reload. It causes a blueprint context to reload.
You should also not share the some config between bundles it is known to mess up you deployments.
There are also other reasons why a bundle may restart. A karaf feature install tries to provide the optimal set of bundles that is needed overall in karaf to satisfy the set of currently installed features.
A typical case is that you first install feature with a bundle containing an optional package import. At this moment it can not provide the package. Then you install a second feature that provides an exporter of the package. Now the optional dependency of the bundle can be satisfied and the bundle will be restarted by karaf.
You can look into such cases by using feature:install -v . This will show you which bundles are restarted and also why. So maybe this can help you to debug why the restart happens.

Deploying WebSite builds to Azure from VSTS Release Management

I'm kicking the tires on the preview for the Visual Studio Team Services new Release Management system. My scenario is a classic website (ASP.NET 4.5) with a Git repo hosted in VSTS. The build definition is successful as seen here:
It is set up to publish as an artifact that can be picked up by Release Manager as shown here:
On the Release Manager side I have that artifact linked properly as shown here:
And here you can see my environments as well as the associated tasks (all 3 are clones)
When I run the release the build publishes fine, it connects to my subscription but when it attempts to find the package file it has the following error on line 101 of the output log:
"No files were found to deploy with the search pattern 'C:\a\4fe43dd1a***.zip'"
Here is the full output:
This is where I am stuck as I assumed my artifact link via VSTS should resolve this path for me. Obviously I am missing an important piece of the puzzle somewhere, but I've followed the available documentation as best as I can.
If anyone has a solution or can point me in the right direction it would be much appreciated!
--- EDIT ---
I used the file picker to select a web deploy package (see below). I tried using the root website as well as the bin folder. Both attempts results in an error stating: "No files were found to deploy with search pattern 'C:\a\4fe43dd1a\Classic Website Definition\drop\ClassicWebsite\bin'"
--- EDIT 2 ---
I added an MSBuild task to my BUILD process with the following MSBuildArguments
/p:OutDir=$(build.stagingDirectory) /p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true
and in my Copy/Publish Artifacts task I limited my output to only copy .zip files. Now in my RELEASE process when I navigate to find a "Web Deploy Package" the "drop" folder is empty. Here is a screenshot:
I think I'm on the right path, I just need help figuring out to tune my BUILD tasks to generate the right artifacts for my RELEASE process to use. Any help would be appreciated.
The deploy package isn't copied to artifacts folder. That's why the release management cannot find the package. Setting "Copy and Publish Build Artifacts" to the following should fix your problem:
Change $(System.DefaultWorkingDirectory) to $(Agent.ReleaseDirectory). The artifacts will be put in that folder. I don't know exactly what $(System.DefaultWorkingDirectory) maps to, but my impression is that it's something outside of the folder used by the agent for your release.
Also make sure that the published artifacts contains the expected zip file -- if the deployment package isn't getting created, or if you're not publishing the output folder that contains the package, obviously you won't be able to release it later.
When you build your web application, make sure it's packaging for deployment by using MSBuild arguments that package it up. Something like this should work:
/p:OutDir=$(build.stagingDirectory) /p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true

Git didn't add x64\SQLite.Interop.dll

I installed SQLite into my WPF project via Nuget. Then added the entire project to a remote repo. Then I cloned the project on another machine, and had a broken build.
x64\SQLite.Interop.dll was missing.
I'm puzzled why Git didn't include one file from my project. I checked the repo on BitBucket and confirmed it is not there. Git status reports nothing to commit, working directory clean
It added the x86 version, but not the x64 version, I can't imagine why.
(project)\x64\SQLite.Interop.dll Git ignored this file!
(project)\x86\SQLite.Interop.dll
You might want to check the .gitignore file at the root of the repo. If it contains for example x64, it would ignore this file.
There would be two main possibilities then:
edit this file to fit your need
or force this file to be added; ie: git add -f x64/SQLite.Interop.dll
However, committing binary files is often frowned upon. It's true in particular if you want to keep up to date with the latest package, hence if you plan to commit new versions of the dlls on a regular basis.
You might rather want to consider Nuget package restore feature. Basically the idea is that you commit a config file, and the client will automatically download the corresponding packages.

Deployment error on Google App Engine - Uploading 0 files

I recently relocated my Eclipse Java GAE project to a different location on my computer. (And both locations are under Dropbox. ) Since then, I've been having issues with deployment. When I make changes to files and save them, sometimes it doesn't recognize the changes and doesn't upload them (so when I deploy through Eclipse, it says uploading 0 files and the live deployment is not updated).
Sometimes it does work (after several clean's and restarts to Eclipse).
Any help on how to fix this would be greatly appreciated, thanks!
It seems solved now, some notes to consider
Ensure that you are not manually modifying (adding/removing) libraries under war/WEB-INF/lib folder. Add to that folder only when you are using the libraries. (I had some unused libraries there)
Ensure that war/bin/classes are cleaned after you clean the project
I reinstalled the plugin as well, just in case

Resources