How to keep multiple versions of an artifact distinguished by properties in Artifactory? - versioning

is it possible to somehow keep different versions of the same file in Artifactory, where the versions are differentiated by their properties?
For example, let's have a file foo.
I upload the file to Artifactory via the REST API and set ver=1 property.
The file changes, I upload it again, this time it has ver=2 property.
Later I try to access the ver=1 file, but get a 404 error.
I understand that Artifactory keeps different versions of Artifacts which are associated with different builds. However there is no build info other than the "custom property" for the files I am uploading. How can I version them?

You must make sure that each artifact is also deployed with a unique path/file name. It is not enough to have a different set of properties.
Usually the best way to version the file will be having the version number as part of the file name and possibly also as part of the path.

Related

How to get information about libraries in qooxdoo?

I want to get info about libraries listed in compile.json (libraries section) during runtime. How could I do it?
What sort of information I would like to get is meta information about themes used in libraries.
There's two things here: firstly meta data (which you would have to serve up to your client manually) and secondly, runtime information which is compiled in.
Meta Data
This data is used by the API viewer https://github.com/qooxdoo/qxl.apiviewer
There's two sets of meta data - firstly, when you compile with qx compile, the compiler will output a db.json in the root of the target's output (eg ./compiled/source/db.json) - this contains information for each class, for every application, including dependency information.
The snag here is that if you add an remove dependencies, there could be classes listed in db.json which are no longer used, and if you have several applications you will need to walk the dependency tree to find out which classes are used by which application.
The second set of data is that every class is transpiled into the compiled/source/transpiled/ directory, and has a .js, .mpa, and .json file - so for qx.core.Object you will have ./compiled/source/transpiled/qx/core/Object.{js,map,json}
The class' .json file contains loads of information about superclass, interfaces, mixins, members, and properties - and for each one, records the parsed JSDoc as well as where in the original source file it occurs.
Runtime Information
There is a global environment variable called qx.libraryInfoMap that lists all the information about all of the libraries which are compiled in - the contents comes from the library's Manifest.json
There is also qx.$$libraries which is a global variable which gives URIs
Safety
None of the data above is considered documented and immune from changes in structure - technically it is internal data structures, but it's reasonably unlikely to change and using it is supported.
Because of the impact on other peoples code as well as our internal tools such as the API viewer, we would try to not change the structure, but it is not guaranteed and a minor version bump could (potentially) make a change in the structure.

How to implement a tool to validate content and push it to the cloud

I am currently committing text files and images to a GitHub repository that need to have a certain format and content style, with validation done via pre-commit hooks, so all files committed to the repository are valid. I also need git to keep track of when files are updated and the versions that existed previously.
In the future I want to move to storing the files to a cloud service instead of the repository. This is the solution I though of:
Have a script that needs the directory you are trying to upload, name would need to be a certain format, ex. <City><Street>.
If it exists, the script compares the folder contents to the one in the cloud, if not all the folder gets uploaded.
Before upload we run content format validation, if it doesn't pass then we throw errors to the user.
If there were previous file versions, we store them in a different folder, appending the date/time to the filename.
Cloud now has the latest version.
I would lose a lot of the advantages that I had with version control and the pre-commit hooks before. I would gain the ability to just pull a specific folder, something that GitHub doesn't allow me to do. What would be a better way to implement this? Is there a tool that would be good for this?

How to update .env file and share among teammates?

I created .env file with params, pushed to github, my teammates downloaded repo. In next push I added .env file to .gitignore. Now I need to make changes to .env file, but how they will get it if .env ignored. What is the right way of doing such of manipulation?
UPDATE:
I used two libraries to manage env variables:
https://www.npmjs.com/package/dotenv
https://www.npmjs.com/package/config
You do not store configured .env file in repository but instead, you create .env.dist (or anything named like that) and commit that file. Your .dist file can contain all keys commented out, or all keys and default values. It's all up to you but you need to ensure your template do not contain any sensitive data too:
DB_HOST=
DB_USER=
The main benefit is that you do not have .env in the repo, so each developer can easily setup own system as he likes/needs (i.e. local db, etc) and there's no risk of having such file accidentally overwritten on next pull, which would be frustrating.
Also (again), you do not store any sensitive data in the repository, sowhile your .env.dist can and maybe even should be pre-configured to your defaults you must ensure any credentials are left empty, so noone can i.e. run the code against production machine (or figure out anything sensitive based on that file).
Depending on development environment you use, you can automate creation of .env file, using provided .env.dist as template (whcih useful i.e. with CI/CD servers). As dotenv file is pretty simple, processing it is easy. I wrote such helper tool for PHP myself, but it is pretty simple code and easily can be ported to any other language if needed. See process-dotenv on GitHub for reference.
Finally, if for any reason config setup is complicated in your project, you may want to create i.e. additional small script that can collect all data and write always up to date config file (or upgrade existing etc).

How to debug the error "The requested property 'current-activity' is not available"?

My Project is using IBM ClearCase as version control tool. I recently checked in some of the files, modified them and even checked out the modified files.
Next day when I try to check in some of the files, I am getting the following error :
the property is not available locally: stream
and
the property is not available locally: current-activity
Is there any way to resolve this error? I am stuck
I recently checked in some of the files, modified them and even checked out the modified files.
You actually checked out files, then modify them, then check them in.
You check out in order to make a file modifiable.
the property is not available locally: current-activity
The exact error message is actually:
The requested property 'current-activity' is not available
See this IBM technote:
Cause
The information within the .Rational folder had become stale or corrupted.
ClearQuest database connections require refreshing as they are no longer seen.
View tag (as stored in the registry server) is no longer visible on the Change Management (CM) Server when performing an Cleartool lsview.
This issue can also occur if the region used does not hold the right VOBs any longer. While trying to create new Views or change load rules the VOBs are missing. This happened due to a changed region map file.
So it depends on your OS, version of ClearCase, integration or not with clearquest.

Dart: Accessing a resource out side the project|web/ directory

I have a web-app(browser based) which needs to access a folder full of icons that resides outside the web folder.
This folder MUST be outside the web folder, and would ideally exist outside the project folder all together
however, when specifying the path to the folder neither "../" or making use of a symlink will work
when the page attempts to load the image I always get
"[web] GET /Project|web/icons/img.png => Could not find asset Project|web/icons/img.png."
however I set the image source to "../icons/img.png"
how can i get dart to access this file properly
PS: I attempted a symlink to another part of the filesystem (where the images would be kept ideally) however this did not work either.
The web server integrated into DartEditor or pub serve only serves directories that are added as folders to the files view. When you add the folder to DartEditor you should be able to access the files. This is just for development.
You have also to find a solution for when you deploy your server app. It would be a hazardous security issue when you could access files outside the project directory. Where should the server draw the line? If this would be possible your entire server would be accessible to the world.
Like #Robert asked, I also have a hard time imaging why the files must not be in the project folder.
If you want to reuse the icons/images between different projects you could create a resource package that contains only those images and add them as a dependency to your project.
If you want a better answer you need to provide more information about your requirements.
If you wrote your own server (by using the HttpServer class) it may be possible to use the VirtualDirectory to server your external files.
Looking at look the dartiverse_search example may give you some ideas.
You could put them in the lib directory and refer to them via /packages/Project/...
Or in another package, in which case they would be in a different place in the file system. But as other people have said, your requirement seems odd.

Resources