Is Raven.Client.Authorization 1.0.960 package compatible with Raven.Client 1.0.972? - version

The current version of the Raven.Client.Authorization is back a version from Raven.Client. The new Raven.Clients allows you to use the latest Json.Net package and therefore RestSharp, ETC.
I hope to save some time / avoid a deep valley of frustration here. Can 1.0.972 support the 1.0.960 Authorization?

I don't sure if out of the box. Try use assembly redirect. If this is not working, you can get the source of build 972 from this branch than compile the Raven.Client.Authorization for yourself, after you update the json.net reference there.
P.S. If your app is in development and will be for the next 2 months, I strongly recommend that you'll try out v1.2, which already started the stabilization process.

Related

How to create Salesforce incremental package.xml automatically?

Does anyone experiment in creating salesforce Package.xml automatically for continuous integration? If there any script or some idea please share.
You know incremental package.xml helps to deploy only the modified files rather than using complete package.xml that redeploy unmodified files as well which takes a lot of time.
Thanks in advance!
Tricky. And not really a programming-related problem, consider cross-posting this to https://salesforce.stackexchange.com/ or maybe even https://devops.stackexchange.com/
I don't think there's no clear answer, you'll have to experiment. Especially that you tagged "migration tool" (so old-school, battle-tested but lower priority Metadata API; seems that all focus is now on SFDX style of deployments). Do you use any version control (ideally Git) or do you hope to somehow compare source & target org, figure out the deltas and deploy only them?
Remember that often SF gets better at detecting "no changes" with every release (how old is your migration tool's jar file?). For example when I deploy my current project to an empty sandbox (exact copy of prod, no custom objects, code etc yet) the initial deploy takes ~7 minutes. But any subsequent deploy with same content or slight changes takes just 3-4. So try to calculate time lost in the grand scheme of things and decide what gains you want to see / how much time you want to spend on experimenting and tweaking the solution.
You could look into dedicated deployment solutions such as Gearset, Autorabit, Odaseva (I'm not affiliated with either and this list is not exhaustive). They often are capable of running a comparison for you.
There are several projects that try to compose package.xml based on Git diff(erence) between two commits. Of course you need to have a repo first and some regime:
https://github.com/cloudsandbox/sfdx-gen-pack saw presentation about it at Cloudforce London 2019
https://github.com/Accenture/sfpowerkit seems to have a "diff" command (disclaimer: I used to work for Accenture but not affiliated now, haven't worked on the tool, haven't used it personally)
https://cumulusci.readthedocs.io/en/latest/ this seems to be interesting and mature. Built by SF employees, not an official tool but used to CI deploy the non-profit packages they build (maybe you heard about Non Profit Starter Pack, especially if you ever considered enabling Person Accounts). I'm not sure if they do delta deployments as such but there seems to be a command that updates package.xml with files in repository so it's a start? https://cumulusci.readthedocs.io/en/latest/tutorial.html#part-4-running-tasks
I'm not saying CumulusCI will be a silver bullet but out of these 3 seems to be most actively maintained ;) But sounds like you'd have to get familiar with SFDX (if not whole thing then at least commands to convert the project back and forth between "source" (SFDX) structure and Metadata API structure
Answering my question by myself: I found git diff master feature/vat | force-dev-tool changeset create vat working!
Thanks to Roman answered in https://salesforce.stackexchange.com/questions/184332/is-there-a-pre-build-solution-for-generating-a-package-xml-from-a-git-repo

clickonce deployement strategy update for specific users for beta testing

Question is plain and simple:
I want to publish my latest version to selected users for beta testing. Is there any quick and dirty way to do the same.
Here is the link that I found in MSDN which is bit old and suggesting an approach but I dont think i need to put this much effort. Rather I will just use a different installer for beta testing.
https://msdn.microsoft.com/en-us/library/aa480721.aspx[^]
I think you should create a new site with a different publish URL. It is the most manageable option I can think of. Users connect to that site and download the beta version and could run it in parallel with another version for comparison purposes.
You are going to have two versions of the client application so you are going to have two installers; I don't think you can get around that. Having multiple publish URLs means people don't have to uninstall/re-install the app to get the "right version".
Embrace MAGE and script it out.

Chef Package Versioning

If a Chef recipe (or any of it's cookbook dependencies) use the package resource without specifying a version, then the latest version of the package is installed. If you want to control and test exactly what you are installing, then you must always supply the package version. What can you do when the cookbooks that you depend on do not take the same precautions?
See for example the default recipe in the ark cookbook. If this recipe is used on a production server, it could install packages that have not been tested. This is just one example (with over 5m downloads) so I am wondering how people are getting around this problem.
What can you do when the cookbooks that you depend on do not take the same precautions?
I don't think there is a simple answer. This is basically "the Chef way" ...
(Actually, I would suggest that hard-wiring package versions could do more harm than good. One of the good things about using a (good) distribution's package repo is that they regularly release updates with patches for security issues and bugs. But if you wired fixed package versions into your recipes or roles/nodes or something, you would prevent any such patches from propagating to your system.)
However, if it is critical to you that package versions are stable then maybe ...
Clone and hack the cookbooks in question to use specific versions. (Actually, you probably need to do this anyway, to avoid being bitten by unstable cookbooks!)
Use a distro (such as RHEL or its "clones") that values long-term stability, and only pushes out package updates that are "really important".
Create your own private mirror of the distro's package repos with only the "good" versions of the critical packages in it.
Modify Chef so that the package resources pick/install specified versions by default. (I don't imagine this would be easy. But if you did come up with a good solution at this level, it would be a pretty useful addition to Chef! IMO.)
UPDATE
Actually, there is a way to do this for (at least) Debian-based systems; see the apt cookbook and in particular the references to "pinning".
Or with yum, you could "lock" particular versions using "yum versionlock ..." as described here: https://www.zulius.com/how-to/yum-install-specific-package-version/
UPDATE - 2
Another possible trick would be to "inject" a version attribute into the "unsafe" package resources. Something like this:
# first, include_recipe a recipe that specifies 'package "foo"' without
# a version attribute
# then ...
r = resources("package[foo]")
r.variables['version'] = "1.2.3"
With a little ingenuity, one could create a "package version lock" recipe that pulled the versions from a databag, and dealt with missing resource exceptions and version attributes that were actually provided. But I don't know if this is "A Good Idea" (tm).
Chef's package resource uses the package manager of the node's operating system (like apt, yum, etc). These tools always install the most recent version that is available through the repositories. That's why chef's package resource also installs this version.
What the ark cookbook is that it downloads the source code and then compiles it - obvious that you can specify the version to install (through the passed URL).
So it depends on your actual need. If you want to install the version that is available through the distro's or your own package repo, then it's totally fine (and that's what most cookbooks do). If you want to compile everything from source (where you usually have the option to specify the version, the coverage of chef cookbooks supporting this is lower.
Personally, I'd suggest that you set up an own apt/yum/whatever repo for the software for which have specific version requirements.
In short : I'm not managing this.
In a more complete answer:
All distro/release go throught a validation phase before releasing new packages, I'm confident over it and it helps me keep in sync with security fixes.
As far as I know all package managers takes care of not upgrading a package in a breaking way if it is a dependencies of a package installed manually, again you have to trust the package maintainer about this.
i.e.: the package ressource without version won't update make nor gcc if it is a dependency of one package you installed with a fixed version.
For exemple under ubuntu, if you set the nagios package to manual, it will never try to update the libc package over a breaking change, and so I could break other package isntallation as dependencies are not satisfied.
If you're absolutely concerned about it, you have some choices:
Rewrite each pacakge ressource to use fixed version of packages
Fork any cookbook to fix thooses issues (you can write a foodcritic rule to help you detect package ressources without specific versions)
Have your own repos, stable and testing, and move packages inside stable repo once tested and use testing repo on your staging/QA environement.
The 3 is the most conservative as you choose what is in the repo in the stable branch and it won't change magically. The drawback is security fixes you'll have to manage.
Hope it would help.
We had the same issue in our cookbook. So we decided to use data bags.
Data bags can be easily changed, for example:
knife data bag from file my_data_bag host1
OR
knife data bag edit my_data_bag host1
Your recipe will be able to see the specified version from the data bag using the code like this:
my_bag = data_bag_item('my_data_bag', 'host1')
Chef::Log.info("You have changed the version to: #{my_bag['version']}")
package 'java' do
version my_bag['version']
action :install
end
So finally you don't need to modify Cookbook or Recipe. All you need is to pass the version to the data bag.

Migrating from Cake 1.3 to 2.0 and beyond - migrate existing, or only use for new?

I'm nearling completion of my first CakePHP-driven website and just saw they're already working on CakePHP 2.0 (not the stable release yet).
My questions:
Is it incredibly time consuming to move to a new version of CakePHP (when it becomes the "stable" release that is)? I know they have migration guides, but - I've never used a framework before, so I've never had to migrate anything.
Do you migrate your code for existing projects, or leave it as is and use the new stable version for future projects only?
Where can I find what version of CakePHP I currently have installed? I've looked at the LICENCE and VERSION files, but cannot find the installed/current version listed in them.
These seem like simple questions, but I greatly appreciate any thoughts/advice - searching this on Google just brings up how-to-migrate pages, not pros/cons...etc.
I've migrated a few sites from CakePHP 1.2 to 1.3. In my experience, it takes 2-3 hours on sites that have 5-10 controllers and no custom plugins, etc. I find I typically only have to change the syntax on a handful of function calls, and when I figure out which ones, it is just a matter of doing a find / replace across the site. Of course it could be more of an issue going from 1.3 to 2.0, but I don't get the sense that it will be an especially drastic API change.
UPDATE: I'm now in the process of migrating to CakePHP 2.0 beta, and thought I should update this, as I'm finding the updates are more extensive and far-reaching than I had assumed when I wrote this. Migration guide here: https://github.com/cakephp/docs/blob/master/en/appendices/2-0-migration-guide.rst
ANOTHER UPDATE: Since people seem to be finding this useful, I just thought I'd point out that Cake now helpfully provides an upgrade shell that does some of the work for you. Note that although the documentation says it will do "most" of the work, I have found there are still quite a few function calls, etc. that will need to be updated manually (see migration guide).
http://book2.cakephp.org/en/console-and-shells/upgrade-shell.html
As dhofstet said, it will all depend on the size and complexity of your site.
Whether you upgrade at all is usually a judgment call, but sometimes you have to (e.g. Cake 1.2 has some code that will break if your host upgrades to PHP 5.3). You certainly wouldn't have the kind of security issues that an old WordPress, Drupal, etc install would have. I have seen some noticeable speed increases with Cake upgrades, so depending on the situation it could be worth the trouble just for that (Cake 2.0 finally drops PHP 4 suppport). Look at the release notes and see if there are things that appeal to you in the new version.
To see your version, in the cake/VERSION.txt file, look at the very last line. It's easy to miss, but it should just be a number, e.g. 1.3.8.
This question is difficult to answer as it depends on the size and complexity of your project(s). The "big" releases (from 1.1 -> 1.2, 1.2 -> 1.3, 1.3 -> 2.0) usually break stuff and so you have to budget some migration work. The migration between "smaller" releases (for example from 1.3.9 to 1.3.10), on the other hand, is usually easy, often it just means to replace the cake folder. In both cases it is useful to have tests.
I migrate the projects which are actively maintained.
You can find the CakePHP version in cake/config/config.php
I'm migrating an app 1.3»2.0rc1 right now and I got no big trouble.
I had to change names of folder/files, eg. app_controller.php » Controller/AppController.php
Follow the migration link (tmp link) http://book2.cakephp.org/en/appendices/2-0-migration-guide.html
plugins/components/.. from various source won't work (at minus, for point 1)
To update the code (which in my case wasn't needed as the app worked well) I've shell-baked a dummy table and looked at differences in code.. It's a good starting point
Authentication/Authorization changed in some config, but requires few changes.
Trees still working
Acl don't. But I'm quite sure it's my fault.
For now it's all, good work!

Apply upgrades (application related) to database

Since I've not done this before I am not sure if the way I am planning to do this is okay or is there a better way. Like using Windows Installer or Install Shield or Windows Installer XML (WiX) toolset. Any help would be great, as I have no clue.
We have a product and we ship new version every few months. So far we've only been rolling out complete versions i.e. Either Version 1.0, or Version 1.5, but no upgrade from 1.0 to 1.2 to 1.3 to .... you get the picture, right! So any customer that get version 1.0 cannot upgrade to version 1.2 or 1.3 or even the latest. They'll have to uninstall old version and install the latest version. This is not right, but thats what we could do until now. But we'd like to change it.
My plan is to have a install file with (Sql Scripts) for each upgrade path. Check the table in database that stores the version info and depending on it run different script to upgrade database.
My concern is that this method may not be scalable, once we have more than 5 or 6 different versions.
If you could point to any articles or books on this topic, that would help a lot too.
Also, could we use Windows Installer or Install Shield for this?
thanks,
_UB
We've been using DBGhost for a year or so now to keep our database under source control along with our codebase, and it makes this kind of thing dead easy. It's not just well thought through, but they've been using it to roll out their own code for years, so it's dead solid.
Your problem is a pretty common one, and I've had to deal with this kind of problem at my last job. There is another tool aside from the RedGate tool that may help you do what you need to do. It's a tool called DB Ghost. They explicitly address the versioning problem, and have a packager as well. I would suggest doing a trial of the DB Ghost product because they have some interesting claims concerning multiple version upgrades. This was taken from their FAQ (http://www.innovartis.co.uk/faqs/faqs.aspx):
Q: Our problem is going to be managing
data structure changes during
upgrades. Our product line is
Shrink-Wrapped, or downloadable from
the website. So when a user downloads
an upgrade, they can be upgrading from
a very recent version, with few
database structure changes, or the
upgrade may be from a very old version
with a multitude of structural
changes. One upgrade needs to manage
it all. The user would be offsite, so
we can't hold their hand. We have
users in Greece, Australia, Malaysia,
Norway, etc. How would DB Ghost, if at
all, handle updates in remote
locations?
A: The DB Ghost Packager Plus product was
design to specifically address this
issue as it can dynamically handle the
required updates to a target database
seamlessly.
I'm just mentioning this because our company is trying to do something similar and I was doing research on this tool.
Thanks,
Eric
Do you insist on doing it yourself, or could you see yourself committing and investing in a tool?
I really like the idea of Red-Gate's SQL Packager, which will "diff" your two database versions, and then create a SQL script, a C# project, or a stand-alone executable to upgrade from version 1 to version 2.
Not 100% how you'd be able to upgrade from 1.0, 1.1, 1.2, 1.3 all to 2.0 - check out their website and see if they offer something for that scenario!
Otherwise, I guess it'll get quite thorny and messy......
Marc
In the Rails world they are using a tool/method called Migrations.
Basically is boils down to creating a small sql script to upgrade and downgrade each little change to the database.
When you are testing the application you migrate your database to the version you want and on deployment the application can check what version it needs and migrate to that version.
There are free migration toolkits for most popular languages, they might be part of some MVC framework though.
A nice side effect of migrations is that you have database source code that is easily stored in you source control repository.

Resources