What are the new changes in Apache POI 3.9 ? Memory leakage issue in 3.9? - version

In Apache POI 3.9 version release they are telling the memory leak and creating temp file is fixed (bug : 53493). But how to use that? Is there any changes for the importing packages in 3.9 version compared to 3.8? If the changes occur, then what are they?

The change log for Apache POI is available online. To see the changes between 3.8 and 3.9, look between here and here
Unless otherwise detailed in the release notes included in the download, you should be fine to just drop the new jars in, in place of the old ones. Make sure you really remove the old ones though! You get all sorts of odd things going wrong if you have both old and new POI jars on your classpath

Related

How I find out which files wicket write in the "generated" directory?

I'm new in wicket. But I have to care a project with wicket 6.20 components (run on a payara application server). Now I have a space problem on my live-server. In directory
payara41/glassfish/domains/domain1/generated/jsp/my-ear-21628-LIVE-CLUSTER/my-web_war/wicket.my-web-filestore
web-filestore are collect in time a huge amount of data files.
./7857
./7857/9907
./7857/9907/6cb5a7b9cb89ad9e0d8b1422af63
./7857/9907/6cb5a7b9cb89ad9e0d8b1422af63/data
./7857/1851
./7857/1851/6bc6a644f4ab91a7674b0e91b4fe
./7857/1851/6bc6a644f4ab91a7674b0e91b4fe/data
...
How can I find the cause of this file generation?
Wicket stores the stateful pages on the disk (by default). It creates a file for each http session. On session expiration the file is deleted and the content of the temp folder should not grow endlessly.
But there was a bug in the past that left such files there.
I don't remember in which version exactly the bug was fixed.
The best would be to upgrade to the latest available version. 6.x is not supported since several years, but still you can upgrade to latest 6.x.

which is the most stable and tested version of apache solr on hdfs or maprfs

I am trying to setup solr in my project, want to know which is the most stable and tested version of solr available. I want to use mapr filesystem.
Basically, there are two rules for all Solr releases:
They all have a large number of tests to pass before release.
Still, there's some issue with every .0 release, so it's wise to choose the bugfixed .1 even later releases.
Bonus: This applies to every type of file system you want Solr to run on.
Then there is a trade-off between having the latest features and having it around in field use for longer time. Of course, version 3.6 was thoroughly tested in the field, because it's been around for many years. But it's so outdated you should not choose it. The same applies for the 4.x branch.
On the other end, there's the 6.x branch which has many cool new features but is relatively young. So personally, I recommend you to go with the latest release of the 5.x branch. While the 5.0 release had many new features introduced, the work up to the latest released version 5.5.4 had many fixes applied and still gets backports for things that are fixed in the 6.x branch.

Jackrabbit locks up with many open ACEs

I am running into an issue where a lot of processes block due to having more than 1000 access control entries active at a time; this is a known issue in Jackrabbit; a work-around has been identified and rolled out into 2.4.1, but CQ 5.5 / CRX 2.3 uses Jackrabbit 2.4.0. Are there any workarounds available under 2.4.0?
I ran into this article that refers to CRX 2.2. http://helpx.adobe.com/crx/kb/cacheentrycollector-cache-size-is-too-small.html
The resolution says to install CRX hotfixpack 2.2.0.56. This makes CachingEntryCollector configurable. via a JVM parameter:
-Dorg.apache.jackrabbit.core.security.authorization.acl.CachingEntryCollector.maxsize=10000
I have not been able to locate hotfix 2.2.0.56, but the solution is showing in 2.2.0.68.
This has been addressed before. The question is, did this make it into CRX 2.3. I am still digging through CQ, looking for org.apache.jackrabbit.core, to see if this fix made it to the new version.
update:
Sadly, this change did not make it in to 2.3.

Apply upgrades (application related) to database

Since I've not done this before I am not sure if the way I am planning to do this is okay or is there a better way. Like using Windows Installer or Install Shield or Windows Installer XML (WiX) toolset. Any help would be great, as I have no clue.
We have a product and we ship new version every few months. So far we've only been rolling out complete versions i.e. Either Version 1.0, or Version 1.5, but no upgrade from 1.0 to 1.2 to 1.3 to .... you get the picture, right! So any customer that get version 1.0 cannot upgrade to version 1.2 or 1.3 or even the latest. They'll have to uninstall old version and install the latest version. This is not right, but thats what we could do until now. But we'd like to change it.
My plan is to have a install file with (Sql Scripts) for each upgrade path. Check the table in database that stores the version info and depending on it run different script to upgrade database.
My concern is that this method may not be scalable, once we have more than 5 or 6 different versions.
If you could point to any articles or books on this topic, that would help a lot too.
Also, could we use Windows Installer or Install Shield for this?
thanks,
_UB
We've been using DBGhost for a year or so now to keep our database under source control along with our codebase, and it makes this kind of thing dead easy. It's not just well thought through, but they've been using it to roll out their own code for years, so it's dead solid.
Your problem is a pretty common one, and I've had to deal with this kind of problem at my last job. There is another tool aside from the RedGate tool that may help you do what you need to do. It's a tool called DB Ghost. They explicitly address the versioning problem, and have a packager as well. I would suggest doing a trial of the DB Ghost product because they have some interesting claims concerning multiple version upgrades. This was taken from their FAQ (http://www.innovartis.co.uk/faqs/faqs.aspx):
Q: Our problem is going to be managing
data structure changes during
upgrades. Our product line is
Shrink-Wrapped, or downloadable from
the website. So when a user downloads
an upgrade, they can be upgrading from
a very recent version, with few
database structure changes, or the
upgrade may be from a very old version
with a multitude of structural
changes. One upgrade needs to manage
it all. The user would be offsite, so
we can't hold their hand. We have
users in Greece, Australia, Malaysia,
Norway, etc. How would DB Ghost, if at
all, handle updates in remote
locations?
A: The DB Ghost Packager Plus product was
design to specifically address this
issue as it can dynamically handle the
required updates to a target database
seamlessly.
I'm just mentioning this because our company is trying to do something similar and I was doing research on this tool.
Thanks,
Eric
Do you insist on doing it yourself, or could you see yourself committing and investing in a tool?
I really like the idea of Red-Gate's SQL Packager, which will "diff" your two database versions, and then create a SQL script, a C# project, or a stand-alone executable to upgrade from version 1 to version 2.
Not 100% how you'd be able to upgrade from 1.0, 1.1, 1.2, 1.3 all to 2.0 - check out their website and see if they offer something for that scenario!
Otherwise, I guess it'll get quite thorny and messy......
Marc
In the Rails world they are using a tool/method called Migrations.
Basically is boils down to creating a small sql script to upgrade and downgrade each little change to the database.
When you are testing the application you migrate your database to the version you want and on deployment the application can check what version it needs and migrate to that version.
There are free migration toolkits for most popular languages, they might be part of some MVC framework though.
A nice side effect of migrations is that you have database source code that is easily stored in you source control repository.

DotNetNuke upgrade

I need to upgrade my current version of DNN this week. I am currently using 2.1.1. I don't want to do everything twice, so, I have several questions.
Is there an upgrade tool or some scripts somewhere that will help me to do an upgrade.
Am I better off installing 4.9 or 5.0. It is production.
If I go with 4.9, will I be able to upgrade to 5.0 when it releases?
I personally strongly disagree with ALassek, you can upgrade DotNetNuke, you just have to follow the steps listed and as long as you do that it isn't a big deal at all, but there are a few key things to keep in mind as you set down the road to do your migration.
DO NOT USE 5.0 in production at this time. 5.0 is only in RC2 stage at this time and using it in production is NOT recommended and an upgrade path from RC2 -> Final might not be possible!
If you plan on trying to upgrade from 2.1.1 go from it to the most current version of 2, then go to 3, then go to 3.3.7, then go to 4.4.1, then to 4.6.2, then to 4.9.0. Typically you are able to make it, but some sites are not.
Some modules though will need to be updated to work with DNN 4.x, depending on the numbers and vendors this can be an easy process or can involve needing to find other providers for the specific functionality at hand.
As for the potential to upgrade to 5.0 from 4.9, yes, that will be 100% supported once 5.0 is in a production ready state.
It's been my experience that DotNetNuke has a tendancy to release breaking changes without documenting them (or documenting much of anything, for that matter). Without knowing exactly what you have installed in it, it's impossible to say exactly how screwed you are. But I can guarantee you the transition will likely not be easy, especially if you have a lot of modules installed.
Between 2.1.1 => 4.9, so much has changed that I can't imagine there is any automated way to upgrade. You're better off starting from scratch and seeing what still works. Most likely you will need to find newer versions of any modules you're using, or replacements for those that aren't being kept current.
To be honest, I don't know. But I see that the DNN download page very strongly states that the 5.0 release-candidates are "NOT RECOMMENDED FOR PRODUCTION USE".
There was a huge amount of breaking changes between 2x and 3x which will cause pretty much any custom modules you have to have to be upgraded or replaced. Other than that Mitchel is the DNN man and I would defer to him.

Resources