How can I test DB2 9.5 -> 9.7 Migration/Upgrade? - database

I am going to migrate a DB2 9.5 database into DB2 9.7. I hope that it will be a smooth process. But I want to be sure that the applications are not affected in a negative manner.
"you can run your application and see if everything works" would be a way to ensure the database functionality needed.
What I want to know is
Is there any subject/DB2 functionality that I need to pay particular attention to.
What is the best testing way that I can choose.
Is there a functionality that improves the performance that i must add to my old DB Structure.
Any help will be welcome..

IBM has a very nice upgrade guide that covers those topics (and more!) on the Information Center. I highly recommend you check it out!

You can use a capture replay tool named iReplay to capture your db traffic and test it against DB2 9.7. http://exact-solutions.com/products/iReplay

Related

How do you handle versioning of Spotfire dashboards?

Natural thing about software is that you enhance it, thus you create next versions of it. How do you handle that in concern of Spotfire ?
At least two ways I can think of.
First, in 7.5 and above you can spin up a test node and copy down any dxp you want from live to develop in test. Once the "upgrade" or changes are complete you then would backup the live version to disk somewhere... anywhere you do other backups, and deploy the new version to live.
For pre-7.5 the idea is the same but you would have to create a test folder in live with restricted access to test your upgrade on a web player.
Strictly speaking of "what version are you on" in regards to Analytics like there is in software isn't really the same in my opinion. There should only be one version of the truth. If you are to run multiple versions you'd have to manage their updates separately for caching which is cumbersome in my opinion. Also, realizing the analytic has a GUID which relates to its information sources means that running them in parallel in the same environment will cause duplication.
If this isn't what you were shooting for I'd love for you to elaborate on the original post and clarify anything I assumed. Cheers mate.
EDIT
Regarding the changes in 7.5, see this article from Tibco starting on p.42 which explains that Spotfire has a new topology with a service oriented architecture. In 7.5 onward, IIS is no longer used and to access the web player you doesn't even go to the "web server" anymore. The application server handles all access and is the central point for authentication and management.

Would Ncache Express solve this scalability issue or would I need to upgrade to premier edition?

I have an asp.net-mvc3 website using nhibernate and SQL server. I have 2 web servers that are loaded balanced. This is a read heavy db (not so concerned with write performance), but as the queries are getting more and more complicated (lots of table joins) its slowing down performance considerably.
Based on comments I read , biggest win would be to put a distributed cache in front. I took a look for free options on windows that support nhibernate and I found NCache Express. I am going to obviously do a bunch of testing and playing around but I wanted to see (before i wasted a lot of time) if this express versions would limit me at all in terms of a workable solutions. I see the version comparisons here and I don't think I see any blockers but wanted to get feedback from anyone that has used NCache Express with nhibernate to see if there any issues.
Also, if there are alternative products suggestions for more efficiently solving this problem that would be great as well.
As mentioned before, you should first optimize your database, but of course you are already doing that.
I also work on a website with 2 servers and in the process of chosing a cache provider I've settled with MemcacheD. It is very robust and it is really simple to setup. NCache Expresse would work fine too, there is no mistery on it, but I recommend going with Memcache because NCache express has the 2 servers limit, so if you ever need to add an aditional node you'll have to change anyway.
Also, if your servers have Windows 2008 you should check Microsoft's AppFabric, it is very good.
Use this to evaluate what features do you require that NCache offers e.g. SQL dependencies and stuff
http://www.alachisoft.com/ncache/edition-comparison.html
Other than that i don't think you will required to upgrade
and regarding alternatives, i haven't used many so i cant say anything in this regard :)
PS: Replicated is great for read intensive applications and bad for write intensive.
You could try appfabric as the nhibernate 2nd level cache. You should run it on separate servers to your application nodes though.
Have you tried, Microsoft Velocity ?
http://msdn.microsoft.com/en-us/magazine/dd861287.aspx

Version Controlling and Release Logs Maintaining Mechanism for Oracle

We have an application developed over Oracle 10 G (DS) forms connected with the Oracle Database in which time by time there are changes we need to make in scripts and procedures defined.
Task assigned to our group is to find out possible Version Controlling and Release Logs Maintaining Mechanism that could record every change made and release finalized in database.
I want a word of suggestion from all the experienced people out here, what could be the best possible solution of our problem, ideally a single solution or multiple ones.
(I am not very Oracle Form-Literate, so apologizes if I sounded confusing)
Have a look at this and this.
The first link is about .Net projects, but gives you concrete examples for how to set up your development processes; the second link is a general approach from Martin Fowler, who is a bit of an authority on software development.
The basics are that you have to script/automate as much of the deployment lifecycle as possible, and version everything.
I don't know much about Oracle Forms, but as far as I know, this approach should work.

Which DB Server should I use?

I have to develop a new (desktop) app for a small business. This business currently has an Access database with millions of records. The file size is about 1.5 GB. The boss told me that searching on this DB is very slow. The DB consists of a single table with about 20 fields.
I also think the overall DB design isn't great. I thought to use another DB server with a new design to improve both performance and efficiency.
Considering this is a relatively small business, I don't want to spend much for a DB license, so I want to ask you what would you do.
Continue to use Access, maybe improving and optimizing the DB in some way
Buy a DB server license (in this case, which one?)
? (any idea?)
Things like SQL Server Express, MySQL and PostgreSQL are available for free, no license purchase necessary.
For improving search speeds, you will probably also want to look at things like what indexes are defined for the table, what exactly searches are doing, et cetera.
2nd the recommendation for Firebird. We've been using it for about 5 years and never had an issue. Cross platform, embedded & server deployments... brilliant. Oh, and Free as in Beer. Mozilla Public LIcense.
Put me down as another recommendation for Firebird. We use it with our commercial Point of Sale product. We have it installed at over 1,000 sites, with databases as large as 40+ Gigabytes. It's fast, stable, simple, easy to deploy, and requires no management.
Your could replace the Access Database with a SQL Server database that will scale well moving forward. You can use SQL Server Express which is free and supports databases up to 4Gb I beleive.
SQL Server Express. Free for database size up to 10 GB.
SQL Server Express is a perfect fit for this. http://www.microsoft.com/express/database/
I warmly recommend MySQL. Its sometimes free and is easy to install on both Windows and linux.
There are also a lots of great free tools to manage its content like tables, users, indexes etc...
I would look into if the big table needs to be broken up into smaller ones(rarely needed, but still) and also what indexes are on it. And for Database software I would recommend PostgreSQL. It is free, easy to use(and I consider it easy to setup, though others beg to differ), and it is fast enough for enterprise applications.
You can take a look at Firebird
Firebird is one of the best database for desktop application and will allways be free.
Some tools exist to convert database from access to firebird.
I also recommend Firebird.
Its key advantages for your scenario are (from top of my head):
embedded version. You can ship it with your application - no separate installation kit needed, no .NET dependencies etc.
later on you can scale seamlessly to the full client-server model. No code changes required.
very small footprint
the entire database is stored in a single file. Much easier to deploy compared with other solutions.
you can have your server on any platform you want: Windows, Linux, MacOSX etc. Of course, you can have your client also on the same platforms but since you mentioned Access, I suppose that you have a Windows application.
no need for server administration. It just works.
I recommend PostgreSQL as well (especially as an alternative to MySQL)
No-one ever seems to mention it, but Oracle also do a free-as-in-free-beer version of their database: Oracle Express Edition (aka XE). It is limited to 1 CPU, 1GB RAM and 4GB of user data but that sounds plenty big enough for your application.
As for your database design, just one table sounds more like a spreadsheet than a database application. Probably you have lots of denormalised data. Splitting those out into smaller de-duplicated tables might well speed up certain queries. However if you only have twenty columns there may not be a lot of scope for tuning.
As for recommendation, the question is, which products do you know? If you have a familiarity with Access then I suggest you try to optimize your existing database. I have worked with Access databases which store several million rows and they performed well enough. After all, there is no guarantee that moving the same design to a different product will automatically make things run x times faster. Another advantage of Access as a tool is that it comes with a built-in front-end tool. If you move from Access you may need to think about re-building your application.
Wow, folks have a lot of platform recommendations for you.
Let me just say this.
If you feel like there are design issues as well as platform issues, why not try the design changes first? These are changes you are likely to make anyway.
If they make no performance difference in Access, you are no worse off, since these changes improve maintainability on any platform.
Then you can try other platforms with the knowledge that you have a solid design, and that you have not wasted any time.
SQLite can also be a candidate.
I would recommand not going with express anything - yep your small but what if the busness takes off and need much higher loads in the future - surely thats the way you want it to go ?? do not want to have to switch db layers / run 2 ..
I would look at MySQL or Firebird (used to borland interbase) both very high quality.

Monitoring a database instance

Anyone have any idea? And any open source sofware which also seens to perform this kind of functionality?
I'm not sure what you need, but would http://www.nagios.org/ be enough for your purposes?
What database? What platform?
If it's MySQL, there are many monitoring applications around - for example, the MySQL GUI Tools include a Health Monitor widget (on OS X)
Also, phpMyAdmin shows statistics from the MySQL server.
You could also write a simple script that connects to the database, executes some trivial command and check it returns a known value. If it doesn't, send an alert somewhere.
This depends a lot on what kind of database and what you're monitoring for.
Things you might be monitorring for:
Is the database still up?
How heavily loaded is the database?
Deadlocks?
Security events?
Exceptions?
Perhaps you could edit your question to fill in a bit more info?
Have you looked at OpenNMS?
You might want to look at cacti (http://www.cacti.net/what_is_cacti.php) which is general purpose tool for giving graphical representations of any type of data. We use it to see how healthy our webservers and mysql servers are. But it does not have any alert system (in case something critical happens and you need to take immediate action) as far as I know for which you might want to consider nagios as pointed by someone already. See the screenshots below for mysql below to have an idea. The screenshots show various graphs for showing various states of mysql server over a period of time:
http://www.xaprb.com/blog/2008/05/25/screenshots-of-improved-mysql-cacti-templates/
IF your database is other than mysql then google for "your_database_name cacti" to find templates for your database.
I'm not sure is I understand your question but I use nagios to monitor just about anything on my server...
What about Nagios?
Here are some recommended scripts for MySQL, MS-SQL, Oracle:
http://www.consol.de/opensource/nagios/
+1 to the suggestion you give us some more details as to what you want to monitor and you're platform.
I use Hyperic and am largely happy
OpenNMS I also looked at, same with Nagios, I'd suggest dowbnloading the 3 of them, or doing a little reading about them, and then picking one and going for it. Hyperic in my opinion was a lot easier to get implemented than Nagios, OpenNMS I didn't try for my self. Those 3 are as far as I know it the big open source monitorring solutions.

Resources