What do you know the best tool to testing database performance? I'm looking for a tool which help me find weak performance places in my db during use application.
There are at least two not obvious tools that can help you:
SoapUI has support for JDBC
JMeter has a JDBC sampler (don't miss these wonderful plugins!)
I said these tools are not obviouse because they are typically used for different targets (SOAP web services functional testing and HTTP accordingly). JMeter seems to be a bit better suited as it aimes for performance testing, but SoapUI can do this as well.
I'd just use SQL Server Profiler to capture a database-side trace, and then just sort by duration.
I do stuff like this 5 times a day.
Hope that helps
-Aaron MCITP: DBA
You could use a source code level profiler to profile the application, which accesses your database. Profilers can identify the slowest lines of code. Most profilers can filter their results by namespace or special naming patterns, so could could filter out all non-database access code. You can then have a look which database queries are made on these slow lines.
In some database systems you can set up logging to log which queries where run and how long they took. Database monitoring applications can show you which queries are running at the moment, so you can identify the slowest/most often executed queries very easy. If this is no option you can log the queries you are doing in your app into a text file and then run them manually against the database, usually the time taken is displayed to the user.
A good feature to optimize database queries is the EXPLAIN command supported by many DBMS.
If you told us which database exactly you are running we could help more.
Related
I'm looking for a portable database solution I can use with a website that is designed to handle service outages. I need to nightly retrieve a list of users from SQL Server and upsert their details into a portable database. It's roughly about 250,000 users (and growing) and each one has probably 25 fields that are required. Of those fields, i'd say less than 5 need to be searched on. The rest just need retrieving.
The idea is, in times of a service outage, we can use a website that's designed to work from the portable database rather than SQL Server. Our long term goal, is to move to the cloud and handle things in an entirely different way, but for the short term this is our aim.
The website is going to be a .Net Core web api so will be being accessed by multiple users in multiple threads. The website will only ever need read access, it will not be updating these details what-so-ever.
To keep the portable database up-to-date i'm thinking of having another application that just runs nightly to update the data. Our business is 24 hours (albeit quieter overnight), so there is a potential this updater is in use while the website is in use. While service outage would assume the SQL Server is down, this may not be the case. There are other factors in play that could cause what we would describe as outages. This will be the only piece of software updating the database.
I've tried using LiteDB but I couldn't get it working in a way that worked with my concurrency requirements. It did seem to do some of the job, and was easy to get running. However, i'd often run into locked files due to the nature of web api. I did work out a solution for that, but then the updater app couldn't access the database file.
Does anyone have any recommendations I can look into?
Given the description of the problem (1 table, 250k rows with - I assume - relative fast growth rate) and requirements, I don't think a relational database is what you are looking for.
I think nosql databases, or, more specifically, document oriented databases are more fitted to meet your requirements. There are many choices: Mongo, Cassandra, CouchDB, ... the choice is yours.
Personally I have some experience with ElasticSearch (https://www.elastic.co/elasticsearch), that is quite easy to learn, is portable (runs on Linux, Windows, Containers, etc...), is scalable, and it is fast. I mean, really, really fast, you can get results in 10-20 milliseconds (even less, sometimes).
The NEST nuget package acts as a high level client for working with ElasticSearch (https://www.elastic.co/guide/en/elasticsearch/client/net-api/7.x/nest-getting-started.html)
Hey there fellow Stackoverflowers,
In our company we have several application stacks running on different types of databases (MySQL, PostgreSQL, MS SQL, Azure SQL,..). For monitoring purposes we use some scripted queries on the databases of all these application stacks, with Nagios reporting back the results in an email.
Now, since our support team would also like easy access to these queries in order to manually run them or modify them, we were considering building an application specifically designed to be able to store, run and modify queries that can be executed on any of the above listed database types and offering both a user-friendly webinterface and a REST API with JSON output for our new reporting stack based on SENSU, to be deployed in a few months.
My personal belief is that a tool like this must already be out there, since the use case for it is so generic. However, googling did not yield any results even closely resembling what I am looking for.
So my question to you is: Do you know of such a tool? If you had to build it yourself: what would your approach be? We're mostly a Java/C++ team, but are open to all options.
Some or may be all of this stuff can be done by an existing API called NAGIRA. Look it up on Google. This will definitely give you all the results in JSON format. Also i think it would allow you to run checks manually. So you can may be build a little front end and call this API to achieve what you want.
A little late of a reply, but check out http://cloudmonix.com -- it offers ability to create metrics based on custom SQL queries, supports SQL Azure, SQL Server, MySQL, and Oracle. Also integrates with Nagios (and Zabbix)
This is the first time, my team has asked me to do some testing on Database which I have no clue how to approach. By testing on database I mean, I need to see how fast it can insert records into it. And till what pressure it can handle. Just like Load and Performance Testing for database. The database that we are about to use is XPRESSmp.
So can anyone help me in what kind of testing we usually do when we need to Test the database and what are the tools that I can take a look into for this. Most of the articles that I have seen where mostly related to Oracle and MySQL. But this is a new database altogether.
One approach I can think of is write a Multithreaded Program with X number of threads that will pump the data into XMP at very high speed. And keep on measuring how much time each thread is taking. What else I can do to test the database?
My team has asked me to break the database by doing your testing but we should know at what situation it broked and what was the reason behind that.
And what important points I should know and take into the consideration while doing the testing on database.
P.S I will be doing this testing in seperate LnP machines.
Usually, SysBench is used to test queries performance on MySQL. It is not just for MySQL, though. I have only a basic knowledge of it, so I suggest you don't ask me and read documentation:
http://sysbench.sourceforge.net/
You can use these tools:
HammerDB is an open source database load testing and benchmarking tool for Oracle, SQL Server, TimesTen, PostgreSQL, Greenplum, Postgres Plus Advanced Server, MySQL and Redis. HammerDB is automated, multi-threaded and extensible with dynamic scripting support. HammerDB includes complete built-in workloads based on industry standard benchmarks as well as capture and replay for the Oracle database.
Download or see more information visit http://hammerora.sourceforge.net/
p-unit
Description:
An open source framework for unit test and performance benchmark, which was initiated by Andrew Zhang, under GPL license. p-unit supports to run the same tests with single thread or multi-threads, tracks memory and time consumption, and generates the result in the form of plain text, image or pdf file.
http://p-unit.sourceforge.net/
DBMonster
Description:
DBMonster is an application to generate random data for testing SQL database driven applications under heavy load.
http://sourceforge.net/projects/dbmonster/
Replied here, use the k6 SQL extension.
I have an asp.net-mvc3 website using nhibernate and SQL server. I have 2 web servers that are loaded balanced. This is a read heavy db (not so concerned with write performance), but as the queries are getting more and more complicated (lots of table joins) its slowing down performance considerably.
Based on comments I read , biggest win would be to put a distributed cache in front. I took a look for free options on windows that support nhibernate and I found NCache Express. I am going to obviously do a bunch of testing and playing around but I wanted to see (before i wasted a lot of time) if this express versions would limit me at all in terms of a workable solutions. I see the version comparisons here and I don't think I see any blockers but wanted to get feedback from anyone that has used NCache Express with nhibernate to see if there any issues.
Also, if there are alternative products suggestions for more efficiently solving this problem that would be great as well.
As mentioned before, you should first optimize your database, but of course you are already doing that.
I also work on a website with 2 servers and in the process of chosing a cache provider I've settled with MemcacheD. It is very robust and it is really simple to setup. NCache Expresse would work fine too, there is no mistery on it, but I recommend going with Memcache because NCache express has the 2 servers limit, so if you ever need to add an aditional node you'll have to change anyway.
Also, if your servers have Windows 2008 you should check Microsoft's AppFabric, it is very good.
Use this to evaluate what features do you require that NCache offers e.g. SQL dependencies and stuff
http://www.alachisoft.com/ncache/edition-comparison.html
Other than that i don't think you will required to upgrade
and regarding alternatives, i haven't used many so i cant say anything in this regard :)
PS: Replicated is great for read intensive applications and bad for write intensive.
You could try appfabric as the nhibernate 2nd level cache. You should run it on separate servers to your application nodes though.
Have you tried, Microsoft Velocity ?
http://msdn.microsoft.com/en-us/magazine/dd861287.aspx
A lot of guys on this site state that: "Optimizing something for performance is the root of all evil". My problem now is that I have a lot of complex SQL queries, many of them utilizing user created functions in PL/pgSQL or PL/python. My problem is that I do not have any performance profiling tool to show me, which functions actually make the queries slow. My current method is to exclude the various functions and take the time on the query for each one. I know that I could use explain analyze as well, but I do not think it will provide me with the information about user created functions.
My current method is quite tedious, especially since there is not query progress in PostgreSQL so I have sometimes have to wait for the query to run for 60 seconds, if I choose to run it on too much data.
Therefore, I am thinking whether it could be a good idea to create a tool, which will automatically do a performance profiling of SQL queries by modifying the SQL query and take the actual processing time on various versions of it. Each version would be a simplified one, which would maybe just contain a single user created function. I know that I am not describing how to do this clearly, and I can think of a lot of complicating factors, but I can also see that there are workarounds for many of these factors. I basically need your gut feeling on whether such a method is feasible.
Another similar idea is to run the query setting server settings work_mem to various values, and showing how this would impact the performance.
Such a tool could be written using JDBC so it could be modified to work across all major databases. In this case it might be a viable commercial product.
Apache JMeter can be used to load test and monitor the performance of SQL Queries (using JDBC). It will howerever not modify your SQL.
Actually I don't think any tool out there could simplify and then re-run your SQL. How should that "simplifying" work?