Is there a way to check the DB2 sql log for actual SQL operations executed into it?, ie how many rows were fetched etc? - database

I am using a DB2 v10.5, and I am pushing messages into the database I created using a gateway. Is there a way to check the DB2 sql logs for actual SQL operation executed?, ie how many rows were fetched etc? While googling, I find these logs inside the DB2 server in the DIAGPATH /db2/db2inst1/sqllib/db2dump/ but I don't see any SQL messages in there.
I have been checking DB2 guides as well but any ideas to help me on this is greatly appreciated. Thank you.

Activity event monitoring
Briefly:
It acts like "logger" for executed statements. The information is written to the tables of such an event monitor for sessions with such a "logging" enabled.

There is also the package cache. This holds aggregate metrics for all executions of a statement that are still in the package cache (entries get evicted from the cache as newer statement arrive). MON_GET_PKG_CACHE_STMT
You can also use the Db2 Database Management Console which is
A new browser-based console that helps you administer, monitor, manage and optimize the performance of IBM Db2 for Linux, UNIX and Windows databases.
and which itself collects data via functions such as MON_GET_PKG_CACHE_STMT and Activity Event Monitors

Related

How to synchronize SQLite database with SQL Server in xamarin?

I am in the process of developing an app for picking up orders for billing, in which I have determined that it works offline because users arrive at certain places where there is no signal, therefore the app will work with a local database SQLite, which I want when it connects to the internet, it synchronizes the data in a bidirectional way, it is a MERGE type replica, between SQL Server and SQLite, it should be noted that the app works in Xamarin Forms, I wanted to know if there is any information about this.
Thank you for your attention.
It's a manual process, you read from your SQLite db and write in SQL Server and vis versa, keep track of the data you already synced using flags and date columns. ( By manual I mean Write Code)
Currently i working the project same as you, using Dotmim.sync tool
for data around 10k to sync with no problem but after 100k data it need take time to sync, maybe that sync tool will help you

How can I use Microsoft Stream Insight to monitor Audit Logs in SQL Server?

I heard Microsoft Stream Insight is powerful and is capable of handling 5k events per seconds. We have sensitive data in SQL Server database. We have enabled SQL Audit Log. A function sys.fn_get_audit_file('auditlogfile') will show all the content of audit log file. I have seen some examples in internet where StreamInsight only reads historical data in CSV format or where some simulated event is generated. How can I use StreamInsight to monitor sql audit log forever and store the captured logs in SQL Server table. I can do it with pure C#.NET or SSIS but our manager is so impressed with StreamInsight that he wants it implemented so badly.
It's been a few years since I've used StreamInsight so I may be a bit rusty.
If you are going to use StreamInsight, you, the developer, are on the hook for creating any needed input and output adapters. For your situation, you will need to find a mechanism for reading/parsing the audit log into custom events (IObservable). On the output side of things, you will need to create an output adapter (IObserver) that will write the data to the desired SQL table. I've written a generic SQL output adapter in the past and it is not terribly difficult.
On another note, there is a max size for an event in StreamInsight and it is 16kb.
I hope that helps.

SQL Server 2005- Investigate what caused tempdb to grow huge

The tempdb of my instance grew huge eating up all the available disk space and causing applications to go down. Had to restart the instance in emergency. However, I want to investigate and dig deep as to what caused the temp db to grow huge all of sudden. What were the queries, processes that casued this? Can someone help me to pull the required info. I know I wont get much of historical Data from the SQL serevr. I do have the Idera SQL Diagnostic Manager(third party tool) deployed. Any help to use the tool would be really appreciated.
As for postmortem analysis, you can use the tools already installed on your server. For future proactive analysis, you can use SQL traces directly in SQL Profiler, or query the traces using SQL statements.
sys.fn_trace_gettable
sys.trace_events
You can also use an auditing tool that tracks every event that happened on a SQL Server instance and databases, such as ApexSQL Comply. It also uses SQL traces, configures them automatically,and processes captured information. It tracks object and data access and changes, failed and successful logins, security changes, etc. ApexSQL Comply loads all captured information into a centralized repository.
There are several reasons that might cause your tempdb to get very big.
A lot of sorting – if this requires more memory than your sql server has then it will store all temp results in tempdb
DBCC commands – if you’re frequently running commands such as DBCC CheckDB this might be the cause. These functions store its results in temp db
Very large resultsets – these are also using temp db to run properly
A lot of heavy transactions such as bulk inserts
Check out this article for more details http://msdn.microsoft.com/en-us/library/ms176029.aspx on how to troubleshoot this.
AK2,
We have Idera DM tool as well. If you know the time frame around what time your tempdb was used heavily you can go to History on the Idera tool to see what query was running at that time and what lead to the server to hose... On the "Tempdb Space used OverTime" you would usually see a straight line or a graph but at the time of heavy use of tempdb there's a pike and a straight drop. Referring to this time-frame you can check into Sessions>Details too see the exact query and who was running the query.
In our server this happens usually when there is a long query doing lots of join. or when there is an expensive query involving in dumping into temp table / table variable.
Hope this will help.
You can use SQL Profiler. Please try the link below
Sql Profiler

Application Hangs on SQL Server - restart required every time

We have an application which has a SQL Server 2000 Database attached to it. After every couple of days the application hangs, and we have to restart SQL Server service and then it works fine. SQL Server logs show nothing about the problem. Can anyone tell me how to identify this issue? Is it an application problem or a SQL Server problem?
Thanks.
Is it an application problem or a SQL Server problem?
Is it possible to connect to MS SQL Server using Query Analyzer or another instance of your application?
General tips:
Use Activity Monitor to find information about concurrent processes, locks and resource utilization.
Use Sql Server Profiler to trace server and database activity, to capture and save data to a table or file to analyze it later.
You can use Dynamic Management Views (\Database name\Views\System Views folder (in the Management Studio)) to get more detailed information about MS SQL Server internals.
If you have the problems with perfomance (not your case) - you can use Perfomance Monitor and Data Collector Sets to gather perfomance information
Hard to predict the issue, I will suggest you to check your application first.Check what all operations you are performing against data base, are you taking care of connection pooling, unused open connections can create issues.
Check if you can get any log from your application. Without any log information hardly we can suggest anything.
Read this
Application may be hanging due to Deadlock
check the SP runs at that time using Profiler
and check the table manipulation(use nolock),
check the buffer size and segregate the DB into two or three module.

Identifying connections and active SQL in SQL Server

How do I see the currently executing SQL statements in SQL Server? I've poked around in SQL Server Management Studio, but I don't see anything "canned".
Profiler will log, and allow you to view, all activity on the server, if that's what you're looking for.
http://msdn.microsoft.com/en-us/library/ms187929.aspx
The active connections can be listed with the built in stored procedures sp_who and sp_who2. At least one of them (don't remember which one right now) shows executing commands on the exact time when the sp is run.
As mentioned in another answer, sql server profiler is a great tool which gives much more detail and logging of activity. The sp:s just provides a quick overview.
Activity Monitor (in SSMS under Management) is a GUI version of sp_who2. To identify the T-SQL being executed, run DBCC Inputbuffer and the SPID, eg,
DBCC inputbuffer(54)
SQL Server DMV's are great for finding information like this. For example the sys.dm_exec_connections table will show you a lot about the user connected to your database.
If it is of interest, our Cotega service has the ability to do an analysis of your database and show you this as well as a lot of other things (such as top queries by CPU, IO, etc) which is available to even the free accounts.
I hope that helps.

Resources