I'm looking to schedule a resource heavy job on the database.
I'd like to see the historical load of the MSSQL server to determine a time slot.
Is there a way to do that?
There is DBCC MEMORYSTATUS to get a tons of memory information. you can also refer to previous SO to find the utilization.
A nice easy way to get the information that you want is to use a SQL Server feature called the data collector.
There is a nice step by step tutorial on how to set this up here:
SQL SERVER – Configure Management Data Collection in Quick Steps
You can also create your own data collectors so that you can persist DMV information (remember DMVs only show information since the last time SQL server was restarted).
Related
Is it possible to check Client Statistics (Include Client Statistics in MS) for SSIS query (Data Flow)? I wonder if it possible to catch these information using any DMV.
I use SQL Server 2008R2.
My experience has been to tune the queries prior to making an SSIS package. You can also tune the SSIS package after creating it (one example: http://sqlmag.com/sql-server-integration-services/designing-ssis-packages-high-performance).
In answer to your question though, I do not believe that you can capture the client stats within SSIS. A Profile trace or Extended Events might give you some good information though.
MongoDB has something called an oplog which you can tail to read/replay all operations (insert, update, delete, etc) that happen to a database. I am looking to do something similar in SQL Server but have been unable to find anything equivalent. Does anything similar to this exist in SQL Server, and more specifically, SQL Azure?
Depending on what version of SQL Server your running, I believe Change Data Capture will cover your need. There are built in functions that will allow you to query all the changes that took place on CDC enabled tables. I've included a link from the Microsoft TechNet library and another from a blog that provides an introduction to CDC.
Hope this helps!
SQL Server has the Transaction Log facility that does just that - record all transactions in order to be able to rollback up to a certain point.
As stated here you can use DBCC LOG(databasename, typeofoutput) to access that information
Easy way to accomplish it is SQL Server profiler. Other ways are given here. You can save SQL Server Profiler output to file or table then use other means to read it.
I am using two similar SQL database in two different servers one is local and another one is online. I want to transfer data at the end of the day from the local server to the online server.
what is the best method to automatically transfer data and protecting primary keys effectively.
Thank you
Use Red-Gate Data Compare. It's commercial, though.
(I'm just a satisfied customer and in no way related to Red-gate)
Open SQL Server Management Studio and connect to both servers, in the Object Explorer right-click on a server and choose Tasks, and select Import Data or Export Data, then it's a simple wizard to go from there.
SSMS can also do a schema compare (no need to pay for RedGate Comparison software) if needed.
Have you considered using Replication?
replication tutorial
I believe what you are trying to do is a Mirror database, updated daily, if that is the case:
Using Database Mirroring is a best practice (Instead of manually doing this yourself), I suggest:
Read about Mirroring here: Database Mirroring
Follow this guide: Setting Up Database Mirroring
Your local server should be the principal and your online will be the mirror
I highly recommend this approach , instead of manually scripting the data (see link to answer below), it will give you benefits such as automatic failover (when your local server crashes it will use the remote one) , you can read all about the benefits in the links above.
If you eventually want to do it manually for any reason, or you don't have the SQL Server Enterprise edition , then read my answer to this question:
sql-server-copying-tables-from-one-database-to-another
My colleague and I are trying to find the best query to sync up an Oracle database with that of a SQL server. There are about 80k+ rows with ~19 columns of data in each row. We have a linked server setup between the two servers and we have a query that works but for 80k records, the query took 10 hours to copy the records over. I can post the query we used but I would like to have a fresh set of eyes. This is a new process so we aren't trying to retrofit a solution to existing code. LIke I said before, permissions aren't an issue, it is just a matter of getting the data from Point A to Point B in the quickest time. This is to be used on a coldfusion supported web site and the client would like to click a buttton to sync up the data but again, this is just "wish list" of requirements we are working with.
Additional Thoughs I'd like to add:
We have tried openquery and using linked server but both took about the same time to complete.
Most are varchar(64 bytes), a couple of varchar(128) and a couple of varchar(12 bytes).
One suggestion someone else made was to write the data to a flat file, ftp the flat file to Point B and then import it. That is a viable solution but the more steps we include, the more chances there are of something breaking.
Thanks in advance. I look forward to seeing y'alls solutions.
I've had more success with an SSIS package than linked servers. If you use the Oracle DLL's, it's not too bad.
Have you looked at Oracle Transparent Gateway? Here is the reference manual. It drives SQL Server from Oracle instead of the other way around.
Zidsoft CompareData you can set up the sync task visually and also scheduling it to run via the commandline. Disclosure: I am the developer of this product.
I want to do this, because I would like to know how many times a particular row has been changed.
Is this possible?
Thanks
Reading the log file either takes a commercial tool, or an incredible amount of SQL internals knowledge to achieve. You can see some of the raw output by using:
Select * from ::fn_DBlog(null,null)
Actually decoding to find the same record being altered and ensuring any alteration was committed etc would be a difficult task to put it lightly. So it is 'possible' but not very 'probable' that you will be able to do it.
If you need that functionality within a database then you should be looking at triggers / logic within the code.
Late answer but I hope it will be useful to new readers…
One more function you can try is DBCC LOG but unfortunately this is undocumented function same like fn_dblog.
Problem with transaction log in SQL Server is that it was never meant to be used for this but only to allow point in time recovery and transaction properties.
There is a commercial log reader from ApexSQL that you can try.
Here are also couple similar posts that might get you in the right direction.
Read the log file (*.LDF) in sql server 2008
SQL Server Transaction Log Explorer/Analyzer
you can use this program to do it
http://www.red-gate.com/products/SQL_Log_Rescue/index.htm
Consider using SQL Server 2008.
There is a feature new to SQL Server 2008 called Change Data Capture that does exactly what you require, that is to track data modifications over time.
Looking to inspect the log file in order to track changes is not a wise practice. Doing so will provide you with a limited history, the scope of which would also be dependent on the Recovery Model that you use for your database.
You could "roll your own" solution with a small amount of development, by using a log table and populating it using SQL Server Triggers. The suitability of such a solution is of course dependent on your business case.
Take a look at the following TechNet article for some interesting reading:
Tracking Changes in Your Enterprise Database