is there a way that I can look into the report server execution log file before two months? I would like to dispose a reportserver database and want to see which report is last run. Sql Server Execution log file only stores for 2 months,but I want to see the log before that.I am using the Microsoft Sql Server 2005.
Thank you very much.
Short answer is no... As you've seen this is 60 days by default; after that SSRS will delete any old entries from its internal ExecutionLog table.
What you can do is change this default:
Server Properties (Logging Page)
This is not much use for your existing data but at least this might let you collect the information you need from now on.
Related
I have an odd issue on a Microsoft SQL Server that I manage. Two of the largest tables in a database are not visible in the Object Explorer.
When doing Right Click > Tasks > Shrink > Files, on the database, it is showing the data file as 99% unused. However in the following screenshot it is clear that there's over 500GB used:
The disk usage by table shows these two tables have over 1B records and is the majority of space reserved in the data file.
However, when looking in the object explorer, the tables does not exist:
I know the table exists because I am able to run select queries against it. The SQL Server version is Microsoft SQL Server 2019 (RTM-GDR) Standard Edition (64-bit). I am also using a sysadmin account, and have confirmed that it is not a view.
Any idea what could be causing this?
Cheers,
It looks to me like you have temporal tables in your environment. The history table will show up underneath the base table in SSMS. Here's a screenshot from the WideWorldImporters sample database from MS:
Is there a way to automate sql server profiler to record data then save data to a table continuously?
The reason, I am supporting a fragile SQL Server application and there is no auditing. I receive a lot of support calls regarding the deletion of records. I want a quick way to be able to view who has changed what data.
You can configure your profiler to save the trace directly to table as described here: How To Save a SQL Server Trace Data to a Table
But it's not a good idea for 2 reasons: first, profiler itself will be loading up your server, second, writing to table is the most costly option and you can even loose some events.
Maybe if you are on Enterprise edition you can use SQL Server database audit
that is more light weight
And here you can find a complete example of setting up database audit that audits the DELETE events
Here are few articles for your reference.
Save trace results to a database table
https://learn.microsoft.com/en-us/sql/tools/sql-server-profiler/save-trace-results-to-a-table-sql-server-profiler
Save Trace Results to a Table
https://technet.microsoft.com/en-us/library/ms191276(v=sql.110).aspx
9 Steps to an Automated Trace
http://sqlmag.com/t-sql/9-steps-automated-trace
alternatively, you may try this automated solution ( https://www.lepide.com/lepideauditor/sql-server-auditing.html ) to accomplish this task.
I'm quite a newbie to the topic databases. It may be that the topic
already exists, but I don't even know how to search for it.
How to query and so on seems to be no problem.
And how to write a trigger on an INSERT/DELETE/UPDATE statement is known too.
I would like to know if it is possible to execute some code to generate a data row
every day once at a specific time without an external programm.
Thanks, Chris
PS: I'm from Germany, so don't get angry about my expressions, please^^
If you are not in an Express Edition you may create an SQL Job to do that.
Open SSMS
Drill Down your server instance
Drill Down SQL Server Agent
Right click Jobs folder
Chose New Job
Fill in the details in the General Tab
Go to the Steps tab
Click New
Enter a name, choose which db to run against and enter your SQL statement.
Go to Schedules tab
Add a schedule to run once each day at the time you wish.
In SQL Server, I have a data source server which has 22 databases and in each database there are 5 tables. Every db has the same table includes different data separated through years.
I want to collect all this data into one single database. Destination database will have only 5 tables, while source has 22 x 5 = 110 tables. I'm using import-export wizard to transfer data but it takes too long and really annoying stuff. For 110 tables I'm going to have to start import-export wizard.
Is there a simple way, tool to do this? There is no linked server between servers.
Here is a simple figure that explains my situation.
Posting my comment as an answer:
Back up each database, restore it to server 2 and then insert the records across using a simple INSERT .. SELECT statement, then drop the restored database and restore the next? You should be able to script this to work unattended, even the creation of all the backups could be scripted to only need a single 'run' which will run for all databases
Your other option (if space permits) is to create a new database on server 1 (potentially a restore of the database on server 2 if it has data already in it), then import all records across into this new database, then backup this database and restore it on server 2.
It depends on several thing like how often do you want the data to be moved, will it be changed on the destinations DB's?
There are 4 methods of High Availability on SQL Server. One of them will surely fits to your scenario (probably a merge replication)
http://msdn.microsoft.com/en-us/library/ms190202.aspx
I use around 3 SQL Server 2008 databases. Every time I need to query on a database, I need to login to that db and then query. Is there a way to retain the last opened database in SQL Server 2008?
As an analogy, think this is like firefox allowing to display the last open websites.
SQL Server retains last opened database. It actually never closes them (auto_close and user instances not withstanding). Do you mean Management Studio by any chance? You can add an USE statement to your saved query. You can use sqlcmd extensions in your query to simply run the query in one shot on all servers/db. Or you can use something like SSMS Tools Pack, a free add on that enhances SSMS with things like query history.
You would need to have the same login for all 3 databases and have auto_close set to off