I have created two triggers. One on sysssispackages and another on sysjobs. thse two triggers will simply send a mail whenever anyone modifies a maintenance plan or a job(sans job schedule).
My problem is whenever someone modifies a MP, i get alerts for both the triggers(1 for sysssispackage and 2-4 alerts for sysjobs table).
Is there any way by which I can restirct these alerts to the following cases.
If change is done from MP, I get alert for only sysssispackage table i.e 1 alert only.
If change is done on Jobs, I get alert only for sysjobs table.
One workaround is to alter the trigger on sysjobs table such that it checks in mail queue and sends mail accordingly.
But, I want to know if there is some other way for this.
Thanks.
P.S- My basic requirement is tracking who made changes in MP and Jobs.
Related
right now I have a challenge what I'm not sure about how to solve in the best manner. I searched the internet but did not find a suitable solution ...
I want to copy data from a view on a linked server (read only; view has severals sub views and tables) to a table which is located on my database. This View contains live data, basically showing the last 100 occuring events. However, what I need is the whole history of the data shown by the view. As I have just reading permission on that specific view and the admin of the linked server is not able (or willing) to give further rights or change the view, I was wondering what the best way is to copy the view data and basically building up the whole history on my database.
I was thinking about a stored procedure and run it on a schedule, but as the last 100 events can change very quickly, this does not seem like an appropriate approach. Another option would be to build a trigger, which takes new rows an copies them to my table. However, I'm not sure if this would be possible.
I appreciate any hints, tips or impulses.
I have added some custom steps to some of my SSRS jobs however they are being removed after a couple of days every time. I know that if you add custom jobs and then change the report or the subscription in the UI then it overwrites the jobs. However they are not being touched yet they are still disappearing.
Has anyone else come across this problem ?
Although I often customize jobs that run subscriptions - no, I haven't come across that problem. I did not customize the jobs that were created automatically, instead I created my own ones.
For a subscription to successfully fire, the name of the job isn't important. Instead, the SQL code to execute the subscription (to be more specific: the SubscriptionID) is what you need to know. Since you were able to find the jobs that execute specific subscriptions, I think that you don't have a problem in finding this information, neither. The code you need looks like this:
exec [ReportServer].dbo.AddEvent #EventType='TimedSubscription', #EventData='<YourSubscriptionID>'
You can use this code in your own jobs as well, and it will work as long as the subscription is there.
The name of the SSRS-generated job is the ID of the report schedule that you define for the subscription. This name is needed by SSRS to know where to change the schedule when you change the schedule of the subscription. As you found out, SSRS resets these jobs not only when a subscription is changed. But you don't need all these jobs when creating your own jobs that run the subscription.
To get rid of the auto-generated job with that cryptic name, don't just delete it yourself (as SSRS would re-create it), instead change the schedule of the subscription to a shared schedule that will never run. For this, I created a shared schedule (under site settings) named "Disabled Schedule" and disabled that schedule.
I'm looking at the features of SymmetricDS (last version symmetric-server-3.7.24) and in their forum I read it is actually possibile to synch from a view.
So I tried to synch from a view but when I run the program I got an error because symmetricDs cannot create a trigger on the view.
I also read that if a use a materialized view, then the trigger should be created.
The view is on a sqlserver 2008. I dropped the view and create a new one with schemabinding and add a cluster index on it. I also check that all the options are set as required in the MSDN guide to create indexed table.
I run symmetricDS again but still fail to create the trigger on the view.
Can anyone help me?
If what I ask is actually not possibile, then it is possibile to craete an extension that does not use trigger to synchronized the tables? I don't care that the two db are synched realtime, I can use a scheduled job, it will be just fine.
Thank you for you help and suggestion.
BTW: I can also change tool you you know a better one :)
I don't think that's a supported use case. However, you can try setting the sync_on_insert/update/delete fields to 0 on the sym_trigger. Then you would be able to sync the view with an initial load or by scheduling reloads (see "symadmin reload-table" command).
I'm needing to add serial numbers to most of the entities in my application because I'm going to be running a Lucene search index side-by-side.
Rather than having to run an ongoing polling process, or manually run my indexer by my application I'm thinking of the following:
Add a Created column with a default value of GETUTCDATE().
Add a Modified column with a default value of GETUTCDATE().
Add an ON UPDATE trigger to the table that updates Modified to GETUTCDATE() (can this happen as the UPDATE is executed? i.e. it adds SET [Modified] = GETUTCDATE() to the SQL query instead of updating it individually afterwards?)
The ON UPDATE trigger will call my Lucene indexer to update its index (this would have to be an xp_cmdshell call presumably, but is there a way of sending a message to the process instead of starting a new one? I heard I could use Named Pipes, but how do you use named pipes from within a Sproc or trigger? (searching for "SQL Server named pipes" gives me irrelevant results, of course).
Does this sound okay, and how can I solve the small sub-problems?
As I understood, you have to introduce two columns to your existing tables and have them processed (at east one of them) in 'runtime' and used by an external component.
Your first three points are nothing unusual. There are two types of triggers in SQL Server according to time when trigger get processed: INSTEAD OF trigger (actually processed before insert happens) and AFTER trigger. However, inside INSTEAD OF trigger you have to provide logic what to really insert data into the table, along with other custom processing you require. I usually avoid that if not really necessary.
Now about your fourth point - it's tricky and there are several approaches to solve this in SQL Server, but all of them are at least a bit ugly. Basically you have to either execute external process or send message to it. I really don't have any experience with Lucene indexer but I guess one of these methods (execute or send message) would apply.
So, you can do one of the the following to directly or indirectly access external component, meaning to access Lucene indexer directly or via some proxy module:
Implement unsafe CLR trigger; basically you execute .NET code inside the trigger and thus get access to the whole (be careful with that - not entirely true) .NET framework
Implement unsafe CLR procedure; only difference to CLR trigger is that you wouldn't call it imediatelly after INSERT, but you will do fine with some database job that runs periodically
Use xp_cmdshell; you already know about this one, but you can combine this aproach with job-wrapping technique in last point
Call web service; this technique is usually marked as experimental AND you have to implement the service by yourself (if Lucene indexer doesn't install some web service on its own)
There surely are other methods I can't think of right now...
I would personally go with third point (job+xp_cmdshell) because of the simplicity, but that's just because I lack any knowledge of how does the Lucene indexer work.
EDIT (another option):
Use Query Notifications; SQL Server Service Broker allows an external application to connect and monitor interesting changes. You even have several options how to do that (basically synchronous or asynchronous), only precondition is that your Service Borker is up, running and available to your application. This is more sophisticated method to inform external component that something has changed.
I need info about updating grid when database is changed.
What I have:
A client application in php which uses Extjs to provide interface to db.
What seems to be a problem:
A problem occurs when for example there are two users are using the app. The 1st user changes and commits data to db, the 2nd user isn't able to see the changed data.
Other example is when admin changes data directly in db.
What I'm trying to do:
I need to be able to load changed data to the grid in timely maner. Reloading store is an issue because the data returned is huge (takes few seconds to load) and while the app would be used by hundreds of users at one time it would create quite an overhead.
Can I load only changed rows (can be checked by timestamp)?
There are 2 parts I see in this question:
Q) Can you update only the changed rows?
A) Sure. Find the record (using the store's "find" function), change the appropriate values (using record.set(key, value)) and commit the changes.
Q) Can you synchronize multiple users?
A) You have 2 options. Option #1, you can lock the object so only one user can edit at a time. This is the simplest solution. Option #2, you can add a synchronizing option (via comet would probably be the best option) and synchronize the changes as they occur.