Please help me. I have a table with all my booking status, check in date, check out date. The flow is, when the booking is not paid after 2 days, the status will automatically update to 'NOT PAID'. I'm a newbie in creating trigger. Can anybody please help me. Thanx in advance.
There is no way to do the above with a sql trigger.
You have several options;
Creating a SQL Job as #GuillermoZooby says,
or
You could make the Status column a Computed Column, making the column check the age.
Triggers you are talking about are executed after an INSERT, UPDATE, or DELETE statement on a table or view. So this is not what you are looking for.
You should create Stored Procedure that checks date (I imagine you have a datetime column in the mentioned table) and when the booking is not paid after 2 days, the status field is updated to 'NOT PAID'.
Then you have to create SQL Job (service SQL Agent must be enabled and up) and when you configure schedule to run the previosuly created Stored Procedure just establish the time when you want to run the procedure. For example, once a day.
Related
I am just wondering, can I find out if somebody wrote a query and updated a row against specific table in some date?
I tried this :
SELECT id, name
FROM sys.sysobjects
WHERE NAME = ''
SELECT TOP 1 *
FROM ::fn_dblog(NULL,NULL)
WHERE [Lock Information] LIKE '%TheOutoput%'
It does not show me ?
Any suggestions.
No, row level history/change stamps is not built into SQL Server. You need to add that in the table design. If you want an automatic update date column it would typically be set by a trigger on the table.
There is however a way if you really need to find out what happened in a forensics scenario. But that is only available if you have the right backup plans. What you can do then is to use the DB transaction log to find when the modification was done. Note that this is not anything an application can or should do runtime.
My question is a mess, but what I mean is basically this: let's say I have a table called EXECUTELATER (in my SQL Server database for my app) and I have a column called DATE. I want to execute queries every time a row/rows in that table has DATE <= current system time, do something with the data, put some data in another table and delete this row from the table EXECUTELATER.
One thing in my mind is to create another simple app logged with another user to query every millisecond and check if there is a DATE matching, but that sounds absurd.
Is there a way I can do this within the web app? Can anyone suggest me a smart way I can make this work? I don't mind if I have to work with full SQL date format or store the date in long type, work with the web app, create a new application for that use, use something in SQL Server itself or whatever.
Again sorry for the badly structured question, I'm not sure how to put this together. Thank you for checking!
you can use triggers like :
create TRIGGER [dbo].[TrgX]
ON [dbo].[EXECUTELATER]
AFTER UPDATE,insert
AS
BEGIN
SET NOCOUNT ON;
update [dbo].[tblx]
set [columnx] = x
from inserted i
inner join [dbo].[tblx] p on i.[Relatedcoulumn] = p.Relatedcoulumn
where i.[Date] >= getdate()
END
there is a function in netezza Database for take only record recently update?
I try with:
select *
from
myTable
where
last_modified_timestamp > current_timestamp - '5 minute'::interval
but dont work ERROR: Attribute 'LAST_MODIFIED_TIMESTAMP' not found.
Thanks
Netezza does not track row level modification metadata automatically. You would have to implement that yourself using a timestamp column and ETL logic to update it properly.
I've created an SSIS package that pulls data from various sources and aggregates it as needed for the business. The goal of this processing is to create a single table, for example "Data_Tableau". This table is the datasource for connected Tableau dashboards.
The Tableau dashboards need to be available during the processing, so I don't truncate "Data_Tableau" and re-populate with the SSIS package. Instead, the SSIS package steps create "Data_Stage". Then the final step of the package is a drop/rename, wherein I drop "Data_Tableau" and sp_rename "Data_Stage" to "Data_Tableau".
USE dbname
DROP TABLE Data_Tableau
EXEC sp_rename Data_Stage, Data_Tableau
Before this final step, I expect max(buydate) from "Data_Stage" to be greater than max(buydate) from "Data_Tableau", since "Data_Stage" would have additional records since the last time the process ran.
However, sometimes there are issues with upstream data and I end up with max(buydate) from "Data_Stage" = max(buydate) from "Data_Tableau". In such cases, I would not want the final drop/rename process to run. Instead, I want the job to fail and I'll send an alert to the appropriate upstream data team when I get the failure notification.
That's the long-winded background. My question is...how do I check the dates and cause a failure within the SSIS package. I'm using VS 2012.
I was thinking of creating a constraint before the final drop/rename step, but I haven't created variables or expressions before and am unsure how to achieve this.
I was also considering creating a 2-row table as follows:
SELECT MAX(buydate) 'MaxDate', 'Tableau' 'FieldType' FROM dbname.dbo.Data_Tableau
UNION ALL
SELECT MAX(buydate) 'MaxDate', 'Stage' 'FieldType' FROM dbname.dbo.Data_Stage
and then using a query against that table as some sort of constraint, but not sure if that makes any sense and/or is better than the option of creating variables/expressions.
Goal: If MAX(buydate) from "Data_Stage" > MAX(buydate) from "Data_Tableau", then I'd want the drop/rename step to run, otherwise it should fail and "Data_Tableau" will contain the same data as before the package ran.
Suggestions? Step-by-step instrux would be greatly appreciated.
I would do this by putting this:
Then the final step of the package is a drop/rename, wherein I drop
"Data_Tableau" and sp_rename "Data_Stage" to "Data_Tableau".
into a stored procedure that gets called by the SSIS package.
Then it's simply a matter of using an IF block before that part of the code:
--psuedocode
IF (SELECT MaxBuyDateFromTableA) >= (SELECT MaxBuyDateFromTableB)
BEGIN
DROP TABLE Data_Tableau
EXEC sp_rename Data_Stage, Data_Tableau
END
ELSE
--do something else (or nothing at all)
Recently I have dropped all automatically created statistics (named _WA_Sys_%) and run the following T-SQL command to create statistics for all columns of the database :
EXEC sp_createstats #indexonly = 'NO', #fullscan = 'FULLSCAN', #norecompute ='NO'
All worked fine, until I had to drop a column in a table, then an error 5074 occured, indicating that statistics should be dropped before dropping the column.
Is there a way to get SQL Server to drop silently user created statistics when a column is dropped ?
I don't think you can make SQL server do it silently, as this is a common problem for people using user statistics. There is a way to drop relevant statistics with a custom query - would it work for your needs?