DB Link Query Error while writing Trigger Function - database

I am using PSQL 9.6 and trying to write a TRIGGER. Now issue is I am facing the error in the following query and the terminal output isn't verbose enough to determine where I am going wrong with this query.
The Function goes like this:
CREATE OR REPLACE FUNCTION insert_dblink_func()
RETURNS trigger AS
$BODY$
BEGIN
perform dblink_exec("insert into Some_Table select * from dblink( 'host=SOMEPRODSERVER dbname=DBNAME user=USERNAME password=PASSWORD', 'select id,name,helpline,email,created_at,updated_at') as sourceTable (id integer,name character varying,helpline character varying,email character varying,created_at timestamp with time zone,updated_at timestamp with time zone) on conflict(id) do update set id=excluded.id,name=excluded.name,helpline=excluded.helpline,email=excluded.email,created_at=excluded.created_at,updated_at=excluded.updated_at");
perform dblink_disconnect();
RETURN NULL;
END;
$BODY$
LANGUAGE plpgsql;
The output I am getting while I execute this function is this
LINE 5: perform dblink_exec("insert into TABLENAME sel...
^
CREATE FUNCTION
So I am confused as to what the error could be.
TIA

I'm so confused, coz I've tried splitting your dblink query into small steps, and I see that there's some issues in the select syntax (+ using of double quotes instead of single quotes):
INSERT INTO
some_table
SELECT *
FROM DBLINK('host=SOMEPRODSERVER dbname=DBNAME user=USERNAME password=PASSWORD',
'SELECT id,name,helpline,email,created_at,updated_at') --<< here's not table with "FROM statement"
AS sourcetable (id INTEGER,
name CHARACTER VARYING,
helpline CHARACTER VARYING,
email CHARACTER VARYING,
created_at TIMESTAMP WITH TIME ZONE,
updated_at TIMESTAMP WITH TIME ZONE)
ON CONFLICT(id) DO UPDATE SET
id=excluded.id,
name=excluded.name,
helpline=excluded.helpline,
email=excluded.email,
created_at=excluded.created_at,
updated_at=excluded.updated_at
And, according to the description, it is not clear for what action this trigger should fire. Can you split your code into small steps and provide some more info? Here is a dbfiddle example, that you can improve with your data

Related

Conversion failed when converting date and/or time from character string during update only

I have been searching for a resolution for a long time now and just can't seem to formulate a query that brings back the resolution so as a last resort I have posted here.
I have a SQL server table with a varchar column that has the date and time stored in this format
"1/1/2013 11:38:31 PM Some other text"
I needed this date and time portion of this data to be stored in another column in datetime datatype. So I created a new column called DateTimeLog of type datetime.
I then used left to chop off the extra text and convert to change the value to datetime format and got the result I would expect.
select CONVERT(DATETIME,(rtrim(left(olddate, 21)))) from mytable
results:
"2013-01-01 23:38:31.000"
So far, so good. this is what I would expect. My troubles begin when I attempt to update my new datetime column with the results of this CONVERT statement.
update mytable
SET DateTimeLog = CONVERT(DATETIME,(rtrim(left(olddate, 21)))) from mytable
I get the infamous "Conversion failed when converting date and/or time from character string" error message.
Conversion failed when converting date and/or time from character string.
I have also attempted to use cast
update mytable
SET DateTimeLog = (cast(CONVERT(DATETIME,(rtrim(left(oldtable, 21)))) as datetime)) from mytable
the error persists. As best I can tell the convert is working correctly because I can see the result set from a select, but getting that result into a new column has eluded me thus far.
thanks,
Your string isn't going to consistently be 21 characters long. Your sample data shows a single character month and a single character date. What if it's, say, 12/13/2018?
That said, you need a more robust way to isolate that timestamp. I used a PATINDEX to capture the position of the last colon in the time component, with a couple of regexes in there to account for the numbers & the AM/PM thing. Then I added 6 to it to get to the end of the string of interest.
This seems to work:
DECLARE #t TABLE (olddate VARCHAR(100));
INSERT #t
(
olddate
)
VALUES
('12/13/2018 11:38:31 PM Some other text')
,('1/1/2018 11:38:31 PM Some other text');
SELECT CAST(LEFT(olddate,PATINDEX('%:[0-9][0-9] [AP]M%', olddate)+6) AS DATETIME)
FROM #t;
Results:
+-------------------------+
| 2018-12-13 23:38:31.000 |
| 2018-01-01 23:38:31.000 |
+-------------------------+
Rextester: https://rextester.com/BBPO51381 (although the date format's a little funky on the output).

Snowsql two digit century start date cast issue

I want a result 2000-02-05 with below query in snowsql.
alter session set TWO_DIGIT_CENTURY_START=2000;
select cast ('05-FEB-00' as date) from dual;
But I am getting 0001-02-05.
I am using existing script to load date in snowflake which works for oracle. I know I can get expected result using to_date function but I don't want to do so. If I have to then I have change many place in script which is hectic.
I want solution using cast function. Do anyone know what is happening here?
You first need to specify the non-default date format for your input data. In the case of the example above:
alter session set date_input_format = 'DD-MON-YY';
Then
alter session set TWO_DIGIT_CENTURY_START=2000;
select cast ('05-FEB-00' as date) from dual;
yields:
2000-02-05

My stored procedure in SQL Server 2008 when is execute with JOB its fails, but without its working great

I have this stored procedure wich its like this :
ALTER PROCEDURE [dbo].[P_ALIMENTATION_VolumeVentes]
AS
BEGIN
SELECT EDS, NomEDS,AgenceEDS, AgenceNomEDS,SecteurEDS, SecteurNomEDS,DirectionEDS,DirectionNomEDS,
(SELECT count(*) FROM CPListeVentesNonConformes WHERE CPListeVentesNonConformes.EDS = CPRT.EDS AND TypePart='PP') AS ListeVenteNC_PP,
(SELECT count(*) FROM CEListeVentesNonConformes WHERE CEListeVentesNonConformes.EDS = CPRT.EDS AND TypePart='ET') AS ListeVenteNC_ET,
(SELECT count(*) FROM CPListeVentesNonConformes WHERE CPListeVentesNonConformes.EDS = CPRT.EDS AND TypePart='PP' OR TypePart='ET') AS ListeVenteNC_PPET,
(SELECT count(*) FROM ListeVentes WHERE IDES01 = CPRT.EDS AND TypePart='PP') AS ListeVentes
INTO VolumeVentes
FROM CPR CPRT
GROUP BY EDS, NomEDS,AgenceEDS, AgenceNomEDS,SecteurEDS, SecteurNomEDS,DirectionEDS,DirectionNomEDS,TypePart
END
When i execute with the command line EXEC [dbo].[P_ALIMENTATION_VolumeVentes]
that work super great my table is create.
But when i use SQL Agent to schedule a job i have a nice surprise to have this error :
Executed as user: ZRES\CSAPREP10IUCRADM. The conversion of a varchar
data type to a datetime data type resulted in an out-of-range value.
[SQLSTATE 22007] (Error 242) The statement has been terminated.
[SQLSTATE 01000] (Error 3621). The step failed.
The structure table who will be create VolumetVentes have no fields with a type as datetime
Here is the structure of the tableVolumeVentes
I don't understand exactly where is the error ?
Thank you for help
Actually it should never work since you already have VolumeVente table.
SELECT INTO creates new table with columns described in select statement
https://msdn.microsoft.com/en-us/library/ms188029(v=sql.120).aspx
You should modify this code to become INSERT SELECT.
But you will probably still get the same conversion error because (I guess) column order is not correct in select statment and does not match column order in existing table. That is why you should always explicitly define column list in INSERT INTO clause, so the final script will look like:
INSERT INTO VolumeVentes(EDS, NomEDS,AgenceEDS, AgenceNomEDS,SecteurEDS, ...)
SELECT EDS, NomEDS,AgenceEDS, AgenceNomEDS,SecteurEDS
FROM ...
Use
INSERT INTO... Statement
instead of
SELECT INTO FROM... Statement

insert record when datetime = system time

I am new to sql so bear with me. I want to insert a record into another table when a datetime column = the system time. I know this is an infinite loop. I am not sure any other way to handle what I am trying to solve.
INSERT INTO dbo.Que
(Name, Time)
SELECT ProspectName, ProspectDate
FROM myProspects where ProspectDate = CURRENT_TIMESTAMP
I need to place a phone call at a certain time. I need to insert the record into another table to execute the call when the time = now. If you have a better way of handling this, please tell me.
Thanks
If you are using sql server, you can use the getdate() function.
You would need to insert the createdate into a column of the call on insert, also using getdate() or if using .net System.Datetime.Now
e.g. select all calls that happened in the last x amount of time in SQL server:
select * from calls where createdate > getdate() - .1

SQL Server: Find out what row caused the TSQL to fail (SSIS)

SQL Server 2005 Question:
I'm working on a data conversion project where I'm taking 80k+ rows and moving them from one table to another. When I run the TSQL, it bombs with various errors having to do with converting types, or whatever. Is there a way to find out what row caused the error?
=====================
UPDATE:
I'm performing an INSERT INTO TABLE1 (...) SELECT ... FROM TABLE2
Table2 is just a bunch of varchar fields where TABLE1 has the right types.
This script will be put into a sproc and executed from an SSIS package. The SSIS package first imports 5 large flat files into TABLE2.
Here is a sample error message: "The conversion of a char data type to a datetime data type resulted in an out-of-range datetime value."
There are many date fields. In TABLE2, there are data values like '02/05/1075' for Birthdate. I want to examine each row that is causing the error, so I can report to the department responsible for the bad data so they can correct it.
This is not the way to do it with SSIS. You should have the data flow from your source, to your destination, with whatever transformations you need in the middle. You'll be able to get error details, and in fact, error rows by using the error output of the destination.
I often send the error output of a destination to another destination - a text file, or a table set up to permit everything, including data that would not have been valid in the real destination.
Actually, if you do this the standard way in SSIS, then data type mismatches should be detected at design time.
What I do is split the rowset in half with a WHERE clause:
INSERT MyTable(id, datecol) SELECT id, datecol FROM OtherTable WHERE ID BETWEEN 0 AND 40,000
and then keep changing the values on the between part of the where clause. I've done this by hand many times, but it occurs to me that you could automate the splitting with a little .Net code in a loop, trapping exceptions and then narrowing it down to just the row throwing the exception, little by little.
I assume you do the update with the INSERT INTO ...
Instead try to do the update with the cursor, use exception handling to catch the error and log all you need: the row number it failed on etc.
Not exactly a cursor, but as effective - I had over 4 million rows to examine with multiple conversion failrues. Here is what I used, and it resulted in a two temp tables one with all my values and assigned rows and one that simply contained a list of rows in the first temp table that failed to convert.
select row_number() over (order by TimeID) as rownum,timeID into #TestingTable from MyTableWithBadData
set nocount on
declare #row as int
declare #last as int
set #row=0
select #last = count(*) from #TestingTable
declare #timeid as decimal(24,0)
create table #fails (rownum int)
while #row<=#last
begin
Begin Try
select #timeid=cast(timeID as decimal(24,0)) from #TestingTable where rownum = #row
end try
begin catch
print cast(#row as varchar(25)) + ' : failed'
insert into #fails(rownum) values(#row)
end catch
set #row = #row+1
end
if you are looping, add prints in the loop.
if you are using set based operations, add a restrictive WHERE condition and run it. Keep running it (each time making it more and more restrictive) until you can find the row in the data. if you could run it for blocks of N rows, then just select out those rows and look at them.
ADD CASE statements to catch the problems (converting that bad value to NULL or whetever) and put a value in a new FlagColumn telling you the type of problem:
CASE WHEN ISNUMERIC(x)!=1 then NULL ELSE x END as x
,CASE WHEN ISNUMERIC(x)!=1 then 'not numeric' else NULL END AS FlagColumn
then select out the new converted data where FlagColumn IS NOT NULL
you could try using select statements with isnumeric() or isdate() functions on the various columns of the source data
EDIT
There are many date fields. In TABLE2,
there are data values like
'02/05/1075' for Birthdate. I want to
examine each row that is causing the
error, so I can report to the
department responsible for the bad
data so they can correct it.
Use this to return all bad date rows:
SELECT * FROM YourTable WHERE ISDATE(YourDateColumn)!=1
If you are working with cursors, yes and is trivial. If you are not working with cursors, I don't think so because SQL operations are ACID, or transactions per se.
John Sauders has the right idea, there are better ways to do this kind of processing using SSIS. However, learning SSIS and redoing your package to completely change the process may not be an option at this time, so I offer this advice. You appear to be having trouble with the dates being incorrect. So first run a query to identify those records which are bad and insert them into an execptions table. Then do you insert only of those records that are left. Something like:
insert exceptiontable (field1, field2)
select field1, field2 from table2 where isdate(field2) = 0
insert table1 (field1, field2)
select field1, field2 from table2 where isdate(field2) = 1
Then of course you can send the contents of the exception table to the people who provide the bad data.

Resources