I am using the nodejs driver.
I have this table
CREATE TABLE IF NOT EXISTS SCHEMA.TABLE (
id VARCHAR(36) NOT NULL
received_at DATETIME
);
I insert data to the table like this using squel.
const statement = squel.insert()
.into('SCHEMA.TABLE')
.set('ID', uuid())
.set('RECEIVED_AT', moment().format('YYYY-DD-M H:mm:ss.SSS'))
.toString()
const queryResult = connection.execute({
sqlText: statement,
complete: (err, stmt, rows) => {
if (err) {
console.log(err.message)
} else {
console.log(`${body.event} event is sent to Snowflake`)
}
}
});
Last week this was working fine, I can insert without any error. But when I tried it again I keep getting the
Timestamp '2021-14-9 10:16:53.511' is not recognized
I tried doing this on snowflake's workspace
INSERT INTO SCHEMA.TABLE (ID, RECEIVED_AT) VALUES ('fe12cef0-9d3e-489b-8658-00b302501123', CURRENT_TIMESTAMP())
and it's working.
The question how can I apply this statement on nodejs?
UPDATE
I just put a default value for the datetime using this
CREATE TABLE IF NOT EXISTS SCHEMA.TABLE (
id VARCHAR(36) NOT NULL
received_at DATETIME DEFAULT CURRENT_TIMESTAMP()
);
Related
I have a SQL Server table with an identity column, set to autoincrement.
Coded in Perl, the insert in the code below works fine, in the while loop the fetchrow_array() call returns no data in the #row array.
How do I best retrieve the identity value for use in subsequent SQL statements?
my $term_sql = "INSERT INTO reminder_term(site, name, description, localization) OUTPUT \#\#IDENTITY VALUES(?,?,?,?)";
my $t_stmt = $dbh->prepare($term_sql);
...
$t_stmt->execute($site, $name, $description, $localizer);
while (#row = $t_stmt->fetchrow_array()) {
$referential_key = $row[0];
}
Avoid using the ##IDENTITY value since it's unreliable in the presence of triggers.
Given the example table schema...
create table [dbo].[reminder_term] (
[id] int not null identity(1,1),
[site] nvarchar(10),
[name] nvarchar(10),
[description] nvarchar(10),
[localization] nvarchar(10)
);
You can rework your OUTPUT clause slightly you can capture the new id value by way of the special inserted row source...
INSERT INTO reminder_term(site, name, description, localization)
OUTPUT inserted.id
VALUES(?,?,?,?)
If I have a nvarchar column in sql server 2012, how do I default the value to a GUID when the value is null or empty space?
To be more clear:
Not only it should work when people do:
INSERT INTO table(second_column) VALUES('test')
It should also work if people do:
INSERT INTO table(column_with_default, second_column) VALUES('', 'test')
I tried to set the default to
LEFT(NEWID(), 36)
but it doesn't work...
When I insert to the table with 'NULL' for that column, it's just NULL (same with '').
EDIT 2: According to your comments I change my answer
CREATE TABLE dbo.Test(ID NVARCHAR(MAX) NOT NULL DEFAULT NEWID()
,TestData NVARCHAR(MAX));
GO
CREATE TRIGGER dbo.TestTrigger ON dbo.Test
INSTEAD OF INSERT
AS
BEGIN
INSERT INTO dbo.Test(ID,TestData)
SELECT CASE WHEN i.ID IS NULL OR i.ID='' THEN CAST(NEWID() AS NVARCHAR(MAX)) ELSE i.ID END,i.TestData
FROM inserted AS i;
END
GO
INSERT INTO dbo.Test(ID,TestData) VALUES(NULL,'test with NULL');
INSERT INTO dbo.Test(ID,TestData) VALUES('','test with EMPTY');
INSERT INTO dbo.Test(ID,TestData) VALUES('abc','test with real data');
SELECT * FROM Test;
GO
DROP TABLE dbo.Test;
The result
69604546-47BD-4E0D-9924-FAD39054BFFD test with NULL
D9F38DB0-1155-464B-89C7-43C2CE8381BF test with EMPTY
abc test with real data
You can use NEWID() in a Default Value constraint. If the field is not large enough then use LEFT(NEWID(),<field-length>) as the constraint expression.
I'm working with PostgreSQL 8.1 and I created a trigger for a table, so if something gets updated or inserted it will be also registered in a log table, but I'm having issues with NULL values:
If I update a column with a NULL value, then it won't insert any value to the log, but with char to char it will be fine, so maybe I'm missing something.
The table was created like this:
CREATE TABLE log_test(
id_test integer NOT NULL DEFAULT nextval('id_test_seq'),
type char(3),
product char(10),
table char(15),
field char(10),
old_val char(10),
new_val char(10),
user char(10),
event char(10),
date timestamp with time zone NOT NULL DEFAULT now(),
CONSTRAINT "log_test_prim" PRIMARY KEY ("id_test")
);
The trigger was created like:
CREATE FUNCTION "public"."log_test_trigger" () RETURNS "trigger" AS 'BEGIN
...
IF (TG_OP = ''UPDATE'') THEN
IF (NEW.name <> OLD.name) THEN
INSERT INTO log_test (type, table, field, old_val, new_val, user, event) VALUES (NEW.type, TG_RELNAME, ''name'', OLD.name, NEW.name, NEW.user, ''UPDATE'');
END IF;
...
END;' LANGUAGE "plpgsql"
VOLATILE
CALLED ON NULL INPUT
SECURITY INVOKER
Any help?
You cannot compare NULL to a value, i.e. both NULL = 2 and NULL <> 2 are NULL. Add ISNULL:
...
IF (TG_OP = ''UPDATE'') THEN
IF (NEW.name <> OLD.name OR NEW.name ISNULL) THEN
INSERT INTO log_test (type, table, field, old_val, new_val, user, event) VALUES (NEW.type, TG_RELNAME, ''name'', OLD.name, NEW.name, NEW.user, ''UPDATE'');
END IF;
...
or (probably better) coalesce() for both NEW and OLD records:
...
IF (TG_OP = ''UPDATE'') THEN
IF (coalesce(NEW.name, '') <> coalesce(OLD.name, '')) THEN
INSERT INTO log_test (type, table, field, old_val, new_val, user, event) VALUES (NEW.type, TG_RELNAME, ''name'', OLD.name, NEW.name, NEW.user, ''UPDATE'');
END IF;
...
I have created a SQL Server table that uses uniqueidentifier as the primary key. I set the Default Value or Binding to newid(). (I would like to set the Identity Specification for this column, but that isn't supported for uniqueidentifier types.)
I'm then using ADO.NET to add a row to this table.
SqlComment command = new SqlCommand("INSERT INTO [User] (Name) VALUES (#name);
SELECT SCOPE_IDENTITY()", Connection);
command.Parameters.AddWithValue("#name", "Joe Smoe");
Guid userId = (Guid)command.ExecuteScalar();
However, the last line fails because ExecuteScaler() returns null. It appears that, since a uniqueidentifier cannot be the table's identity, SCOPE_IDENTITY() returns null (as does ##IDENTITY).
Okay, so is there another way to retrieve the newly added ID using ADO.NET?
SCOPE_IDENTITY() is only used for Identity value, for guid values you would need to use the OUTPUT clause with a table variable.
DECLARE #NewGuid TABLE(NewValue UNIQUEIDENTIFIER);
INSERT INTO [User] (Name)
OUTPUT inserted.pk_ColName INTO #NewGuid(NewValue)
VALUES (#name);
SELECT * FROM #NewGuid --<-- here you will have the new GUID Value
C# code would look something like....
string cmd = "DECLARE #NewGuid TABLE(NewValue UNIQUEIDENTIFIER);
INSERT INTO [User] (Name)
OUTPUT inserted.pk_ColName INTO #NewGuid(NewValue)
VALUES (#name);
SELECT #newID = NewValue FROM #NewGuid;"
SqlCommand command = new SqlCommand(cmd, Connection);
cmd.Parameters.AddWithValue("#name", "Joe Smoe");
cmd.Parameters.Add("#newID", SqlDbType.UniqueIdentifier).Direction = ParameterDirection.Output;
Guid userId = (Guid)cmd.ExecuteScalar();
Personally I would put the whole thing in a stored procedure.
Scope_Identity() focuses on an IDENTITY field, so it will never yield anything. You need to output from INSERTED instead. Even though this page is not focused on your particular problem, it should give you some clues:
Return ID on INSERT?
My normal direction is a stored procedure, but you can chain commands, as you have done. The stored procedure makes things a bit easier, as you can create an output parameter for the procedure, but outputting a value works fine.
EDITED to show specific example:
Assume the following table:
CREATE TABLE [dbo].[MyTable]
(
[Id] [uniqueidentifier] PRIMARY KEY NOT NULL DEFAULT NEWID(),
[Name] [varchar](50) NOT NULL,
)
The following program will output the new GUID created from NewID():
class Program
{
static void Main(string[] args)
{
var connString = ConfigurationManager.ConnectionStrings["testDB"].ToString();
var cmdString = "INSERT INTO MyTable (Name) OUTPUT Inserted.Id VALUES ('Name')";
var connection = new SqlConnection(connString);
var command = new SqlCommand(cmdString, connection);
Guid outputValue;
try
{
connection.Open();
//Convert to Guid here instead
Console.WriteLine(command.ExecuteScalar().ToString());
}
finally
{
connection.Dispose();
}
Console.Read();
}
}
I am having a problem migrating a NUMERIC(8,0) from Sql Server to a column in ASE (v16.0) which has a user-defined type of NUMERIC(8,0) .
I have simple user-defined type:
IF EXISTS (SELECT * FROM systypes WHERE name='typ_small_id')
BEGIN
EXEC sp_droptype 'typ_small_id'
IF EXISTS (SELECT * FROM systypes WHERE name='typ_small_id')
PRINT '<<< FAILED DROPPING DATATYPE typ_small_id >>>'
ELSE
PRINT '<<< DROPPED DATATYPE typ_small_id >>>'
END
go
EXEC sp_addtype 'typ_small_id','numeric(8,0)','NOT NULL'
go
IF EXISTS (SELECT * FROM systypes WHERE name='typ_small_id')
PRINT '<<< CREATED DATATYPE typ_small_id >>>'
ELSE
PRINT '<<< FAILED CREATING DATATYPE typ_small_id >>>'
go
This is used in a small table:
create TABLE dbo.target
(
e_type_id typ_small_id /*NOT*/ NULL
)
I am trying to BulkCopy a single value from SqlServer using AseBulkCopy.WriteToServer. The source value is defined as NUMERIC(8,0) in SqlServer.
I am using the .net client for ASE and my app code is C# :
public void BulkCopyFromSqlServer_t_sec_exchange(string sourceConnectionString, string targetConnectionString)
{
SqlConnection sourceConnection = null;
AseConnection targetConnection = new AseConnection(targetConnectionString);
try
{
IDataReader dataSource;
//The next method call returns a single row with a single column, defined as a NUMERIC (8,0) in SqlSvr.
MssqlReader.GetDataReaderSelect_t_sec_exchange(out sourceConnection, out dataSource);
targetConnection.Open();
AseBulkCopy blk = new AseBulkCopy(targetConnection, new AseBulkCopyOptions(), null);
blk.BulkCopyTimeout = 1200;
blk.DestinationTableName = "dbo.target";
blk.ColumnMappings.Clear();
blk.WriteToServer(dataSource);
blk.Close();
}
catch (AseException ex)
{
Console.WriteLine(ex.Message);
}
finally
{
sourceConnection.Dispose();
targetConnection.Dispose();
}
}
The value coming from SqlServer is 1.
This all works fine until I define target.e_type_id as typ_small_id NOT NULL instead of being nullable.
When I do this I get an AseError:
The data type or the function is not supported.
Stack Trace:
at Sybase.Data.AseClient.AseBulkCopyBusinessBulk.TdsToAseDBType(TdsTypesDefines type, Int32 len, Int32 usertype, Boolean IsNullable, Int32& DateTimeLen)
at Sybase.Data.AseClient.AseBulkCopyBusinessBulk.InitMetadata()
at Sybase.Data.AseClient.AseBulkCopyBusinessBulk.InitFmtData()
at Sybase.Data.AseClient.AseBulkCopyBusinessBulk.BulkOperation()
at Sybase.Data.AseClient.AseBulkCopyBusinessBulk.RunInsertsRowsReader()
at Sybase.Data.AseClient.AseBulkCopy.WriteToServer(IDataReader reader)
at SybaseBulkCopy.SybaseCommand.BulkCopyFromSqlServer(String sourceConnectionString, String targetConnectionString) in c:..\SybaseCommand.cs:line 32
When the column is defined as typ_small_id NOT NULL the below queries both successfully insert data into the column from a query editor so it looks like an error in the WriteToServer code:
insert into dbo.target ( e_type_id )
SELECT top 1 cast(1 as numeric(8,0)) as e_type_id from source_tbl
and
insert into dbo.target ( e_type_id )
SELECT top 1 cast(1 as int) as e_type_id from source_tbl
Needless to say the customer's table has the column defined as not null (and cannot alter the schema) so I need to find out how to migrate the data into their ASE table.
Has any one seen this or found a workaround?
This has been confirmed by SAP (Sybase Tech Support) as a bug.