What is the purpose of Dapper's [ExplicitKey] attribute?
In this example generated SQL is absolutely identical, regardless of whether the key is applied or not.
exec sp_executesql
N'insert into InvoiceDetail ([InvoiceID], [Detail]) values (#InvoiceID, #Detail);
select SCOPE_IDENTITY() id',N'#Detail nvarchar(4000),#InvoiceID int',#Detail=N'Insert_Single_1',#InvoiceID=4
This is why the attribute exists:
https://github.com/StackExchange/Dapper/issues/351
The [Key] attribute assumes an autoincremented key, and when you try to pass a custom (e.g. non-int) key instead, Insert and InsertAsync fail with NullReferenceException, even if the passed value clearly isn't null. So [ExplicitKey] was introduced to handle those instead.
Related
Since SQL Server 2008 does not have a Try_convert function I wanted to implement it for use in my Project. (no simply updating to a more recent version is not an option)
Problem is: How can i pass a datatype as a parameter to my Function?
Or
More generic: What entry type does a variable need to be to be able to hold a datatype?
Basically I want to build a Function like this:
CREATE FUNCTION my_Try_Convert
(
-- Add the parameters for the function here
#value varchar(max),#entrytype type,
#format int = NULL
)
RETURNS typeof(#entrytype)
AS
BEGIN
if #entrytype = typeof(datetime)
if (select ISDATE(#value))=1
return convert(#entrytype,#value,#format)
if #entrytype = typeof(int)
if (select isnumeric(#value))=1
return convert(#entrytype,#value,Null)
END
GO
What do I need to pass for #entrytypes type and for the typeOf psyodo-function?
Well, there is no TRY_CONVERT() or TRY_CAST() in SQL-Server 2008, but you can use XML's internal casts to nullable types.
Try this
DECLARE #tbl TABLE(SomeInt VARCHAR(100));
INSERT INTO #tbl VALUES('123')
,('blah') --bad data
SELECT t.*
,CAST('' AS XML).value('sql:column("t.SomeInt") cast as xs:int?','int')
FROM #tbl t;
Bit of a hack... But somehow beautiful ;-)
Some explanantion
The CAST('' AS XML) is just a trick to get a real XML in order to get use of XML's native methods. The sql:column() is a XQuery-function, which allows you to include a set's column into the XQuery (use sql:variable for variables).
The cast as xs:int? will try to parse the string as int-value and will return NULL if this does not work.
The limit is: This will always use your system's defaults (similar to TRY_CAST). With TRY_CONVERT you'd have more control over the output with the third paramter...
Hint: XQuery/Xpath is strictly case-sensitive. So there is xs:dateTime?, but you'd get an error with xs:datetime...
I'm using this query:
INSERT INTO [LepidoliteDB].[dbo].[EGelLayerWeighingData] ([EGelWeighingDataID], [Amount], [ContainerID], [Portions],
[PortionNumber], [Canceled], [LayerID], [WeighingDone], [InsertDone],
[MeasuresDone], [StartDateAndTime], [EndDateAndTime])
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
SELECT SCOPE_IDENTITY()
But SCOPE_IDENTITY() always returns 1.
What am I missing?
Maybe you have an instead-of / for trigger there? what version of MSSQL are you using? Have you tried doing this without a database name? Have you tried this with a begin-end block?
Have you read this article ##IDENTITY vs SCOPE_IDENTITY() vs IDENT_CURRENT
I know this is an old thread, but I was having this same issue, and I managed to solve it by going into my XSD and on the query, changing the ExecuteMode of the query from NonQuery to Scalar. This stopped it always returning 1 and instead returning the ID of the previously added row.
It was probably down to me trying to add the SELECT to the INSERT afterwards, rather than doing it all at once and the UI making it correctly for me.
Best way to write it is ..
RETURN SCOPE_IDENTITY()
You can also use ##IDENTITY and IDENT_CURRENT for this
SCOPE_IDENTITY, IDENT_CURRENT, and ##IDENTITY are similar functions because they return values that are inserted into identity columns.
IDENT_CURRENT is not limited by scope and session; it is limited to a specified table. IDENT_CURRENT returns the value generated for a specific table in any session and any scope. For more information, see IDENT_CURRENT (Transact-SQL).
SCOPE_IDENTITY and ##IDENTITY return the last identity values that are generated in any table in the current session. However, SCOPE_IDENTITY returns values inserted only within the current scope; ##IDENTITY is not limited to a specific scope.
Therefore RETURN SCOPE_IDENTITY() is the best one.
I had this problem and I must admit it was a dumb mistake, but I thought it may happen to someone else. I was using the Execute command that returns the number of rows affected instead of a Query command.
By the way, I was combining my Insert and SELECT SCOPE_IDENTITY() together in one command, which is why I started with the Execute command. I was still able to combine them, but I used ExecuteScalar instead.
Late to the thread, but if someone is using Dapper ORM and they are doing an UPDATE or INSERT a single row and want to know the primary key of that updated record:
var connection = new SqlConnection("ConnectionString");
string sql=#"INSERT INTO [dbo].[Student] ([Name],[Subject])
VALUES (#Name,#Subject)
SET #StudentId=CAST(SCOPE_IDENTITY() as int)";
For the above SQL string, if you execute the SQL connection like:
Result=connection.Execute(sql,new {Name,Subject});
The value of Result will be the number of rows affected, in this case only one since we have inserted only one row.
Instead do this:
Result=connection.Query<int>(sql,new {Name,Subject}).Single();
Now Result will have the primary key value of the newly inserted/updated column.
Remember, this answer is only relevant to single row insertion/update
We are creating a schema version for our database. Right now we have a database table with a column version and a single row with the version.
Is it possible to make a scalar variable on the database, rather than a table?
Something like:
SET DBNAME.dbo.DBVersion = 1;
You may consider to store a scalar value in an Extended Property attached to your database.
To add an extended property use sp_addextendedproperty
EXEC sp_addextendedproperty #name = N'DBVersion', #value = '1';
To get a value of your property fn_listextendedproperty
SELECT value FROM fn_listextendedproperty(N'DBVersion', default, default, default, default, default, default);
Output:
Value
-----
1
Further reading:
Using Extended Properties on Database Objects
When I need to store constant values somewhere in the database for that that’s not necessarily related I usually end up with dbo.Config table that has one column for each cost that it needs to hold.
If you really want to store this somewhere outside the table you can consider extended properties, views, functions or stored procedures.
You can always create a function that looks like
CREATE FUNCTION dbo.GetSomeConst ()
RETURNS int
BEGIN
return 12
END
For static (read-only) data, you can also define a single-row view:
CREATE VIEW V_Constants AS
SELECT '1' AS DBVersion
as sketched in an earlier answer
Is there a way to call a User defined function without using "dbo." before the function name and parameters?
Using:
SELECT USERFUNCTION(PARAM1, PARAM2, PARAM3, PARAMN)
instead of:
SELECT dbo.USERFUNCTION(PARAM1, PARAM2, PARAM3, PARAMN)
This isn't possible for the SELECT syntax. BOL States: "Scalar-valued functions must be invoked by using at least the two-part name of the function"
This syntax works however.
CREATE FUNCTION USERFUNCTION
(#p INT)
RETURNS INT
AS
BEGIN
RETURN (2)
END
GO
DECLARE #rc INT
EXEC #rc = USERFUNCTION 1
SELECT #rc
It is best practice to always explicitly schema qualify objects you are referencing anyway though to avoid some overhead for resolving the schema (and avoid the possibility that the schema cannot be resolved implicitly or is resolved in a way that is undesired)
There are various ways to do this, if we take it that you have a negative reaction to seeing "dbo.".
In SQL Server 2000, there is a way to turn UDFs into system functions by toggling a bit. This "feature" has been removed from SQL Server 2005 onwards, so I won't go into detail unless you really are still using 2000.
You can use OPENQUERY with PROC syntax similar to what Martin has shown.
You can turn the Scalar function into a Table Valued Function, either by rewriting it, or by wrapping it in a TVF. The syntax changes however, so
select dbo.udf(a,b) from c
--becomes
select d
from c
cross apply tvf(a,b) e(d) -- look, no "dbo"!
But none of the above looks simpler than just tacking a simple "dbo." prefix to the function name, so why would you do it?
Yes Possible,
Actually when function returning scalar value you must call with schema name like dbo.yourfunction , If you don't want to call function without schema name you should create function as follows.
Sample Code:
CREATE FUNCTION [dbo].[FN_MAPCOUNT]
(
#countValue int
)
RETURNS #TEMPTABLE Table(cntValue int)
as
begin
DECLARE #countValueint
#countValue= select count(*) from mappings;
INSERT #TEMPTABLE (cntValue) VALUES (#countValue)
RETURN
end
Go
select * from FN_MAPCOUNT(1);
The reason is you are returning the value as table .
While updating a DataTable to a SQL Server database I get the error message "Column 'PK_Column' does not allow nulls" after calling GetErrors()
I don't want to provide a value for PK_Column because it is a auto increment primary key column in the database. My insert statement looks like this:
INSERT INTO [Order] ([Customer_Id], [OrderTime], [OrderType])
VALUES(#Customer_Id, #OrderTime, #OrderType)
SELECT CAST(SCOPE_IDENTITY() AS int) AS '#PK_Column'
It works as expected in SQL Server Management Studio, so the query is obviously not the problem.
I have four parameters on the insert command, one output parameter (#PK_Column) and three input parameters (#Customer_Id, #OrderTime, #OrderType). I figured out that I don't get the error if I set #PK_Column to InputOutput parameter, but then the PK_Column value does not get updated with the correct value created by the database.
Try
SELECT #PK_Column = SCOPE_IDENTITY()
This way, you assign to the local variable #PK_Column, which is picked up as a parameter.
When you do ... AS '#PK_Column', you are creating a data set with one column called "#PK_Column", when you want to assign a value to local var/parameter #PK_Column
Note: ... AS #PK_Column would fail because this is column alias. You're relying on a setting that allows '#PK_Column' to be a valid alias where as #PK_Column would fail