I am creating a winform application in c#.and using sql database.
I have one table, employee_master, which has columns like Id, name, address and phone no. Id is auto increment and all other datatypes are varchar.
I am using this code to get the next auto increment value:
string s = "select max(id) as Id from Employee_Master";
SqlCommand cmd = new SqlCommand(s, obj.con);
SqlDataReader dr = cmd.ExecuteReader();
dr.Read();
int i = Convert.ToInt16(dr["Id"].ToString());
txtId.Text = (i + 1).ToString();
I am displaying on a textBox.
But when last row from table is deleted, still I get that value which is recently deleted in textbox
How should I get the next autoincrement value?
To get the next auto-increment value from SQLServer :
This will fetch the present auto-increment value.
SELECT IDENT_CURRENT('table_name');
Next auto-increment value.
SELECT IDENT_CURRENT('table_name')+1;
------> This will work even if you add a row and then delete it because IDENT_CURRENT returns the last identity value generated for a specific table in any session and any scope.
try this:
SELECT IDENT_CURRENT('tbl_name') + IDENT_INCR('tbl_name');
If you are using Microsoft SQL Server. Use this statement to get current identity value of table. Then add your seed value which you have specified at time of designing table if you want to get next id.
SELECT IDENT_CURRENT(<TableName>)
As for me, the best answer is:
dbcc checkident(table_name)
You will see two values (probably same)
current identity value , current column value
When you delete a row from the table the next number will stay the same as it doesnt decrement in any way.
So if you have 100 rows and you deleted row 100. You would have 99 rows but the next number is still going to be 101.
select isnull((max(AddressID)+1),1) from AddressDetails
the max(id) will get you maximum number in the list pf employee_master
e.g. id = 10, 20, 100 so max will get you 100
But when you delete the record it must have been not 100
So you still get 100 back
One important reason for me to say this might be the issue because you are not using order by id in your query
For MS SQL 2005 and greater:
Select Cast(IsNULL(last_value,seed_value) As Int) + Cast(increment_value As Int) As NextID
From sys.identity_columns
WHERE NAME = <Table_Name>
Just a thought, if what you wanted was the last auto-number that you inserted on an already open connection try using:
SELECT ##IDENTITY FROM...
from that connection. That's the best way to keep track of what has just happened on a given connection and avoids race conditions w/ other connections. Getting the maximum identity is not generally feasible.
SqlConnection con = new SqlConnection("Data Source=.\SQLEXPRESS;Initial Catalog=databasename;User ID=sa;Password=123");
con.Open();
SqlCommand cmd = new SqlCommand("SELECT TOP(1) UID FROM InvoiceDetails ORDER BY 1 DESC", con);
SqlDataReader reader = cmd.ExecuteReader();
//won't need a while since it will only retrieve one row
while (reader.Read())
{
string data = reader["UID"].ToString();
//txtuniqueno.Text = data;
//here is your data
//cal();
//txtuniqueno.Text = data.ToString();
int i = Int32.Parse(data);
i++;
txtuid.Text = i.ToString();
}
Related
I have couple insert queries which are merged in transaction. First of that insert is to create new product articel number incrementing the most higher in table by one. Unfortunetly i just noticed that mostly during tests if for instance two users from two diffrent applications click button which trigger my transaction's method they could get same new product number. How can avoid that situation? Is there something like lock on first insertion so that if first user accessing table to insert restrict other's user/s about their insertion so they have to wait in queue after first user insert is finished? Is there something like that? Besides i thought if someone inserts other users are not able to insert. I made comments in code you to understand.
Part of my transaction query below:
Public Sub ProcessArticle(ByRef artikel As ArticlesVariations)
Dim strcon = New AppSettingsReader().GetValue("ConnectionString", GetType(System.String)).ToString()
Using connection As New SqlConnection(strcon)
connection.Open()
Using transaction = connection.BeginTransaction()
Try
For Each kvp As KeyValuePair(Of Integer, Artikel) In artikel.collection
articleIndex = kvp.Key
Dim art As Artikel = kvp.Value
Using cmd As New SqlCommand("INSERT INTO tbArtikel (Nummer) VALUES (#Nummer);Select Scope_Identity()", transaction.Connection)
cmd.CommandType = CommandType.Text
cmd.Connection = connection
cmd.Transaction = transaction
'Get next product number from table tbArtikel (this will be new product number)'
Dim NewArtNummer as String = New DALArtikel().GetNewArtikelNumber(transaction)
art.Nummer = NewArtNummer
cmd.Parameters.AddWithValue("#Nummer", art.Nummer)
'Get inserted product id for other diffrent inserts below'
newArticleRowId = CInt(cmd.ExecuteScalar())
'....
other INSERTs queries to other tables ...
...'
transaction.Commit()
Catch ex As Exception
transaction.Rollback()
Throw 'Rethrow exception.'
End Try
End Using
End Using
End Sub
Just about the only way to assure that users are not assigned the same values is to issue them from the server when the row is inserted. It is the entire premise behind the server issuing AI values for PKs.
BUT since your thing is a multi-segment, "numeric string" that presents a problem. Rather than tearing the string apart to find the Max()+1 for one segment with a WHERE clause on parts of the string. Consider something like this:
Start with a table used to increment and issue the values:
{DocId Int, SegmentB int, SegmentC Int}
This will simply track the values to use in the other table. Then a stored procedure to create/increment a new code (MySQL - this is a conceptual answer):
CREATE DEFINER=`root`#`localhost` PROCEDURE `GetNextProductCode`(in docId int,
in Minr int,
in Rev int
)
BEGIN
SET #maxR = 0;
SET #retCode ='';
if Minr =-1 then
Start transaction;
SET #maxR = (SELECT Max(SegmentB) FROM articlecode WHERE MainId = docId) + 1;
UPDATE articlecode SET SegmentB = #maxR WHERE MainId = docId;
Commit;
Select concat(Cast(docId As char) , '.',
Cast(#maxR AS char) , '.',
Cast(Rev As char)
);
end if;
END
This is a rough idea of the process. As such, it only works on the second segment (I dunno what happens when you create a NEW SegmentB - does SegmentC reset to 1???). The idea is:
pass numbers so there is no need to tear up a string
pass -1 for the segment you need the next value for
the sp gets the Max()+1 and updates the counter table so the next user will get a new value
If for some reason you end up not saving the row, there will be gaps
the sp uses a transaction (probably only needs to protect the update) so that only 1 update can happen at a time
returns the new code. it could just return 2 values, but your going to glue them together anyway
There is much To Do:
It only does SegmentB
For a NEW DocId (-1), insert a new row with 1000 and 1(?) defaults
Same for a NEW segmentB (whatever it is): insert a new row for that DocId with default values
To get a new code before you insert a row:
cmd.CommandType = CommandType.StoredProcedure
cmd.Parameters.Add("docId", MySqlDbType.Int32).Value = 3
cmd.Parameters.Add("Minr", MySqlDbType.Int32).Value = -1
cmd.Parameters.Add("Rev", MySqlDbType.Int32).Value = 1
dbcon.Open()
Using rdr = cmd.ExecuteReader()
rdr.Read()
Console.WriteLine(rdr(0))
End Using
The obvious downside is that each insert requires you to hit the DB in order to...well save to the DB. If they were int values it could be a Trigger.
I'm a SQL developer and my VB skills are about fifteen years out of date, but instead of creating the incremented number yourself in VB just let SQL generate them with an IDENTITY field. SQL will never allow duplicates and then you just need to return the SCOPE_IDENTITY():
ALTER TABLE dbo.tbArtikel
ADD [ArtikelID] INT IDENTITY(1,1) PRIMARY KEY;
I have two suggestions:
First suggestion: move your code to a stored procedure this way all your users will execute the same transaction where you can set your isolation level the way you want. Read This.
Second suggestion: I would create a unique index on your field Nummer. This way when I try to insert a duplicate value it will raise an error that I can deal with it by telling the user that he need to retry the same operation or retry it automatically.
Trying to lock the record or the table for your operation is not advisable, however you can check this article on code project you might find what you are looking for. Make sure that you provide a mechanism of releasing all locks if your program stops at the middle of the transaction.
Suppose I generate the PK for my SQL Server DB table with the help of newid() function. In Java I can do something like this:
...
String query = "DECLARE #newGuid uniqueidentifier "+
"SET #newGuid = newid() "+
"INSERT INTO myTable(id, stringval) "+
"VALUES (#newGuid, "Hello") "+
"SELECT uid FROM #newGuid";
PreparedStatement ps = conn.prepareStatement(query);
ResultSet rs = ps.executeQuery();
String uid = rs.getString("uid");
But when I try to make that with Delphi+ADO I get stuck cause ADO can either get data from DB (Open method of AdoQuery) or put data to DB (ExecSQL method). So I can't insert new value to the table and get the parameter value afterwards.
You could solve this problem atleast in two ways.
You can put both of your SQL queries into one string (just like you have in your example) and call TADOQuery.Open or TADOQuery.Active := True. it doesn't matter that you have INSERT statement there as long as query returns something.
You can define parameter's direction as pdOutput in ADOQuery.Parameters collection and read value of that parameter after executing the query.
You are treating #newGuid as if it was a table. Your last row in the query should be:
SELECT #newGuid as uid
When I execute Stored Procedure
SELECT * FROM Users
INNER JOIN BloodBankUser ON Users.UserID = BloodBankUser.UserID
It gives me result fine.
but now on .net side
dt.Rows[0]["Address"].ToString();
this gives me Address of table BloodBankUser
dt.Rows[0]["Users.Address"].ToString();
when I debug this statement it execute error
Column 'Users.Address' does not
belong to table.
How can I get Value of Users.Address
While the first answer would be to change your SQL Query to specify a distinct name for each of your field, it is still possible to retrieve the table name associated with your field.
In this example, I am not filling a DataTable using a DataAdapter, but rather I am using the SqlDataReader.
Be aware that this may fail if you are unable to retrieve the database schema for any reason
When calling ExecuteReader on a SqlCommand, there is an overload that allows you to specify a CommandBehavior. In our case, the behavior that we want is CommandBehavior.KeyInfo.
var reader = command.ExecuteReader(CommandBehavior.KeyInfo);
Now, on the reader, you can invoke the GetSchemaTable method. It returns a DataTable that contains the structure of your query.
var schema = reader.GetSchemaTable();
You can read about that table on MSDN.
Our goal now is to match the field and table against its ordinal position in the column list. Three fields from the schema table are relevant to your interest:
ColumnName
BaseTableName
ColumnOrdinal
You can then create an extension method to do that reading:
public static T Field<T>(this SqlDataReader reader, DataTable schema, string table, string field)
{
// Search for the ordinal that match the table and field name
var row = schema.AsEnumerable().FirstOrDefault(r => r.Field<string>("BaseTableName") == table && r.Field<string>("ColumnName") == field);
var ordinal = row.Field<int>("ColumnOrdinal");
return (T)reader.GetValue(ordinal);
}
You can then call that extension method
using (SqlConnection connection = new SqlConnection("your connection string"))
{
connection.Open();
using (SqlCommand command = new SqlCommand("SELECT * FROM Users INNER JOIN BloodBankUser ON Users.UserID = BloodBankUser.UserID;", connection))
using (var reader = command.ExecuteReader(CommandBehavior.KeyInfo))
{
var schema = reader.GetSchemaTable();
while (reader.Read())
{
Console.WriteLine(reader.Field<string>(schema, "Users", "Address"));
}
}
}
Rename the FIELD in the output (Select FIELDNAME as NEWNAME)
You specify the column names rather than use SELECT * FROM You will then be able to do the following
Select User.Username,
User.Address as 'UserAddress',
BloodBankUser.Address as 'BloodbankAddress'
FROM Users
INNER JOIN BloodBankUser ON Users.UserID = BloodBankUser.UserID
Avoid the use of * in SELECT queries. Select only the columns you need and name them explicitly to avoid ambiguity,
Insead of SELECT *... specify the columns you want explicitly, and alias those that may duplicate
SELECT Users.Address as UsersAddress
I am using VFPOLEDB driver to read DBF files and I keep getting this error and I am not sure why and how to fix the problem:
The provider could not determine the Decimal value. For example, the row was just created, the default for the Decimal column was not available, and the consumer had not yet set a new Decimal value.
Here is the code. I call this routine to return a DataSet of the DBF file and display the data in a DataGridView.
public DataSet GetDBFData(FileInfo fi, string tbl)
{
using (OleDbConnection conn = new OleDbConnection(
#"Provider=VFPOLEDB.1;Data Source=" + fi.DirectoryName + ";"))
{
conn.Open();
string command = "SELECT * FROM " + tbl;
OleDbDataAdapter da = new OleDbDataAdapter(command, conn);
DataSet ds = new DataSet();
da.Fill(ds);
return ds;
}
}
I found the solution here:
Error reading certain numeric values with VFPOLEDB driver
SELECT CAST(FieldName As NUMERIC(11, 3)) From TableName
I finally solved the problem by getting the table schema and then casting all of non-character fields to varchar in the select statement. Good enough for previewing the contents of the table.
It is a known issue.
Especially, if You need to select all columns, it is much more comfortable:
Select * from some_table
One working solution is to use another provider, for example Microsoft.Jet.OLEDB.4.0.
Example connection string can be found here: http://docs.30c.org/conn/dbf-foxpro.html
If you add a row from your gridview, it doesn't necessarily use a default value, but rather NULLs, so you may need to pre-set your defaults, or set the schema to NOT Allow Nulls.
You could automate through the columns after the query is done and force defaults based on the columns data types, such as
foreach (DataColumn oDC in YourDataSet.Tables[0].Columns)
{
if (oDC.DataType.ToString().Contains("String"))
oDC.DefaultValue = "";
else if (oDC.DataType.ToString().Contains("Int32"))
oDC.DefaultValue = 0;
else if (oDC.DataType.ToString().Contains("DateTime"))
oDC.DefaultValue = DateTime.MinValue;
}
these are just 3 default types, but there could be others like boolean, decimal, float, whatever, just add into the if/else and put whatever "default" values. It MAY help where otherwise "NULL" values are getting injected in when adding new rows.
I get this error when I do a bulk insert with select * from [table_name], and another table name:
the locale id '0' of the source column 'PAT_NUM_ADT' and the locale id '1033'
of the destination column 'PAT_ID_OLD' do not match
I tried resetting my db collation but this did not help.
Has anyone seen this error?
If you are copying less than a full set of fields from one table to another, whether that table is on another domain across the world, or is collocated in the same database, you just have to select them in order. SqlBulkCopyColumnMappings do not work. Yes, I tried. I used all four possible constructors, and I used them both as SqlBulkCopyMapping objects and just by providing the same information to the Add method of SqlBulkCopy.ColumnMappings.Add.
My columns are named the same. If you're using a different name as well as a different order, you may well find that you have to actually rename the columns. Good luck.
I just had this error message when bulk copying some data. While it might not have been the exact same problem you were having, I was getting the same error.
Specifically, I was doing the following:
SELECT NULL AS ColumnName ...
And the destination was a nullable varchar(3).
In this case, all I needed to do was update my select statement as follows:
SELECT CONVERT(VARCHAR(3),NULL) AS ColumnName...
This worked perfectly and the error message went away!
It is right that when we use SqlBulkCopy, some time it gives error, the best way to map the columns when you are using SqlBulkCopy.
My Previous Code :
SqlConnectionStringBuilder cb = new SqlConnectionStringBuilder("Data Source=ServerName;User Id=userid;Password=****;Initial Catalog=Deepak; Pooling=true; Max pool size=200; Min pool size=0");
SqlConnection con = new SqlConnection(cb.ConnectionString);
SqlCommand cmd = new SqlCommand("select Name,Class,Section,RollNo from Student", con);
con.Open();
SqlDataReader rdr = cmd.ExecuteReader();
SqlBulkCopy sbc = new SqlBulkCopy("Data Source=DestinationServer;User Id=destinationserveruserid;Password=******;Initial Catalog=DeepakTransfer; Pooling=true; Max pool size=200; Min pool size=0");
sbc.DestinationTableName = "StudentTrans";
sbc.WriteToServer(rdr);
sbc.Close();
rdr.Close();
con.Close();
The Code Was giving me the Error as :
The locale id '0' of the source column 'RollNo' and the locale id '1033' of the destination column 'Section' do not match.
Now After Column Mapping my Code Is Running Successfully.
My Modified Code is :
SqlConnectionStringBuilder cb = new SqlConnectionStringBuilder("Data Source=ServerName;User Id=userid;Password=****;Initial Catalog=Deepak;");
SqlConnection con = new SqlConnection(cb.ConnectionString);
SqlCommand cmd = new SqlCommand("select Name,Class,Section,RollNo from Student", con);
con.Open();
SqlDataReader rdr = cmd.ExecuteReader();
SqlBulkCopy sbc = new SqlBulkCopy("Data Source=DestinationServer;User Id=destinationserveruserid;Password=******;Initial Catalog=DeepakTransfer;");
sbc.DestinationTableName = "StudentTrans";
sbc.ColumnMappings.Add("Name", "Name");
sbc.ColumnMappings.Add("Class", "Class");
sbc.ColumnMappings.Add("Section", "Section");
sbc.ColumnMappings.Add("RollNo", "RollNo");
sbc.WriteToServer(rdr);
sbc.Close();
rdr.Close();
con.Close();
This code is running Successfully.
The answer by sal
If you are copying less than a full
set of fields from one table to
another, whether that table is on
another domain across the world, or is
collocated in the same database, you
just have to select them in order.
SqlBulkCopyColumnMappings do not work.
is according to my work absolutely right! Thanks for posting it. Everything has to be the same - data types, etc. Each time it finds a mismatch it throws the mysterious Locale Id error - funny yet frustrating as h###.
I was getting same error and it turned out I was copying from VARCHAR column in the DataTable to INT.
After I changed the data type it worked flawlessly. I successfully copied subset of fields, specifying proper field mappings (mappings worked by both field name or sequence number).
So make sure your data types are correct.
I would check what your default locale settings are. Also, you'll need to check the locale of both tables using sp_help to verify they are the same. If they aren't you'll need to convert it to the correct locale
When you change the Collation of a database the table columns keep the old collation so you need to drop the tables and create them again.
A great way to debug this is to take the sql query being used in your SqlBulkCopy and run it in management studio as a select-into, for instance, change select * from [table_name] to select * into newTable from [table_name], then look at the nullability and data types of 'newTable' versus 'table_name'. If there are any differences then you are likely to end up with this misleading error. Adjust the query or target table until they match, and then your command will work.
Many thanks to Deepak Dwivedi for help. Here is little more hack with COLLATE DATABASE_DEFAULT which finally solved problem for me:
SqlConnectionStringBuilder cb = new SqlConnectionStringBuilder("Data Source=ServerName;User Id=userid;Password=****;Initial Catalog=Deepak;");
SqlConnection con = new SqlConnection(cb.ConnectionString);
SqlCommand cmd = new SqlCommand("select Name COLLATE DATABASE_DEFAULT Name ,Class COLLATE DATABASE_DEFAULT Class ,Section COLLATE DATABASE_DEFAULT Section ,RollNo COLLATE DATABASE_DEFAULT RollNo from Student", con);
con.Open();
SqlDataReader rdr = cmd.ExecuteReader();
SqlBulkCopy sbc = new SqlBulkCopy("Data Source=DestinationServer;User Id=destinationserveruserid;Password=******;Initial Catalog=DeepakTransfer;");
sbc.DestinationTableName = "StudentTrans";
sbc.ColumnMappings.Add("Name", "Name");
sbc.ColumnMappings.Add("Class", "Class");
sbc.ColumnMappings.Add("Section", "Section");
sbc.ColumnMappings.Add("RollNo", "RollNo");
sbc.WriteToServer(rdr);
sbc.Close();
rdr.Close();
con.Close();