How to get SQL Server CE TableAdapter to commit to database? - sql-server

VS2008 project. Created local database (SSCE). Created dataset with tableadapter. Dragged dataset and tableadapter to the form so I can reference it in code.
New records successfully add to the dataset but will not commit back to the database. Gives no error or clue why it won't work.
TableAdapter insert statement was created automatically and is parameterized (#p1, #p2, etc.), but I am trying to avoid those parameter.add statements as I want to be able to use a field-by-field format without having to essentially repeat the schema of my database in code with parameter.add statements. Command object and INSERT statements work fine, but then you always have to construct an INSERT statement -- a pain if they're complicated. I want something as simple as working with a ADO recordset, but in .NET.
I know the last statement with the update is wrong without the parameters, but looking for an alternative method.
What can I do to accomplish the following without parameter.add statements?
DocsTableAdapter1.Fill(Documents1.Docs)
Debug.Print("Starting Row Count is: " & Documents1.Docs.Count.ToString)
Dim dr As DataRow = Documents1.Docs.NewRow
dr("Name") = "John Smith"
dr("Reference") = "My new reference code"
Documents1.Docs.Rows.Add(dr)
Debug.Print("New Row Count is: " & Documents1.Docs.Count.ToString)
DocsTableAdapter1.Update(Documents1.Docs)

Related

Fast way to transfer table to a remote server as import/export does

I have a stored procedure with statements similar to this:
DELETE FROM [LinkedServer].[DB1].[dbo].[Table1]
DELETE FROM [LinkedServer].[DB1].[dbo].[Table2]
DELETE FROM [LinkedServer].[DB1].[dbo].[Table3]
INSERT INTO [LinkedServer].[DB1].[dbo].[Table3]
SELECT * FROM [DB1].[dbo].[Table3]
INSERT INTO [LinkedServer].[DB1].[dbo].[Table2]
SELECT * FROM [DB1].[dbo].[Table2]
INSERT INTO [LinkedServer].[DB1].[dbo].[Table1]
SELECT * FROM [DB1].[dbo].[Table1]
Which is extremely slow. But my goal is simple as this.
I cannot use replication, just a method to empty the tables and fill them again.
If I do the same action using import/export functionality from SSMS, it empties the remote tables and fill them up very quickly.
Is there a way to simulate what import/export is doing, using T-SQL commands?
It would not be a problem to disable restrictions while copying data.
Perhaps exists some kind of BULK INSERT I could use in this scenario? After search info about this option it seems to be useful to transfer data from SQL to file or from file to SQL, but I don't find examples to transfer from table to table.
Is there a way to simulate what import/export is doing, using Transact SQL commands?
Not quite. But try running the INSERT from the other end.
INSERT INTO [DB1].[dbo].[Table3] SELECT * FROM [LinkedServer].[DB1].[dbo].[Table3]
Or install an instance of SQL Server 2008 and backup/restore to upgrade the database; then backup/restore that to the target version.
I finally opted for #Jeroen Mostert's proposal, since no other working solutions have turned up.
I have created a small command line tool that receives the parameters for the source and destination connections. I've tested it with a table and it goes just as fast as the import/export does.
Using sourceConn As New SqlConnection(sourceConnStr)
Using destinationConn As New SqlConnection(destinationConnStr)
Dim cM As New SqlCommand("SELECT * FROM " & sourceTable, sourceConn)
sourceConn.Open()
Using dR As SqlDataReader = cM.ExecuteReader
Dim bC As New SqlClient.SqlBulkCopy(destinationConn)
destinationConn.Open()
'Truncate
cM = New SqlCommand("TRUNCATE TABLE " & destinationTable, destinationConn)
cM.ExecuteNonQuery()
'BulkCopy
bC.DestinationTableName = destinationTable
bC.WriteToServer(dR)
'Close connections
dR.Close()
sourceConn.Close()
destinationConn.Close()
End Using
End Using
End Using

T-SQL - SELECT SCOPE_IDENTITY does not work in vba code

I am working on a project which consists transferring a few thousands of Excel rows to a SQL Server (it was also called T-SQL if I'm right?) database. I put together some logic in VBA to shape up the data.
Let me give you some context first about this. The data I'm about to transfer are invoice files. On each row, there are code of the stock items, prices, invoice number, invoice date, name of the client etc. These needs to be transferred to the database of a proprietary ERP system.
There are two tables on the database which I'm interested in for now. First one holds the header data for the invoice (client data, date, invoice number, invoice total etc.). Second table holds the information on the stock items (what has been sold, how many and for how much money etc).
After each insert onto the first table, I have to get the inserted row's primary key, in order to insert rows to the second table, which requires the PK of the first table on each row.
Now, my approach was to use the SCOPE_IDENTITY() function of the T-SQL. When I try to do it on the database directly via SQL Server Management Studio, it works without a hitch.
But when I try to use it in the code, it returns an empty recordset.
Code I'm using is as follows:
Public Function Execute(query As String, Optional is_batch As Boolean = False) As ADODB.Recordset
If conn.State = 0 Then
OpenConnection
End If
Set rs = conn.Execute(query) 'this is the actual query to be executed
Dim identity As ADODB.Recordset 'this rs supposed to hold the PK of inserted row, but returns an empty recordset
Set identity = conn.Execute("SELECT SCOPE_IDENTITY();")
If TypeName(identity.Fields(0).Value) = "Null" Then
pInsertedId = -1
Else
pInsertedId = identity.Fields(0).Value 'I'm saving it in an object variable, to access it easily later on
End If
Set Execute = rs 'to be returned to the caller
'closing the connection is handled outside this procedure
End Function
When I run this on VBA, second query SELECT SCOPE_IDENTITY();just returns an empty recordset. Same query works successfully when ran on the db directly.
Actually I'm able to pull this off by other means. There is a UUID column which I'm supposed to insert to the row in the first table. I can just simply query the table with this UUID and get the PK, but I'm just curious why this won't work.
Any ideas?
Your code doesn't insert any data, so no identity values are generated in the current scope, as defined in the official documentation for SCOPE_IDENTITY():
Returns the last identity value inserted into an identity column in the same scope. A scope is a module: a stored procedure, trigger, function, or batch. Therefore, if two statements are in the same stored procedure, function, or batch, they are in the same scope.
Your code effectively is the same as inserting data in one query window in SSMS and querying SCOPE_IDENTITY() in another query window. Well, this isn't how it works. You must query it in the same scope, i.e. a stored procedure, trigger, function, or batch. Otherwise, use ID values generated by you and insert them with the data.

Excel - SQL Query - ## Temp Table

I am trying to create a global temp table using the results from one query, which can then be selected as a table and manipulated further several times without having to reprocess the data over and over.
This works perfectly in SQL management studio, but when I try to add the table through an Excel query, the table can be referenced at that time, but it is not created in Temporary Tables in the tempdb database.
I have broken it down into a simple example.
If I run this in SQL management studio, the result of 1 is returned as expected, and the table ##testtable1 is created in Temporary Tables
set nocount on;
select 1 as 'Val1', 2 as 'Val2' into ##testtable1
select Val1 from ##testtable1
I can then run another select on this table, even in a different session, as you'd expect. E.g.
Select Val2 from ##testtable1
If I don't drop ##testtable1, running the below in a query in Excel returns the result of 2 as you'd expect.
Select Val2 from ##testtable1
However, if I run the same Select... into ##testtable1 query directly in Excel, that correctly returns the result of 1, but the temptable is not created.
If I then try to run
Select Val2 from ##testtable1
As a separate query, it errors saying "Invalid object name '##testtable1'
The table is not listed within Temporary Tables in SQL management studio.
It is as if it is performing a drop on the table after the query has finished executing, even though I am not calling a drop.
How can I resolve this?
Read up on global temp tables(GTT). They persist as long as there is a session referencing it. In SSMS, if you close the session that created the GTT prior to using it in another session, the GTT would be discarded. This is what is happening in Excel. Excel creates a connection, executes and disconnects. Since there are no sessions using the GTT when Excel disconnects, the GTT is discarded.
I would highly recommend you create a normal table rather than use a GTT. Because of their temporary nature and dependence on an active session, you may get inconsistent results when using a GTT. If you create a normal table instead, you can be certain it will still exist when you try to use it later.
The code to create/clean the table is pretty simple.
IF OBJECT_ID('db.schema.tablename') IS NOT NULL
TRUNCATE TABLE [tablename]
ELSE
CREATE [tablename]...
GO
You can change the truncate to a delete to clean up a specific set of data and place it at the start of each one of your queries.
is it possible you could use a view? assuming that you are connecting to 5 DBs on the same server can you union the data together in a view:
CREATE VIEW [dbo].[testView]
AS
SELECT *
FROM database1.dbo.myTable
UNION
SELECT *
FROM database2.dbo.myTable
Then in excel:
Data> New Query > From Database > FromSQL Server Database
enter DB server
Select the view from the appropriate DB - done :)
OR call the view however you are doing it (e.g. vba etc.)
equally you could use a stored procedure and call that from VBA .. basically anything that moves more of the complexity to the server side to make your life easier :D
You can absolutely do this. Notice how I'm building a temp table from SQL called 'TmpSql' ...this could be any query you want. Then I set it to recordset 1. Then I create another recordset 2, that goes and gets the temp table data.
Imagine if you were looping on the first cn.Execute where TmpSql is changing.. This allows you to build a Temporary table coming from many sources or changing variables. This is a powerful solution.
cn.open "Provider= ..."
sql = "Select t.* Into #TTable From (" & TmpSql & ") t "
Set rs1 = cn.Execute(sql)
GetTmp = "Select * From #TTable"
rs2.Open GetTmp, cn, adOpenDynamic, adLockBatchOptimistic
If Not rs2.EOF Then Call Sheets("Data").Range("A2").CopyFromRecordset(rs2)
rs2.Close
rs1.Close
cn.Close

How to avoid errors when multiple Users Increment number field in SQL Server

I am using SQL Server Express 2016 and Excel VBA to generate unique lot numbers for a bunch of different excel documents. Currently, I am creating a new row,then a script in SQL Server increments the lot number. Then I run a select statement to grab the the field from the row that was just created. So far I have not had any issues, however, I am concerned that if the excel file is fired by different users at the same time, the select query for one user may grab the row that another user created. Is this a concern? If so how do I avoid it?
statement = "INSERT INTO LotInfo(RefNum,DateCreated,UserName)VALUES('" & RefNum
& "','" & DateCreated & "','" & user & "')"
conn.Execute statement
Set lot = conn.Execute("SELECT top 1 Lot FROM LotInfo Order By ID desc;")
I don't believe that Scope identity will work.
The statements are entirely separate. first you do the insert. That ends. Then you send the select. While I'm not 100% familiar with VBA, I'm not sure that the second select will know anything about the result of the first invoice.
I would suggest you create a stored procedure which you can call from VBA.
The procedure would perform the insert and then return the lot number.

Update ADO Recordset for a Calculated Field

I'm using the following sql for an ADO recordset against a SQLServer backend in VB6:
select c.name, taxid=
case when exists(select 1 from sometable where fld='abc')
then c.SSN else null end
When I try to update the taxid field in a row within the recordset locally ADO complains with the error "Multiple-step operation generated errors. Check each status value." I assume it's bothered by the fact that the taxid field is coming from a calculated field and not a raw table column. For my purposes I'm never going to be persisting these changes back to the database so I'm looking for a way to tell ADO that have no intent to persist changes so that it will allow me to change the data locally.
I think that #HK1's suggestion is a good one, though I'm not sure what happens to your ability to alter any of the values in the recordset whether the column you're trying to update is computed or not. It's been a long time since I played with classic ADO but if the recordset is disconnected it may become read only at that point.
But if you have no interest in using the recordset to perform updates, and you need to alter values locally, perhaps you should consider storing the results in a local array first? That way you can minimize the locking and cursor options of the recordset, for example, and immediately close the recordset and free up those resources.
rs.Open cmd, conn, adOpenForwardOnly, adLockReadOnly
Dim MyArray
MyArray = rs.GetRows()
rs.Close: set rs = nothing
Now you can manipulate MyArray however you want...

Resources