How to insert update & delete the data from the database in powerbuilder - sybase

Can any one please help me how to Insert the data into database from window form. How to fetch the data to show on window form & same to update the data from database. I am looking for the code that contain sql query with in the code not from the quick select data window. I am very new in powerbuilder.I want to write a code fetch update data from the code any where & show anywhere.
Thanks

I'm not quite sure about your question. Try going to this website http://powerbuilder.hyderabad-colleges.com.
Look for Datawindow control and Datawndow object topics.
There are other ways to manipulate data in Powerbuilder like using Embeded SQL (stored procedure and cursors).
I hope this will help you.

The whole point of the Datawindow is that it does all that work for you.
Retrieve data:
dw_1.Retrieve(arguments)
Update the database:
dw_1.Update()

I'm not understanding the question entirely you must be having trouble with a multi-table update they can be challenging for a new developer.
This will do an update into two tables I did it in a hurry so might be a syntax error or two.
// insert a row
li_row = dw_1.insertrow(0)
dw_1.setitem(li_row, 'col1', 'try reading')
dw_1.setitem(li_row, 'col2', 'the PowerBuilder')
dw_1.setitem(li_row, 'col3', 'manual next time')
// do accept text left out for purposes of brevity
// Update first table and dont bother with another accepttext
// since weve already done one and dont set the updateflags
// so second half of update creates correct sql statement
li_rtn = dw_1.Update(false, false)
if li_rtn = 1 then
dw_1.modify('tbl1_col1.Update = No')
dw_1.modify('tbl1_col2.Update = No')
dw_1.modify('tbl1_col3.Update = No')
dw_1.modify('tbl1_id.Key = No')
dw_1.modify("Datawindow.Table.updateable = 'tbl2'")
dw_1.modify('tbl2_col1.Update = Yes')
dw_1.modify('tbl2_col2_id.Key = Yes')
li_rtn = dw_1.update(false, true)
if li_rtn = 1 then
commit using sqlca;
else
rollback using sqlca;
end if
end if
// cleanup the temp recs
li_rowcount = dw_1.rowcount()
for li_row = li_rowcount to 1 step -1
dw_1.deleterow(li_row)
next
dw_1.Update()

Related

Entity Framework Insert Into Table With AFTER INSERT Trigger

I am working on a Web API and Entity Framework 6 that is doing a "bulk" insert of under 500 records at any given time to a Microsoft SQL Server table. The DbContext.SaveChanges() method will insert all the records into a table in a couple seconds, so have no issues with that. However, when the method is called to insert the same number of records into the same table with a semi-extensive trigger attached to it, the process can take many minutes. The trigger has some calls to table joins and inserts into other tables and then deletes the newly inserted record.
I do not have much control of the table or the trigger, so I am looking for suggestions on how to improve performance. I made a suggestion to move the trigger to a stored procedure and have the trigger call the stored procedure, but I am uncertain if that will achieve any gains.
EDIT: As I understand my question was kind of generic, I will post some of my code in case it helps. The SQL is not mine, so I will see what I can actually post.
Here is the part of my Web API method that does the call to SaveChanges():
string[] stringArray = results[0].Split(new[] { "\r\n", "\r", "\n" }, StringSplitOptions.None);
var profileObjs = db.Set<T_ProfileStaging>();
foreach (var s in stringArray)
{
string[] columns = s.Split(new[] {",", "\t"}, StringSplitOptions.None);
if (columns.Length == 6)
{
T_ProfileStaging profileObj = new T_ProfileStaging();
profileObj.CompanyCode = columns[0];
profileObj.SubmittedBy = columns[1];
profileObj.VersionName = columns[2];
profileObj.DMName = columns[3];
profileObj.Zone = columns[4];
profileObj.DMCode = columns[5];
profileObj.ProfileName = columns[6];
profileObj.Advertiser = columns[7];
profileObj.OriginalInsertDate = columns[8];
profileObjs.Add(profileObj);
}
}
try
{
db.SaveChanges();
return Ok();
}
catch (Exception e)
{
return Content(HttpStatusCode.BadRequest, "SQL Server Insert Exception");
}
When you load with SaveChanges() EF will send each row in a separate INSERT statement. So if you have a statement trigger, it will run for each row.
To work around this you need either
use a bulk load API from the client (instead of EF's SaveChanges()) using SqlBulkCopy directly, or one of the many EF extensions that wrap it.
or
Configure EF to insert into a different table and then INSERT ... SELECT into the target table

Same data is inserted during insert

I have couple insert queries which are merged in transaction. First of that insert is to create new product articel number incrementing the most higher in table by one. Unfortunetly i just noticed that mostly during tests if for instance two users from two diffrent applications click button which trigger my transaction's method they could get same new product number. How can avoid that situation? Is there something like lock on first insertion so that if first user accessing table to insert restrict other's user/s about their insertion so they have to wait in queue after first user insert is finished? Is there something like that? Besides i thought if someone inserts other users are not able to insert. I made comments in code you to understand.
Part of my transaction query below:
Public Sub ProcessArticle(ByRef artikel As ArticlesVariations)
Dim strcon = New AppSettingsReader().GetValue("ConnectionString", GetType(System.String)).ToString()
Using connection As New SqlConnection(strcon)
connection.Open()
Using transaction = connection.BeginTransaction()
Try
For Each kvp As KeyValuePair(Of Integer, Artikel) In artikel.collection
articleIndex = kvp.Key
Dim art As Artikel = kvp.Value
Using cmd As New SqlCommand("INSERT INTO tbArtikel (Nummer) VALUES (#Nummer);Select Scope_Identity()", transaction.Connection)
cmd.CommandType = CommandType.Text
cmd.Connection = connection
cmd.Transaction = transaction
'Get next product number from table tbArtikel (this will be new product number)'
Dim NewArtNummer as String = New DALArtikel().GetNewArtikelNumber(transaction)
art.Nummer = NewArtNummer
cmd.Parameters.AddWithValue("#Nummer", art.Nummer)
'Get inserted product id for other diffrent inserts below'
newArticleRowId = CInt(cmd.ExecuteScalar())
'....
other INSERTs queries to other tables ...
...'
transaction.Commit()
Catch ex As Exception
transaction.Rollback()
Throw 'Rethrow exception.'
End Try
End Using
End Using
End Sub
Just about the only way to assure that users are not assigned the same values is to issue them from the server when the row is inserted. It is the entire premise behind the server issuing AI values for PKs.
BUT since your thing is a multi-segment, "numeric string" that presents a problem. Rather than tearing the string apart to find the Max()+1 for one segment with a WHERE clause on parts of the string. Consider something like this:
Start with a table used to increment and issue the values:
{DocId Int, SegmentB int, SegmentC Int}
This will simply track the values to use in the other table. Then a stored procedure to create/increment a new code (MySQL - this is a conceptual answer):
CREATE DEFINER=`root`#`localhost` PROCEDURE `GetNextProductCode`(in docId int,
in Minr int,
in Rev int
)
BEGIN
SET #maxR = 0;
SET #retCode ='';
if Minr =-1 then
Start transaction;
SET #maxR = (SELECT Max(SegmentB) FROM articlecode WHERE MainId = docId) + 1;
UPDATE articlecode SET SegmentB = #maxR WHERE MainId = docId;
Commit;
Select concat(Cast(docId As char) , '.',
Cast(#maxR AS char) , '.',
Cast(Rev As char)
);
end if;
END
This is a rough idea of the process. As such, it only works on the second segment (I dunno what happens when you create a NEW SegmentB - does SegmentC reset to 1???). The idea is:
pass numbers so there is no need to tear up a string
pass -1 for the segment you need the next value for
the sp gets the Max()+1 and updates the counter table so the next user will get a new value
If for some reason you end up not saving the row, there will be gaps
the sp uses a transaction (probably only needs to protect the update) so that only 1 update can happen at a time
returns the new code. it could just return 2 values, but your going to glue them together anyway
There is much To Do:
It only does SegmentB
For a NEW DocId (-1), insert a new row with 1000 and 1(?) defaults
Same for a NEW segmentB (whatever it is): insert a new row for that DocId with default values
To get a new code before you insert a row:
cmd.CommandType = CommandType.StoredProcedure
cmd.Parameters.Add("docId", MySqlDbType.Int32).Value = 3
cmd.Parameters.Add("Minr", MySqlDbType.Int32).Value = -1
cmd.Parameters.Add("Rev", MySqlDbType.Int32).Value = 1
dbcon.Open()
Using rdr = cmd.ExecuteReader()
rdr.Read()
Console.WriteLine(rdr(0))
End Using
The obvious downside is that each insert requires you to hit the DB in order to...well save to the DB. If they were int values it could be a Trigger.
I'm a SQL developer and my VB skills are about fifteen years out of date, but instead of creating the incremented number yourself in VB just let SQL generate them with an IDENTITY field. SQL will never allow duplicates and then you just need to return the SCOPE_IDENTITY():
ALTER TABLE dbo.tbArtikel
ADD [ArtikelID] INT IDENTITY(1,1) PRIMARY KEY;
I have two suggestions:
First suggestion: move your code to a stored procedure this way all your users will execute the same transaction where you can set your isolation level the way you want. Read This.
Second suggestion: I would create a unique index on your field Nummer. This way when I try to insert a duplicate value it will raise an error that I can deal with it by telling the user that he need to retry the same operation or retry it automatically.
Trying to lock the record or the table for your operation is not advisable, however you can check this article on code project you might find what you are looking for. Make sure that you provide a mechanism of releasing all locks if your program stops at the middle of the transaction.

SQL SERVER: Loading data all at ONCE or Checking ONE by ONE?

Which one could be a better practice? In my situation, I need to check if a specific data exists in a table. I am iterating through an Excel file and verifying if a code there exists in my table using VB.NET. I have two options to do this (or if there is a better way to do this, I am open for suggestions).
First is to check it one by one, this code is executed per loop:
SQL = "SELECT TOP 1 * FROM Table1 WHERE Code = '" & codeFromExcel & "'"
rs = dbConn.Execute(SQL)
If Not rs.EOF Then
isFound = True
Else
isFound = False
End If
The other one is I load all the codes in a List(Of T)
Dim myList As New List(Of String)()
rs = Nothing
rs = dbConn.Execute("Select Code from Table1")
If Not rs.EOF Then
Do While Not rs.EOF
myList.Add(rs.Fields("Code").Value.ToString)
rs.MoveNext()
Loop
End If
Then check every record if it is in the List(Of T) while iterating in the Excel.
If myList.Contains(codeFromExcel) Then
isFound = True
Else
isFound = False
End If
I've been working with this kind of stuff most of the time and I want to know which one is the most efficient way to use. At the moment I only have a few records in my database. I want my code to be ready and efficient when the time comes that I need to deal with numerous records. Thanks in advance!
Additional info: The data doesn't need to be "fresh" as that table is meant for one-time entry only.
Personally I prefer to open as less connections to data base as possible.
So:
If the table is not very large (some hundred rows) I would go with the "cache" option.
Generally:
I would gather all excel codes in a list. ( excelCodes )
Then I would query something like Select Distinct Code from Table1 Where Code In ( excelCodesList ) and store it in a second list ( foundCodes ).
Then I would compare these lists.
I test it on a table with 6.143.993 rows.
To select just one column (description) to "cache" took 1'29".
On the other hand query like:
select distinct description from ItemDetail where description in ( 'abc','cddd','xxx' )
took 0'58".
UPDATE
An index on Code column might help with performance.

save huge xml from sql to web

In sqlserver I have a function which generates a complex xml of all products with several tables joined: location, suppliers, orders etc.
No problem in that, it runs in 68 sec and produces around 450MB.
It should only be called occationally during migration to another server, so it doesn't matter it takes some time.
I want to make this available for download over webserver.
I've tried some variations of this in classic asp:
Response.Buffer = false
set rs=conn.execute("select cast(dbo.exportXML() as varchar(max)) as res")
response.write rs("res")
But I just get a standard
An error occurred on the server when processing the URL. Please contact the system administrator.
If you are the system administrator please click here to find out more about this error.
Not my usual custom 500-errorhandler, so I'm not sure how to find the error.
The problem is in response.write rs("res"), if i just do
temp = rs("res")
the script runs, but displays nothing of cause; if I then
response.write temp
I get the same failure.
So the problem is writing such a ling string.
Can I save the file from tsql directly; and run the job periodically from sql agent?
I found that there seems to be a limit on how much data can be written at once using Response.Write. The workaround I used was to break the data into chunks like this:
Dim Data, Done
Done = False
Do While Not Done
Data = RecordSet(0).GetChunk(8192)
If Not Len(Data) = 0 Then
Response.Write Data
Else
Done = True
End If
Loop
Try this:
Response.ContentType = "text/xml"
rs.CursorLocation = 3
rs.Open "select cast(dbo.exportXML() as varchar(max)) as res",conn
'Persist the Recordset in XML format to the ASP Response object.
'The constant value for adPersistXML is 1.
rs.Save Response, 1

Linq to sql testing stored procedures - call to procedure, verify and rollback in one transaction

I'm trying to use linq to sql for integration testing of stored procedures. I'm trying to call an updating stored procedure and after that retrieving the updated row from db to verify the change. All this should happen in one transaction so that I can rollback the transaction after the verification.
The code fails in assert, because the the row I retrieved does not seem to be updated. I know that my SP works when called from ordinary code. Is it even possible see the updated row in same transaction?
I'm using Sql Server 2008 and used sqlmetal.exe to create linq-to-sql mapping.
I've tried many different things, and right now my code looks following:
DbTransaction transaction = null;
try
{
var context =
new DbConnection(
ConfigurationManager.ConnectionStrings["MyConnectionString"].ConnectionString);
context.Connection.Open();
transaction = context.Connection.BeginTransaction();
context.Transaction = transaction;
const string newUserName= "TestUserName";
context.SpUpdateUserName(136049 , newUserName);
context.SubmitChanges();
// select to verify
var user=
(from d in context.Users where d.NUserId == 136049 select d).First();
Assert.IsTrue(user.UserName == newUserName);
}
finally
{
if (transaction != null) transaction.Rollback();
}
I believe you are coming acress a stale datacontext issue.
Your update is done through a stored procedure so your context does not "see" the changes and has no way to update the Users.
If you use a new datacontext to do the assert, it usually works well. However, since you are using a transaction you probably have to add the second datacontext to the same transaction.

Resources