First of all I am working at VB 2012.
I have problem with searching my database. It goes so slow, actually filling the ListView is what bothers me.
I have a text box with TextChange event. Its instant search. So when I'm starting to write in that text box it's starts to filter the database and filling the data in the ListView.
This is the code in text box and the Load procedure
Private Sub txtID_TextChanged(sender As Object, e As EventArgs) Handles txtID.TextChanged
Load("SELECT * FROM table WHERE id LIKE '" & txtID.Text & "%'")
End Sub
Private Sub Load(ByVal strQ As String)
List.Items.Clear()
cmd = New SqlClient.SqlCommand(strQ, con)
dr = cmd.ExecuteReader()
If dr.HasRows = True Then
While dr.Read
Dim X As ListViewItem
X = List.Items.Add(dr(0))
X.SubItems.Add(dr(2))
X.SubItems.Add(dr(3))
X.SubItems.Add(dr(4))
X.SubItems.Add(dr(1))
X.SubItems.Add(dr(5))
End While
End If
End Sub
So, every time I hit a letter it calls the load procedure.
And I have so much data and it goes so slow. Can you help me somehow ? Is there any solution ?
I don't know how you're ever going to possibly speed that up. Connecting and querying the database is a lot of overhead, especially compared to the speed of pressing a key or typing a word. There's just not a way to do this without severely affecting the user who's typing.
What I suggest instead is that you wait for the user to tell you they're done typing before you bother making a search. If you're trying to do fancy auto-completion stuff you're going to need to cache the data a lot closer to the app than the database.
You need to create a class to hold the search results, like this:
Public Class SearchResult
Private _propID As String
Public Property ID() As String
Get
Return _propID
End Get
Set
_propID = Value
End Set
End Property
Private _propName As String
Public Property PropName() As String
Get
Return _propName
End Get
Set
_propName = Value
End Set
End Property
...
End Class
Now you query the database to get all the results to display in the list view, storing it in a List(Of SearchResult), like this:
Private Function Load(ByVal strQ As String) As List(Of SearchResult)
Dim ListOfResults = New List(Of SearchResult)
cmd = New SqlClient.SqlCommand(strQ, con)
dr = cmd.ExecuteReader()
If dr.HasRows = True Then
While dr.Read
Dim X As New SearchResult()
X.PropID = dr(0)
X.PropName = dr(1)
...
End While
End If
End Sub
You can call this code like this:
Dim AllSearchResults = Load("SELECT * FROM table WHERE id LIKE '" & txtID.Text & "%'")
Now when you want to do a search you can apply the following LINQ against your cached list of everything (AllSearchResults), like this:
Public Function DoSearch(searchText As String) As List(Of SearchResult)
Return From s In AllSearchResults Where s.PropID.Contains(searchText) Select c
End Function
Finally, you can call this LINQ filtering on each key press by the user, like this:
Private Sub txtID_TextChanged(sender As Object, e As EventArgs) Handles txtID.TextChanged
DoSearch(txtID.Text)
End Sub
One way to cut the overhead down by a fair degree to to wait to issue the query until the length of the text entered is 3 (or better yet 5) characters. It is very, very unlikely that all customers (or whatever these are) starting with 'S' is going to be meaningful or helpful to anyone except the one person on rare occasions looking for "Sab....". People HAVE to be typing in the 2nd and 3rd char before the first query is complete and displayed!
To try to make a really bad idea less bad, I'd look into a way to issue the query ONCE (on the first character if I really, really has to), then filter those results down on subsequent keystrokes. (ala Aaron's "caching the data a lot closer to the app").
The next thing is Select *. I dunno whats in the table, but do you REALLY need every column? This appears to be some sort of pick list, do you really need 6 fields to provide the user the information needed to make a selection? If there are more than 6 columns in the table, immediately narrow the query to the 6 used in the listview. Once they make their choice you can go back and get exactly what you need by ID or whatever.
I'd personally use a faster control but thats subjective.
ALL databases eventually acquire dormant data. Customers (or whatever) that were one time shoppers and never return - do they need to be in the list? If there is a column somewhere for lastorderdate or lastupdateddate construct your query to pick those active in the last XX months (and if not, see if you can add one because the issue is not going to get better as the database gets larger!). Then a checkbox for the user to widen the range as needed, like "See All" or something. Users are not likely to balk at the idea if it speeds things up the other 80% of the time.
...those are just off the top of my head.
Related
im doing a project where i need to read data from a SQL Table (Called Table_IDs_Name). In that table i have to read the column (Variable_Name) and get every row in an array ( Called in the code as Names_Array). Im trying the following code and i get the values with the reader but how can i put them into an array?
Its quite important so hope u can help me with this
Public SQL_Connection As SqlConnection
Public SQL_Command As SqlCommand
Public SQL_Connection_String As String
Public Names_Array() As String
SQL_Connection_String = "---------------------------"
SQL_Connection = New SqlConnection(SQL_Connection_String)
SQL_Connection.Open()
Dim SQL_Statement_Array As String = "SELECT Variable_Names From Table_IDs_Names"
SQL_Command = New SqlCommand(SQL_Statement_Array, SQL_Connection)
Dim Reader As SqlDataReader
Dim i As Integer
Reader = SQL_Command.ExecuteReader()
While Reader.Read()
Console.WriteLine(Reader("Variable_Name").ToString().ToArray())
Names_Array(i) = Reader("Variable_Name").ToString().ToArray()
i = i + 1
Console.WriteLine("PROBANDO {0}", Names_Array(i))
End While
SQL_Command.Dispose()
Several things here that can trip up people who are new to this:
Do NOT try to re-use the same connection object. It interferes with a feature called connection pooling and will end up creating bottlenecks, making things slower, and causing you to use more memory, not less.
Do put your data access into its own class or module, separate from your UI and business logic, but only re-use the connection string within this module, not the full connection. This class/module will have a separate method for each query or operation you want to run.
Do put these short-lived connections in a Using block to make sure they are disposed correctly.
Arrays have a very specific meaning in formal computer science terms. However, many languages define arrays in a more colloquial sense. This is not true for .Net. When you have an array in .Net, you have a real array in the full formal definition. These formal arrays are not as often appropriate for modern work. You almost always want a generic List instead, or even the raw Data Access Objects (DAOs) like IDataReader or DataTable provided by ADO.Net. Databinding is also an option. Arrays are just bad for this, m'kay? Don't conflate them with other collections.
Be sure to always use parameterized queries, and NEVER string concatenation to build your SQL statements. I don't see evidence you missed on this one, but it's important enough to make sure it's listed.
Similar to #5 (too important to overlook, even if it's not relevant to the question), NEVER store passwords in your database. To use your database to support authentication, instead salt new passwords with a unique nonce value, and then hash the result with a secure cryptographic hash a la BCrypt. You can save the resulting hash. When someone tries to login, you do the same operations to the attempted password, and then compare the hashes; NEVER compare passwords directly.
All that out of the way, we can look at some code:
Public Module DB
Private ConnectionString As String = "---------------------------"
'I'm extending this to filter by TableID, just so I can demonstrate a parameterized query
Public Shared Iterator Function GetIDVariables(TableID As Integer) As IEnumerable(Of String)
Dim SQL As String = "SELECT Variable_Names From Table_IDs_Names WHERE TableID = #TableID"
Using cn As New SqlConnection(ConnectionString), _
cmd As New SqlCommand(SQL, cn)
cmd.Parameters.Add("#TableID", SqlDbType.Int).Value = Table
cn.Open()
Using rdr As SqlDataReader = cmd.ExecuteReader()
While rdr.Read()
Yield DirectCast(rdr("Variable_Names"), String)
End While
rdr.Close()
End Using
End Using
End Function
End Module
And then in other code:
Dim data = DB.GetIDVariables(12345)
For Each variable As String in data
Console.WriteLine($"PROBANDO {variable}")
Next
John suggested that you can use List of string which comes from System.Collection.Generic.
Import System.Collection.Generic
.... 'rest of the code
'declare list of string to store stuff
Dim lst as List(Of String) = new List(Of String)
'iterate query result
While Reader.Read()
'add query result to the list
lst.Add(Reader("Variable_Name").ToString())
End While
You can use the List of string named lst in a loop or convert it to array using lst.ToArray().
Note: I haven't write vb.net in a very long time
I've got a WBS (Work Break Down Structure), with multiple rows (top-level of a group outline), and each top-level row is an activity. Directly under the activity are the roles involved.
Based on the value of the activity in the top level ("plan", for example), the cells in the level below are populated, according to their values in a related table on another sheet ("defaults" tab).
Currently, the rows under the activity (that correspond to roles) are doing an ugly index/match lookup, which multiplied by 25 roles, can grind the spreadsheet to a halt.
What I think will solve this issue is taking the Role Defaults table, putting it in a persistent array, and using the values in the array over and over, as the user puts in the top-level activities. I just can't figure out how to make the array persistent (so the VBA doesn't repopulate it ever time a user changes a cell). If the values in the Role Defaults table changes, I can handle that with a worksheet OnChange, so that's not an issue.
Row 3 "Activity 1" is what the Activity Rows look like with the group outline collapsed.
Rows 4-9 are what the Activity Rows look like with the group outline expanded, showing the underlying roles.
For each of the roles, this is the table on another tab that's used to look up the value that should be in the corresponding Activity/Role cell on the WBS tab.
I'm a proponent of using Dictionary objects whenever the need for lookups arise. In my solution below, I use nested dictionaries to return a combination of Top-Level and Activity. (Note: I tried to understand your business need as best as I could, but I'm sure I didn't nail it. I also assumed some knowledge of VBA above a beginner's level. If you have follow up questions, please ask and we'll try and help).
First, create a new module to hold the globally available Dictionary. This cannot be a Worksheet module. (In the VBE, go to Insert --> Module). At the very top of the module, before creating a subroutine, declare a publicly available Dictionary
Public oDictWbs As Object
We only want one instance of this dictionary, so I like to use a Singleton like pattern which returns a Dictionary if already created, and if not, create and return a new one. (Note: I factored out the routine that returns a new dictionary into RefreshWBS so that it can be used to create a new dictionary based on your business rules. So, for example, in the Default worksheet OnChange event, you can call RefreshWBS [code reuse is always fun]).
Private Function GetWBS() As Object
If Not oDictWbs Is Nothing Then
Set GetWBS = oDictWbs
Exit Function
End If
Set GetWBS = RefreshWBS()
End Function
Private Function RefreshWBS()
Dim sDefault As Worksheet
Dim rTopLevels As Range
Dim rActivities As Range
Dim rIterator As Range
Dim rInnerIter As Range
Set oDictWbs = Nothing
'Both variables below establish the range that stores the fixed info (the default worksheet)
'Instead of hard coding in the range, create your own logic based on your needs and rules
Set sDefault = Sheets("Default")
Set rTopLevels = sDefault.Range("B1:C1")
Set rActivities = sDefault.Range("A3:A4")
Set oDictWbs = CreateObject("Scripting.Dictionary")
For Each rIterator In rTopLevels
If Not oDictWbs.exists(rIterator.Value) Then
Set oDictWbs(rIterator.Value) = CreateObject("Scripting.Dictionary")
End If
For Each rInnerIter In rActivities
If Not oDictWbs(rIterator.Value).exists(rInnerIter.Value) Then
oDictWbs(rIterator.Value)(rInnerIter.Value) = sDefault.Cells(rInnerIter.Row, rIterator.Column)
End If
Next rInnerIter
Next rIterator
Set RefreshWBS = oDictWbs
End Function
Finally, we create a function that can be accessed from within the Worksheet itself, allowing the user to access information in the WBS Dictionary. You can enter into an Excel cell a function like =GetWbsActivityTime(B1, A4) presuming that cell B1 contains the top-level descriptor and A4 describes the activity. So long as that value is in the dictionary, it will return the value associated with it.
Function GetWbsActivityTime(sTopLevel As String, sActivity As String) As Variant
Dim oDict As Object
Set oDict = GetWBS()
If Not oDict.exists(sTopLevel) Then
GetWbsActivityTime = CVErr(xlErrRef)
Exit Function
End If
If Not oDict(sTopLevel).exists(sActivity) Then
GetWbsActivityTime = CVErr(xlErrRef)
Exit Function
End If
GetWbsActivityTime = oDict(sTopLevel)(sActivity)
End Function
I know it's a lot to absorb, so review it and let me know of any questions or quirks with which I can help. Also, if I totally missed the point of the exercise, let me know and I'll see if we can salvage parts of the solution.
Please see the code below:
Public Class Form1
Private _ConString As String
Private Sub Form1_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load
Dim objDR As SqlDataReader
Dim objCommand As SqlCommand
Dim objCon As SqlConnection
Dim id As Integer
Try
_ConString = ConfigurationManager.ConnectionStrings("TestConnection").ToString
objCon = New SqlConnection(_ConString)
objCommand = New SqlCommand("SELECT * FROM Person")
objCommand.Connection = objCon
objCon.Open()
objDR = objCommand.ExecuteReader(ConnectionState.Closed)
Do While objDR.Read
ProcessPerson(objDR("URN"))
Loop
objDR.Close() 'line 16
Catch ex As Exception
throw
Finally
End Try
End Sub
End Class
Say there are one million records in the Person table and it takes 24 hours to run. Say I deleted the Person table or updated the data in the table significantly, half way through. Would it then still process the one million records? Please assume that ProcessPerson does not use the Person table.
I have spent some time Googling this but I have not found an answer.
The best way to find out if something will happen under a certain condition is simply to test it out. That beats all documentation since occasionally documentation (whether it be "official" MSDN and/or TechNet pages, blog posts, etc) is incorrect. You already have the app code written so it is a rather simple matter to throw in 1 - 2 million rows into the [Person] table. It should be fine if all fields outside of the PK field are the same as you are just testing for a simple effect. Or, maybe sure that at least one field has unique values to make it easier to see if something is missing. Then just run the code that you have posted here and delete a few targeted rows and see if they are there or not in your collection.
After you do that test, look into enabling Snapshot Isolation as it is designed to help with this. If this process might take 24 hours then you probably shouldn't be wrapping it in a transaction (well, if you want to keep your job ;-) ).
Another option is to create a stored procedure that does the following:
Create temp table
INSERT into temp table SELECT query
SELECT * FROM temp table
That does separate out and hence preserve your result set for as long as it takes for the app to read it. But it also duplicates the full result set in [tempdb] whereas Snapshot Isolation:
only uses the space it needs, which will be much less space than the full result set if the rows aren't changing that much, and
prevents blocking while the initial query is running, which might still happen while populating the temp table
I have no idea what is going on here and it's a little bizzare.
I'm adapting a VBA macro into a VB.net project, and I'm experiencing what I would describe as some extreemly unusual behavior of a method I'm using to pass data around in VB.net. Here's the set up...
I have, for indexing reasons, a collection that consists of all open orders:
Public allOpenOrders As New Collection
Within this collection, I store other collections, indexed by account number, that each contain information about each open order in an array that is three elements long. Here is how I'm populating it:
openOrderData(0) = some information
openOrderData(1) = some information
openOrderData(2) = some information
SyncLock allOpenOrders
If allOpenOrders.Contains(accountNumber) Then
'Already in the collection...
accountOpenOrders = allOpenOrders(accountNumber)
accountOpenOrders.Add(openOrderData)
Else
'Not already in collection
accountOpenOrders = New Collection
accountOpenOrders.Add(openOrderData)
allOpenOrders.Add(accountOpenOrders, AccountNumber)
End If
End SyncLock
Here's the thing, if I place a stop after end synclock and check the collection, I can clearly see that the array with all data is there, plain as day. However, when I move on in my code (this is occuring in another thread after the preceeding code has executed) to retrieve it and write it to a workbook...
If allOpenOrders.Contains(accountNumber) Then
accountOpenOrders = allOpenOrders(accountNumber)
For each openOrderArray In accountOpenOrders
OutputSheet.Cells(1, 1).value = accountNumber
For counter = 0 to 2
OutputSheet.Cells(1, counter + 2).value = openOrderArray(counter)
Next counter
Next openOrderArray
End If
I get the first element of the array in column B, but C and D are blank. Even more puzzling, if I put a stop right after the allOpenOrders.Contains line I can look at the collection and the last two elements of the array are now blank. Most puzzling of all, they aren't just blank, they are blanks, a number of blanks equal in length to the original field I recorded in that element of the array?!
Any ideas are appreciated. I can tell you I'm using the same type of method to load other data in this workbook with no problems. These are also the only instances in which the allOpenOrders collection is touched... I'm so confused by these results.
I am a self taught vb6 programmer who uses DAO. Below is an example of a typical piece of code that I could churn out:
Sub cmdMultiplier_Click() 'Button on form, user interface '
dim Rec1 as recordset
dim strSQL as string
strSQL = "select * from tblCustomers where ID = " & CurrentCustomerID 'inline SQL '
set rec1 = GlobalDataBase.openrecordset(strSQL) ' Data access '
if rec1.bof <> true or rec1.eof <> true then
if rec1.fields("Category").value = 1 then
PriceMultiplier = 0.9 ' Business Logic '
else
priceMultiplier = 1
end if
end if
End Sub
Please pretend that the above is the entire source code of a CRUD application.
I know this design is bad, everything is mixed up together. Ideally it should have three distinct layers, user interface, business logic
and data access. I sort-of get why this is desirable but I don't know how it's done and I suspect
that's why I don't fully get why such a separation is good.
I think I'd be a lot further down the road if someone could refactor the above ridiculously
trivial example into 3 tiers.
a trivial example, yes, but with all the basic elements - they just belong in 3 different classes (see below). The main reason for this is the "separation of concerns" principle, i.e. the GUI is only concerned with GUI things, the Biz Logic layer is only concerned with the business rules, and the data-access layer is only concerned with data representations. This allows each layer to be maintained independently and reused across applications:
'in Form class - button handler
Sub cmdMultiplier_Click()
PriceMultiplier = ComputePriceMultiplier(CurrentCustomerId)
End Sub
'in Biz Logic class
Function ComputePriceMultiplier(custId as Integer) as Double
Dim cust as Customer = GetCustomer(custId)
if cust.Category = 1 then 'please ignore magic number, real code uses enums
return 0.9
end if
return 1
End Function
'in Data Access Layer class
Function GetCustomer(custId as Integer) as Customer
Dim cust as Customer = New Customer 'all fields/properties to default values
Dim strSQL as String = "select * from tblCustomers where ID = " & custId
set rec1 = GlobalDataBase.openrecordset(strSQL) ' Data access '
if rec1.bof <> true or rec1.eof <> true then
cust.SetPropertiesFromRecord(rec1)
end if
return cust
End Function
[a 'real' application would cache the current customer, have constants or stored procedures for the customer query, etc.; ignored for brevity]
Contrast this with your original everything-in-the-button-handler example (which is appallingly common in VB code because it is so easy to do it that way) - if you needed the price-multiplier rule in another application, you'd have to copy, paste, and edit the code into that application's button-handler. Now there would be two places to maintain the same business rule, and two places where the same customer query was executed.
Typically you will have your UI code responding to the events raised by the user, in this case the Button Click.
After that it really depends on how your program is designed, the most basic design would be to reference a Customer instance and it would contain a multiplier property.
Your customer object is populated from data in your DAL.
Validation for UI would go in UI layer, business validation rules could go into your business object, and then your DAL is your persistence layer.
Here is a very basic pseudo-code example:
btnClick
Dim Cust as New Customer(ID)
multplr = Cust.DiscountMultiplier
End Click
Class Customer
Sub New(ID)
Data = DAL.GetCustomerData(ID)
Me.Name = Data("Name")
Me.Address = Data("Address")
Me.DiscountMultiplier = Data("DiscountMultiplier")
End Sub
Property ID
Property Name
Property Address
Property DiscountMultiplier
Return _discountMultiplier
End
End Class
Class DAL
Function GetCustomerData(ID)
SQL = "Paramaterized SQL"
Return Data
End Function
End Class
Knowing how to refactor is a good thing. From now you will know how to separate layers.
However, I think your time will be better spend to upgrade the tools you are using at the same time. Do you have consider to do it with VB.Net ?
A way to do it will preserving your existing code base is to code the Data layer and BR in VB.Net. Then to expose the BR through COM Interface (this is a check box option in the project). You can then use the new BR from your current interface.
Once all BR and DAL done, you will be a step away to a complete new platform.
What is the purpose of the button?
My first steps would be:
extract the part accessing the database. (warning: air code ahead)
function getCustomer(CurrentCustomerID as Long)
strSQL = "select * from tblCustomers where ID = " & CurrentCustomerID
set rec1 = GlobalDataBase.openrecordset(strSQL)
result = 1
if rec1.recordcount >0 then
getCustomer = rec1
else
getCustomer = false
endif
end function
compose the business logic function:
function getCustomerDiscount(customerID as Long)
customer = getCustomer(customerID)
res = 1
if customer then
if customer("category")=1) then
res = .9
endif
endif
getcustomerdiscount = res
end function
then, change the button:
Sub cmdMultiplier_Click()
pricemultiplier = getcustomerdiscount(currentcustomerid)
end sub