I am working on a windows form application, and I have it reading a text file that has information stored in the form of a table. It looks somewhat like this:
ID Name URL
1 client1 client1.com
2 client2 client2.com
3 client3 client3.com
And so on...
What I need to do is get this data to be read from a stream reader, that throws it into a string, including vbtabs and newlines, and then create an array out of that information, so that it acts as a table that I can then later pull information from based on the column names (i.e. ID, Name, URL) and the ID number. I do not have a lot of experience with arrays, so I was hoping to get some help here as how to do this.
The code I have so far for this functionality is:
Dim readCLientTxtListReader As New StreamReader(strReplicationDataClientAccessListPath)
Dim strClientAccessList As String = readCLientTxtListReader.ReadToEnd
Console.Write(strClientAccessList)
readCLientTxtListReader.Close()
Dim i As Integer
Dim aryClientAccessList() As String
aryClientAccessList = strClientAccessList.Split(vbTab)
For i = 0 To UBound(aryClientAccessList)
Console.WriteLine(aryClientAccessList)
Next i
The problem about this is that it just creates a new instance of the array as each individual string of characters between each vbtab. which means, the arrays look like:
ID
Name
URL
1
client1
client1.com
2
client2
client2.com
3
client3
client3.com
Which is not really what I need.
Any ideas?
If you need more info, let me know.
Edit: As an added side-note, I believe multidimensional arrays are what I am looking for, and am currently looking them up now, but if you have any more information on these, I would greatly appreciate it.
The way I am reading your end goal is that you want to have an array that that contains each row of the file and that each row should be an array with the tokens for that row in it. If that is correct, then you could do something like this:
var lines = new List<string[]>();
using (var strReplicationDataClientAccessListPath = new FileStream("path", FileMode.Open))
using (var streamReader = new StreamReader(strReplicationDataClientAccessListPath))
{
while (streamReader.Peek() >= 0)
{
var line = streamReader.ReadLine();
if (!string.IsNullOrEmpty(line))
lines.Add(line.Split('\t'));
}
}
foreach (var line in lines)
{
foreach (var token in line)
Console.Write(token);
Console.WriteLine();
}
I have not done VB in a long time but here is the C# which can be run through a converter to get VB.NET. Also I did it using a List, if you need an array at the end you can do:
lines.ToArray();
This is rclements code
converted to VB
Dim lines = New List(Of String())()
Using strReplicationDataClientAccessListPath = New FileStream("path", FileMode.Open)
Using streamReader = New StreamReader(strReplicationDataClientAccessListPath)
While streamReader.Peek() >= 0
Dim line = streamReader.ReadLine()
If Not String.IsNullOrEmpty(line) Then
lines.Add(line.Split(ControlChars.Tab))
End If
End While
End Using
End Using
For Each line As var In lines
For Each token As var In line
Console.Write(token)
Next
Console.WriteLine()
Next
You are trying to simulate CSV -> DataTable workflow, where vbTab is your CSV delimiter, and DataTable is your storage structure (you can query by field name, and by row index). There are numerous solutions on the internet, just google for CSV to DataTable.
Here is one, linked from here (SO answer).
If you still want multi-dimensional arrays, I recommended a List(Of String()) instead of String(,), because you would then not need to manage memory allocation. Each line of data would be an element of List, and a single dimension array, where column values are array elements 0-N.
To read from file, you can use IO.File.ReadAllLines.
Is it what you need?
'Read Values from the text file and store in a DataTable
Dim dt As New DataTable
Using TextReader As New IO.StreamReader("C:\Data.txt")
Dim Line As String = TextReader.ReadLine
If String.IsNullOrEmpty(Line) Then
MsgBox("No Data")
Exit Sub
End If
With Line.Split(vbTab)
dt.Columns.Add(.GetValue(0))
dt.Columns.Add(.GetValue(1))
dt.Columns.Add(.GetValue(2))
End With
Do
Line = TextReader.ReadLine
If Line Is Nothing Then Exit Do
dt.Rows.Add(Line.Split(vbTab))
Loop
End Using
'Print the DataTable header
For Each Column As DataColumn In dt.Columns
Console.Write(Column.ColumnName & vbTab)
Next
Console.WriteLine(vbCrLf & New String("-", 24))
'Print the DataTable contents
For Each Row As DataRow In dt.Rows
Console.WriteLine(Row("ID") & vbTab & Row("Name") & vbTab & Row("URL"))
Next
I have added another solution using List in case you prefer it more than the DataTable:
'Read Values from the text file and store in a List of type Tuples
Dim Values As New List(Of Tuple(Of String, String, String))
Using TextReader As New IO.StreamReader("C:\Data.txt")
Dim Line As String = TextReader.ReadLine
Do Until Line Is Nothing
With Line.Split(vbTab)
Values.Add(Tuple.Create(.GetValue(0).ToString, .GetValue(1).ToString, .GetValue(2).ToString))
End With
Line = TextReader.ReadLine
Loop
End Using
'Print the List contents
For Each T As Tuple(Of String, String, String) In Values
Console.WriteLine(T.Item1 & vbTab & T.Item2 & vbTab & T.Item3)
Next
Related
Using visual basic. Trying to load a series of reports onto a listview, listview consists of 3 columns (location, date and severity level) everytime it loads it crashes due to 'index being outside the bounds of the array'.Specifically around DOI = reportdetails(1) in my code. It is loading off of a textfile. I have the data within the textfile so I am unsure of why it is saying I am asking for information that doesnt exist. The program also encypts the textfile.
Dim locate, DOI, SeverityLevel, ReportTitles, EReportTitles, ReportDetails(2) As String
Dim Index As Integer 'Define Variables
Dim FileNum As Integer = FreeFile()
Dim IncidentReport As ListViewItem
lstReports.Items.Clear()
If Dir("ReportTitles.txt") <> "" Then 'If the directory of the file exits then continue
FileOpen(FileNum, "ReportTitles.txt", OpenMode.Input) 'open file
Do Until EOF(FileNum) 'Repeat until the end of the file is reached
EReportTitles = "" 'Clear variables, to safeguard against crashes or errors
ReportTitles = ""
EReportTitles = LineInput(FileNum) 'EReportTitles is equal to the current file line
Dim FileName As String = "ReportTitles.txt" 'Define variables
Dim I, C As Integer
Dim Last As Integer = EReportTitles.Length - 1
Dim ThisChar As Char
For I = 0 To Last 'Begin for loop
ThisChar = EReportTitles.Chars(I) 'Decryption of file
C = Asc(ThisChar) Xor 22
ThisChar = Chr(C)
ReportTitles += ThisChar
Next
If ReportTitles <> "" Then
ReportDetails = Split(ReportTitles, ",") 'Split the lines when a "," is encountered
locate = ReportDetails(0) 'Assosciate to relevant value in array
DOI = ReportDetails(1)
SeverityLevel = ReportDetails(2)
IncidentReport = New ListViewItem
IncidentReport.Text = locate 'Add relevant values to IncidentReport ListViewItem variable
IncidentReport.SubItems.Add(DOI)
IncidentReport.SubItems.Add(SeverityLevel)
lstReports.Items.Add(IncidentReport) 'Transfer IncidentReport to listview
Else
End If
Loop
FileClose(FileNum) 'close file
End If
Expected result is to load all of the report location, dates and severity levels onto the listview.
Also sorry about the formatting of this question, i'm new to stack overflow.
There's no point declaring ReportDetails like this:
ReportDetails(2) As String
because that creates an array that you never use. Here:
ReportDetails = Split(ReportTitles, ",")
you are creating a new array anyway and the length of that array will be determined by the number of delimiters in ReportTitles. If you're being told that 1 is an invalid index for that array then that array must only contain 1 element, which means that ReportTitles didn't contain any delimiters.
This is not something that we should have to explain to you because you can easily see it for yourself by debugging and you should ALWAYS debug BEFORE posting here. Set a breakpoint at the top of the code, step through it line by line and examine the state at each step. You can easily see the contents of ReportTitles and ReportDetails and anything else to see whether they are what you expect them to be.
If the point here is to read a CSV file then you really ought to be using the TextFieldParser class. The documentation for that class includes a code example.
This requires .Net Standard 2.1, and so I'm not sure if VB.Net can use the required SpanAction for the String.Create() method, but if it is supported it should greatly outperform the original.
lstReports.Items.Clear()
'Read and "Decrypt" (and I use that term loosely) the file with only a single heap allocation
Dim file As String
Using fs As FileStream = File.OpenRead("ReportTitles.txt")
file = String.Create(fs.Length, fs,
Sub(chars, stream)
For i As Integer = 0 To stream.Length - 1
'THIS IS NOT ENCRYPTION! At best, it's obfuscation.
chars(i) = Chr(fs.ReadByte() Xor 22)
Next
End Sub)
End Using
'Use an actual CSV parser
Using reader As New StringReader(file), _
parser As New TextFieldParser(reader)
parser.TextFieldType = FileIO.FieldType.Delimited
parser.Delimiters = New String() {","}
Dim row As String()
While Not parser.EndOfData
row = parser.ReadFields()
If row.Length >= 3 Then
Dim IncidentReport As New ListViewItem()
IncidentReport.Text = row(0) '
IncidentReport.SubItems.Add(row(1))
IncidentReport.SubItems.Add(row(2))
lstReports.Items.Add(IncidentReport)
End If
End While
End Using
If you are not able to use that version, this is not quite as good, but still a better approach than the original:
lstReports.Items.Clear()
'Load and "Decrypt" the file
Dim file As String
Using fs As FileStream = File.OpenRead("ReportTitles.txt")
Dim builder As New StringBuilder(fs.Length)
For i As Integer = 0 To fs.Length - 1
'THIS IS NOT ENCRYPTION! At best, it's obfuscation.
builder.Append(Chr(fs.ReadByte() Xor 22))
Next
file = builder.ToString()
End Using
'Use an actual CSV parser
Using reader As New StringReader(file), _
parser As New TextFieldParser(reader)
parser.TextFieldType = FileIO.FieldType.Delimited
parser.Delimiters = New String() {","}
Dim row As String()
While Not parser.EndOfData
row = parser.ReadFields()
If row.Length >= 3 Then
Dim IncidentReport As New ListViewItem()
IncidentReport.Text = row(0) '
IncidentReport.SubItems.Add(row(1))
IncidentReport.SubItems.Add(row(2))
lstReports.Items.Add(IncidentReport)
End If
End While
End Using
In both cases, use Try/Catch rather than Dir() to check whether the location exists. Just try to open the file. Dir() costs an extra disk seek, and there are precious few things in programming slower than disk I/O.
My Goal is to take two rows(FirstName and Surname) Convert them to a single Array of "FirstName, Surname".
This is my terrible code i eventually put together
Private Sub Search_Load(sender As Object, e As EventArgs) Handles MyBase.Load
'TODO: This line of code loads data into the 'DbaPatientDataSet.tblPatientData' table. You can move, or remove it, as needed.
Me.TblPatientDataTableAdapter.Fill(Me.DbaPatientDataSet.tblPatientData)
listFirst.DataSource = Me.TblPatientDataBindingSource
listFirst.DisplayMember = "FirstName"
listLast.DataSource = Me.TblPatientDataBindingSource
listLast.DisplayMember = "Surname"
Dim Lenth As Integer = Me.listFirst.Items.Count - 1
Dim count As Integer = 1
Dim ArrFirst(Lenth) As String
Dim ArrLast(Lenth) As String
For count = 1 To Lenth
ArrFirst(count) = listFirst.Items(count).ToString
ArrLast(count) = listLast.Items(count).ToString
Next count
count = 1
For count = 1 To Lenth
arrFullName(count) = ArrLast(count) & ", " & ArrFirst(count)
Next count
'Arrays Set =====================================================
But with this code i get an Array of
`"Sytem.Data.DataRowView, Sytem.Data.DataRowView"
"Sytem.Data.DataRowView, Sytem.Data.DataRowView"
"Sytem.Data.DataRowView, Sytem.Data.DataRowView"
"Sytem.Data.DataRowView, Sytem.Data.DataRowView"
`
As you can see
Here
There must be an easy way to convert both DataRows to strings then concatenate them together in an array
I am going to search this array using a Binary Search to find a desired name
Thanks
First, I think you are confusing your rows and your columns. You have 2 columns. I went directly to full name but I think you can break it out if you need to.
Dim arrNames(ListBox1.Items.Count - 1) As String
For i As Integer = 0 To ListBox1.Items.Count - 1
arrNames(i) = $"{ListBox1.Items(i)} {ListBox2.Items(i)}"
Next
For Each item In arrNames
Debug.Print(item)
Next
The string with the $ in front is an interpolated string. Sort of an improvement to String.Format.
I know there is an answer but for now you could go direct to the data table to get what you need.
Dim arrNames(ListBox1.Items.Count - 1) As String
Dim i As Integer = 0
Dim dt As DataTable = DbaPatientDataSet.Tables(0)
For Each row As DataRow In dt.Rows
arrNames(i) = $"{row("Surname")}, {row("FirstName")}"
i += 1
Next
For Each item In arrNames
Debug.Print(item)
Next
'assume the names of your columns are Surname and FirstName
If I run your code up, I get the result you are looking for, so I'm not sure what you are missing. In saying that though, you are making things hard on yourself by messing around with arrays :). Just use the dataset rows directly - they are strongly typed and you can check for nulls etc as needed... something like this;
Dim fullNames As New List(Of String) '-- or you could fill your array.
For Each row As DbaPatientDataSet.tblPatientDataRow In ds.tblPatientData
fullNames.Add(row.Surname & ", " & row.FirstName)
Next
Just looking at what you are trying to achieve, if it was me, I would be bringing back the formatted data in my query that fills the dataset i.e. a third, FullName, column.
It has been in the back of my mind. Finally got it for the List Box directly.
Dim arrFullNames(ListBox1.Items.Count - 1) As String
Dim i As Integer = 0
For Each item As DataRowView In ListBox1.Items
arrFullNames(i) = $"{DirectCast(item("Surname"), String)}, {DirectCast(item("Firstname"), String)}"
i += 1
Next
For Each item As String In arrFullNames
Debug.Print(item)
Next
The aim of my application is to extract text from documents and search for specific entries matching records in a database.
My application extracts text from documents and populates a textbox
with the extracted text.
Each document can have anywhere from 200 to 600,000 words
(including a large amount of normal plain text).
Extracted text is compared against database entries for specific
values and matches are pushed into an array.
My Database contains approximately 125,000 records
My code below loops through the database records, comparing against the extracted text. If a match is found in the text it is inserted into an array which I use later.
txtBoxExtraction.Text = "A whole load of text goes in here, " & _
"including the database entries I am trying to match," & _
"i.e. AX55F8000AFXZ and PP-Q4681TX/AA up to 600,000 words"
Dim dv As New DataView(_DBASE_ConnectionDataSet.Tables(0))
dv.Sort = "UNIQUEID"
'There are 125,000 entries here in my sorted DataView dv e.g.
'AX40EH5300
'GB46ES6500
'PP-Q4681TX/AA
For i = 0 to maxFileCount
Dim path As String = Filename(i)
Try
If File.Exists(path) Then
Try
Using sr As New StreamReader(path)
txtBoxExtraction.Text = sr.ReadToEnd()
End using
Catch e As Exception
Console.WriteLine("The process failed: {0}", e.ToString())
End Try
end if
For dvRow As Integer = 0 To dv.Table.Rows.Count - 1
strUniqueID = dv.Table.Rows(dvRow)("UNIQUEID").ToString()
If txtBoxExtraction.Text.ToLower().Contains(strUniqueID.ToLower) Then
' Add UniqueID to array and do some other stuff..
End if
next dvRow
next i
Whilst the code works, I am looking for a faster way of performing the database matching (the 'For dvRow' Loop).
If a document is small with around 200 words, the 'For dvRow..' Loop completes quickly, within a few seconds.
Where the document contains a large amount of text... 600,000 words and upwards, it can take several hours or longer to complete.
I came across a couple of posts that are similar, but not close enough to my issue to implement any of the recommendations.
High performance "contains" search in list of strings in C#
https://softwareengineering.stackexchange.com/questions/118759/how-to-quickly-search-through-a-very-large-list-of-strings-records-on-a-databa
Any help is most gratefully appreciated.
This is an example of the comment a wrote.
If that's the actual code, I don't understand why you need to put the
information in a textbox. You could save a bit of speed by not
displaying the text on screen. If you have 125000 UNIQUEIDs, then it
might be better to pull the id from your file and then search from
that list. Instead of searching the whole text every time. Even just
splitting your text by space and filtering by the "words" that are
between a specific size could make it go faster.
Since it seems you want to do a word check and not a per-character check. And that you only want to check for those ids and not each word. You should pull up the ids from each text before doing any search. This will reduce the searching that need to be done by a lot. This list of id could also be saved if the text never changes.
Module Module1
Private UNIQUEID_MIN_SIZE As Integer = 8
Private UNIQUEID_MAX_SIZE As Integer = 12
Sub Main()
Dim text As String
Dim startTime As DateTime
Dim uniqueIds As List(Of String)
text = GetText()
uniqueIds = GetUniqueIds()
'--- Very slow
startTime = DateTime.Now
' Search
For Each uniqueId As String In uniqueIds
text.Contains(uniqueId)
Next
Console.WriteLine("Took {0}s", DateTime.Now.Subtract(startTime).TotalSeconds)
'--- Very fast
startTime = DateTime.Now
' Split the text by words
Dim words As List(Of String) = text.Split(" ").ToList()
' Get all the unique key, assuming keys are between a specific size
Dim uniqueIdInText As New Dictionary(Of String, String)
For Each word As String In words
If word.Length < UNIQUEID_MIN_SIZE Or word.Length > UNIQUEID_MAX_SIZE Then
If Not uniqueIdInText.ContainsKey(word) Then
uniqueIdInText.Add(word, "")
End If
End If
Next
' Search
For Each uniqueId As String In uniqueIds
uniqueIdInText.ContainsKey(uniqueId)
Next
Console.WriteLine("Took {0}s", DateTime.Now.Subtract(startTime).TotalSeconds)
Console.ReadLine()
End Sub
' This only randomly generate words for testing
' You can ignore
Function GetRandomWord(ByVal len As Integer) As String
Dim builder As New System.Text.StringBuilder
Dim alphabet As String = "abcdefghijklmnopqrstuvwxyz"
Dim rnd As New Random()
For i As Integer = 0 To len - 1
builder.Append(alphabet.Substring(rnd.Next(0, alphabet.Length - 1), 1))
Next
Return builder.ToString()
End Function
Function GetText() As String
Dim builder As New System.Text.StringBuilder
Dim rnd As New Random()
For i As Integer = 0 To 600000
builder.Append(GetRandomWord(rnd.Next(1, 15)))
builder.Append(" ")
Next
Return builder.ToString()
End Function
Function GetUniqueIds() As List(Of String)
Dim wordCount As Integer = 600000
Dim ids As New List(Of String)
Dim rnd As New Random()
For i As Integer = 0 To 125000
ids.Add(GetRandomWord(rnd.Next(UNIQUEID_MIN_SIZE, UNIQUEID_MAX_SIZE)))
Next
Return ids
End Function
End Module
This works fine, assuming there are no line breaks in certain cells.
dgvResults.SelectAllCells()
dgvResults.ClipboardCopyMode = DataGridClipboardCopyMode.IncludeHeader
ApplicationCommands.Copy.Execute(Nothing, dgvResults)
Dim result As [String] = DirectCast(Clipboard.GetData(DataFormats.CommaSeparatedValue), String)
Clipboard.Clear()
dgvResults.UnselectAllCells()
Try
Dim file As New System.IO.StreamWriter("c:\export.csv")
file.WriteLine(result)
file.Close()
Process.Start("c:\export.csv")
Catch ex As Exception
MessageBox.Show(ex.Message, "Error")
End Try
This is how I add line breaks
Dim x As New List(Of String)
For Each item In res.Properties("proxyaddresses")
x.Add(item)
Next
AllSMTPAddresses = String.Join(ControlChars.Lf, x)
When I export this, it doesn't take into consideration there are line breaks, and completely ignores them... so the excel formatting is a little wonky. I've tried: Environment.NewLine, vbCrLf, and now ControlChars.Lf. I think excel doesn't know what to do with the line breaks, so it just does w.e it wants, and creates new rows with them.
Any idea on how I would attack this?
Updated results #Jimmy
This is what it's supposed to look like...
I don't believe this will be possible without modifying those rows prior to the export. I found some example code that may help,
Public Sub writeCSV(grid1 As Object, outputFile As String)
' Create the CSV file to which grid data will be exported.
Dim sw As New StreamWriter(outputFile)
' First we will write the headers.
Dim dt As DataTable = DirectCast(grid1.DataSource, DataSet).Tables(0)
Dim iColCount As Integer = dt.Columns.Count
For i As Integer = 0 To iColCount - 1
sw.Write(dt.Columns(i))
If i < iColCount - 1 Then
sw.Write(",")
End If
Next
sw.Write(sw.NewLine)
' Now write all the rows.
For Each dr As DataRow In dt.Rows
For i As Integer = 0 To iColCount - 1
sw.Write("""") 'lets encapsulate those fields in quotes, quoted csv file!
If Not Convert.IsDBNull(dr(i)) Then
sw.Write(dr(i).ToString())
End If
sw.Write("""")
If i < iColCount - 1 Then
sw.Write(System.Globalization.CultureInfo.CurrentCulture.TextInfo.ListSeparator)
End If
Next
sw.Write(sw.NewLine)
Next
sw.Close()
End Sub
modified from here
In need of your opinions. Currently developing an application in VB.NET.
I have a text file which contains more than one thousand rows. Each rows contains the data needed to be inserted into the database. A sample of a row is as follows:
0001--------SCOMNET--------0.100---0.105
At first glance, one might figured that each column was separated with a tab but actually each column was separated by blank spaces (I used '-' to denote as blank spaces because somehow could not get SO text editor to show the blank spaces).
The first column is defined by
Substring(0, 12) which is the data [0001--------]
second column
Substring(12, 12) in which the data is [SCOMNET-----]
third column is
Substring(24, 8) in which the data is [---0.100]
and last column is
Substring(32, 8) in which the data is [---0.105]
My initial though is to extract the lines for the text file and stored in as a list of strings, then while looping, do the separation of the each string item in the list with the SubString() function and insert it one by one until the end of the list. But this will no doubt takes time.
In my scenario, how can I take advantage of the SqlBulkCopy? Or if there is any other ways to approach this for a faster insert? Say;
open file
start loop
read line
separate each column in the line with substring
save in a DataTable
end loop
BCP(DataTable)
This includes a method that may be a more efficient way of reading your text file.
Sub readFixWidthTextFileIntoSqlTable()
Dim sqlConn As New SqlConnection("Connection String Goes Here")
sqlConn.Open()
Dim sqlComm As New SqlCommand
sqlComm.Connection = sqlConn
sqlComm.CommandType = CommandType.Text
sqlComm.CommandText = "INSERT INTO YourTableNameHere VALUES(#Field1, #Field2, #Field3, #Field4)"
sqlComm.Parameters.Add("#Field1", SqlDbType.Text)
sqlComm.Parameters.Add("#Field2", SqlDbType.Text)
sqlComm.Parameters.Add("#Field3", SqlDbType.Text)
sqlComm.Parameters.Add("#Field4", SqlDbType.Text)
Using IFReader As New FileIO.TextFieldParser(FileNameWithPath)
IFReader.TextFieldType = FileIO.FieldType.FixedWidth
IFReader.SetFieldWidths(12, 12, 8, 8)
While Not IFReader.EndOfData
Dim fields As String() = IFReader.ReadFields
sqlComm.Parameters("#Field1").Value = fields(0)
sqlComm.Parameters("#Field2").Value = fields(1)
sqlComm.Parameters("#Field3").Value = fields(2)
sqlComm.Parameters("#Field4").Value = fields(3)
sqlComm.ExecuteNonQuery()
End While
End Using
sqlConn.Close()
End Sub
You've got it pretty much right. This approach is one that I take a lot. Here's a bit of sample code to get you started. It is ONLY an example, there's absolutely no validation or and no consideration for Primary Keys on the table. If you want to review your question with more details of the structure of the destination table then I can make this example much more specific.
Read_File:
Dim sFileContents As String = ""
Using sRead As New StreamReader("e:\ExampleFile.txt")
sFileContents = sRead.ReadToEnd
End Using
Dim sFileLines() As String = sFileContents.Split(vbCrLf)
Connect_To_DB:
Dim sqlConn As New SqlConnection
sqlConn.ConnectionString = "Data Source=YourServerName;Initial Catalog=YourDbName;Integrated Security=True"
sqlConn.Open()
Setup_DataTable:
Dim ExampleTable As New DataTable
ExampleTable.Load(New SqlCommand("Select Top 0 * From Example_Table", sqlConn).ExecuteReader)
'This is not absolutely necessary but avoids trouble with NOT NULL columns (like keys)'
For Each dcColumn As DataColumn In ExampleTable.Columns : dcColumn.AllowDBNull = True : Next dcColumn
Save_To_DataTable:
For Each sLine In sFileLines
Dim ExampleRow As DataRow = ExampleTable.NewRow
ExampleRow("First_Column_Name") = sLine.Substring(0, 12).TrimEnd
ExampleRow("Second_Column_Name") = sLine.Substring(12, 12).TrimEnd
ExampleRow("Third_Column_Name") = sLine.Substring(24, 8).TrimEnd
ExampleRow("Fourth_Column_Name") = sLine.Substring(32, 8).TrimEnd
ExampleTable.Rows.Add(ExampleRow)
Next
Update_Database:
If ExampleTable.Rows.Count <> 0 Then
Dim sqlBulk As SqlBulkCopy = New SqlBulkCopy(sqlMvConnection)
sqlBulk.DestinationTableName = "Example_Table"
sqlBulk.WriteToServer(ExampleTable)
End If
Disconnect_From_DB:
sqlConn.Close()
Also, as commented on above and if you have access to it SSIS will do this in a jiffy.