Loop Running VERY Slow - sql-server

please help me make my app a little faster, it's taking forever to loop through and give me results right now.
here is what im donig:
1. load gridview from an uploaded excel file (this would probably be about 300 records or so)
2. compare manufacturer, model and serial No to my MS SQL database (about 20K records) to see if there is a match.
'find source ID based on make/model/serial No combination.
Dim cSource As New clsSource()
Dim ds As DataSet = cSource.GetSources()
Dim found As Boolean = False
'populate db datatables
Dim dt As DataTable = ds.Tables(0)
Dim rows As Integer = gwResults.Rows.Count()
For Each row As GridViewRow In gwResults.Rows
'move through rows and check data in each row against the dataset
'1 - make
For Each dataRow As DataRow In dt.Rows
found = False
If dataRow("manufacturerName") = row.Cells(1).Text Then
If dataRow("modelName") = row.Cells(2).Text Then
If dataRow("serialNo") = row.Cells(3).Text Then
found = True
End If
End If
End If
'display results
If found Then
lblResults.Text += row.Cells(1).Text & "/" & row.Cells(2).Text & "/" & row.Cells(3).Text & " found"
Else
lblResults.Text += row.Cells(1).Text & "/" & row.Cells(2).Text & "/" & row.Cells(3).Text & " not found "
End If
Next
Next
is there a better way to find a match between the two? i'm dying here.

For each of your 300 gridview rows, you are looping through all 20k datarows. That makes 300 * 20k = 6 million loop iterations. No wonder your loop is slow. :-)
Let me suggest the following algorithm instead (pseudo-code):
For Each gridviewrow
Execute a SELECT statement on your DB with a WHERE clause that compares all three components
If the SELECT statement returns a row
--> found
Else
--> not found
End If
Next
With this solution, you only have 300 loop iterations. Within each loop iteration, you make a SELECT on the database. If you have indexed your database correctly (i.e., if you have a composite index on the fields manufacturerName, modelName and serialNo), then this SELECT should be very fast -- much faster than looping through all 20k datarows.
From a mathematical point of view, this would reduce the time complexity of your algorithm from O(n * m) to O(n * log m), with n denoting the number of rows in your gridview and m the number of records in your database.

While Heinzi's answer is correct; it may be more beneficial to carry out the expensive SQL query before the loop and filter using data views so you aren't hitting the DB 300 times
Execute a SELECT statement on your DB
For Each gridviewrow
if my datagridview.Select(String.format("manufacturerName={0}", row.item("ManufacturerName"))
If the dataview has a row
--> found
Else
--> not found
End If
Next
NOTE: I only compared a single criteria to illustrate the point, you could filter on all three in here

Hmm... how about loading the data from the spreadsheet into a table in tempdb and then writing a select that compares the rows in the way that you want to compare them? This way, all of the data comparisons happen server-side and you'll be able to leverage all of the power of your SQL instance.

Related

VBA group by process using Public Types

I want to create a counter in a routine that will count how many times a specific entry has appeared so far.
The routine that i have created so far populates data in a spreadsheet through a For..Next Loop. For each of these rows i have an extra column that will represent the counter and count how many times a characteristic of the entry row has appeared so far in the previous rows. For that, I am using the application.worksheetfunction.CountIf function but the reference range has to be dynamic.
For example, I have the following table
Example Table
the overall idea is to group by month and expense type and have the sum amount. The role of the counter is to identify these rows that can be grouped together and loop through their values and sum them. The table has approximately 10,000 rows and 53 columns. For this process, i have created the following public type:
>public type OP
>>Month as string
>>expense_type as string
>>amount as double
>end type
Sub NewOuput()
with sheet1
>for i=1 lastrow 'output is the existing table that i get the data and i want to manipulate and then populate them into another table of the same format
>>op.month=output(i,1)
>>op.expense_type=output(i,2)
>>op.amount=output(i,3)
'----------------------------
>> .cells(i,1)=op.month 'this is the population of hte data in the new table
>> .cells(i,2)=op.expense_type
>> .cells(i,3)=op.amount
next i
end with
end sub
Through functions, i try to identify the rows that need to sum-up and then call the respective functions in the output part of the loop.
Countif excel function cannot be appied with arrays, so this is now out of hte question. I have read many posts on various ways of grouping including data connections, collections and other customised approaches. Collections appeared to be the best ones but i miss some of hte background there.
Does this make any sense? Any suggestions are appreciated
I didn't actually grasp your exact needs, but since the table example image I'd go like follows:
Sub NewOuput()
With sheet1
'fill in the voids of 1st column
With .Range("A1:A" & .Cells(.Rows.Count, "B").End(xlUp).row) '<--| change "A" and "B" to your actual 1st and 2nd columns index
.SpecialCells(xlCellTypeBlanks).FormulaR1C1 = "=R[-1]C"
.Value = .Value
End With
'more code to exploit a "full" database structure
End With
End Sub

Excel VBA: More Efficient Way of Removing Rows From Worksheet

Please note the edit after the original function code block
I've got this data set in Excel that I download from my company's cost management system each month. On average, this data set is around 100,000 rows with 32 columns. One of my job functions is to filter out line items that don't belong to my work group and arrange the data in the required format for a separate analysis system. Typically, I re-arrange the columns, enter a bunch of formulas into cells, and then use a series of autofilter checks to identify line items that need to be moved to other worksheets. This normally takes me about a couple of hours tops, but it's quite arduous and I'd rather automate the process to save time and reduce chances for me to make mistakes.
So I went ahead and wrote a VBA procedure that satisfies all of the requirements and everything seems to be checking out. The only problem is that the procedure itself takes about an hour to check 10,000 line items (I stopped it at that point). Wasting 10 hours watching a progress bar tick is not going to cut it. So now I'm trying to re-think how I've written this procedure to see if there's a better way (I'm certain there is).
Here's the code as it stands (I omitted a lot of code before and after the main loop for clarity, but I left comments there so you can see what happens in a 'pseudo-code' manner. The vast majority of time is spent in that loop, so it's really my main concern):
ORIGINAL FUNCTION
Function Prepare_CICTDF()
'Rename and set worksheet
wbRawFile.Worksheets("Sheet1").Name = "Excluded"
Set wsSheet = wbRawFile.Worksheets("Excluded")
'Update progress bar
status_message = "Rearranging columns in CICT Dedicated Facility. This may take several minutes."
Call Progress_Bar(current_row, status_message)
'Rearrange columns
'Omitted to shorten code block
'Create worksheet for included rows
wbRawFile.Worksheets.Add().Name = "Self Service"
'Copy header row to other worksheets
wsSheet.Rows("4").Copy Destination:=Sheets("Self Service").Range("A4")
'Import Lookup List
Dim wbLookupList As Workbook
Set wbLookupList = Workbooks.Open("\\server\path\to\file\Dedicated Facility Lookup List.xlsx")
Dim wsLookupList As Worksheet
Set wsLookupList = wbLookupList.Worksheets("Lookup List")
wsLookupList.Copy Before:=wbRawFile.Worksheets("Excluded")
wbLookupList.Close SaveChanges:=False
'Get first and last data row
Dim FirstRow As Long
Dim LastRow As Long
FirstRow = 5
LastRow = wsSheet.UsedRange.Rows.Count - 1
'Update progress bar
status_message = "Preparing rows in CICT Dedicated Facilty."
Call Progress_Bar(current_row, status_message)
'Loop through the rows to add formulas
Dim NextBlankRow As Long
Dim RowDeleted As Boolean
Dim i As Long
i = FirstRow
'-------------------------LOOP STARTS HERE-------------------------
Do While i <= LastRow
RowDeleted = False
'Add "CICTDF" before project ID
wsSheet.Range("B" & i).Value = "CICTDF" & wsSheet.Range("B" & i).Text
'Add formula for "Total Impact" column in column T
wsSheet.Range("T" & i).FormulaR1C1 = "=IF(AND(RC[-10]=""Complete"",RC[7]=""Manual Part Number Line Item""),RC[5],IF(AND(RC[-10]=""Complete"",RC[5]=0),0,IF(RC[-10]=""Complete"",RC[5]/RC[-5]*RC[4],RC[5])))"
'Add formula for rows with blank "Cost Impact - Part" column
If wsSheet.Range("V" & i).Value = "" Then
wsSheet.Range("V" & i).FormulaR1C1 = "=IF(RC[-7]>0,RC[3]/RC[-7]*-1,0)"
End If
'Change GLOBAL SUPPLY NETWORK to GLOBAL PURCHASING
If wsSheet.Range("F" & i).Value = "GLOBAL SUPPLY NETWORK" Then
wsSheet.Range("F" & i).Value = "GLOBAL PURCHASING"
End If
'Change numbers stored as text back to numbers
wsSheet.Range("M" & i).NumberFormat = "General"
wsSheet.Range("M" & i).Value = wsSheet.Range("M" & i).Value
wsSheet.Range("P" & i).NumberFormat = "General"
wsSheet.Range("P" & i).Value = wsSheet.Range("P" & i).Value
wsSheet.Range("AB" & i).NumberFormat = "General"
wsSheet.Range("AC" & i).NumberFormat = "General"
wsSheet.Range("AD" & i).NumberFormat = "General"
wsSheet.Range("AE" & i).NumberFormat = "General"
'Insert Cab Part # Formula
wsSheet.Range("AB" & i).Formula = "=VLOOKUP(M" & i & ",'Lookup List'!A:A,1,FALSE)"
'Insert Cabs DC formula
wsSheet.Range("AC" & i).Formula = "=VLOOKUP(N" & i & ",'Lookup List'!B:B,1,FALSE)"
'Insert Cab Localization HEX & MG Formula
wsSheet.Range("AD" & i).Formula = "=VLOOKUP(B" & i & ",'Lookup List'!C:C,1,FALSE)"
'Insert Already in MOASS formula
wsSheet.Range("AE" & i).Formula = "=VLOOKUP(B" & i & ",'Lookup List'!D:D,1,FALSE)"
'Include part numbers that match the inclusion criteria
If wsSheet.Range("AB" & i).Text <> "#N/A" And wsSheet.Range("AC" & i).Text = "#N/A" And wsSheet.Range("AD" & i).Text = "#N/A" _
And wsSheet.Range("AE" & i).Text = "#N/A" And wsSheet.Range("P" & i).Value = "14" Then
NextBlankRow = Worksheets("Self Service").UsedRange.Rows.Count + 1
wsSheet.Rows(i).Copy Destination:=Worksheets("Self Service").Range("A" & NextBlankRow)
wsSheet.Rows(i).Delete
RowDeleted = True
End If
'Check if the row was included or not
If RowDeleted = True Then
LastRow = LastRow - 1
Else
i = i + 1
End If
'Update the progress completion
current_row = current_row + 1
Call Progress_Bar(current_row, status_message)
Loop
'-------------------------LOOP STOPS HERE-------------------------
'Autofilter header row in Self Service tab
Worksheets("Self Service").Range("B4:AG4").AutoFilter
'Save as new file format
Worksheets("Self Service").Select
wbRawFile.SaveAs Filename:=output_directory & "CICT 2014 Dedicated Facility.xlsx", FileFormat:=51
wbRawFile.Application.DisplayAlerts = True
wbRawFile.Close SaveChanges:=False
End Function
Basically I loop through all of the lines in the file. For each line, I enter the formulas and values that I need and then check if they satisfy the inclusion requirements. If they do, I move the line to the "Self Service" worksheet, delete the line from the "Excluded" worksheet, and move on to the next line.
After running the first 10,000 lines of data, the elapsed time was just over 58 minutes. I think most of this can be attributed to the copy/paste/delete processes at the tail end of the loop. I've read that a common suggestion is to work within arrays instead of manipulating cells/rows/ranges in Excel, but I'm not exactly sure how I would go about doing this.
----------Edit:----------
After some input from Ron Rosenfeld, I revisited my process a little bit and made a bunch of changes. In the end, the new procedure processes and prepares over 100,000 rows (of 32 columns) in just over 49 minutes. The original procedure would have taken over 9.75 hours, so the changes have resulted in a procedure that's over 10x faster than its predecessor. Rather than paste the entire code block again, I'll describe the procedure in "pseudo-code":
Rearrange columns (takes the raw server download and puts it in the order I need).
Create a new worksheet for the included rows. Note that for my purposes, I process over 100,000 rows but end up keeping only about 10,000. Thus, I made the decision to look for those that I would INCLUDE instead of those that I would EXCLUDE.
Enter formulas in first row of data and drag down the column. I used Ron's suggestion of e.g. Range("A" & FirstRow & ":A" & LastRow") = "=B1+C1" for any columns that I could.
There was one column that only needed formulas if the cell was blank. So I used the SpecialCells(xlCellTypeBlanks) method to enter these.
AutoFiltered the data so that only the rows I wanted to include were visible. Again I used the SpecialCells(xlCellTypeVisible) method to find these and stored them in an array. This array was then entered into the new worksheet.
Finally, I did a little bit of massaging format-wise to make sure everything looked consistent (since storing the values in the array lost the cell formats).
It should also be noted that I think Tim's suggestion of using SQL in this scenario could be a very efficient alternative--I simply wasn't versed well enough on the topic to try it out. I'll be looking for ways to use it in the future, though!
Thanks everyone for the help!
Without knowing the exact layout of your worksheets, it is hard to say. In general, with regard to values, the process of reading a large DB into an array; looping through the array to decide what rows/items to keep, the writing that back to a new sheet, is usually at least an order of magnitude (10x) faster than looping through rows. Sometimes a challenge is to figure out how big the results array needs to be. If that cannot be done with some simple formulas, I have taken the interim step of gathering each row into a collection before dimensioning the results array.
Another thought after looking at your code: why not just filter on the values for columns 15, 27-30, and then copy/paste the visible cells to your new worksheet.
After you write the data to the worksheet; you can select all the blanks in the range with the SpecialCells method and write the formula that way:
R.columns(X).SpecialCells(xlCellTypeBlanks).FormulaR1C1 = "= RC[1] + RC[2]".
To get the size of arrIncluded, it seems you could either use CountIfs; or you could add the desired rows to a Collection, then use the Count property to get the size of arrIncluded; and write the Collection to the array. I prefer the Collection method, but test to see which way is faster.

Count the number of specific words in a list

I have some big computation to do since I have an Excel file with a column representing a list of unique IDs of people that worked on every incidents in our system. I would like to know the total number of interventions that have been done on all incidents. For example, let's say I have this:
ID|People working on that incident
¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯
0|AA0000 BB1111 CC2222 ZZ1234
1|BB1111
2|CC2222 ZZ1234 CC2222 ZZ1234
3|BB1111 CC2222 AA0000 BB1111
I have a list named List which has a zone with the list of people IDs I actually want to include. For example, let's say that the first zone of List = {"AA0000","CC2222"}.
Now, I would like to know how many interventions have been done by our employees (in List) on all the incidents I have (we have 4 in the array above). The result would be 6: 2 interventions for incident ID 0, 0 for ID 1, 2 for ID 2 and 2 for ID 3.
Assuming the data are in a different (closed) workbook, how can I calculate that using my list List and the range above A1:B4 (I would like to eventually use the whole columns, so let's say A:B)?
EDIT:
I already got something working that count the number of times a specific word is in a whole column.
SUM(
LEN('[myFile.xlsx]Sheet1'!$A:$A)
-LEN(
SUBSTITUTE('[myFile.xlsx]Sheet1'!$A:$A;$Z$1;"")
)
)
/LEN($Z$1)
Z1 is the word I'm looking for (example: CC2222) and '[myFile.xlsx]Sheet1'!$A:$A is the column I'm searching in.
Isn't there a really simple way to make this working with an array instead of Z1? The length is always the same (six plus a space).
Source: http://office.microsoft.com/en-ca/excel-help/count-the-number-of-words-in-a-cell-or-range-HA001034625.aspx
Split your source data ColumnB with Text to Columns. Unpivot the result, delete the middle column and pivot what's left.
You could do this fairly easily with a User Defined Function. The function below takes two arguments. The first is the range constituting you second column labelled above "People working on that incident". The second is your List which is a range consisting of a single entry for each ID you wish to count. As shown in your example, if multiple identical ID's are shown in a single entry (e.g. your ID 2 has CC2222 repeated twice), they will each be counted.
To enter this User Defined Function (UDF), opens the Visual Basic Editor.
Ensure your project is highlighted in the Project Explorer window.
Then, from the top menu, select Insert/Module and
paste the code below into the window that opens.
To use this User Defined Function (UDF), enter a formula like
=InterventionCount(B2:B5,H1:H2)
in some cell.
Option Explicit
Function InterventionCount(myRange As Range, myList As Range) As Long
Dim RE As Object, MC As Object
Dim vRange As Variant, vList As Variant
Dim sPat As String
Dim I As Long
vRange = myRange
vList = myList
If IsArray(vList) Then
For I = 1 To UBound(vList)
If Not vList(I, 1) = "" Then _
sPat = sPat & "|" & vList(I, 1)
Next I
Else
sPat = "|" & vList
End If
sPat = "\b(?:" & Mid(sPat, 2) & ")\b"
Set RE = CreateObject("vbscript.regexp")
With RE
.Global = True
.ignorecase = True
.Pattern = sPat
End With
For I = 1 To UBound(vRange)
Set MC = RE.Execute(vRange(I, 1))
InterventionCount = InterventionCount + MC.Count
Next I
End Function
For a non-VBA solution you could use a helper column. Again, List is a single column which contains the list of people you want to add up, one entry per cell.
If your data is in Column B, then add a column and enter this formula in B2:
This formula must be array-entered; and the $A:$J terms represent a counter allowing for up to ten items in the entries in column B. If there might be more than that, expand as needed: e.g. for up to 26 items, you would change them to $A:$Z
=SUM(N(TRIM(MID(SUBSTITUTE(B2," ",REPT(" ",99)),(COLUMN($A:$J)=1)+(COLUMN($A:$J)>1)*(COLUMN($A:$J)-1)*99,99))=(List)))
Fill down as far as necessary, then SUM the column to get your total.
To array-enter a formula, after entering
the formula into the cell or formula bar, hold down
ctrl-shift while hitting enter. If you did this
correctly, Excel will place braces {...} around the formula.
I finally went for a completely different solution based on my working formula for 1 employee:
SUM(
LEN('[myFile.xlsx]Sheet1'!$A:$A)
-LEN(
SUBSTITUTE('[myFile.xlsx]Sheet1'!$A:$A;$Z$1;"")
)
)
/LEN($Z$1)
Instead of trying something more complicated, I just added a new column to my employee list where the total is evaluated for each employees (it was already needed elsewhere anyway). Then, I just have to sum up all the employees to get my total.
It is not as elegant as I would like and I feel like it is a workaround, but since it is the easiest solution on a programmation standpoint and that I need the individual datas anyway, it's what I really need for now.
+1 to all the other answers for your help though.

Load time variance with .CacheSize/.PageSize in ADODB.Recordset

I am working on a project for a client using a classic ASP application I am very familiar with, but in his environment is performing more slowly than I have ever seen in a wide variety of other environments. I'm on it with many solutions; however, the sluggishness has got me to look at something I've never had to look at before -- it's more of an "acadmic" question.
I am curious to understand a category page with say 1800 product records takes ~3 times as long to load as a category page with say 54 when both are set to display 50 products per page. That is, when the number of items to loop through is the same, why does the variance in the total number of records make a difference in loading the number of products displayed when that is a constant?
Here are the methods used, abstracted to the essential aspects:
SELECT {tableA.fields} FROM tableA, tableB WHERE tableA.key = tableB.key AND {other refining criteria};
set rs=Server.CreateObject("ADODB.Recordset")
rs.CacheSize=iPageSize
rs.PageSize=iPageSize
pcv_strPageSize=iPageSize
rs.Open query, connObj, adOpenStatic, adLockReadOnly, adCmdText
dim iPageCount, pcv_intProductCount
iPageCount=rs.PageCount
If Cint(iPageCurrent) > Cint(iPageCount) Then iPageCurrent=Cint(iPageCount)
If Cint(iPageCurrent) < 1 Then iPageCurrent=1
if NOT rs.eof then
rs.AbsolutePage=Cint(iPageCurrent)
pcArray_Products = rs.getRows()
pcv_intProductCount = UBound(pcArray_Products,2)+1
end if
set rs = nothing
tCnt=Cint(0)
do while (tCnt < pcv_intProductCount) and (count < pcv_strPageSize)
{display stuff}
count=count + 1
loop
The record set is converted to an array via getRows() and the destroyed; records displayed will always be iPageSize or less.
Here's the big question:
Why, on the initial page load for the larger record set (~1800 records) does it take significantly longer to loop through the page size (say 50 records) than on the smaller records set (~54 records)? It's running through 0 to 49 either way, but takes a lot longer to do that the larger the initial record set/getRows() array is. That is, why would it take longer to loop through the first 50 records when the initial record set/getRows() array is larger when it's still looping through the same number of records/rows before exiting the loop?
Running MS SQL Server 2008 R2 Web edition
You are not actually limiting the number of records returned. It will take longer to load 36 times more records. You should change your query to limit the records directly rather than retrieving all of them and terminating your loop after the first 50.
Try this:
SELECT *
FROM
(SELECT *, ROW_NUMBER() OVER(ORDER BY tableA.Key) AS RowNum
FROM tableA
INNER JOIN tableB
ON tableA.key = tableB.key
WHERE {other refining criteria}) AS ResultSet
WHERE RowNum BETWEEN 1 AND 50
Also make sure the columns you are using to join are indexed.

Excel VBA: Chart-making macro that will loop through unique name groups and create corresponding charts?

Alright, I've been racking my brain, reading up excel programming for dummies, and looking all over the place but I'm stressing over this little problem I have here. I'm completely new to vba programming, or really any programming language but I'm trying my best to get a handle on it.
The Scenario and what my goal is:
The picture below is a sample of a huge long list of data I have from different stream stations. The sample only holds two (niobrara and snake) to illustrate my problem, but in reality I have a little over 80 stations worth of data, each varying in the amount of stress periods (COLUMN B).
COLUMN A, is the station name column.
COLUMN B, stress period number
COLUMN C, modeled rate
COLUMN D, estimated rate
What I have been TRYING to figure out is how to make a macro that will loop through the station names (COLUMN A) and for each UNIQUE Group of station names, make a chart that will pop out to the right of the group, say in the COLUMN E area.
The chart is completely simple, it just needs two series scatterplot/line chart; one series with COLUMN B as x-value and COLUMN C as y-value; and the other series needs COLUMN B as x-value and COLUMN D as y-value.
Now my main ordeal, is that I don't know how to make the macro distinguish between station names, use all the data relating to that name to make the chart, then looping on to the next Station group and creating a chart that corresponds for that, and to continue looping through all 80+ station names in COLUMN A and to make the corresponding 80+ charts to the right of it all in somewhere like the COLUMN E.
If I had enough points to "bounty" this, I would in a heartbeat. But since I do not, whoever can solve my dilemma would receive my sincere gratitude in helping me understand run this problem smoothly and hopefully better my understanding of scenarios like this in the future. If there is anymore information that I need to clarify to make my question more understandable please comment your query and I'd be happy to explain in more detail the subject.
Cheers.
Oh, and for extra credit; now that I think about it, I manually entered the numbers in COLUMN B. Since the loop would need to use that column as the x-value it would be important if it could loop through itself and fill that column on its own before it made the chart (I would imagine it would have something to do with anything as simple as "counting out the rows that correspond to the station name". But again, I know not the proper terminology to correspond the station name, hence the pickle I'm in; however if the veteran programmer who is savvy enough to answer this question could, I'd imagine such a piece of code would be simple enough yet crucial to the success of such a macro I seek.
Try this
Sub MakeCharts()
Dim sh As Worksheet
Dim rAllData As Range
Dim rChartData As Range
Dim cl As Range
Dim rwStart As Long, rwCnt As Long
Dim chrt As Chart
Set sh = ActiveSheet
With sh
' Get reference to all data
Set rAllData = .Range(.[A1], .[A1].End(xlDown)).Resize(, 4)
' Get reference to first cell in data range
rwStart = 1
Set cl = rAllData.Cells(rwStart, 1)
Do While cl <> ""
' cl points to first cell in a station data set
' Count rows in current data set
rwCnt = Application.WorksheetFunction. _
CountIfs(rAllData.Columns(1), cl.Value)
' Get reference to current data set range
Set rChartData = rAllData.Cells(rwStart, 1).Resize(rwCnt, 4)
With rChartData
' Auto fill sequence number
.Cells(1, 2) = 1
.Cells(2, 2) = 2
.Cells(1, 2).Resize(2, 1).AutoFill _
Destination:=.Columns(2), Type:=xlFillSeries
End With
' Create Chart next to data set
Set chrt = .Shapes.AddChart(xlXYScatterLines, _
rChartData.Width, .Range(.[A1], cl).Height).Chart
With chrt
.SetSourceData Source:=rChartData.Offset(0, 1).Resize(, 3)
' --> Set any chart properties here
' Add Title
.SetElement msoElementChartTitleCenteredOverlay
.ChartTitle.Caption = cl.Value
' Adjust plot size to allow for title
.PlotArea.Height = .PlotArea.Height - .ChartTitle.Height
.PlotArea.Top = .PlotArea.Top + .ChartTitle.Height
' Name series'
.SeriesCollection(1).Name = "=""Modeled"""
.SeriesCollection(2).Name = "=""Estimated"""
' turn off markers
.SeriesCollection(1).MarkerStyle = -4142
.SeriesCollection(2).MarkerStyle = -4142
End With
' Get next data set
rwStart = rwStart + rwCnt
Set cl = rAllData.Cells(rwStart, 1)
Loop
End With
End Sub

Resources