How many times the function will be invoked? - c

Here i have the loop:
for (i = n; i < 2*n; i += 4) {
for (j = 0; j < 3*i; j += 2) {
function();
}
}
How can i count amount of calls (in a term of n) of function() without running this code?
As the idea i think i can use arithmetic progression, which has the sum is S = (a1 + ak) * k / 2, where a1 - is amount of iterations of inner loop while i has initial value and ak - is amount of iterations of inner loop while i has final value.
But i cannot express it as a one formula with n as a variable.
Do you have any ideas about that?

The inner loop performs 3*i/2 calls. The outer loop has i=n, n+4, n+8 .. 2n-4. Therefore we have:
count = 3*n/2 + 3*(n+4)/2 + 3*(n+8)/2 + ... 3*2*n/2 =
= 3/2 * (n + (n+4) + (n+8) + .. + (2n-4)) =
= 3/2 * (3n^2-4n) / 8 =
= (9n^2 - 12n) / 16
(Edit: there may still be small inaccuracies that need to be fixed)
Edit #2 - I followed self's correction, and now I get the expected result.

Well, you've got the formula for arithmetic progression. When i = n, the inner loop goes 3n/2 times (more or less -- you may have to convert to a whole number). You may have to tweak the upper end a bit because there's no guarantee that n is divisible by 4, but you can do the same for the final loop. And it will run n/4 times (again convert to whole number).

Below are the formal steps that would allow you to deduce the exact number of times function() would execute:
The outer loop will execute n + ceil(n/4) - 1 times, the inner loop depends on the outer loop. I tried to detail as much as possible to make this solution clear enough.

Related

Nested loop analyzing (each loop bounds inner loop)

In my data structure lecture, I got homework about algorithms and time complexity. I could not actually find what I need to do for this.
Question : What is the time-complexity of this algorithm ?
My solution was the analyzing loop by loop, removing constant and lower order terms of each of loop itself. For this reason , there are three loops within each other. Complexity should be O(n3). Critical point is that the innermost loop is bounded dynamically.
What is the mistake on this table ( if there is ) :
int c = 0;
for (int i = 0; i < n * n; ++i)
for (int j = 0; j < n; ++j)
for (int k = 0; k < j * 2; ++k)
c = c + 1;
return c;
All answers are welcomed.
In order to compute the time complexity, you can try and evaluate the number of iterations of the innermost loop.
the loop on k evaluates a simple expression 2 * j times.
the loop on j runs n times. Hence the inner loop runs 2 * n * (n + 1) / 2, which simplifies to n * (n + 1).
the outer loop runs n * n times. Hence the inner loops runs exactly n * n * n * (n + 1) times.
Simplifying for the dominant term, the resulting time complexity is n4.
Yet a very shrewd compiler would reduce this complexity to constant time O(1) and generate code for:
return n * n * n * (n + 1);
Trying this on Godbolt's compiler explorer shows that none of the common compilers achieve this as of now, albeit clang goes to great lengths trying to optimize the code with unfathomable SIMD code.

What is the time complexity of the following dependent loops?

I have a question that needs answer before an exam I'm supposed to have this week.
i = 1;
while (i <= n)
{
for (j = 1; j < i; j++)
printf("*");
j *= 2;
i *= 3;
}
I have those dependent loops, I calculated the outer loop's big O to be O(logn).
The inner loop goes from 1 to i - 1 for every iteration of the outer loop,
the problem I'm having with this is that I do not know how calculate the inner loop's time complexity, and then the overall complexity (I'm used to just multiplying both complexities but I'm not sure about this one)
Thanks a lot!
P.S: I know that the j *= 2 doesn't affect the for loop.
As you recognized, computing the complexity of a loop nest where the bounds of an inner loop vary for different iterations of the outer loop is not as easy a simple multiplication of two iteration counts. You need to look more deeply to get the tightest possible bound.
The question can be taken to be asking about how many times the body of the inner loop is executed, as a function of n. On the first outer-loop iteration, i is 1, so j is never less than i, so there are no inner-loop iterations. Next, i is 3, so there are two inner-loop iterations, then eight the next time, then 26 ... in short, 3i-1 - 1 inner-loop iterations. You need to add those all up to compute the overall complexity.
Well, that sum is Σi = 1, floor(log n) (3i-1 - 1), so you could say that the complexity of the loop nest is
O(Σi = 1, floor(log n) (3i-1 - 1))
, but such an answer is unlikely to get you full marks.
We can simplify that by observing that our sum is bounded by a related one:
= O(Σi = 1, floor(log n) (3i-1))
. At this point (if not sooner) it would be useful to recognize the sum of powers pattern therein. It is often useful to know that 20 + 21 + ... 2k - 1 = 2k - 1. This is closely related to base-2 numeric representations, and a similar formula can be written for any other natural number base. For example, for base 3, it is 2 * 30 + 2 * 31 + ... 2 * 3k - 1 = 3k - 1. This might be enough for you to intuit the answer: that the total number of inner-loop iterations is bounded by a constant multiple of the number of inner-loop iterations on the last iteration of the outer loop, which in turn is bounded by n.
But if you want to prove it, then you can observe that the sum in the previous bound expression is itself bounded by a related definite integral:
= O(∫0log n 3i di)
... and that has a closed-form solution:
= O((3log n - 30) / log 3)
, which clearly has a simpler bound itself
= O(3log n)
. Exponentials of logarithms reduce to linear functions of the logarithm argument. Since we need only an asymptotic bound, we don't care about the details, and thus we can go straight to
= O(n)

Calculating Time Complexity for nested loops

Got this question on a test that I've been stuck on for a few days regarding Big O time complexity analysis:
Below is the C code:
if ( A > B ) {
for ( i=0; i<n^2/100; i++ ){ //1
for ( j=n^2; j>i; j-- ){ //2
A += B;}}
}
else {
for ( i=0; i<2n; i++ ){ //3
for ( j=3n; j>i; j-- ){ //4
A += B;}}
}
My first instinct was that this algorithm would have a big O of O(n2) with the nested for loops and such but it wasn't a multiple choice answer. Tried to count each loop iteration manually but having trouble accounting for the changing i in each inside loop (2 and 4). Also having trouble trying to write it as a summation.
Consider the first case where A > B. The inner loop executes a number of iterations equal to n^2 - i for each value of i iterated over by the outer loop. Consider n = 2 and i = 1. n^2 = 4 and the inner loop iterates over j = 4, j = 3, j = 2, three iterations, consistent with our finding.
The total number of iterations of the inner loop is therefore the sum of all n^2 - i where i varies from 0 to floor(n^2/100) - 1. Let us define k := floor(n^2/100) - 1. Then this sum is equal to kn^2 - k(k+1)/2. Substituting the expression for which k stands we recover [floor(n^2/100) - 1]n^2 - [floor(n^2/100) - 1][floor(n^2/100)]/2. This is no greater than (n^2/100 - 1)n^2 - (n^2/100 - 1)(n^2/100)/2. We may multiply through to get n^4/100 - n^2 - n^4/20000 + n^2/200 = n^4(1/100 - 1/20000) - n^2(1 - 1/200). From this we can see that the time complexity of the first case is O(n^4). Indeed, it is also Omega(n^4) and Theta(n^4).
In the case where A <= B, the analysis is similar. It is easy to show that the time complexity of the second case is O(n^2), Omega(n^2) and thus Theta(n^2).
Therefore, we may confidently say that:
The worst-case time complexity is O(n^4);
The best-case time complexity is Omega(n^2);
Each of these bounds may actually be given as Theta bounds instead.

What is complexity of following code?

Find the time complexity of following code.
Answer given is O(log(n)*n^1/2), but I am not getting it.
I want someone to explain this.
i=n;
while(i>0)
{
k=1;
for(j=1;j<=n;j+=k)
k++;
i=i/2;
}
Take this code segment:
k=1;
for(j=1;j<=n;j+=k)
k++;
The values of j over various iterations will be 1, 3, 6, 10, 15, 21, 28, ....
Note that these numbers have closed form (m+1)(m+2)/2, where m is the number of iterations that have gone by. If we want to know how many iterations this loop will run for, we need to solve (m+1)(m+2)/2 = n, which has solution m = (sqrt(8n + 1) - 3))/2 = O(sqrt(n)). So this loop will run O(sqrt(n)) times.
The outer loop will run O(log(n)) times (this is rather easy to see). So overall, we have O(log(n)sqrt(n)).
edit: Or perhaps easier than solving (m+1)(m+2)/2 = n directly would simply be to note that (m+1)(m+2)/2 = O(m^2), and so O(m^2) = n implies m = O(sqrt(n)).
The complexity would be :
(log n + 1)*(-1 + squareroot(1+4n))/2 = O(squareroot(n)*log n)
log n is in base 2.
Suppose n is 36.
The outer loop will iterate for log n + 1 times because the value is halved every time 36,18,9,4,2,1.
The inner loop has j values = 1,3,6,10,15,21,28,36.Every j value can be calculated as the sum of terms in AP 1+2+3+4+5....w = w(w+1)/2. So w(w+1)/2 = n.Solving this quadratic equation we get w=(-1+sqrt(1+4n))/2 i.e the number of iterations of inner loop.
For n=36, w=8.
Total complexity thus comes out to be : log n * sqrt(n).

Replace for loop with formula

I have this loop that runs in O(end - start) and I would like to replace it with something O(1).
If "width" wouldn't be decreasing, it would be pretty simple.
for (int i = start; i <= end; i++, width--)
if (i % 3 > 0) // 1 or 2, but not 0
z += width;
start, end and width have positive values
As someone else mentioned, this is probably easiest to think of as the sum of two series.
x x+3 x+6 ... x+3N
+ x+3N x+3(N-1) x+3(N-2) ... x
-----------------------------------
2x+3N 2x+3N 2x+3N ... 2x+3N
The above can be simplified to
(2x+3N)(N+1)
Which means the sum of one of them is really ...
(2x+3N)(N+1)/2
This equation would need to be applied for both series. It is possible that N would be different for both.
Thus, all you have to do is determine your starting point, and the number of items in the series. That shall be left as an exercise for the student.
Hope this helps.
Notice that
width == initial_width - (i - start)
so the summation can be rewritten as
end
—————
\ (initial_width + start - i)
/
—————
i=start
i mod 3 ≠ 0
end ⌊end/3⌋
————— —————
== \ (initial_width + start - i) —— \ (initial_width + start - 3j)
/ /
————— —————
i=start j=⌈start/3⌉
The rest should be simple.
It's probably easiest to think of this as the sum of two separate series, one for when i%3 = 1 and the other for when i%3=2. Alternatively, you could figure it as the sum for all values of i minus the sum for i%3=0. For the sake of argument, let's look at the first half of the latter approach: summing all the values of width.
In this case, width will start at some initial value, and each iteration its value will be reduced by 1. In the last iteration, its value will have been reduced by (end-start). Perhaps it's easiest to think of it as a triangle. Just to keep things simple, we'll use small numbers -- we'll start with width = 5, start = 1 and end = 5. Perhaps it's easiest to draw a diagram:
Values of width:
*
**
***
****
*****
What we're really looking for is the area of that triangle -- which is a pretty well-known formula from elementary geometry -- 1/2ab, where a and b are the lengths of the two sides (in this case, defined by the initial value of width and end-start). That assumes it really is a triangle though -- i.e. that it decrements down to 0. In reality, there's a good chance that we're dealing with a truncated triangle -- but the formula for that is also well known (1/2a1b + 1/2a2b, where the a's are the heights of the right and left sides, and b is the width.
I came up with this ugly method:
int start; // = some number
int end; // = ...
int initialwidth; // = ...
int each = (end+1)/3 - (start-1)/3 - 1;
int loop = 2*(3-(start+2)%3)+1;
int total = each*loop + 3*each*(each-1) + (start%3==1) + (end-start)*(end%3==1);
int result = -total + initialwidth*(1 + end - start - end/3 + (start-1)/3);
total will give the sum of (i-start)s when (i%3 > 0) for i=start to end.
result will give the sum of widths added to z.
The closed form of sum(i=1 ... n) i is (n)(n+1)/2. You should be able to use this with a little algebra to find a closed form that provides you with the result you're looking for.
Do you want something like z = 2 * width * (start - end) / 3 + (start - end) % 3? (not quite right, but close enough to get you on the right track.
This isn't a full answer, but you should notice that:
x = end - start;
k = ~(-1 << x); // I think (width * k)>>x would be your z except if you didn't have the contidional
and that a value that from LSB up has two bits set, one bit cleared, two bits set, one bit cleared (0x...011011011) could be used to compute where the %3 is 0.
R = k - (k & 0x...011011011); // This is the series 3 + (3 << 3) + (3 << 6) ...
z = (R * width)>>x; // I think.
Just something to try. I've probably made some kind of mistake.

Resources