Finding the Complexity of Nested Loops - loops

I'm given the loop pseudocode:
where "to" is equivalent to "<="
sum = 0;
for i = 1 to n
for j = 1 to i^3
for k = 1 to j
sum++
I know the outermost loop runs n times.
Do the two inner loops also run n times though? (Making the entire Complexity O(n^3).
Where for instance n = 5
Then:
1 <= 5 2<= 5
j = 1 <= 1^3 2 <= 2^3 = 8
k=1 <= 1 2 <= 2
And this would continue n times for each loop, making it n^3?

This seems like a tricky problem, those inner loops are more complex than just n.
The outer loop is n.
The next loop goes to i^3. At the end of the outer loop i will be equal to n. This means that this loop at the end will be at n^3. Technically it would be (n^3)/2, but we ignore that since this is Big O.
The third loop goes to j, but at the end of the previous loop j will be equal to i^3. And we already determined that i^3 was equal to n^3.
So it looks like:
1st loop: n
2nd loop: n^3
3rd loop: n^3
Which looks like it comes to n^7. I'd want someone else to verify this though. Gotta love Big O.

You can use Sigma notation to explicitly unroll the number of basic operations in the loop (let sum++ be a basic operation):
Where
(i): Partial sum formula from Wolfram Alpha.
(ii): Expanding the expression from Wolfram Alpha.
Hence, the complexity, using Big-O notation, is O(n^7).

Related

What is the time complexity of the following dependent loops?

I have a question that needs answer before an exam I'm supposed to have this week.
i = 1;
while (i <= n)
{
for (j = 1; j < i; j++)
printf("*");
j *= 2;
i *= 3;
}
I have those dependent loops, I calculated the outer loop's big O to be O(logn).
The inner loop goes from 1 to i - 1 for every iteration of the outer loop,
the problem I'm having with this is that I do not know how calculate the inner loop's time complexity, and then the overall complexity (I'm used to just multiplying both complexities but I'm not sure about this one)
Thanks a lot!
P.S: I know that the j *= 2 doesn't affect the for loop.
As you recognized, computing the complexity of a loop nest where the bounds of an inner loop vary for different iterations of the outer loop is not as easy a simple multiplication of two iteration counts. You need to look more deeply to get the tightest possible bound.
The question can be taken to be asking about how many times the body of the inner loop is executed, as a function of n. On the first outer-loop iteration, i is 1, so j is never less than i, so there are no inner-loop iterations. Next, i is 3, so there are two inner-loop iterations, then eight the next time, then 26 ... in short, 3i-1 - 1 inner-loop iterations. You need to add those all up to compute the overall complexity.
Well, that sum is Σi = 1, floor(log n) (3i-1 - 1), so you could say that the complexity of the loop nest is
O(Σi = 1, floor(log n) (3i-1 - 1))
, but such an answer is unlikely to get you full marks.
We can simplify that by observing that our sum is bounded by a related one:
= O(Σi = 1, floor(log n) (3i-1))
. At this point (if not sooner) it would be useful to recognize the sum of powers pattern therein. It is often useful to know that 20 + 21 + ... 2k - 1 = 2k - 1. This is closely related to base-2 numeric representations, and a similar formula can be written for any other natural number base. For example, for base 3, it is 2 * 30 + 2 * 31 + ... 2 * 3k - 1 = 3k - 1. This might be enough for you to intuit the answer: that the total number of inner-loop iterations is bounded by a constant multiple of the number of inner-loop iterations on the last iteration of the outer loop, which in turn is bounded by n.
But if you want to prove it, then you can observe that the sum in the previous bound expression is itself bounded by a related definite integral:
= O(∫0log n 3i di)
... and that has a closed-form solution:
= O((3log n - 30) / log 3)
, which clearly has a simpler bound itself
= O(3log n)
. Exponentials of logarithms reduce to linear functions of the logarithm argument. Since we need only an asymptotic bound, we don't care about the details, and thus we can go straight to
= O(n)

How is this loop's time complexity O(n^2)?

How is this loop's time complexity O(n^2)?
for (int i = n; i > 0; i -= c)
{
for (int j = i+1; j <=n; j += c)
{
// some O(1) expressions
}
}
Can anyone explain?
Assumption
n > 0
c > 0
First loop
The first loop start with i=n and at each step, it substracts c from i. On one hand, if c is big, then the first loop will be iterated only a few times. (Try with n=50, c=20, you will see). On the other hand, if c is small (let say c=1), then it will iterate n times.
Second loop
The second loop is the same reasoning. If c is big, then it will be iterated only a few times, if c is small, many times and at the worst case n times.
Combined / Big O
Big O notation gives you the upper bound for time complexity of an algorithm. In your case, first and second loop upper bound combined, it gives you a O(n*n)=O(n^2).

Why is the time complexity O(n) instead of O(n^2) in this code?

Why isn't the time complexity here O(n^2) and instead it is O(n)?
Isn't the first loop is n times, & the same is the second one, so it becomes O(n*n), what is wrong here?
void f(int n){
for( ; n>0; n/=2){
int i;
for(i=0; i<n; i++){
printf("hey");
}
}
}
Isn't the first loop is n times, & the same is the second one, so it becomes O(n*n).
Above statement is false, since:
The outer loop does not run n times. (The outer loop runs O(log n) times, but it does not matter in this case.)
For the inner loop, the number of loops differs as the value of n changes.
To get the time complexity of this code, we should count the total number of times the body of the inner loop is executed.
Clearly, the body of the inner loop is executed n times, for each value of n.
The value of n is determined by the for statement of the outer loop. It starts from the value given as the argument to the function, and is halved each time the body of outer loop is executed.
So as already stated by the comments, since n + n/2 + n/4 + n/8 + ... = 2n, the time complexity for this algorithm is O(n).
For some more concrete mathematical proof on this:
Find an integer k such that 2^(k-1) < n <= 2^k. For this k:
A lower bound for the total number of inner loops is 1 + 2 + 4 + ... + 2^(k-1) = 2^k - 1 >= n - 1 ∈ Ω(n).
An upper bound for the total number of inner loops is 1 + 2 + 4 + ... + 2^k = 2^(k+1) - 1 < 4n - 1 ∈ O(n).
Therefore the total number of inner loops is Θ(n), as well as O(n).

Calculating Time Complexity for nested loops

Got this question on a test that I've been stuck on for a few days regarding Big O time complexity analysis:
Below is the C code:
if ( A > B ) {
for ( i=0; i<n^2/100; i++ ){ //1
for ( j=n^2; j>i; j-- ){ //2
A += B;}}
}
else {
for ( i=0; i<2n; i++ ){ //3
for ( j=3n; j>i; j-- ){ //4
A += B;}}
}
My first instinct was that this algorithm would have a big O of O(n2) with the nested for loops and such but it wasn't a multiple choice answer. Tried to count each loop iteration manually but having trouble accounting for the changing i in each inside loop (2 and 4). Also having trouble trying to write it as a summation.
Consider the first case where A > B. The inner loop executes a number of iterations equal to n^2 - i for each value of i iterated over by the outer loop. Consider n = 2 and i = 1. n^2 = 4 and the inner loop iterates over j = 4, j = 3, j = 2, three iterations, consistent with our finding.
The total number of iterations of the inner loop is therefore the sum of all n^2 - i where i varies from 0 to floor(n^2/100) - 1. Let us define k := floor(n^2/100) - 1. Then this sum is equal to kn^2 - k(k+1)/2. Substituting the expression for which k stands we recover [floor(n^2/100) - 1]n^2 - [floor(n^2/100) - 1][floor(n^2/100)]/2. This is no greater than (n^2/100 - 1)n^2 - (n^2/100 - 1)(n^2/100)/2. We may multiply through to get n^4/100 - n^2 - n^4/20000 + n^2/200 = n^4(1/100 - 1/20000) - n^2(1 - 1/200). From this we can see that the time complexity of the first case is O(n^4). Indeed, it is also Omega(n^4) and Theta(n^4).
In the case where A <= B, the analysis is similar. It is easy to show that the time complexity of the second case is O(n^2), Omega(n^2) and thus Theta(n^2).
Therefore, we may confidently say that:
The worst-case time complexity is O(n^4);
The best-case time complexity is Omega(n^2);
Each of these bounds may actually be given as Theta bounds instead.

how to determin the time complexity of this c program

void mystery2 (int n)
{
int i;
for (i = 1; i <= n; i++) {
double x = i;
double delta = 1 / (double)i;
while ( x > 0 )
x -= delta;
}
return 0;
}
How to determine the time complexity of this program using tracking tables like here http://pages.cs.wisc.edu/~vernon/cs367/notes/3.COMPLEXITY.html#application and not by guessing?
For each iteration, initially you have x=i, then x is decremented by 1/i each time. So this will be repeated i/(1/i)==i^2 times.
So, for each iteration of for(i=1;i<n;++i), the inner part has a complexity of O(i^2). As i grows from 1 to n it's just like adding (1^2+2^2+3^2+...+n^2), which is roughly n^3/6. Thus it's O(n^3).
Outer loop(for) Inner Loop
I=1 1
I=2 4
I=3 9
... ..
I=N N^2
TOTAL_ ~N^3/6
This is relatively straightforward: you need to determine how many times each of the two nested loops executes, and considering the complexities together.
The outer loop is a trivial for loop; it executes n times.
The inner loop requires a little more attention: it keeps subtracting 1/i from i until it gets to zero or goes negative. It is easy to see that it takes i iterations of the while loop to subtract 1 from x. Since x is initially set to i, the total time taken by the inner loop is i^2.
The total is, therefore, a sum of x squared, for x between 1 and n.
Wolfram Alpha tells us that the answer to this is n*(n+1)*(2n+1)/6
This expands to n^3/3 + n^2/2 +n/6 polynomial, which has the complexity of O(n^3).

Resources