how to determin the time complexity of this c program - c

void mystery2 (int n)
{
int i;
for (i = 1; i <= n; i++) {
double x = i;
double delta = 1 / (double)i;
while ( x > 0 )
x -= delta;
}
return 0;
}
How to determine the time complexity of this program using tracking tables like here http://pages.cs.wisc.edu/~vernon/cs367/notes/3.COMPLEXITY.html#application and not by guessing?

For each iteration, initially you have x=i, then x is decremented by 1/i each time. So this will be repeated i/(1/i)==i^2 times.
So, for each iteration of for(i=1;i<n;++i), the inner part has a complexity of O(i^2). As i grows from 1 to n it's just like adding (1^2+2^2+3^2+...+n^2), which is roughly n^3/6. Thus it's O(n^3).
Outer loop(for) Inner Loop
I=1 1
I=2 4
I=3 9
... ..
I=N N^2
TOTAL_ ~N^3/6

This is relatively straightforward: you need to determine how many times each of the two nested loops executes, and considering the complexities together.
The outer loop is a trivial for loop; it executes n times.
The inner loop requires a little more attention: it keeps subtracting 1/i from i until it gets to zero or goes negative. It is easy to see that it takes i iterations of the while loop to subtract 1 from x. Since x is initially set to i, the total time taken by the inner loop is i^2.
The total is, therefore, a sum of x squared, for x between 1 and n.
Wolfram Alpha tells us that the answer to this is n*(n+1)*(2n+1)/6
This expands to n^3/3 + n^2/2 +n/6 polynomial, which has the complexity of O(n^3).

Related

What is time complexity of fun()?

what is run time in function exe:log(n)
int fun(int n) {
int count = 0;
for (int i = n; i > 0; i /= 2)
for (int j = 0; j < i; j++)
count += 1;
return count;
}
It's not O(log n), but it is O(n). You can think about it like this: Each run of the outer loop sends the remaining data (originally n) into the inner loop for processing, and then removes one half of it. The inner loop is clearly linear in the data it processes.
At first iteration, the outer loop sends the whole n into the inner loop, which "pays" n steps for processing it.
At the second iteration, there is n / 2 data left, so the innter loop pays n / 2 for it; it has payed 1.5n in total.
At the next iteration, there is n / 2 / 2 == n/4 data left, for which the inner loop pays an extra n/4, so 1.75n in total.
And so on, until the entire n has been paid for twice, so the cost is 2n, which is O(n), actually even ϴ(n).
The complexity would be
O(n)
For example suppose we take n=32
so for various iterations the number of times loop will run is
32,16,8,4,2,1
So on adding it will be 63 which is the total number of times loop ran
and that is 2*n-1
Mathematically ,for any value which it is a G.P Sum where the series is like n,n/2,n/4,n/8......1
suppose we take n=32 again
then
sum = a * (1-r^nof)/(1-r) = 32 * (1-(1/2)^5)/(1-(1/2)) = 63
where nof(number of times outer loop ran)=5 is log2n, a=32, r=(1/2)
for any number it will be less than 2*n
The Time Complexity of your code is O(n) only.

Time complexity of nested for loop

What would be the time complexity of this following block of code
void function(int n).
My attempt was that the outermost loop would run n/2 times and the inner two would run 2^q times. Then I equated 2^q with n and got q as 1/2(log n) with base 2. Multiplying the time complexities I get my value as O(nlog(n)) while the answer is O(nlog^2(n)).
void function(int n) {
int count = 0;
for (int i=n/2; i<=n; i++)
for (int j=1; j<=n; j = 2 * j)
for (int k=1; k<=n; k = k * 2)
count++;
}
Time to apply the golden rule of understanding loop nests:
When in doubt, work inside out!
Let’s start with the original loop nest:
for (int i=n/2; i<=n; i++)
for (int j=1; j<=n; j = 2 * j)
for (int k=1; k<=n; k = k * 2)
count++;
That inner loop will run Θ(log n) times, since after m iterations of the loop we see that k = 2m and we stop when k ≥ n = 2lg n. So let’s replace that inner loop with this simpler expression:
for (int i=n/2; i<=n; i++)
for (int j=1; j<=n; j = 2 * j)
do Theta(log n) work;
Now, look at the innermost remaining loop. With exactly the same reasoning as before we see that this loop runs Θ(log n) times as well. Since we do Θ(log n) iterations that each do Θ(log n) work, we see that this loop can be replaced with this simpler one:
for (int i=n/2; i<=n; i++)
do Theta(log^2 n) work;
And here that outer loop runs Θ(n) times, so the overall runtime is Θ(n log2 n).
I think that, based on what you said in your question, you had the right insights but just forgot to multiply in two copies of the log term, one for each of the two inner loops.
In your code there are 3 nested loops.
First loop runs n/2 times which is almost equivalent to n while calculating complexity.
Second loop runs logn times.
Third loop runs logn times.
So, finally the time complexity will be O( n * logn * logn ) == O(nlog^2n).
Now, one may wonder how the run time complexity of the two inner loops is logn. This can be generalized as follows:
Since we are multiplying by 2 in each iteration, we need value of q such that:
n = 2 ^ q.
Taking log base 2 on both sides,
log2 n = log2 (2^q)
log2 n = q log2(2)
log2 n = q * 1 [ since, log2(2) is 1 ]
So, q is equal to logn.
So, overall time complexity is: O(n*log^2n).
First loop takes: n/2
Second loop: log(n)
Third loop: log(n)
Notice that because the step of the inner loops is multiplied by two, their respective counters grow exponentially, reaching n in a log(n), in terms of time complexity.
Then, also notice that constants like 1/2 can safely be ignored in that case, which results in O(n * log(n) *log(n)), thus:
O(nlog2n)

Why is the time complexity O(n) instead of O(n^2) in this code?

Why isn't the time complexity here O(n^2) and instead it is O(n)?
Isn't the first loop is n times, & the same is the second one, so it becomes O(n*n), what is wrong here?
void f(int n){
for( ; n>0; n/=2){
int i;
for(i=0; i<n; i++){
printf("hey");
}
}
}
Isn't the first loop is n times, & the same is the second one, so it becomes O(n*n).
Above statement is false, since:
The outer loop does not run n times. (The outer loop runs O(log n) times, but it does not matter in this case.)
For the inner loop, the number of loops differs as the value of n changes.
To get the time complexity of this code, we should count the total number of times the body of the inner loop is executed.
Clearly, the body of the inner loop is executed n times, for each value of n.
The value of n is determined by the for statement of the outer loop. It starts from the value given as the argument to the function, and is halved each time the body of outer loop is executed.
So as already stated by the comments, since n + n/2 + n/4 + n/8 + ... = 2n, the time complexity for this algorithm is O(n).
For some more concrete mathematical proof on this:
Find an integer k such that 2^(k-1) < n <= 2^k. For this k:
A lower bound for the total number of inner loops is 1 + 2 + 4 + ... + 2^(k-1) = 2^k - 1 >= n - 1 ∈ Ω(n).
An upper bound for the total number of inner loops is 1 + 2 + 4 + ... + 2^k = 2^(k+1) - 1 < 4n - 1 ∈ O(n).
Therefore the total number of inner loops is Θ(n), as well as O(n).

what is the complexity/BIG O of this function "loops"

void mystery2 (int n)
{
int i;
for (i = 1; i <= n; i++) {
double x = i;
double delta = 1 / (double)i;
while ( x > 0 )
x -= delta;
}
return 0;
}
Why is the BIG O, The time complexity of this function is O(n^3) and not O(n^2)?
What I done is when i=1 ==> 1 iterations, i=2 ==> 2iterations(in while) i=3 ==> 3 iteration ........ i=n ==> n iteration, if we sum all the iterations we get 1+2+3+4....+n = n*(n+1)/2. so what I missing here?
This is because inner loop runs like this.
For i=1, inner loop runs 1 time,
For i=2, inner loop runs 4 time,
//because x=2 and delta=0.5 so for x to become 0 it has to iterate 4 time
For i=3, inner loop runs 9 time
//because x=3 and delta=0.33 so for x to become 0 it has to iterate 9(atleast) time
and so on..
So inner loop run i^2 time and the equation becomes 1^2+2^2+3^2+...+n^2=n(n+1)(2n+1)/6 which is equal to O(n^3) complexity.
I think you are looking at it as a standard integer decrement loop, which I also did at first, but the number are doubles, and delta is not 1 but actually 1 / (double)i, so the number of inner loop iterations it takes to fully decrement x does not increase linearly as n increases, but much more sharply, because delta gets smaller as n gets larger.

Finding the Complexity of Nested Loops

I'm given the loop pseudocode:
where "to" is equivalent to "<="
sum = 0;
for i = 1 to n
for j = 1 to i^3
for k = 1 to j
sum++
I know the outermost loop runs n times.
Do the two inner loops also run n times though? (Making the entire Complexity O(n^3).
Where for instance n = 5
Then:
1 <= 5 2<= 5
j = 1 <= 1^3 2 <= 2^3 = 8
k=1 <= 1 2 <= 2
And this would continue n times for each loop, making it n^3?
This seems like a tricky problem, those inner loops are more complex than just n.
The outer loop is n.
The next loop goes to i^3. At the end of the outer loop i will be equal to n. This means that this loop at the end will be at n^3. Technically it would be (n^3)/2, but we ignore that since this is Big O.
The third loop goes to j, but at the end of the previous loop j will be equal to i^3. And we already determined that i^3 was equal to n^3.
So it looks like:
1st loop: n
2nd loop: n^3
3rd loop: n^3
Which looks like it comes to n^7. I'd want someone else to verify this though. Gotta love Big O.
You can use Sigma notation to explicitly unroll the number of basic operations in the loop (let sum++ be a basic operation):
Where
(i): Partial sum formula from Wolfram Alpha.
(ii): Expanding the expression from Wolfram Alpha.
Hence, the complexity, using Big-O notation, is O(n^7).

Resources