what is the complexity/BIG O of this function "loops" - c

void mystery2 (int n)
{
int i;
for (i = 1; i <= n; i++) {
double x = i;
double delta = 1 / (double)i;
while ( x > 0 )
x -= delta;
}
return 0;
}
Why is the BIG O, The time complexity of this function is O(n^3) and not O(n^2)?
What I done is when i=1 ==> 1 iterations, i=2 ==> 2iterations(in while) i=3 ==> 3 iteration ........ i=n ==> n iteration, if we sum all the iterations we get 1+2+3+4....+n = n*(n+1)/2. so what I missing here?

This is because inner loop runs like this.
For i=1, inner loop runs 1 time,
For i=2, inner loop runs 4 time,
//because x=2 and delta=0.5 so for x to become 0 it has to iterate 4 time
For i=3, inner loop runs 9 time
//because x=3 and delta=0.33 so for x to become 0 it has to iterate 9(atleast) time
and so on..
So inner loop run i^2 time and the equation becomes 1^2+2^2+3^2+...+n^2=n(n+1)(2n+1)/6 which is equal to O(n^3) complexity.

I think you are looking at it as a standard integer decrement loop, which I also did at first, but the number are doubles, and delta is not 1 but actually 1 / (double)i, so the number of inner loop iterations it takes to fully decrement x does not increase linearly as n increases, but much more sharply, because delta gets smaller as n gets larger.

Related

Time complexity of nested for loop

What would be the time complexity of this following block of code
void function(int n).
My attempt was that the outermost loop would run n/2 times and the inner two would run 2^q times. Then I equated 2^q with n and got q as 1/2(log n) with base 2. Multiplying the time complexities I get my value as O(nlog(n)) while the answer is O(nlog^2(n)).
void function(int n) {
int count = 0;
for (int i=n/2; i<=n; i++)
for (int j=1; j<=n; j = 2 * j)
for (int k=1; k<=n; k = k * 2)
count++;
}
Time to apply the golden rule of understanding loop nests:
When in doubt, work inside out!
Let’s start with the original loop nest:
for (int i=n/2; i<=n; i++)
for (int j=1; j<=n; j = 2 * j)
for (int k=1; k<=n; k = k * 2)
count++;
That inner loop will run Θ(log n) times, since after m iterations of the loop we see that k = 2m and we stop when k ≥ n = 2lg n. So let’s replace that inner loop with this simpler expression:
for (int i=n/2; i<=n; i++)
for (int j=1; j<=n; j = 2 * j)
do Theta(log n) work;
Now, look at the innermost remaining loop. With exactly the same reasoning as before we see that this loop runs Θ(log n) times as well. Since we do Θ(log n) iterations that each do Θ(log n) work, we see that this loop can be replaced with this simpler one:
for (int i=n/2; i<=n; i++)
do Theta(log^2 n) work;
And here that outer loop runs Θ(n) times, so the overall runtime is Θ(n log2 n).
I think that, based on what you said in your question, you had the right insights but just forgot to multiply in two copies of the log term, one for each of the two inner loops.
In your code there are 3 nested loops.
First loop runs n/2 times which is almost equivalent to n while calculating complexity.
Second loop runs logn times.
Third loop runs logn times.
So, finally the time complexity will be O( n * logn * logn ) == O(nlog^2n).
Now, one may wonder how the run time complexity of the two inner loops is logn. This can be generalized as follows:
Since we are multiplying by 2 in each iteration, we need value of q such that:
n = 2 ^ q.
Taking log base 2 on both sides,
log2 n = log2 (2^q)
log2 n = q log2(2)
log2 n = q * 1 [ since, log2(2) is 1 ]
So, q is equal to logn.
So, overall time complexity is: O(n*log^2n).
First loop takes: n/2
Second loop: log(n)
Third loop: log(n)
Notice that because the step of the inner loops is multiplied by two, their respective counters grow exponentially, reaching n in a log(n), in terms of time complexity.
Then, also notice that constants like 1/2 can safely be ignored in that case, which results in O(n * log(n) *log(n)), thus:
O(nlog2n)

Why is the time complexity O(n) instead of O(n^2) in this code?

Why isn't the time complexity here O(n^2) and instead it is O(n)?
Isn't the first loop is n times, & the same is the second one, so it becomes O(n*n), what is wrong here?
void f(int n){
for( ; n>0; n/=2){
int i;
for(i=0; i<n; i++){
printf("hey");
}
}
}
Isn't the first loop is n times, & the same is the second one, so it becomes O(n*n).
Above statement is false, since:
The outer loop does not run n times. (The outer loop runs O(log n) times, but it does not matter in this case.)
For the inner loop, the number of loops differs as the value of n changes.
To get the time complexity of this code, we should count the total number of times the body of the inner loop is executed.
Clearly, the body of the inner loop is executed n times, for each value of n.
The value of n is determined by the for statement of the outer loop. It starts from the value given as the argument to the function, and is halved each time the body of outer loop is executed.
So as already stated by the comments, since n + n/2 + n/4 + n/8 + ... = 2n, the time complexity for this algorithm is O(n).
For some more concrete mathematical proof on this:
Find an integer k such that 2^(k-1) < n <= 2^k. For this k:
A lower bound for the total number of inner loops is 1 + 2 + 4 + ... + 2^(k-1) = 2^k - 1 >= n - 1 ∈ Ω(n).
An upper bound for the total number of inner loops is 1 + 2 + 4 + ... + 2^k = 2^(k+1) - 1 < 4n - 1 ∈ O(n).
Therefore the total number of inner loops is Θ(n), as well as O(n).

Converting nested for loop into lower than O(n^3)

So, I have this code and I need to make it run on a time complexity of less than O(n^3). I've just started learning about complexity and I really have no idea what to do.
int n, i, j, k, x=0;
printf("Type n: \n");
scanf("%d",&n);
for(i=1; i<n; i++)
{
for(j=1; j<i; j++)
{
for(k=1; k<j; k++)
{
x=x+1;
}
}
}
printf("%d\n",x);
I think I get why it's O(n^3), but I don't really know how to make it more efficient. I tried turning it into a recursive function, is it possible?
You're adding 1 to the result for each i, j, k with 0 < k < j < i < n. There's choose(n-1, 3) such values of i, j, k (one for each subset of size 3 of {1, 2, ..., n-1}). (Here "choose" in the binomial coefficient function).
Thus, you can replace your loop-based computation with choose(n-1, 3) which is (n - 1)(n - 2)(n - 3) / 6 if n is positive.
int n;
printf("Type n: \n");
scanf("%d",&n);
printf("%d\n", n > 0 ? (n-1)*(n-2)*(n-3)/6 : 0);
This is O(1) to compute the result, and O(log N) to output it (since the result has O(log N) digits).
Your current function is just a lousy O(n^3) way to calculate some mathematical function ...
In Out
0 0
1 0
2 0
3 0
4 1
5 4
6 10
7 20
8 35
9 56
10 84
x will end up being equal to the number of iterations.
Your assignment is likely to reinterpret that for loop into an equation.
We know that the outer loop will execute its block (n-1) times. The next inner loop will execute its block a total of 1+2+..+(n-2) times. That's (n-1)(n-2)/2 times. (At this point I get stuck myself, none of my extrapolations get (n-1)(n-2)(n-3)/6)
Another way: Since we know that 1, 2, 3 all are zero roots, we also know the function at least is (n - 1)(n - 2)(n - 3). Solve for n=4 and you get 1/6 as the constant factor.
I refactored your loop as follows:
for(i=1; i<n-2; i++)
{
x = x + ( ( i * ( i + 1 ) ) / 2 );
}
This works because ( ( i * ( i + 1 ) ) / 2 ) = sum of all values in the series 1 through i.
You inner most loop (using variable k) is the equivalent of adding the value of j to x. Your second loop (using variable j) is then the equivalent of calculating the sum of the series 1 through i.
So I've replaced your second and third loop with the sum of the series 1 through i. We keep your first loop, and at each iteration add the sum of the series 1 through i to your previous value.
Note that I've added a -2 to your outer loop to simulate the < sign in your two inner loops. If your requirement was <= on each inner loop then that -2 would not be needed.
This is an O(n) solution, which is not as good as Paul Hankin's O(1) solution.

What is the complexity of this piece of code

I had to determinate big O complexity of this piece of code.
I thought the answer is nlogn but apparently its n. Can anyone help explain why that is so?
void funct(int n)
{
for (int i = n; i > 0; i /= 2)
for(int j = 0; j < i; j++)
printf("%d\n", j%2);
}
That's geometric progression
The first time the inner loop is executed n times.
The second time it is executed n/2 times.
etc...
So we have the sequence:
n + n/2 + n/4 + ... + 1
so the final formula is:
n*(1 - (1/2)^(log n))/(1/2)
which is equivalent to n
Look these can be solved using Double Sigma's :
Let $ represents sigma.
so this problem is :
$(i=n downto 0 by a factor of 2 )$(j=0 to i-1) 1
where 1 represent a unit cost
now for inner sigma its sum of 1 i times that is = i
now problem is
$(i=n downto 1 by a factor of 2 ) i
which is sum of i values i.e. n+n/2+n/4+...+1(when n/2^x=1 or after log(n) terms)+0
or
n*(1+1/2+.....log(n) terms)
which is a convergent Geometric progression. and the result will be n*(1 - (1/2)^(log n))/(1/2) i.e O(n)
The outer loop, as I'm sure you can see is executed log(n) times. The inner loop is executed on average log(n)/2 times. So the printf statement is executed log(n) * (log(n) / 2) times which equals n / 2. So the complexity of the code is O(n).

how to determin the time complexity of this c program

void mystery2 (int n)
{
int i;
for (i = 1; i <= n; i++) {
double x = i;
double delta = 1 / (double)i;
while ( x > 0 )
x -= delta;
}
return 0;
}
How to determine the time complexity of this program using tracking tables like here http://pages.cs.wisc.edu/~vernon/cs367/notes/3.COMPLEXITY.html#application and not by guessing?
For each iteration, initially you have x=i, then x is decremented by 1/i each time. So this will be repeated i/(1/i)==i^2 times.
So, for each iteration of for(i=1;i<n;++i), the inner part has a complexity of O(i^2). As i grows from 1 to n it's just like adding (1^2+2^2+3^2+...+n^2), which is roughly n^3/6. Thus it's O(n^3).
Outer loop(for) Inner Loop
I=1 1
I=2 4
I=3 9
... ..
I=N N^2
TOTAL_ ~N^3/6
This is relatively straightforward: you need to determine how many times each of the two nested loops executes, and considering the complexities together.
The outer loop is a trivial for loop; it executes n times.
The inner loop requires a little more attention: it keeps subtracting 1/i from i until it gets to zero or goes negative. It is easy to see that it takes i iterations of the while loop to subtract 1 from x. Since x is initially set to i, the total time taken by the inner loop is i^2.
The total is, therefore, a sum of x squared, for x between 1 and n.
Wolfram Alpha tells us that the answer to this is n*(n+1)*(2n+1)/6
This expands to n^3/3 + n^2/2 +n/6 polynomial, which has the complexity of O(n^3).

Resources