Determining the time complexity of this code? - c

I think the time complexity of this code will be O(n^2) but I am not sure so if someone can explain what will be the time complexity of this code it would be really helpful
int func2()
{
int i, j, k = 0;
scanf("%d", &n);
for (i = 1; i < n; i++)
{
i -= 1;
i *= 2;
k = k + i;
for (j = 1; j < n; j++);
}
}

It looks like an infinite loop to me, so the time complexity is O(infinity).
On the first iteration of the outer loop, i -= 1 will set i to 0. Multiplying by 2 leaves it still 0.
The loop iteration i++ will then increment i to 1, and the next iteration will repeat the above computations.

I am a beginner on time complexity but these are my views:-
The outer for loop is in the condition of an infinite loop as on the first iteration of the outer loop, execution starts with i=1.
On executing i -= 1 it will set i=0.
Executing i*=2, the value of i remains the same as 0.
On going in the increment phase, i is incremented and i=1.
So the same process occurs.
Thus the value of i remains the same causing it to run indefinitely.
Now, coming forward inside the outer for loop is a nested for (in the variable j) loop that is followed by a semicolon. This causes it to have a time complexity of O(1).
So the resultant overall time complexity can be expected to be O(infinity).

First, 'n' not declared here and input value is being assigned to it.
Second, technically, this code is an infinite loop (done ridiculously hard way) and for non-terminating, forever-running algorithms the time-complexity is 'Undefined' as by the principles of Algorithm Analysis, time-complexity is only computed for algorithms that perform a task with certainity to terminate.
In case if this would have been a terminating loop, time complexity of this function is O(n^2) would have been quadratic in nature due to nesting of for(;;) inside of another for(;;) with enclosed statements of O(1) - linear time complexity. The higher order complexity ( O(n^2) ) supersedes.

Related

time complexity of nested loops - always just a multiplication of each of them seperated?

When looking at this code for example :
for (int i = 1; i < n; i*=2)
for (int j = 0; j < i; j +=2)
{
// some contstant time operations
}
Is it as simple as saying that because the outer loop is log and and inner loop is n , that combined the result is big(O) of nlogn ?
Here is the analysis of the example in the question. For simplicity I will neglect the increment of 2 in the inner loop and will consider it as 1, because in terms of complexity it does not matter - the inner loop is linear in i and the constant factor of 2 does not matter.
So we can notice, that the outer loop is producing is of values which are powers of 2 capped by n, that is:
1, 2, 4, 8, ... , 2^(log2 n)
these numbers are also the numbers that the "constant time operation" in the inner loop is running for each i.
So all we have to do is to sum up the above series. It is easy to see that these are geometric series:
2^0 + 2^1 + 2^2 + ... + 2^(log2 n)
and it has a well known solution:
(from Wiki )
We have a=1, r=2, and... well n_from_the_image =log n. We have a same name for different variables here, so it is a bit of a problem.
Now let's substitute and get that the sum equals
(1-2^((log2 n) + 1) / (1 - 2) = (1 - 2*n) / (1-2) = 2n-1
Which is a linear O(n) complexity.
Generally, we take the O time complexity to be the number of times the innermost loop is executed (and here we assume the innermost loop consists of statements of O(1) time complexity).
Consider your example. The first loop executes O(log N) times, and the second innermost loop executes O(N) times. If something O(N) is being executed O(log N) times, then yes, the final time complexity is just them multiplied: O(N log N).
Generally, this holds true with most nested loops: you can assume their big-O time complexity to be the time complexity of each loop, multiplied.
However, there are exceptions to this rule when you can consider the break statement. If the loop has the possibility of breaking out early, the time complexity will be different.
Take a look at this example I just came up with:
for(int i = 1; i <= n; ++i) {
int x = i;
while(true) {
x = x/2;
if(x == 0) break;
}
}
Well, the innermost loop is O(infinity), so can we say that the total time complexity is O(N) * O(infinity) = O(infinity)? No. In this case we know the innermost loop will always break in O(log N), giving a total O(N log N) time complexity.

c loop function computing time complexity

I am learning to compute the time complexity of algorithms.
Simple loops and nested loops can be compute but how can I compute if there are assignments inside the loop.
For example :
void f(int n){
int count=0;
for(int i=2;i<=n;i++){
if(i%2==0){
count++;
}
else{
i=(i-1)*i;
}
}
}
i = (i-1)*i affects how many times the loop will run. How can I compute the time complexity of this function?
As i * (i-1) is even all the time ((i * (i-1)) % 2 == 0), if the else part will be true for one time in the loop, i++ makes the i odd number. As result, after the first odd i in the loop, always the condition goes inside the else part.
Therefore, as after the first iteration, i will be equal to 3 which is odd and goes inside the else part, i will be increased by i * (i-1) +‌ 1 in each iteration. Hence, if we denote the time complexity of the loop by T(n), we can write asymptotically: T(n) = T(\sqrt(n)) + 1. So, if n = 2^{2^k}, T(n) = k = log(log(n)).
There is no general rule to calculate the time complexity for such algorithms. You have to use your knowledge of mathematics to get the complexity.
For this particular algorithm, I would approach it like this.
Since initially i=2 and it is even, let's ignore that first iteration.
So I am only considering from i=3. From there I will always be odd.
Your expression i = (i-1)*i along with the i++ in the for loop finally evaluates to i = (i-1)*i+1
If you consider i=3 as 1st iteration and i(j) is the value of i in the jth iteration, then i(1)=3.
Also
i(j) = [i(j-1)]^2 - i(j-1) + 1
The above equation is called a recurrence relation and there are standard mathematical ways to solve it and get the value of i as a function of j. Sometimes it is possible to get and sometimes it might be very difficult or impossible. Frankly, I don't know how to solve this one.
But generally, we don't get situations where you need to go that far. In practical situations, I would just assume that the complexity is logarithmic because the value of i is increasing exponentially.

Calculating the complexity of this function

This is the function:
void f(int n)
{
for(int i=0; i<n; ++i)
for(int j=0; j<i; ++j)
for(int k=i*j; k>0; k/=2)
printf("~");
}
In my opinion, the calculation of the time complexity would end up to be something like this:
log((n-1)(n-2))+log((n-1)(n-3))+...+log(n-1)+log((n-2)(n-3))+...+log(n-2)+...log(2)
So, I get a time complexity of nlog(n!) (because loga+logb=log(a*b) and because n-1,n-2,n-3,... each appears n-1 times in total.
However, the correct answer is n^2*logn, and I have no idea where my mistake is. Could anyone here help?
Thanks a lot!
log(n!) can be approximated as (n+1/2)log(n) - n + constant (see https://math.stackexchange.com/questions/138194/approximating-log-of-factorial)
So the complexity is n*n*log(n) as expected.
Simpler: compute the complexity loop by loop independently and multiply them.
First 2 outer loops: trivial: n each, which makes n^2
Inner loop: has a log(n**2) complexity which is the same as log(n)
So n^2log(n) is the correct answer.
The Complexity is O(N*N*LOG_2(N^2)).
The first and the second loop both have O(N) and the last loop in k has logarithmic grow.
LOG_2(N^2) = 2*LOG_2(N) and
O(N*M)=O(N)*O(M).
O(constant)=1.
So for the grow of the last loop you can write also O(LOG_2(N^2))=O(LOG(N)).

How is this loop's time complexity O(n^2)?

How is this loop's time complexity O(n^2)?
for (int i = n; i > 0; i -= c)
{
for (int j = i+1; j <=n; j += c)
{
// some O(1) expressions
}
}
Can anyone explain?
Assumption
n > 0
c > 0
First loop
The first loop start with i=n and at each step, it substracts c from i. On one hand, if c is big, then the first loop will be iterated only a few times. (Try with n=50, c=20, you will see). On the other hand, if c is small (let say c=1), then it will iterate n times.
Second loop
The second loop is the same reasoning. If c is big, then it will be iterated only a few times, if c is small, many times and at the worst case n times.
Combined / Big O
Big O notation gives you the upper bound for time complexity of an algorithm. In your case, first and second loop upper bound combined, it gives you a O(n*n)=O(n^2).

Is the time complexity of this code correct?

Calculate the time complexity of this code fragment if the function "process" has a complexity of O(logn)
void funct (int a[], int n)
{
int i=0
while (i < n){
process (a, n);
if (a[i]%2 == 0)
i = i*2+1;
else
i = i+1;
}
I tried to calculate the best and worst case for time complexity;
Worst case is when the "else" statement get called so it should just be:
Worst case : T(n) = O(nlogn)
I have some problems with the best case. I tried this way but i don't know if this is correct
Since in the "if" statement "i" get incremented by "2i+1" it should be
i=2^k-1
2^k < n+1
so k < log_2(n+1)
Is correct to say that the while loop get executed (log_2(n+1)-2)/2 times because this is the last possible value for which i < n ?
if so the time complexity is O(lognlogn) in best case?
The best case is if the sampled values in a are all even. In that case, the complexity is O(log(n)*log(n)), since the loop trip count is O(log(n)).
The worst case is if the sampled values in a are all odd. In that case, the complexity is O(n*log(n)), since the loop trip count is O(n).

Resources