if else recursion worst time complexity - c

I have some trouble to figure out the worst time complexity of below code.
(This is not a homework, see https://leetcode.com/problems/integer-replacement/description/.)
int recursion (int n) {
if (n == 1)
return 0;
if (n % 2 == 0) {
return recursion(n/2) + 1
} else {
return min(recursion(n / 2 + 1) + 1, recursion(n - 1)) + 1;
}
}
The only thing I know is when N == 2 ^ k(k > 0), worst time complexity is O(logN).
However, I am unclear when N is not 2^k. Because even number / 2 can still get odd number. Some people said it is still O(LogN), but I am not convinced.
I know the code is not best solution, just wanna analyze the time complexity. I tried recursion tree and aggregate analysis, seems not help.

If n is even, we know that T(n) = T(n/2) + 1, and if n is odd we know that
T(n) = T(n/2 + 1) + T(n-1) + 1. In the latter case, as n is odd we know that n-1 must be even. if n/2 + 1 is even T(n) = T(n/4) + T(n/2) + 3 and if n/2 + 1 is odd T(n) = 2*T(n/4) + T(n/2) + 3.
From the above discussion, in the worst case T(n) is defined based on the T(n/2) and T(n/4) in a general case. From Akra-Bazzi Theorem we can say, T(n) = O(n^((log(1+sqrt(5))-log(2))/log(2))) ~ O(n^0.69) (from the first case) and T(n) = O(n) from the second case (which n/2 + 1 is odd).
However, for the more tight complexity, we should scrutinize more in our analysis.

Related

Finding Time Complexity Of This Algorithm

I'm trying to find the time complexity of this algorithm, I tried calculating it using T(n), assumed that T(n) = 2T(n-1) + Const and got O(n) as a result.
bool sum(int arr[], int x, int n, int index){
if(x == 0 && index == 3)
return true;
if(n == 0)
return false;
if (index == 3)
return false;
if(sum(arr+1, x-arr[0], n-1, index + 1))
return true;
return sum(arr+1, x, n-1, index);
}
The top call starts with index = 0. Basically I'm trying to see if there is a triple that sums to a given value x.
Am I missing something?
First of all, T(n) = 2T(n-1) + Const would make T(n) be in O(2^n), not O(n). This would be the runtime if you didn't have the index == 3 stopping condition. But this stopping condition decreases the runtime significantly.
One way to find the time complexity is to count the number of leaves in the recursion tree (i.e. number of times a stop condition was reached). Each leaf with index == 3 corresponds to a choice of 3 out of n elements, so there are C(n, 3) such nodes. Leaves with n == 0 and index < 3 corresponds to a choice of 0, 1, or 2 elements, i.e. C(n, 0) + C(n, 1) + C(n, 2). The total number of leaves is thus O(n^3).
Since the number of inner nodes (calls which do not reach a stop condition and thus make recursive calls) is about equal to the number of leaves, and each call does O(1) work not including the recursive calls, the total runtime is O(n^3).
Another way to get the same result is to consider T(n, index):
T(n, 3) = C = O(1)
T(n, 2) = T(n-1, 3) + T(n-1, 2) + C = O(n)
T(n, 1) = T(n-1, 2) + T(n-1, 1) + C = O(n^2)
T(n, 0) = T(n-1, 1) + T(n-1, 0) + C = O(n^3)
Given that the top-level call is assumed to be made with index == 0 (from comments), the algorithm is O(n3). Ignore the details of the implementation, and consider more abstractly what it is doing:
It performs a linear scan over array arr, where
for each element e, it performs a linear scan over the tail of arr starting after e, where
for each element f, it performs a linear scan over the tail of arr starting after f, where
for each element g, it checks whether e + f + g == x
The boundary case is the one in which no triplet of elements sums to x, and in that case the procedure does not end until all the scans are complete. As should be clear from that description, the recursion is equivalent to a triply-nested loop.

Space complexity of recursive function (Time & Space)

There is recursion function below, and I did not calculated time & space complexity. I looked at some resources, but it was not clear enough for me the understanding. Could anyone explain the way of solving in the simplest way, and answers the question?
By the way, I tried to solve time complexity, and I found O(2^n). Is it correct?
int func(int n) {
if (n < 3)
return 3;
else {
return func(n-3)*func(n-3);
}
}
Yes, the time complexity is indeed O(2 ^ n).
The recurrence relation for time complexity is:
T(n) = 2 * T(n - 3)
Applying the above equation k times:
T(n) = 2 * 2 * 2 ... k times * T(n - 3 * k) = 2 ^ k * T(n - 3k)
When k is n/3, T(n) = 2 ^ k = 2 ^ (n / 3) = O(2 ^ n)
There's only one function running at a time and stack depth can be k at max.
So, space complexity is n / 3 or O(n)

time complexity of randomized array insertion

So I had to insert N elements in random order into a size-N array, but I am not sure about the time complexity of the program
the program is basically:
for (i = 0 -> n-1){
index = random (0, n); (n is exclusive)
while (array[index] != null)
index = random (0, n);
array[index] = n
}
Here is my assumption: a normal insertion of N numbers is of course strictly N, but how much cost will the collision from random positions cost? For each n, its collision rate increases like 0, 1/n, 2/n .... n-1/n, so expected number of insertions attempts will be 1, 2, 3 .. n-1, this is O(n), so total time complexity will be O(n^2), so is this the average cost? but wow this is really bad, am I right?
So what will happen if I do a linear search instead of keep trying to generate random numbers? Its worst case will obviously be O(n^2>, but I don't know how to analyze its average case, which depends on average input distribution?
First consider the inner loop. When do we expect to have our first success (find an open position) when there are i values already in the array? For this we use the geometric distribution:
Pr(X = k) = (1-p)^{k-1} p
Where p is the probability of success for an attempt.
Here p is the probability that the array index is not already filled.
There are i filled positions so p = (1 - (i/n)) = ((n - i)/n).
From the wiki, the expectation for the geometric distribution is 1/p = 1 / ((n-i)/n) = n/(n-i).
Therefore, we should expect to make (n / (n - i)) attempts in the inner loop when there are i items in the array.
To fill the array, we insert a new value when the array has i=0..n-1 items in it. The amount of attempts we expect to make overall is the sum:
sum_{i=0,n-1} n/(n-i)
= n * sum_{i=0,n-1}(1/(n-i))
= n * sum_{i=0,n-1}(1/(n-i))
= n * (1/n + 1/(n-1) + ... + 1/1)
= n * (1/1 + ... + 1/(n-1) + 1/n)
= n * sum_{i=1,n}(1/i)
Which is n times the nth harmonic number and is approximately ln(n) + gamma, where gamma is a constant. So overall, the number of attempts is approximately n * (ln(n) + gamma), which is O(nlog n). Remember that this is only the expectation and there is no true upper bound since the inner loop is random; it may never find an open spot.
The expected number of insertions attempt at step i is
sum_{t=0}^infinity (1-i/n)^t * (n-i)/n * t
= (n-i)/n * i/n * (1-i/n)^{-2}
= i/(n-i)
Summing over i you get
sum_{i=0}^{n-1} i/(n-1)
>= sum_{i=n/2}^n i / (n-i)
>= n/2 sum_{x=1}^n/2 1/x
>= n/2 * log(n) + O(n)
And
sum_{i=0}^{n-1} i/(n-i)
<= n * sum _{x=1}^n 1/x
<= n * log(n) + O(n)
So you get exactly n*log(n) as an asymptotic complexity. Which is not as bad as you feared.
About doing a linear search, I don't know how you would do it while keeping the array random. If you really want an efficient algorithm to shuffle your array, you should check out Fisher-Yates shuffle.

Finding the time complexity of code

Given is an infinite sorted array containing only numbers 0 and 1. Find the transition point efficiently.
For example : 00000000000111111111111111
Output : 11 which is the index where the transition occurs
I have coded a solution for this ignoring some edge cases.
int findTransition(int start)
{
int i;
if(a[start]==1)return start;
for(i=1;;i*=2)
{
//assume that this condition will be true for some index
if(a[start+i]==1)break;
}
if(i==1)return start+1;
return findTransition(start+(i/2));
}
I am not really sure about the time complexity of this solution here. Can someone please help me in figuring this out?
Is it O(log(N))?
Let n be position of transition point
This block
for(i=1;;i*=2)
{
//assume that this condition will be true for some index
if(a[start+i]==1)break;
}
works for log2(n)
So we have
T(n) = log2(n) + T(n/2)
T(n) = log2(n) + log2(n/2) + T(n/4) = log2(n) + (log2(n) - 1) + (log2(n) - 2)...
T(n) = log2(n) * (log2(n) + 1) / 2
So there is O(log(n)^2) complexity (for worst case)
Note: you can use usual binary search instead of recursion call, then you will have log2(n) + log2(n/2) just O(log(n)) granted.

Space complexity of a given recursive program

Consider the following C-function:
double foo (int n) {
int i;
double sum;
if (n == 0) return 1.0;
else {
sum = 0.0;
for (i = 0; i < n; i++)
sum + = foo (i);
return sum;
}
}
The space complexity of the above function is
1) O(1)
2) O(n)
3) O(n!)
4) O(n^n)
in the above question, according to me, answer should be (2) but answer is given as (3) option. Although it is recursive function but stack will never have more than O(n) stack depth. Can anyone explain me why is this answer (3) and where am I thinking wrong?
If You needed time complexity then it is certainly not O(N!) as many suggest but way less then that it is O(2^N).
Proof:-
T(N) = T(N-1) + T(N-2) + T(N-3) + T(N-4)........T(1)
moreover by above formula
T(N-1) = T(N-2) + T(N-3)...... T(1)
hence T(N) = T(N-1) + T(N-1) = 2*T(N-1)
solving above gives T(N) = O(2^N)
Whereas if you needed space complexity then for recursive function space complexity is calculated by the amount of stack space at max occupied by it at a moment and that in this case cannot exceed of O(N)
But in any case the answer is not O(N!) because that many computations are not done at all so how can stack occupy that much space.
Note:- Try to run the function for n = 20 if it doesnt cause memory overflow then answer given in text will be 20! which is larger than any memory but i think it will run in O(2^20) time without any stack overflow.
Space complexity is O(N). at any given time the space used is limited to:
N*sizeof_function_call_which_is_a_constant.
Think of it like this:
To calculate foo(n) . The program have to calculate: foo(0)+foo(1)+foo(2) ... foo(n-1):
Similarly for foo(n-1). The program have to recursively calculate: foo(0) + foo(1) + ... foo(n-2).
Basically you will have O(foo(n)) = n! + (n-1)! + (n-2)! + ... 1! = O(n!).
Hope this is clear.

Resources