In my data structure lecture, I got homework about algorithms and time complexity. I could not actually find what I need to do for this.
Question : What is the time-complexity of this algorithm ?
My solution was the analyzing loop by loop, removing constant and lower order terms of each of loop itself. For this reason , there are three loops within each other. Complexity should be O(n3). Critical point is that the innermost loop is bounded dynamically.
What is the mistake on this table ( if there is ) :
int c = 0;
for (int i = 0; i < n * n; ++i)
for (int j = 0; j < n; ++j)
for (int k = 0; k < j * 2; ++k)
c = c + 1;
return c;
All answers are welcomed.
In order to compute the time complexity, you can try and evaluate the number of iterations of the innermost loop.
the loop on k evaluates a simple expression 2 * j times.
the loop on j runs n times. Hence the inner loop runs 2 * n * (n + 1) / 2, which simplifies to n * (n + 1).
the outer loop runs n * n times. Hence the inner loops runs exactly n * n * n * (n + 1) times.
Simplifying for the dominant term, the resulting time complexity is n4.
Yet a very shrewd compiler would reduce this complexity to constant time O(1) and generate code for:
return n * n * n * (n + 1);
Trying this on Godbolt's compiler explorer shows that none of the common compilers achieve this as of now, albeit clang goes to great lengths trying to optimize the code with unfathomable SIMD code.
Related
I have been given the following code:
void sort(int a[], int n)
{
for(int i = 0; i < n - 1; i++)
for(int j = 0; j < n - i - 1; j++)
if(a[j] > a[j+1])
swap(a + j, a + j + 1);
}
I have to calculate the worst-case running time O(n) of this code.
n is 20, so I was thinking, is O(n) = 20 - 1, or is it O(n)= n-1?
Any time you see a double-nested for loop, your first reaction should be :likely O(N²).
But let's prove it.
Start with the outer loop. It will iterate n-1 times. As n gets really large, the -1 part is negligible. So the outer loop iterates pretty close to n iterations.
When i==0, the inner loop will iterate n-2 times. But again, there's not much difference between n and n-2 in terms of scale. So let's just say it iterates n times.
When i==n-2, the inner loop will iterate exactly once.
Hence, the inner loop iterates an average of n/2 times.
Hence, if(a[j] > a[j+1]) is evaluated approximately n * n/2 times. Or n²/2 times. And for Big-O notation, we only care about the largest polynomial and none of the factors in front of it. Hence, O(N²) is the running time. Best, worst, and average.
I get O(n^2 logn) as output of the following code. Yet I am unable to understand why?
int unknown(int n) {
int i, j, k = 0;
for (i = n / 2; i <= n; i++)
for (j = 2; j <= n; j = j * 2)
k = k + n / 2;
return k;
}
A fixed constant starting point will make no difference to the inner loop in terms of complexity.
Starting at two instead of one will mean one less iteration but the ratio is still a logarithmic one.
Think in terms of what happens when you double n. This adds one more iteration to that loop regardless of whether you start at one or two. Hence it's O(log N) complexity.
However, you should keep in mind that the outer loop is an O(N) one since the number of iteratations is proportional to N. That makes the function as a whole O(N log N), not the O(N2 log N) you posit.
What would be the time complexity of this following block of code
void function(int n).
My attempt was that the outermost loop would run n/2 times and the inner two would run 2^q times. Then I equated 2^q with n and got q as 1/2(log n) with base 2. Multiplying the time complexities I get my value as O(nlog(n)) while the answer is O(nlog^2(n)).
void function(int n) {
int count = 0;
for (int i=n/2; i<=n; i++)
for (int j=1; j<=n; j = 2 * j)
for (int k=1; k<=n; k = k * 2)
count++;
}
Time to apply the golden rule of understanding loop nests:
When in doubt, work inside out!
Let’s start with the original loop nest:
for (int i=n/2; i<=n; i++)
for (int j=1; j<=n; j = 2 * j)
for (int k=1; k<=n; k = k * 2)
count++;
That inner loop will run Θ(log n) times, since after m iterations of the loop we see that k = 2m and we stop when k ≥ n = 2lg n. So let’s replace that inner loop with this simpler expression:
for (int i=n/2; i<=n; i++)
for (int j=1; j<=n; j = 2 * j)
do Theta(log n) work;
Now, look at the innermost remaining loop. With exactly the same reasoning as before we see that this loop runs Θ(log n) times as well. Since we do Θ(log n) iterations that each do Θ(log n) work, we see that this loop can be replaced with this simpler one:
for (int i=n/2; i<=n; i++)
do Theta(log^2 n) work;
And here that outer loop runs Θ(n) times, so the overall runtime is Θ(n log2 n).
I think that, based on what you said in your question, you had the right insights but just forgot to multiply in two copies of the log term, one for each of the two inner loops.
In your code there are 3 nested loops.
First loop runs n/2 times which is almost equivalent to n while calculating complexity.
Second loop runs logn times.
Third loop runs logn times.
So, finally the time complexity will be O( n * logn * logn ) == O(nlog^2n).
Now, one may wonder how the run time complexity of the two inner loops is logn. This can be generalized as follows:
Since we are multiplying by 2 in each iteration, we need value of q such that:
n = 2 ^ q.
Taking log base 2 on both sides,
log2 n = log2 (2^q)
log2 n = q log2(2)
log2 n = q * 1 [ since, log2(2) is 1 ]
So, q is equal to logn.
So, overall time complexity is: O(n*log^2n).
First loop takes: n/2
Second loop: log(n)
Third loop: log(n)
Notice that because the step of the inner loops is multiplied by two, their respective counters grow exponentially, reaching n in a log(n), in terms of time complexity.
Then, also notice that constants like 1/2 can safely be ignored in that case, which results in O(n * log(n) *log(n)), thus:
O(nlog2n)
What is the time complexity of these loops? Correct me if I am wrong.
This loop is O(n^3) because it's got (n^3)/2 + 1 iterations.
for (int i = 0; i < n * n * n; i+=2)
{
//body
}
and
This loop is O(n^3 * m^2) since it has (n^3 + 1) * (m^2 + 1) iterations. Or would this just be O(n^3) since the inner loop is not a variable n?
for (int i = 0; i < n * n * n; i+=2)
{
for (int j = 0; j < m * m; j++)
{
//Body
}
}
In the first case the time complexity is O(n^3). It captures the most significant term so you ignore the scaling factor of 1/2 and the constant +1. In the latter case it is O(n^3 * m^2) unless you treat m as a constant and not as a variable. In the Big-O notation you don't necessary need to have only a single variable to represent the size of the input data.
This loop is O(n^3) because it's got (n^3)/2 + 1 iterations.
Correct.
This loop is O(n^3 * m^2) since it has (n^3 + 1) * (m^2 + 1) iterations. Or would this just be O(n^3) since the inner loop is not a variable n?
Both are correct. It depends if you consider m a variable or a constant.
In asymptotic notations one can have more than one variables.
For the second case considering both n and m as variables the complexity will be O(n^3 * m^2). If m is treated as a constant then the complexity is O(n^3).
I know there is many similar questions about the Big Oh notation, but this example is quite interesting and not trivial:
int sum = 0;
for (int i = 1; i <= N; i = i*2)
for (int j = 0; j < i; j++)
sum++;
The outer loop will iterate lg(N) times, but what with inner loop? And what is T(N) for all the operations?
I can see only 3 posibilities :
T(N) = lg(N) * 2^N
T(N) = log(N) * (N-1)
T(N) = N
My opinion - T(N) = N - but it is just my intuition from observations value of sum variable when N was multiplied many times - sum was almost equal to 2N, which gives us N.
Basically I do not know how to count it. Please help me with this task and explain the solution - it is quite important for me.
Thanks
The inner loop iterates the last time max. N. Before the last run it iterates N/2. If you sum it up N + N/2 + N/4 + N/8 This add up to 2*N. And that's all as you counted all runs. T(N) = N