Assume sizeof(int).
Then, what's the total size of bytes that will be allocated on the dynamic heap?
And can you please explain why?
#include<stdio.h>
#include<stdlib.h>
#define MAXROW 8
#define MAXCOL 27
int main()
{
int (*p)[MAXCOL];
p = (int (*) [MAXCOL])malloc(MAXROW *sizeof(*p));
return0;
}
Assume "sizeof(int) is "(what?)... I guess you meant 4.
In the first line you declare p to be a pointer to an array of 27 integers.
In the second line you allocate memory in the heap for size of dereferenced p - which is 27 integers times 8 - that's 27*4*8 so the number of bytes allocated is 864.
In your code,
int (*p)[MAXCOL];
is eqivalent to saying, declare p as a pointer to an array of MAXCOL number of ints.
So, considering sizeof(int) is 4 bytes (32 bit compiler / platform),
sizeof(*p) is 108, and MAXROW *sizeof(*p) is 8 * 108, and malloc() allocate that many bytes, if successful.
Also, please see this discussion on why not to cast the return value of malloc() and family in C..
The answer should be MAXROW*MAXCOL*sizeof(int). The size of int cannot be determined from the code shown. It can be 2, 4, 8... or even 42, pretty much anything greater than 0.
If your teacher or course expects 432, they rely on extra context you failed to provide. Re-reading your question, you write assume sizeof(int). You need to say what should be assumed precisely.
Related
I was doing some practice with strings of different types and returning their address so I could understand the concept of pointer arithmetic better.
I noticed that when using the printf function, and %p as the reference character, the address would increment by 4 + 1 bytes when using the & operand on the variable, and by 1 byte without it.
Here is an example of my code and it's output:
1 #include <stdio.h>
2 #include <string.h>
3
4
5 int main ()
6 {
7 char charString_1[] = "Hello";
8 printf("%s\t%s\t %p\t %p\n", charString_1 + 1, charString_1 + 1, &charString_1 + 1, charString_1 + 1);
The output was the following
ello ello 0x7ffe76aba5d0 0x7ffe76aba5cb
Looking at the last two hex numbers only, the address is 203 and 208 (in decimal) respectively. So the latter is a char + int value bigger than the former. if I increment by two (&charString_1 + 2) , the gap is now 2(char + int) = 10 bytes.
I understand this question might be ridiculous, but my search results have turned up nothing. I'm trying to understand how memory works, and become better at finding common faults in buggy code.
When you do arithmetic on a pointer, the 'base unit size' is the size of the object pointed to.
So, for char_string, which points to a char (size = 1), the + 1 operation adds just one.
However, the expression &char_string evaluates as a pointer to an array, which (in your example) has a size of six characters (including the nul terminator), so the + 1 operation on that adds 6.
The difference in values printed by your two %p fields (5) is the difference between those two sizes (6 - 1 = 5). If you change the length of the array (e.g. like char charString_1[] = "Hello, World!";) you will see a corresponding change in the value of &charString_1 + 1.
"+1" will add the size of one of whatever type the compiler determines it is working with. In one case it believes it is working with a char, so it will add one byte (one "sizeof" a char). In the other case, it determines it is working with a pointer, so it will add one "sizeof" a pointer (typically 4 bytes).
(Edit: See below for the correction by Eric Postpischil, who points out that it actually sizeof pointer vs sizeof array)
This question already has answers here:
A riddle (in C)
(4 answers)
Closed 5 years ago.
int array[] = {23,34,12,17,204,99,16};
#define TOTAL_ELEMENTS (sizeof(array) / sizeof(array[0]))
printf("%d",sizeof(TOTAL_ELEMENTS));
here is my some piece of sample code. the array is an integer type. so, the array[0] is also an integer. division of an integer by an integer should yield an integer. But, when I try to find the size of TOTAL_ELEMENTS by using sizeof() operator, it shows 8 bytes. why??
Your use of the macro expands to something like sizeof (sizeof ....). Since the result of sizeof is a size_t, you're getting sizeof (size_t), which is evidently 8 on your platform.
You probably wanted
printf("%zu\n", TOTAL_ELEMENTS);
(Note that %d is the wrong conversion specifier for a size_t, and a good compiler will at least warn about your version.)
Here's a complete program that works:
#include <stdio.h>
int main()
{
int array[] = {23,34,12,17,204,99,16};
size_t const TOTAL_ELEMENTS = (sizeof array) / (sizeof array[0]);
printf("%zu\n", TOTAL_ELEMENTS);
}
Output:
7
Note that I made TOTAL_ELEMENTS be an ordinary object, as there's no need for it to be a macro here. You may need it as a macro if you want a version that will substitute the array name, like this:
#define TOTAL_ELEMENTS(a) (sizeof (a) / sizeof (a)[0])
You'd then write:
printf("%zu\n", TOTAL_ELEMENTS(array));
When you use the #define the pre-processor replaces TOTAL_ELEMENTS with (sizeof(array) / sizeof(array[0])).
The result type of (sizeof(array) / sizeof(array[0])) is a size_t. When you use sizeof operator on a size_t it will return its size. In your case 8 bytes.
sizeof returns a size as type size_t, as per 6.5.3.4p5
Your platform's sizeof(size_t) is 8.
sizeof returns the size of an item, in bytes. Since a lot of types are larger than a single byte, if you have an array, one way to determine its size in elements is to take the total size in bytes of the array, then divide it by the size of the type of element it's composed of - and that type will be in the first element of the array, at index 0.
For example, if I have an array of ints, and an int is 4 bytes on my platform, it would look like this in memory
Item 0 | Item 1 | Item 2
4bytes | 4bytes | 4bytes
sizeof(array) would be 12, the array's total size in bytes. sizeof(array)/sizeof(array[0]) would be 3, the array's total size in elements.
You're using the macro wrong in your code. You should be using:
printf("%zu", TOTAL_ELEMENTS);
Otherwise the code expands to sizeof(sizeof ...) which is not what you want if you're actually after the length of elements in the array. sizeof(sizeof(something)) will return the size of a size_t type on your platform, which is why you're seeing 8.
Pointers can only move in discrete steps.
int *p;
p = malloc(sizeof(int)*8);
Therefore, formally *(p+2) is calculated as *(p+2*sizeof(int)).
However If I actually code the above two, I get different results, which seems understandable.
*p = 123;
*(p+2) = 456;
printf("%d\n",*(p+2*(sizeof(int)))); \\0
printf("%d\n",*(p+2)); \\456
The question is, is this calculation implicit, done by the compiler at compile time?
The question is, is this calculation implicit, done by the compiler at
compile time?
Yes this is implicit, when you write ptr+n it actually advances forward n times as many bytes as size of pointee type (e.g. in case of int* - this is 4 bytes granted integer takes four bytes on your computer).
e.g.
int *x = malloc(4 * sizeof(int)); // say x points at 0x1000
x++; // x now points at 0x1004 if size of int is 4
You can read more on pointer arithmetic.
Therefore, formally *(p+2) is calculated as *(p+2*sizeof(int)).
No, *(p+2) is calculated as *(int*)((char*)p+2*sizeof(int)).
Even a brief look reveals that the only way for your statement to hold is if sizeof(int) == 1.
Hi I am new to C programming can anyone please tell me what this line of code would do:
i = (sizeof (X) / sizeof (int))
The code actually works with a case statement when it takes a value of bdata and compares it to different cases.
Generally, such a statement is used to calculate the number of elements in an array.
Let's consider an integer array as below:
int a[4];
Now, when sizeof(a) is done it will return 4*4 = 16 as the size. 4 elements and each element is of 4 bytes.
So, when you do sizeof(a) / sizeof(int), you will get 4 which is the length or size of the array.
It computes the number of elements of the array of int named X.
returns the length of the array X
it computes X's volume in memory divided by the size of an integer in your computer(2 bytes or 4 bytes). If i is integer than it is an integer division. If it is float and X has no even volume, it is real division.
int size can change. X depends on implementation. Division result depends on type of i.
All these means, it computes how many ints fit into X.
Besides common practice or personal experience there is no reason to think that this i = (sizeof (X) / sizeof (int)) computes the size of the array X. Most often probably this is the case but in theory X could be of any type, so the given expression would compute the ratio of the sizes of your var X and an int (how much more memory, in bytes, does your X var occupy with respect to an int)
Moreover, if X was a pointer to an array (float* X, the alternate way of declaring arrays in C) this expression would evaluate to 1 on a 32-bit architecture. The pointer would be 4 bytes and the int also 4 bytes => i = sizeof(X) / sizeof(int) (=1)
On Linux, with 16 GB of RAM, why would the following segfault:
#include <stdlib.h>
#define N 44000
int main(void) {
long width = N*2 - 1;
int * c = (int *) calloc(width*N, sizeof(int));
c[N/2] = 1;
return 0;
}
According to GDB the problem is from c[N/2] = 1 , but what is the reason?
It's probably because the return value of calloc was NULL.
The amount of physical RAM in your box does not directly correlate to how much memory you can allocate with calloc/malloc/realloc. That is more directly determined by the remaining amount of Virtual Memory available to your process.
Your calculation overflows the range of a 32-bit signed integer, which is what "long" may be. You should use size_t instead of long. This is guaranteed to be able to hold the size of the largest memory block that your system can allocate.
You're allocating around 14-15 GB memory, and for whatever reason the allocator cannot
give you that much at the moment- thus calloc returns NULL and you segfault as you're dereferencing a NULL pointer.
Check if calloc returns NULL.
That's assuming you're compiling a 64-bit program under a 64-bit Linux. If you're doing something else - you might overflow the calculation to the first argument to calloc if a long is not 64 bits on your system.
For example, try
#include <stdlib.h>
#include <stdio.h>
#define N 44000L
int main(void)
{
size_t width = N * 2 - 1;
printf("Longs are %lu bytes. About to allocate %lu bytes\n",
sizeof(long), width * N * sizeof(int));
int *c = calloc(width * N, sizeof(int));
if (c == NULL) {
perror("calloc");
return 1;
}
c[N / 2] = 1;
return 0;
}
You are asking for 2.6 GB of RAM (no, you aren't -- you are asking for 14 GB on 64 bit... 2.6 GB overflowed cutoff calculation on 32 bit). Apparently, Linux's heap is utilized enough that calloc() can't allocate that much at once.
This works fine on Mac OS X (both 32 and 64 bit) -- but just barely (and would likely fail on a different system with a different dyld shared cache and frameworks).
And, of course, it should work dandy under 64 bit on any system (even the 32 bit version with the bad calculation worked, but only coincidentally).
One more detail; in a "real world app", the largest contiguous allocation will be vastly reduced as the complexity and/or running time of the application increases. The more of the heap that is used, the less contiguous space there is to allocate.
You might want to change the #define to:
#define N 44000L
just to make sure the math is being done at long resolution. You may be generating a negative number for the calloc.
Calloc may be failing and returning null which would cause the problem.
Dollars to donuts calloc() returned NULL because it couldn't satisfy the request, so attempting to deference c caused the segfault. You should always check the result of *alloc() to make sure it isn't NULL.
Create a 14 GB file, and memory map it.