Here is my stucture
struct node{
int V,E;
int **adj;
};
Here is my code to create a graph:
struct node* create()
{
int i,j,x,y;
struct node *G=malloc(sizeof(struct node));
printf("Write the number of vertex and edges\n");
scanf("%d%d",&G->V,&G->E);
G->adj=malloc(sizeof(int)*(G->V * G->V));
if(!G->adj){
printf("Out of memory\n");
return;
}
for(i=0;i<G->V;i++)
for(j=0;j<G->V;j++)
G->adj[i][j]=0;
printf("\nWrite the source node and destination: ");
for(i=0;i<G->E;i++){
scanf("%d%d",&x,&y);
G->adj[x][y]=1;
G->adj[y][x]=1;
}
return(G);
}
and I am storing the pointer returned by this function in another pointer like this:
int main()
{
struct node *G=create();
}
When I compile the program, I'm asked for the number of vertex and edges but as soon as I enter the values, my program crashes. I want to know the reason. Is this because of memory allocation failure?
C99 style variable length arrays are only useful with local variables or arguments. So, you have the two classic C methods available to implement a 2D array.
Array of pointers:
That's what your struct looks like. With that, you need separate allocations for the pointer array and data:
G->adj = calloc(G->V, sizeof (int*));
assert(G->adj != NULL); /* need assert.h for this */
for (i=0; i<G-V; ++i)
{
G->adj[i] = calloc(G->V, sizeof (int));
assert(G->adj[i] != NULL);
}
It's a bit easier on memory to do one bulk allocation of the array data and then use pointer arithmetic to set the G->adj[] pointers, but the above gives the idea that, as far as C is concerned, each row is a separate array.
Just one bulk array, with explicit cell location calculation done on each access. This is what C does internally with nested arrays.
Change the type of adj to just int* and then:
G->adj = calloc(G->V * G->V, sizeof (int));
assert(G->adj != NULL);
That's it. Now when you access an element, use G->adj[i*G->V + j], instead of G->adj[i][j]. Macros may help with readability.
Related
I am studying C (self-study, not in an educational institution) and have been trying to build a hashtable data structure as part of my learning.
Please refer to this hopefully reproducible example:
#include <stdio.h>
#include <stdlib.h>
struct table_item {
char *name;
char gender;
char *birthdate;
char *address;
};
struct list_node {
struct table_item *data;
struct list_node *next;
unsigned long long hash_key;
};
struct hashtable {
int table_size;
int num_entries;
struct list_node **entries;
};
struct hashtable* init_hashtable(int size);
void free_hashtable(struct hashtable *table);
int main(void)
{
struct hashtable *hashtable = NULL;
int size_entry = 0;
printf("Input hashtable array size: ");
while (size_entry < 1) {
scanf(" %d", &size_entry);
}
hashtable = init_hashtable(size_entry);
free_hashtable(hashtable);
return 0;
}
struct hashtable* init_hashtable(int size) {
struct hashtable* new_table;
if ((new_table = malloc(sizeof(struct hashtable))) == NULL) {
perror("Error: failed to allocate memory for hash table\n");
exit(EXIT_FAILURE);
}
new_table->table_size = size;
new_table->num_entries = 0;
if ((new_table->entries = malloc(size*sizeof(struct list_node))) == NULL) {
perror("Error: failed to allocate memory for hash table array\n");
exit(EXIT_FAILURE);
}
return new_table;
}
void free_hashtable(struct hashtable *table) {
for (int i = 0; i < table->table_size; i++) {
if (table->entries[i] != NULL) {
free_list(table->entries[i]);
table->entries[i] = NULL;
}
}
free(table->entries);
free(table);
}
My issue is that trying to free the table always fails, even if I have not added anything to it.
I used GDB to check the issue. It seems that, in the above for loop, if (table->entries[i] != NULL) always fires (such as when i=0) even when I haven't added anything. This results in my free_list function trying to free inappropriate memory, which is why I get the stack dump.
Somehow it seems that table->entries[i] is actually not NULL but rather has a struct list_node * type, causing the if condition to fire inappropriately. Could somebody please explain to me why this is?
I was hoping that I could use this for loop to go through the entries array and only free memory where malloced nodes exist, but as it stands this will just crash my program. I am not sure how I can alter this to behave as I'd like it to.
Somehow it seems that table->entries[i] is actually not NULL
Indeed, because you never initialized it to NULL.
init_hashtable allocates space using malloc and points table->entries. Now malloc does not initialize the memory it provides. Its contents are garbage, and in particular, there is no reason why it should consist entirely of NULL pointers as your code expects.
If you want table->entries to be full of NULL pointers then you have to explicitly initialize it, either with a loop, or with memset(entries, 0, size*sizeof(struct list_node *)). Or best of all, by calling calloc instead of malloc, which also avoids bugs in case the multiplication size*sizeof(struct list_node *) overflows.
(Technically memset and calloc initialize memory to all-bits-zero, which in theory does not have to correspond to NULL pointers, but it actually does on all systems you are likely to encounter. But to be pedantic, the loop is the only strictly conforming way to do it.)
but rather has a struct list_node * type,
This has nothing to do with types. Types in C are statically determined from declarations, and there is no way for an object to have an unexpected type at runtime. The type of table->entries[i] is struct list_node * no matter what. The question is about the value of that object; you expect it to be NULL but it's not. "Null pointers" are not a separate type in C; NULL is simply a value that a pointer of any type may have.
As Avi Berger points out, there is another bug in that the size calculation in the malloc should be size*sizeof(struct list_node *) not sizeof(struct list_node). Each element is not a struct list_node but rather a pointer. In this case a struct list_node is larger than a pointer, so it's just wasting memory and not causing any other harm, but it should be fixed.
Somehow it seems that table->entries[i] is actually not NULL but rather has a struct list_node * type, causing the if condition to fire inappropriately. Could somebody please explain to me why this is?
You dynamically allocate space for table->entries. The initial contents of that allocated space are unspecified, so until you assign values to its contents, it is unsafe to have any particular expectations about them. In particular, you cannot assume that any or all elements will contain null pointers.
If you want to rely on those values to tell you something about what kind of cleanup needs to be performed, then you should set them all to appropriate values, I guess NULL, immediately after allocating the space.
Note also that there are null pointer values of every pointer type, so being null and having type struct list_node * are not mutually exclusive.
I am struggling with the free of memory working with dynamic arrays. Considering the following code:
struct element{
float a;
float b;
};
struct list{
int size;
struct element *myelements;
};
int main(){
struct list mylist;
mylist.size = 0;
mylist.myelements = (struct element*) malloc(sizeof(struct element)*4); //I reserve it as if i had struct element myelements[4]
//i do stuff like
int i;
for(i = 0; i< 4 ; i++){
mylist.myelements[i].a = i;
mylist.myelements[i].b = i*2;
}
//I try to free myelements[2] for example, but i get an error
free(mylist.mylements[3]);
return 0;
}
My question is, how am i supposed to free the second possition of my array of elements. I have thought some alternatives involving realloc:
mylist.mybooks = realloc(mylist.mybooks, sizeof(mylist.mybooks) - sizeof(struct element));
but in that case wouldn´t I have to reorder the elements of the array?
Thanks in advance!
My question is, how am i supposed to free the second possition of my array of elements.
free deallocates the entire allocation done by malloc, it cannot deallocate a part of one allocation.
A common way to manage a resizeable array is to maintain its capacity and size, and when removing elements move subsequent array elements to fill the removed element gap and reduce the size. The spare capacity is used for new elements when they get inserted. Calling realloc for every element insertion/removal is sub-optimal in terms of speed.
I'm trying to write a simple dictionary with an array of linked list but I keep losing data after calling the display function.
Here's my structure definition
typedef struct node{
int elem;
struct node *next;
}*L;
typedef L Dictionary[10];
And here's my display
void display(Dictionary A)
{
int i;
for(i=0;i<10;i++){
printf("A[%d]: ",i);
while(A[i]!=NULL){
printf("%d\t",A[i]->elem);
A[i] = A[i]->next;
}
printf("\n");
}
}
The solution for this is to make a temporary variable.
I tried
Dictionary tempStruct
for(i=0;i<10;i++){
tempStruct[i] = A[i];
}
and it works. But is there any other ways of assigning linked list that is more efficient than this one?
tempStruct = A;
Doesn't really work, I get incompatible types node** to Dictionary{*node[10]}
You can change the loop in your display function to this:
for(i=0;i<10;i++){
printf("A[%d]: ",i);
L tmp = A[i];
while(tmp!=NULL){
printf("%d\t",tmp->elem);
tmp = tmp->next;
}
printf("\n");
}
There's no need to copy the whole array, a simple temporary pointer navigating through the linked list is enough.
Side note: For the copy of the array, you tried to assign it with tempStruct = A;. There are two reasons this doesn't work:
Inside your function, A doesn't have an array type. C doesn't support passing an array to a function. When a function has a parameter with an array type, this is automatically adjusted to a pointer type, and instead of passing an array, a pointer to the array's first element is passed. This effect is often expressed as the array decays as a pointer, and it's the reason for your message incompatible types node** to Dictionary{*node[10]}.
Even if A had an array type, it still wouldn't work because C doesn't allow assigning to an array. This is a bit surprising because the same thing would work with a struct. I can't think of a good reason why assigning arrays is not allowed in C, you should just remember that you can't. Of course, you can do it manually, and if you don't want to assign every single element, you can use the function memcpy(), declared in string.h:
int foo[5];
int bar[5] = {1, 2, 3, 4, 5};
// instead of foo = bar;
memcpy(foo, bar, sizeof foo);
Unrelated to your question, but I had a hard time understanding this code. Your typedefs are catastrophic for readability. Never ever hide a pointer behind a typedef -- for understanding the code dealing with a pointer, it's important the pointer is obvious. A typedef for an array type is at least questionable as well. I would suggest the following code:
typedef struct node {
int elem;
struct node *next;
} node;
// not strictly necessary, but IMHO, if you want to typedef a struct type,
// it's the least confusing option to name it the same as the struct tag.
#define DICTSIZE 10
void display(node **a) // variable names are often lowercase by convention
{
// to cope with ANY possible size, you need size_t, int might be too small
// include stddef.h or stdlib.h to use it. Of course, with 10 elements,
// int is enough.
for (size_t i = 0; i < DICTSIZE; ++i) {
printf("a[%zu]: ", i);
node *tmp = a[i];
// now it's obvious tmp is a pointer, so no need to explicitly
// write the != NULL ... (any pointer that's not NULL evaluates true)
while (tmp) {
printf("%d\t", tmp->elem);
tmp = tmp->next;
}
printf("\n");
}
}
Also note how some added spaces greatly improve the readability of the code (so, use them).
I would consider your original display function broken, because it modified what it displayed. This is not expected behavior for a function that displays data. If you want to further improve on your code, you should use const to make it explicit that the function shouldn't modify what it receives, so the compiler can catch errors. In the example above, the signature for display should better look like this:
void display(const node *const *a)
The first const would make any struct node immutable, the second const (after the asterisk) makes the pointers in your array immutable. With this, you also have to write
const node *tmp = a[i];
because you can't assign a const pointer to a non-const pointer.
In your display function, you modify the entries of the dictionary array with A[i] = A[i]->next;, so you corrupt the data structure and lose the data.
You should instead use a local variable to enumerate each list:
void display(Dictionary A) {
struct node *n;
int i;
for (i = 0; i < 10; i++) {
printf("A[%d]:", i);
for (n = A[i]; n; n = n->next) {
printf(" %d", n->elem);
}
printf("\n");
}
}
Hiding pointers and arrays behind typedefs is a bad idea, it leads to confusing code for both the reader and the programmer. You should just typedef struct node node; and use explicit pointers and arrays.
I am trying to create an array of structs, with dynamically allocated memory,
Here's the struct definition I'm using:
struct node {
int key;
double probability;
struct node *parent;
struct node *children[255];
};
Here is the declaration and initialization:
int base_nodes = sizeof(X)/sizeof(*X);
while ((base_nodes - 1)%(D-1) != 0){
printf("Incrementing base\n");
base_nodes++;
}
printf("base_nodes:\t%d\n", base_nodes);
struct node **nodes = malloc(base_nodes * sizeof(struct node));
if (nodes) {
printf("Size of nodes:\t%llu\n", sizeof(nodes));
} else { printf("Failed to allocate memory\n"); return 1;}
Where X is another dynamically allocated Array of numbers defined before I call it here.
AFAIK, base_nodes is being calculated correctly, however the Size of nodes: is reporting 8, rather than 10. I have tried base_nodes less than 8 and it also returns 8.
Could someone explain why this happens? And how to do it properly?
The program I'm making is a D-ary Huffman code generator given a PMF.
I also attempted to realloc later on in the program and it seems to have had no effect:
nodes = realloc(nodes, ((sizeof(nodes) + 1) * sizeof(struct node)));
if (nodes) {
printf("New size:\t%llu\n", sizeof(nodes));
} else { printf("Not enough memory\n"); }
You're trying to obtain the number of elements of type struct node, allocated dynamically, using the operator sizeof() on the pointer itself, which will just return the size of a pointer on your machine, which is 8 bytes as it seems to be a 64 bit machine.
I think you're confused by the fact that when you allocate some memory statically in an array you can use sizeof() operator to return the number of elements allocated, i.e.
myType a[N];
number_of_elements = sizeof(a)/sizeof(myType)
I have the following tree node struct that holds pointers to other tree nodes:
struct node {
// ...
struct node* children[20];
}
The idea is that I want to check whether there is node* inside the children and based and that go deeper into the tree. So when I allocate the node I want to have children with 20 NULL values.
Currently I am not doin
How should I allocate this array in order to not get errors like Conditional jump or move depends on uninitialised value(s) (Valgrind)?
Would it be better to use struct node** children and allocate fixed size each time I allocate a new node?
EDIT: Example of one place where Valgrind complains:
for(int i=0;i<20;i++)
if(node->children[i] != NULL)
do_something_with_the_node(node->children[i]);
When you allocate a new instance of struct node, you must set the contained pointers to NULL to mark them as "not pointing anywhere". This will make the Valgrind warning go away, since the pointers will no longer be uninitialized.
Something like this:
struct node * node_new(void)
{
struct node *n = malloc(sizeof *n);
if(n != NULL)
{
for(size_t i = 0; i < sizeof n->children / sizeof *n->children; ++i)
n->children[i] = NULL;
}
return n;
}
You cannot portably use either memset() on n->children nor calloc(), since those will give you "all bits zero" which is not the same as "pointer NULL".
Your struct definition is valid (although it's hard to tell without more context if it fits your requirements).
Valgrind doesn't complain about your struct definition, it probably complains about how you instantiate variables of that type. Ensure that all of the array members get initialized and the complaints will most likely go away.
The problem is that you are using an unintialized value in an if condition.
When you instantiate a struct node, its member struct node* children[20]; is an array of 20 struct node *, all of which are uninitialized.
It would be no different from this:
char *x;
if (x == NULL) {
/* Stuff */
}
At this point, x may have literally any value. In your example, any element of an array may have any value.
To fix this, you need to initialize the elements of an array before using them, for example like this:
for (int i = 0; i < 20; ++i) {
node->children[i] = NULL;
}
Or shorter:
memset(node->children, 0, 20);
If you changed the member to, as you've suggested, node **children, the situation wouldn't be much different - you'll still need to initialize all the members, including array's elements. You could make it shorter by using calloc, which will initialize all bytes to 0; then again, you'll need some code for correct deallocation (and remember to do it), so I think the tradeoff's not worth it.