I am just curious about the array size of any programming language prefereably which can allow high size. after reading this link
What is the Maximum Size that an Array can hold?
is it that the Maximum array size is equivalent to RAM theoretically?, as RAM has to work on other application etc also. Or can we increase the virtual memory to accommodate any size of array.
Depend on ram size, and ram is constantly consumed and released back.so u can not figure how much for sure u can utilize. If theratically speak. If only array and ram then yes keeping every thing e.g os and programs out of question.
Related
I need to use 64kb of RAM for a buffer which will be used once in a lifetime. My total RAM is running short so I am planning to reuse the already allocated buffers to store the 64kb of data I need.
I am thinking about a way of allocating at compile time a set of static arrays in continuous location in order to reach my goal.
I am currently using a gcc compiler, the purpose of my procedure is to temporarily store data in RAM before transferring it in FLASH.
One idea, I don't know exactly how easy it would be, is to predetermine at a linker level which are the sector of data used for the already allocated buffers. At the moment the number of buffer present is reduced to 3 for some kb of space. This solution can not be the best one in the case that new buffers will be needed; asking developers to allocate some specific area of memory in the scatter file can be tedious and maybe can complicate stuff too much.
Thanks in advance.
In FreeRTOS, the heap is simply a global array with a size (lets call is heapSize) defined in a H file which the user can change. This array is a non-initialized global array which makes it as part of the BSS section of the image, as so it is filled with zeros upon loading, then, every allocation of memory is taken from this array and every address of allocated memory is a an offset of this array.
So, for a maximal utilization of the memory size, we can approximate the size of the Data, Text and BSS areas of our entire program, and define the heap size to something like heapSize = RAM_size - Text_size - Data_size - BSS_size.
I would like to know what is the equivalent implementation is Linux OS. Can Linux scan a given RAM and decide its size in run time? does linux have an equivalent data structure to manage the heap? if so, how does it allocates the memory for this data structure in the first place?
I would like to know what is the equivalent implementation is Linux OS.
Read "Chapter 8: Allocating Memory" in Linux Device Drivers, Third Edition.
Heaps in Linux are dynamic, so it grows whenever you request more memory. This can extend beyond physical memory size by using swap files, where some unused portions of the RAM is written to disk.
So I think you need to think more in terms of "how much memory does my application need" rather than "how much memory is available".
Is there a limit to the size of the variable(structure) which can be passed via pointer to a function.Also how big in size a variable can be defined(structure within the structure and so on ...and having arrays of sizes varying from 100-500)
for a safe running of the programme.(no stack overflow, memory propblem ,pointer corruption)
This is with reference to the embedded system having memories limited to 64K to 512K
You seem to have three questions:
How large a block of memory can be pointed to with a pointer: The limit here is the limit of the pointer size - be it a 32 bit or 64 bit pointer. But in practice no machine are likely to have enough memory to make the 64 bit restriction an issue.
How large a structure can I declare on the stack (this means a local declaration at some level): the stack size is limited on all hardware, usually much more restricted than the heap, the stack is not intended for large objects and the stack limits can be restrictive especially in embedded systems. The issue here is not the size of the single stack object but more the total size of the stack that is restricted.
How large a structure can I allocate from the heap (this means structures allocated with new or malloc: The heap is the 'rest' of available memory; this area is generally larger and the better place for large allocations; again the maximum space is completely dependent on the execution environment. There is a maximum amount of continuous heap available at any time for the process which amount s to the largest object that can be allocated.
As far as the C language is concerned, the largest size of a structure is
2 ^ (8*sizeof(size_t)) bytes,
where ^ should be read as "power of".
Beyond that, the limits depend on your specific system.
I - not a professional software engineer - am currently extending a quite large scientific software.
At runtime I get an error stating "insufficient virtual memory".
At this point during runtime, the used working memory is about 550mb and the error accurs when a rather big threedimensional array is dynamically allocated. The array - if it would be allocated - would be about a size of 170mb. Adding this to the already used 550mb the program would still be way below the 2gb boundary that is set for 32bit applications. Also there is more than enough working memory available on the system.
Visual Studio is currently set that it allocates arrays on the stack. Allocating them on the heap does not make any difference anyway.
Splitting the array into smaller arrays (being the size of the one big array in sum) results in the program running just fine. So I guess that the dynamically allocated memory has to be available in one adjacent block.
So there I am and I have no clue how to solve this. I can not deallocate some of the already used 550mb as the data is still required. I also can not change very much of the configuration (e.g. the compiler).
Is there a solution for my problem?
Thank you some much in advance and best regards
phroth248
The virtual memory is the memory your program can address. It is usually the sum of the physical memory and the swap space. For example, if you have 16GB of physical memory and 4GB of swap space, the virtual memory will be 20GB. If your Fortran program tries to allocate more than those 20 addressable GB, you will get an "insufficient virtual memory" error.
To get an idea of the required memory of your 3D array:
allocate (A(nx,ny,nz))
You have nx*ny*nz elements and each element takes 8 bytes in double precision or 4 bytes in single precision. I let you do the math.
Some things:
1. It is usually preferable to to allocate huge arrays using operating system services rather than language facilities. That will circumvent any underlying library problems.
You may have a problem with 550MB in a 32-bit system. Usually there is some division of the 4GB address space into dedicated regions.
You need to make sure you have enough virtual memory.
a) Make sure your page file space is large enough.
b) Make sure that your system is not configured to limit processes address space sizes to smaller than what you need.
c) Make sure that your accounts settings are not limiting your process address space to smaller than allowed by the system.
I have a 4GB Ram installed on Coure2Duo PC with a 32bit Windows 7 Operating system. I have increased the paging size up to 106110MB. But after doing all this I am not able to significantly increase the maximum array size.
Following are the specs
memory
Maximum possible array: 129 MB (1.348e+08 bytes) *
Memory available for all arrays: 732 MB (7.673e+08 bytes) **
Memory used by MATLAB: 563 MB (5.899e+08 bytes)
Physical Memory (RAM): 3549 MB (3.722e+09 bytes)
* Limited by contiguous virtual address space available.
** Limited by virtual address space available.
Kindly help me on your earliest. I am not even able to read a file of 48+MB size in double format.
There are two things you can do to clear up memory for MATLAB. Since you're using a 32-bit version of the program, you're normally limited to 2GB of memory. Using the /3GB switch while opening the program makes an additional 1GB of RAM available to that program.
Second, you should consider using the pack() function, which rearranges variables in RAM to free up more contiguous memory space. This, more than anything, is affecting your ability to open individual arrays.
Remember: you can figure out how many items an array will hold by dividing the memory amount available by the size of the variable type. Double variables take up 8 bytes each. Your 129MB of space available should allow around 16.85 million double values in a single array.
You can view information about memory usage using the memory functions included in MATLAB.
memory shows the memory information
inmem will show you the variables and functions stored in memory
clear will allow you to clear the memory of specific variables or functions.
You may try to set the 3GB switch, maybe this increases the possible memory. Otherwise: Switch to a 64 bit os. Your system wastes 547MB of RAM simply because there are no addresses for it.