On Fri, 16 Oct 2009 20:03:36 +0200, you wrote: >>>> It isn't. The problem is that if say we assume heap allocation is twice as >>>> efficient as static worst case >>> You can assume whatever you like, the point is: how can you be sure? >> >> Because I wrote the program that makes the allocations and I understand how >> it works. >> >> For a current project I am working on I know that if I statically allocated >> everything that could exist on the heap it would take around 10 times more >> memory than the heap will ever use while it is running. > >If you feel sure about your analysis (sure enough to sit on en electric >chair connected to the 'out of memory' alarm)... And note that it is not >just about memory use, but also about fragmentation. It has a QVGA GLCD with multiple pages of user interface. Memory to support the graphical objects and some associated strings and lists on a page is only required while that page is displayed. I need enough heap to support the most complex page. I would need enough static allocation to support all pages. Or I could jump through hoops making worst case sized arrays for each type of object and mini allocation schemes to support them all and still use 2 or 3 times more memory than a heap. I have no other requirement for dynamic allocation in the system so I know the heap will empty on every page change. I can check this happens and use a dumb heap management scheme which can't fragment and is easy to monitor. If I flip through all the GUI pages I know I have seen the peak heap usage. It is an example of the kind of system where using a heap over static allocation is a complete no brainer. Sure I can envisage systems where the opposite is true. In between there are plenty of systems where the greater efficiency of a heap allows for enough contingency to mitigate uncertainty in worst case heap requirements. -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist