years ago programmers used to hand there jobs as a IBM 360 which was sent to a operators to run them.

is there a program where there is only 32k in the main memory.

can you give me a exaple with only 32 k of memory something lie this

At the beginning there will be no notthing running the screen might show something like this:

32k free

that all i need, i've done everything else

i want something like when you used 10 k, in your memory it should say 22k left. it could be about anything like a simple calculatotr, i just need a example or my assignment

well, you need to ditch all those heavy GUI libraries to start with, those eat massive amounts of memory...

It's all a question of memory allocation, people just don't go the extra mile these days to think up memory structures to use minimal RAM.
Where are the days of using a single int to store 16 booleans?

You can make a chair from a stump of wood, too. Or mend socks rather than buying new ones. But, the world has moved on from that. RAM has become 'free' on most systems.

Those of us who work on embedded systems still have to worry about ram and disk/flash usage, but it's almost a lost art.

For normal windows programming, you want to reign in extremes ("Store the entire database in ram"), but most people don't set out to make a windows program that is super small, they set out to accomplish some OTHER design goal.

First step would be to make a 'hello world' app and see how big it is. (Don't use MFC or any other library; stay away from the 'std' library too, as well as other template usage). Tinker with compiler and linker options. Get to the smallest you can, and then go from there.

32K isn't much room these days. I recall working on an IMSI machine back in '76 that only HAD 4k! Aaahhhh, those were the days.

Programming for less waste also makes for faster programs.
Customers complain about programs being huge and slow, a main reason is the reasoning you portray that RAM and CPU cycles no longer matter.
They do. Maybe not to the extent they did 20 years ago but they certainly matter today.

DOS 3 ran on 384KB of RAM in a 4.77MHz CPU comfortably with enough left over to run a word processor or spreadsheet.
Today Windows XP struggles doing that when you have 384MB of RAM on a 2.5GHz CPU... That's 1000 times the memory and 500 times the CPU speed to run essentially the same functionality.
And guess what, that DOS program on that XT ran almost as fast as that bloated Windows program on that PIV.

I don't advocate leaving everything behind and going back to handcoding in ASM, but what I do advocate is to be aware of the RAM and CPU cycles your code uses and not to waste that.

Another example: a few years ago I was assigned to optimise a C++ application running on OS/2 on a P233. It needed to run once a day analysing a large amount of input data and generating a report over it.
There was a problem though, over time the data had grown and now the program needed 32 hours to complete it's daily run...
By doing a few simple optimisations like loop unrolling and inlining some small functions (one and two liners) I was able to get the program to complete in half the time on the same hardware (which at the time was, while not state of the art, certainly about average for what the customer was using).
One day of work on the part of one consultant saved them from buying a highend workstation.
Had the original programmer been more aware of the consequences of his design choices that expensive consultant (me) would not have been needed. The company would have saved my fee (about a thousand Euro for a day of work at the time) for maybe a few hours of one of their own people (at 50 Euro an hour or less).

And that isn't the only time some simple performance awareness in software design and implementations would have avoided major cost lateron.

This article has been dead for over six months. Start a new discussion instead.