The comments are wide of the mark.? I started programming in the 1960's.? On an IBM
360/25 for example,? we had 48 KB memory in which to partition F1, F2, and BG.? Larger
systems we would run Power II or HASP, GRASP etc.? the spooler would use up memory.? Now
considering that back then one byte of memory cost a buck it doesn't take a rocket
scientist to? figure out that every little bit counted.? We had 11MB disks given that time
cost money (mount/unmount disks), multi volume files etc,? and that hardware was extremely
expensive it made a lot of sense to consider every aspect of the equation:
1) Run time - I rewrote a number of COBOL programs in 360 BAL (Assembler) that were
resource hogs and took a lot of processor time.? I could reduce the run time of some by a
factor of 5 or more.? COBOL compilers were inefficient and for example multiply, divide
and moves (different data formats) would generate a lot of? machine code instructions.? A
good programmer would know what was good and bad practice.? Example group moves to the
same format, multiple adds instead of multiply.? Assemblers were NOT considered poor use
of machine time? - totally the reverse.? Assembler programmers had to know what they were
doing though and for that I was paid more.? It took far less time to assemble a program in
BAL that compile a COBOL program.? The down side was it took a lot longer to write.
2)? Programs had to be written so that the most frequent parts were in memory due to the
need for the system to page.? BAL I wrote in Csec's (4kb) so that for example one
would perform a credit check, another perform page headings etc.? Maybe it sounds familiar
- reusable code/API's!? So maybe we weren't so stupid after all.
3) Anyone appreciating earlier data processing challenges would also be aware that storage
came at a price.? Also there was a limitation as to what would fit onto a 80 column card
(punched card).? Disc storage wasn't cheap and all this needed to be weighed against
the belief that a system written in 1965 would not be used by Y2K.? Maybe someone would
like to explain to my why my Quick Books purchase in 1996 (C#) wasn't Y2K compliant.?
I found a lot of the computer article written prior to the Y2K switch laughable being
written by people by people that really hadn't a clue as to why the systems had been
written that way in? the first place.
--- On Tue, 2/9/10, Ethan Dicks <ethan.dicks at gmail.com> wrote:
From: Ethan Dicks <ethan.dicks at gmail.com>
Subject: Re: Algol vs Fortran was RE: VHDL vs Verilog
To: "General Discussion: On-Topic and Off-Topic Posts" <cctalk at
classiccmp.org>
Date: Tuesday, February 9, 2010, 4:17 PM
On Tue, Feb 9, 2010 at 4:14 PM, geoffrey oltmans <oltmansg at bellsouth.net> wrote:
I wonder how much programmers in the 50's and
60's argued about how many CPU cycles this programming construct took vs. that. ;)
Probably a lot.? In the earliest days, assemblers were considered a
poor use of time on the machine.
Of course, these are the people who argued as to how many digits you
really need to represent a year on the modern calendar.? Hint: 2
digits only takes you so far... ;-)
-ethan