Warren Wolfe wrote:
Hello, All,
I'm glad my first post on this topic started some discussion. I'm
less glad that I apparently didn't do a very good job of
communicating. Let me clear up a bit, if I may.
First, I'm not a Luddite, or troglodyte, or anything similar. I love
using the finest, newest devices of all kinds, including computers.
And, while I like playing with the old computers, and fixing them,
nothing beats screaming speed in a personal computer.
I tend to agree with this sentiment, I do love having old machines
around, but I use the most modern machine I own/can access for daily
work. It doesn't mean I don't appreciate the Lisas, Commodores, or old
Macs in my collection, nor does it mean I don't use them often. That
said, some machines I rarely turn on because I don't want to risk
harming them, and in those cases, I find using emulators for them a
better choice. i.e. the 5M profile hard drives on my Lisa, or the Xerox
Star's floppy and hard drives - after all, the most common components to
fail are mechanical ones.
But if you're thinking in terms of common things to do, perhaps those
don't need a lot of CPU power (i.e. word processing email, and surfing
the web - though recently between heavy javascript and flash sites, you
do need the CPU power.)
If on the other hand, you do something a bit more interesting, like
development, and find yourself compiling the same code dozens of times a
day, it pays to have a faster machine - that doesn't mean you have to
run the evil OS from Redmond, nor does it mean it has to have a CPU from
intel, but you can consider in terms of efficiency. MIPS (or perhaps
GIPS these days) per watt.
While both my Xerox Star and my Macbook can be used to write something
in a word processor, and both have a very nice desktop, my Macbook
doesn't require its own 20AMP circuit.
And if a hard drive dies, a 2.5" SATA drive is fairly cheap and easy to
replace. How about the drive in the Star? And should I need to print
that document, where do I find a printer for the Star?
Doesn't mean I love the Star any less.
Teo Zenios writes:
You can fix anything sold today if you want to spend a bunch of
money for equipment to deal with surface mount chips (BGA type
equipment is not cheap). You can get spare parts (or boards to
desolder parts from) at a recycler or ebay.
This is very true, and is one of
the costs of modern electronics. The
other part is while the high end items are expensive at the start (say
before the warranty starts to run out), they can be had for less than
half the price in working order a year or two out, once they're used.
And perhaps less if they're sold as non working. So it's not too bad in
all cases.
My background is as a National Institute of Standards
and Technology
Calibration Technician. Call me fixed in my ways if you like, but I
prefer finding a (small) failed component, getting a replacement for
under a dollar, and spending an hour or two troubleshooting and fixing
equipment. I no longer have the spare space to keep a few dead copies
of all my live equipment. I never cared much for cannibalizing
equipment. It seems somehow... sacreligious.
Where you're able to this, by all
means, that's wonderful. And as you
note, not everything is easily serviceable this way. It's just the
reality of things.
When each transistor is microscopic, how would you fix just that one?
And when it's part of an SMT chip with hundreds of pins, how would you
replace it when it was surface-mount soldered onto a board? Is any
electronics factory going to care that we'd want them socketed if it'll
cost them $1-$10 more per chip, or are they going to care that the
consumer and maker wants them as cheap as possible and the maker wants
them to fail the day after the warranty expires? (And the maker wants
them to be as proprietary as possible so they can control their market?)
Now, if I find the bad component, it's likely to be a $50 chip,
compared with a new $60 board replacement. While it gets things
working again, it's just not as much fun as it was to find a part that
was close to free to replace, and replace that part... a personal
taste issue, that. I didn't expect I'd need to explain that mind-set
HERE, of all places, though.
Ya, well, you know what they say about common sense.
:-) A lot of this
boils down to how practical something is. Collecting and repairing
historically important computers isn't the same as reviving a somewhat
modern but obsolete, by a couple of years, computer.
I'm going to care far less about throwing out a 20G IDE drive instead of
attempting to repair it (even if I could), versus an old 5MB 5.25" full
height MFM drive (which I might be able to.)
Perhaps in 10 years the 20G IDE drives will be scarce and have some sort
of collectible value to them, but that doesn't mean I'm going to
stockpile the dead ones in hopes for that to happen.
Sure, someone out there today will see an opportunity, and might
stockpile them now when they can be had for cheap. You can be certain
that in 10 years we'll be complaining about why they charge $100 for
them on ebay and how ebay sucks so much and how these drives shouldn't
cost so much when for $100 you could have a 200TB drive... All while
secretly kicking ourselves for not keeping more of those around. :-D
But such is the life of a collector.
I don't want to go back to the old days when machines were slow and
unreliable, when you had to fix a machine just to be able to use it.
Neither do I. On the other hand, I tend to think hardware progress
has been TOO fast, and that software, because of the speed of the
hardware, is immensely sloppy and inefficient, to a degree I find
amazing. Some years back, I was looking at a setup for a Microsoft
programming environment. It came on five CDs, IIRC, and I couldn't
help thinking of Turbo Pascal for CP/M, wich came on a single floppy,
and had an editor, compiler, and debugger in about 56 K of programming
space. The efficiency factor difference is stunning. I believe that
a bit more time at each 'watermark' of hardware progress would result
in tightening of software efficiency to improve product performance,
rather than just waiting for the next generation of PC hardware to
hide one's poorly-written code. Again, that might just be me...
Not at all.
You be certain a lot of this stuff is grossly inefficient.
The current "in" trend is to use things like garbage collection instead
of doing your own memory management, and it's also to use dynamic
languages which are really easy and fun to work in, but are extremely
difficult to compile (i.e. ruby - and I'm not knocking them by saying
this). So if you think of a popular application that's likely to be
used by hundreds of thousands of people, the trade off is that it costs
the programmer less time to spit out a piece of code, but what they
don't look at is how hundreds of thousands of people now have an
application that runs several orders of magnitude slower than it could,
and requires several dozens of times more memory. Why? Because the
industry loves having replaceable programmers, kids who have passed some
introductory course in Java and churn out whatever the spec requires.
This isn't to say that there aren't really good programmers out there
anymore than can code low to the machine, (you'll find those writing
device drivers for example), and perhaps those will get paid maybe 2x-3x
more than the replaceable Java kids that corporations love so much, but
IMHO, what they produce is worth several thousand times more, yet,
they'll never get paid several thousand times more.
That said, Java, and most other languages have a plethora of libraries
available to them, so they don't have to reinvent the wheel (quicksort,
linked lists, dynamic arrays, etc.) so for the EXPERIENCED programmer,
it can be a highly efficient and powerful tool. But that's the trouble,
most places are set up with very few experienced programmers and hordes
of code monkeys. So that's how you wind up with say things like Office
Suites that take up a whole DVD and have billions of unused features
full of bugs that few people need.
As long as the rest of the corporate world continues to use the bloated
inefficient tools, they remain the standard and everyone wants a
document in that format. It's a self-sustaining ecosystem.
So far no one's complained that I've used OpenOffice to edit
spreadsheets (even at work) or text documents as long as I export them
in that non-native format they seem to think is the best thing since
sliced bread. Infact, they never even notice it unless I happen to trip
on some bug, and even then do they rarely care.
IMHO: if you remove all the ugly inconsistencies in the Java language,
fix up the syntax a bit more, get rid of the garbage collector and have
a really good optimizing recompiler that spits out native code, you
might be able to write very efficient code that would run very nicely on
all platforms. Just my $0.02.
There are millions of people driving cars that have no idea how to
do anything other then put gas in it and maybe change the oil. Why
should computer (another tool like a car) be any different today?
Think about what you're saying... There ARE people who like to futz
around with their cars, like I like to futz with my computer. All the
people like that I know are not happy about computer control of the
engine, as it makes it difficult to the point of near impossibility
for an individual to work on their own car, and expect good results.
That *IS* like a car, isn't it? There is something to be said for
simplicity enough to be within one's skill set to repair with objects
at hand. One of my friend's father was stranded in the desert with
car trouble, and used a couple of gum foils, a paperclip, and a couple
of rubber bands to patch it up until they could get to a service
station. How cool is that? Today, unless someone is packing a spare
CPU for his model car, one would be coyote bait in the same situation.
We have not progressed smoothly -- rather, each area has gone off on
its own, totally separate from all other areas. I could get behind a
technology base where you could carry a few spare processors, and
program them with your cellphone to run your car, or your PC, or your
GPS unit, just by loading the correct program into it. But, nothing
is even similar, let alone identical today. I can't help but think we
took a wrong turn dictated to us by hyper-speed progress.
"The future is
already here. It's just not evenly distributed" --
William Gibson. Somehow he managed to say all that in 11 words instead
of two paragraphs. :-) Damn, I wish there were more writers like him
out there.
A lot of this is that the tools are kept hidden. The dealers have
access the stuff the public doesn't, and even non-dealer mechs. This is
on purpose to lock people into their market, and perhaps its expensive,
not just because of the lock-in but perhaps because it'll incite the car
owner to trade in their old car for a newer one, and thus feed the
system more money.
The fix would be to create an open source car manufacturer, where
everything is accessible and well documented. But good luck with that.
Still the latest Ubuntu was out just last week, and I gotta say, it's
quite impressive. Maybe there's hope for this idea too. Maybe Mr.
Shuttleworth can buy a bankrupt car factory and start it up. :-D
Efficient progress would have dictated that we make
the most of a much
smaller number of available parts, each with multiple uses. Is that
so hard? Somehow, I think not. And the benefits would be amazing.
Hardware leaping ahead of software has given us a dictatorial
Microsoft, and stifled the development of software better than Windows
for many years. Slower hardware development (25% improvement per
year, perhaps) would have forced competition on software vendors, to
the detriment of Microsoft. It would be interesting to compare the
two, but the world doesn't have a "control."
But you need to remember
that Microsoft is what drives Intel and other
hardware manufacturers to improve so quickly. By producing bloatware,
they generate a market for Intel to create and sell faster chips. If
Microsoft goes away, Moore's law will slow down significantly. (Perhaps
something else will replace MSFT's role in this, but the idea stands.)
As for smaller sets of parts that do lots of things, sure, that's
wonderful for the maker, but would they sell them that way?
However, this depression we're in now (and yes, that's what it really
is) has perhaps woken MSFT up that they can optimize a bit - or so
people who I know that have played with Windows 7 have told me.
(Disclosure: I've not played with either Vista nor Windows 7, so this is
2nd hand info.).
We're also seeing a lot of interest (but not surprisingly, few sales) in
"Net Books". (Let's see do I plunk down $300 to buy a tiny tiny
notebook computer that can't do very much other than surf the web and
has little storage, or do I buy a used/refurb full sized notebook for
the same price that I can do everything I'd need?)
Now that cell phones have killed off PDA's, perhaps the netbooks are for
people who find cell phones not as useful, and wish for something
bigger, but don't really want a full sized notebook?
Even so, I see something like 30 posts a day on NewtonTalk, so the PDA
might be dead in the eyes of industry, but not the need for one.