>> Maximisation of processor throughput, and minimization of
>> microinstruction count, is at least half the purpose of microprogramming.
>
>Sure. And the microcode compilers I've written and used are much better
>at optimizing horizontal microcode than I have the time or patience to do
>by hand.
>
Your lack of time and patience is not equal to the claim you make regarding
the quality of optimisation possible.
William R. Buckley
Hello, all:
Does anyone have a copy of the AIM65 Assembler ROM and BASIC ROM that
they can shoot me? I read the User's Guide and these seem like interesting
programs to have :-).
Thanks!
[ Rich Cini/WUGNET
[ ClubWin!/CW7
[ MCP Windows 95/Windows Networking
[ Collector of "classic" computers
[ http://highgate.comm.sfu.ca/~rcini/classiccmp/
[ http://highgate.comm.sfu.ca/~rcini/pdp11/
<---------------------------- reply separator
>On an interesting note is the availability of "white" LEDs (actually R, G,
>and B leds all wired together in the same case).
*Most* of the white LED's currently on the market are GaN (blue) LED's that
illuminate a white-producing phosphor. The chromaticity indices of the
resulting light is markedly different than you get from mixing R,
G, and B. (Though this probably doesn't matter to anyone for this
particular application...)
> This leaves open the
>possibility of replacing lamps with white LEDs continuing to keep the 8/e
>in service as it was intended. (I also scavenge wheat lamps when I see 'em!
It requires some small modifications, most noticably some way of dealing
with the pre-heat current (usually done with a shunting resistor across
the LED, though I have also seen schemes where a Zener or lifting a leg
of the pre-heat resistor is done.)
--
Tim Shoppa Email: shoppa(a)trailing-edge.com
Trailing Edge Technology WWW: http://www.trailing-edge.com/
7328 Bradley Blvd Voice: 301-767-5917
Bethesda, MD, USA 20817 Fax: 301-767-5927
While I agree fundamentally in that you really don't have to have graphic
output capabilities, to wit, I did without it for over 30 years of computer
use, I don't believe there's any reason to favor the terimnal over the
direct-mapped monochrome video display. It's nominally a 2000 character
window that has to be managed, and whether you do it with a terminal or with
a video board is strictly up to you. I personally believe that exploiting
the approach of the 6545 chip is still a pregnant way to address the problem
of slow video updates due to low (<24.576Mbit/sec) baud rate.
as for what you find difficult to get fixed . . . (a) who cares about fixing
a serial card? Another costs $3. (b) pre-vga monochrome cards and monitors
abound at the thrift stores. Keyboards do as well. (c) so long as hard disk
drives of the ST506 variety still abound in the thrift stores, the
controllers will too. I passed on an 'AT box a week ago, which had a VGA
card, a 200+ MB eide 3.5" 1/3-height hard disk, and much of the usual stuff
for $10. Had there been a decent keyboard, I might have gone for it, but
there'll be others next week.
I figure, if I can't replace it with something similar, then I'll replace it
with something more current.
Dick
-----Original Message-----
From: Tony Duell <ard(a)p850ug1.demon.co.uk>
To: Discussion re-collecting of classic computers
<classiccmp(a)u.washington.edu>
Date: Sunday, April 04, 1999 7:26 PM
Subject: Re: homemade computer for fun and experience...
>>
>> Again, I have to agree about the "waste-of-trees" nature of most
"technical"
>> documents these days. Nevertheless, I find it easier to understand the
>> result of a SPICE simulation when displayed graphically, e.g. with PROBE
as
>
>There are, agreed, times when a graphical output is essential. I'd hate
>it if my logic analyeser could _only_ give a list of the input states for
>each sample (although sometimes that's what I want). More often, though,
>I want a timing diagram.
>
>My handheld 'scope has an RS232 output that'll send the samples in
>memory. Seeing those as a list of values is not often useful. Plotting
>them is.
>
>However, the fact remains that often graphical output is _not_ essential.
>I can't think why it would be superior to text-only output for
>programming, for example. Or text processing - I do all my word processing
>using TeX on a text-only machine. WYSIWYG would add absolutely nothing.
>
>> I already stated that the "old" machines did the "old" and in many
instances
>> quite persistent tasks well, and still would, given a chance. People
have
>> learned, however, that it's not as beneficial to have OLD hardware as to
>> have new, not because of what it will do, but what it won't. I don't
mean
>> that it won't break. Any hardware can fail. It's a statistical reality.
>
>Although, to be fair, some of the older machines were rather
>over-engineered, and less likely to fail than modern PCs...
>
>> However, if you try to repair that old, fine, terminal you bought in the
>> '80's you'll find you can't get it fixed for less than the cost of a PC.
>
>_get_ it fixed? I fix things (anything) myself, and will continue to do
so...
>
>> If, however, you break your PC, there's really nothing you can't repair
or
>> replace for much less than the cost of the original.
>
>Oh yes? Where do I find an ST506 controller from, new, these days? Or if
>I am using a machine that uses such a drive and the controller fails, do
>I have to replace the drive _and_ restore from backups as well? That's
>hardly a good idea. Ditto for any video card that isn't VGA or higher, etc
>
>Fact is, swapping parts in PCs is easy, provided the PC is absolutely
>modern. If it's even one generation behind, you start having problems
>finding parts. Maintaining an old machine, where proper docs and spares
>are _still_ available is a lot easier.
>
>What if I am depending on some hardware feature of the old card (like the
>current loop interface on the IBM Async card. Ever tried getting one of
>those, or a clone?
>
>-tony
>
For those who are putting together single-density-capable systems
for use on a PC-clone (with Teledisk, 22Disk, and the like), they
may find these articles from comp.os.cpm several months ago useful
(or they may have additional data and/or contradicting data, which
will be useful to the rest of us.)
Enjoy! -Tim.
Single Density on a PC (was Re: 22Disk and CompatiCard)
Author: Amardeep S. Chana <asc1000(a)ibm.net>
Date: 1998/12/30
Forum: comp.os.cpm
Ken Ganshirt <ken.ganshirt(a)sk.sympatico.ca> wrote in article
<36899CB0.19AD9093(a)sk.sympatico.ca>...
>
> I must have a pretty decent floppy controller, because I can even read
> my original SS/SD Os-1 floppies, even though 22Disk warns that it might
> not work for that format. (For the technically curious, this is on a
> Dell 486/50 running Win 95 and the floppy drive is one of those deals
> that has both a 3.5 and 5.25 in a single half-height drive. It's the
> only system I have left with a 5.25" floppy drive.)
>
Ken,
I recently did a study to find out what will and what won't do single
density. Here are my findings so far:
Will support single density / FM:
NS PC87306 Super I/O
SMC FDC37C65
SMC FDC37C78
Most SMC Super I/O chips
Will NOT support single density / FM:
NS 8473
NS PC87332* Super I/O
NS PC97307* Super I/O
WD FDC37C65
Most (if not all) Intel parts
Any Winbond part
Any UMC part
Reportedly will do single density / FM but NOT verified:
NS 8477
Intel 82077AA
Goldstar Super I/O
The NS PC87306 is found in a lot of Dell and Compaq machines from the
486-50Mhz models to the Pentium-90 models. Most Super Micro Pentium
motherboards using the PCI HX chipset also used that super I/O.
*NOTE: It is important to verify the part number on the chip itself. Many
of these newer NS parts will identify themselves to software as PC87306,
but do NOT support single density.
Best regards,
Amardeep
_________________________________________________________________
Re: Single Density on a PC (was Re: 22Disk and CompatiCard)
Author: Amardeep S. Chana <asc1000(a)ibm.net>
Date: 1998/12/31
Forum: comp.os.cpm
Don Maslin <donm(a)cts.com> wrote in article
<915064276.933215(a)optional.cts.com>...
[snip]
>
> Amardeep, I fear that I must question your study. I believe that you
> are ascribing to some of the chips the shortcomings of the FDC
> manufacturer. For example, both the NS 8473 and the WD 37C65 will
> most assuredly support FM. I have DTK FDC cards with the 8473 and
> read Osborne 1 disks with them just prior to writing this. Likewise,
> I have the WD 37C65 in the WD FOX card and it will also read/write
> FM. On that basis, I must have reservations about some of your other
> determinations.
> - don
>
Hi Don,
I understand your reservations and can address every issue. I did not go
into enough detail in the first posting to fully support my assertions.
> : Will support single density / FM:
>
> : NS PC87306 Super I/O
> : SMC FDC37C65
> : SMC FDC37C78
> : Most SMC Super I/O chips
>
The above parts are completely stand alone with on board filters, write
precomp generators, and data separators. They should work with FM in any
board implementation, unless something specific is done to prevent it (not
likely). This is per the National and SMSC (new name for SMC
semiconductor) data sheets. I have tested the NS PC87306 and SMC FDC37C65
using Jeff Vavasour's Model 4 emulator and Tim Mann's xtrs 2.8 under Linux.
They both read and write FM with no problems.
> : Will NOT support single density / FM:
>
> : NS 8473
> : NS PC87332* Super I/O
> : NS PC97307* Super I/O
> : WD FDC37C65
> : Most (if not all) Intel parts
> : Any Winbond part
> : Any UMC part
>
The 1988 data sheet for the NS 8473 states on page 8-32, "While the
controller and data separator support both FM and MFM encoding, the filter
switch circuitry only supports the IBM standard MFM data rates. To provide
both FM and MFM filters external logic may be necessary."
Every 8473 board I have tried failed to write FM. However, it may be
possible to read FM on some boards if the external filters have a wide
enough Q.
The NS PC87332 & NS PC97307 are standalone and by design do not support FM
(verified on the National data sheets).
The only information I have on the WD FDC37C65 is the Always IN2000 card I
have with that chip cannot read or write FM. I suspect it is also
dependent on implementation.
I have new information on Intel...
Intel 8272 is a NEC 765 clone and therefore dependent on implementation
.
Intel 82077AA and 82077SL - data sheet clearly states these parts suppo
rt
FM.
Thanks to Pete Cervasio for testing and reporting that the 82077 does
indeed read and write FM.
Intel 82078 - data sheet clearly states these parts will NOT support FM
.
I haven't yet investigated the new Intel Super I/O chip which is replacing
the 82078.
The Winbond and UMC chips have never worked on any adapter or motherboard
I've ever encountered them on. No idea if its the chip or the
implementation.
> : Reportedly will do single density / FM but NOT verified:
>
> : NS 8477
> : Intel 82077AA
> : Goldstar Super I/O
>
The NS 8477 data sheet indicates that it does support FM (it is
functionally and pin for pin compatible with the Intel 82077). The
Goldstar Super I/O was reported to work with FM in a newsgroup posting I
read once but have never been able to confirm it.
Hope that clarifies things :)
Amardeep
--
Tim Shoppa Email: shoppa(a)trailing-edge.com
Trailing Edge Technology WWW: http://www.trailing-edge.com/
7328 Bradley Blvd Voice: 301-767-5917
Bethesda, MD, USA 20817 Fax: 301-767-5927
>Bill Pechter <pechter(a)pechter.dyndns.org> wrote:
>> > ABS - American Bull Shi...
>> >
>> > I have noted one difficulty with ABS, and that is its failure to
operate on
>> > snow
>> > and ice. Since I live in Southern California, I do not get that much
snow
>> > but,
>> > in any quick application of my Mustang's breaks, on snow covered roads,
>> > they always seem to lock up. Well, the pumping action occurs but, at
each
>> > application of the pump, I notice wheel lock-up. There is no stopping.
>> >
>> > William R. Buckley
>>
>> Noted the same thing with my wife's Acura in New Jersey.
>> The bad news is (unlike the last two years) we usually get snow here.
>
>Hi
> I think people miss the point here. First, nothing short
>of retro rockets will slow you fast on snow and ice. The
>best rule here is "SLOW DOWN". Even rainy or dew slick
>roads reduce traction a lot. ABS' generally do much better
>on ice and snow than can be done manually. Yes, they generally
>lock and on lock but it is much better than complete lock
>as all but a trained expert would do under such conditions.
> I'm not all that great a fan of ABS but I think for anyone
>not trained in skid control, it will do better than most
>people would do. Skid training is something that has to
>be learned as an automatic reaction. It requires regular
>refreshing to keep the skill in tune. It can't be learned
>by reading a book, it must be experienced.
> ABS will not do magic, it will in most conditions give
>one a better chance than they would normally have. It won't
>make up for foolish drivers.
>IMHO
>Dwight
>
The detail here is that it occurs even with extremely
light pressing of the brake (sorry for the earlier misspelling)
peddle.
Sure, very slow travel is the best course, which I judiciously
demonstrate in my driving under such hazardous conditions.
I, too, wonder how the computer knows (obviously, it does
not) when lock is due to locking up as opposed to halted
motion.
William R. Buckley
>[I'd suggested that we take this discussion off the list. I'm continuing
>to reply here in this case only because I don't want people to get the
>incorrect idea that Godel's Incompleteness Theorem can be used to magically
>explain away any philosophical problem regarding computers.]
>
>> Is this use of the word "assembly" not yours? I, sir, am quoting you,
not
>> me!
>
>OK, that one was mine. It wasn't in the context you originally quoted, or
>even from the same message you quoted
>(<19990405060635.29296.qmail(a)brouhaha.com>). I had used it three
>hours earlier in the discussion
(<19990405030452.28640.qmail(a)brouhaha.com>).
>So perhaps you see why I didn't understand what you were complaining about.
>It is customary to include a brief quote of the actual context you are
>referring to.
>
The quote was passed down several layers of reply. I expect one to
remember one's own words. Your failure to do so does not provide any
obligation on my part.
>> That says nothing about the general case
>> that humans have superior intellectual capacity vis-a-vis the computer.
>
>In the general case, I've never claimed that they do. I've only claimed
that
>in a sufficiently limited problem domain with a time limit (i.e., the
solution
>value vs. time curve is a flat with a sharp drop to zero), a computer may
>reach a better solution than a human would. I also claim that this is true
>for other common solution value vs. time curves; if the solution is worth
$x
>today but only $x/2 tomorrow, the computer may produce a more valuable
>solution than would a human.
>
Time limits accepted but, that is not my concern. I am refering to an
ultimate issue, which is that humans have intelligence, computers
do not. Any high-speed moron has the opportunity to surpass a
considerate intellect. Witness the ability of Deep Blue to challenge
the best chess player. Yet, ultimately, a human can decide by means
not algorithmic.
>> What you have failed to address is that the human intellect is not
limited
>> by the capacity to algorithmatise a solution.
>[and later:]
>> Humans have the capacity to make judgements by means outside of those
>> mathematical and logical, hence the reference to Penrose.
>
>Sure. A human may proceed in a manner that is not based upon logical
>deduction or any (obvious) deterministic algorithm.
>
>It is yet to be proven that this human ability (as manifested in complex
>problem-solving) is not equivalent to a non-deterministic algorithm,
>or even to a sufficiently complex deterministic system. Penrose claims
>that quantum uncertaintly is necessary to intelligence. While he provides
>insufficient proof of this claim (really just anecdotal evidence), as an
>argument against machine intelligence it is a red herring, since it is
>not especially difficult to build a system that uses quantum uncertainty
>to influence nondeterministic algorithms.
>
This begs the question, for proof is necessarily mathematical (I, for one,
do not agree with Judicial notions of proof, such as a preponderance of
the evidence). That you hinge your argument upon the lack of a proof
of the means of some human ability simply points to flaws therein.
>> in particular, the notions of Godel: that within any axiomatic system, th
>> answers to some positable questions are indeterminable.
>
>You know, since you mentioned the book GEB, I thought you might have been
>trying to bring Godel's Incompleteness Theorem into the discussion. But
>since you didn't specifically state that, I wanted to give you the benefit
>of the doubt.
>
>The Incompleteness Theorum if very useful for certain lines of reasoning.
>And it might be relevant to the strong AI problem. But it has no relevance
>to the compiler problem we've been discussing.
>
It is relevant to the notion that humans must use methods not algorithmic.
>In the compilers "axiomatic system", it is not possible to even construct
>the kind of questions to which GIT refers.
>
>The compiler is not burdened with proving that it is correct, or that its
>own output is correct. At most we are asking it to select the more
efficient
>of several proposed solutions. This in some sense does involve a "proof",
>but the required proof is no of the validity of the axioms (i.e., the
>compiler algorithm), nor is it a proof that the "system" is
self-consistent.
>
>> For all the nit-picky details of the works of these masters, the points
they
>> make are far grander. The real value of their works is not kept solely
>> within the realm from which their conclusions emerge, but within which
>> such conclusions find additional value.
>
>If you know where to apply them. You can't just willy-nilly claim that
>GIT applies to any random problem.
>
This is one of the wonders of human intelligence: to make leaps of logic
and application.
>If you are going to maintain that GIT precludes compilers generating code
>as efficient as the best human-generated code, you'd best be prepared to
>present a logical argument as to why GIT applies. It's not a magic wand,
>and I'm not going to concede your point at the mere mention of it.
I am not applying GIT to the operation of compilers. Instead, I am applying
it to the operation of human intelligence. Whether you concede the point
makes no difference to me. My purpose is to refute your claims of the
superiority of software versus human intelligence, and that is all.
William R. Buckley
Hi Gang:
I went on a trip to one of the local electronic/scrap stores today, and
noted that they have about twenty CPU and other boards from some sort of
Tandem machine for sale. They're marked $10 each (that's about $6.50 US)
but one could probably haggle with success.
Is anyone on the list into Tandem? There hasn't been any discussion of
Tandem and their "non-stop" line (read as "full stop" for those with any
hands-on experience with the machines) on the list at all, in my
recollection.
I'm not into Tandem either, but if anyone's interested I can call the
store owner for further info this week.
Kevin
--
Kevin McQuiggin VE7ZD
mcquiggi(a)sfu.ca
Boy, we're way off topic here.
First off, most ABS will lock up if all four wheels slip. How's it to know
that you haven't stopped? No variety of braking is going to help you on
ice, only tires (e.g. Bridgestone Blizzak) help there.
Secondly, ABS is for dry or wet roads. It actually increases stopping
distance in snow and gravel, because on those surfaces it is more
advantageous to lock up the wheels and pile up material in front of each
tire.
Kai
-----Original Message-----
From: Buck Savage [mailto:hhacker@home.com]
Sent: Monday, April 05, 1999 3:20 PM
To: Discussion re-collecting of classic computers
Subject: ABS - or is it Pure BS
>2) even if the ABS system fails it still works just like non-ABS brakes.
unless
>the vacuum (power assisted) system fails or the brake line is cut, or (very
>unlikely) the piston sticks open, the brakes will work just fine.
>
ABS - American Bull Shi...
I have noted one difficulty with ABS, and that is its failure to operate on
snow
and ice. Since I live in Southern California, I do not get that much snow
but,
in any quick application of my Mustang's breaks, on snow covered roads,
they always seem to lock up. Well, the pumping action occurs but, at each
application of the pump, I notice wheel lock-up. There is no stopping.
William R. Buckley
According to the jumper settings on the 5150, it appears that 4 floppy
drives can be connected to the computer. How is this possible? I'm
guessing 2 internal, and two external, but there's only one connector for an
external drive, so it would only allow 3 drives.
Or is there a special controller that has dual external ports?
Any suggestions?
ThAnX,
--
        -Jason Willgruber
      (roblwill(a)usaor.net)
          ICQ#: 1730318
<http://members.tripod.com/general_1>
Fact is, the serial protocol for communicating with your 'AT keyboard is
widely understood and well documented. I'm sure anyone who could program an
older 8-bit micro could program a PIC or other single-chipper, like the
87C42 which I believe is still made, to do what the old 8042 does. If you
get an 8742, I don't think they even have a code protection bit.
Given that you have too much principle, and perhaps not enough interest, to
replicate the 8042 ( a clever choice of chips ) you could simply decode the
binary you do get from the keyboard with a lookup table.
Dick
-----Original Message-----
From: Tony Duell <ard(a)p850ug1.demon.co.uk>
To: Discussion re-collecting of classic computers
<classiccmp(a)u.washington.edu>
Date: Sunday, April 04, 1999 6:31 PM
Subject: Re: homemade computer for fun and experience...
>>
>> > True. But AFAIK the AT keyboard host interface was never implemented in
>> > TTL (it always used a programmed 8042 microcontroller), so it's a
little
>> > harder to build from scratch.
>>
>> If what you're trying to do is interface the AT keyboard to some custom
>> controller that doesn't need to be otherwise AT-compatible, there's no
>> reason why you need the 8042. The AT keyboard interface is not
particularly
>> harder to implement than the XT interface was. I've written code for
several
>> products that bit-banged it on a microcontroller.
>
>Absolutely. BUT : if you are making a homebrew machine, the last things
>you need are (a) I/O that's timing critical (at least not for the
>keyboard) or (b) a microcontroller that you have to program and debug.
>
>And then, as you said below, the AT keyboard protocol is not that well
>documented. The XT is a little better documented, if only because there's
>a circuit using standard chips (plain TTL chips) that accepts XT keyboard
>input. You can work out any odd bits of the protocol from that.
>
>Alas IBM never published the 8042 ROM source, so you can't use that as a
>reference.
>
>>
>> The AT keyboard interface protocol is really a pathetic design, though.
It's
>
>I'll not argue with that.
>
>> a pain in the ass to deal with, and it's not well documented anywhere
(even
>
>The documentation is not brilliant, but you can figure out how to talk to
an
>AT keyboard from the info in the TechRef if you have to. Not an ideal
>first project, though.
>
>-tony
>
> The problem now is, if I change the file that contains foo(), I have to
>apply my patch again. Or in other words, once I patch the output from the
>compiler, I can no longer use the compiler. If this is a one time shot and
>I will only work with the output from then on, then no problem. But
>otherwise ...
>
> -spc (Although from the discussion it seems that the deal was a one
> shot anyway ... )
This sort of situation (compiler doesn't quite do what the writer
wants) is actually widely encountered in some classic Unix kernels. There
are parts of the kernel that need interlocking, running at a different
priority, etc. The "classic" way of doing this is to compile the C code
into assembly code, run a program that massages the assembly code to
change the details of how some actions are done, and then assemble the
modified code.
As the old fortune cookie program says, "I'd rather write programs that
write programs than write programs" :-).
--
Tim Shoppa Email: shoppa(a)trailing-edge.com
Trailing Edge Technology WWW: http://www.trailing-edge.com/
7328 Bradley Blvd Voice: 301-767-5917
Bethesda, MD, USA 20817 Fax: 301-767-5927
>> Your argument, Eric, was that the microcode compiler generated code
>> that is equally as efficient as that you, or someone else, could have
>> constructed by hand. Megan in no way implies the use of assembly code.
>> The microcode compiler would generate an object file, which by your
>> own admission above, generated more code than could fit in the
>> memory space available. You accepted her argument that the human
>> was required to generate code more efficient than that produced by
>> the microcode compiler. You protest _too loudly_ my friend.
>
Again, you used the word *assembly* and that implies my point.
>No, I accepted her argument that for conventional machine code compiled
from
>a conventional high-level language, a human can fairly easily generate
>better code. But if you had read my posting *carefully*, I specifically
>protested that this is *not* the same problem as compiling horizontal
>microcode from a specialty source language.
>
>I *still* stand by my statement. The compiler produced better code
>in minutes than I could have produced in three months. Your argument seems
>to be that a compiler can't produce better code than a human with an
infinite
>amount of time could. I'll concede you that point. Or maybe I won't. A
>compiler with an infinite amount of time could have simply tried every
possible
>combination of control store bits (for the 512*72 example, 2^36864
>possibilities), and run a set of test vectors against each candidate to
>determine which ones meet the specifications, and of those which yields
>the highest overall performance. And by applying some relatively simple
>heuristics, the number could be reduced from 2^36864 down to a number that,
>while still huge, could at least be done during the remaining life of the
>universe. But this is irrelevant, because neither the human nor the
computer
>has an infinite amount of time available.
Halting problem (P vs NP) difficulties aside, I have never seen the
situation
in which the resultant output of a language translator could not be further
optimised, with the exception of trivial cases. The value that you ascribe
to your time, notwithstanding.
>If my job had depended on finishing the project in question without using
>the compiler, the only way to do it would have been to expand the control
>store to 768 or 1024 words, because after spending a lot of time writing
>microcode by hand, it would probably have been larger than 512 words.
It is always easier for the human to find a wasteful application of
resources
to facilitate job completion than to hunker down and produce a quality and
efficient product. Witness the ubiquitous supremacy of Windows, i.e. NT.
>It was the use of the compiler that allowed me the luxury of shrinking it
to
>fit in the 512 words available. Without using the compiler, there is no
way
>in hell that I would have had time to do such a thing.
You argument, again, is the value that you place on your time, and not the
quality of your intellect. I maintain that the computer, no matter the
skill
of the algorithm, is always to fall short of human productivity. In this, I
agree
with such notable researchers as Roger Penrose and Douglas R. Hofstadter.
Have you read Godel, Escher, and Bach: The Eternal Golden Braid?
>It is instructive to note that when I was trying to squeeze the 514 words
>down to 512, I discovered that the compiler had succeeded in combining
>several things that I wouldn't have easily found,
Here, again, you base your argument on your lack of skill and capacity, not
on the limitations of algorithms.
> because the compiler is
>actually *better* at doing data flow analysis than I am. That's not
because
>the compiler is inherently more clever than I am, but because it is not
>subject to the Hrair (sp?) limit as I am. It's not more clever, but it's
>more tolerant
With regard to self-deprication, you seem to hold the decathelon in
tolerance.
>of doing tedious recordkeeping and matching. Of course, if I
>had the time to meticulously do the same thing, I obviously could do at
least
>as good a job of data flow analysis as the compiler. But in practice
that's
>simply not going to happen. Life's too short.
I do accept that life is, indeed, too short. I, too, would not want to
spend my
life on a single problem, a single implementation of algorithm to the limits
of optimality. That, however, is not the point.
>Most everyone in this discussion is just parroting the conventional wisdom
>that compilers don't generate code as compact or efficient as humans can,
>without considering the possibility that for specific problems and under
>specific constraints, they actually can be *better*. I'm absolutely
willing
>(and eager) to concede that in the general and unconstrained case, the
>conventional wisdom holds true.
Demonstrate a case where an algorithm provides a better solution to a
translation problem, and I'll show you a case where the algorithm provides
exactly the solution obtainable by a human but, no better a solution than
that
obtainable by a human.
Your argument is that no human can perform the act of a computer, and this
is shear lunacy.
William R. Buckley
According to the jumper settings on the 5150, it appears that 4 floppy
drives can be connected to the computer. How is this possible? I'm
guessing 2 internal, and two external, but there's only one connector for an
external drive, so it would only allow 3 drives.
Or is there a special controller that has dual external ports?
Any suggestions?
ThAnX,
--
-Jason Willgruber
(roblwill(a)usaor.net)
ICQ#: 1730318
<http://members.tripod.com/general_1>
-----Original Message-----
From: Bill Sudbrink <bill(a)chipware.com>
To: Discussion re-collecting of classic computers
<classiccmp(a)u.washington.edu>
Date: Tuesday, 6 April 1999 00:55
Subject: RE: bringing up an 8f...
>Not to mention that the Acrobat user interface _SUCKS_!
Agreed. I'd be a darn sight happier with a simple text file, or even
html. I can read that even on a Vax with a VT100 and Lynx. Acrobat is
somewhat tedious to manipulate. And I hate having to zoom on text
that's too small to read.
Just my $0.02 worth as well.
Cheers
Geoff Roberts
-----Original Message-----
From: Eric Smith <eric(a)brouhaha.com>
To: Discussion re-collecting of classic computers
<classiccmp(a)u.washington.edu>
Date: Sunday, April 04, 1999 8:03 PM
Subject: Re: microcode, compilers, and supercomputer architecture
>Megan wrote:
>> well put... I've yet to find a compiler which can produce code which
>> could not then be further optimized in some way by a person well
>> versed in that machine's architecture...
>
>Yes, but if you paid attention to the original claim, you would see that
>I asserted that it was true for horizontal microcode with large amounts
>of data dependency. This is *very* different than trying to compile C
>(or Pascal, or Bliss, or whatever) for a typical architecture (which more
>closely resembles vertical microcode).
>
>One of the systems I microcoded had 512 words of control store (of about
>72 bits each), and running my microprogram source code through the compiler
>produced 514 words of microinstructions. With about two weeks of
>concentrated effort, I was able to eventually squeeze out two
>microinstructions. Total development time: 6 weeks.
>
>If I had tried to write all of the microcode in "assembly", it would have
taken
>me longer to write, and it probably would have been *bigger* on the first
>pass. And I still would have had to spend a lot of time on hand
optimization.
>I think this would have taken at least 12 weeks of development time,
although
>since I didn't do it that way I'll never know.
And, from your most recent posting:
>> Again, you used the word *assembly* and that implies my point.
>
>Now you've lost me completely. You were quoting your own writing, not
>mine. I didn't even *mention* "assembly" in my posting, except in quoting
>you.
>
>> Halting problem (P vs NP) difficulties aside,>
Is this use of the word "assembly" not yours? I, sir, am quoting you, not
me!
>
>I've never seen the situation in which human-generated code could not be
>further optimized, with the exception of trivial cases. Your assertion
>does not contradict my claims. Of course, this brings up the issue that
>"trivial" is not objectively quantifiable. One could perhaps credibly
argue
>that a trivial code sequence is one for which no further optimization is
>possible. I'm not taking that position, but simply pointing out the
>difficulty in basing arguments on non-objective statements.
>
>In point of fact, I've seen huge amounts of human-generated code that was
>nowhere near as optimal as code that a compiler would generate.
>
>All this proves is that neither humans nor compilers tend to produce
>optimal code. It says nothing about which tends to produce more optimal
>code.
>
The fact that an individual program is incapable of producing superior code,
relative to optimality, only serves to indicate that humans suffer a greater
degree of falability vis-a-vis the computer, which as you said is quite
happy
to act on tedium. That says nothing about the general case that humans
have superior intellectual capacity vis-a-vis the computer. After all, who
invented what?
This discussion is founded upon your statement:
>> Maximisation of processor throughput, and minimization of
>> microinstruction count, is at least half the purpose of microprogramming.
>
>Sure. And the microcode compilers I've written and used are much better
>at optimizing horizontal microcode than I have the time or patience to do
>by hand.
>
>> For such optimisation to be effected, on must necessarily write directly
>> in microcode, either bit and byte streams, or coded as in assembly
>> languages.
>
>No, it doesn't. Microcode almost always has a lot of data dependencies,
>which means that a compiler can often do as well as a human at optimizing
>it.
>
And yet, you argue against yourself with:
>... when I was trying to squeeze the 514 words down to 512, I ...
Herein, you admit that your personal skills quite outweighed those of the
algorithm that you constructed for the purpose of compiling a high-level
code into a particular microcode. Recall:
>Sure. And the microcode compilers I've written and used are much better
>at optimizing horizontal microcode than I have the time or patience to do
>by hand.
Also, recall:
>Therefore if I can use four weeks of my time to write a compiler and two
weeks
>to slightly tweak the output of that compiler ...
So, we are agreed that a human has greater capacity for the preparation of
optimal code. I conceed the notion of sufficient time to complete a task.
What you have failed to address is that the human intellect is not limited
by
the capacity to algorithmatise a solution. Hence, P vs. NP, GEB, and in
particular, the notions of Godel: that within any axiomatic system, the
answers
to some positable questions are indeterminable.
Humans have the capacity to make judgements by means outside of those
mathematical and logical, hence the reference to Penrose.
For all the nit-picky details of the works of these masters, the points they
make
are far grander. The real value of their works is not kept solely within
the realm
>from which their conclusions emerge, but within which such conclusions find
additional value.
William R. Buckley
At 12:46 AM 4/5/99 -0700, you wrote:
>
>Are you insane? The excrutiatingly slow and bloated Microsoft Word
>screams compared to Acrobat. I get so antsy waiting for Acrobat to update
>a fricken PDF page on the screen that my head wants to explode.
I suspect it's like PostScript, or metafiles, or executable code
in general: it all depends on what's generating the PDF file.
Some PDFs are apparently just bitmaps, others a mix of text and
bitmap, others just text. The existence of a PDF print driver
doesn't mean what goes through it will be the best.
- John
<IF you can stick the XT keyboard (are keyboards that talk that protocol
<still being made?) then look at the circuit of the PC or XT. The keyboard
<interface is a few TTL chips hung off an 8255.
At keyboards can be used as well as they are similar (not the same). You'll
have to make a interface as the serial is not compatable with UARTs, also
you will have to convert the key down/Key up codes to something more human.
<I'd make it modular (in that I'd have expansion slots), but I'd probably
<put the CPU + RAM + basic I/O on the 'motherboard'. For prototyping,
<DIN41612 connectors are easier than edge connectors because you don't
<need special boards with the connector fingers on them.
An acceptable bus is ISA-8bit and there are plenty of FDC, VIDEO, HDC cards
for that bus that could easily interface to z80.
<SRAM is a _lot_ easier. And now that 64K SRAM is 2 chips at most (62256's
<are cheap now), I'd use that. DRAM is not too hard until you realise that
<layout and decoupling are critical if you want to avoild random errors.
Same comment, one proviso, if your doing over 256k consider DRAM and MMU.
a good article for that is at the TCJ site.
<[For the hardware wizards here : Yes you can homebrew with DRAM - I've
<done it. But not as my first real project].
For a z80 system of 64 or 128k static is far easier. Also 128kx8 parts are
cheap so even 256k or 512k ram systems are modest.
Allison
>I was unclear.
>
>I meant that the microcode/assembly language words _corresponding to a
>particular LISP program_ were created from that program and then executed.
>If you define a function (like the ever-popular factorial function)
>something has to be stored in memory as the definition; presumably it is
>some sort of primitive (as in not-easily-decomposed) machine language, and
>presumably there is a program that converts source text into object code.
>
>So wouldn't that converter be a compiler? I believe that a number of
subtle
>details happen during the conversion process, so you couldn't even say the
>compiler is a simple compiler.
>
>-- Derek
You were quite clear. The answer is no.
Consider the instruction set of the x86. The MOV instruction is actually
implemented as a small sequence of microinstructions. There is, in fact,
no dedicated series of gates and other electronic aparatus which
implements the operation of MOV. Instead, it is implemented as a
series (or sequence) of smaller operations, such as LOAD REGISTER,
ADD REGISTERS, etc. If you are not familiar with the processes of
microprogramming, then you should become so. Microprograms are
not stored in RAM. Instead, they are stored in ROM.
Also, the only processors which today are founded upon the operation
of dedicated electronics (that is, electronic circuits which implement
fully and singly the operation of a machine instruction for a computer)
are the RISC machines. This is why they are so bloody fast. All CISC
machines are microprogrammed.
For those who are aware of the operations of the HP 21MX processors,
these are microprogrammed machines. As it happens, the user of
such a computer can alter the microprogramming. This is the computer
upon which I obtained my experience as a microprogrammer.
I do not mean to say that the factorial function is microprogrammed. It
is not. However, the operators CAR, CDR, CONS, etc. are implemented
in microcode. Hence, there is no need for translation - they are executed
directly.
For confirmation of this, contact my friend, Chuck Fry at
chucko(a)ptolemy.arc.nasa.gov
Now, it is true that the printed text of the program must be converted to
the instruction set of the computer but, the process is like this.
"CAR" corresponds to the instruction with byte code 0x01
"CDR" corresponds to the instruction with byte code 0x02
and so on. Of course, the byte values I give are only examples. The
true translations are not known to me. However, each operator of
the Lisp language will correspond to a single instruction code of the
Lisp machine.
This is a far cry from the result one usually obtains from a compiler.
If a compiler were used by a Lisp machine, then the operation of
CAR would involve the production of dozens of machine
instructions, just as the call of a subroutine in C involves dozens of
machine instructions. Heck, a simple addition in C results in an
instruction sequence like
MOV AX, address of data1
ADD AX, address of data2
MOV address of result, AX
For the Lisp machine, CAR would result in an instruction like
CAR address of source operand list, address of result operand
list
William R. Buckley
>> Consider the PDP 11/44 in my living room. It is constructed using the
>> AMD 2900 series of bit-slice microprocessor chips. In this case, the
>
>Well, I've never seen a PDP11 processor (as opposed to a floating point
>processor or a VAX) that uses 2900 series. IIRC the 11/44 uses 74S181
>ALUs and a sequencer built from TTL (and maybe some 82S100 PLAs)
>
>-tony
>
Not mine. I just pulled the processor card and it contain 16 of the
AM2901BDC chips, copyright 1978. The card has designation M7093
imprinted in the PCB metalisation layer. Well, upon closer inspection
this seems to be the FPP. The card designated M7094 does have
four of the 74181 type chips, and this is probably the general purpose
CPU component.
Any way, the point that I was trying to make is that the control code
for the 2901 was contained in ROM, and not so much that the CPU
was implemented via the 2901. Lets concentrate on the issue, not
the errors associated with making the point.
William R. Buckley
Well . . . I did think one could get two short cards on one S-100. I did
have something concrete in mind, too. If one inserts a wire-wrap 62 pin
(8-bit ISA) connector into a DIN 41612 right-angle socket, such as what one
finds on a VME wire-wrap board, but of opposite gender, (remembering that I
once sold S-100 wire-wrap boards with a pattern certainly suitable for this
purpose, and VME wire-wrap cards as well) one can, indeed, host two 8-bit
ISA cards on a single s-100 board. This would certainly be cheap enough in
most cases, to warrant such an effort. The software might get to be a
problem, though.
Dick
-----Original Message-----
From: CLASSICCMP(a)trailing-edge.com <CLASSICCMP(a)trailing-edge.com>
To: Discussion re-collecting of classic computers
<classiccmp(a)u.washington.edu>
Date: Saturday, April 03, 1999 6:58 PM
Subject: Re: homemade computer for fun and experience...
>>What might be fun would be an S-100 card to serve as an interface to a
>>Monochrome/Hercules equivalent card and an IBM-style keyboard
>
>Compupro did a similart thing over a decade ago ... it's called the "PC
Video"
>S-100 card.
>
>>thrift store. That would save the hassle of having an extra keyboard and
>>monitor for your "extra" PC.
>
>I dunno - to me the most useful possible console interface is a serial
>port. I can hook a terminal up, I can hook a VAX up, I can hook a
>PC-clone up, etc.
>
>--
> Tim Shoppa Email: shoppa(a)trailing-edge.com
> Trailing Edge Technology WWW: http://www.trailing-edge.com/
> 7328 Bradley Blvd Voice: 301-767-5917
> Bethesda, MD, USA 20817 Fax: 301-767-5927
>> Are you insane? The excrutiatingly slow and bloated Microsoft Word
>> screams compared to Acrobat. I get so antsy waiting for Acrobat to update
>> a fricken PDF page on the screen that my head wants to explode.
>>
>> Unless you have the latest and greatest 500Mhz PII wonder machine, Acrobat
>> is a farce.
>Not to mention that the Acrobat user interface _SUCKS_!
There are several alternatives to Acrobat. I'm quite familiar with
them because there are no Acrobat binaries for any of the architectures
that I commonly use.
1. Any Ghostscript release from the past few years does PDF quite nicely.
You have a choice of a command line interface (very useful for doing
batch conversions from PDF to something more usable) or a point-and-drool
shell ("GhostView", aka "gv"). Ghostscript is available for many platforms,
it is independent of any specific windowing system, and it is quite usable
on platforms which lack graphic displays at all. For details, see
http://www.cs.wisc.edu/~ghost/
2. XPDF, which only does PDF (unlike Ghostscript which does postscript
as well) and only can display on X-windows screens, is also available.
See
http://www.foolabs.com/xpdf/
Personally, I think Postscript is a better "high-level" approach to
describing page layout than PDF is, but I also understand the commercial
reasons that force Adobe to push PDF instead.
I've been having a blast lately doing my own printed circuit board layout
in Postscript, incidentally :-). Bezier splines, here we come!
--
Tim Shoppa Email: shoppa(a)trailing-edge.com
Trailing Edge Technology WWW: http://www.trailing-edge.com/
7328 Bradley Blvd Voice: 301-767-5917
Bethesda, MD, USA 20817 Fax: 301-767-5927
This is why I'm positng about progress.
I'm well above average with tools, test gear, spare parts and debug
skills. I seriously wonder how many of the "I want a SROCTH because
I've heard they are neat(kewl, cool, gnarly...)." have the skills or
the clues to even power it successfully. These things are as far from
PCs as one gets. The end result is destruction rather than preservation.
OH, "SROCTH" is Some Rare Old Computer To Hack.
Part of the power supply work was removing and repairing an existing
bungled hack that had some of the foil burnt off the board, likely with
some soldering torch of killer proportions. When dealing with old power
supplies like this substitutions are not a good thing.
<Great progress! The stuck bit won't be hard to find/fix. I had a couple of
<bad TTL chips when debugging the 8/f here, they were easy to find. Similar
<problem (always-off bit in certain I/O operations).
I will find it but first back to the power supply as the -15V folded.
<Let us know how your core plane works!
Initial tests indicate it was good, I was able to run a simple 5
instruction program. Short story. The core in this one came from an
8E that had two 8k stacks I got from the Mill in '85 and gave away in
90 or 91 to another Digit (digit=person that works/worked at Digital).
It was known good when given away.
<Schematics and other info's on highgate (when you get time to download the
<viewer!).
There is no free or shareware viewer for that format I can find. If you
have one it can likely save the file as jpg of gif. Why an oddball version
of TIFF was used is beyond me.
Allison
Hi,
----------
> From: CLASSICCMP(a)trailing-edge.com
> To: Discussion re-collecting of classic computers
<classiccmp(a)u.washington.edu>
> Subject: RE: bringing up an 8f...
> Date: Monday, April 05, 1999 9:33 AM
> I've been having a blast lately doing my own printed circuit board layout
> in Postscript, incidentally :-). Bezier splines, here we come!
But even then, having a layout in postscript for refrerence reasons is
nice. A quad qbus board doesn't fit on a letter format, but a "resize to
fit" do, what is really nice, if you try to document some rewiring..
cheers,
emanuel
Hi,
----------
> From: Zane H. Healy <healyzh(a)aracnet.com>
> To: Discussion re-collecting of classic computers
<classiccmp(a)u.washington.edu>
> Subject: Re: bringing up an 8f...
> Date: Sunday, April 04, 1999 8:18 PM
>
> Actually I'm of the opinion that PDF is just about the best format out
> there for documenation, although I know there are a LOT of people that
> disagree with that.
I prefer postscript. There are many tools to view the stuff, print it, or
change. Most of the time i like to print the documents, but i always
reformat them. (something like 2 pages on one pages, landscape format ,
duplex print) simply not to waste so much paper.
only my .002 $
cheers,
emanuel