> From: Eric Smith
> The power draw of the 11/45 and 11/70 is almost the same, with the
> exception of the separate memory box(es)
Well, the 11/70 does have the cache, the UNIBUS map, etc that the 11/45 does
not. I forget how many extra cards that all works out to, but my vague memory
is that it's roughly half a dozen.
Noel
Hi,
Who can help me with a source (not IBM) for logic probe tips
used with IBM MST and SLT backplanes.
See: http://home.hccnet.nl/h.j.stegeman/IBM_logic_probes.jpg
Prefably the lower one (P/N 453826).
Thanks for your replies.
Regards Henk
>Glen Slick wrote:
>>On Mon, Feb 16, 2015 at 11:47 AM, Noel Chiappa <jnc at mercury.lcs.mit.edu> wrote:
>
>
>>This may be common knowledge here, but I was unaware of it (and, AFAIK, the
>>DEC documentation doesn't point this out), so here goes...
>>
>>It turns out one doesn't need the fancy cab-kit to connect up to an 11/23+'s
>>console. The headers on the card are completely compatible with standard
>>DLV11-J connectors, and a DLV11-J cable can be used to connect up to an
>>11/23+ card. (One has to select the desired baud rate with the DIP switches
>>on the card, of course.) I have verified this by trying it, and it worked.
>>
>>The cabkits merely allow one to select the baud rate at the console connector
>>(via a clock generator on the cabkit, and the 'external clock' input on the
>>serial interface). This implies that one should be able to plug a cabkit into
>>an appropriately configured DLV11-J (external serial clock select), and
>>select the baud rate via the switch on the cabkit. I haven't tried that,
>>though.
>>
>If anyone is looking for the real cabkit for the M8189 PDP-11/23+ JT
>Computer currently has some listed on eBay for $45. There were 5
>listed, 2 sold last week:
>
>http://www.ebay.com/itm/151615474481
>CK-KDF11-BA M8189 CABKIT INCLUDES PANEL AND CABLES
>
>I finally got around to buying a CK-KDJ11-D cabkit for the M7554
>KDJ11-D directly from their website ( www.jtcomputer.com ) a week ago.
>I've had good results buying a few things from them.
>
Check
John Foust <jfoust at threedee.com> wrote:
> I'm trying to understand at a low level how some early computers
> and game consoles generated a non-standard form of NTSC.
>
> The Wikipedia http://en.wikipedia.org/wiki/Low-definition_television
> says: (...)
Adam Sampson <ats at offog.org> wrote:
> There's a pretty good description here, with diagrams of the video
> waveforms involved for both PAL and NTSC (both use the same idea):
> http://martin.hinner.info/vga/pal.html
>
> (...) in the kind of non-interlaced
> signal you're talking about, every frame starts with the odd field
> vertical sync, so the monitor always pulls the electron beam back to
> the same place.
A-ha, interesting to see that this sort of shortcut was actually taken
in commercial products. I pretty much accidentally ended up with that
sort of signal when I, back in 2010ish, tried to coax a Sun cg3 style
framebuffer (onboard FB of a SPARCclassic) into outputting a (50Hz) TV
displayable RGBs Signal by feeding it a hand-crafted "mode line" after
working out what the registers on the video timing ASIC do. I'm
pretty sure there will be some accounts of that adventure in the list
archive as a list member helped me through it.
I got the picture to display on my Commodore 1081 Monitor via a 13W3
to SCART cable I had fashioned therefore, but of course wasn't very
impressed with the vertical resolution, and couldn't find out how to
enable interlaced mode on the ASIC - if it's capable of that at all.
So Long,
Arno
This may be common knowledge here, but I was unaware of it (and, AFAIK, the
DEC documentation doesn't point this out), so here goes...
It turns out one doesn't need the fancy cab-kit to connect up to an 11/23+'s
console. The headers on the card are completely compatible with standard
DLV11-J connectors, and a DLV11-J cable can be used to connect up to an
11/23+ card. (One has to select the desired baud rate with the DIP switches
on the card, of course.) I have verified this by trying it, and it worked.
The cabkits merely allow one to select the baud rate at the console connector
(via a clock generator on the cabkit, and the 'external clock' input on the
serial interface). This implies that one should be able to plug a cabkit into
an appropriately configured DLV11-J (external serial clock select), and
select the baud rate via the switch on the cabkit. I haven't tried that,
though.
Noel
On Mon, Mar 16, 2015 at 8:20 AM, John Foust <jfoust at threedee.com> wrote:
> I'm trying to understand at a low level how some early computers
> and game consoles generated a non-standard form of NTSC.
There were several non-standard aspects of the Apple II "NTSC" video
signal which made it not actually NTSC-compliant. It was mostly a
matter of timing. The most significant deviations were lack of
interlace (60 frames per second, rather than 30 frames per second each
comprised of two interlaced fields), and that each horizontal scan
line was 228 color carrier cycles long rather than 227.5, which was
done so that the color carrier phase relative to horizontal timing was
the same on all scan lines. Early revisions of the Apple II were also
missing the serrations in the vertical interval.
Some people claim that the way color was generated in the Apple II was
somehow "fake" and refer to it as "artifacts" that somehow "trick" the
television or monitor, but in actuality it was just a clever way of
having the hardware produce various signal amplitudes and phases, just
as "real" color does. It was clever enough to be patented, but it's
not in any way "wrong" or "fake".
When other devices have an issue with Apple II video as a source, and
don't reproduce the colors properly, in my experience it is usually
the 228 vs. 227.5 color carrier cycles per scan line that is at fault.
In the early 1980s, Video Associate Labs sold a product for the Apple
II called the VB3 Microkeyer. This consisted of a long slot 7 card,
and a larger card that installed atop the power supply, connected by a
ribbon cable. You had to pull about a dozen chips out of the Apple II
motherboard (thus it would only work with the II and II+, but not the
IIe or IIgs), and install in their place ribbon cables to the VB3
boards. The results were:
1) The Apple II would generate fully-compliant broadcast-quality NTSC
output. (IIRC, there was a way to selectably disable interlace, which
would break NTSC compliance.)
2) proc amp and gen lock: The Apple II video could be overlayed on an
external video input. This could be done selectively (keying), and it
was also possible to do software-controlled "wipes" between two video
inputs, using the Apple video bitmap as the input selector.
3) A special "linear" high-res graphics mode was added, in which the
addressing of the frame buffer was linear rather than having the
normal Apple II interlaced addressing. This made it easier to program
keying and wipes, but could be used for other purposes.
I used and programmed one of these in the instructional television
studio of a community college. I wrote some "wipe" programs for it,
and licensed them to Video Associate Labs in exchange for a VB3 of my
own. Alas, I no longer have it. :-(
Years later Apple introduced a Video Overlay card, which did produce
broadcast-quality NTSC output, and would work in the II, II+, IIe, and
IIgs. It supported all of the IIgs graphics modes, even when used in
an earlier model computer.
> From: Johnny Billquist
> I need to see if I can locate the manual in the summer when I get close
> to where I had that 11/34
That would be wonderful, if you find it! In addition to showing for sure how
it basically worked, I still also have a number of questions as to exactly
how it connected up to the optional cache, EUB memory, etc.
> we still seem to talk about two different ENABLE products, though... :-)
I saying what I'm about to say, I am not in any way trying to be
argumentative (truly, I would be as happy if you were correct, just as much
as if I were, _provided that we had found out what the correct answer really
is_), but I really do think that your memory is playing tricks on you, with
the 'it didn't require any software changes at all'.
The "Enable/34" described in contemporary posts here:
http://gopher.quux.org:70/Archives/usenet-a-news/FA.unix-wizards/81.07.09_u…http://gopher.quux.org:70/Archives/usenet-a-news/FA.unix-wizards/81.07.12_u…
works the way I'm describing... (Although note I do think Mike made some
mistakes in the diagram in the first one - I think the DMA devices have to be
behind the ENABLE/34, per his description in the second post of how it works.)
I wish we could find a copy of the paper mentioned there ("Modifications to
UNIX to Allow Four Mega Bytes of Main Memory on a 11/40 Class Processor" by
Clement T. Cole and Sterling J. Huxley), as it might also answer the questions
I have...
Noel
Probably stupid question, I'm sure, but I'd like to disable the beeping of
SimH when VMS is notifying. I cannot seem to find anything of a
configurable parameter in the documentation.
Am I missing something?
Kind regards,
Sander
> From: Roe Peterson
>> (Unless you have an 11/xx with an Able ENABLE board! :-)
> What is an ENABLE board?
That's that thing we had an 'energetic' discussion about a while back; it's a
board that allows one to put more than 256KB of memory in a UNIBUS machine
(other than a /44 or /70, which already support more than 256KB - and probably
the /24 too, too lazy to check).
Noel