John wrote:
Seems like something was invented, probably in chip
design that made
3.3 so useful. I wonder what that was, when that landmark was reached.
AFAIK, there's nothing particularly magic about 3.3V. CMOS can be
designed or optimized for a wide range of voltages, and 3.3V just
happened to "win" as one of the most common. But there are also
systems that run on 3.0 and other voltages. And most modern VLSI
chips actually use lower voltages for the core logic than the I/O, so
it's common to see chips with a 1.2, 1.5, 1.8, 2.0, or 2.5V core
supply, while 3.3V is still common for the I/O supply. Some chips
with 3.3V I/O are "5V tolerant", but many are not.
Anyhow, the migration from 5V to 3.3V was mainly motivated by two things:
1) reducing power consumption
2) reducing field strength in high-density IC core circuitry to
avoid damage
Eric