On Sunday 22 July 2007 19:40, davis wrote:
I just read an recent article in eetimes that the
target life of new
ICs is 10 years. This is due to metal migration and other (i can't
remember ) effects that sub-micron processes exhibit. I guess you
should hang on to all the old gear you can get, because everything
built today will be land-fill in 10 years. I too have that microwave,
stainless on the outside for no apparent reason and a painted
interior. The coating failed after a few years.
"Target life-time" doesn't mean that every one of the chips will be dead
in 10 years. Realistically, at worst, 1/2 will be dead in 10 years,
and more likely than not it'll be somewhere more favorable on the bell
curve than that [an insignificant amount of dead chips until 10 years
out]. There's plenty of products that were produced 20+ years ago
which had worse "target life" (though not all of it had anything to do
with design).
Yes, it's much further out on the normal curve.
Chip makers have a strong incentive to make chips have long lifetimes.
Nobody is going to buy huge volumes of chips that will all die soon.
Complicated chips in modern geometries often take many $1M's and often
$10M's to develop when you consider the NRE (a few $M's), the tools, and
the number of person-years that go into them. If you build chips that
die soon, you won't have any customers and that makes it kind of hard
to recoup all of those $M's. You really want the product to become
obsolete for some other reason than because the chips start dying.
If you were creating a mass-market product, would you want to take
a $1B write-off on earnings when the world discovered your stuff dies
before it's "natural" obsolescence cycle?
The general practice is to take the desired lifetime (often 10 years
for the server type of chips that I usually build), worst case
average temperature, worst case voltage, maximum frequency, pessimistic
estimate of how often each signal inside the chip switches, and worst
case process variation.
A program then applies a formula to each of the structures in the chip
design to see if each meets the lifetime expectation with some probability,
well over 99%. The models used by these programs take more and more effects
into account with each generation. Foundry's work very hard to have
good models and work very hard to understand what structures will age
poorly and provide design rules to avoid those. Sometimes they mess up
and they spend $100M's or more to fix it because their reputation is
valuable.
For server type applications, the design teams work very hard to be sure
that all of the estimates are *not* optimistic. More often, the estimates
are fairly pessimistic. The reason is that the probability needs to
be really high if you are putting a lot of chips into a box and want
the box itself to have a lifetime of 10 years; if a box has 100 chips
with only 99% probability of 10 year lifetime, then the contribution
from the chips alone would mean that only 37% of the
boxes last for
10 years.
Bringing this somewhat back on topic ... one of the worst factors in the
equation is the average operating temperature. The lifetime falls
exponentially with the temperature. If you can get your chips 10 degrees
cooler, you will greatly extend their life; make sure boxes have good fans
and that the fans are working well. I don't know what fudge factors and
design life were used for commercial chips from the '70s, but no matter what
the target was, keep it cool!
James Markevitch