On 2018-01-27 8:16 PM, Murray McCullough via cctalk wrote:
Three ?computing? events are happening:
1) The i-Pad ...
2) Bitcoin mining ? The energy usage is extreme because of GPUs. Was
16-bit computer era, employing the 80287, such an energy hog?
It's extreme because proof-of-work is required, and so the more machines
you have doing the work (= the more energy expended), the quicker you
can mint fantasy money. (Using GPUs actually does much _more_ work per
Watt than other means, so they can't be blamed for this waste.)
We usually try to _minimise_ the work in everything else we do with
computers... and designs requiring proof-of-work don't seem sustainable.
3) INTEL doesn?t seem to have been hurt by Meltdown & Specture
financially- speaking.
Its CEO sure wasn't, personally.
--Toby
Had excellent earnings and profits for the last
quarter but ?may? change this quarter. However, INTEL marches on going from
4004 to 7980XE. AMD was/in the picture but financially-wise?
Happy computing all!
Murray J