Pleas ID this IBM system....
Patrick Finnegan
pat at vax11.net
Wed May 22 10:30:13 CDT 2019
On Wed, May 22, 2019 at 4:24 AM Chuck Guzis via cctalk <
cctalk at classiccmp.org> wrote:
> On 5/22/19 12:49 AM, Christian Corti via cctalk wrote:
> > On Tue, 21 May 2019, Patrick Finnegan wrote:
> >> Plumbing (unless you're doing aisle containment or RDHx) shouldn't run
> >> through the IT space in the data center.
> >
> > So how exactly do you attach a modern water cooled rack system to your
> > cooling water system if not using plumbing?
>
> So how are data centers cooled with water now? Does the water cool
> coldplates directly?
>
That's an option. I support 20-30kW/rack systems with using Coolcentric
(passive) rear door heat exchangers, which have a dewpoint-adjusted cooling
loop. The air is generally managed using CRAC units / building air
handlers.
> I recall visiting the Honeywell plant in Phoenix not long after they
> took it over from GE and the engineers there were tinkering with a
> direct water-cooling setup--water circulated in each rack (connected by
> what was probably vinyl tubing, I don't recall, only that it was
> translucent), with copper diaphragms serving as the interface between
> the water and the semiconductors. I recall from comments made that
> algae was a problem and adding an algicide to the cooling water tended
> to corrode the copper diaphragms.
>
New versions of that are made by companies such as Cool-IT, or HPE's SGI
systems. The materials used have progressed quite a bit, mostly
eliminating the algae and corrosion problems, and people have mostly
settled on ~25-40C (77-104F) water for cooling, to avoid condensing
humidity.
Pat
More information about the cctech
mailing list