Beginning Liquid Nitrogen Overclocking

Computer CPUs are typically "air-cooled", meaning that after the heat has conducted to the heatsink, it is extracted by blowing cool air across the fins. While it’s cheap, quiet and effectively does the job, it’s impossible to cool the CPU down below the ambient air temperature. In liquid nitrogen (LN2) cooling, LN2 (boils at nearly -200°C) is poured onto the a specialised heatsink to extract heat rather than blowing air.

I’ve had the opportunity to try this out first hand at the Gigabyte OC Workshop this Monday, with the TeamAU crew (dinos22, deanzo, and uncle fester). LN2 overclocking on the Z68X-UD4, Core i5 2500K, GTX470 and the GTX580 SOC.

Jay and Justin overclocking

How does it work?

The reason why LN2 cooling works so well is not just because we’re pouring really cold liquids onto the heat source; it’s also because a lot of the heat energy is absorbed by the process of evaporation. To make sure that the LN2 doesn’t just boil away and splatter away, there are custom designed CPU "pots" that are essentially a heatsink base with a cup to hold the LN2 while it evaporates.

I’ve been told that the colder the CPU gets, the faster it can run. (That, and to save the CPU from blowing up from pumping an insane core voltage into it.)

Why is sub-zero overclocking more difficult?

The biggest difference between the conventional air cooling and even cooling below room temperature is that condensation builds up as moist air settles on the cool surfaces on the equipment. As we all know, water and electronics don’t mix. That’s why we see so many fans, hairdryers and mounds of paper towels in LN2 setups.

Insulating around the CPULN2 cooling in action

Given that the CPU can only be overclocked so much below a certain temperature threshold, on the one hand we need to keep the CPU below a certain temperature, while on the other hand we can’t drop the temperature too much otherwise we’d get what’s called a "cold bug" and the computer would freeze – as in, lock up and stop working.

Usually when the computer locks up, it’s easy enough to hit the reset button to reboot and try again, but with subzero cooling there’s also the "coldboot bug" where there computer will not start unless it’s warmed up above a certain temperature. I suspect the coldboot bug is due to arithmetic underflow on the temperature sensor that falsely triggers the thermal protection circuit. The quickest way to fix this is apparently putting a butane torch to the LN2 pot…

Because temperature and voltage regulation is so critical, external temperature probes are used to get to most accurate readings from as close to the chip as possible. We can’t rely on the built-in sensors because they’re not designed to operate at extreme subzero temperatures, and many motherboards are simply incapable of reporting temperatures at that range.

What are the risks?

As mentioned before, the biggest risk in going sub-zero is the build-up of condensation wherever that’s colder than ambient. There’s a good amount of preparation work to be done before anything is plugged in, including insulating exposed circuitry on the motherboard against moisture and the cold, as well as making sure that the cooling device and temperature probes is mounted correctly.

Even when all precautions are taken, things can still go wrong. We were shown a dead $600 GTX580 SOC with a resistor gone kaput.

The Gigabyte OC Workshop

The workshop was an incredibly fun experience, and it’s a rare opportunity to have the professional overclockers share their insights and experiences as well as an excellent tutorial on LN2 overclocking.

It’s a shame we didn’t beat any records on our first try, but I did walk away with a Gigabyte X58A-OC Motherboard having guessed the closest top 3DMark11 score. The runner up (Justin) walked away with a Gigabyte Z68X-UD4-B3 Motherboard.

Final words

A big thanks to PC PowerPlay and Gigabyte for hosting this awesome event. It’s incredibly difficult to even have a go at LN2 overclocking because it’s just a different beast. Reaching out and teaching us how it’s done really saves the stress of not knowing what’s right and would definitely save a bunch of dead hardware.

Let’s do this again!

Deanzo tops up the flask of LN2

Chinese New Year Twilight Parade 2011

6 February 2011
George Street, Sydney, Australia

IMG_4426.jpg

Craig Mundie talks NUI and Avatar Kinects with Dean of Engineering and IT at the University of Sydney

More Like Us: Computing Transformed

On Tuesday, as part of the Dean’s Lecture Series, Microsoft’s Chief Research and Strategy Officer Craig Mundie demonstrates a virtual persona based teleconferencing technology known as Avatar Kinect which was announced at CES 2011 earlier this year.

More Like Us: Computing TransformedThis public demonstration was the second in Australia after the Melbourne event a day before, and was in attendance of over 300 people in Sydney, a hundred more than the first.

The lecture focused on next generation computer human interaction interaction “Natural User Interfaces” (NUI), set to expand the possibilities of sterile computer control in the operating theatre, remote virtual gaming and a virtual receptionist. Natural user interfaces is set to remove the learning curve from today’s user interfaces, taking advantage of metaphors from the physical world.

More Like Us: Computing Transformed

Craig explains that while the step from telephone was to television, the next step is telepresence. I think that the use of an avatar is a great step forward, especially since many of today’s users are already comfortable with taking upon an online persona through many of the video games available today.

Microsoft Research have been rather active in NUI development, and many of its work could be found found explained in quite simple terms at MSDN ‘s Channel9 since TechFest 2010.

Grey dead pixel lines on iPod Touch screen

A year and a half into my first apple product, the iPod touch 2G, and I was hit with a screen defect that happens to surface after just the product warranty expired. *thumbs up to Apple product quality*

So what exactly is the problem? The screen develops a number of permanently grey pixels, so that no matter what colour those pixels needed to display, they’ll always show up as grey.

Grey line of pixels on iPod

No, it’s not a cracked screen – the glass touch input panel is fine. The LCD is not cracked, it’s just dead pixels. What’s worse is that those dead pixels are somewhat contagious. Those grey lines would extend until it reaches the edge of the screen, killing off more and more pixels in due time.

Grey pixels on white backgroundGrey pixels on black background

Now if you’re unfortunate enough to have this same product defect as I did, the obvious question is, "How can we fix it?" I asked the Apple store, and they were willing to fix it for me – for about AUD $200. Well why might I want to get Apple to fix it for me at that price, when I could almost get a brand new iPod Touch?

There are sites out there that claims to fix iPhone dead pixels, but it certainly won’t fix this particular problem. You see, it’s a physical degradation, which can be seen under bright sunlight, even when the device is powered off.

IMG_9212

If I’m not willing to pay 4/5 of the price of a new one to fix an old iPod Touch, what are my options? Throw it away? Nope. Here’s my plan. use it with the defect until it becomes annoying enough. Then, I’ll buy a replacement screen on eBay for about $AUD 15 (with free shipping) and fix it myself. Hah – take that, consumerism!

Software Radio for Satellite TV on your Computer

Look at your TV today. It receives analog free-to-air TV signals using some built-in receiver circuitry. If your TV is capable of receiving digital channels, great – but did you know that it actually uses a separate set of receiver circuitry to make it happen? That’s why for those with an older TV set, they need to get a set-top box to get the digital channels. Wouldn’t it be great if we can use the one set of hardware to receive every channel out there, be it analog TV, digital terrestrial TV, satellite TV and even the TV standards yet to be developed? Well that’s what software radio here for.

Test setup connection

Throughout this year, I was busily working on my undergraduate thesis project, with the goal of developing a software based transmission system for satellite TV. In particular, I wanted to implement the European standard DVB-S using a general purpose computer and the free software radio framework called GNU Radio and a generic radio device called the USRP from Ettus Research.

How did I go? It worked! I was able to correctly decode the captured signal from the satellite and recover the MPEG-2 transport stream that can be played using MPlayer, but unfortunately that’s not the end of the story. Ideally we’d like to receive the satellite signal and decode it in real-time, but our processing speed hasn’t quite got there yet. The performance can be summarised in the figure below:

Results: normalised throughput

In this chart, we’ve taken the throughput of each component signal processing block in the receive chain and normalised it to a value of 1 meaning that it’s just able to run in real-time assuming that each block can be run in parallel. A value of lower than 1 means that it’s slower than real-time while values greater than 1 mean it’s more than fast enough for real-time processing. Looking at the proportion of CPU time spent in decoding, there are only three blocks taking up most of the time: the Viterbi decoder, the M-PSK receiver for symbol timing recovery and the frequency correcting frequency locked loop:

relative_duration

So what we need to do now is improve the efficiency and throughput of each of those blocks, then we should be well on our way to real-time, satellite TV decoding on completely generic and reconfigurable software radio on the computer!

More details can be found in my treatise, which can be found under my blog’s Sydney Uni page.

WordPress Themes