Вы находитесь на странице: 1из 154

The 2008 PC Builder's Bible

Find the best parts. Learn to build a rig from scratch and overclock it
to kingdom come. PC Gamer shows you how

Getting your hands dirty and building your own system is what separates PC gamers
from their console brethren, and it just so happens to be one of the most exciting parts
of our hobby. Unfortunately, it’s also a pretty daunting process for anyone who hasn’t
assembled his own rig from scratch. Thankfully, this guide not only gives you all you
need to know about every component that goes in your gaming PC, but also thoroughly
walks you through the entire building process with detailed instructions and helpful
photographs.

We’ve always written the PC Builder’s Bible not only as a guide for new system builders
who want more versatility from their computer, but also for hardcore enthusiasts who
have to be on the cutting edge of technological innovation. And guess what, that
includes us as well. Every piece of hardware you’ll find recommended in this book is
something we would buy for ourselves. The specs of our custom rigs are actually the
same as the machines we’ve built for ourselves at home. That’s because we’re just like
you; we want the most bang for our proverbial buck. And with more money saved from
building a lean super-rig, you’ll have more money to spend on the awesome new games
to play on it!
MOTHERBOARDS

Wrap your head around the various motherboard chipsets that create
the backbone of your gaming PC.
- Meet the latest motherboard technologies
- Buy the right motherboard in six easy steps
- Motherboards for Intel CPUs – LGA 775
- Motherboards for AMD CPUs – Socket AM2

CPUS
Dual core or quad core? Intel’s Penryn or AMD’s Phenom? We give you
the answers!
- Intel’s Penryn and three recommended Intel CPUs
- AMD’s Phenom and three recommended AMD CPUs

RAM
DDR2 or DDR3? Find out how your random access memory works with
answers to frequently asked questions.
- Choose the RAM that’s right for you

VIDEOCARDS
Behind every great gaming PC is a great video card. Follow our guide
when deciding your next GPU purchase.
- Videocard features to look for
- SLI and Crossfire analyzed
- Is DirectX 10 worth it?
- The best mid-range DirectX 10 cards
HARD DRIVES
The hard disk is a paradox – it’s both tiny and enormous at the same
time. We’ll help you wrap your head around the terabytes of data.
- Easy answers to common questions about your hard drive
- PC Gamer’s hard drive picks

OPTICAL DRIVES
CDs, DVDs, dual layer, Bluray… the optical drive scene is evolving
breakneck speed!
- Optical drives in a nutshell
- PC Gamer’s optical drive picks

SOUNDCARDS AND SPEAKERS


Nobody enjoys the sound of silence. Learn all there is to hear about the
latest audio technologies, and you’ll soon be basking in true surround
sound earphoria.
- Meet the latest technology for audiophiles
- Your speakers are sick. PC Gamer has the cure

CASES
We guide you through the ins and outs of a PC’s metal frame and
review two excellent high-class enclosures.
-Give your components a happy home
- PC Gamer’s recommended cases

MONITORS
In the case of PC gaming displays, screen size and display resolution
matter. From brilliant 24-inchers to dominating 30-inch high-definition monitors, you’ll
never see games the same way again. We tell you what to consider.
- Pick the perfect display
- Three recommended monitors

PERIPHERALS
If you’re using a standard mouse and keyboard, you’ll never really have
a chance in the gaming world. If you’re serious about gaming, get some
serious gear.
- The best gaming keyboards
- The best gaming mice
- The best gaming accessories

Everything you need to know to build a smoking-fast, no-compromises, gaming PC


- Prices, parts, and lots of pictures to show you how it’s done

Learn how to wring every last drop of performance from your new rig
The importance of a good motherboard can’t be overemphasized. Every byte of data
your computer processes must pass between several components before it reaches
you, and the motherboard is the highway. The last thing you want is a metaphorical
traffic jam between CPU, RAM, and videocard when you’re trying to frag.

Should your next motherboard be BTX and support both SLI and DDR2? Don’t be
embarrassed if you don’t know the answer—our CliffsNotes primer on top-end mobo
technology will have you spouting geek-speak in less time than it takes to burn a DVD.

PCI EXPRESS
PCI Express has become a de-facto motherboard standard seemingly overnight,
despite the fact it hasn’t demonstrated much of a performance boost over the older AGP
standard (at least not in single-card configurations). PCI-E joins the trend of moving
away from wide, slow interfaces with lots of pins to narrow, high-speed interfaces. It
increases the available bandwidth for graphics from AGP’s 2GB/s to a whopping 8GB/s.
But PCI-E’s real graphics promise lies in its upstream bandwidth throughput: 4GB/s
compared with AGP’s 133MB/s.
For add-in cards, the standard x1 PCI-E connectors offer about 300MB/s second of
throughput—just about double that of a standard PCI slot. Considering the amount of
integration on today’s motherboards, however, few components really need to be
added. For this reason, we’ve not yet seen any real application for x1 cards; but that’s
likely to change as soon as software developers create applications that take advantage
of PCI-E.
BTX FORMFACTOR
The BTX motherboard formfactor moves the processor to the front of the case,
relocates the chipset to deliver higher I/O speed, and provides better component
cooling. Despite these advantages, BTX has been greeted with about as much
enthusiasm as turd casserole at a pot-luck. Much of the resistance springs from chassis
manufacturers, who are reluctant to spend $50K to retool their assembly lines. AMD,
meanwhile, has publicly stated it won’t embrace the standard unless customers demand
it. While we think BTX is a smart design improvement, it’s pretty much dead in the
water. You can safely stick with the tried and true ATX formfactor until the next
challenger comes along.
ATX 12V 2.0
PCI Express graphics cards can suck up to 75 watts of power, compared with AGP's
50-watt maximum. ATX 12v 2.01-compliant power supplies feature a 24-pin connector
that jacks into new PCI Express-capable motherboards. The good news is that you don't
necessarily have to buy a new PSU to run your new 24-pin mobo. Many motherboards
with a 24-pin connector are keyed to accept an older 20-pin PSU; the extra four pins are
simply left vacant. To make up for the lack of power, some new motherboards allow you
to supplement the mobo's main power by plugging in a second, four-pin connector.
NCQ and SATA 3Gb
SATA 3Gb is a pretty simple concept: Take SATA’s maximum transfer rate of 150MB/s,
double it to 300MB/s, and you get SATA 3Gb. Today’s hard drives don’t need the
throughput, but there’s no reason not to have it on a new motherboard. Native
command queuing is probably more important. NCQ enables a hard drive and its
controller to intelligently reorder data requests, so the combo can scoop up and write
data faster. Although we’ve seen only small performance boosts from NCQ so far, it’s a
good idea to have it on whatever motherboard you choose.

High-Definition Audio
High-Definition Audio bumps maximum audio resolution from AC-97’s 20 bits up to 32
bits, while sampling rates are boosted from AC-97’s 48kHz max up to 192kHz. HD
Audio supports up to eight analog channels, where AC-97 supported only six. PCs
outfitted with HD Audio will also support a host of Dolby technologies, including Dolby
Headphone, Dolby Virtual Speaker, Dolby Digital Live, and Dolby Pro Logic IIx. Dolby
Pro Logic IIx might be the most interesting. This technology can encode a stereo or 5.1-
channel audio stream—including game audio—into 6.1 or even 7.1 channels in real
time.

So what’s the catch? Most audio experts we’ve talked to contend that it will be all but
impossible for HD Audio to match the fidelity of even a three-year-old PCI soundcard
because of all the electrical noise motherboards generate.
Before you buy a motherboard, you must first decide if you’re going to recycle your old
CPU or upgrade to something new. If you’re keeping your old proc, make sure it will
work with your new mobo. If you’re going new, will it be AMD’s Quad FX uber chip[s],
Intel’s blazing Core 2 Extreme QX6800 with a whopping 8mb of onboard cache, or
something in between?

Choosing a core-logic chipset is as important as your CPU choice. Intel, NVIDIA, and
VIA all make excellent chipsets for the mid and high end processors.

Now it’s time to decide which features you want on your mobo. In the old days (well, if
you consider 1999 the old days), motherboards were about as stripped as a Chevy
Impala left parked on a Bronx side street. These days, motherboards come with
everything you need, save a videocard, CPU, and RAM. What are you looking for? Dual
Gigabit Ethernet? HD Audio? Enough SATA ports to feed a rack of hard drives? Make
your list.

Once you find a motherboard that tickles your fancy, read the owner’s
manual before you plop down your dough. Most motherboard vendors offer their
manuals as free downloadable PDFs on their websites. The manual will reveal any of
the board’s limitations (such as the types of memory and CPUs it supports), and it will
let you know if a PSU upgrade is necessary.

If the motherboard has


been out for a few months, visit the forums on the manufacturer’s website and see what
buyers are saying. But remember to keep everything in perspective: People don’t go to
the forums to wax poetic about their AM2 board, they go there to bitch. It’s all but
impossible to determine if the person complaining is a fried customer or one of the
manufacturer’s competitors looking to sow fear, uncertainty, and doubt. Always take
forum comments with a grain of salt, but if you see a pattern emerging, it could be a
warning sign.

It’s not at all uncommon for motherboard manufacturers to revise their designs without
going so far as to introduce an entirely new model. Newer revisions are almost always
better than older boards, so try to purchase the latest version of the motherboard that’s
available. You’ll find the rev numbers silk-screened on the board.

nForce 780i SLI 775 A1


Manufacturer: EVGA
Chipset: NVIDIA nForce 780i
CPU Support: Intel Pentium, Pentium EE, Core 2 Duo, Core 2 Quad, Core 2 Extreme
Memory Support: DDR2 533/800/1066/1333 MHz
PCI Slots: 3 PCIe x16, 1 PCIe x1, 2 PCI
Notable features: Triple SLI support, 6 SATA ports with support for RAID 0, RAID 0+1,
RAID 5; integrated 7.1 channel audio, 10 USB 2.0 ports (6 external, 4 internal), 2
Firewire ports (1 external, 1 internal)
Price: $260
www.evga.com

Rampage Formula
Manufacturer: ASUS
Chipset: Intel X48
CPU Support: Intel Pentium, Pentium EE, Core 2 Duo, Core 2 Quad, Core 2 Extreme
Memory Support: DDR2 667/800/1066/1200 MHz
PCI Slots: 2 PCIe x16, 3 PCIe x1, 2 PCI
Notable features: Crossfi re support, 6 SATA 3Gb/s ports, Dual Gigabit LAN
controllers, SupremeFX II Audio Card, 12 USB ports (6 external, 6 internal), 2 Firewire
ports (1 external, 1 internal), External LCD post device
Price: $300
www.asus.com
Striker II Extreme
Manufacturer: ASUS
Chipset: NVIDIA nForce 790i Ultra SLI
CPU Support: Intel Pentium, Pentium EE, Core 2 Duo, Core 2 Quad, Core 2 Extreme
Memory Support: DDR3 800/1066/1333/1600 MHz
PCI Slots: 2 PCIe 2.0 x16, 1 PCIe x16, 2 PCIe x1, 2 PCI
Notable features: Triple SLI support, 6 SATA 3Gb/s ports, Dual Gigabit LAN controllers,
SupremeFX II Audio Card, 10 USB ports (6 external, 4 internal), 2 Firewire ports (1
external, 1 internal), External LCD post device
Price: $300
www.asus.com
Crosshair II Formula
Manufacturer: ASUS
Chipset: NVIDIA nForce 780a SLI
CPU Support: AMD Socket AM2+ Phenom FX, Phenom X4, Phenom X2, Athlon X2,
Athlon 64, Sempron
Memory Support: DDR2 667/800/1066MHz
PCI Slots: 3 PCIe 2.0 x16, 2 PCIe x1, 2 PCI
Notable features: Integrated graphics w/ 512mb shared memory, Hybrid SLI support, 6
SATA ports, SupremeFX II Audio Card, Dual Gigabit LAN, 12 USB ports (6 external, 6
internal), 2 Firewire ports (1 external, 1 internal), External LCD post device
Price: $300
www.asus.com
M3A32-MVP
Manufacturer: ASUS
Chipset: 790FX
CPU Support: AMD Socket AM2+ Phenom FX, Phenom X4, Phenom X2, Athlon X2,
Athlon 64, Sempron
Memory Support: DDR2 533/667/800/1066 MHz
PCI Slots: 4 PCIe 2.0 x16, 2 PCI
Notable features: Crossfi re Support, 6 SATA 3 GB/s ports, 1 eSATA, Gigabit LAN, 8
channel HD audio, 10 USB ports (6 external, 4 internal), 2 Firewire ports (1 external, 1
internal)
Price: $210
www.asus.com
M2R32-MVP
Manufacturer: ASUS
Chipset: 580X
CPU Support: Athlon 64 X2, Athlon 64 FX, Athlon 64, Sempron
Memory Support: DDR2 533/667/800 MHz
PCI Slots: 2 PCIe X16, 2 PCIe X1, 2 PCI
Notable features: Crossfi re support, 8 channel audio card, gigabit LAN, 4 SATA ports,
10 USB 2.0 ports, 2 Firewire ports
Price: $110
www.asus.com
CPUs are tricky beasts. There was a time when simply looking at the number of
megahertz on a chip was a surefire indication of how well it would perform, but sadly
that just isn’t the case any longer. With Intel and AMD at each other’s throats for the
biggest piece of the market, their approaches to the technology have taken different
paths. Considering the new 64-bit and quad-core chips available now, does clock speed
even mean anything?

The question of whether or not to upgrade to a multi-core processor has long been
settled—you definitely want the advantages a multi-core proc gives you. You get a
payoff right away with extra breathing room for background applications to run without
dragging your gaming performance down to a crawl, and then you’ll get another payoff
in the future, as more and more games optimized for multi-core processors (like
Company of Heroes: Opposing Fronts and Crysis) hit the shelves. And with quad-core
procs from Intel hovering at around the same price as slightly higher-clocked dual-core
chips, going quad core is a little bit of future-proofing that you can’t afford to miss.

Of course, the Intel-versus-AMD debate rages on. AMD has rolled out its “Phenom”
series of multi-core processors, along with an announcement that it’s bringing “true
quad-core” to the desktop (with the intention of scaring the crap out of potential Intel
shoppers). What that refers to is that quad-core Phenom processors have four
individual cores on a single piece of silicon, whereas Intel’s quad-core procs are really
two dual-core procs stuck together. Does this fundamental design difference matter?
We’ll give you the answers.
Q: What exactly is a Penryn?
A: Penryn is the “family” name for Intel’s follow-up to its 65nm Core 2-lineage CPUs.
For consumers, Wolfdale will be the dual-core Penryn, Yorkfield will be the quad-core
version, and Harpertown will be the quad-core Xeon workstation CPU.

The big enhancement is the process shrink from 65nm to 45nm. Intel calls its move to a
45nm process the “biggest change to computer chips in 40 years.” Intel’s tendency
toward self-aggrandizement aside, the 45nm process is a significant jump forward,
allowing twice as many transistors to fit in the space of a 65nm chip. The 45nm process
also uses high-k gate dialectics. Not to be confused with L. Ron Hubbard’s Dianetics,
the high-k gate using hafnium oxide replaces the silicon dioxide gate that’s been in use
since the 1960s. The new transistor leaks less energy, produces less heat, and is able
to switch faster than a silicon dioxide transistor by 20 percent. This boils down to
smaller, faster, more power-efficient CPU cores. How much smaller? The previous Core
2 Extreme quad cores packed 582 million transistors within a space of 286mm2. The
Yorkfield quad core packs 820 million transistors into 214mm2.
Q: So what else is new under the hood?
A: Penryn is more than a simple die shrink. The new CPUs are based on the Core 2
microarchitecture with a few tweaks that Intel hopes will keep it ahead of AMD. The
headliner of these tweaks is the new SSE4 instruction set designed for media encoding
and high-performance computing. Also new is a Super Shuffle Engine, which increases
the speed of many SSE media-encoding instructions by doubling the processing units
from 64-bit to 128-bit.

Penryn also includes a new Fast Radix-16 Divider that pretty much doubles the division
math speed. Intel also reportedly boosted virtual machine performance by as much as
25 to 75 percent. And Intel added a new feature called Dynamic Acceleration
Technology that essentially overclocks one of the cores when the others are sleeping.

The new chip also makes use of all the physical space freed up by the die shrink.
(Imagine if all the stuff in your garage shrunk by 50 percent!) That’s what accounts for
the beefed up L2 cache, which at 6MB per core is a 50 percent increase over the L2 in
65nm quad cores. The larger L2 cache helps in numerous ways, but its biggest
contribution is in ameliorating the potential performance hit caused by the ancient
shared front-side bus architecture Intel uses for communication between cores. To keep
the front-side bus from bogging down, the large and very efficient L2 cache ensures that
the CPU has ample data close at hand so it won’t be data starved. While Intel has
certainly proved that the FSB strategy is still workable, the company has stated it plans
to adopt an on-die memory controller in its next CPU.
Q: How significant is the new SSE4 instruction set?
A: Instruction sets in CPUs always garner the most attention but, sadly, are usually the
last feature to actually add performance benefits. While the Fast Radix-16 Divider and
the Super Shuffle Engine in Penryn will increase the performance on many existing
applications, the 47 new instructions in SSE4 will not give you any performance boost
until applications directly
support them. SSE4’s main claim to fame will be in media encoding and high
performance computing (i.e., supercomputers). In fact, Intel’s demonstrations of SSE4-
enabled encoders showed incredible performance boosts.
However, those demonstrations have been called into question, with skeptics
suggesting that while the alpha build of DivX used for the proof-of concept benchmarks
is faster with SSE4, it’s not a realistic scenario. One developer we spoke with told us:
“The applicability of SSE4 for our codecs seems rather limited and the expected gain
seems rather small (I expect no more than a 1- to 2-percent speed gain with SSE4)
compared to the speed increment we got from SSE on pre-Core 2 Duo and SSE2 on
Core 2 Duo. The SSE4-instructions that are often advertised as being especially
targeted for video encoding are useless for us, since those instructions are only
applicable for exhaustive search algorithm (ESA), which we don’t use because of its
inherent inefficiency.”
Q: Is Penryn faster than the current Core 2 quad cores?
A: We don’t want to give away the punch line but, generally, an equivalent Penryn runs
up to 14 percent faster when compared clock-for-clock with the current Core 2 quads.
The exact speed increase depends on the benchmark. In some, you’ll see no change in
performance; in others, a healthy increase is possible. But remember, Penryn isn’t the
big leap forward. Intel’s CPU schedule dictates a little jump one year and then a big
jump the next year. This is the little jump. Intel hopes to make a big jump when it
introduces its Nehalem CPU in late 2008.
Q: Will Penryn work in my motherboard?
A: Long-time Intel lovers have been vexed by this for years, as the company’s been in
the habit of invalidating perfectly good motherboards by requiring new or updated
chipsets to run its latest CPUs. Want a 1,066MHz P4 on a 925X mobo? Sorry, you need
a 925XE. Pentium D on a 925XE? Nope, you need a 955X chipset. Pentium 955 EE on
a 955X? Guess again: 975X.
Fortunately, Intel has gotten a little better in this area, and there is a very good chance
that a QX9650 will work in many existing motherboards. Certainly motherboards that
use Intel’s P35 and X38 chipsets will support the new CPU (although a BIOS update
might be required). Some Intel 965 and 975X boards might also work with the new CPU
and we understand that the majority of 680i boards will be compatible. To be safe,
however, before you buy any board/CPU
combination, check the manufacturer’s website to see what processors it has validated
with the design. Just because the Yorkfield and Wolfdale are LGA775 doesn’t mean
they’ll work in the board of your fancy.

Above: Intel’s 45nm die shrink allows engineers to pack nearly twice the number
of transistors into the same space as a 65nm CPU
Three recommended Intel CPUS

QX9650 Core 2 Extreme


Yorkfi eld 3.0GHz 12MB L2 Cache LGA 775
Quad-Core Processor
$1050

Q9300 Core 2 Quad


Yorkfi eld 2.5GHz 6MB L2 Cache LGA 775
Quad-Core Processor
$270
E8400 Core 2 Duo
Wolfdale 3.0GHz 6MB L2 Cache LGA 775
Dual-Core Processor
$200

Q: How do you pronounce Phenom?


A: It’s fee-nom, not fuh-nom.
Q: What advances does Phenom offer?
A: Phenom is AMD’s first quad-core processor and is touted as a “true quad core.”
Based on a 65nm process, Phenom uses an enhanced version of the stellar K8 Athlon
64 core, which features many of the same “wider and faster” techniques as Intel’s Core
2 Duo. Improvements over the Athlon 64 include the ability to execute SSE instructions
in 128-bit chunks versus 64-bit. Cache speed gets a bump, as well, with L1 going from
16 bytes per cycle to 32 bytes per cycle, and L2 going from 64 bits per cycle to 128 bits.
AMD also spends silicon on increased floating-point performance; a few new
instructions; HyperTransport 3, which nearly quadruples the bandwidth over previous
implementations; and more L3 cache.
Q: What’s meant by “true quad core”?
A: Each Phenom features four execution cores on one single, contiguous die.
Architecturally, it’s far more elegant than Intel’s quad core, which fuses two dual-core
chips in a CPU and forces the dual-core islands to talk to each other over the front-side
bus. Phenom was designed from the get-go as a quad chip, and each core
communicates at HyperTransport 3 speeds—several orders of magnitude faster than
Intel’s front-side bus. All the cores can also share data stored in the L3 cache, so a core
would have to reach out only to the L3 instead of the much slower system RAM in
certain applications. This adds up to a chip that, on paper, seems to at least equal—if
not exceed—Intel’s Core microarchitecture.
Q: Will Phenom work in my existing motherboard?
A: Phenom is designed as a Socket AM2/Socket AM2+ chip and should, therefore, drop
right into the majority of existing motherboards, provided the motherboard maker
updates the BIOS—and didn’t screw up on the board design.
Q: Does Phenom have the same RAM issues that DDR2 Athlon 64s did?
A: No. AMD corrected the issue that limited the DDR2 Athlon 64s to whole number
RAM divisors. This, in essence, would force DDR2/800 RAM to run at DDR2/766.
Phenom CPUs use a separate clock for the memory controller, so memory will run at its
intended speed. Consequently, however, the memory controller no longer runs at the
core’s speed. The memory controller on the 2.6GHz Athlon 64 FX-60 runs at 2.6GHz.
On the 2.6GHz Phenom 9900, the memory controller runs at 2GHz and notches down
to 1.8GHz for the 2.3GHz Phenom 9600. It’s not clear if or how this impacts memory
performance; it’s still a good clip faster than what the memory controller runs at in
competing Intel machines, where that part is located in the north bridge.
Q: How well does Phenom overclock?
A: It will vary from chip to chip, of course, but Phenom is not shaping up to be a great
overclocker today. We didn’t get very far with our engineering sample chip and few
other reviewers have either. And when you look at how the thermals ramp up for
relatively minor speed increases, it’s no wonder. Going from 2.3GHz to 2.4GHz takes
the thermals from 95 watts to 125 watts. Going from 2.4GHz to 2.6GHz jumps it up to
140 watts. Older AMD and many Intel enthusiast parts have high thermal ratings but
only because they’re anticipating users to overclock the hell out of them. We suspect
that the increased thermals for the two faster Phenom parts are more related to AMD’s
issue at the fab.

Above: AMD’s “true quad core” jams all four cores onto a single 65nm, 285mm2
die
Q: What’s the deal with AMD’s tri core?
A: The tri core is being sold on the concept that if two is good and four is great, three is
a perfectly attractive middle option. AMD’s tri core is primarily aimed at people who
don’t want to pay for quad core but want some additional performance at a more
affordable price. The CPUs are, as you might suspect, dies that won’t pass muster as
quad cores but work fine with one core turned off. While some view this as selling
defective chips, AMD says it’s business as usual. In the past, if a portion of a CPU’s
1MB L2 was bad, it could be sold as a chip with 512KB or 128KB L2, with the offending
portion turned off. Like the higher-clocked Phenoms, the tri cores won’t be out until later
in the year—they will carry model designators of 7 instead of 9. Since they’re the same
chip as a quad core but with one core turned off, you can expect performance to fall in
between their quad- and dual-core brethren.
Q: Where does AMD go from here?
A: AMD’s next stop is 45nm, which it says will be online at the end of this year. There’s
likely to be a shrink of the Phenom core with some enhancements to get the
performance up, but AMD’s CPU code-named Bulldozer will be the next chip to truly
take on Intel. Bulldozer, which is due in 2009, will be a multicore design, but AMD hasn’t
revealed very many specifics. The problem for AMD is that Intel is expected to make
another jump forward with its chip code-named Nehalem, which will adopt AMD’s on-die
memory controller and chip-to-chip communication techniques and feature four cores
per die and an improved version of HyperThreading. With two quad-cores glued
together under the heat spreader, a Nehalem would have up to 16 cores (eight real,
eight virtual) available to the OS.

Above: AMD’s AM2 Socket


Three recommended AMD CPUs
Phenom X4 9850 Black Edition
Phenom 2.5GHz 4x 512KB L2 Cache 2MB
L3 Cache Socket AM2+
$240
Phenom X4 9550
Phenom 2.2GHz 4 x 512KB L2 Cache 2MB
L3 Cache Socket AM2+
$195
Phenom X3 8450
Phenom 2.1 GHz 3x 512KB L2 Cache 2MB
L3 Cache Socket AM2+
$145

RAM stands for “random access memory.” Your computer uses RAM as a temporary
workspace. The CPU transfers data and applications from long-term storage devices
(your hard drive and optical drive) into RAM, then runs the programs and accesses data
from memory. New data is created within your system memory before it’s ever saved to
a storage device. Every byte of information used by a PC during its operation flows
through RAM on its way to or from an I/O device, the CPU, or a storage device. Access
to data in RAM is immediate: The CPU can read or write to any location in memory
without having to muddle through the adjoining data.

Most RAM used in PCs today is dynamic RAM, or DRAM. It’s called “dynamic” because
the memory chips must receive new electrical charges (a process known as memory
refreshing) thousands of times a second, or the data stored in the chips is lost. This is
why information saved only in RAM is lost as soon as your PC is restarted or turned off.
RAM and Paging Files

If a program or data file is too large to completely reside in RAM, PCs use dedicated
areas of the hard disk to store the overflow. This dedicated disk space is known as
“virtual memory.” The paging file (swapfile) in Windows is an example of virtual memory.
Windows uses the paging file as a holding tank for information being transferred in and
out of your system RAM. The less RAM you have, the more frequently your paging file
is used. Although a paging file enables a system with a relatively small amount of
memory to work with files that exceed the amount of available physical memory, using
the paging file instead of physical memory has a huge negative impact on performance.
Hard drives move data an order of magnitude slower than even the slowest RAM. This
means that the more memory you add to your system, the greater the number of
programs you can run, and your system can work with larger files before resorting to the
paging file. In an ideal situation, the paging file would never be used. In practical terms,
you want to install enough memory to handle the largest amount of work (or play) your
PC performs on a routine basis.
Q: What about DDR3?
A: DDR2 (double data-rate 2) is the standard memory for all Intel and AMD desktop
computer systems today. However, we should see an expanded push for DDR3 RAM
this year. The new memory spec promises higher bandwidth but at the cost of higher
latencies. In late 2007, this compromise along with its higher prices made DDR3 seem
pretty irrelevant. But we have seen one promise from DDR3 – really high clock speeds.
DDR3 modules are already pushing 1,800MHz whereas DDR2 topped out at 1,066MHz.
As clock speeds increase, the latency becomes less of an issue. Combined with the
higher front-side bus speeds of Intel’s 45nm Penryn CPU, we think DDR3 is starting to
show some promise. With that said, you can’t lose with DDR2 and it’s pretty darned
cheap, too!
Q: What is the significance of the numbers listed after the model number for a
module, such as 3-4-4-8?

A: The first number is the CAS latency (CL), which is the number of clock cycles
between the time a read command is sent and the data is available. The second
number is the tRCD (row address to column address delay), which is the number of
clock cycles between the active command and the read or write command. The third
number is the tRP (row precharge time), which is the number of clock cycles between a
precharge command and the active command. The fourth number is the tRAS (row
active time), the number of clock cycles between a bank active command and a bank
recharge command. The standard values for a memory module are stored in its SPD
(serial presence detect) chip, and are used by the BIOS when you select “By SPD” or
“Auto” for memory timings.
Q: Can I change these values?

A: Most systems permit you to manipulate memory timings. Reducing the tRCD and
tRP values can improve memory performance, although you might need to increase the
CL value a bit to maintain stability.
Q: When I add memory to my system, what are the most important specs to look
for?

A: SPEED - (PC or PC2 rating). This should be the same or faster than your existing
memory.
SIZE - For a single-channel system, buy the largest (in MB) module you can afford. For
a dual-channel system, buy a matched set of modules providing the total size you need.
For example, two GB modules will run faster than a 2GB module on a dual-channel
system. If you’re upgrading a laptop, you usually only have one memory slot, so fill it
with the biggest module available.
TIMINGS - If you’re a hardcore gamer, you’ll probably want to overclock your memory.
Look for low-latency memory, and remember to consider all the numbers, not just the
CAS latency.
Warning: If you’re serious about building a game machine, do not skimp on your video
card! All the latest graphically-intense games — like Supreme Commander, Unreal
Tournament III, and Crysis, just to name a few — look absolutely incredible will the
resolution, anti-aliasing, and detail settings cranked up to the max. This is truly the way
it was meant to be played.

But you won’t even get close to realizing this gaming dream without investing a serious
slice of your budget in a monster video card. There is no sight on Earth sadder to a
gamer’s eye than seeing a potentially beautiful game reduced to minimal graphics
settings and resolution, and still chugging along with a low frame rate. Don’t let this
happen to you.

The graphics card is the single biggest factor (though not the only factor) in determining
how fast your computer will be able to run the latest frag fest or grand strategy game.
Choosing last year’s card will earn you some pretty chunky frame rates, and that simply
won’t do.

Although there used to be two separate types of videocards — 2D cards for desktop
work and 3D cards for games — today’s videocards do everything in one sexy silicon
package. And over the years, as games have become increasingly complex and more
lifelike, videocard development has accelerated, rapidly bringing Finding Nemo-quality
graphics on your desktop closer and closer to reality. While that day is still a ways out,
modern videocards are technological wonders that are just as complex (and just as
expensive, unfortunately) as some high-end CPUs.
Above: NVIDIA’s 8800GT

The consumer videocard market is currently dominated by just two companies: ATI and
NVIDIA. Today, DirectX 10 cards like NVIDIA’s 9800 line pretty much trounces the
competition in performance, with tough competition from ATI’s brand-new cards based
on their RV670 chip. And with NVIDIA’s SLI or ATI’s Crossfire technology, which allows
you to run two high-powered cards in tandem for a huge bump in performance, game
graphics are experiencing an unprecedented boost in hardware power.

With that in mind, this may be the most important section of this article. We’ll help you
find the right card, starting by answering some frequently asked questions.
Q: Are onboard graphics really that bad? Are integrated graphics any good? Can
they run a game like Assassin’s Creed?
A: Integrated graphics—that is, graphics that are built directly into a motherboard—are
designed to provide minimal 3D performance in exchange for reduced cost. They’re not
designed for gaming, but rather simple 2D desktop work. As such, anyone serious
about gaming should never consider using integrated graphics.
Q: If I buy a top-of-the-line videocard today, how long will it be a viable solution
for good gaming?
A: In general, a high-end videocard should be extremely capable for at least a year, and
probably longer depending on what kind of frame rates you demand and the kind of
high-end features you’d like to be able to enable. There are games out today, for
example, that run just fine on 3-year-old cards, but that’s typically because the 3D
engine used in those games came out at roughly the same time as the card. Play a
brand-new game with a modern engine on the same card, however, and it’ll probably
run like a slide show, if at all.

Above: Assassin’s Creed in DirectX 10

While the videocard industry generally relies on a six-month refresh cycle for all of its
cards (meaning that you’ll usually see new cards from both ATI and NVIDIA twice each
year), the game industry moves at a much slower pace. All 3D games run on their own
“engine” — a massive pile of code that, among other things, determines the visual
quality of the game you eventually see on your screen. These engines take years to
develop, and are as forward-looking as possible, meaning they are designed to run on
hardware that won’t even exist until several years down the road! As a result, many
brand-new engines/games are brutal on PC hardware when they’re first released. But
over time, hardware catches up and eventually surpasses the 3D engine’s capabilities.

A classic example of that is id software’s Quake III. When it was first released several
years ago, nothing but the most high-end card could run the game at a constant 30
frames per second. Today, the latest hardware runs that same engine at several
hundred frames per second. And a few years from today, new, yet-imagined video
cards should churn through Quake 4 and Oblivion in the same way!
The boxes videocards come in are filled with mumbo-jumbo touting often obscure
features and wildly out-of-context performance numbers. Here are the key features that
really matter.

DirectX 10: The most important thing to know about DX10 is that both AMD and
NVIDIA GPUs that support it feature a unified architecture. This means that any or all of
the processor’s computational units (aka stream processors) can be dedicated to
executing any type of shader instruction, be it vertex, pixel, or geometry. This means
DX10 compatibility is a desirable feature even if you don’t plan on running Vista.
Memory Interface: In theory, a GPU with a 512-bit interface to memory will perform
faster than one with a 256-bit memory interface. But don’t be confused by AMD’s 512-
bit “ring bus” memory. That architecture is 512-bits wide internally, but only its high-end
GPUs have a true 512-bit memory interface; the company’s lesser components have
only 128- and 256-bit paths to memory. Inside the GPU, AMD’s “ring bus” architecture is
512 bits wide across the board. But don’t judge a card based solely on its memory
interface. NVIDIA’s 8800 GTX and 8800 Ultra are considerably faster than AMD’s ATI
Radeon HD 2900 XT despite those GPUs having a much narrower 384-bit memory
interface.
Stream Processors: Unlike CPUs, which have one to four processing cores on a single
die, modern GPUs consist of dozens of computational units known as stream
processors. As with the GPU’s memory interface, however, simply counting the number
of stream processors doesn’t necessarily indicate that one videocard is more powerful
than another. AMD’s ATI Radeon HD 2900 XT, for example, is much slower than
NVIDIA’s GeForce 8800 GTX despite the fact that the latter part has only 128 stream
processors to the former’s 320.
HDMI: If you purchased a new big-screen TV, it’s probably outfitted with an HDMI port,
either instead of or in addition to a DVI port. The big difference is that HDMI is capable
of receiving both digital video and digital audio over the same cable. Videocards based
on AMD’s new GPUs are capable of taking audio from the motherboard and sending it
out through an HDMI adapter that connects to the card’s DVI port. With an NVIDIA card,
audio must be routed to your display or A/V receiver over a separate cable.
HDCP: This acronym refers to the copy-protection scheme deployed in commercial Blu-
ray and HD DVD movies. In order to transmit the audio and video material on these
discs to your display in the digital domain, both the videocard and the display must be
outfitted with an HDCP decryption ROM. This copy protection is not currently enforced if
the signal is transmitted in the analog domain. (See also Dual-Link DVI)
DUAL-LINK DVI: Driving a 30-inch LCD at its native resolution of 2560x1600 requires a
videocard with Dual-Link DVI, which is relatively common in mid-range and high-end
products. What’s not so common is a videocard that supports HDCP on Dual-Link DVI;
without that feature, the maximum resolution at which you can watch Blu-ray and HD
DVD movies is 1280x800.
Above: Note the proprietary VIVO port next to the DVI ports on this NVIDIA
9800GX2 videocard
VIVO: The acronym stands for video in/video out—analog video, that is. Most
videocards are capable of producing, in order of quality, composite, S-, or component-
video that renders them friendly to analog TVs. Support for these types of video input—
which useful primarily for capturing analog video from VCRs and older camcorders—is
much less common.
BLU-RAY And HD DVD Support: As backward as it sounds, high-end videocards are
less capable than mid-range videocards when it comes to decoding the high-resolution
video streams (H.264, VC1, and MPEG-2) recorded on commercial Blu-ray and HD
DVD movies. AMD’s ATI Radeon HD 2600 XT and the upcoming RV670 fully offload
the decode chores from the host CPU; the ATI Radeon HD 2900 XT do not. On the
NVIDIA side, the GeForce 8600 GTS and the 8800 GT do, but the 8800 GTS, 8800
GTX, and 8800 Ultra do not.

If one videocard can churn out 30 frames per second, two in the same machine should
be able to pump 60fps, right? Well, not exactly. Assuming your PC is even capable of
running more than one GPU at the same time, the best performance bump you can look
forward to is about 80 percent in a dual-GPU configuration. Very high-end GPUs scale
much less effectively.
The point is moot, of course, if your motherboard doesn’t support running two or more
videocards simultaneously—and that means more than simply having a mobo with two
or more PCI Express slots. Running multiple AMD ATI Radeon videocards, for instance,
requires a CrossFire compatible motherboard. Doing the same with two GeForce cards
requires an SLI-compatible motherboard (the acronym stands for scalable link
interface).

Right: NVIDIA’s 9800GX2 in SLI mode is technically 4 GPUS on one


board

It’s understandable that you can’t chain AMD and NVIDIA videocards together—the
architectures are radically different—but there’s no good reason why you can’t mix and
match videocards and motherboards. HP, in fact, recently figured out how to do just that
with its Blackbird 002 gaming PC (which can be outfitted with two Radeon HD X2900
XT videocards in CrossFire on a motherboard with an NVIDIA SLI chipset).
Unfortunately, HP isn’t sharing this firmware/driver trick with the rest of us.

Looking on the bright side, both companies support both AMD and Intel CPUs; gaining
access to SLI, however, requires a motherboard with an NVIDIA chipset. CrossFire
support is available with both AMD and Intel chipsets. Both companies’ technologies
also require that the GPUs on each videocard be identical, although they don’t
necessarily need to have the same clock speeds or even the same-size frame buffers.
You can couple an NVIDIA GeForce 8800 GTS with a 640MB frame buffer to a
GeForce 8800 GTS with a 320MB frame buffer, for instance, but you can’t pair either of
those cards with a GeForce 8800 GTX.
TRIPLE AND QUAD GPUS

NVIDIA launched quad-SLI technology some time ago, but the solution failed to gain
much traction in the market: It didn’t scale particularly well, it was wickedly expensive,
and it was available only in pre-built systems from OEMs. The solution featured four
GeForce 7900 GPUs mounted on four PCBs that fit into two PCI Express slots on the
motherboard. NVIDIA never announced a similar solution for its 8-series products; and
as we went to press, there were still no Vista drivers available for those rigs.

In the wake of Ageia shipping its PhysX physics accelerator last year, both AMD and
NVIDIA made a great deal of noise about doing physics acceleration on the GPU.
Despite several technology demos, in which a third videocard was used to accelerate
physics, this initiative also failed to get off the ground. Now that NVIDIA has acquired
Ageia and its technology, its next-generation cards may feature a built-in PhysX
processor so you won’t have to buy a separate add-in card.

AMD recently announced CrossFireX technology, which will enable three and four
videocards to operate in a single motherboard (one with three PCI Express slots,
obviously), and NVIDIA was making noises about the same thing with SLI. As with
NVIDIA’s quad SLI, all three (or four) GPUs will be used to produce graphics. NVIDIA’s
new 780 and 790 nForce motherboards all support triple-SLI. With very large monitors
becoming increasingly less expensive, gamers need all the graphics horsepower they
can lay their hands on.

We thought DirectX 10 would be the one reason to consider holding our noses and
upgrading to Vista. While there’s no reason why Microsoft could not release DirectX 10
for Windows XP, the company has so far insisted on keeping DX10 and Shader Model
4.0 exclusive to their new OS.

The new API gives game developers the tools to dramatically increase the visual
complexity of their games. However, from what we’ve seen of DX10 games so far there
are too few compelling reasons to justify abandoning XP right now for anyone that does
not fall into the “must early adopt” category. Vista’s slow adoption rate is one reason
why developers have been reluctant to move to it. Valve recently released statistics
culled from its Steam gaming service that revealed only three percent of its one million
anonymous users had machines equipped with both a DX10- compatible videocard and
Vista.

“[Microsoft’s] decision to couple DX10 with Vista was a mistake,” said Valve’s director of
marketing, Doug Lombardi. “There is no difference between running Orange Box games
[Half-Life 2: Episode 2, Team Fortress 2, and Portal] on Vista versus XP, but there are
some benefits to having a DX10 GPU.”
But this is more than just a chicken-or-the-egg problem. DX10 and Shader Model 4.0
are also more complex to program that DX9 and SM 3.0, and most of the games that
shipped last year were far along in their development cycles when Microsoft made
these new tools available.
Lombardi, for example, told us that Valve’s developers do make use of the unified
architecture that’s unique to DX10-class GPUs in order to deliver more sophisticated
facial animation in Team Fortress 2, but you don’t need Vista for this because they
didn’t tap DX10 or SM 4.0.
UNREASONABLE TRADE-OFF

The few games we’ve seen that do make use of DX10 (both new games and previously
released games with DX10 patches) don’t look significantly better running under Vista
than they do with Windows XP. But what’s worse is that they run slower on Vista. When
we patched the RTS game Company of Heroes and ran it at 1920x1200 resolution in
Windows XP (using an EVGA GeForce 8800 GTS with 640MB of memory), we
achieved a playable 42.3 frames per second. When we played the same game on the
same machine using Vista, frame rate plummeted to a creaky 20.2 frames per second.
It would be one thing if the trade-off resulted in supremely better graphics, but we
couldn’t see any significant differences. We had a similar experience with World in
Conflict.
Microsoft’s recent announcement of DirectX 10.1 and Shader Model 4.1 have rendered
the situation even more complex. These new versions were released along with Vista
Service Pack 1, but they’re supported only by AMD’s and NVIDIA’s very newest GPUs
(we’re talking about the G92 and the RV670). So if you thought buying any Radeon
2000-series or any GeForce 8000-series card rendered you future-proof, you’re in for a
rude awakening.
Microsoft, of course, insists these updates don’t render these cards obsolete. “The
updated API,” said Microsoft’s Sam Glassenberg, lead DX10.1 programmer, “provides
full support for all existing Direct3D 10 hardware and upcoming hardware that supports
the extended feature set.
The API is a strict superset. No hardware support has been removed in DirectX 10.1”
The new API renders mandatory several features that were previously optional.
Compliant GPUs must now support at least 4x AA and 32-bit floating-point filtering, for
instance.

Considering how slowly both consumers and developers are moving to Vista, we don’t
anticipate the point releases of these new tools to have much of an impact on the
market.
Now that they’ve cashed in on the early adopters, ATI and NVIDIA are going after the
rest of us, with fast and inexpensive cards that do DirectX 10 – and beyond!

GeForce 8800 GTS 512MB $240, www.NVIDIA.com


Yet another variation on NVIDIA’s winning GeForce 8 series, the GTS 512MB is an
excellent card whose terrific performance is simply overshadowed by the
price/performance ratio of the 8800 GT. It’s a hefty card in the dual-slot form-factor of
the 9800GX2 and 9800 GTX, but with a modest bump up in the core clock and only
512MB of texture memory, instead of 768MB (as well as a slightly narrower 256-bit pipe
to squeeze frames through, compared to the 384-bit pipe of its two older brothers).

The extra forty bucks it costs over the cheaper 8800 GT doesn’t go to waste. Although
the gains seem modest in the benchmark chart below—a few extra frames per second
here, an extra 20 frames per second there—the differences become more prominent at
higher resolutions and with higher levels of postprocessing (including filtering and
antialiasing).
GeForce 8800 GT 512MB $200, www.NVIDIA.com
If you’ve got the bucks, then by all means, get yourself a GeForce 9800 GTX or a
9800GX2. But most of us are forced to cut corners every now and then on our PC
upgrading budget. That’s what makes NVIDIA’s GeForce 8800 GT such a winner of a
card: though it costs a lot less than the high end, and a hundred bucks less than the
8800 GTS 512MB version, its performance hardly plays like a second-tier card.

You can argue on and on about whether memory size, number of stream processors, or
clock speed are more important in a videocard, but in the end, it’s the balance of all
three that matters, and right now, you won’t find a more finely tuned card than the 8800
GT. It’s the one card we can recommend without reservation to gamers of all levels and
budgets.
Radeon HD 3870 $170, www.ati.com
ATI now has a very serious competitor to the cards NVIDIA’s been dealing out to mid-
range gamers in the HD 3870, a dual-slot card with a high 775MHz core clock speed
and the 512MB of memory that a mid-range videocard deserves. It clocks in virtually
neck-and-neck in Crysis and faster in Half-Life 2: Episode One compared to the
GeForce 8800 GT, but gets winded and lags behind in RTS games like World in Conflict
and Company of Heroes. Even then, we’re talking about differences of 10 to 15 frames
per second, which looks less harsh when you consider that the MSRP on the HD 3870
is merely $170.

It doesn’t seem as finely tuned as the 8800 GT, and won’t appeal to as many different
levels of gamers as that card does, but it’s fast, inexpensive, DirectX 10.1–compatible
like the 3850—and well worth the extra $40 over that card. Plus, you can pair two of
them up for some awesome Crossfire action!

WINDOWS XP / WINDOWS VISTA


3DMark06 run at default resolution of 1280x1024; all other benchmarks run at
1600x1200 with 4x full-screen antialiasing and 16x anisotropic filtering enabled.
Name: GeForce 8800 GT
3DMark06: 11870/11641
Crysis: 17/13
Half-Life 2: Episode One: 129/131
World in Conflict: 33/28
Company of Heroes: 58/54
Name: GeForce 8800 GTS 512MB
3DMark06: 11976/11787
Crysis: 21/15
Half-Life 2: Episode One: 151/162
World in Conflict: 37/27
Company of Heroes: 59/55
Name: Radeon 3850
3DMark06: 10078/9280
Crysis: 13/8
Half-Life 2: Episode One: 124/129
World in Conflict: 16/35
Company of Heroes: 50/47
Name: Radeon 3870
3DMark06: 11462/10378
Crysis: 17/15
Half-Life 2: Episode One: 148/158
World in Conflict: 26/15
Company of Heroes: 58/42
The hard drive is truly the unsung hero of PC components. You know the type; the kind
of component that labors away in the background while all the fl ashy components like
the CPU and videocard get all the credit. Yet the hard drive is the one single component
that is used in almost every single task you’ll ever perform on your PC.

Whether you are trying to access folders on your hard drive, surfing the web or copying
content from one location to another, your hard drive is constantly in use. Even when
you are just sitting in front of your computer, staring at the screen, the hard drive’s
platters are spinning furiously as the drive’s read/writer heads eagerly await your next
command. The millisecond you click on a folder, these heads leap into action to deliver
the data you’ve requested, and as soon as they complete your request, they return back
to their “ready and waiting” status. You could say the hard drive is the Labrador retriever
of the PC, waiting patiently with its tongue hanging out and tail wagging as you decide
what trick you’d like it to perform next. As soon as you toss the bone and say “fetch!” it’s
off and running. And the good news is hard drives today are faster than ever before
thanks to the successful proliferation of the Serial ATA spec.

This new interface doesn’t offer any major benefits over the old interface, which was
called “parallel ATA,” other than that it offers more bandwidth for future drives to take
advantage of, and is easier to add to a system due to its smaller cables and lack of
jumpers. Parallel ATA drives have to be correctly configured via jumper pins as Master
or Slave prior to use, but the newer drives have no such limitation—just plug them in
and they work. Nonetheless, eventually all hard drives will use the Serial ATA interface,
so if you are in the market for a hard drive today, you’d be wise to consider a SATA
drive in order to make your system as future-proof as possible. Plus, drive
manufacturers are only releasing their top-of-the-line drives in SATA form these days,
so if you buy one, you can be sure it’s the cream of the crop (for now).
What is SATA?
The parallel ATA connection standard for hard drives and optical drives has enjoyed an
unusually long tour of duty by PC standards, but it’s clear that the old spec is ready for
retirement.

PATA is called a “parallel” interface because multiple bits of data travel along the 40-pin
cable simultaneously on separate channels. But the parallel ATA interface tops out at a
maximum transfer rate of 133MB per second, due to crosstalk. Crosstalk occurs when
electrical signals on adjoining wires interfere with one another. It’s like trying to have a
conversation with a friend on a crowded bus while the dumbass sitting next to you is
yelling into his cellphone. Because you’re sitting so close to Mr. Cellphone, you can only
hear his conversation, so you have to talk louder to make your conversation heard. But
then he starts talking louder on the phone, and pretty soon neither of you can hear
anything and everyone else on the bus is pissed off. That’s crosstalk, and trying to push
data through IDE cables faster just generates too much of it. And because the lasagna-
size parallel cable is already too large and unwieldy to accommodate good airflow in
today’s PCs, an even wider cable just isn’t an acceptable solution. Fortunately, there’s
another way to push data at extremely high rates while eliminating the crosstalk
problem: Serial ATA.

Instead of adding more parallel wires and channels, Serial ATA eliminates the problem
of crosstalk by using an interface that pumps data through a single channel one bit at a
time. Without the worry of electrical crosstalk, these bits can be pushed along the serial
cable much faster than across parallel ATA.

The Serial ATA cable uses seven wires, three of which are ground wires, with the other
four carrying data. Two of the data wires are dedicated to moving data from the
computer to the hard drive (downstream), and two are dedicated to carrying data from
the hard drive to the computer (upstream).
Q: What makes a hard drive “fast”?

A: Many factors define a hard drive’s raw speed potential, but the most important is the
rotational speed of its platters. All drives store their data on internal platters, and the
data is retrieved when the platters spin under read/write heads. The faster these little
platters spin, the faster the data can be accessed. Today’s standard desktop drives
rotate at 7200rpm, and these drives are very fast. There are also a handful of
10,000rpm drives, which are insanely fast due to their rotational-speed advantage. On
the server side of things, where performance is king and money is no object, 15,000rpm
drives reign supreme. These drives are the absolute pinnacle of performance, but not
practical for desktop tasks due to their high cost and relatively small capacity.

The size of a drive’s onboard memory plays a distinct role in its overall performance as
well, with the rule of thumb being “the bigger the better.” Onboard memory buffers range
in size from 2MB to 16MB, and drives with these large buffers deliver up to 30 percent
faster performance, on average, than drives with smaller buffers. Typically, data is
delivered from the buffer as fast as the interface allows, so the more data a drive can
wedge into its buffer, the faster it can perform typical desktop tasks.
Q: What is Serial ATA?

A: Take a look at a machine equipped with Serial ATA, and the most striking feature will
be the skinny data cables. While skinny cables have a positive impact on a case’s
internal airflow, this isn’t the main reason why the PC industry is dropping parallel ATA
(and its flat, wide cables) for SATA. The main reason is that the current parallel
interface is facing a performance wall.

Parallel ATA cables send data along multiple wires within the same wide ribbon. Each
piece of data must travel along the length of the familiar ribbon cable, and arrive at the
same time in order to maintain data integrity. In order to get more speed from this
scheme, the only option is to push the data to higher frequencies or make the data path
wider. That’s where the problems lie. Making the data path wider is impractical, as there
are already 80 conductors in the ribbon. And increasing speed adds to the likelihood of
data corruption.

Because serial interfaces don’t have to deal with coordinating multiple lanes of data,
we’re able to push them to much higher speeds. SATA launched with speeds of 150MB/
s, slightly higher than the 133MB/s offered by the fastest parallel ATA spec. 3G SATA
drives have already doubled speeds to 300MB/s, and will again to 600MB/s by next
year.

Although current hard drive transfer rates fall far short of the maximum throughput of
even parallel ATA specs, companies are laying the foundation for the future. You don’t,
after all, wait for the traffic jam before you try to build the roads (unless you run the state
of California).
ATA
Advanced Technology Attachment. This is the parallel interface used to attach hard
drives, CD ROMs, and DVD drives to the majority of PCs on the market. The term
“ATA” is used interchangeably with the term “IDE.” Officially, there are the ATA-1
through ATA-6 specifications, which usually are written as ‘ATA’, and then the
interface’s maximum throughput. For example, the final spec of parallel ATA is
ATA/133, which allows for data transfers of up to 133MB per second.
Q: What are the different rotational velocities offered in today’s hard drives, and
what are the benefits of each?

A: Today’s desktop hard drives are offered in three rotational speeds. The slowest is
5,400 rpm, with these drives primarily being used for rudimentary storage duties where
speed is of little importance. They are affordable since they represent last-gen
technology and are not in high demand. The next fastest speed is 7200rpm, which is the
norm for today’s desktop drives.

These drives are very fast, and are more than adequate for all but the most demanding
desktop users. Finally, for those “demanding” types, there are the 10,000rpm Raptor
drives from Western Digital. These puppies are wicked-fast, and are zippier than
7200rpm drives by a wide margin. The only drawback to 10,000rpm drives is that they
are currently only offered in 150GB or 300GB capacities, while 7200rpm drives are
offered in capacities ranging from under 10GB, all the way up to 1TB!
RAID
Redundant Array of Inexpensive Disks. An arrangement whereby more than one hard
drive is combined to form a single storage volume. Depending on the configuration,
better performance, better security, or both can be attained. The only way to practically
double a hard drive’s speed is to add a second drive and divide up the work between
them. It’s a process called “R.A.I.D.,” and here we see a four-drive array (the fifth drive
is the primary volume).
Internal Drives
Western Digital Caviar
SE16 500GB
Capacity: 500GB
Interface: SATA 3.0Gb/s
RPM: 7200
Cache: 16MB
Avg Seek Time: 8.9ms
Avg Write Time: 10.9ms
Price: $90
www.westerndigital.com
Seagate Barracuda
7200.11 1TB
Capacity: 1TB
Interface: SATA 3.0Gb/s
RPM: 7200
Cache: 32MB
Avg Latency: 4.16ms
Price: $220
www.seagate.com

Western Digital
Raptor X 150GB
Capacity: 150GB
Interface: SATA 1.5Gb/s
RPM: 10000
Cache: 16MB
Avg Seek Time: 4.6ms
Avg Write Time: 5.2ms
Avg Latency: 2.99ms
Price: $174
www.westerndigital.com
External Drives

Seagate FreeAgent
Go 250GB
Capacity: 250GB
Interface: USB 2.0
RPM: 5400
Transfer speed: 480Mb/s
Dimensions: .7”x4.8”x3.9”
Weight: 6.4oz
Price: $100
www.seagate.com

Western Digital My
Passport Elite 320GB
Capacity: 320GB
Interface: USB 2.0
RPM: 5400
Transfer speed: 480Mb/s
Dimensions: .6”x5.0”x3.1”
Weight: 0.23lb
Price: $160
www.westerndigital.com
Maxtor OneTouch
4 Plus 1TB
Capacity: 1TB
Interface: USB 2.0 /
IEEE1394a
RPM: 7200
Transfer speed: 480Mb/s
Dimensions: 2.5”x6.0”x6.8”
Weight: 2.5lbs
Price: $250
www.maxtor.com
When the compact disc was introduced by Philips and Sony in 1979, vinyl records had
the misfortune to be standing directly in its path. Those black, circular monstrosities—
with their fragile surfaces and analog data—couldn’t compete with the CD’s deadly
combination of digital clarity and rugged portability. A few years later, engineers figured
out how to adapt audio CD technology for use with computer data by adding strong
error detection and correction schemes, which led to the downfall of the floppy disk.
This storage medium then evolved to DVD, which has taken over as the standard to
distribute audio, data, and video to consumers. Today, it continues to evolve at an
astounding pace.

Both CD and DVD drives fall under the banner of “optical storage.” These drives contain
a laser, and when a disc is inserted, the laser “looks” at the surface of a disc, where
information is encoded in a single spiral track that begins in the center of the disk and
moves outward toward the edges. The laser is looking for variations in the surface of the
disc, from which it derives digital data (ones and zeroes, in other words). The spiral
track in a commercial CD-ROM contains a series of bumps and fl at surfaces called
“pits” and “lands” embedded in a clear layer just below the disc’s outer surface. These
“pits” and “lands” represent ones and zeroes and are the building blocks of data.
Recordable CDs, or “burned” CDs, work in a similar way. Commercial, write-once and
recordable DVDs use these same principles to store information.
Next-gen or not?

Optical technology is currently making another giant leap forward with the introduction
of the next generation of storage discs: Blu-ray, which increases capacity from 4.5 and
8GB all the way up to more than 30GB using a blue laser instead of the traditional red
one. But should this new format figure into your gaming system at this time?
For starters, a next-gen burner is a big investment. Prices have certainly dropped in the
last year, starting at $1,000 or more a year ago and now resting at half that amount
today. And that trend is sure to continue, so it may be worthwhile to wait for a while
longer.

Yes, these drives are uniquely capable of burning huge amounts of data to a single disc
(25GB per layer for Blu-ray), but the media is also quite pricey, running $12 to $15
dollars per single-layer disc and twice that for double-layer media. If data backup is your
primary concern, an external drive might be more cost-effective purchase. And transfer
times will certainly be more speedy, as even the fastest next-gen drive we’ve tested
took more than 21 minutes to fill a 25GB disc—that’s not speedy, folks.

Indeed, the majority of your disc-burning needs might best be handled by a good, old-
fashioned standard DVD drive. High-performance models, capable of speedy 18x
burns, can be had for less than 100 bucks. And DVD media is itself very affordable. This
can be a compelling stop-gap measure while you wait for “next-gen” optical to come into
its own.
Q: Does it make any difference what color or brand of media I use?
A: Which one should you use? That’s easy. Check the documentation that came with
your optical drive, or the manufacturer’s website for media recommendations. These
recommendations didn’t come about as a result of back-alley deals or bribes of exotic
whiskey. You’ll find some media brands recommended over others because these discs
have been specifically tested with the manufacturers’ drives. The proper laser strength
for each type of media has been evaluated and programmed into the drive’s firmware.

In general, we do not recommend buying cheap spindles of off-brand media, no matter


how inexpensive they are. El Cheapo vendors aren’t worried about brand loyalty, so
they skimp on quality control and you pay the price in discs that are error-prone or that
won’t retain their data for very long.
Q: Are non-combo, non-dual-format drives even relevant anymore? Why would
anyone ever buy a dedicated DVD-ROM drive, CD-ROM drive, or CD-RW burner in
this day and age?
A: Today’s optical storage market is divided among drives that record to CD-R, DVD-R,
or both. Drives that do both are known as “combo” drives, while drives dedicated to
recording in one format are called “dedicated” drives. A general rule of thumb is that
dedicated drives tend to offer higher speeds than combo drives, but the speed
differential is often negligible. But there are still good reasons to look at dedicated or
single-format drives. For example, Plextor’s ultra-foxy PlexWriter Premium CD-RW drive
doesn’t burn DVDs, but it offers a staggering amount of one-of-a-kind features, like the
ability to tweak laser strength for higher compatibility with your audio equipment, and to
“overburn” discs so that ordinary CDs can contain as much as 1GB of data.

Another reason to covet a dedicated drive is price. For example, if you know your set-
top DVD player can read DVD-R discs, don’t waste your money on a dual-format burner
when you can buy a less expensive single-format burner.
Dual-Layer
Some factory-pressed DVDs contain data on two layers for a total capacity of around
8.5 GB. While all DVD players, including DVD-ROMs, can fully access dual-layer discs,
all current recordable DVD formats are based in single-layer technology and are limited
to 4.7GB.

Lite-On 20X DVDR Burner with LightScribe


Formats: DVD+R, DVD+RW, DVD-R, DVD-RW, CD-R, CD-RW
Read speed: 16X for DVDs, 48X for CDs
Interface: SATA
Access time: 160ms
Cache: 2MB
Price: $30

Sony BDU-X10S
Formats: Blu-Ray, DVD, CD
Read Speed: 2X for Blu-ray, 8X for DVDs, 24X for CDs
Interface: SATA 150Mb/s
Access time: 210ms for BRD, 170ms for DVDs, 150ms for CDs
Cache: 4MB
Price: $199
Sony BWU-200S 4X Blu-ray Disc Burner
Formats: Blu-ray, DVD, CD
Read Speed: 4X for Blu-ray, 16X for DVDs, 40X for CDs
Interface: SATA 150Mb/s
Access time: 210ms for BRD, 170ms for DVDs, 150ms for CDs
Cache: 8MB
Price: $699

For years, the soundcard looked as though it was headed to join the scrapheap along
with the Ethernet card, USB 2.0 card, and Firewire card. Oddly, a recent renewed
interest in soundcards indicates that this dog may still have a little hunt left in it. Creative
Lab’s X-Fi has been the premier soundcard but entries from Asus, Auzentech, Razor,
and others have recently been introduced for PC enthusiasts. Why run a soundcard
instead of the “free” onboard stuff on your motherboard? The main reason is because it
simply sound better. Onboard audio’s biggest weakness is sharing the same space as
the other electrically noisy components on a motherboard. This leads to the snap,
crackle, and humming that most people associate with bad audio. Onboard audio also
has a weakness in that most motherboard companies’ strengths aren’t in making good
audio; they just need to have it fulfill a checkbox on the packaging.

Today, gamers are faced with two choices: hardware audio-processing or host-based.
There’s only one soundcard series with hardware support: Creative’s X-Fi (and
Auzentech’s authorized copy). X-Fi cards will actually process the complex math for
audio on the digital signal processor (DSP) on the card. Newcomers, such as Asus’
Xonar or Razor’s Barracuda AC-1 actually process the math on the CPU and use the
soundcard as little more than a glorified I/O card to pass the audio signal out of the
system to your speakers. The argument for the X-Fi cards is that they will put less of a
load on the CPU and thus, theoretically, increase frame rates. For the most part, we’ve
found this to be true. However, with quad-core computers becoming the norm, is the
soundcard even really working that hard?

Host-based soundcards are actually quite good and offer features that DSP-equipped
cards cannot, such as real-time encoding of content to Dolby Digital. For those looking
to use the PC with a home entertainment system, a card like the Asus Xonar is a better
fit than the X-Fi.

There’s also been a push to include ever more satellite speakers in soundcards with 5.1
going to 6.1 and now 7.1 audio. While additional speakers do help, we don’t find it
practical to run seven speakers around our PC. Plus, support for 7.1 and 6.1 audio in
speakers tends to be mismatched with not all systems working quite right. The sweet
spot for someone looking for a good surround-sound experience is still a 5.1 speaker
setup. The good news is that cards that tout 7.1 support also work fine with most 5.1
speaker sets.

If games are the main application you consider when it comes to sound, your choice
remains simple: Creative’s X-Fi.
Q: What’s the difference between 24-bit/192KHz audio and 16-bit/44.1KHz audio?
A: 16-bit/44.1KHz audio is the specification for CD-quality audio, whereas 24-
bit/192KHz sound is recorded at a higher bit rate, meaning it includes more information
(or bits of data) about the sound than 16-bit/44.1KHz audio. With a higher bit rate,
sound is produced with increased resolution and is able to convey more subtle nuances
than with a lower bitrate. Unfortunately, it will be a while before 24-bit/192KHz media
becomes commonplace, simply because 16-bit/44KHz is excellent sound quality by
most people’s standards. Another roadblock to the adoption of 24-bit/192KHz audio is
that if you play a CD that was engineered at 16 bits, it won’t sound better with a
soundcard that’s capable of 24-bit resolution. Most 24-bit soundcards do let you record
at that resolution, though, which is a nice feature if you do a lot of music recording.
Q: In specific terms, how badly might my 3D gaming frame rates suffer if I use a
“host-based” card that relies on my CPU for audio processing chores?
A: Most onboard sound chips (and even some add-in soundcards) offload audio
number crunching to the system’s CPU, which is generally bad. This is because, during
a 3D game, the CPU has its hands full feeding instructions to the videocard, so the last
thing it needs is more work. We know how it feels! However, by most benchmarks, the
difference in frame rates for a system using a host-based card and an add-in card is
usually less than 10 frames per second. If you have a monster gaming rig that has
frames to spare, you can afford to send some more work to the CPU. However, if you’re
running a “budget” system, an add-in card with its own audio processor is the way to go
for maximum gaming performance.

Q: Does it matter where I place the subwoofer?


A: A subwoofer produces tones that are so deep the human ear is unable to pinpoint
their location, which is why the conventional wisdom is to put the sub anywhere you like.
Your ears can’t tell the difference if it’s three feet behind you or five feet to your right.
However, there will always be a “sweet spot” in your listening area where the subwoofer
sounds best, so we
recommend playing a bass-heavy DVD (Saving Private Ryan’s opening sequence is a
good choice) or some thumping music and then moving the subwoofer around the room
while returning to your listening area to see how it sounds. Once you’ve pinpointed the
“sweet spot,” invite your friends over to show your home theater off a bit! For tips on
speaker placement, you also can read the article on the next page of this very issue.
2.1, 4.1, 5.1, and 6.1
Abbreviations used to denote the number of sound channels in a speaker system. The
number before the decimal point indicates the number of regular audio channels and
the number after the decimal point denotes the subwoofer (low-frequency) channel. For
example, 5.1 means five regular channels and one subwoofer channel.
Q: What do I need to play movies in Dolby Digital surround sound?
A: In order to listen to true “discrete” Dolby Digital multi-channel audio on your PC,
which is sound that is sent to separate channels from the sound source, all you need is
a piece of hardware to decode the sound into its separate channels and a 5.1 speaker
system. It’s really that simple. A Dolby Digital audio stream is a digital signal that
includes six audio channels, but these signals have to be sorted by a decoder and sent
to their respective channels in order to get that movie-theater sound separation where
you hear bullets whizzing from the front channel to the rear channel. This decoder can
either be built into the soundcard or the speakers, or it can be a separate add-on unit.
Q: How much power do I need?
A: A realistic assessment is that speaker systems that crank out 100 watts are
sufficiently loud for home use, or 200 watts for a surround sound system. However,
ultra-highwattage speakers that are capable of 500 watts or more sound better at
moderate volumes since there is absolutely no stress to the speaker’s components at
lower levels, whereas lesser speakers can become considerably stressed at lower
volumes. The real reason for speakers to have high wattage ratings isn’t to actually use
those high levels of output, but to ensure distortion-free playback at lower volumes. As a
yardstick, the 2.1 Logitech Z-2300 speakers are capable of pumping 200 continuous
watts and are pure sonic fury. We almost went deaf testing them.
Q: I have a 5.1 speaker system, but am seeing advertisements for 6.1 and even 7.1
speaker systems now. Is it worth it to upgrade?
A: Although the addition of one little speaker behind you or two on the sides may not
seem like it would make a big difference, it does. The traditional 5.1 surround sound
speaker system sounds fantastic but leaves huge gaps in the sound field behind you
and on the sides as well. The only catch to upgrading to 6.1 or 7.1 is that Creative Labs
is the only manufacturer selling decent systems, but it offers both a budget 7.1 speaker
system called the Inspire T7700 as well as high-end 6.1 and 7.1 systems as well. You
can also buy Creative Labs’ S700 5.1 system, which is upgradeable to 7.1 for an extra
$100. Also note that you’ll need a soundcard that supports 7.1 sound, but there are
several models on the market currently that offer this feature.

SPEAKERS FROM SPACE


Q: I bought the Klipsch GMX speakers and have a Soundblaster Live. How can I
get 5.1 sound out of them?
A: The Klipsch GMX-D5.1s were designed primarily for console gamers and include
only digital 5.1 support. For PC gamers, this setup sucks, because most soundcards
can send only a two-channel PCM signal digitally. If you want to get 5.1 sound out of
your GMX-D5.1s, you’ll need an nForce motherboard or a soundcard that can output a
Dolby Digital 5.1 stream. Unfortunately, only the Sound Blaster Audigy and Audigy 2
products can do that now. Your GMX-D5.1s are essentially 2.0 speakers, unless you
have a properly outfitted soundcard.

WORRYING ABOUT WIRES


Q: What is the best way to lengthen speaker wires that are hard-wired into the
back of the satellite?
A: The only way to deal with this tricky situation is to don your electrician cap and splice
an extra length of wire into the main speaker wire. Grab a set of wire clippers/ strippers
and clip the wire at any point.

Next, strip the cabling off the leading edge of the wires to expose the internal wires and
connect the two sections of cable. Twist the new wires together and wrap the exposed
portions of wire (the parts that used to be covered in cable sheath but are now
entwined) with electrical tape and you’re done.
GOING COMMANDO
Q: Is it OK to remove the dust covers from my speakers? I like the look of the
drivers and want to show them off!
A: It’s totally fine to remove the dust covers from your speakers. After all, that’s all they
are- covers to keep dust off the drivers. In fact, some people like the look of the speaker
drivers as opposed to the speaker grills, but to each his own. Be warned, though, that
removing the safety covers exposes your speakers to errant flying objects, mischievous
kitty cats and all sorts of desktop dangers. Personally, we like to protect our PC
hardware, so we’d leave ‘em on.

Though most people upgrade videocards and hard drives on an annual basis, they
rarely upgrade their PC case, unless tragedy strikes. The reason is simple: The ATX
specification for cases has been around a long time, and it’s still getting the job done.
Simply put, there’s usually little reason to upgrade unless you’re looking for more room,
more cooling or a more pleasing aesthetic. Indeed, these are the most important
characteristics of a case: it must be able to hold all your hardware, and have enough
fans to keep everything cool and relatively quiet.
Case fans: All cases include some sort of cooling system, though whether or not the
actual fans are inside the case at the time of purchase varies. Regardless, every case
has fan mounts, and it’s important to see what size they are prior to purchase. We
typically favor large 12cm fans because they spin slowly, and therefore are relatively
nice and quiet—and move a lot of air. You’ll want to make sure there’s a fan in the lower
front of the case to suck air into the PC, and a nice, big fan in the back to blow it out.
Some fans include exhaust fans on the top or side of the case, too, but these aren’t
always necessary and can add a lot of unwanted noise.
Form factor: The lion’s share of consumer-level motherboards conform to the ATX
specification, so make sure the case in question supports this standard (most do). Once
you know it supports ATX, the only big question left is, How big do you want to go?
There’s mid-tower cases, which are the size of what most consider to be a “regular”
desktop, and there’s full-tower cases, which are much larger and longer than a mid-
tower. Though their size makes them unsuitable for frequent transport to LAN parties,
full-size towers are a breeze to work in given their cavernous interiors. They can also
hold a lot of hardware, which is nice if you have several hard drives, are running SLI, or
are thinking about investing in water cooling. For most users, however, a mid-tower will
be more than sufficient.
Construction: The materials that make up your prospective case don’t really matter
that much. What does impact the equation—and your arms—is the case’s weight. If a
case is cumbersome before you put anything into it, imagine its heft once it’s stuffed
with optical drives, hard drives, videocards, water-cooling reservoirs, etc. With that said,
you’ll only infrequently tote your case, so don’t skimp on quality just to get a lighter
case. It’s also important that the outside of your case is durable. If it gets all marked up
the second you run your fingernail across it or if it feels flimsy to the touch, move on.
What good is a sweet enclosure that turns ugly - or worse, broken - within a few weeks?
Features: Your case’s features can range from the truly useful to the simply cool. A
slide-out motherboard tray, for example, is a feature we’re always keen on. Toolless
drive bays are also welcome. Then, of course, there’s all the whiz-bangery (or lack
thereof) to consider: We’re talking LED fans, built-in gauges, locking systems, etc. Case
innovation can be a slippery slope, though, as sometimes these features are actually
more irritating than useful (we can’t count the number of poorly implemented screwless
PCI holders we’ve broken). A feature doesn’t have to be new to be unique—the simple
addition of changeable side panels to a case kicks ass, and there’s nothing overly fancy
about replacing a window.
Front-mounted connectors: This feature used to be resigned to USB ports mounted
on the front of the case, but with Firewire and eSATA making headway in the market,
you’re going to want a case that gives you at least two of these connectors to play with.
And one should be eSATA – its speed benefits destroy anything Firewire or USB-based,
making it a perfect connection for that external backup drive of yours. Be sure to pay
attention to the location of these connectors, as sometimes they’re on the bottom-front
of the case, and other cases put them right at the top, which is nice if your PC is resting
on the floor.
Aesthetics

Simply put, you don’t want an ugly case. But far be it from us to decide what’s atrocious
versus what’s attractive, as everyone has his own personal sense of style. While we
personally hate cases that look like they were pulled straight out of the X-Files prop
shop, some people are into that sort of thing. Of course, these same people might very
well hate a case that’s covered in branding for a particular professional gamer.
Antec 900
www.antec.com
$120
Antec’s Nine Hundred is solidly constructed and surrounded by enough air cooling to
bring Dorothy back home to Kansas. Shoot, we were effectively “blown away” by the
Nine Hundred, hereafter dubbed “the 900,” which is a fi ne example of case
craftsmanship, despite a few minor flaws.

The case’s internals are pleasantly predictable. Three 5.25-inch bays and six 3.5-inch
bays reside behind the case’s stylish front panel, and the full grill not only looks sharp
but also improves the 900’s ability to generate ample airflow. Two 12cm blue LED fans
suck air across your hard drives and into the eye of the storm, and a 20cm fan churns
on the 900’s ceiling.

And that’s not all! Another fan at the rear of the case helps make the 900 an ideal
solution for those who prefer air cooling to water cooling. Heck, you can even install an
additional fan on the case’s side window grill—a pleasant bit of overkill.

“Hurricane” is an apt term to describe the force produced by the 900’s fans at full tilt, but
if going deaf isn’t your thing, Antec has wisely given users the ability to customize
speeds via a little switch on each fan.
The 900’s few flaws—a hard-to-remove side panel, a ton of drive-bay thumbscrews, and
no eSATA port—are hardly enough to dump rain on this case’s parade.
Coolermaster Cosmos
www.coolermaster.com
$200
We tipped our reviewing hand when we chose this case to house this year’s build it
machine. But that’s just how sweet the Cosmos is. This case looks as good as it
functions, and there’s nary a blemish in either area. More important, the case retains
enough of a unique look and feel to distance itself from the bevy of generic models we
frequently see.

You don’t need to grab a screwdriver to make major changes to any parts in the
Cosmos case (aside from the motherboard). The fi ve front 5.25-inch bays use an
awesome push-button locking mechanism that, to date is the best we’ve come across.
Tiny thumbscrews hold the six hard-drive trays in place - an elegant improvement over
standard drive bays.

The Cosmos caters to the water-cooling crowd with its ready-for-a-radiator ceiling grills,
but lovers of the air won’t be left out. A detachable 12cm fan bunker pulls in air from the
bottom of the case, and a plastic bar running horizontally across the case draws cool air
right into the videocard area. Strangely, there’s no airflow across the hard drives in this
case, one of the very few oversights we were able to find with the Cosmos. A lack of
functioning drive-activity lights on the case’s front panel is another stinger, but it’s not
enough to destroy the taste of this sweet, sweet chassis.

Q: Let’s start with the basics. What is a CRT?


A: Cathode Ray Tube. CRT monitors are just fancy implementations of the same
technology used in TVs: An electron beam originating from the base of
a vacuum-sealed tube scans across the tube’s screen, which is covered with a layer of
phosphor material. A metal grating or wire mesh limits how much of the electron beam
can hit individual phosphor clusters, thus leading to an acceptably sharp image. When
the phosphor material becomes excited, it glows either red, green or blue. Mix up
several differently colored phosphor clusters and suddenly you have millions of colors.
Although it is possible to make flat or nearly flat cathode ray tubes, most older models
exhibit some curvature, at least around the corners.
Q: But I have no desk space! What is an LCD?
A: Liquid-Crystal Displays are modern alternatives to CRTs. LCD manufacturing starts
with a flat pane of glass, which is then layered with a grid of small transistors; the
transistors are arranged in groups of three, and each triad describes a screen pixel.
When excited by electricity, these transistors can be made to open and shut. Put a
backlight behind the transistor grid and, behold, you have an image. There’s more to it
than that, but that’s the basic idea.
Q: What type of display is the best for gaming?
A: The knee-jerk recommendation has long been “a primo 19-inch CRT.” Why?
Because CRTs were better suited for gaming than LCDs, and because 19-inchers have
always been great values, price-wise. However, in today’s modern age of affordable
LCDs with blazing-fast refresh rates that eliminate the ghosting and bluring effects that
plagued earlier generations of monitors, we at PC Gamer are all on 20” widescreen
LCDs, and we’re loving them. Not only do they save desk space (and your back, if you
ever want to move them), but the image is as sharp and clear as you could want. The
one drawback is that they don’t look as good if they’re not at their native resolution, but
that’s a small price to pay.

Q: What is an optimal refresh rate?


A: On an LCD, this is a no-brainer—use whatever refresh rate the manufacturer tells
you to use for the native resolution of the panel (this is the resolution the display has to
run at in order for everything to look normal, i.e., not stretched out, squished, or jaggy).
This is almost always 60Hz, even if the monitor may be able to handle higher. Don’t
worry - LCD pixels don’t fade and strobe the way CRT pixels do, so 60Hz won’t cause
eyestrain. CRTs are a little trickier. While some would answer “as high as a resolution
as the CRT will allow,” we recommend taking a more cautious approach. First, make
sure you’ve loaded the Windows drivers for your particular display before you mess with
the refresh rate settings. With the driver loaded, Windows won’t let you choose a rate
higher than the monitor can display without damage to its circuitry. Our advice is to stay
away from 60Hz, but 75Hz or higher will be just dandy.

Screen size
This figure is the size of the LCD panel measured diagonally from corner to corner.
Desktop screens range in size from 15 to 24 inches and beyond. We consider 19 inches
the minimum for all-purpose computing. You need at least that much screen real estate
to work in multiple windows comfortably, and to thoroughly enjoy high-definition video
and PC games.
Aspect ratio
A display’s aspect ratio is its screen width divided by its height. The majority of desktop
monitors have an aspect ratio of 4:3, regardless of their screen size; and the majority of
software applications and computer games are designed accordingly. This is something
to bear in mind if you’re considering a widescreen model, which typically has an aspect
ratio of 16:10. If content, such as a game, insists on a 4:3 ratio, the display will stretch
the content to fill the entire screen, making everything look fatter than it should. This
situation is becoming less of a problem, as most games support at least one widescreen
mode that won’t look distorted.
Native Resolution
Every LCD sports a fixed number of pixels arrayed in a grid that is a certain number of
pixels high and a certain number of pixels wide. The native resolution is the width of the
display (in pixels) by the height (in pixels). The native resolution will deliver an optimum
picture. While it’s possible to run an LCD at a lower, non-native resolution, the image
will be rescaled and the display will use interpolation to fill in the missing pixels, which
can degrade image quality. Native res and interpolation quality are of particular concern
to gamers, who often run games at low resolutions to get the best frame rate. An LCD’s
native resolution is typically determined by its screen size. For example, many 19-inch
monitors have a native resolution of 1280x1024, while many 20-inch models have a
native resolution of 1600x1200. A higher resolution makes everything look smaller
onscreen, but also gives you more desktop space.
Interface
Today’s LCDs connect to the graphics board via either an analog VGA connector or a
digital DVI connector. If your graphics board is equipped with DVI outputs—most
modern boards are - we recommend you use DVI to connect to your LCD. Unlike CRTs,
which must refresh every pixel on the screen 60-plus times a second, LCDs modify
pixels only when they change. The analog connection is less precise because the digital
information must be converted to an analog stream in order to travel to the LCD, where
it is then analyzed and converted back to a digital format. This is a recipe for data loss
or corruption in the image that is ultimately displayed on-screen.
Pixel response time
This spec has been getting a lot of play lately, so it deserves mention. A pixel’s
response time, measured in milliseconds, describes the time it takes for a pixel to
change from its on state to its off state and then back on again. If the response time is
too slow, you’ll see ghosting and other artifacts because the display’s pixels can’t keep
pace with the information sent from the graphics card. This problem is particularly
noticeable in games, which tend to have fast action sequences. A response time of 25
milliseconds was once the norm, but it’s not uncommon these days to see response
times listed in the single digits. As impressive as this spec sounds, it should be taken
with a grain of salt. Different manufacturers report response times differently, so this
spec isn’t a reliable means of comparing different brands. Some vendors report only the
pixels’ rising (turning on) or falling (turning off) time; others report how long it takes for a
pixel to turn on, turn off, and then turn on again; and still others report the time it takes
for a pixel to go from peak white to full black. (Pixels change from white to black much
faster than they change from gray to gray, but the latter is a more common occurrence
in real-world use.) Because of this inconsistency, we don’t normally report on a display’s
pixel response time, but we mention it here to illustrate a point: Response-time specs
often do not jibe with qualitatively measured performance. The best way to determine
an LCD’s abilities with fastpaced content, in our opinion, is to eyeball it first-hand.
Ergonomics
Obviously, the more ability you have to adjust your screen’s height, tilt, and orientation
to fit your body, the better.
Three recommended monitors

Gateway FHD2400 24” LCD


Native Resolution: 1900x1200
Contrast Ratio: 1000:1
Brightness: 400 cd/m
Inputs: HDMI, VGA, DVI,
Component, Composite, S-Video
Price: $499
Dell Ultrasharp 2707WFP 27” LCD
Native Resolution: 1900x1200
Contrast Ratio: 1000:1
Brightness: 400 cd/m
Inputs: DVI, VGA, Component,
Composite, S-Video
Price: $850
Dell Ultrasharp 3008 WFP 30” LCD
Native Resolution: 2560x1600
Contrast Ratio: 3000:1
Brightness: 370cd/m
Inputs: DVI (2) w/ HDCP, HDMI, VGA, Component,
Composite, DisplayPort
Price: $1999
RECLUSA
Cushy and responsive keys are the hallmark of this latest gaming keyboard from
Microsoft. Designed in conjunction with the wizards at Razer devices, this pad has blue
backlit keys and a pair of USB ports. A set of programmable keys lines the top of the
keyboard, bracketed by two 360 degree jog dials to adjust volume and whatever else
you need.
G15 GAMING KEYBOARD
The Logitech G15 Gaming Keyboard is designed for gamers of all types. FPS players
will benefit from the built-in folding LCD display that shows vital stats, and MMO junkies
have a near limitless number of macro keys to program for their every need.

ECLIPSE II
Saitek’s Eclipse II is a simple, comfortable keyboard with backlit keys and a heavy stay-
put base that gives you extra insurance against mishaps. The backlit keys can also
rotate between one of three colors with a hit of a button. We’re just a little bummed out
that the board doesn’t have any extra USB ports.
RAZER TARANTULA
Now this is typing heaven. Ten reprogrammable hot keys (with 10 included weapon key-
covers) and on-the-fly switching between up to five user profiles are just two of the
fantastic features of Razer’s flagship keyboard. The shallow-action keys for fast fingers
round it out nicely.

DAS KEYBOARD II
No frills. No hotkeys. No backlight. There aren’t even any labels on the keys! Instead,
you get the sturdiest keyboard on the planet, and keys that click with a satisfying report
of a finely machined pistol. For arrogant gamers with deep pockets only.
NATURAL KEYBOARD 4000
Some gamers like their keyboards flat, some like them ergonomic. For the latter group,
we recommend the Microsoft Natural 4000 Keyboard. It fits your hands like a glove, with
a unique concave key layout and elevated wrist rest. Carpal Tunnel begone!
G5 LASER MOUSE
The Logitech G5 Laser mouse may just be our favorite mouse ever. With a 2000dpi
laser sensor, buttons for adjusting the sensitivity on the fly, and customizable weights
that load into a slot on the bottom, this mouse is a time tested favorite among the staff in
the office. The newly updated G5 even adds a second thumb button on the left!
HABU GAMING MOUSE
The specs on this animal read like a hardcore shooter fan’s shopping list: on-the-fly
mouse-sensitivity switching up to 2000dpi, an always-on laser to eliminate lag, and two
swappable side panels for large or small paws. A thumb button changes button
assignment profiles on the fly for tactical changes, or for simply turning off buttons that
get in the way of a good fight. Plus, it’s ambidextrous! South paws rejoice!
GAMER HD7600L
We don’t know why companies give mice complex model numbers, but we sure know
that Creative’s Gamer HD7600L is a smoking mouse. Its laser is accurate up to a
2400dpi resolution, which can be toggled all the way down to 400dpi. The laser
automatically reduces power when the mouse is taken off a surface, so you won’t risk
blinding yourself.
FATAL1TY GAMING MOUSE
You get astounding tracking and a smooth ride with this mouse created to the specs of
FPS champ Jonathan “Fatal1ty” Wendel. Features include on-the-fly resolution
changing and interchangeable weights for altering its heft. It’s for the precision gamer
with an itchy trigger finger.
OBSIDIAN WIRELESS MOUSE
While gamers typically shun wireless mice because of the slight increase in latency
delays compared to corded mice, the Obsidian from Saitek is too slick to pass up. The
1000dpi optical sensor is fine for real-time strategy games and the touch sensitive scroll
wheel is very easy to use. Wireless range stretches to 10 meters, and the mouse even
comes with two sets of rechargeable batteries.
IDEAZON REAPER
From the folks who brought us the saucy Fang keypad comes a sleek and subdued
gaming mouse with 1600dpi resolution and on-the-fly sensitivity switching with a small
button that’s accessible but nearly impossible to press accidentally. Three well-placed
buttons on the side promote quick grenade tosses or panicky drops to the prone
position. It’s light and comfy and for the semi-pro on a budget.
X52 PRO FLIGHT CONTROL SYSTEM
Oh ho ho. This is a gaming joystick that Tim Allen would approve. Whether you’re
piloting jumbo jets, dog fighting in space combat, or rumbling around in mechs, there’s
no better way to get immersed in a simulation than with Saitek’s flagship product. 30
programmable buttons, led lit switches, an LCD display screen are just a few of the
features on this joystick/ throttle combination. Flying around in Battlefield 2 will never be
the same again!
INFOCUS SP5000
No mere monitor is big enough for a real gamer. This projector turns what was once a
wall into a gigantic high-definition screen for gaming or movies. Expensive, but
awesome.

SHURE E4C EARBUDS


Because of the steep price tag, we don’t recommend these for everybody – only those
who demand the very best in audio quality. For those people, you can’t beat the E4C’s
passive noise cancellation technology and comfort. Plus, they’re light and ultra-portable!

X-ARCADE TANKSTICK
Play those old arcade classics the way they were meant to be played by pounding,
beating, and punching the nearly indestructible X-Arcade Tankstink (with pinball side
buttons!). Ah, the good ol’ days.
WIRELESS GAMING RECEIVER
The Microsoft Wireless Gaming Receiver lets you use any wireless accessory available
for the Xbox 360 on your PC—including the Wireless Headset, the Wireless Racing
Wheel, and, of course, the Xbox 360 Wireless Controller.
XBOX 360 WIRELESS CONTROLLER
Now the standard gamepad that all Games for Windows–branded games must support,
the Xbox 360 controller also just happens to be comfortable, responsive, durable, and
now wireless, too. Don’t worry—no one’s going to think you’ve sold out to the console
dark side.
LOGITECH G25 RACING WHEEL
Harsh—the G25 Racing Wheel has better appointments than my real car: a leather-
wrapped wheel with 900-degree rotation and two force-feedback motors that have
surprising grit and kick. The leather wrapped six-speed shifter pushes down with good
resistance, but it’s the hefty, solid pedals that leap far ahead of the toy wheel sets you’re
used to.

MATROX TRIPLEHEAD2GO DIGITAL EDITION


The TripleHead2Go has a DVI and a VGA input on one side, and three DVI outputs on
the other. You connect the little grey box to your PC with a DVI or VGA cable, and plug
the other end to three displays. Your PC sees the three displays as a single,
humungously widescreen Franken-monitor, so if your displays are identical and your
videocard can deliver a good framerate at the panoramic resolution you choose (up to
3840x1024), you will be amazed at the image. That’s a lot of eye candy, and if you’re
into flight simulators, hang on to your ’chutes, because you’re really going to be blown
away by being able to glance left and right as if you were looking out port and starboard
windows. TripleHead2Go works as well in Vista as Windows XP, but be warned that the
effect is far less impressive at some of the more modest resolutions.
SUMOSAC
What’s it like to sit on a 6-foot wide bag of pure comfort? We found out by melting in
Sumo Lounge’s new SumoSac, a massive 55-pound cushion of tangible relaxation. This
new beanbag chair stands at 3-feet tall and is wrapped with a soft micro-suede cover,
making it even more comfortable than our previous favorite Omni bag. The shredded
urethane foam filling the bag also never needs to be replaced, so we could imagine
ourselves sprawled on one of these until Kingdom Come.
NOVINT FALCON
The Novint Falcon is a mouse/joystick replacement controller that looks like a robotic
Hershey’s Kiss that waddled straight out of Portal. The action takes place on the
interchangeable ball grip mounted at the front of the unit on three articulated arms,
which are connected to force-feedback motors that update about 1,000 times a second.
You can move the ball in all three dimensions, and the motors add resistance to
simulate surfaces and textures. Sounds pretty dry until you fire up the demo, which
places a giant sphere in the center of the screen for you to fondle as you change its
composition. Turn the sphere to steel and it becomes smooth and impenetrable. Cover
the sphere in sandpaper and you can feel your virtual hand stroking the fine grit. Turn
the sphere into honey and you meet resistance trying to punch through until your hand
emerges from the other side. You can also download Haptics-Life 2, a mod for Half-Life
2 that adds “haptic”—or touch effects—to one of the best games ever made for the PC,
and makes it even better. Landing from a high jump in your buggy causes the grip to
jerk wildly with every impact, and you’d best brace yourself when switching from the
relatively delicate pistol to the Magnum, because the kick delivers a hell of a meaty
punch to the palm.

If you can put together a bed from IKEA, you can build your own PC. Seriously—it’s that
easy. In fact, you can probably do it in less time, and with the same screwdriver. But
while few people curl into the fetal position at the thought of snapping together some
Swedish furniture, lots of folks are intimidated by the prospect of putting together their
own PCs. That’s too bad, because you can begin with a pile of components in the
morning and be playing Unreal Tournament 3 on your new gaming rig the same
afternoon.

That’s not to say that PC-building is for everybody. You do miss out, for instance, on
having a single point of contact for technical support. Then again, technical support can
be as unhelpful as it is unintelligible, and building your own PC means that you get to
call the shots on every component, from the motherboard to the sound system. You pay
only for the parts you want, and you decide what kind of upgrade path to leave for
yourself in the future.

Still, building a PC can be nerve-wracking if you’ve never done it before, or if the last
videocard you installed went in the AGP slot. No worries: in this super-comprehensive
guide to building your own PC—you might even call it the “Ultimate” guide—We’ll show
you how to put a PC together, with a lot of “show” and a minimum of yap. We’ll even
explain how to benchmark your PC to make sure it’s performing up to spec, and give
you my favorite and most reliable tweaking tips, whether you’re running Windows XP or
Windows Vista. We’ll admit – putting a PC together can be difficult. But then, so is
constructing your own bed, if you don’t have someone to help you select the right parts
and show you how they go together. Lucky for you, you do!
Prices and Parts list

CASE
Coolermaster Cosmos 1000 $200

MOTHERBOARD
NVIDIA nForce 780i $250

CPU
Intel Q9300 2.5GHz Quad-Core $270

RAM
2GB Corsair PC8500 DDR 2 $100

VIDEO CARD
NVIDIA GeForce 9800GTX $300

POWER SUPPLY
Antec TruePower Quattro 850 Watt $200

CPU FAN
Zalman 9700NT $65

HARD DRIVE
Western Digital 500GB SE16 SATA $90

OPTICAL DRIVE
Lite-On 20X SATA DVD R/W $25

TOTAL $1500
(Optional):

MEDIA READER
Sabrent 65-in-1 Flash Card Reader $14

SOUND CARD
Creative X-Fi XtremeGamer $130

BLU-RAY
SONY BD-ROM BDU-X10S $199

ADD. HARD DRIVES


WD 10,000 RPM 150GB Raptor SATA $175
Seagate 7200.11 1TB SATA $220
Step 1
Remove the side panel of your case. This is particularly easy with a case like the
CoolerMaster Cosmos 1000, which has a lever on both sides of the chassis that
releases the side panels when lifted. This can save you a great deal of aggravation if
you frequently need to get inside your case to exchange parts or perform upgrades. If
using another model, we suggest purchasing thumbscrews that let you unscrew the side
panels without using a screwdriver.

Step 2
Screw in the motherboard stands. Your case purchase should include a bundle of
screws packed in the box. Find the narrow screw stands that need to be affixed to the
interior case platform. With motherboards based on the ATX spec (the current design
standard), 10 stands need to be screwed in. The Cosmos case includes a handy
reference sheet to let you know where the screws go. Putting a screw stand in the
wrong slot could short out your motherboard!
Step 3a
Time to install the power supply. Take the power supply out of the box and unravel its
cables. Our power supply is modular, so we only use the cables that we actually need to
power our internal devices. Find the included set of screws in the power supply box.
These will be different from the ones included with your case.
Step 3b
Screw in your power supply. Make sure all the screws are tight so the power supply
can’t wiggle around inside the case. Four screws will do the trick.
Step 3c
With the power supply mounted, this is how the interior of your system should look right
now. Move the cables from permanently attached to the power supply out the way for
the next few steps.
Step 4a
On to the motherboard. We’re going to be using a third-party CPU cooler for this build,
which will keep your CPU perfectly chilled for overclocking. The Zalman 9700NT
requires that you install a mounting bracket onto the motherboard. To do this, first align
the bottom part of the bracket to the underbelly of your mobo. Its four holes should line
up with four screw holes surrounding the CPU.
Step 4b
Screw in the top part of the bracket carefully. The corner of the bracket with the extra
niche should match the corner on the motherboard where the CPU locking pin rises
(bottom left in our photo). Make sure you don’t screw the top bracket in to tightly, or the
bottom bracket may lose alignment with the screw holes.
Step 5
Move the excess cables coming from your case aside. This includes front-panel
connectors and eSATA cables. We’ll get to these later. A good place to store them for
now is in the optical drive cage, where they can’t get in the way.
Step 6a
Once last bit of preparation before putting in the motherboard. Remove the generic
motherboard back panel port plate from the rear of the case. It should easily pop out if
you push it from the outside.

Step 6b
Replace the generic back panel plate with the one included in your motherboard box.
Snap it in by pressing its corners against the case from the inside. You should hear a
distinct click when each corner is snapped into place.
Step 7
Time to install the motherboard! Carefully place the motherboard into the case cavity,
holding it from opposite corners. Angle and slide it in from the front of the case toward
the back, so the back panel powers can fi t through the back port plate. Once the
motherboard is aligned and correctly situated, screw it in with 10 small screws that were
packaged with your case. Don’t go overboard when you tighten the screws – it’s not
necessary, and you could crack your mobo if you’re overzealous.
Step 8a
Read your CPU and motherboard instructions carefully to fi nd out how to remove the
socket shield and properly align the processor. In this case, our Q9300 has a tiny
triangle in one corner that must be aligned with the one corner of the socket that’s
missing a metal contact.
Step 8b
Close the socket shield and lock the CPU into place by clamping down on the CPU
socket pin.
Step 9a
Apply thermal paste to the processor before mounting the CPU fan. The Zalman
9700NT fan includes a small bottle of thermal paste, along with a tiny brush in the cap
for easy application. Don’t goop the paste on. Spread it evenly over the processor,
leaving a little bit of space around the edges to spread. Apply a little bit of paste to the
bottom of the CPU cooler as well.
Step 9b
Carefully place the CPU fan on top of the processor, with the fan facing away from the
back of the case (as seen in our photo). This ensures that air flows smoothly from the
front of the case through and out the back. Drop the clamping bridge under the center
of the cooler, and screw in one side with the screws packaged with the cooler. Screwing
in the other side requires a bit of strength, as the clamp will bend to lock the cooler into
place. Don’t press too hard or you’ll risk damaging both your CPU and motherboard. We
recommend screwing both sides in halfway first, then slowly tightening each side.
Step 9c
Plug in the CPU fan adapter to the three-pin port located near the processor socket.
Refer to your motherboard manual for the exact location, as it may vary from model to
model.
Step 10a
Remove a few of the PCI slot shield plates from the back of the case. You only need to
take out the ones that are blocking the holes you’ll need for your video card. With our
9800GTX, we’ll need to remove two shields.
Step 10b
Drop in your video card into the PCI-E slot. If your motherboard has more than one slot
(as ours does), check the manual to locate the correct one. In our case, we’re using the
“primary” PCI-Express slot, which has the most bandwidth for graphics. Push the card
straight down into the slot gently and firmly, and make sure the white retention clip
snaps into place. Screw the video card’s bracket into your case.
Step 10C
Pull two six-pin PCI-E power cables from your power supply and plug them into your
video card. Some video cards will only require one power connector. If your power
supply doesn’t have enough PCI-E power cables, you can use a molex adapter that
should be included with your graphics card.
Step 11a
Locate two memory slots on your motherboard. Our build will have two 1GB sticks of
DDR memory. You’ll want to insert them in alternating slots (color-coded on our
motherboard) to run the memory in dual-channel.
Step 11b
Align the RAM so the gap in the row of pins matches the slot on the motherboard. Plug
the RAM into the slots until the locking brackets on both sides snap up. Make sure your
RAM is completed seated, as improperly-installed RAM is the source for many system
malfunctions.
Step 12a
Hard drive cages like these are a feature unique to high-end cases like the
Coolermaster Cosmos 1000, and they’re much easier to work with than standard hard
drive bays, which practically have you crawling inside your own case like a hamster.
Remove the thumbscrew and pull out the cage bracket.
Step 12b
Screw the hard drive into the bracket. The washers in the Cosmos’ hard-drive bracket
are made of rubber, which helps cushions the drive against jostles (to a very limited
degree), and also cuts down on noise and rattle.
Step 13a
The Cosmos doesn’t require any tools to install or remove optical drives and drive-bay
accessories. To open a slot, just squeeze the two ends of the front panel plate.
Step 13b
Slide your optical drive in until it’s flush with the front panel. Just press the button on the
side of the interior cage to lock the drive into place.
Step 14a
Plug a SATA cable into your motherboard for every drive you have (including hard
drives and optical drives).
Step 14b
Plug a SATA power cable into your power-supply (these should be included with your
PSU). These power cables have a fl at SATA power connector on one end.
Step 14c
On all of your hard drives and optical drives, attach SATA and power cables from the
previous two steps. If a single drive includes both a SATA power connector and a
traditional 4-pin Molex connector, use either one, but NOT both.
Step 15a
Gather up the front panel connectors that you put aside from the beginning of this
tutorial. Separate the small ones that correlate with the frontpanel power switches from
the ninepin USB and Firewire cables.
Step 15b
Using your motherboard manual as a reference, plug in the front-panel cables to your
motherboard. Some motherboard manufacturers like ASUS include an adapter that lets
you affix all the cables at once.
Step 16a
Most cases have front panel USB, Firewire, and headphone jacks, among other ports.
Again, look to your motherboard documentation to find out where these go. Just make
sure you don’t plug a USB jack into the Firewire port, which may short the motherboard.
Step 16b
Plug in all of your case fans into the various three-pin ports on the motherboard. If your
motherboard doesn’t have enough fan slots, you can always use a Molex adapter to
plug the fans directly into your power supply.
Step 17
Locate the eight-pin power cable from your power supply, which provides power to the
CPU, and plug it into your motherboard. Like the PCI-E connector, this plug is keyed
and should only fit in one way, so there’s no risk of putting it in backwards. Finally, you’ll
need to power up the motherboard by connecting the 24-pin main power connector (as
shown in the photo). With our 850-watt power supply, there should be more than
enough juice to energize our system, with power to spare for any upgrades (ie. SLI,
additional hard drives, etc).
Step 18
The finish line is in sight! This fi nal step isn’t necessary, but we like tidying up our
cabling by hiding all the excess and loose wires out of the way. With our case, we can
pull cables and hide them behind the motherboard-mounting platform, or in empty slots
in the hard drive cage. Other cables can be neatly bundled with plastic zip ties too. Neat
cabling arrangements aren’t just aesthetically pleasing – they help improve airflow as
well!
Say hello to our Budget Badass!
So what did $1500 get you? Answer: a quad-core machine that will be able to run
DirectX 10 and has a hell of a lot of upgrade possibilities (not that you’ll need any for a
while).
Benchmarks
How does our DIY machine perform in games? We stress-tested the system with a
variety of benchmarks and games (full benchmarking details on page 88), and here are
the results:
Settings: 1280x1024, all settings on high, no AA or
AF (all games patched to latest version)
3DMark 06: 10882
3DMark Vantage: 5653
Crysis GPU Test: 37.91 fps
World in Conflict: 35 fps
Company of Heroes: Opposing Fronts: 59.4 fps
Half-Life 2: Episode One: 143.9 fps
Settings: 1600x1200, all settings on high, no AA or AF (all games patched to latest
version)
3DMark 06: 10265
Crysis GPU Test: 25.17 fps
World in Conflict: 35 fps
Company of Heroes: Opposing Fronts: 59.3 fps
Half-Life 2: Episode One: 141.2 fps
Settings: 1920x1200, all settings on high, 2x AA and 4x AF (all games patched to latest
version)
3DMark 06: 9982
Crysis GPU Test: 21.44 fps
World in Conflict: 34 fps
Company of Heroes: Opposing Fronts: 58.7 fps
Half-Life 2: Episode One: 134.8 fps
Benchmarking is the first thing you should do once you’ve finished assembling your PC,
smoothed out the wrinkles in the operating system, and installed up-to-date drivers for
all your components. If you drink beer, then at least make benchmarking the second
thing you do once you’re finished, because it tells you whether or not you’ve got a stable
system (if your system is prone to overheating, for example, it’ll likely do so during a
benchmark test); it lets you know if a component is wildly underperforming (a signal of
hardware or driver problems); and it gives you the feedback you need to tweak system
settings in games for the best framerate with the least compromise in visual quality.
There are two ways to benchmark: use a framerate display utility such as Fraps (trial
version available at www.fraps.com) to log your framerates as you play, or use the built-
in benchmarks available in many games. The Fraps method is fi ne, and we often use it
to see whether or not minor graphics-settings tweaks are doing me any good. But the
built-in benchmarks are generally more reliable, as they max out the action and throw in
every in-game effect in order to gauge your system’s performance during game
sequences that are the most resource intensive; after all, it doesn’t do you much good if
you know that you’re getting a smooth 72 frames per second while crossing a glade if
that figure drops to an excruciating 10 or 12 during battle sequences. Fail! Don’t forget
to run benchmarks frequently as a kind of routine physical for your gaming rig, to make
sure that bad drivers, malware, or hardware problems aren’t bringing your system down.
3DMARK06
3DMark06 is what’s known as a synthetic benchmark, meaning it was created
exclusively for the purposes of benchmarking and is designed to test a broad range of
your hardware’s capabilities. While this means that 3DMark06 isn’t a perfect indicator of
how well actual games will run (with multiple characters and action that frequently
speeds up and slows down), it is a good metric of your system’s overall power that you
can compare against scores online or PC Gamer review systems.
You can download a free trial version of 3DMark06 (currently at build 1.1.0)
atwww.futuremark.com. Install, launch the application, and click on the Run 3DMark06
button to the left. You don’t have access to any options unless you purchase the full
version, but the default test at 1280x1024 is what we run anyway. Kick back, watch
some pretty sequences, and when it’s finished, you can compare your results with
others by clicking on “Review your results online.”
CRYSIS
Many games can push high-end gaming systems to their limits. Only Crysis can make
them cry in public as they grovel for mercy. But with the built-in benchmarking tools and
some judicious tweaking, we managed to get an average of 58 frames per second out
of Crysis, at 1600x1200 resolution, from our DIY system. That was with all graphics
settings on Medium and full-screen antialiasing turned off, but as you can see in the
screenshot to the right, it still looks fantastic, and it runs smoothly even during intense
action sequences. Dropping the resolution to 1280x1024 allows us to crank up the
settings to High and still enjoy a silky 38 frames per second!

You’ll find a benchmark that targets your CPU (Benchmark_CPU.bat) and another that
targets your graphics subsystem (Benchmark_GPU.bat) in the Bin32 folder of your
Crysis installation. Double-clicking on either of them starts the benchmark (which loops
four times and displays an average frames per second at the end). If the framerate is
too choppy, launch Crysis and lower the resolution and the graphics-quality settings, or
turn off full-screen antialiasing, and then run the benchmark again.
COMPANY OF HEROES: OPPOSING FRONTS
Launch the game, log in, click on Options, select the Graphics tab, and then click on
Performance Test.
WORLD IN CONFLICT
Launch the game, click on Options, then Graphics, and then click on Run Benchmark.
Note the Advanced tab at the top of the screen, where the advanced graphics settings
can be found—it’s easy to miss.
HALF-LIFE 2: EPISODE ONE
The benchmark built into Half-Life 2: Episode One requires you to “record” a bit of
gameplay that the benchmarking system then runs to measure framerates. But you
might try running Fraps instead, or better yet, download the easy-to-use and free HL:E1
benchmarking utility from HardwareOC atwww.hocbench.com. You must already own a
copy of HL:E1 and you need to have launched it at least once before using the
Hardware OC benchmark. Keep in mind that games downloaded through Steam update
themselves automatically, which may affect benchmark comparisons in the future.

THE ESSENTIALS
Driver-by: This should go without saying, but check the manufacturers’ web sites of
every component you installed for up-to-the-minute drivers. If you’re not running the
latest drivers for your videocard and/or soundcards, you risk system instability and miss
out on free performance boosts. BIOS updates: Hardcore tweakers should regularly
check for BIOS updates from the manufacturers of their motherboards. While you’re
unlikely to get any more frames per second out of the deal, you may benefit from a
more stable system with higher overclocking potential. Be careful, though—applying the
update incorrectly can hose your motherboard. If this makes you nervous, don’t do it.
Swap meet: Although you’ll want Windows to manage the size of its swapfile, you’ll
squeeze more muscle out of your machine if you actually host the file on a separate
hard drive from your primary OS. So, yes, do that.
Clean living: Defragment your hard drive. Set up a schedule for running your favorite
anti-spyware application. Scan your system for viruses. Delete huge files you don’t use
anymore. You know—spring-cleaning stuff.
WINDOWS XP
Low Profile: You can sweep all the background crap—applets, utilities, desktop junk—
under the rug instantly by creating a new user profile dedicated exclusively to gaming.
Name it “Pwner” or something clever like that. Install nothing else under this profile!
Virtual tweaking: For some reason, XP will sometimes default to a generic number for
its virtual memory settings. Right-click on your My Computer icon and select Properties;
click on the Advanced tab, then click on the Performance Settings button, and finally,
click on the new Advanced tab. Here, you’ll see either a checkbox that indicates you
want Windows to automatically manage the size of virtual memory, or a radio button
that allows you to select “System managed size.” Check either one.
Self service: Many Windows “services” that run in the background are unnecessary
and memory-hogging. Hit up blackviper.com and check out the list of services that are
safe to disable. The tedium is worth the performance boost.
Task Scheduler: You can have XP automate a ton of tasks for you, like virus scanning
and defragmentation. Hit up your task scheduler and make sure that nothing’s set for
your prime gaming hours, or you’ll pay for it in performance!
VISTA
Give yourself a ReadyBoost: You may get a performance bump when using a USB
storage device with ReadyBoost in Windows Vista. Plug in your flash drive, right-click
on the device in Windows Explorer, select Properties, click the ReadyBoost tab, select
Use This Device, and then select the amount of space you’d like Vista to use.
Sideline the Sidebar: Turn off the sidebar before you launch a resource intensive
gaming session. Gadgets like the RSS feed and CPU Meter will tax your processor as
long as you have the sidebar open.
Lose control: While it won’t give you more frames per second in games, disabling
Vista’s spastic User Account Control will save you countless hours of verifying, re-
verifying, and assuring Vista that yes, you really do want to install that program.
Superfetch: This one’s tricky. Depending on how you use your computer, your
performance may suffer as a result of Vista’s default RAM-maximizing pre-caching
system. Try disabling it via the Services Console and benchmarking your system again
to see if you get a little extra perf.
Desktop Search: Vista’s indexing service can be a system hog, as it’s often scouring
your computer at the most inopportune times. Right-click on your hard drive in Windows
Explorer and uncheck “Index this drive for faster searching.”
Service Pack 1: Microsoft recently released Service Pack 1 for Windows Vista. Is this
update necessary? Our tests don’t show any framerate improvements in games, but we
recommend applying the update for a whole slew of security and reliability fixes.
When you tamper with the internal workings of your computer’s parts, you do so at your
own risk. Overclocking can damage, or even destroy, your CPU, motherboard, RAM, or
other system components, and it can void the warranty on those parts. So consider
yourself warned about the potential hazards! That said, you’re unlikely to harm your
hardware if you overclock with extreme caution and care. And following the advice and
instructions we lay out here will help you. So let’s get started!
Determining a CPU’s Speed

There’s simple math that determines the clock speed of any CPU. Each CPU has a
fixed internal number called the clock multiplier. That number multiplied by the reference
clock of the frontside bus determines the stated clock speed of the processor. For
example, an Intel 2.66GHz

Core 2 Quad Q6700 has a clock multiplier of 10. The stock system bus speed for this
processor is 1,066MHz. But wait, 1,066MHz multiplied by 10 equals 10GHz. What
gives? Intel’s front-side bus is quad-pumped, so its actual reference clock is 266MHz
(1,066MHz divided by four). That makes the clock speed of a Core 2 Quad Q6700 10
times 266MHz for 2,660MHz, or 2.66GHz.

This same math applies to AMD’s Athlon 64 CPUs, although, technically, they have no
front-side bus; instead, a HyperTransport link connects the CPU to the chipset. A
2.6GHz Athlon 64 X2 5000+, for example, operates on a 13x multiplier using a 200MHz
link—the actual HyperTransport link connection runs at 1GHz, as it operates on a 5x
multiplier.

You can overclock both Intel and AMD CPUs by increasing the multiplier setting,
increasing the “front-side bus,” or both. By using a combination of a multiplier and FSB
overclock, you may achieve higher speeds with more stability. Depending on your
situation, a combination of both may give you the best overclock, as your motherboard
may simply not be up to running at excessively high speeds.
Multiplier Locking

CPU manufacturers will take measures to ensure that a processor runs at its intended
speed by locking the multiplier. This fi xes the multiplier setting, so it cannot be changed
in the BIOS. This is done primarily to keep CPU “re-markers” from selling cheaper parts
as more expensive ones, but it also serves to thwart overclockers.
But not every chip is locked. Intel’s Extreme series of CPUs does not feature multiplier
locking nor does AMD’s FX series or some of its new Black Edition CPUs. This gives
overclockers who pay the extra price of admission more flexibility in their adventures. A
2.66GHz Core 2 Extreme QX6600 CPU, for example, can be overclocked to 2.93GHz
simply by increasing the multiplier from 10 to 11 without having to resort to front-side
bus overclocking.
The Role of Core Voltage

When you overclock, you essentially run the CPU out of spec. Upping a CPU’s core
voltage allows you to run a CPU way out of spec by further increasing your overclocking
headroom. For example, a stock Intel Core 2 Duo E6600 running at 2.4GHz eats about
1.2 volts. To get the same CPU up past 5.6GHz, one overclocker increased the core
voltage to 1.9 volts. As you can imagine, if AMD and Intel designed a CPU to operate at
a certain voltage, running it higher will greatly decrease the life expectancy of the CPU.
This is the most dangerous element of overclocking. The worst we’ve personally seen
from overclocking a CPU via its multiplier or front-side bus is instability or a corrupted
OS. But by adding a ton of voltage to a processor, you risk nuking it. Proceed with
caution!
How Do I Know if My CPU Is Overclockable?

As a rule of thumb, a very mature CPU production line will yield parts that are capable
of running at much higher than rated speeds. So, while it’s not a guarantee,
overclockers are generally better off with later-stepped CPUs.
An even better way to determine your processor’s overclocking credentials is to
download CPU-Z (www.cpuid.com). This freeware utility will identify your Intel or AMD
CPU and tell you such nitty-gritty details as the stepping and revision of the proc.
Steppings and revisions are internal labels that Intel and AMD use to denote versions. A
step denotes larger changes while a revision indicates fairly minor tweaks.

Once you find out that your retail 2.4GHz Core 2 Quad Q6600 is a revision G0 you can
rejoice in knowing that it runs cooler and can withstand more heat than the previous B3
step version. You learn those particular CPU qualities only by doing research, and the
best resources are online overclocking databases. Almost every enthusiast PC site has
a section devoted to overclocking, where users post details of their own experiences
with various CPUs. MaximumPC.com, ExtremeSystems.org, and FiringSquad.com all
include areas for users to discuss overclocking exploits.

STEP 1
BACK UP YOUR DATA
While the risk of hardware loss is generally very low, there’s always the possibility of OS
corruption or data loss.
STEP 2
ENTER YOUR BIOS
Get into your BIOS by hitting the Del, F1, or F2 key during boot. The key will vary by
motherboard, so check your documentation if you’re not sure what to press.
Once in the BIOS, you will need to find the appropriate configuration screens for
overclocking. The screens we refer to in our examples are specific to the EVGA 680i
SLI motherboard—they will differ from BIOS to BIOS. Your mobo manual or an online
search can provide guidance, but often you just need to dig around.
STEP 3
GOOSE YOUR CPU’S MULTIPLIER
One way to overclock your Intel CPU is to increase its multiplier— if it’s unlocked, which
is true for any Extremeclass Intel processor. The downside to doing a multiplier-only
overclock is that there is very little granularity. Taking a 2.66GHz Core 2 Extreme
QX6700 from its stock 10x multiplier to 12x jumps you all the way to 3.2GHz. If you
want to hit 3.1GHz, a multiplier overclock won’t let you do it. Try increasing your CPU’s
multiplier just a notch or two (in our BIOS, the multiplier setting is in Advanced Chipset
Features, System Clocks). Then reboot your system and see how it runs. If your system
crashes or won’t start, see Step 7.
STEP 4
INCREASE YOUR FRONT-SIDE BUS SPEED
The other, more likely, way to overclock your Intel CPU is through the front-side bus. By
bumping the FSB beyond its stock 800MHz or 1,066MHz, you increase your CPU’s
clock speed. On the majority of CPUs, this will be the sole overclocking option, as only
the most expensive Intel chips are unlocked. On our EVGA 680i board, we went into
Advanced Chipset Features, FSB & Memory Config. Here, we set the FSB Memory
Clock Mode to Unlinked. This effectively separates the RAM clocks from the front-side
bus. (If your chipset doesn’t allow you to unlink the RAM, you will need to choose an
FSB-to-RAM speed ratio; make sure your choice keeps you within your RAM’s spec.
See Step 6 for more info.) Increase your FSB by just 20MHz increments. Reboot with
each increase to see if your machine will boot (if your system crashes or fails to reboot,
see Step 7). With the multiplier set at its stock 10x, we pushed our 2.66GHz Core 2 to
3GHz by increasing the FSB speed from its stock 1,066MHz to 1,200MHz.
STEP 5
ADD SOME VOLTAGE
We wanted to go beyond the 3GHz we achieved, but our attempts at pushing the FSB
further made our system unstable. There’s still hope for more speed if we increase our
CPU’s voltage. In our BIOS’s Advanced Chipset Features, System Voltages screen, we
can increase the CPU voltage, the chipset voltage, and the memory voltage, in addition
to the voltage of a few other parts. By pushing the CPU voltage of our early-rev
2.66GHz Core 2 Extreme QX6700 from 1.11 volts to 1.39 volts, we’re able to push the
FSB up to 1,333MHz and achieve a stable 3.2GHz CPU speed. How much voltage is
safe? It’s difficult to say, as the number differs among CPUs and motherboards. We
recommend that you troll forums and overclocking databases to see how far people are
going with individual chips. We can’t give general recommendations on voltage as each
CPU has different specs and anything over stock could nuke your chip.
STEP 6
TO OVERCLOCK RAM OR NOT?
So you’re satisfied that the CPU is running far above its rated speed, but now you want
to overclock the RAM. As we noted above, our nForce 680i board offers the option to
run the RAM linked or unlinked. Linking the RAM sets the RAM speed as a ratio of the
front-side bus’s clocked speed. The ratios are determined by the chipset, and in our
case, we could choose between FSB: memclock ratios of 1:1, 5:4, 3:2 or Sync mode,
which is fractionally equivalent to running at a 2:1 ratio. Picking any of the settings will
change the RAM speed. For example, if you push your FSB to 1,066MHz and choose a
1:1 ratio, your RAM speed will hit 1,066MHz—if you’re using overclockable memory
(see RAM section on page 24). If you’re not using overclockable RAM, your box will
probably just hard lock. A 5:4 ratio would give you 853MHz, 3:2 generates 711MHz, and
Sync gives you 533MHz. Which is better? Some overclockers report that linked RAM
gives better performance than unlinked. But you’ll have to test your system by running
apps you typically use to determine which setting is the most stable and provides the
best performance for your needs.
STEP 7
BEEP! BEEP!
No, your system isn’t asking you where the Dagobah system is. That constant beeping
means your overclock failed. With some motherboards, simply powering down by
unplugging the system from the wall or switching off the PSU for a few seconds will get
you back into the BIOS. In some cases, you’ll need to reset the system’s CMOS by
cutting power and then throwing the CMOS-clear jumper or removing and then
reinserting the coincell battery.
STEP 8
TEST IT
Just because you booted into the OS doesn’t put you in the clear. You should now
stress-test the system using Prime95 or another application that really stresses the
CPU. You might be tempted to use 3DMark06, but it’s primarily a GPU test, and many
overclocked systems that pass 3DMark06 burn-ins will actually fail under heavy CPU
loads.

Keeping Your CPU Cool


Without adequate cooling, your overclocked rig is as good as toast

Aftermarket air cooling is a fine way to manage CPU temperatures, but only to a point.
Eventually, practicality and performance concerns render air coolers insufficient for
OC’d machines. That’s why there’s liquid cooling. Not only can you reach lower
temperatures when using a liquid-based setup as opposed to air, but you’ll also benefit
from a lower sound profile.
Of course, there’s an obvious caveat: Liquids plus electronics can equal a serious
monetary hit if you have to replace hardware that inadvertently gets wet. Installing a
water-cooling kit in your rig is a delicate process, and the drama only increases if you’ve
never done it before. Sure, you can go with a preassembled liquid-cooling kit, but in our
experience, a majority of these units perform on par with—if not worse than - stock air
coolers. The best liquid cooler we’ve found is CoolIT’s Boreas unit
($450,www.coolitsystems.com). A fancier, fatter version of the company’s Eliminator,
the Boreas uses 12 thermoelectric modules to rip the heat from your molten tubing into
a giant heatsink. Two 12cm fans take care of the rest, allowing the Boreas to beat our
FX-60 test bed’s stock cooler by 20 C in idle and 32 C during our burn-in test.

Sep 22, 2008

Вам также может понравиться