March 19, 2012, 8:30 AM —
In part 1 of this series I talked about my initial exploration of bringing PC gaming onto the living room TV by using (for the most part) old hardware that was sitting in the closet. Having tasted the joy of combining PC and console gaming in this way, I decided to get serious about the idea.
In my case, getting serious meant purchasing an Alienware X51. The reviews I'd read about the machine were generally positive [check out reviews from The Verge and PC World] and my days of building my own computers are behind me. If you still enjoy skinning your knuckles on circuit boards you could build a faster rig for the money I spent on the X51, but I was looking for something that was readily available and worked 'off the shelf.' I opted for the top of the line X51. $1249 got me a 3.4 Ghz i7 processor, 8 GB of RAM, a terabyte of drive space, an Nvidia GTX555 graphics card and a slot-loading Blu-ray drive. Hardcore PC gamers will chortle derisively at those specs, I'm sure, but for a casual PC gamer it's a nice system. I was worried about the amount of RAM until I remembered this was just for gaming; I'm not going to have a bunch of programs all running at once.
The X51 looks a bit like a black Xbox that's been bulking up. It has an external power supply which means a huge power brick that can be tucked away behind the entertainment center. The back of the system has an HDMI port on the motherboard as well as a mini-HDMI and a couple of DVI ports on the graphics card. There's an array of audio-out ports, too.
Set up consisted, once again, of plugging in an HDMI cable, a network cable and the power cable. I'm using all wireless input peripherals which makes the set-up very clean, though once again I did have to use the extension cable for the Logitech Unifying wireless dongle to get it up high enough to work well. I ran that cable out the back of the entertainment center and then forward to nestle inconspicuously under the TV.
The X51 uses Nvidia's Optimus technology and I'm not sure why. Optimus lets a computer run using onboard graphics and only switch to the power-sucking GPU when needed. It makes sense for a laptop running on batteries but I'm not sure what advantage it brings to the X51.
I connected the HDMI cable to the onboard HDMI port. When Windows 7 booted up it 'saw' two desktops. One it properly identified as an HDTV and the other as a "Generic PnP Monitor." This bugged me for no good reason. It just seemed...messy.
For many games, the Optimus tech worked fine and the system was powerful enough to run lots of slightly older games at their highest detail levels. Remember, the TV tops out at 1920x1080 resolution, which helps a bit. If you need to play Battlefield 3 at its highest settings while maintaining 60 FPS so that you can be competitive, you're probably not interested in playing while kicked back on the couch ten feet from the screen and using a wireless mouse anyway.
Some games, particularly new or indie titles, didn't work correctly with the Optimus tech. In some cases they'd set graphics options to very low settings, basing them on the on-board video hardware. That was more of an annoyance than a problem. In other cases the system wouldn't switch over to using see the Nvidia card. The fix for these was opening the Nvidia Control Panel and assigning the GPU to the exectuable for these problem games. You only have to do this once per game so it's not a huge deal, but it kind of flew in the face of having a "console experience" while playing PC titles.
And then there was Bioware's MMO, Star Wars: The Old Republic (SW:TOR). SW:TOR did not like the Optimus system in this machine, at all. For some reason the in-game 3D renderer (3D as in polygons, not as in funny glasses) was formatting the graphics for a display that was much wider than 1920 pixels. The result was that characters and the game world were tall and skinny. Oddly the HUD rendered perfectly; it was just the 3D stuff that was the problem.
I went back and forth with Bioware tech support for about 10 days trying to solve this problem, to no avail. Finally I gave up and ordered a mini-HDMI to HDMI converter and moved the HDMI cable from the onboard socket to the one on the GTX 555. Not only did that fix my issue with SW:TOR but it side-stepped the entire Optimus system and all the little foibles that had come along with it. The system still sounds whisper quiet while web browsing and other non-intensive activities, so I can't see that I'm giving up anything by hooking up the system this way. My advice? Don't bother with the onboard HDMI port; go right for the mini-HDMI connector on the video card. The adapter cost me about $5 including shipping from Monoprice.com.
Since banishing Optimus, life has been pretty good. All those Steam games I bought during various holiday sales are starting to get some use. I'm getting great deals on older, but still great, titles by keeping an eye on the gaming blogs, many of which pass on good tips. For instance Mass Effect 3 is a really hot ticket right now, but I hadn't played the first two games. Amazon ran a deal that let me get them both for just under $15. I picked up The Witcher 2 for $15 as well.
I still struggle a little with controls thanks to not having a desk to work on. Lately I've just been holding the wireless keyboard on my lap and either mousing on the couch cushion next to me or using a wireless trackball. The trackball isn't ideal for gaming but I'm getting used to it, and it'll work anywhere even when there's a dog snuggled up alongside you on the couch. The games that work best are the ones that support a gamepad though.
Since starting this project I'm spending less and less evening time sitting at me desk. I've even got to where I'll check email on the 55" TV from time to time. Watching web video has become a real joy. I know I can get YouTube on the Xbox 360 but I still prefer the interface on a PC. Google offers an Android app that acts as a remote control for YouTube's LeanBack. And of course, YouTube isn't the only web video source out there.
The Steam Box turned out to be a rumor, but Steam's Big Picture mode is real and one of the requirements for being included in the program is controller support. When it arrives I'll be ready for it and I'm hoping it'll prove popular enough to convince more PC game devs to support a game controller. With both Microsoft and Sony delaying the launch of their next generation consoles, having a PC in the living room is the only way to get better game graphics on the TV in the foreseeable future (unless Nintendo's Wii U surprises us). I'm really pleased with the results of this project and would recommend it to any PC gamer getting tired of sitting at a desk all night.
Read more of Peter Smith's TechnoFile blog and follow the latest IT news at ITworld. Follow Peter on Twitter at @pasmith. For the latest IT news, analysis and how-tos, follow ITworld on Twitter and Facebook.