Showing posts with label FEATURE. Show all posts
Showing posts with label FEATURE. Show all posts

Friday, June 12, 2009

Smartphone Buyers Guide: The Best of the Best

via Gizmodo by John Herrman on 6/12/09

As the dust settles from the last two weeks of mobile madness, one question remains unanswered: Which of the new generation of smartphones should you actually buy? We've collected everything you need to know.

We've selected the five phones that most feel like modern handsets to us�the iPhone 3G, iPhone 3G S, the Palm Pre, the HTC Magic (or, as we soon expect, the T-Mobile G2) and the BlackBerry Storm�and broken them down by hardware, software and cost. This is a guide in the strictest sense, meaning we aren't declaring winners or losers, just giving you the information you need to make your own choice. So! On with the matrices.

Phones' hardware specs tend to dominate carriers' marketing, but in many cases they just don't mean much, with a few exceptions: screens, storage, graphics performance and input.

The iPhones and Pre hold a sizable advantage in the screen department, trumping the G2, which doesn't have multitouch, and the Storm, which has an ill-conceived pseudo-multitouch clickscreen that left most reviewers at best underwhelmed, and at worst downright frustrated.

In terms of storage, our phones take two fundamentally different approaches. The iPhone and Pre include healthy amounts of nonremovable storage�in the case of the iPhone 3G S, up to 32GB�which makes sense: if we're going to use our phones as they're marketed (as multimedia devices), we need space. The G2, like the G1 before it, depends on a removable microSD card for file storage, since its inbuilt memory is measured in megabytes. So does the Storm. This is fine if the carrier bundles the handset with a capacious card; Verizon is good about this. T-Mobile, on the other hand, shipped the G1 with a pitifully small 1GB card, so we'll just have to hope they're more generous with the G2.

Technical 3D ability is actually fairly uniform across this hardware, with the exception of the iPhone 3G S, which is, in this area, a next-gen product. Only Apple and HTC, though, give developers any meaningful kind of access to their handsets' graphics accelerators, meaning the G2 and iPhones (particularly the bulked-up 3G S) will be the sole options for would-be gamers. And of the two platforms, iPhone OS has amassed plenty of serious gaming titles, while Android, let's be honest, hasn't.

The Pre is an obvious standout in that it has a hardware keyboard in addition to its touchscreen. The hardware QWERTY/onscreen keyboard debate is all about personal preference, so whether this is a boon or a burden is up to you. Typing on a screen is an acquired skill�but much more so on the Storm than the iPhone or G2.

Battery life would seem to be a valuable metric; it's not. The differences in capacity and claimed endurance don't really matter much, since realistically, they all need to be charged nightly.

Note: the Storm is due a minor hardware refresh, possibly quite soon. The main change, it's been rumored, is a different touchscreen.

The greatest hardware in the world couldn't save a phone with shitty software, and your handset's OS is the single largest determining factor in how you'll enjoy your phone. We've explored the differences between the major smartphone platforms at length here, and there's no point getting too far into the specific differences right now.

To summarize: iPhone OS claims advantages in ease of use, its burgeoning App Store, and a respectable core feature set, but falters on multitasking and its lack of ability to install unsanctioned apps. The Pre's WebOS is extremely slick and friendly to multitasking, but its App Catalog is light on content, and its development SDK is somewhat restrictive. Android and BlackBerry OS are both more laissez-faire, letting users install apps from whatever source they choose. Neither of their app stores is spectacular, but Android's is markedly less anemic. More on app stores here.

Carrier preferences will often override prices, but here they are anyway. The Pre and G2 are the most economic options, and the Storm roughly ties the 3G S as the most expensive. (It's easy to underestimate how much a small monthly cost difference can add up over two years.) But again, carrier loyalty (or more likely, disloyalty) and coverage quality is as important as cost. If Sprint's killing your Pre buzz, it could be worth waiting until next year, when Verizon is rumored to pick it up. Likewise, if T-Mobile coverage in your area is patchy, don't worry: by the time T-Mobile actually offers the G2, we'll probably have at least another functionally identical handset lined up for release elsewhere.

So there you have it: everything you need to know about the latest crop of consumer smartphones. Go forth, and be gouged.

Wednesday, May 13, 2009

Giz Explains: GPGPU Computing, and Why It'll Melt Your Face Off

via�Gizmodo�by matt buchanan on 5/13/09

No, I didn't stutter: GPGPU�general-purpose computing on graphics processor units�is what's going to bring hot screaming gaming GPUs to the mainstream, withWindows 7�and�Snow Leopard. Finally, everbody's face melts! Here's how.

What a Difference a Letter Makes
GPU sounds�and looks�a lot like CPU, but they're pretty different, and not just 'cause dedicated GPUs like the Radeon HD 4870 here can be massive. GPU stands for graphics processing unit, while CPU stands for central processing unit. Spelled out, you can already see the big differences between the two, but it takes some experts from Nvidia and AMD/ATI to get to the heart of what makes them so distinct.

Traditionally, a GPU does basically one thing, speed up the processing of image data that you end up seeing on your screen. As AMD Stream Computing Director Patricia Harrell told me, they're essentially chains of special purpose hardware designed to accelerate each stage of the geometry pipeline, the process of matching image data or a computer model to the pixels on your screen.

GPUs�have a pretty long history�you could go all the way back to the Commodore Amiga, if you wanted to�but we're going to stick to the fairly present. That is, the last 10 years, when Nvidia's Sanford Russell says GPUs starting adding cores to distribute the workload across multiple cores. See, graphics calculations�the calculations needed to figure out what pixels to display your screen as you snipe someone's head off in Team Fortress 2�are particularly suited to being handled in parallel.

An example Nvidia's Russell gave to think about the difference between a traditional CPU and a GPU is this: If you were looking for a word in a book, and handed the task to a CPU, it would start at page 1 and read it all the way to the end, because it's a "serial" processor. It would be fast, but would take time because it has to go in order. A GPU, which is a "parallel" processor, "would tear [the book] into a thousand pieces" and read it all at the same time. Even if each individual word is read more slowly, the book may be read in its entirety quicker, because words are read simultaneously.

All those cores in a GPU�800 stream processors in ATI's Radeon 4870�make it really good at performing the same calculation over and over on a whole bunch of data. (Hence a common GPU spec is�flops, or floating point operations per second, measured in current hardware in terms of gigaflops and teraflops.) The general-purpose CPU is better at some stuff though, as AMD's Harrell said: general programming, accessing memory randomly, executing steps in order, everyday stuff. It's true, though, that CPUs are sprouting cores, looking more and more like GPUs in some respects, as�retiring Intel Chairman Craig Barrett told me.

Explosions Are Cool, But Where's the General Part?
Okay, so the thing about�parallel processing�using tons of cores to break stuff up and crunch it all at once�is that applications have to be programmed to take advantage of it. It's not easy, which is why Intel at this point hires more software engineers than hardware ones. So even if the hardware's there, you still need the software to get there, and it's a whole different kind of programming.

Which brings us to OpenCL (Open Computing Language) and, to a lesser extent, CUDA. They're frameworks that make it way easier to use�graphics cards�for kinds of computing that aren't related to making zombie guts fly in Left 4 Dead.�OpenCL is�the "open standard for parallel programming of heterogeneous systems" standardized by the Khronos Group�AMD, Apple, IBM, Intel, Nvidia, Samsung and a bunch of others are involved, so it's pretty much an industry-wide thing. In semi-English, it's a cross-platform standard for parallel programming across different kinds of hardware�using both CPU and GPU�that anyone can use for free. CUDA is�Nvidia's own architecturefor parallel programming on its graphics cards.

OpenCL�is a big part of Snow Leopard. Windows 7 will use some graphics card acceleration too (though we're really�looking forward to DirectX 11). So graphics card acceleration is going to be a big part of future OSes.

So Uh, What's It Going to Do for Me?
Parallel processing is�pretty great for scientists. But what about those regular people? Does it make their stuff go faster. Not everything, and to start, it's not going too far from graphics, since that's still the easiest to parallelize. But converting, decoding and creating videos�stuff you're probably using now more than you did a couple years ago�will improve dramatically soon. Say bye-bye 20-minute renders. Ditto for image editing; there'll be less waiting for effects to propagate with giant images (Photoshop CS4 already uses GPU acceleration). In gaming, beyond straight-up graphical improvements, physics engines can get more complicated and realistic.

If you're just Twittering or checking email, no, GPGPU computing is not going to melt your stone-cold face. But anyone with anything cool on their computer is going to feel the melt eventually.


Super Sport Car Evolution