Digital Home Thoughts - News & Reviews for the Digital Home

Be sure to register in our forums and post your comments - we want to hear from you!

Zune Thoughts

Loading feed...

Apple Thoughts

Loading feed...

Laptop Thoughts

Loading feed...

All posts tagged "gpu"

Tuesday, June 19, 2012

Performance Of Intel Core i5 3470: HD 2500 Graphics Tested

Posted by Brad Wasson in "Digital Home Talk" @ 11:00 AM

"Intel's Core i5-3470 is a good base for a system equipped with a discrete GPU. You don't get the heavily threaded performance of the quad-core, eight-thread Core i7 but you're also saving nearly $100. For a gaming machine or anything else that's not going to be doing a lot of thread heavy work (e.g. non-QuickSync video transcode, offline 3D rendering, etc...) the 3470 is definitely good enough."

Many of our readers are interested in detailed specifications and performance analyses of CPUs and GPUs. There are few better than AnandTech to conduct a through analysis, as they have done for this new Intel Core i5 chip. Their conclusion? It will work fine for you if your computing requirements are limited to activities like video transcoding, but if you are a game player you will need to look at some of their other chips to get satisfactory performance.

Monday, October 31, 2011

A Q&A With Intel Canada on Sandy Bridge's QuickSync Feature

Posted by Jason Dunn in "Digital Home Talk" @ 09:00 AM

Intel's second generation Core i-series processors, referred to by us geeks as "Sandy Bridge CPUs", brought with them a significant boost in overall processing power. What really got me curious though was Intel's QuickSync technology. Intel has a page on their Web site that talks about this technology, but I wanted to dig deeper so I reached out to Intel Canada and Joe Ellis, Market Development Manager for Intel Canada, responded.

DHT: A key feature in the second generation of Intel Core processors, known as Sandy Bridge CPUs in the tech circles, is the inclusion of an on-chip graphics processor. One of the benefits of this integration is Intel's Quick Sync video technology. Can you describe what Quick Sync technology is and how it works? Why is it better that a straight CPU-based video encode?

ELLIS: "Intel Quick Sync Video has often been described as "hardware acceleration" technology built into 2nd Gen Intel Core processors. This is partially correct. Traditional hardware acceleration has been enabled through software optimizations for general-purpose CPU resources otherwise shared with multiple PC functions. This approach was widely adopted with the first MMX instruction set in 1995, and resulted in much faster multimedia rendering and playback times - though often at the expense of other computing functions waiting for those same computing resources. Subsequent Intel CPU generations introduced ever more powerful instructions and architectural advancements to accelerate a variety of parallel tasks, but always using processor resources common to every task." Read more...

Tuesday, May 31, 2011

GIGABYTE's HD 6870 Super Overclock Video Card Reviewed

Posted by Jason Dunn in "Digital Home Hardware & Accessories" @ 11:00 AM

"The GIGABYTE HD 6870 Super Overclock is GIGABYTE's elite offering for those looking to pick up an overclocking friendly Radeon HD 6870. GIGABYTE has even started the process for you and boosted the GPU frequency by 50MHz and the memory frequency by 50MHz. This represents a decent boost these days when some companies are labeling video cards "Overclocked" with a mere 10MHz GPU increase and the memory being left a reference frequencies. Kudos to GIGABYTE for giving us higher core and memory frequencies."

If you're looking for a fast, overclocked video card, the latest and greatest from AMD is reviewed in extreme detail over on HardOCP. As much as I like having a powerful video card in my system, what I don't like is the noise that often comes with having that power. The good news is that this card is no louder than the reference design - but the review doesn't explain exactly how loud that is...

Thursday, February 17, 2011

The Real Cost Of High Powered Gaming

Posted by Hooch Tan in "Digital Home News" @ 04:00 PM,2849.html

"Our feeling was that the usual extrapolations and estimates using minimum and maximum power readings don’t do justice to everyday operation. Therefore, we decided to measure the actual power consumption over a certain period of time and with different usage models, because most people do not just turn on their computers and play games without ever doing something else."

While consoles are designed for a 5-10 year lifecycle, PC gaming, especially graphics-wise, advances much faster than that. New video cards coming out every 6 months (granted, some are just rubbing off serial numbers and putting on a fresh coat of paint) and those at the top end can cost more than your first born. With greater power comes greater power consumption and Tom's Hardware does a check to see just how much of a difference it makes in a real world situation. Electricity prices being what they are, that expensive toy you just bought may end up costing you far more than you originally thought. Time to reconsider just how important gaming is to you.

Friday, February 11, 2011

Did Intel's QuickSync Technology Kill CUDA/APP?

Posted by Jason Dunn in "Digital Home Talk" @ 03:00 PM,2833-5.html

Take a look at that image above. It's from an article on Tom's Hardware that talks about how Intel's QuickSync technology - which is part of the Sandy Bridge platform - stacks up against GPU acceleration from NVIDIA's CUDA and AMD's APP (formerly Stream) technologies. The numbers above are simply shocking: GPU acceleration using CUDA or APP gives an almost 2x performance increase in HD video trascoding...yet, amazingly, Intel's QuickSync manages to get the same job done 5x faster than CUDA or APP. And let's be clear about something: it's able to do that with an integrated GPU that's part of the overall CPU, not with a beefy, power-sucking, loud graphics card that costs $300+. Read more...

Monday, January 31, 2011

AMD's 5 Watt CPU Wants to Be In Your Next Windows Tablet

Posted by Jason Dunn in "Laptop Thoughts News" @ 10:53 PM

Engadget has a brief news item that says AMD has a 5 watt version of their Fusion APU (that's a CPU plus a GPU for those of you who haven't been paying attention lately) designed for x86 tablets. Meaning, tablets and slates running Windows. With Microsoft's move to port Windows to ARM, is AMD too late? I don't think so - there's a pressing need for low-power hardware to run Windows in a variety of form factors, and x86 compatibility it still critical for application use. What I haven't seen anyone do yet is do a power consumption comparison between AMD's new low-power APUs and the comparable offering from Intel in the form of an Atom CPU and an Intel HD GPU. I'd guess AMD would win that, but I'd like to know for sure. Anyone seen anything like that yet?

Wednesday, October 20, 2010

AMD's Llano APU Brings the Firepower

Posted by Jason Dunn in "Digital Home News" @ 08:00 PM

"The whole is greater than the sum of its parts. It's true in sports, just as it is in technology. An accelerated processing unit (APU) is more than just a CPU + GPU. Much more. And, AMD Fusion is more than just a technology product supported by software and hardware vendors. AMD Fusion is about an entire ecosystem changing the computing landscape as we know it."

I've been hearing about AMD's combination CPU/GPU since mid 2008 and it's finally becoming a reality - well, sometime in 2011 at least. The performance looks impressive, so as long as the power consumption and heat are held in check, this could be an impressive piece of hardware.

Thursday, September 30, 2010

Boxee's Switch from Nividia's Tegra to Intel CE4100

Posted by Jason Dunn in "Digital Home Articles & Resources" @ 02:05 PM

"The Boxee Box announced at the 2010 CES was based on the Tegra 2. In a post made on my personal blog right after the CES announcement, I had expressed my reservations on how it would be foolhardy to expect the same sort of performance from an app-processor based device as what one would expect from a dedicated media streamer or HTPC. Just as suspected, Boxee had to replace Tegra 2 with a much more powerful SoC. After evaluating many solutions, Boxee and D-Link decided to choose the Atom based Intel CE4100 for the Boxee Box."

A great article on the Boxee Box and how the switch from NVIDIA's Tegra 2 chip to the Intel CE4100 will enable the Boxee Box to really deliver on a high-quality experience in terms of hardware-assisted playback of HD video content. Will the software measure up? My Magic 8 Ball says "It's looking likely". Let's hope that's the case!

Wednesday, September 29, 2010

BFG Liquidating, Screwing Customers Along the Way

Posted by Jason Dunn in "Digital Home News" @ 12:00 AM

"It looks like BFG has finally received the final nail within its coffin and is officially "....winding down and liquidating its business" in their own words. Some of their customers who have recently sent in BFG video cards for RMA have begun receiving the following letter from BFG..."

It's a fairly rare thing to see a major technology brand name go down in flames; more typically they get bought up by another brand and the customers are taken care of. Not this time! I happen to own two BFG 9800GT passively cooled video cards, and both of them died on me (likely due to heat issues). I knew that BFG had exited the graphics card market earlier this year, but when talked to tech support about six weeks ago, I was told that even if they didn't have the exact model of graphics card in stock that I needed, they'd send me the equivalent product. Imagine my surprise when I got both of my broken cards send back to me with a note that basically said "Sorry we couldn't fix this - we're going out of business". Kind of sucks! I guess I should be glad they were relatively inexpensive video cards and not $600 monster cards. Anyone else get bitten by the BFG liquidation bug?

Thursday, March 4, 2010

NVIDIA GeForce GTX480 Demo Video

Posted by Jason Dunn in "Digital Home Hardware & Accessories" @ 04:00 PM

"Tom Petersen Director of Technical Mkt at NVIDIA describes how the performance of GeForce GTX 480 compares to existing solution. We highlight the different aspects of the benchmark and how the new GeForce excels in tessellation. Tom also offers a quick peak at how 3DVision Surround is pushing PC gaming forward."

Excited about the new NVIDIA GPUs coming out? This video gives you a bit of what's in store in terms of performance. I couldn't help but notice that the GTX 480 card had two power plugs connected to it. Yikes...I shudder to think how much juice that monster will draw!

Thursday, January 7, 2010

Nvidia Announces Next Gen Tegra

Posted by David Tucker in "Zune News" @ 03:09 PM

Nvidia had their CES press event today and among the announcements were details on the next generation Tegra chip. The Tegra powers, among other things, the Zune HD and from everything I’ve seen it’s a very impressive chip already. The current Tegra runs the ARM11 chip. The new Tegra is going to be running the dual core Cortex-A9.

They also said that there will be 8 independent processors so if I understand the transcript from Engadget correctly, that means 16 cores total on the new Tegra (I didn't understand correctly. Thanks bitbyte for pointing out there are actually 2 Cortex A9 processors). All while only consuming 500 milliwatts. They claim this will allow for 140 hours of music and 16 hours of HD video but of course, that will depend on a lot of factors unrelated to the Tegra alone.

Nvidia’s biggest announcement for Tegra is that it will be powering all Audi cars starting in 2011. I assume that means the 2012 model years but they don’t specify in the live blog on Engadget. There is no mention of the Zune or other devices specifically but they do say there are 50 devices in the pipe for the new Tegra.

I think a combination of Tegra, Windows Phone 7, and the Zune all on one platform could make for a killer device.

Check out the rest of the live blog on Engadget. They have other announcements involving 3D on our laptops, which I’m sure everyone is excited about and lots of great pictures and slides from the Nvidia’s presentation.

Tuesday, October 27, 2009

The Radeon 5870 is a Pretty Fast Video Card

Posted by Hooch Tan in "Digital Home News" @ 02:30 PM

"Two years ago, AMD’s ATI division decided to bow out of the game of building huge, hot chips that were expensive to make, ceding the high-end glory to Nvidia’s GT200 chip. That’s not to say AMD gave up on performance; it instead adopted the mantra of building the best performance GPU within a certain cost and power envelope. The Radeon HD 5800 series, originally code-named RV870, is the culmination of that approach."

If you're looking for some serious pixel pushing power, the newly released Radeon 5870 should be close to the top of your list. While some dual GPU configurations still sit on top, it seems as if ATI is starting to follow the trend that CPUs took 4-5 years ago. Instead of going the faster, hotter, hungrier route, requiring a nuclear reactor and wind tunnel to maintain, lean and mean is what's hot. The 5870 admittedly will gulp down a hefty 188 watts of power (Still less than the GTX 295s 289 watts!) when in full use, when idle, it putters along at a mere 27 watts. I am just glad that they are going this route. Performance is still important, but there are a lot of other factors coming into consideration, like power usage, noise, number of displays and physical size. The 5870 is definitely an ehtusiast card, but its legacy, as shown in the less models, will benefit us all.

Thursday, October 1, 2009

Do You Really Need More Video RAM Than Regular RAM?

Posted by Hooch Tan in "Digital Home News" @ 02:00 PM,2428.html

"The focus of this article isn't to dig into the minutia of where your graphics card RAM is being used. Instead, we're more interested in looking at the tangible impact that different amounts of graphics card RAM will have on your gaming experience. Our goal is to let you know exactly what advantage, if any, you can expect from a graphics card that has more RAM on-board."

I do not actually know anyone who has a video card with more RAM than the PC has, but I do know people who still use computers with 256 or 512MB of RAM. If you've been shopping for a video card lately, you have probably noticed that a lot of manufacturers are promoting how much RAM is on the card. Why? Well, RAM is a cheaper "upgrade" to a card than a better GPU. And who can resist sexy numbers like 768 megabytes or 1 or even 2 gigabytes? For gamers, knowing how to make the most of your money is very important and Tom's Hardware does give you an answer at the end. Unfortunately, the trend is towards laptops, notebooks and netbooks, and while they are improving, the opportunity to cherry pick a sweet video card is quickly diminishing.

Saturday, August 8, 2009

AMD Comes Full Circle. CPUs are GPUs are CPUs.

Posted by Hooch Tan in "Digital Home News" @ 01:00 PM

"AMD has announced the release of the first OpenCL SDK for x86 CPUs, and it will enable developers to target x86 processors with the kind of OpenCL code that's normally written for GPUs. In a way, this is a reverse of the normal "GPGPU" trend, in which programs that run on a CPU are modified to run in whole or in part on a GPU."

Parallel processing is a hot trend in computing. Taking advantage of the horsepower available in video cards has become more and more desirable, especially to accelerate media processing. Both NVidia and AMD have been working hard to provide a development platform that allows you to use their video cards for more than just pretty 3D graphics, but AMD has just taken it a step further, extending their platform to support x86 processors. The biggest benefit I see is it makes OpenCL much more universal, giving developers more incentive to use it. Those of us with the extra horsepower can then see a great boost in more programs, but those of us without will not be left behind.

Wednesday, March 11, 2009

Photoshop CS4 Performance Examined

Posted by Hooch Tan in "Digital Home News" @ 07:00 PM

"If you've ever used Photoshop, and watched the little progress bar crawl across the screen when you apply a filter, you've no doubt wished for better performance. Performance in photo editing applications has become a little more complicated, partly because there are more photo editing apps out today, but also because the graphics chip companies are in the game, accelerating portions of current generation photo editing software."

Not being a hard core photo editor, I've never really found myself waiting while doing anything in Photoshop. Of course, simple crops, red-eye removal and copying and pasting images together is most of what I've done. I have to admit that upon hearing that Photoshop CS4 utilizes your GPU, I had to wonder just how useful it would be. So I'm naive and ignorant. ExtremeTech took four computers with various hardware, though all fairly powerful, to see what difference the GPU integration makes. The results are quite enlightening and definately make the case for those who edit a lot of high resolution photos to trick out their rig and upgrade to CS4. It's beyond my means or needs, but perhaps some of you could be tempted!

Featured Product

The Canon PowerShot S100 - The incredibly fun and small camera that offers you 12.1 megapixels with a bright f/2.0 lens and full 1080p video recording . MORE INFO

News Tip or Feedback?

Contact us

Thoughts Media Sites

Windows Phone Thoughts

Digital Home Thoughts

Zune Thoughts

Apple Thoughts

Laptop Thoughts

Android Thoughts

Reviews & Articles

Loading feed...


Loading feed...

Reviews & Articles

Loading feed...


Loading feed...

Reviews & Articles

Loading feed...


Loading feed...

Reviews & Articles

Loading feed...


Loading feed...

Reviews & Articles

Loading feed...


Loading feed...