GPU

엔비디아는 인텔 라라비를 비판하다.

by Rapter posted Aug 24, 2008
?

단축키

Prev이전 문서

Next다음 문서

ESC닫기

크게 작게 위로 아래로 댓글로 가기 인쇄 수정 삭제


결론을 추려보면,

1.인텔은 x86 코어 32개를 때려박아 라라비를 만들려고 하는데 여러가지 어플리케이션들에 맞는 32코어 병렬 스케일링 잘할수 있겠느냐,
2.멀티 프로세싱은 30년 동안의 난제로 해결하지 못하고 있는데 너네라고 잘할것이냐,
3.2010년 출시 예정인 라라비는 엔비디아와 amd가 2006년쯤 출시한 그래픽카드들과 비슷한 성능을 낼것이다,
4.우리는 사실 amd 라데온 4k 시리즈를 과소 평가했었다는것을 인정한다.
5.amd의 퓨전프로세서 계획은 삽질이다.

정도로 보이네요. 몇가지 예리한 지적도 보이기는 하지만 개인적으로 엔비디아에게 랩터가 해주고 싶은 말은 "현재 엔비디아 너희의 위치를 돌이켜보고 가장 불리한 위치에 있는것을 인지하고 미래에 대한 경쟁력이나 구상해라"

- 아래는 원문 -

Nvidia has delivered a scathing criticism of Intel's Larrabee, dismissing the multi-core CPU/GPU as wishful thinking - while admitting it needs to catch up with AMD's current Radeon graphics cards.
Andy Keane, general manager of the company's GPU computing group, spoke to reporters at the company's headquarters in Sunnyvale, California, ahead of the opening of the annual NVISION expo on Monday.

"There's an incredible amount about Larrabee that's undefined," explained Keane, commenting on the specifications so far released. "You can't just say 'it's x86 so it's going to solve the massively parallel computing problem.'"

"Look at the PC," he continued. "With an OS they don't control, and applications coming from everywhere... to say arbitrarily that everything's going to scale to 32 cores seems to me to be a bit of a stretch. "

Keane dismissed the idea that the Intel Parallel Studio, announced last week at IDF, might solve the problem. "Multi-processing is a hard problem in computer science. It's been there for 30 years. It's not answered by software tools."

Unrealistic performance projections

John Mottram, chief architect for the company's GT200 core, raised further doubts about Larrabee's real-world performance, brushing aside Intel's announcements as marketing puff.

"They've put out a certain amount of technical disclosure in the past five weeks," he noted, "but although they make Larrabee sound like it's a fundamentally better approach, it sn't. They don't tell you the assumptions they made. They talk about scaling, but they disregard memory bandwidth. They make it sound good, but we say, you neglected half a dozen things."

"Every GPU we make, we always consider this type of design, we do a reasoned analysis, and we always conclude no. That's why we haven't built that type of machine."

"Intel is not a stupid company," he conceded. "They've put a lot of people behind this, so clearly they believe it's viable. But the products on our roadmap  
    
ADVERTISEMENT

are competitive to this thing as they've painted it. And the reality is going to fall short of the optimistic way they've painted it."

"As [blogger and CPU architect] Peter Glaskowsky said, the 'large' Larrabee in 2010 will have roughly the same performance as a 2006 GPU from Nvidia or ATI."

Fighting back against ATI

Mottram then turned to the company's most direct competitor, ATI, the graphical division of AMD whose RV770 GPU has put Nvidia's GT200 based cards on the back foot.

"We underestimated ATI with respect to their product," he admitted. "We've looked very closely at this, and we know there are certain things we can do better. There will be improvements to things from all angles: there are some easy fixes in the software domain that will soon be forthcoming. Believe me, it's a very prime focus of ours."

Mottram ascribed the company's current embarrassment, at ATI's hands, to its earlier successes. "ATI has had the benefit for a long time of seeing Nvidia's products and having something to shoot for," he argued, "while Nvidia has not had the benefit of having someone to be shooting after."

He also predicted that ATI would regret its focus on raw graphical power at the expense of more general-purpose capabilities.

"ATI did not spend on things like PhysX and CUDA. But we believe that people value things beyond graphics. If you compare only on graphics, that's a relative disadvantage to us, but the notion of what you measure a GPU on will change and evolve," he argued.

"We're forward-looking. And sometimes, when someone's forward-looking, they get a little bit ahead of the game. And that's kind of where we are."

Fusion a flop?

Mottram also scorned AMD's forthcoming Fusion platform, which will combine a GPU and a CPU on one die.

"Joining both components on the same die doesn't buy you that much," he commented. "It's not like there's a real bottleneck there. And every square millimeter you add to the die is a very expensive millimeter. It's an incremental expense, not a linear function. It's cheaper to separate them."

Andy Keane doubted whether buyers would even care about Fusion. "The class of buyer who buys that type of graphics isn't really buying graphics," he argued. "They don't care about it."

"You're working out what you can really afford to put on a CPU, and you're selling it to a customer who doesn't care. The economics don't make sense."

Darien Graham-Smith in San Jose






Articles