Jul 3, 2008

Intel exec says NVIDIA's CUDA will be a "footnote" in history

Intel dismisses threat of GPGPU computing, saying that Larrabee’s focus on IA cores will win out in the long run-
Intel has revealed that it sees no place in the future of computing for general purpose GPU (GPGPU) programming models such as Nvidia’s CUDA, which has enabled Stanford's Nvidia GPU folding client, saying that programmers don’t have the time to learn how to program for radical new architectures.
In a Q&A session after announcing Intel’s 40th birthday, we asked Intel’s senior vice president and co-general manager of Intel Corporation's Digital Enterprise Group, Pat Gelsinger, where he saw GPGPU languages such as CUDA in the future. He said that they would be nothing more than ‘interesting footnotes in the history of computing annals.’ ‘The problem that we’ve seen over and over and over again in the computing industry is that there’s a cool new idea, and it promises a 10x or 20x performance improvements, but you’ve just got to go through this little orifice called a new programming model,’ Gelsinger explained to Custom PC. Those orifices, says Gelsinger, have always been ‘insurmountable as long as the general purpose computing models evolve into the future.’
Gelsinger used the Cell architecture used in the PlayStation 3’s CPU as an example to prove his point. ‘It [Cell] promised to be this radical new computing architecture,’ said Gelsinger, ‘and basically years later the application programmers have barely been able to comprehend how to write applications for it.’This, according to Gelsinger, is one of the major reasons why Intel’s forthcoming Larrabee graphics chip will be entirely based on IA (Intel Architecture) x86 cores. ‘Our approach to this has been to not force programmers to make a radical architectural shift,’ explained Gelsinger, ‘but to take what they already know – this IA-compatible architecture – and extend the programming model to comprehend new visual computing data-parallel-throughput workloads, and that’s the strategy that we’re taking with Larrabee.’ Larrabee, according to Gelsinger, will simply expand on a standard programming model.
‘It’s an IA-compatible core,’ explained Gelsinger, ‘and we’re extending it with a graphics vector visualisation instruction set that has full support for native programming models, full support for the graphics APIs like DX and OpenGL, and then this broad set of new programming models to go with it.’Gelsinger claims that the ISVs (independent software vendors) that are currently dealing with Larrabee have responded with ‘nothing but sheer passion and enthusiasm for that direction.’ As such, he added that ‘we expect things like CUDA and CTN will end up in the same interesting footnotes in the history of computing annals – they had great promise and there were a few applications that were able to take advantage of them, but generally an evolutionary compatible computing model, such as we’re proposing with Larrabee, we expect will be the right answer long term.’
Could GPGPU computing really take off in the future, or is Intel right in saying that a standard x86-based programming model will win out in the end? Let us know your thoughts.

No comments: