Panel MarkHarris

Views:
 
Category: Entertainment
     
 

Presentation Description

No description available.

Comments

Presentation Transcript

A Game Developer Perspective on Cutting-Edge Graphics Hardware: 

A Game Developer Perspective on Cutting-Edge Graphics Hardware GH2004 Panel Discussion

Panelists: 

Panelists Rune Vendler, Lionhead Studios Julien Merceron, Ubisoft Entertainment Rémi Arnaud, Sony R&D Moderator Mark Harris, NVIDIA

Rune Vendler: 

Rune Vendler Technical Lead, Lionhead Studios Rune is currently leading the development of a major unannounced title. Prior to this, he was Head of R&D at Lionhead, and has contributed to the Black and White series (PC) and Fable (Xbox). Rune has logged eight years in the games industry. His background is in simulation and rendering, but the diverse nature of Lionhead's games have led him into artificial intelligence work as well.

A Game Developer Perspective on Cutting-Edge Graphics Hardware: 

A Game Developer Perspective on Cutting-Edge Graphics Hardware Rune Vendler, Lionhead Studios Graphics Hardware 2004

Interesting times: 

Interesting times Graphics hardware is growing up Doubling in speed roughly every 9 months Growing and maturing an impressive set of features Consumer hardware is no longer just a graphics accelerator toy It is rapidly becoming a new class of processor: the stream processor Tailored for parallel processing of independent data With its own performance characteristics

Freedom to be different: 

Freedom to be different Three shader models More operations More resources More precision We are no longer constrained to fixed graphics algorithms We can be creative: Custom lighting, shadowing, animation algorithms More outlandish ideas: physics, collision, maths The goal is games that are beautiful and unique

Disappointing results: 

Disappointing results Have games really embraced this? No - so far, the use of programmable shading has overall been fairly superficial Why? Games take a long time to make Requirements and design for graphics might already be fixed Games require robust, generic solutions Ideally easy to integrate into an existing codebase

This will change: 

This will change We do follow what is happening in academic institutions and research labs And it does filter down to the games Many techniques just aren’t quite ready yet Unless you are willing to design your game with them in mind

Promising techniques: 

Promising techniques An imaginary next-generation engine, 2006/2007 timeframe Roughly 3x9 months = 8x GPU power Techniques to investigate, problems to tackle: Precomputed radiance transfer Generalised shadowing Global illumination Occlusion testing Order-independent transparency

Almost there: 

Almost there GPUs have come a long way, but there is still a little left to go: Complete set of precision modes Orthogonality of feature set Generalisation of resources Virtualization of memory Better performance

Conclusion: 

Conclusion The games industry is embracing new graphics hardware and techniques, but big steps often only happen between each generation of games Next generation is first to be designed wholly around programmable hardware This will mean a considerable difference in how that hardware is used These are indeed interesting times!

Julien Merceron: 

Julien Merceron Worldwide Technical Director, Ubisoft Entertainment Julien started developing on the Atari Jaguar in 1993 at Shen in Paris. He joined Ubisoft Entertainment in 1994 as a programmer for Rayman for the Jaguar and Playstation. Since then he has worked on many titles for Ubisoft. Julien is fond of hardware architecture, programming languages and algorithms, and loves designing game engines and production pipelines.

A Game Developer Perspective on Cutting-Edge Graphics Hardware : 

A Game Developer Perspective on Cutting-Edge Graphics Hardware Julien Merceron Worldwide Technical Director Ubisoft Ent.

Slide14: 

Graphics evolution on PC: From Geforce 2 (early 2000)… 0.18 micron technology, 25 Million transistors; 25 Million Polygons per second; No shaders; … to GeForce FX (early 2003): 0.13 micron technology, 125 Million transistors; 200 Million Polygons per second; 2.0 shaders. Graphics Hardware is in constant evolution…

Slide15: 

PC Graphic Hardware Doom3 wouldn’t be there; Shortens innovation cycles; PC competitive/synchronous with console. The will to innovate is there Sources of inspiration everywhere; Huge eye candy & gameplay potential. Staying on the cutting edge (difficult) but pays Strong selling point; Strong development basis; Easier generation transitions.

Slide16: 

Today: Nice shadows Shadow buffer Adaptive Attenuated Alpha sensitive Soft Per pixel lighting FP resolution HDR becomes possible Pixel processing power Actual Shader capabilities Cinematic effects Full screen z-buffer depth of field Motion blur Color alteration Polygonal geometry For runtime

Slide17: 

Not close enough to CG rendering quality Memory still limits: Texture types & resolution FSAA New lighting / material interaction models World Animation at Runtime

Slide18: 

Inner frame computing: Graphic Sub-sampling Animation detail appears even at high speed High animation production values needed

Slide19: 

Images Copyright Pixar

Slide22: 

Solving that completely: More Processing power Hardware Transparency sorting

Slide26: 

Tessellation according to distance and normal maps;

Rémi Arnaud: 

Rémi Arnaud Graphics architect, Sony Computer Entertainment America Rémi started in the R&D team of Thomson Training & Simulation, and then moved to SGI where he worked on the Performer API. He co-founded Intrinsic Graphics after it became clear that the same software issues he solved for high-end customers at SGI would now apply to the videogame industry. Rémi now leads development of next-generation graphics APIs at Sony. Rémi received a PhD from the Pierre & Marie Curie University in Paris in 1994.

Sony US R&D: 

Sony US R&D Foster City, California Next generation Playstation R&D 35 people Several projects Lots of internal projects eyeToy COLLADA

HW Side Effects…: 

HW Side Effects… Content complexity Content size Larger teams Shrinking production schedule New specialized jobs (shaders) Development tools yet to be invented

Current model: 

Current model Source Exporter Intermediate Final asset Conversion Application Fast Path Viewer DCC tool App. data Dev. Tools script

Communication deficiency: 

Communication deficiency No existing standard interchange format Source data in DCC proprietary format No collaboration to create such format Too much work creating exporters Content creativity limited by lack of this technology Import is equally or more important than export Need validation capability Syntax to be validated, regardless of the content Need to enable verification of exporter and importer quality Need real interchange capability Can use a variety of tools No loss. Can be used as source data Opening the door to a variety of specialized tools

Introduction to COLLADA: 

Introduction to COLLADA COLLADA: COLLAborative Design Activity Sony Computer Entertainment US R&D project ‘Open-Source’ project No string attached / no hidden agenda Multi-platform, for all targets Work started exactly a year ago at SigGraph’03 SigGraph’04 COLLADA 1.0 specifications ‘Open Source’ COLLADA (www.collada.org) COLLADA 1.1 later this year GDC’05 COLLADA 1.2 Feature list depend on feedback

Current contributors: 

Current contributors SCEA, SCEE, SCEI, Naughty Dog, Insomniac, … DCC tools: Alias*, Discreet*, Softimage*, … Middleware: Criterion*, Havok, Hybrid, Quazal, Emdigo*, Metrowerks, Novodex*, Virtools, Vicarious Visions, … Game Developers Digital Eclipse, Electronic Arts, Epic, Secret Level, UBIsoft, Vicarious Visions*, … Movie Industry dnahelix, PDI/DreamWorks, … * Presented at SigGraph

What makes COLLADA?: 

What makes COLLADA? COLLADA specification document XML schema, used to validate COLLADA files Source code for: OpenGL viewer, with Cg 1.2 support Reference code not intended for production For Windows, Linux and MaxOS Sample data and utilities DCC Importer/Exporter directly from vendor 3DMax, Maya, XSI www.collada.org www.collada.org/2004/COLLADASchema

COLLADA model: 

COLLADA model Import/Export Final asset Conversion Application Fast Path Viewer Dev. Tools 3rd party tools Schema Validation DCC tools

COLLADA 1.0 requirements: 

COLLADA 1.0 requirements Include all the major features Geometry, Hierarchy Materials, Textures, Shaders, Lights, Camera Techniques, multi representation, assets Accepted and supported by the major DCC vendors Alias, Discreet and Softimage Validated with developers Lots of feedback incorporated Need acceptance Collecting feedback from public forum

Pandora box opened: 

Pandora box opened Conformance Test Missing even from 1.0 release Dynamic content Animations Physics Asset management Still working on files Element asset management Distribution model Real database storage Distributed content Shader development Cg ? HLSL ? GLSL ? Sh ? Low level interface ? Need development tools (IDE? Specialized tools?)

Contact information: 

Contact information remi@playstation.sony.com collada@collada.org www.collada.org

Slide39: 

Q&A

Research: 

Research Graphic research shows the way to go; Use our knowledge to get the effect without the cost of it. See solved: Dev/art constraint: getting rid of dealing explicitely with transparency: No sorting issues anymore; Allows per pixel alpha sensitive materials. Getting closer to CG: No more pixel Color saturation; Texture Resolution & adv. filtering & fsaa; Inner-frames computing & blur. (see next slide for technical explanations, but illustrations & general explanations will be here) Soft shadowing New lighting / material interaction models From R&D to integration: Working closer (always helps); Use of software standards; Use of game hardware.

Graphics Hardware : 

Graphics Hardware Latest most significant improvements: Floating point resolution for computations. Less saturation, more per pixel effects. WGF architecture Looking forward to see: HDR « free » in f32f32f32f32 format. Crazy idea that will change games look 10 years from now: Inner frame computing: N frames created for one displayed with per pixel blur blending. Determine N based on inner-frames animation key frame densities & physical object trajectories (to avoid obvious artefacts); Sub sampling as you do with physics/collisions; Will require high quality animation, what should be true in 2 console generations time frame; From a hardware architecture standpoint we’re almost there, but from a processing power one we’re not. Real-time tesselation according to normal map informations on the silhouette.

Games: 

Games From the beginning of 3D until now: Lighting: Geometry & material aspects were statically hand-drawn in the color textures; Lightmaps appeared along with texture projection shadowing; Color texture + normal map + height map + gloss map + light map + occlusion maps + shadow buffer techniques => per pixel shadowed virtual displacement mapping ( parallax mapping) Materials: Making materials react as they look: coherency. Coherency appears in games. Next steps: Lighting: Per pixel self shadowing: Angle & horizon maps for pp diffuse self shadowing? PP Spherical harmonics? Advanced models for lighting. Materials: You look like you sound and you behave.

authorStream Live Help