Pixar’s Story Rules, Illustrated in Lego by ICanLegoThat Last year, Pixar story artist Emma Coats (@lawnrocket) tweeted 22 rules of storytelling like “give your characters opinions” and “no work is ever wasted.” Alex Eylar, aka ICanLegoThat, has illustrated twelve of those rules with Legos. He gave us the chance to premiere them at Slacktory. Science of Fluid Sims: Pt 2 – RealFlow Last September we published a piece on fluid sims. The aim was to examine the topic via one primary approach. Here is a second companion piece to that original story that examines the topic via the work of Fusion CI Studios. Mark Stasiuk and Lauren Millar are co-founders of Fusion CI Studios, a dynamic effects specialist facility that uses RealFlow extensively. Millar is a filmmaker of 20 years’ experience, having produced or directed more than 75 TV shows, and Stasiuk holds a PhD in geophysical fluid mechanics. Stasiuk started using RealFlow and answering questions on the forums with such insight that RealFlow’s authors at Next Limit began noticing. - Above: watch Fusion CI’s demo reel. Stasiuk has a doctorate from Bristol University in fluid dynamics of volcanic eruptions. “We called him Dr. In the end, Stasiuk decided that RealFlow was more interesting than the managerial role he had found himself promoted into with volcanic observatories and government work. RealFlow Polygonization
Path Tracing vs Ray Tracing – Dusterwald.com Path tracing is all the rage in the offline rendering space these days. From Cycles to SuperFly (based on the open source Cycles) to Octane, most new rendering engines seem to be using this technology. Sometimes referred to as “unbiased, physically correct rendering” what is path tracing, how is it different to ray tracing and is it the future of high quality offline rendering? I will be looking to answer all of those questions in this blog post for anyone confused by the changing landscape of rendering engines (note that I will be talking about offline rendering in this post, as opposed to realtime rendering). So first up the question: what is path tracing? Now a path tracer is like a ray tracer on steroids. So is path tracing the future of high quality rendering? The crux of the problem is that with a path tracer you are locked into an all or nothing approach.
The State of Rendering – Part 2 In Part 1 of The State of Rendering, we looked at the latest trends in the visual effects industry including the move to physically plausible shading and lighting. Part 2 explores the major players in the current VFX and animation rendering markets and also looks at the future of rendering tech. There is more about rendering at www.fxphd.com this term. There are many renderers, of course, but we have focused below on the primary renderers that have come up during the last 18 months of writing fxguide stories. It is not an exact science but fxguide has a ring side seat on the industry and the list below covers the majority of key visual effects and non-exclusive in house animation renderers. The order is not in terms of market share – in reality the 3ds Max default renderer or Mental Ray would swamp many others due to the market share of Autodesk with Max, Maya and XSI. 2. 2.1 RenderMan – Pixar fxguide will soon be publishing a piece on the 25th anniversary of RenderMan.
The Science of Fluid Sims Fluid sims have become such a vital part of so many visual effects films, yet are not well understood by most general artists. We try and explain the science behind the fluid sims, and look at one in particular closely: Naiad, with help from our friends at Exotic Matter. Introduction One of the most significant and commonly requested areas of real world simulation is fluid simulation. From pouring shots to ocean vistas, directors and artists have come to rely on computer simulated water and similar fluids. Fluid sims are not confined to just fluids either, they can be used to achieve fire and flames - the fluid being simulated in this scenario is the air itself (a gas). Fluid simulations (fluid sims) have many applications outside visual effects. History Before the computer graphics industry got involved, fluids simulation was being actively modeled mathematically as early as the 1950's and 60's. - Watch Jerry Tessendorf talk at TED. - A Naiad scene test: 'Bunny in Trouble' Basic concepts
Path Tracing The random sampling in path tracing causes noise to appear in the rendered image. The noise is removed by letting the algorithm generate more samples, i.e. color values resulting from a single ray. A more in-depth explanation of the path tracing algorithm is given below. Random Sampling In path tracing rays are distributed randomly within each pixel in camera space and at each intersection with an object in the scene a new reflection ray, pointing in a random direction, is generated. The samples in a path-traced image are distributed evenly over all pixels. The random components in path tracing cause the rendered image to appear noisy. Samples Per Pixel (SPP) The defining factor for render quality is the number of samples per pixel (SPP). The higher SPP you have in a rendered image the less noise will be noticeable. Sunlight does not require high SSP to give a nice image. Render Time There is no definite answer to how long it will take to render a scene. More About Noise SPP Comparisons
Art of Destruction (or Art of Blowing Crap Up) Destruction pipelines today are key aspects of any major visual effects pipeline. Many current pipelines are based on Rigid Body Simulations (RBS) or otherwise referred to as Rigid Body Dynamics (RBD), but a new solution – Finite Element Analysis (FEA) – is beginning to emerge. In this ‘Art Of’ article, we talk to some of the major visual effects studios – ILM, Imageworks, MPC, Double Negative and Framestore – about how they approach their destruction toolsets. In VFX and CGI, RBS is most often relevant to the subdivision of objects due to collision or destruction, but unlike particles, which move only in three space and can be defined by a vector, rigid bodies occupy space and have geometrical properties, such as a center of mass, moments of inertia, and most importantly they can have six degrees of freedom (translation in all three axes plus rotation in three directions). The ‘explosion’ in destruction tools A scene from '2012', visual effects by Digital Domain. Another scene from 2012.
INF584 - Synthèse d'Images : Théorie et Pratique Cours de l'Ecole Polytechnique par Tamy Boubekeur Description |Organisation | Programme | Travaux Pratiques | Séminaire | Toolbox | Liens Description La synthèse d'image, ou "rendu", est un thème central de l'informatique graphique 3D. Elle regroupe un ensemble de méthodes d’imagerie artificielle permettant générer automatiquement des images numériques à partir de modèles de scènes 3D virtuelles. Elle s’appuie sur l’informatique, la physique, les mathématiques appliquées et la perception. Ce cours présente les principes, algorithmes et techniques de la synthèse d'image. Ce cours a une forte dimension pratique, où les élèves implémentent les modèles et algorithmes tout au long du trimestre, à l'aide notamment des langages C/C++ et de l'API OpenGL. Organisation Chaque séance est organisée ainsi : Programme Travaux Pratiques Les travaux pratique visent à mettre en oeuvre, sous forme de programme en C++ et OpenGL, les principes de la synthèse d'image. Séminaire Période : dernière séance de cours.