background preloader

The Art of Rendering (updated)

The Art of Rendering (updated)

Science of Fluid Sims: Pt 2 – RealFlow Last September we published a piece on fluid sims. The aim was to examine the topic via one primary approach. Here is a second companion piece to that original story that examines the topic via the work of Fusion CI Studios. Mark Stasiuk and Lauren Millar are co-founders of Fusion CI Studios, a dynamic effects specialist facility that uses RealFlow extensively. Millar is a filmmaker of 20 years’ experience, having produced or directed more than 75 TV shows, and Stasiuk holds a PhD in geophysical fluid mechanics. Stasiuk started using RealFlow and answering questions on the forums with such insight that RealFlow’s authors at Next Limit began noticing. - Above: watch Fusion CI’s demo reel. Stasiuk has a doctorate from Bristol University in fluid dynamics of volcanic eruptions. “We called him Dr. In the end, Stasiuk decided that RealFlow was more interesting than the managerial role he had found himself promoted into with volcanic observatories and government work. RealFlow Polygonization

Bokeh Lens Shader – Closer look – Tutorial | PixelCG Tips & Tricks is a Japanese word for “blurred or fuzzy” (暈け) This is a real life phenomenon that occurs in photography where the light sources in an out of focus area of an image. Different lens Bokeh produces different aesthetic qualities in out of focus backgrounds, which are often used to reduce distractions and emphasize the primary subject. We can simulate the same effect in Maya using the Bokeh Lens Shader in mental ray. This is the render without using Bokeh as our starting point. We start by selecting the camera of choice, and under the attribute editor of that camera, mental ray > Lens Shader. you can also expand the Lens Shader section and add it in there as well. Locate the Mia_lens_bokeh Shader under the mental ray tab > Lenses By default the render will look something like this The Shader parameters rely on the scene size, therefore results will vary from one file to the other. This is a toggle on/off switch to disable the Shader if needed. {*style:<b>Plane: Plane value = 18.8 Plane value = 8.7

Path Tracing vs Ray Tracing – Dusterwald.com Path tracing is all the rage in the offline rendering space these days. From Cycles to SuperFly (based on the open source Cycles) to Octane, most new rendering engines seem to be using this technology. Sometimes referred to as “unbiased, physically correct rendering” what is path tracing, how is it different to ray tracing and is it the future of high quality offline rendering? I will be looking to answer all of those questions in this blog post for anyone confused by the changing landscape of rendering engines (note that I will be talking about offline rendering in this post, as opposed to realtime rendering). So first up the question: what is path tracing? Now a path tracer is like a ray tracer on steroids. So is path tracing the future of high quality rendering? The crux of the problem is that with a path tracer you are locked into an all or nothing approach.

The Science of Fluid Sims Fluid sims have become such a vital part of so many visual effects films, yet are not well understood by most general artists. We try and explain the science behind the fluid sims, and look at one in particular closely: Naiad, with help from our friends at Exotic Matter. Introduction One of the most significant and commonly requested areas of real world simulation is fluid simulation. From pouring shots to ocean vistas, directors and artists have come to rely on computer simulated water and similar fluids. Fluid sims are not confined to just fluids either, they can be used to achieve fire and flames - the fluid being simulated in this scenario is the air itself (a gas). Fluid simulations (fluid sims) have many applications outside visual effects. History Before the computer graphics industry got involved, fluids simulation was being actively modeled mathematically as early as the 1950's and 60's. - Watch Jerry Tessendorf talk at TED. - A Naiad scene test: 'Bunny in Trouble' Basic concepts

“My render times are high. How do I fix that?” | elemental ray If you don’t have access to an infinite render farm, chances are you might be concerned about render times. With a certain amount of flexibility and exposed controls you may be tempted to try lots of different things or even combine techniques seen on the internet. In some cases this can be useful and in others this combination doesn’t work well. For example: If you use an inexpensive Final Gather solution you may increase the quality of, or add, Ambient Occlusion to increase details. Where’s a good place to see what might be eating your render time? The Output Window and the Time Diagnostic Buffer with Unified Sampling. The Maya Output Window What effects cost you the most time? Well, that depends on what you are rendering. Let’s look at some output from a render. Render Current Frame Options Box I usually choose “Progress Messages”. So, I have rendered a decently complex scene from a project at 1280 by 720. I haven’t included the image here because we’re going to look at the numbers. 1. 2.

Path Tracing The random sampling in path tracing causes noise to appear in the rendered image. The noise is removed by letting the algorithm generate more samples, i.e. color values resulting from a single ray. A more in-depth explanation of the path tracing algorithm is given below. Random Sampling In path tracing rays are distributed randomly within each pixel in camera space and at each intersection with an object in the scene a new reflection ray, pointing in a random direction, is generated. The samples in a path-traced image are distributed evenly over all pixels. The random components in path tracing cause the rendered image to appear noisy. Samples Per Pixel (SPP) The defining factor for render quality is the number of samples per pixel (SPP). The higher SPP you have in a rendered image the less noise will be noticeable. Sunlight does not require high SSP to give a nice image. Render Time There is no definite answer to how long it will take to render a scene. More About Noise SPP Comparisons

Art of Destruction (or Art of Blowing Crap Up) Destruction pipelines today are key aspects of any major visual effects pipeline. Many current pipelines are based on Rigid Body Simulations (RBS) or otherwise referred to as Rigid Body Dynamics (RBD), but a new solution – Finite Element Analysis (FEA) – is beginning to emerge. In this ‘Art Of’ article, we talk to some of the major visual effects studios – ILM, Imageworks, MPC, Double Negative and Framestore – about how they approach their destruction toolsets. In VFX and CGI, RBS is most often relevant to the subdivision of objects due to collision or destruction, but unlike particles, which move only in three space and can be defined by a vector, rigid bodies occupy space and have geometrical properties, such as a center of mass, moments of inertia, and most importantly they can have six degrees of freedom (translation in all three axes plus rotation in three directions). The ‘explosion’ in destruction tools A scene from '2012', visual effects by Digital Domain. Another scene from 2012.

6 Tips for Better Lighting | Blender Guru The difference between a boring image and an outstanding image can often just come down to the lighting. But lighting is such a complex and rarely discussed topic, that a lot of artists are left to just guesswork. So in this article, I’ll break down some of the common mistakes in lighting, and share with you 6 of my own tips for better lighting in blender. Feel free to download this model by Ben Simonds if you want to experiment. The size of your lamp affects the resulting render in big ways! Soft shadows can create a calm atmosphere or imitate overcast lighting (example), whereas sharp shadows can bring out the detail or imitating harsh daylight (example). Changing the size of the lamp can complete change the mood of your scene. Seriously… Trying to light everything is one of the most common mistakes that beginners make. Overlighting a scene eliminates the shadows, which play an important role! So instead of fearing shadows – celebrate them! Got some tips of your own? Want more tips?

INF584 - Synthèse d'Images : Théorie et Pratique Cours de l'Ecole Polytechnique par Tamy Boubekeur Description |Organisation | Programme | Travaux Pratiques | Séminaire | Toolbox | Liens Description La synthèse d'image, ou "rendu", est un thème central de l'informatique graphique 3D. Elle regroupe un ensemble de méthodes d’imagerie artificielle permettant générer automatiquement des images numériques à partir de modèles de scènes 3D virtuelles. Elle s’appuie sur l’informatique, la physique, les mathématiques appliquées et la perception. Ce cours présente les principes, algorithmes et techniques de la synthèse d'image. Ce cours a une forte dimension pratique, où les élèves implémentent les modèles et algorithmes tout au long du trimestre, à l'aide notamment des langages C/C++ et de l'API OpenGL. Organisation Chaque séance est organisée ainsi : Programme Travaux Pratiques Les travaux pratique visent à mettre en oeuvre, sous forme de programme en C++ et OpenGL, les principes de la synthèse d'image. Séminaire Période : dernière séance de cours.

Related: