Welcome back to part two of our ‘VFX for dummies’ series! We know you could be a veteran of the industry but still be unsure about this rapidly changing field of TV production. What CGI and VFX are might not be immediately obvious to you. So if you’ve ever tried to research the topic to gain a clearer understanding, you might have left more confused than before.
Like any highly technical specialisation, VFX has its own dictionary of industry terms that can – on the surface – appear confusing. But we’re here to demystify some of the most common terms and break them down for you.
This refers to the bit depth of your footage, i.e. how much colour information is stored in your imagery. The more colour information it has, the more colours you have available. The higher the bit depth, the more colours it can store. This term is common when discussing concepts like ultra-high-definition (UHD) or high dynamic range (HDR).
These are your 3D models – what you might think of when you think ‘VFX’. They can be as large or small, or as noticeable or inconsequential as you want. It can be anything from a plane to a three-legged war machine, a tree to an animal, or a box to an even smaller box.
Computer-generated imagery. As the name suggests, these are the visual elements of your production that are created on a computer. Often used to refer specifically to 3D Computer Animation or as another term for Visual Effects or VFX. CGI and VFX are not the same though. CGI – and its integration – can be considered part of the VFX process.
The combination of at least two source images to create a new integrated image. Compositing happens when you put all your different ‘elements’ together – your 3D assets, your backgrounds, your particle effects, and your actual on-set footage.
High dynamic range (HDR)
This is a common term used in relation to next-generation TV’s which can deal with a larger than normal dynamic range. Dynamic range relates to the brightness values in a scene or image, from brightest to darkest, often expressed as a ratio. In a digital image, it also relates to the total number of different colours in the image. Streaming content providers like Netflix and Amazon have spearheaded the drive in the industry to deliver films and series to HDR standards.
Keying is the process of algorithmically extracting an object from its background and combining it with a different background. To help with this process, productions use a ‘green screen’ or ‘blue screen’ to shoot against. This is used so, during the keying process, you can cut out the green or blue colour and insert your own background digitally. Ideal in situations where a location does not exist so needs creating (such as an alien planet) or where it is too dangerous to have the actor in that situation (such as in a special effects explosion).
From the small to the large, sometimes you need to create entire landscapes. You may be able to use a matte painting, which is a 2D, digitally drawn background that can be added to your scene. Before digital production became the industry standard, matte paintings were painted onto glass. The paint techniques used now are created using software like Photoshop, Nuke, Mari, and ZBrush.
This is a specific image file format designed for use with High Dynamic Range (HDR) imagery.
Parallax is defined as the perceptual difference in an object’s location or spatial relationship when seen from different vantage points. Parallax is an effect which can be used to add more depth to 2D shots. You can adjust focus and depth of field to make certain elements appear closer to or further away from the camera, adding depth to a 2D shot.
Particle system / particle effects
A 3D computer graphics technique that is used to create a large number of objects that obey well-defined behavioural rules. Useful for controlling multitudes of discrete objects, such as asteroids or flocks of birds, but also as a tool for creating natural phenomena such as fire, smoke or water. Particles are small 3D elements that add tiny details to a shot. If there’s a fire, you’ll need rising embers and smoke. If there’s rain, you’ll need small droplets.
A pipeline is the generic term used to describe a set of processes for achieving a certain result. It is most commonly used to describe the VFX pipeline. The VFX pipeline covers all the processes from pre-production through to post-production and delivery. It involves many things in this glossary, including previz, matte painting, and tracking. Creating a robust and efficient pipeline is a key part of developing a successful VFX company.
Previz (abbreviation for previsualisation)
Previs is a collaborative process that generates preliminary versions of shots or sequences, predominantly using 3D animation tools and a virtual environment. Previs is used by filmmakers to explore creative ideas, plan technical solutions to shooting, and to help the whole production team visualise how finished 3D elements will look in the final project ahead of final animation being completed.
A rotoscope was originally the name of a device patented in 1917 to aid in cel animation. It is now used as a generic term for ‘rotoing’. This is the process of cutting someone or something out of a more complex background to use them in another way. For example, you might want to put a VFX explosion behind your actors as they walk away from it towards the camera. You would need to rotoscope them out of the shot so you can place the explosion behind them but in front of the background scenery.
Special effects. While these aren’t visual effects, it’s worth defining how the two are different. While visual effects are digitally created assets, special effects are real effects done on set – for example, explosions or stunts. It can also include camera tricks or makeup. People often confuse SFX and VFX.
When 3D models are first created, they are just blank shapes with no realistic details on them. Texturing is a process which is akin to painting the model – giving it a skin or surface.
Tracking is the process of determining the movement of objects in a scene (relative to the camera) by analysing the captured footage of that scene. 2D tracking is dependent on tracking points in the image. These can be tracking markers placed there or points on objects being tracked. 3D tracking – also referred to as match moving – is the process of extracting the camera move from a live action plate in order to replicate it in a computer generated (CG) environment. A match move is often created by hand, whereas 3D tracking is done with specialist software. 3D tracking is used to recreate the movements of a camera in a digital space. So, for example, you have a shot that pans from left to right. When you add in your 3D asset, it needs to move from left to right in the same way at the same speed so it looks as if it was actually there.
VFX stands for ‘visual effects’. It is a very broadly used term used to describe just about anything that cannot be captured through standard photographic capture
Speaking of which, it’s time to end another entry in our VFX for dummies series. We hope this has helped you understand the often complex world of VFX. There’s more to come in the series so check back regularly. Next time, we’ll discuss how to plan your CGI elements to make filming your TV show that much easier.
We know VFX can be confusing to even the most experienced industry veteran. But at REALTIME, we make the process as stress-free and easy as possible. If you need CGI elements in your upcoming project, get in touch with me on firstname.lastname@example.org to see how we can help.