Despite the production being ‘virtual’ the same rules of physical production apply and have an on flow effect to the content creation process. As a 3D artist, understanding the constraints of the physical space, the lenses and lighting are critical before you embark on creating your scenes.
Just because you can move your camera in Notch to the perfect position, doesn’t mean the on-set camera can move to that position. Physical constraints on-site are one of the biggest causes of on-set changes to virtual sets/scenes. While you can use Notch’s live editing features, you’ll save yourself a lot of on-set pain by thinking about the physical constraints on the camera boom/crane/track and how it will impact your shot.
- How big is the room/sound stage?
- Are there obstructions such as pillars/trusses to the tracked camera?
- What is the distance from the edge of your LED stage/green screen set to the walls from the room/sound stage? Can the camera move back enough based on lens selection?
Camera & Lens Selection
Just like with physical constraints, the camera & lens selection has a significant bearing on the creative / framing choices in Notch. Pre-visualising in Notch will help you significantly, but you will need to capture the key bits of information upfront.
- What lenses are available to you? (And therefore, respective FOVs)
- If using camera tracking, does your tracking vendor have calibration profiles for the lenses?
- If using non-prime lenses, are you happy with the barrel distortion at the zoom levels you will need to use?
- When working with LED, are you and the team clear on how moiré patterns will constrain the depth of field, camera positions etc. and how the rendered content needs to play to this?
Physical lighting is often the make or break of a good virtual production shot. Your virtual and physical lighting need to gel well for looks to be convincing. This is complicated significantly in LED stages, and you’ll need to work very closely from the outset with the lighting team (if you’re lucky enough to have one) to get a good outcome. Just ‘matching’ physical and virtual lighting isn’t always the best approach. Often you’ll need to use lighting and the rendered content to find solutions to challenges like ‘grounding’ the talent, shadows and reflections. (Read more on ‘grounding’ here)
When used with a pixel mapping enabled media server, Notch can drive DMX lighting directly, which is useful for handling dynamic lighting scenes. Additionally, when using LED panels as light sources, video from Notch can directly drive the panels as video.
- What lighting is available to light the talent and stage?
- If you’re not responsible for physical lighting, have you spoken with the lighting team about the proposed scenes lighting setups?
- How will both physical and virtual lighting be driven? From a DMX lighting desk, pixel mapper or video panels as light?
- When using an LED stage, does the LED product allow shadows to be cast with physical lighting? Will it cause reflections?
LED stage specifics
If you’re working with an LED stage, there are few specific things you’ll want to understand before you start creating content.
- Physical dimensions of the stage: so you can build a pre-visualisation, understand shot framing and edges to set extension
- While the primary resolution that you will care about is the camera/FOV resolutions, it’s good to understand the resolution of the stage LED surfaces, so you can understand the level of fidelity you might achieve in close-ups etc.
- Working with LED stages requires an ‘overscan’ render, where you render a FOV wider than the actual sensor will capture. This is to account for latency in the tracking & video processing by the media server and LED processor. You should be provided with either a scale (e.g. 1.2x HD) or a fixed resolution (e.g. 2304 × 1296) that the scene will need to be rendered at – this will impact the performance ceiling of your scene.