Okie dokie time for the more advanced and scientific sounding terms designed to make you feel teeny, and boost the industry veterans ego. But fear not grasshopper, because here's that oh so special dictionary for you designed to democratize the industry, starting with our first term, lat long, the short form for latitude and longitude, which is actually a visual effects term if you believe it or not, basically, latitude and longitude or lat long is the unwrapped version of VR, similar to what you see when cartographers unwrapped Earth and flatten it out. That my friends is your lat long. To get a clearer picture of what that means. Check out www dot lat long.net. Okay, next equirectangular projection technique like unwrapping VR, same thing as our previous term.
This is basically what's most unwrapped cinematic VR look like fisheye as in fisheye lens of fish eye projection for live action VR. Now there are two different ways the definition of fisheye could apply either as a Fisheye Camera projection on actual Fisheye Camera lens which is a wide angle lens, we will field of view covering up to 180 degrees, the scale of the capture footage being reduced to the edges, little planet, and other projection technique or effects in photography usually formed by some fisheye lenses that capture a wide angle view. But generally as a projection technique, digital planet projection can be remapped back and forth between other projection types. If you have the right image remapping software or tool such as auto panel, or PT GUI For example, we will cover those tools in a later lesson. VR compositing self explain mandatory basically compositing for cinematic VR.
A little tip off. I am working on a cause for this. So hang tight, it will be announced and you will know about it. If it is not already out by the time you're watching this lesson. After all, I have worked on VR footage before some of the tools we have now for VR compositing. And actually alpha tested one of the foundries VR tools at a startup before they launched it publicly.
So stay tuned for that. Okay, stitching, combining footage from different cameras to form a panoramic stitch of cinematic VR. basically taking all the footage from all the cameras in a VR rig and arranging them into a single image. Rough stitch, quick and first test stitch of VR live action footage, sort of like a rough cut in post production. Find stitch the finished and polished stitch with some agility In the same session tool or software, sort of like your final cut, no, not the software, your final cut impulse production. clean plate basically well in place with no actors or moving elements other than what is in the background used to clean up unwanted objects in a shot of scene.
Gaze point, usually against circle point typically found and seen in most VR experiences, usually the center of your POV in a VR experience speaking of POV, here is f of v field of view of the VR device and after user wearing the HMD fo v allows for coverage of an area rather than a single focus point in simpler terms, the larger the field of view, the more immersive and complete the VR experience is, as it behaves more like the human eye where you are not limited to seeing Only a specific range of visual scope that you can see and view in all directions of your vision without any limitations, parallax. Now there is a camera definition that also applies to VR. parallax is the relative movement of objects as a result of a change in point of view. For example, when objects move relative to each other, whereby the object in the distance appears to move slowly, then the nearest object as you can see this image.
This one might be a bit tricky to understand. But again, it is like when you are driving in a car, the buildings in the distance outside your window doesn't really move at the same speed as your car. As they move past your view was the trees on the road tend to move quicker past your view when driving for example, nodal point, commonly mistaken as the no parallax point of a lens, but this is incorrect. That has actually known as the entrance pupil of the lens, the nodal point is actually more so where the lens swings or pens, for example, the center of a camera whereas swings from a pivot point, but just so you know, people mistake this for the no parallax point all the time in the industry. All right, next, experiential design, or experienced design, the practice of designing products, processes, services, events and environments with focus placed on the quality of the user experience and culturally relevant solution.
In VR. This is the designing and optimization of the interaction and gaming experience for the end user, which leads us to user experience design, UX slash u z. Very similar and commonly abbreviated as UX. UX D, you Ed or XD is the process of enhancing user satisfaction with a product by improving the user Ability accessibility and pleasure provided in the interaction with the product. This is essentially the ease of flow in your VR experience. Basically, you can see this term as a close duplicate of experiential design, in a way since they are both so similar, they would be hard for people to differentiate between the two terms either way.
So next up, we have gamification. The application of typical elements of game playing such as point scoring competition with others rules of play to other areas of activity, typically as an online marketing technique to enhance engagement with a product or service like adding gamified elements to a VR experience to make an experience retain with the user for example, frame rate, the frames used to run any VR experience but more traditionally and formally, the definition of frame rate is the frequency of Which frames in a television picture film or video sequence are displayed. Now the next one can be a bit tricky but here goes buzzing slash flickering stash juddering slash stuttering. These are basically terms used to describe artifacts formed due to low frame rates or rendering issue with the VR experience. For example, the terms I usually use interchangeably but could visibly appear very different compared to one another, as you can see in images on the slide here, so conveniently, next we have latency, the delay between action and reaction.
Quite simply, low latency means low delay, which is a good thing. high latency means high delay, which is a bad thing. In fact, latency has to be low enough to be undetectable by the brain. And this term is usually used to describe how much lag the user experiences in the VR game. dome projection. This is an immersive dome based video projection environment, a planetarium dome projection, basically another camera projection type, typically a hit settlers and a fully immersive experience via an external projection technique rather than one that is worn by the user.
Speaking of which, projection techniques slash mapping, aka video mapping, or Spatial Mapping in the real world. These projection mapping and techniques are ways used to turn objects into a display surface for video projection, projecting images or visuals onto the external world to be perceived by the end user. But in the digital world, projection techniques are basically projection algorithms used to unwrap digital images into different types of folding formats to be used to wrap around other digital objects ought to also be projected onto physical objects in the end, so that We have camera projection, aka image projection, usually in the 3d world, using a cameras point of view, or POV, and viewing through the eyes of a camera to project out an image into the scene or environment, whereby the cameras position orientation and FMV control the behavior of the projection transformation. To put it simply, this is a 3d term used to describe the projection of digital images in virtual cameras onto virtual images objects.
Next up, we have stereoscopic VR, or 3d vr in simple terms, basically, very straightforward. This is the our depth in them. Then microscopic VR, or non 3d VR, again, flat VR, aka VR with no deaf and just like watching a 2d movie. Okay, come close to the end now. So hang on tight. We have zeniff appointed in the sky, or 360 panoramic sphere that is directly north or above the user in VR, usually the pinching point above the user when wrapped around the users POV.
And lastly, we have the Nadir appoints on the ground, or 360 panoramic sphere that is directly SFX or below the user in VR. Usually the pinching points below the user when wrapped around the users POV. And there you have it, all the possible advanced terms that I know have now used visual effects veterans today. Now, I'll actually cover some of these in a cost that is currently being prepared for you. Or if you are watching this later, you will know which cause I'm talking about. So no worries if you do not get it the first time round.
The beauty of online learning is that you can always pause, rewind and watch again, now, still confused. Want to learn more terms? post your questions on the q&a board. If not, let's dive deeper into the different types of VR that exists out there in the various industries we've touched upon earlier in this course.