Audio file format. §Format types It is important to distinguish between the audio coding format, the container containing the raw audio data, and an audio codec.
A codec performs the encoding and decoding of the raw audio data while this encoded data is (usually) stored in a container file. Although most audio file formats support only one type of audio coding data (created with an audio coder), a multimedia container format (as Matroska or AVI) may support multiple types of audio and video data. There are three major groups of audio file formats:
CinemaScope. CinemaScope is an anamorphic lens series used, from 1953 to 1967, for shooting wide screen movies.
Its creation in 1953 by Spyros P. Skouras, the president of 20th Century-Fox, marked the beginning of the modern anamorphic format in both principal photography and movie projection. §Origins A French inventor named Professor Henri Chrétien developed and patented a new film process that he called Anamorphoscope in 1926. It was this process that would later form the basis for CinemaScope. The optical company Bausch & Lomb were asked to produce a prototype "anamorphoser" (later shortened to "anamorphic") lens, meanwhile Sponable tracked down Professor Chrétien.
§Early implementations The original expectation was that CinemaScope would use a separate film for sound (see Audio below) thus enabling the full "silent" 1.33:1 aperture to be available for the picture with a 2:1 anamorphic squeeze applied that would allow an aspect ratio of 2.66:1. Blocking (stage) In contemporary theatre, the director usually determines blocking during rehearsal, telling actors where they should move for the proper dramatic effect, ensure sight lines for the audience and work with the lighting design of the scene.
It is especially important for the stage manager to note the actors' positions, as a director is not usually present for each performance of a play and it becomes the stage manager's job to ensure that actors follow the assigned blocking from night to night. By extension, the term is sometimes used in the context of cinema to speak of the arrangement of actors in the frame.
In this context, there is also a need to consider the movement of the camera as part of the blocking process (see Cinematography). House left/right are from the audience's perspective The stage itself has been given named areas to facilitate blocking. In France, stage left is referred to as côté cour (court side). Vidéo-jockey. Un article de Wikipédia, l'encyclopédie libre.
Exemple de prestation de Vjing. Un vidéo-jockey (anglicisme), abrégé sous le sigle VJ, est une personne qui est à l'origine d'une animation visuelle projetée sans plus d'indications sur les techniques utilisées ou les choix graphiques effectués. Le VJing est un terme large qui désigne la performance visuelle en temps réel. Les caractéristiques du VJing sont la création ou la manipulation de l'image en temps réel via la médiation technologique et en direction d'un public, en synchronisation avec la musique. Le VJing a souvent lieu dans des évènements comme des concerts, clubs, festivals de musique, et en général associé à une autre performance artistique. Dans les pays anglophones, le terme a été popularisé par MTV qui utilisait le terme de VJ pour désigner la personne qui animait et présentait les diffusions de clips vidéo, mais ses origines datent des clubs new-yorkais des années 70,. §Historique[modifier | modifier le code]
Cinematic techniques. Storyboard. A storyboard is a graphic organizer in the form of illustrations or images displayed in sequence for the purpose of pre-visualizing a motion picture, animation, motion graphic or interactive media sequence.
The storyboarding process, in the form it is known today, was developed at Walt Disney Productions during the early 1930s, after several years of similar processes being in use at Walt Disney and other animation studios. Origins The storyboarding process can be very time-consuming and intricate. Image. Two images of the same scene: The top image is a captured photo made using photography, while the bottom image is a simplified artistic rendering.
Images are produced by capturing or rendering. An SARradar image acquired by the SIR-C/X-SAR radar on board the Space Shuttle Endeavour shows the Teide volcano. Compositing. Four images of the same subject with original backgrounds removed and placed over a new background Compositing is the combining of visual elements from separate sources into single images, often to create the illusion that all those elements are parts of the same scene.
Live-action shooting for compositing is variously called "chroma key", "blue screen", "green screen" and other names. Today, most, though not all, compositing is achieved through digital image manipulation. Motion graphic design. Motion Graphic Design is a subset of graphic design in that it uses graphic design principles in a filmmaking or video production context (or other temporally evolving visual medium) through the use of animation or filmic techniques.
Examples include the kinetic typography and graphics you see as the titles for a film, or opening sequences for television or the spinning, web-based animations, three-dimensional station identification logo for a television channel. Although this art form has been around for decades, it has taken quantum leaps forward in recent years in terms of technical sophistication. Technology Recently, motion graphics design needs more than a few tools and practices to be created smoothly. Tools like Maxon Cinema4D has integrated tools to create Motion Graphics, such as the native MoGraph plugin, or ICE of Softimage that can also be used for similar purposes. Special effect. A methane bubble bursting The illusions or tricks of the eye used in the film, television, theatre, video game, and simulator industries to simulate the imagined events in a story or virtual world are traditionally called special effects (often abbreviated as SFX, SPFX, or simply FX).
Special effects are traditionally divided into the categories of optical effects and mechanical effects. With the emergence of digital film-making tools a greater distinction between special effects and visual effects has been recognized, with "visual effects" referring to digital post-production and "special effects" referring to on-set mechanical effects and in-camera optical effects. Mechanical effects (also called practical or physical effects) are usually accomplished during the live-action shooting. This includes the use of mechanized props, scenery, scale models, animatronics, pyrotechnics and atmospheric effects: creating physical wind, rain, fog, snow, clouds, etc. Stop motion. Storyboard. Special effect. Filmmaking. Parts Film production consists of five major stages:
Master shot. Historically, the master shot was arguably the most important shot of any given scene. All shots in a given scene were somehow related to what was happening in the master shot. This is one reason some of the films from the 1930s and 1940s are considered "stagey" by today's standards.