Search This Blog

Thursday, December 16, 2021

Movie camera

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

A movie camera (also film camera and cine-camera) is a type of photographic camera that rapidly takes a sequence of photographs, either on an image sensor or onto film stock, in order to produce a moving image to project onto a movie screen. In contrast to the still camera, which captures a single image at a time, by way of an intermittent mechanism, the movie camera takes a series of images; each image is a frame of film. The strips of frames are projected through a movie projector at a specific frame rate (number of frames per second) to show a moving picture. When projected at a given frame-rate , the persistence of vision allows the eyes and brain of the viewer to merge the separate frames into a continuous moving picture.

History

An interesting forerunner to the movie camera was the machine invented by Francis Ronalds at the Kew Observatory in 1845. A photosensitive surface was drawn slowly past the aperture diaphragm of the camera by a clockwork mechanism to enable continuous recording over a 12- or 24-hour period. Ronalds applied his cameras to trace the ongoing variations of scientific instruments and they were used in observatories around the world for over a century.

The chronophotographic gun invented by Étienne-Jules Marey.

The chronophotographic gun was invented in 1882 by Étienne-Jules Marey, a french scientist and chronophotograph. It could shoot 12 images per second and it was the first invention to capture moving images on the same chronomatographic plate using a metal shutter.

Charles Kayser of the Edison lab seated behind the Kinetograph. Portability was not among the camera's virtues.

In 1876, Wordsworth Donisthorpe proposed a camera to take a series of pictures on glass plates, to be printed on a roll of paper film. In 1889, he would patent a moving picture camera in which the film moved continuously. Another film camera was designed in England by Frenchman Louis Le Prince in 1888. He had built a 16 lens camera in 1887 at his workshop in Leeds. The first 8 lenses would be triggered in rapid succession by an electromagnetic shutter on the sensitive film; the film would then be moved forward allowing the other 8 lenses to operate on the film. After much trial and error, he was finally able to develop a single-lens camera in 1888, which he used to shoot sequences of moving pictures on paper film, including the Roundhay Garden Scene and Leeds Bridge.

Another early pioneer was the British inventor William Friese-Greene. In 1887, he began to experiment with the use of paper film, made transparent through oiling, to record motion pictures. He also said he attempted using experimental celluloid, made with the help of Alexander Parkes. In 1889, Friese-Greene took out a patent for a moving picture camera that was capable of taking up to ten photographs per second. Another model, built in 1890, used rolls of the new Eastman celluloid film, which he had perforated. A full report on the patented camera was published in the British Photographic News on February 28, 1890. He showed his cameras and film shot with them on many occasions, but never projected his films in public. He also sent details of his invention to Edison in February 1890, which was also seen by Dickson (see below).

Film-gun at the Institut Lumière, France

William Kennedy Laurie Dickson, a Scottish inventor and employee of Thomas Edison, designed the Kinetograph Camera in 1891. The camera was powered by an electric motor and was capable of shooting with the new sprocketed film. To govern the intermittent movement of the film in the camera, allowing the strip to stop long enough so each frame could be fully exposed and then advancing it quickly (in about 1/460 of a second) to the next frame, the sprocket wheel that engaged the strip was driven by an escapement disc mechanism—the first practical system for the high-speed stop-and-go film movement that would be the foundation for the next century of cinematography.

The Lumière Domitor camera, owned by brothers Auguste and Louis Lumière, was created by Charles Moisson, the chief mechanic at the Lumière works in Lyon in 1894. The camera used paper film 35 millimeters wide, but in 1895, the Lumière brothers shifted to celluloid film, which they bought from New-York’s Celluloid Manufacturing Co. This they covered with their own Etiquette-bleue emulsion, had it cut into strips and perforated.

In 1894, the Polish inventor Kazimierz Prószyński constructed a projector and camera in one, an invention he called the Pleograph.

Mass-market

The Aeroscope (1909) was the first hand-held movie camera.

Due to the work of Le Prince, Friese-Greene, Edison, and the Lumière brothers, the movie camera had become a practical reality by the mid-1890s. The first firms were soon established for the manufacture of movie camera, including Birt Acres, Eugene Augustin Lauste, Dickson, Pathé frères, Prestwich, Newman & Guardia, de Bedts, Gaumont-Démény, Schneider, Schimpf, Akeley, Debrie, Bell & Howell, Leonard-Mitchell, Ertel, Ernemann, Eclair, Stachow, Universal, Institute, Wall, Lytax, and many others.

The Aeroscope was built and patented in England in the period 1909–1911 by Polish inventor Kazimierz Prószyński. Aeroscope was the first successful hand-held operated film camera. The cameraman did not have to turn the crank to advance the film, as in all cameras of that time, so he could operate the camera with both hands, holding the camera and controlling the focus. This made it possible to film with the Aeroscope in difficult circumstances including from the air and for military purposes.

The first all-metal cine camera was the Bell & Howell Standard of 1911-12. One of the most complicated models was the Mitchell-Technicolor Beam Splitting Three-Strip Camera of 1932. With it, three colour separation originals are obtained behind a purple, a green, and a red light filter, the latter being part of one of the three different raw materials in use.

In 1923, Eastman Kodak introduced a 16mm film stock, principally as a lower-cost alternative to 35 mm and several camera makers launched models to take advantage of the new market of amateur movie-makers. Thought initially to be of inferior quality to 35 mm, 16 mm cameras continued to be manufactured until the 2000s by the likes of Bolex, Arri, and Aaton.

Digital movie cameras

The Red EPIC camera has been used to shoot numerous feature films—including The Amazing Spiderman and The Hobbit.

Digital movie cameras do not use analog film stock to capture images, as had been the standard since the 1890s. Rather, an electronic image sensor is employed and the images are typically recorded on hard drives or flash memory—using a variety of acquisition formats. Digital SLR cameras (DSLR) designed for consumer use have also been used for some low-budget independent productions.

Since the 2010s, digital movie cameras have become the dominant type of camera in the motion picture industry, being employed in film, television productions and even (to a lesser extent) video games. In response to this, movie director Martin Scorsese started the non-profit organisation The Film Foundation to preserve the use of film in movie making—as many filmmakers feel DSLR cameras do not convey the depth or emotion that motion-picture film does. Other major directors involved in the organisation include Quentin Tarantino, Christopher Nolan and many more.

Technical details

Basic operation: When the shutter inside the camera is opened, the film is illuminated. When the shutter is completely covering the film gate, the film strip is being moved one frame further by one or two claws which advance the film by engaging and pulling it through the perforations.

Most of the optical and mechanical elements of a movie camera are also present in the movie projector. The requirements for film tensioning, take-up, intermittent motion, loops, and rack positioning are almost identical. The camera will not have an illumination source and will maintain its film stock in a light-tight enclosure. A camera will also have exposure control via an iris aperture located on the lens. The righthand side of the camera is often referred to by camera assistants as "the dumb side" because it usually lacks indicators or readouts and access to the film threading, as well as lens markings on many lens models. Later equipment often had done much to minimize these shortcomings, although access to the film movement block by both sides is precluded by basic motor and electronic design necessities. Advent of digital cameras reduced the above mechanism to a minimum removing much of the shortcomings.

A spring-wound Bolex 16 mm camera

The standardized frame rate for commercial sound film is 24 frames per second. The standard commercial (i.e., movie-theater film) width is 35 millimeters, while many other film formats exist. The standard aspect ratios are 1.66, 1.85, and 2.39 (anamorphic). NTSC video (common in North America and Japan) plays at 29.97 frame/s; PAL (common in most other countries) plays at 25 frames. These two television and video systems also have different resolutions and color encodings. Many of the technical difficulties involving film and video concern translation between the different formats. Video aspect ratios are 4:3 (1.33) for full screen and 16:9 (1.78) for widescreen.

Multiple cameras

Multiple cameras to take surround images (1900 Cinéorama system, for modern version see Circle-Vision 360°

Multiple cameras may be placed side-by-side to record a single angle of a scene and repeated throughout the runtime. The film is then later projected simultaneously, either on a single three-image screen (Cinerama) or upon multiple screens forming a complete circle, with gaps between screens through which the projectors illuminate an opposite screen. (See Circle-Vision 360°) convex and concave mirrors are used in cameras as well as mirrors.

Sound synchronization

One of the problems in film is synchronizing a sound recording with the film. Most film cameras do not record sound internally; instead, the sound is captured separately by a precision audio device (see double-system recording). The exceptions to this are the single-system news film cameras, which had either an optical—or later—magnetic recording head inside the camera. For optical recording, the film only had a single perforation and the area where the other set of perforations would have been was exposed to a controlled bright light that would burn a waveform image that would later regulate the passage of light and playback the sound. For magnetic recording, that same area of the single perf 16 mm film that was prestriped with a magnetic stripe. A smaller balance stripe existed between the perforations and the edge to compensate the thickness of the recording stripe to keep the film wound evenly.

Double-system cameras are generally categorized as either "sync" or "non-sync." Sync cameras use crystal-controlled motors that ensure that film is advanced through the camera at a precise speed. In addition, they're designed to be quiet enough to not hamper sound recording of the scene being shot. Non-sync or "MOS" cameras do not offer these features; any attempt to match location sound to these cameras' footage will eventually result in "sync drift", and the noise they emit typically renders location sound recording useless.

To synchronize double-system footage, the clapper board which typically starts a take is used as a reference point for the editor to match the picture to the sound (provided the scene and take are also called out so that the editor knows which picture take goes with any given sound take). It also permits scene and take numbers and other essential information to be seen on the film itself. Aaton cameras have a system called AatonCode that can "jam sync" with a timecode-based audio recorder and prints a digital timecode directly on the edge of the film itself. However, the most commonly used system at the moment is unique identifier numbers exposed on the edge of the film by the film stock manufacturer (KeyKode is the name for Kodak's system). These are then logged (usually by a computer editing system, but sometimes by hand) and recorded along with audio timecode during editing. In the case of no better alternative, a handclap can work if done clearly and properly, but often a quick tap on the microphone (provided it is in the frame for this gesture) is preferred.

One of the most common uses of non-sync cameras is the spring-wound cameras used in hazardous special effects, known as "crash cams". Scenes shot with these have to be kept short or resynchronized manually with the sound. MOS cameras are also often used for second unit work or anything involving slow or fast-motion filming.

With the advent of digital cameras, synchronization became a redundant term, as both visual and audio is simultaneously captured electronically.

Home movie cameras

Various German Agfa Movex Standard 8 home movie cameras

Movie cameras were available before World War II often using the 9.5 mm film format or 16 mm format. The use of movie cameras had an upsurge in popularity in the immediate post-war period giving rise to the creation of home movies. Compared to the pre-war models, these cameras were small, light, fairly sophisticated and affordable.

An extremely compact 35 mm movie camera Kinamo was designed by Emanuel Goldberg for amateur and semi-professional movies in 1921. A spring motor attachment was added in 1923 to allow flexible handheld filming. The Kinamo was used by Joris Ivens and other avant-garde and documentary filmmakers in the late 1920s and early 1930s.

While a basic model might have a single fixed aperture/focus lens, a better version might have three or four lenses of differing apertures and focal lengths on a rotating turret. A good quality camera might come with a variety of interchangeable, focusable lenses or possibly a single zoom lens. The viewfinder was normally a parallel sight within or on top of the camera body. In the 1950s and for much of the 1960s these cameras were powered by clockwork motors, again with variations of quality. A simple mechanism might only power the camera for some 30 seconds, while a geared drive camera might work for as long as 75 – 90 seconds (at standard speeds).

The common film used for these cameras was termed Standard 8, which was a strip of 16-millimetre wide film which was only exposed down one half during shooting. The film had twice the number of perforations as film for 16 mm cameras and so the frames were half as high and half as wide as 16 mm frames. The film was removed and placed back in the camera to expose the frames on the other side once the first half had been exposed. Once the film was developed it was sliced down the middle and the ends attached, giving 50-foot (15 m) of Standard 8 film from a spool of 25-foot (7.6 m) of 16 mm film. 16 mm cameras, mechanically similar to the smaller format models, were also used in home movie making but were more usually the tools of semi professional film and news film makers.

In the 1960s a new film format, Super8, coincided with the advent of battery-operated electric movie cameras. The new film, with a larger frame print on the same width of film stock, came in a cassette that simplified changeover and developing. Another advantage of the new system is that they had the capacity to record sound, albeit of indifferent quality. Camera bodies, and sometimes lenses, were increasingly made in plastic rather than the metals of the earlier types. As the costs of mass production came down, so did the price and these cameras became very popular.

This type of format and camera was more quickly superseded for amateurs by the advent of digital video cameras in the 2000s. Since the 2010s, amateurs increasingly started preferring smartphone cameras.

Special effect

From Wikipedia, the free encyclopedia
 
A special effect of a miniature person from the 1952 film The Seven Deadly Sins

Special effects (often abbreviated as SFX, SPFX, F/X or simply FX) are illusions or visual tricks used in the theatre, film, television, video game, and simulator industries to simulate the imagined events in a story or virtual world.

Special effects are traditionally divided into the categories of mechanical effects and optical effects. With the emergence of digital film-making a distinction between special effects and visual effects has grown, with the latter referring to digital post-production and optical effects, while "special effects" refers to mechanical effects.

Mechanical effects (also called practical or physical effects) are usually accomplished during the live-action shooting. This includes the use of mechanized props, scenery, scale models, animatronics, pyrotechnics and atmospheric effects: creating physical wind, rain, fog, snow, clouds, making a car appear to drive by itself and blowing up a building, etc. Mechanical effects are also often incorporated into set design and makeup. For example, prosthetic makeup can be used to make an actor look like a non-human creature.

Optical effects (also called photographic effects) are techniques in which images or film frames are created photographically, either "in-camera" using multiple exposure, mattes or the Schüfftan process or in post-production using an optical printer. An optical effect might be used to place actors or sets against a different background.

Since the 1990s, computer-generated imagery (CGI) has come to the forefront of special effects technologies. It gives filmmakers greater control, and allows many effects to be accomplished more safely and convincingly and—as technology improves—at lower costs. As a result, many optical and mechanical effects techniques have been superseded by CGI.

Developmental history

Early development

The Execution of Mary Stuart (1895)

In 1857, Oscar Rejlander created the world's first "special effects" image by combining different sections of 32 negatives into a single image, making a montaged combination print. In 1895, Alfred Clark created what is commonly accepted as the first-ever motion picture special effect. While filming a reenactment of the beheading of Mary, Queen of Scots, Clark instructed an actor to step up to the block in Mary's costume. As the executioner brought the axe above his head, Clark stopped the camera, had all of the actors freeze, and had the person playing Mary step off the set. He placed a Mary dummy in the actor's place, restarted filming, and allowed the executioner to bring the axe down, severing the dummy's head. Techniques like these would dominate the production of special effects for a century.

It wasn't only the first use of trickery in cinema, it was also the first type of photographic trickery that was only possible in a motion picture, and referred to as the "stop trick". Georges Méliès, an early motion picture pioneer, accidentally discovered the same "stop trick." According to Méliès, his camera jammed while filming a street scene in Paris. When he screened the film, he found that the "stop trick" had caused a truck to turn into a hearse, pedestrians to change direction, and men to turn into women. Méliès, the stage manager at the Theatre Robert-Houdin, was inspired to develop a series of more than 500 short films, between 1914, in the process developing or inventing such techniques as multiple exposures, time-lapse photography, dissolves, and hand painted color. Because of his ability to seemingly manipulate and transform reality with the cinematograph, the prolific Méliès is sometimes referred to as the "Cinemagician." His most famous film, Le Voyage dans la lune (1902), a whimsical parody of Jules Verne's From the Earth to the Moon, featured a combination of live action and animation, and also incorporated extensive miniature and matte painting work.

From 1910 to 1920, the main innovations in special effects were the improvements on the matte shot by Norman Dawn. With the original matte shot, pieces of cardboard were placed to block the exposure of the film, which would be exposed later. Dawn combined this technique with the "glass shot." Rather than using cardboard to block certain areas of the film exposure, Dawn simply painted certain areas black to prevent any light from exposing the film. From the partially exposed film, a single frame is then projected onto an easel, where the matte is then drawn. By creating the matte from an image directly from the film, it became incredibly easy to paint an image with proper respect to scale and perspective (the main flaw of the glass shot). Dawn's technique became the textbook for matte shots due to the natural images it created.

During the 1920s and 1930s, special effects techniques were improved and refined by the motion picture industry. Many techniques—such as the Schüfftan process—were modifications of illusions from the theater (such as pepper's ghost) and still photography (such as double exposure and matte compositing). Rear projection was a refinement of the use of painted backgrounds in the theater, substituting moving pictures to create moving backgrounds. Lifecasting of faces was imported from traditional maskmaking. Along with makeup advances, fantastic masks could be created which fit the actor perfectly. As material science advanced, horror film maskmaking followed closely.

Publicity still for the 1933 film King Kong, which used stop-motion model special effects

Many studios established in-house "special effects" departments, which were responsible for nearly all optical and mechanical aspects of motion-picture trickery. Also, the challenge of simulating spectacle in motion encouraged the development of the use of miniatures. Animation, creating the illusion of motion, was accomplished with drawings (most notably by Winsor McCay in Gertie the Dinosaur) and with three-dimensional models (most notably by Willis O'Brien in The Lost World and King Kong). Naval battles could be depicted with models in studio. Tanks and airplanes could be flown (and crashed) without risk of life and limb. Most impressively, miniatures and matte paintings could be used to depict worlds that never existed. Fritz Lang's film Metropolis was an early special effects spectacular, with innovative use of miniatures, matte paintings, the Schüfftan process, and complex compositing.

An important innovation in special-effects photography was the development of the optical printer. Essentially, an optical printer is a projector aiming into a camera lens, and it was developed to make copies of films for distribution. Until Linwood G. Dunn refined the design and use of the optical printer, effects shots were accomplished as in-camera effects. Dunn demonstrating that it could be used to combine images in novel ways and create new illusions. One early showcase for Dunn was Orson Welles' Citizen Kane, where such locations as Xanadu (and some of Gregg Toland's famous 'deep focus' shots) were essentially created by Dunn's optical printer.

Color era

A period drama set in Vienna uses a green screen as a backdrop, to allow a background to be added during post-production.
 
Bluescreens are commonly used in chroma key special effects.

The development of color photography required greater refinement of effects techniques. Color enabled the development of such travelling matte techniques as bluescreen and the sodium vapour process. Many films became landmarks in special-effects accomplishments: Forbidden Planet used matte paintings, animation, and miniature work to create spectacular alien environments. In The Ten Commandments, Paramount's John P. Fulton, A.S.C., multiplied the crowds of extras in the Exodus scenes with careful compositing, depicted the massive constructions of Rameses with models, and split the Red Sea in a still-impressive combination of travelling mattes and water tanks. Ray Harryhausen extended the art of stop-motion animation with his special techniques of compositing to create spectacular fantasy adventures such as Jason and the Argonauts (whose climax, a sword battle with seven animated skeletons, is considered a landmark in special effects).

The science fiction boom

During the 1950s and 1960s numerous new special effects were developed which would dramatically increase the level of realism achievable in science fiction films. Sci-fi special effects milestones in the 1950s included the Godzilla films, The Day the Earth Stood Still (featuring Klaatu), and 3-D films.

The tokusatsu genre of Japanese science fiction film and television, which include the kaiju sub-genre of monster films, rose to prominence in the 1950s. The special-effects artist Eiji Tsuburaya and the director Ishirō Honda became the driving forces behind the original Godzilla (1954). Taking inspiration from King Kong (1933), Tsuburaya formulated many of the techniques that would become staples of the tokusatsu genre, such as so-called suitmation—the use of a human actor in a costume to play a giant monster—combined with the use of miniatures and scaled-down city sets. Godzilla changed the landscape of Japanese cinema, science fiction and fantasy, and kickstarted the kaiju genre in Japan called the "Monster Boom", which remained extremely popular for several decades, with characters such as the aforementioned Godzilla, Gamera and King Ghidorah leading the market. Tokusatsu films, notably Warning from Space (1956), sparked Stanley Kubrick's interest in science fiction films; according to his biographer John Baxter, despite their "clumsy model sequences, the films were often well-photographed in colour ... and their dismal dialogue was delivered in well-designed and well-lit sets."

If one film could be said to have established a new benchmark for special effects, it would be 1968's 2001: A Space Odyssey, directed by Stanley Kubrick, who assembled his own effects team (Douglas Trumbull, Tom Howard, Con Pederson and Wally Veevers) rather than use an in-house effects unit. In this film, the spaceship miniatures were highly detailed and carefully photographed for a realistic depth of field. The shots of spaceships were combined through hand-drawn rotoscoping and careful motion-control work, ensuring that the elements were precisely combined in the camera—a surprising throwback to the silent era, but with spectacular results. Backgrounds of the African vistas in the "Dawn of Man" sequence were combined with soundstage photography via the then-new front projection technique. Scenes set in zero-gravity environments were staged with hidden wires, mirror shots, and large-scale rotating sets. The finale, a voyage through hallucinogenic scenery, was created by Douglas Trumbull using a new technique termed slit-scan.

The 1970s provided two profound changes in the special effects trade. The first was economic: during the industry's recession in the late 1960s and early 1970s, many studios closed down their in-house effects houses. Technicians became freelancers or founded their own effects companies, sometimes specializing on particular techniques (opticals, animation, etc.).

The second was precipitated by the blockbuster success of two science-fiction and fantasy films in 1977. George Lucas's Star Wars ushered in an era of science-fiction films with expensive and impressive special effects. Effects supervisor John Dykstra, A.S.C. and crew developed many improvements in existing effects technology. They created a computer-controlled camera rig called the "Dykstraflex" that allowed precise repetition of camera motion, greatly facilitating travelling-matte compositing. Degradation of film images during compositing was minimized by other innovations: the Dykstraflex used VistaVision cameras that photographed widescreen images horizontally along stock, using far more of the film per frame, and thinner-emulsion filmstocks were used in the compositing process. The effects crew assembled by Lucas and Dykstra was dubbed Industrial Light & Magic, and since 1977 has spearheaded many effects innovations.

That same year, Steven Spielberg's film Close Encounters of the Third Kind boasted a finale with impressive special effects by 2001 veteran Douglas Trumbull. In addition to developing his own motion-control system, Trumbull also developed techniques for creating intentional "lens flare" (the shapes created by light reflecting in camera lenses) to provide the film's undefinable shapes of flying saucers.

The success of these films, and others since, has prompted massive studio investment in effects-heavy science-fiction films. This has fueled the establishment of many independent effects houses, a tremendous degree of refinement of existing techniques, and the development of new techniques such as computer-generated imagery (CGI). It has also encouraged within the industry a greater distinction between special effects and visual effects; the latter is used to characterize post-production and optical work, while "special effects" refers more often to on-set and mechanical effects.

Introduction of computer generated imagery (CGI)

The use of computer animation in film dates back to the early 1980s, with the films Tron (1982) and Golgo 13: The Professional (1983). Since the 1990s, a profound innovation in special effects has been the development of computer generated imagery (CGI), which has changed nearly every aspect of motion picture special effects. Digital compositing allows far more control and creative freedom than optical compositing, and does not degrade the image as with analog (optical) processes. Digital imagery has enabled technicians to create detailed models, matte "paintings," and even fully realized characters with the malleability of computer software.

Arguably the biggest and most "spectacular" use of CGI is in the creation of photo-realistic images of science-fiction/fantasy characters, settings and objects. Images can be created in a computer using the techniques of animated cartoons and model animation. The Last Starfighter (1984) used computer generated spaceships instead of physical scale models. In 1993, stop-motion animators working on the realistic dinosaurs of Steven Spielberg's Jurassic Park were retrained in the use of computer input devices. By 1995, films such as Toy Story underscored the fact that the distinction between live-action films and animated films was no longer clear. Other landmark examples include a character made up of broken pieces of a stained-glass window in Young Sherlock Holmes, a shape-shifting character in Willow, a tentacle formed from water in The Abyss, the T-1000 Terminator in Terminator 2: Judgment Day, hordes and armies of robots and fantastic creatures in the Star Wars (prequel) and The Lord of the Rings trilogies, and the planet, Pandora, in Avatar.

Planning and use

Although most visual effects work is completed during post-production, it must be carefully planned and choreographed in pre-production and production. A visual effects supervisor is usually involved with the production from an early stage to work closely with the Director and all related personnel to achieve the desired effects.

Practical effects also require significant pre-planning and co-ordination with performers and production teams. The live nature of the effects can result in situations where resetting due to an error, mistake, or safety concern incurs significant expense, or is impossible due to the destructive nature of the effect.

Live special effects

Spinning fiery steel wool at night

Live special effects are effects that are used in front of a live audience, such as in theatre, sporting events, concerts and corporate shows. Types of effects that are commonly used include: flying effects, laser lighting, theatrical smoke and fog, CO2 effects, and pyrotechnics. Other atmospheric effects can include flame, confetti, bubbles, and snow.

Mechanical effects

Mechanical effects encompass the use of mechanical engineering to a greater degree. Cars being flipped and hauled over buildings are usually an effect built on specialized rigs and gimbals such as in movies like Unknown. These features were made possible by the use of these rigs and gimbals. Usually a team of engineers or freelance film companies provide these services to movie producers. As the action event is being recorded against a green screen, camera workers, stunt artists or doubles, directors and engineers who conceptualize and engineer these monumental mechanics, all collaborate at the same time to acquire the perfect angle and shot that provides the entertainment users enjoy. It is then edited and reviewed before final release to the public.

Digital cinema

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Digital_cinema

Digital cinema refers to adoption of digital technology within the film industry to distribute or project motion pictures as opposed to the historical use of reels of motion picture film, such as 35 mm film. Whereas film reels have to be shipped to movie theaters, a digital movie can be distributed to cinemas in a number of ways: over the Internet or dedicated satellite links, or by sending hard drives or optical discs such as Blu-ray discs.

Digital movies are projected using a digital video projector instead of a film projector, are shot using digital movie cameras and edited using a non-linear editing system (NLE). The NLE is often a video editing application installed in one or more computers that may be networked to access the original footage from a remote server, share or gain access to computing resources for rendering the final video, and to allow several editors to work on the same timeline or project.

Alternatively a digital movie could be a film reel that has been digitized using a motion picture film scanner and then restored, or, a digital movie could be recorded using a film recorder onto film stock for projection using a traditional film projector.

Digital cinema is distinct from high-definition television and does not necessarily use traditional television or other traditional high-definition video standards, aspect ratios, or frame rates. In digital cinema, resolutions are represented by the horizontal pixel count, usually 2K (2048×1080 or 2.2 megapixels) or 4K (4096×2160 or 8.8 megapixels). The 2K and 4K resolutions used in digital cinema projection are often referred to as DCI 2K and DCI 4K. DCI stands for Digital Cinema Initiatives.

As digital-cinema technology improved in the early 2010s, most theaters across the world converted to digital video projection.

History

The transition from film to digital video was preceded by cinema's transition from analog to digital audio, with the release of the Dolby Digital (AC-3) audio coding standard in 1991. Its main basis is the modified discrete cosine transform (MDCT), a lossy audio compression algorithm. It is a modification of the discrete cosine transform (DCT) algorithm, which was first proposed by Nasir Ahmed in 1972 and was originally intended for image compression. The DCT was adapted into the MDCT by J.P. Princen, A.W. Johnson and Alan B. Bradley at the University of Surrey in 1987, and then Dolby Laboratories adapted the MDCT algorithm along with perceptual coding principles to develop the AC-3 audio format for cinema needs. Cinema in the 1990s typically combined analog video with digital audio.

Digital media playback of high-resolution 2K files has at least a 20-year history. Early video data storage units (RAIDs) fed custom frame buffer systems with large memories. In early digital video units, content was usually restricted to several minutes of material. Transfer of content between remote locations was slow and had limited capacity. It was not until the late 1990s that feature-length films could be sent over the "wire" (Internet or dedicated fiber links). On October 23, 1998, Digital Light Processing (DLP) projector technology was publicly demonstrated with the release of The Last Broadcast, the first feature-length movie, shot, edited and distributed digitally. In conjunction with Texas Instruments, the movie was publicly demonstrated in five theaters across the United States (Philadelphia, Portland (Oregon), Minneapolis, Providence, and Orlando).

Foundations

Texas Instruments, DLP Cinema Prototype Projector, Mark V, 2000

In the United States, on June 18, 1999, Texas Instruments' DLP Cinema projector technology was publicly demonstrated on two screens in Los Angeles and New York for the release of Lucasfilm's Star Wars Episode I: The Phantom Menace. In Europe, on February 2, 2000, Texas Instruments' DLP Cinema projector technology was publicly demonstrated, by Philippe Binant, on one screen in Paris for the release of Toy Story 2.

From 1997 to 2000, the JPEG 2000 image compression standard was developed by a Joint Photographic Experts Group (JPEG) committee chaired by Touradj Ebrahimi (later the JPEG president). In contrast to the original 1992 JPEG standard, which is a DCT-based lossy compression format for static digital images, JPEG 2000 is a discrete wavelet transform (DWT) based compression standard that could be adapted for motion imaging video compression with the Motion JPEG 2000 extension. JPEG 2000 technology was later selected as the video coding standard for digital cinema in 2004.

Initiatives

On January 19, 2000, the Society of Motion Picture and Television Engineers, in the United States, initiated the first standards group dedicated towards developing digital cinema. By December 2000, there were 15 digital cinema screens in the United States and Canada, 11 in Western Europe, 4 in Asia, and 1 in South America. Digital Cinema Initiatives (DCI) was formed in March 2002 as a joint project of many motion picture studios (Disney, Fox, MGM, Paramount, Sony Pictures, Universal and Warner Bros.) to develop a system specification for digital cinema.

In April 2004, in cooperation with the American Society of Cinematographers, DCI created standard evaluation material (the ASC/DCI StEM material) for testing of 2K and 4K playback and compression technologies. DCI selected JPEG 2000 as the basis for the compression in the system the same year. Initial tests with JPEG 2000 produced bit rates of around 75–125 Mbit/s for 2K resolution and 100–200 Mbit/s for 4K resolution.

Worldwide deployment

In China, in June 2005, an e-cinema system called "dMs" was established and was used in over 15,000 screens spread across China's 30 provinces. dMs estimated that the system would expand to 40,000 screens in 2009. In 2005 the UK Film Council Digital Screen Network launched in the UK by Arts Alliance Media creating a chain of 250 2K digital cinema systems. The roll-out was completed in 2006. This was the first mass roll-out in Europe. AccessIT/Christie Digital also started a roll-out in the United States and Canada. By mid 2006, about 400 theaters were equipped with 2K digital projectors with the number increasing every month. In August 2006, the Malayalam digital movie Moonnamathoral, produced by Benzy Martin, was distributed via satellite to cinemas, thus becoming the first Indian digital cinema. This was done by Emil and Eric Digital Films, a company based at Thrissur using the end-to-end digital cinema system developed by Singapore-based DG2L Technologies.

In January 2007, Guru became the first Indian film mastered in the DCI-compliant JPEG 2000 Interop format and also the first Indian film to be previewed digitally, internationally, at the Elgin Winter Garden in Toronto. This film was digitally mastered at Real Image Media Technologies in India. In 2007, the UK became home to Europe's first DCI-compliant fully digital multiplex cinemas; Odeon Hatfield and Odeon Surrey Quays (in London), with a total of 18 digital screens, were launched on 9 February 2007. By March 2007, with the release of Disney's Meet the Robinsons, about 600 screens had been equipped with digital projectors. In June 2007, Arts Alliance Media announced the first European commercial digital cinema Virtual Print Fee (VPF) agreements (with 20th Century Fox and Universal Pictures). In March 2009 AMC Theatres announced that it closed a $315 million deal with Sony to replace all of its movie projectors with 4K digital projectors starting in the second quarter of 2009; it was anticipated that this replacement would be finished by 2012.

AMC Theatres former corporate headquarters in Kansas City, prior to their 2013 move to Leawood, Kansas.

In January 2011, the total number of digital screens worldwide was 36,242, up from 16,339 at end 2009 or a growth rate of 121.8 percent during the year. There were 10,083 d-screens in Europe as a whole (28.2 percent of global figure), 16,522 in the United States and Canada (46.2 percent of global figure) and 7,703 in Asia (21.6 percent of global figure). Worldwide progress was slower as in some territories, particularly Latin America and Africa. As of 31 March 2015, 38,719 screens (out of a total of 39,789 screens) in the United States have been converted to digital, 3,007 screens in Canada have been converted, and 93,147 screens internationally have been converted. At the end of 2017, virtually all of the world's cinema screens were digital (98%).

Despite the fact that today, virtually all global movie theaters have converted their screens to digital cinemas, some major motion pictures even as of 2019 are shot on film. For example, Quentin Tarantino released his latest film Once Upon a Time in Hollywood in 70 mm and 35 mm in selected theaters across the United States and Canada.

Elements

In addition to the equipment already found in a film-based movie theatre (e.g., a sound reinforcement system, screen, etc.), a DCI-compliant digital cinema requires a digital projector and a powerful computer known as a server. Movies are supplied to the theatre as a digital file called a Digital Cinema Package (DCP). For a typical feature film, this file will be anywhere between 90 GB and 300 GB of data (roughly two to six times the information of a Blu-ray disc) and may arrive as a physical delivery on a conventional computer hard drive or via satellite or fibre-optic broadband Internet. As of 2013, physical deliveries of hard drives were most common in the industry. Promotional trailers arrive on a separate hard drive and range between 200 GB and 400 GB in size. The contents of the hard drive(s) may be encrypted.

Regardless of how the DCP arrives, it first needs to be copied onto the internal hard drives of the server, usually via a USB port, a process known as "ingesting". DCPs can be, and in the case of feature films almost always are, encrypted, to prevent illegal copying and piracy. The necessary decryption keys are supplied separately, usually as email attachments and then "ingested" via USB. Keys are time-limited and will expire after the end of the period for which the title has been booked. They are also locked to the hardware (server and projector) that is to screen the film, so if the theatre wishes to move the title to another screen or extend the run, a new key must be obtained from the distributor. Several versions of the same feature can be sent together. The original version (OV) is used as the basis of all the other playback options. Version files (VF) may have a different sound format (e.g. 7.1 as opposed to 5.1 surround sound) or subtitles. 2D and 3D versions are often distributed on the same hard drive.

The playback of the content is controlled by the server using a "playlist". As the name implies, this is a list of all the content that is to be played as part of the performance. The playlist will be created by a member of the theatre's staff using proprietary software that runs on the server. In addition to listing the content to be played the playlist also includes automation cues that allow the playlist to control the projector, the sound system, auditorium lighting, tab curtains and screen masking (if present), etc. The playlist can be started manually, by clicking the "play" button on the server's monitor screen, or automatically at pre-set times.

Technology and standards

Digital Cinema Initiatives

Digital Cinema Initiatives (DCI), a joint venture of the six major studios, published the first version (V1.0) of a system specification for digital cinema in July 2005. The main declared objectives of the specification were to define a digital cinema system that would "present a theatrical experience that is better than what one could achieve now with a traditional 35mm Answer Print", to provide global standards for interoperability such that any DCI-compliant content could play on any DCI-compliant hardware anywhere in the world and to provide robust protection for the intellectual property of the content providers.

The DCI specification calls for picture encoding using the ISO/IEC 15444-1 "JPEG2000" (.j2c) standard and use of the CIE XYZ color space at 12 bits per component encoded with a 2.6 gamma applied at projection. Two levels of resolution for both content and projectors are supported: 2K (2048×1080) or 2.2 MP at 24 or 48 frames per second, and 4K (4096×2160) or 8.85 MP at 24 frames per second. The specification ensures that 2K content can play on 4K projectors and vice versa. Smaller resolutions in one direction are also supported (the image gets automatically centered). Later versions of the standard added additional playback rates (like 25 fps in SMPTE mode). For the sound component of the content the specification provides for up to 16 channels of uncompressed audio using the "Broadcast Wave" (.wav) format at 24 bits and 48 kHz or 96 kHz sampling.

Playback is controlled by an XML-format Composition Playlist, into an MXF-compliant file at a maximum data rate of 250 Mbit/s. Details about encryption, key management, and logging are all discussed in the specification as are the minimum specifications for the projectors employed including the color gamut, the contrast ratio and the brightness of the image. While much of the specification codifies work that had already been ongoing in the Society of Motion Picture and Television Engineers (SMPTE), the specification is important in establishing a content owner framework for the distribution and security of first-release motion-picture content.

National Association of Theatre Owners

In addition to DCI's work, the National Association of Theatre Owners (NATO) released its Digital Cinema System Requirements. The document addresses the requirements of digital cinema systems from the operational needs of the exhibitor, focusing on areas not addressed by DCI, including access for the visually impaired and hearing impaired, workflow inside the cinema, and equipment interoperability. In particular, NATO's document details requirements for the Theatre Management System (TMS), the governing software for digital cinema systems within a theatre complex, and provides direction for the development of security key management systems. As with DCI's document, NATO's document is also important to the SMPTE standards effort.

E-Cinema

The Society of Motion Picture and Television Engineers (SMPTE) began work on standards for digital cinema in 2000. It was clear by that point in time that HDTV did not provide a sufficient technological basis for the foundation of digital cinema playback. In Europe, India and Japan however, there is still a significant presence of HDTV for theatrical presentations. Agreements within the ISO standards body have led to these non-compliant systems being referred to as Electronic Cinema Systems (E-Cinema).

Projectors for digital cinema

Only three manufacturers make DCI-approved digital cinema projectors; these are Barco, Christie and NEC. Except for Sony, who used to use their own SXRD technology, all use the Digital Light Processing (DLP) technology developed by Texas Instruments (TI). D-Cinema projectors are similar in principle to digital projectors used in industry, education, and domestic home cinemas, but differ in two important respects. First, projectors must conform to the strict performance requirements of the DCI specification. Second, projectors must incorporate anti-piracy devices intended to enforce copyright compliance such as licensing limits. For these reasons all projectors intended to be sold to theaters for screening current release movies must be approved by the DCI before being put on sale. They now pass through a process called CTP (compliance test plan). Because feature films in digital form are encrypted and the decryption keys (KDMs) are locked to the serial number of the server used (linking to both the projector serial number and server is planned in the future), a system will allow playback of a protected feature only with the required KDM.

DLP Cinema

Three manufacturers have licensed the DLP Cinema technology developed by Texas Instruments (TI): Christie Digital Systems, Barco, and NEC. While NEC is a relative newcomer to Digital Cinema, Christie is the main player in the U.S. and Barco takes the lead in Europe and Asia. Initially DCI-compliant DLP projectors were available in 2K only, but from early 2012, when TI's 4K DLP chip went into full production, DLP projectors have been available in both 2K and 4K versions. Manufacturers of DLP-based cinema projectors can now also offer 4K upgrades to some of the more recent 2K models. Early DLP Cinema projectors, which were deployed primarily in the United States, used limited 1280×1024 resolution or the equivalent of 1.3 MP (megapixels). Digital Projection Incorporated (DPI) designed and sold a few DLP Cinema units (is8-2K) when TI's 2K technology debuted but then abandoned the D-Cinema market while continuing to offer DLP-based projectors for non-cinema purposes. Although based on the same 2K TI "light engine" as those of the major players they are so rare as to be virtually unknown in the industry. They are still widely used for pre-show advertising but not usually for feature presentations.

TI's technology is based on the use of digital micromirror devices (DMDs). These are MEMS devices that are manufactured from silicon using similar technology to that of computer chips. The surface of these devices is covered by a very large number of microscopic mirrors, one for each pixel, so a 2K device has about 2.2 million mirrors and a 4K device about 8.8 million. Each mirror vibrates several thousand times a second between two positions: In one, light from the projector's lamp is reflected towards the screen, in the other away from it. The proportion of the time the mirror is in each position varies according to the required brightness of each pixel. Three DMD devices are used, one for each of the primary colors. Light from the lamp, usually a Xenon arc lamp similar to those used in film projectors with a power between 1 kW and 7 kW, is split by colored filters into red, green and blue beams which are directed at the appropriate DMD. The 'forward' reflected beam from the three DMDs is then re-combined and focused by the lens onto the cinema screen.

Sony SXRD

Alone amongst the manufacturers of DCI-compliant cinema projectors Sony decided to develop its own technology rather than use TI's DLP technology. SXRD (Silicon X-tal (Crystal) Reflective Display) projectors have only ever been manufactured in 4K form and, until the launch of the 4K DLP chip by TI, Sony SXRD projectors were the only 4K DCI-compatible projectors on the market. Unlike DLP projectors, however, SXRD projectors do not present the left and right eye images of stereoscopic movies sequentially, instead they use half the available area on the SXRD chip for each eye image. Thus during stereoscopic presentations the SXRD projector functions as a sub 2K projector, the same for HFR 3D Content.

However, Sony decided in late April, 2020 that they would no longer manufacture digital cinema projectors.

Stereo 3D images

In late 2005, interest in digital 3-D stereoscopic projection led to a new willingness on the part of theaters to co-operate in installing 2K stereo installations to show Disney's Chicken Little in 3-D film. Six more digital 3-D movies were released in 2006 and 2007 (including Beowulf, Monster House and Meet the Robinsons). The technology combines a single digital projector fitted with either a polarizing filter (for use with polarized glasses and silver screens), a filter wheel or an emitter for LCD glasses. RealD uses a "ZScreen" for polarisation and MasterImage uses a filter wheel that changes the polarity of projector's light output several times per second to alternate quickly the left-and-right-eye views. Another system that uses a filter wheel is Dolby 3D. The wheel changes the wavelengths of the colours being displayed, and tinted glasses filter these changes so the incorrect wavelength cannot enter the wrong eye. XpanD makes use of an external emitter that sends a signal to the 3D glasses to block out the wrong image from the wrong eye.

Laser

RGB laser projection produces the purest BT.2020 colors and the brightest images.

LED screen for digital cinema

In Asia, on July 13, 2017, an LED screen for digital cinema developed by Samsung Electronics was publicly demonstrated on one screen at Lotte Cinema World Tower in Seoul. First installation in Europe is in Arena Sihlcity Cinema in Zürich. These displays do not use a projector; instead they use a MicroLED video wall, and can offer higher contrast ratios, higher resolutions, and overall improvements in image quality. MicroLED allows for the elimination of display bezels, creating the illusion of a single large screen. This is possible due to the large amount of spacing in between pixels in MicroLED displays. Sony already sells MicroLED displays as a replacement for conventional cinema screens.

Effect on distribution

Digital distribution of movies has the potential to save money for film distributors. To print an 80-minute feature film can cost US$1,500 to $2,500, so making thousands of prints for a wide-release movie can cost millions of dollars. In contrast, at the maximum 250 megabit-per-second data rate (as defined by DCI for digital cinema), a feature-length movie can be stored on an off-the-shelf 300 GB hard drive for $50 and a broad release of 4000 'digital prints' might cost $200,000. In addition hard drives can be returned to distributors for reuse. With several hundred movies distributed every year, the industry saves billions of dollars. The digital-cinema roll-out was stalled by the slow pace at which exhibitors acquired digital projectors, since the savings would be seen not by themselves but by distribution companies. The Virtual Print Fee model was created to address this by passing some of the saving on to the cinemas. As a consequence of the rapid conversion to digital projection, the number of theatrical releases exhibited on film is dwindling. As of 4 May 2014, 37,711 screens (out of a total of 40,048 screens) in the United States have been converted to digital, 3,013 screens in Canada have been converted, and 79,043 screens internationally have been converted.

Telecommunication

Realization and demonstration, on October 29, 2001, of the first digital cinema transmission by satellite in Europe of a feature film by Bernard Pauchon, Alain Lorentz, Raymond Melwig and Philippe Binant.

Live broadcasting to cinemas

Broadcasting antenna in Stuttgart

Digital cinemas can deliver live broadcasts from performances or events. This began initially with live broadcasts from the New York Metropolitan Opera delivering regular live broadcasts into cinemas and has been widely imitated ever since. Leading territories providing the content are the UK, the US, France and Germany. The Royal Opera House, Sydney Opera House, English National Opera and others have found new and returning audiences captivated by the detail offered by a live digital broadcast featuring handheld and cameras on cranes positioned throughout the venue to capture the emotion that might be missed in a live venue situation. In addition these providers all offer additional value during the intervals e.g. interviews with choreographers, cast members, a backstage tour which would not be on offer at the live event itself. Other live events in this field include live theatre from NT Live, Branagh Live, Royal Shakespeare Company, Shakespeare's Globe, the Royal Ballet, Mariinsky Ballet, the Bolshoi Ballet and the Berlin Philharmoniker.

In the last ten years this initial offering of the arts has also expanded to include live and recorded music events such as Take That Live, One Direction Live, Andre Rieu, live musicals such as the recent Miss Saigon and a record-breaking Billy Elliot Live In Cinemas. Live sport, documentary with a live question and answer element such as the recent Oasis documentary, lectures, faith broadcasts, stand-up comedy, museum and gallery exhibitions, TV specials such as the record-breaking Doctor Who fiftieth anniversary special The Day Of The Doctor, have all contributed to creating a valuable revenue stream for cinemas large and small all over the world. Subsequently, live broadcasting, formerly known as Alternative Content, has become known as Event Cinema and a trade association now exists to that end. Ten years on the sector has become a sizeable revenue stream in its own right, earning a loyal following amongst fans of the arts, and the content limited only by the imagination of the producers it would seem. Theatre, ballet, sport, exhibitions, TV specials and documentaries are now established forms of Event Cinema. Worldwide estimations put the likely value of the Event Cinema industry at $1bn by 2019.

Event Cinema currently accounts for on average between 1-3% of overall box office for cinemas worldwide but anecdotally it's been reported that some cinemas attribute as much as 25%, 48% and even 51% (the Rio Bio cinema in Stockholm) of their overall box office. It is envisaged ultimately that Event Cinema will account for around 5% of the overall box office globally. Event Cinema saw six worldwide records set and broken over from 2013 to 2015 with notable successes Dr Who ($10.2m in three days at the box office - event was also broadcast on terrestrial TV simultaneously), Pompeii Live by the British Museum, Billy Elliot, Andre Rieu, One Direction, Richard III by the Royal Shakespeare Company.

Event Cinema is defined more by the frequency of events rather than by the content itself. Event Cinema events typically appear in cinemas during traditionally quieter times in the cinema week such as the Monday-Thursday daytime/evening slot and are characterised by the One Night Only release, followed by one or possibly more 'Encore' releases a few days or weeks later if the event is successful and sold out. On occasion more successful events have returned to cinemas some months or even years later in the case of NT Live where the audience loyalty and company branding is so strong the content owner can be assured of a good showing at the box office.

Pros and cons

Pros

The digital formation of sets and locations, especially in the time of growing film series and sequels, is that virtual sets, once computer generated and stored, can be easily revived for future films.Considering digital film images are documented as data files on hard disk or flash memory, varying systems of edits can be executed with the alteration of a few settings on the editing console with the structure being composed virtually in the computer's memory. A broad choice of effects can be sampled simply and rapidly, without the physical constraints posed by traditional cut-and-stick editing. Digital cinema allows national cinemas to construct films specific to their cultures in ways that the more constricting configurations and economics of customary film-making prevented. Low-cost cameras and computer-based editing software have gradually enabled films to be produced for minimal cost. The ability of digital cameras to allow film-makers to shoot limitless footage without wasting pricey celluloid has transformed film production in some Third World countries. From consumers' perspective digital prints don't deteriorate with the number of showings. Unlike celluloid film, there is no projection mechanism or manual handling to add scratches or other physically generated artefacts. Provincial cinemas that would have received old prints can give consumers the same cinematographic experience (all other things being equal) as those attending the premiere.

The use of NLEs in movies allows for edits and cuts to be made non-destructively, without actually discarding any footage.

Cons

A number of high-profile film directors, including Christopher Nolan, Paul Thomas Anderson, David O. Russell and Quentin Tarantino have publicly criticized digital cinema and advocated the use of film and film prints. Most famously, Tarantino has suggested he may retire because, though he can still shoot on film, because of the rapid conversion to digital, he cannot project from 35 mm prints in the majority of American cinemas. Steven Spielberg has stated that though digital projection produces a much better image than film if originally shot in digital, it is "inferior" when it has been converted to digital. He attempted at one stage to release Indiana Jones and the Kingdom of the Crystal Skull solely on film. Paul Thomas Anderson recently was able to create 70-mm film prints for his film The Master.

Film critic Roger Ebert criticized the use of DCPs after a cancelled film festival screening of Brian DePalma's film Passion at New York Film Festival as a result of a lockup due to the coding system.

The theoretical resolution of 35 mm film is greater than that of 2K digital cinema. 2K resolution (2048×1080) is also only slightly greater than that of consumer based 1080p HD (1920x1080). However, since digital post-production techniques became the standard in the early 2000s, the majority of movies, whether photographed digitally or on 35 mm film, have been mastered and edited at the 2K resolution. Moreover, 4K post production was becoming more common as of 2013. As projectors are replaced with 4K models the difference in resolution between digital and 35 mm film is somewhat reduced. Digital cinema servers utilize far greater bandwidth over domestic "HD", allowing for a difference in quality (e.g., Blu-ray colour encoding 4:2:0 48 Mbit/s MAX datarate, DCI D-Cinema 4:4:4 250 Mbit/s 2D/3D, 500 Mbit/s HFR3D). Each frame has greater detail.

Owing to the smaller dynamic range of digital cameras, correcting poor digital exposures is more difficult than correcting poor film exposures during post-production. A partial solution to this problem is to add complex video-assist technology during the shooting process. However, such technologies are typically available only to high-budget production companies. Digital cinemas' efficiency of storing images has a downside. The speed and ease of modern digital editing processes threatens to give editors and their directors, if not an embarrassment of choice then at least a confusion of options, potentially making the editing process, with this 'try it and see' philosophy, lengthier rather than shorter. Because the equipment needed to produce digital feature films can be obtained more easily than celluloid, producers could inundate the market with cheap productions and potentially dominate the efforts of serious directors. Because of the quick speed in which they are filmed, these stories sometimes lack essential narrative structure.

The projectors used for celluloid film were largely the same technology as when film/movies were invented over 100 years ago. The evolutions of adding sound and wide screen could largely be accommodated by bolting on sound decoders, and changing lenses. This well proven and understood technology had several advantages 1) The life of a mechanical projector of around 35 years 2) a mean time between failures (MTBF) of 15 years and 3) an average repair time of 15 minutes (often done by the projectionist). On the other hand, digital projectors are around 10 times more expensive, have a much shorter life expectancy due to the developing technology (already technology has moved from 2K to 4K) so the pace of obsolescence is higher. The MTBF has not yet been established, but the ability for the projectionist to effect a quick repair is gone.

Costs

Pros

The electronic transferring of digital film, from central servers to servers in cinema projection booths, is an inexpensive process of supplying copies of newest releases to the vast number of cinema screens demanded by prevailing saturation-release strategies. There is a significant saving on print expenses in such cases: at a minimum cost per print of $1200–2000, the cost of celluloid print production is between $5–8 million per film. With several thousand releases a year, the probable savings offered by digital distribution and projection are over $1 billion. The cost savings and ease, together with the ability to store film rather than having to send a print on to the next cinema, allows a larger scope of films to be screened and watched by the public; minority and small-budget films that would not otherwise get such a chance.

Cons

The initial costs for converting theaters to digital are high: $100,000 per screen, on average. Theaters have been reluctant to switch without a cost-sharing arrangement with film distributors. A solution is a temporary Virtual Print Fee system, where the distributor (who saves the money of producing and transporting a film print) pays a fee per copy to help finance the digital systems of the theaters. A theater can purchase a film projector for as little as $10,000 (though projectors intended for commercial cinemas cost two to three times that; to which must be added the cost of a long-play system, which also costs around $10,000, making a total of around $30,000–$40,000) from which they could expect an average life of 30–40 years. By contrast, a digital cinema playback system—including server, media block, and projector—can cost two to three times as much, and would have a greater risk of component failure and obsolescence. (In Britain the cost of an entry level projector including server, installation, etc., would be £31,000 [$50,000].)

Archiving digital masters has also turned out to be both tricky and costly. In a 2007 study, the Academy of Motion Picture Arts and Sciences found the cost of long-term storage of 4K digital masters to be "enormously higher—up to 11 times that of the cost of storing film masters." This is because of the limited or uncertain lifespan of digital storage: No current digital medium—be it optical disc, magnetic hard drive or digital tape—can reliably store a motion picture for as long as a hundred years or more (something that film—properly stored and handled—does very well). The short history of digital storage media has been one of innovation and, therefore, of obsolescence. Archived digital content must be periodically removed from obsolete physical media to up-to-date media. The expense of digital image capture is not necessarily less than the capture of images onto film; indeed, it is sometimes greater.

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...