Search This Blog

Friday, August 5, 2022

Film preservation

From Wikipedia, the free encyclopedia
 
Stacked film cans containing rolls of film.
 
Decayed nitrate film. EYE Film Institute Netherlands.
 
Decayed nitrate film. EYE Film Institute Netherlands.

Film preservation, or film restoration, describes a series of ongoing efforts among film historians, archivists, museums, cinematheques, and non-profit organizations to rescue decaying film stock and preserve the images they contain. In the widest sense, preservation assures that a movie will continue to exist in as close to its original form as possible.

For many years the term "preservation" was synonymous with "duplication" of film. The goal of a preservationist was to create a durable copy without any significant loss of quality. In more modern terms, film preservation includes the concepts of handling, duplication, storage, and access. The archivist seeks to protect the film and share its content with the public.

Film preservation is not to be confused with film revisionism, in which long-completed films are modified with the insertion of outtakes or new musical scores, the addition of sound effects, black-and-white film being colorized, older soundtracks converted to Dolby stereo, or minor edits and other cosmetic changes being made.

By the 1980s, it was becoming apparent that the collections of motion picture heritage were at risk of becoming lost. Not only was the preservation of nitrate film an ongoing problem, but it was then discovered that safety film, used as a replacement for the more volatile nitrate stock, was beginning to be affected by a unique form of decay known as "vinegar syndrome", and color film manufactured, in particular, by Eastman Kodak, was found to be at risk of fading. At that time, the best-known solution was to duplicate the original film onto a more secure medium.

A common estimate is that 90 percent of all American silent films made before 1920 and 50 percent of American sound films made before 1950 are lost films.

Although institutional practices of film preservation date back to the 1930s, the field received an official status only in 1980, when UNESCO recognized "moving images" as an integral part of the world's cultural heritage.

The problem of film decay

Simulation of a film archive: the shelves have been left almost empty to help visitors better visualize the gap between the number of surviving films and the number of films actually made. Museum of Film and Television Berlin, Deutsche Kinemathek

The great majority of films made in the silent era are now considered to be lost forever. Movies of the first half of the 20th century were filmed on an unstable, highly flammable cellulose nitrate film base, which required careful storage to slow its inevitable process of decomposition over time. Most films made on nitrate stock were not preserved; over the years, their negatives and prints crumbled into powder or dust. Many of them were recycled for their silver content, or destroyed in studio or vault fires. The largest cause, however, was intentional destruction. As film preservationist Robert A. Harris explains, "Most of the early films did not survive because of wholesale junking by the studios. There was no thought of ever saving these films. They simply needed vault space and the materials were expensive to house." Silent films had little or no commercial value after the advent of sound films in the 1930s, and as such, they were not kept. As a result, preserving the now-rare silent films has been proposed as a high priority amongst film historians.

Because of the fragility of film stock, proper preservation of film usually involves storing the original negatives (if they have survived) and prints in climate-controlled facilities. The vast majority of films were not stored in this manner, which resulted in the widespread decay of film stocks.

The problem of film decay is not limited to films made on cellulose nitrate. Film industry researchers and specialists have found that color films (made using processes for Technicolor and its successors) are also decaying at an increasingly rapid rate. A number of well-known films only exist as copies of original film productions or exhibition elements because the originals have decomposed beyond use. Cellulose acetate film, which was the initial replacement for nitrate, has been found to suffer from "vinegar syndrome". Polyester film base, which replaced acetate, also suffers from fading colors.

Storage at carefully controlled low temperatures and low humidity can inhibit both color fading and the onset of vinegar syndrome. However, once degradation begins to occur, the chemical reactions involved will promote further deterioration. "There is no indication that we will ever find a way to arrest decomposition once it has started. All we can do is inhibit it," says the director of the AMIA (Association of Moving Image Archivists) board, Leo Enticknap.

Film decay as an art form

In 2002, filmmaker Bill Morrison produced Decasia, a film solely based on fragments of old unrestored nitrate-based films in various states of decay and disrepair, providing a somewhat eerie aesthetic to the film. The film was created to accompany a symphony of the same name, composed by Michael Gordon and performed by his orchestra. The footage used was from old newsreel and archive film and was obtained by Morrison from several sources, such as the George Eastman House, the archives of the Museum of Modern Art, and the Fox Movietone News film archives at the University of South Carolina.

Preservation through careful storage

Packard Humanities Institute, Santa Clarita, Nitrate film Film Vault

The preservation of film usually refers to physical storage of the film in a climate-controlled vault, and sometimes to the actual repair and copying of the film element. Preservation is different from restoration, as restoration is the act of returning the film to a version most faithful to its initial release to the public and often involves combining various fragments of film elements.

Film is best preserved by proper protection from external forces while in storage along with being under controlled temperatures. For most film materials, the Image Permanence Institute finds that storing film media in frozen temperatures, with relative humidity (RH) between 30% and 50%, greatly extends its useful life. These measures inhibit deterioration better than any other methods and are a cheaper solution than replicating deteriorating films.

Preparing a film for preservation and restoration

In most cases, when a film is chosen for preservation or restoration work, new prints are created from the original camera negative or from a composite restoration negative, which can be made from a combination of elements for general screening. It is therefore particularly important to keep camera negatives or digital masters under safe storage conditions.

The original camera negative is the remaining, edited, film negative that passed through the camera on the set. This original camera negative may, or may not, remain in original release form, depending upon number of subsequent re-releases after the initial release for theatrical exhibition. Restorers sometimes create a composite negative (or composite dupe) by recombining duplicated sections of the best remaining material, sometimes on "a shot-to-shot, frame-by-frame basis" to approximate the original configuration of the original camera negative at some time in the film's release cycle.

In traditional photochemical restorations, image polarity considerations must be observed when recombining surviving materials and the final, lowest generation restoration master may be either a duplicate negative or a fine grain master positive. Preservation elements, such as fine-grain master positives and duplicate printing negatives, are generated from this restoration master element to make both duplication masters and access projection prints available for future generations.

Choosing an archival medium

Film as an archival medium

Film preservationists would prefer that the film images, whether restored through photochemical or digital processes, be eventually transferred to other film stock, because no digital media exists that has proven truly archival because of rapidly evolving and shifting data formats, while a well-developed and stored, modern film print can last upwards of 100 years.

While some in the archival community feel that conversion from film to a digital image results in a loss of quality that can make it more difficult to create a high-quality print based upon the digital image, digital imaging technology has become increasingly advanced to the point where 8K scanners can capture the full resolution of images filmed at as high as 65mm. 70mm IMAX film has a theoretical resolution of 18K, the highest possible resolution given the sensor.

Of course, having an intermediate digital stage, followed by forming a new film master by lasering the digital results onto new film stock does represent an extra generation. So would an intermediate film master that was restored frame-by-frame by hand. The choice of film vs. digital restoration will be driven by the amount, if any, of restoration required, the taste and skill set of the restorer, and the economics of film restoration vs. digital restoration.

Digital as an archival medium

As of 2014, digital scanners can capture images as large as 65mm in full resolution. That is the typical image size on a traditional (as opposed to the IMAX process) 70mm film which used a portion of the film surface for its multitrack magnetic sound stripe. The cost of a 70mm print of a two and a half hour film as of 2012 ran upwards of $170,000; while a hard disk capable of storing such a movie typically cost a few hundred dollars, with an archival optical disk even less. The problem of having to transfer the data as new generations of equipment come along will continue, however, until true archival standards are put in place.

Digital film preservation

In the context of film preservation, the term "digital preservation" highlights the use of digital technology for the transfer of films from 8mm to 70mm in size to digital carriers, as well as all practices for ensuring the longevity and access to digitized or digitally born film materials. On purely technical and practical terms, digital film preservation stands for a domain specific subset of digital curation practices.

The aesthetic and ethical implications of the use of digital technology for film preservation are major subjects of debate. For instance, the senior curator of George Eastman House Paolo Cherchi Usai has decried the shift from analogue to digital preservation of film as ethically unacceptable, arguing, on philosophical terms, that the medium of film is an essential ontological precondition for the existence of cinema. In 2009, the senior curator of EYE Film Institute Netherlands Giovanna Fossati has discussed the use of digital technologies for the restoration and preservation of film in a more optimistic way as a form of remediation of the cinematic medium, and has positively reflected on digital technologies' ability to broaden restoration possibilities, improve quality, and reduce costs. According to the cinema scholar Leo Enticknap, the views held by Usai and Fossati could be seen as representative of the two poles of the digital debate in film preservation. It should be kept in mind, however, that both Usai and Fossati's arguments are highly complex and nuanced, and likewise, the debate about the utility of digital technologies in film preservation is complex and continually evolving.

Advancements

In 1935, New York's Museum of Modern Art began one of the earliest institutional attempts to collect and preserve motion pictures, obtaining original negatives of the Biograph and Edison companies and the world's largest collection of D. W. Griffith films. The following year, Henri Langlois founded the Cinémathèque Française in Paris, which would become the world's largest international film collection.

For thousands of early silent films stored in the Library of Congress, mostly between 1894 and 1912, the only existing copies were printed on rolls of paper submitted as copyright registrations. For these, an optical printer was used to copy these images onto safety film stock, a project that began in 1947 and continues today. The Library hosts the National Film Preservation Board, whose National Film Registry annually selects 25 U.S. films "showcasing the range and diversity of American film heritage". The George Eastman House International Museum of Photography and Film was chartered in 1947 to collect, preserve and present the history of photography and film, and in 1996 opened the Louis B. Mayer Conservation Center, one of only four film conservation centers in the United States. The American Film Institute was founded in 1967 to train the next generation of filmmakers and preserve the American film heritage. Its collection now includes over 27,500 titles.

In 1978, Dawson City, Yukon Territory, Canada, a construction excavation inadvertently found a forgotten collection of more than 500 discarded films from the early 20th century that were buried in and preserved in the permafrost. This fortunate discovery was shared and moved to the United States' Library of Congress and Library and Archives Canada for transfer to safety stock and archiving. However, to move such highly flammable material such a distance ultimately required assistance from the Canadian Armed Forces to make the delivery to Ottawa. The story of this discovery as well as excerpts of these films can be seen in the 2016 documentary film Dawson City: Frozen Time.

Another high-profile restoration by staff at the British Film Institute's National Film and Television Archive is the Mitchell and Kenyon collection, which consists almost entirely of actuality films commissioned by traveling fairground operators for showing at local fairgrounds or other venues across the UK in the early part of the twentieth century. The collection was stored for many decades in two large barrels following the winding-up of the firm, and was discovered in Blackburn in the early 1990s. The restored films now offer a unique social record of early 20th-century British life.

Individual preservationists who have contributed to the cause include Robert A. Harris and James Katz (Lawrence of Arabia, My Fair Lady, and several Alfred Hitchcock films), Michael Thau (Superman), and Kevin Brownlow (Intolerance and Napoleon). Other organizations, such as the UCLA Film and Television Archive, have also preserved and restored films; a major part of UCLA's work includes such projects as Becky Sharp and select Paramount/Famous Studios and Warner Bros. cartoons whose credits were once altered due to rights taken over by different entities.

Studio efforts

In 1926 Will Hays asked for film studios to preserve their films by storing them at 40 degrees at low humidity in an Eastman Kodak process, so that "schoolboys in the year 3,000 and 4,000 A.D. may learn about us".

Beginning in the 1970s, Metro-Goldwyn-Mayer, aware that the original negatives to many of its Golden Age films had been destroyed in a fire, began a preservation program to restore and preserve all of its films by using whatever negatives survived, or, in many cases, the next best available elements (whether it be a fine-grain master positive or mint archival print). From the onset, it was determined that if some films had to be preserved, then it would have to be all of them. In 1986, when Ted Turner acquired MGM's library (which by then had included Warner Bros.' pre-1950, MGM's pre-May 1986, and a majority of the RKO Radio Pictures catalogs), he vowed to continue the preservation work MGM had started. Time Warner, the current owner of Turner Entertainment, continues this work today.

The cause for film preservation came to the forefront in the 1980s and early 1990s when such famous and influential film directors as Steven Spielberg and Martin Scorsese contributed to the cause. Spielberg became interested in film preservation when he went to view the master of his film Jaws, only to find that it had badly decomposed and deteriorated—a mere fifteen years after it had been filmed. Scorsese drew attention to the film industry's use of color-fading film stock through his use of black-and-white film stock in his 1980 film Raging Bull. His film, Hugo included a key scene in which many of film pioneer Georges Méliès' silent films are melted down and the raw material recycled as shoes; this was seen by many movie critics as "a passionate brief for film preservation wrapped in a fanciful tale of childhood intrigue and adventure".

Scorsese's concern about the need to save motion pictures of the past led him to create The Film Foundation, a non-profit organization dedicated to film preservation, in 1990. He was joined in this effort by fellow film makers who served on the foundation's board of directors—Woody Allen, Robert Altman, Francis Ford Coppola, Clint Eastwood, Stanley Kubrick, George Lucas, Sydney Pollack, Robert Redford, and Steven Spielberg. In 2006, Paul Thomas Anderson, Wes Anderson, Curtis Hanson, Peter Jackson, Ang Lee, and Alexander Payne were added to the board of directors of The Film Foundation, which is aligned with the Directors Guild of America.

By working in partnership with the leading film archives and studios, The Film Foundation has saved nearly 600 films, often restoring them to pristine condition. In many cases, original footage that had been excised—or censored by the Production Code in the U.S.—from the original negative, has been reinstated. In addition to the preservation, restoration, and presentation of classic cinema, the foundation teaches young people about film language and history through The Story of Movies, an educational program claimed to be "used by over 100,000 educators nationwide".

In the age of digital television, high-definition television and DVD, film preservation and restoration has taken on commercial as well as historical importance, since audiences demand the highest possible picture quality from digital formats. Meanwhile, the dominance of home video and ever-present need for television broadcasting content, especially on specialty channels, has meant that films have proven a source of long-term revenue to a degree that the original artists and studio management before the rise of these media never imagined. Thus media companies have a strong financial incentive to carefully archive and preserve their complete library of films.

Video Aids to Film Preservation

The group Video Aids to Film Preservation (VAFP) became active on the Internet in 2005.

The VAFP site was funded as part of a 2005 Institute of Museum and Library Services (IMLS) grant to the Folkstreams project. The purpose of the site is to supplement already existing film preservation guides provided by the National Film Preservation Foundation with video demonstrations. The preservation guides provided by the origination, while thoroughly depicting accurate methods of preservation, are mostly text-based. The films and clips are copyrighted under the Creative Commons license, which allows anyone to use these clips with attribution—in this case, attribution to the VAFP site and to the author of the clip and his company.

Obstacles in restoration

Regardless of the age of the print itself, damage may occur if stored improperly. Damage to the film (caused by tears on the print, curling of the film base due to intense light exposure, temperature, humidity, etc.) can significantly raise the difficulty and the cost of preservation processes. Many films simply do not have enough information left on the film to piece together a new master, although careful digital restoration can produce stunning results by gathering bits and pieces of buildings from adjacent frames for restoration on a damaged frame, predicting entire frames based on the characters' movements in prior and subsequent frames, etc. As time goes on, this digital capability will only improve, but it will ultimately require sufficient information from the original film to make proper restorations and predictions.

Cost is another obstacle. As of 2020, Martin Scorsese's non-profit The Film Foundation, dedicated to film preservation, estimates the average cost of photochemical restoration of a color feature with sound to be $80,000 to $450,000 dollars, with digital 2K or 4K restoration being "several hundred thousand dollars". The degrees of physical and chemical damage of film influence the incentive to preserve, i.e., as the business perspective states that once a film is no longer "commercially" viable, it stops generating profit and becomes a financial liability. While few films would not benefit from digital restoration, the high cost of digitally restoring films still prevents the method from being as broadly applied as it might be.

Demand for new media, digital cinema, and constantly evolving consumer digital formats continues to change. Film restoration facilities must keep pace to maintain audience acceptance. Classic films today must be in near-mint condition if they are to be reshown or resold, with the demand for perfection only rising as theaters move from 2K to 4K projection and consumer media continues its shift from SD to HD to UltraHD and beyond.

Digital restoration steps

Once a film is inspected and cleaned, it is transferred via telecine or a motion picture film scanner to a digital tape or disk, and the audio is synced to create a new master.

Common defects needing restoration include:

  • Dirt/dust
  • Scratches, tears, burned frames
  • Color fade, color change
  • Excessive film grain (a copy of an existing film has all of the film grain from the original as well as the film grain in the copy)
  • Missing scenes and sound (censored or edited out for re-release or television broadcast)
  • Shrinkage

Modern, digital film restoration takes the following steps:

  1. Expertly clean the film of dirt and dust.
  2. Repair all film tears with clear polyester tape or splicing cement.
  3. Scan each frame into a digital file.
  4. Restore the film frame by frame by comparing each frame to adjacent frames. This can be done somewhat by computer algorithms with human checking of the result.
    1. Fix frame alignment ("jitter" and "weave"), or the misalignment of adjacent film frames due to movement of film within the sprockets. This corrects the issue where the holes on each side of a frame are distorted over time. This causes frames to slightly be off center.
    2. Fix color and lighting changes. This corrects flickering and slight color changes from one frame to another due to aging of the film.
    3. Restore areas blocked by dirt and dust by using parts of images in other frames.
    4. Restore scratches by using parts of images in other frames.
    5. Enhance frames by reducing film grain noise. Film foreground/background detail about the same size as the film grain or smaller is blurred or lost in making the film. Comparing a frame with adjacent frames allows detail information to be reconstructed since a given small detail may be split between more film grains from one frame to another.

Photochemical restoration steps

Modern, photochemical restoration follows roughly the same path that digital restoration does:

  1. Extensive research is done to determine what version of the film can be restored from the existing material. Often, extensive efforts are taken to search out alternative material in film archives located around the world.
  2. A comprehensive restoration plan is mapped that allows preservationists to designate elements as "key" elements upon which to base the polarity map for the ensuing photochemical work. Since many alternative elements are actually salvaged from release prints and duplication masters (foreign and domestic). Care must be taken to plot the course at which negative, master positive and release print elements arrive back at a common polarity (i.e., negative or positive) for assembly and subsequent printing.
  3. Test prints are struck from existing elements to evaluate contrast, resolution, color (if color) and sound quality (if audio element exists).
  4. Elements are duplicated using the shortest possible duplication path to minimize analog duplication artifacts, such as the build-up of contrast, grain and loss of resolution.
  5. All sources are assembled into a single master restoration element (most often a duplicate negative).
  6. From this master restoration element, duplication masters, such as composite fine grain masters, are generated to be used to generate additional printing negatives from which actual release prints can be struck for festival screenings and DVD mastering.

Education

The practice of film preservation is more craft than science. Until the early 1990s there were no dedicated academic programs in film preservation. Practitioners had often entered the field through related education (e.g. library or archival science), related technical experience (e.g. film lab work), or driven by sheer passion for working with film.

In the last two decades universities globally began offering graduate degrees in film preservation and film archiving, which are often taught conjointly (the latter focusing more on skills related to the description, cataloguing, indexing and broadly speaking management of film and media collections).

The recent years rapid incursion of digital technologies in the field has somewhat redefined the vocational scope of film preservation. In response, the majority of graduate programs in film preservation have begun offering courses on digital film preservation and digital film and media collection management.

Some established graduate programs in the field are:

Dark Energy Survey

From Wikipedia, the free encyclopedia
 
The Dark Energy Survey
Dark Energy Survey logo.jpg
Dark Energy Survey logo

The Dark Energy Survey (DES) is an astronomical survey designed to constrain the properties of dark energy. It uses images taken in the near-ultraviolet, visible, and near-infrared to measure the expansion of the Universe using Type Ia supernovae, baryon acoustic oscillations, the number of galaxy clusters, and weak gravitational lensing. The collaboration is composed of research institutions and universities from the United States, Australia, Brazil, the United Kingdom, Germany, Spain, and Switzerland. The collaboration is divided into several scientific working groups. The director of DES is Josh Frieman.

The DES began by developing and building Dark Energy Camera (DECam), an instrument designed specifically for the survey. This camera has a wide field of view and high sensitivity, particularly in the red part of the visible spectrum and in the near infrared. Observations were performed with DECam mounted on the 4-meter Victor M. Blanco Telescope, located at the Cerro Tololo Inter-American Observatory (CTIO) in Chile. Observing sessions ran from 2013 to 2019; as of 2021 the DES collaboration has published results from the first three years of the survey.

DECam

A Sky Full of Galaxies.

DECam, short for the Dark Energy Camera, is a large camera built to replace the previous prime focus camera on the Victor M. Blanco Telescope. The camera consists of three major components: mechanics, optics, and CCDs.

Mechanics

The mechanics of the camera consists of a filter changer with an 8-filter capacity and shutter. There is also an optical barrel that supports 5 corrector lenses, the largest of which is 98 cm in diameter. These components are attached to the CCD focal plane which is cooled to −100 °C with liquid nitrogen in order to reduce thermal noise in the CCDs. The focal plane is also kept in an extremely low vacuum of 10−6 Torr to prevent the formation of condensation on the sensors. The entire camera with lenses, filters, and CCDs weighs approximately 4 tons. When mounted at the prime focus it was supported with a hexapod system allowing for real time focal adjustment.

Optics

The camera is outfitted with u, g, r, i, z, and Y filters spaccing roughly from 340–1070 nm, similar to those used in the Sloan Digital Sky Survey (SDSS). This allows DES to obtain photometric redshift measurements to z≈1. DECam also contains five lenses acting as corrector optics to extend the telescope's field of view to a diameter of 2.2°, one of the widest fields of view available for ground-based optical and infrared imaging. One significant difference between previous charge-coupled devices (CCD) at the Victor M. Blanco Telescope and DECam is the improved quantum efficiency in the red and near-infrared wavelengths.

CCDs

Simulated image of the DECam CCD array at focal plane. Each large rectangle is a single CCD. The green rectangle circled in red in the upper left corner shows the size of the iPhone 4 camera CCD at the same scale.
 
The Dark Energy Camera's 1 millionth exposure. The 1 millionth exposure has been combined with 127 earlier exposures to make this view of the field. 

The scientific sensor array on DECam is an array of 62 2048×4096 pixel back-illuminated CCDs totaling 520 megapixels; an additional 12 2048×2048 pixel CCDs (50 Mpx) are used for guiding the telescope, monitoring focus, and alignment. The full DECam focal plane contains 570 megapixels. The CCDs for DECam use high resistivity silicon manufactured by Dalsa and LBNL with 15×15 micron pixels. By comparison, the OmniVision Technologies back-illuminated CCD that was used in the iPhone 4 has a 1.75×1.75 micron pixel with 5 megapixels. The larger pixels allow DECam to collect more light per pixel, improving low light sensitivity which is desirable for an astronomical instrument. DECam's CCDs also have a 250-micron crystal depth; this is significantly larger than most consumer CCDs. The additional crystal depth increases the path length travelled by entering photons. This, in turn, increases the probability of interaction and allows the CCDs to have an increased sensitivity to lower energy photons, extending the wavelength range to 1050 nm. Scientifically this is important because it allows one to look for objects at a higher redshift, increasing statistical power in the studies mentioned above. When placed in the telescope's focal plane each pixel has a width of 0.263″ on the sky, resulting in a total field of view of 3 square degrees.

Survey

DES imaged 5,000 square degrees of the southern sky in a footprint that overlaps with the South Pole Telescope and Stripe 82 (in large part avoiding the Milky Way). The survey took 758 observing nights spread over six annual sessions between August and February to complete, covering the survey footprint ten times in five photometric bands (g, r, i, z, and Y). The survey reached a depth of 24th magnitude in the i band over the entire survey area. Longer exposure times and faster observing cadence were made in five smaller patches totaling 30 square degrees to search for supernovae.

First light was achieved on 12 September 2012; after a verification and testing period, scientific survey observations started in August 2013. The last observing session was completed on 9 January 2019.

Observing

The footprint of the wide-area survey on the sky (colored region) in celestial coordinates; the dashed curve shows the approximate location of the Milky Way disk in these coordinates.

Each year from August through February, observers will stay in dormitories on the mountain. During a weeklong period of work, observers sleep during the day and use the telescope and camera at night. There will be some DES members working at the telescope console to monitor operations while others monitoring camera operations and data process.

For the wide-area footprint observations, DES takes roughly every two minutes for each new image: The exposures are typically 90 seconds long, with another 30 seconds for readout of the camera data and slewing to point the telescope at its next target. Despite the restrictions on each exposure, the team also need to consider different sky conditions for the observations, such as moonlight and cloud cover.

In order to get better images, DES team use a computer algorithm called the "Observing Tactician" (ObsTac) to help with sequencing observations. It optimizes among different factors, such as the date and time, weather conditions, and the position of the moon. ObsTac automatically points the telescope in the best direction, and selects the exposure, using the best light filter. It also decides whether to take a wide-area or time-domain survey image, depending on whether or not the exposure will also be used for super nova searches.

Results

Cosmology

Constraints on a measure of the clumpiness of the matter distribution (S8) and the fractional density of the Universe in matter (Ωm) from the combined 3 DES Y1 measurements (blue), Planck CMB measurements (green), and their combination (red).

Dark Energy Group published several papers presenting their results for cosmology. Most of these cosmology results coming from its first-year data and the third-year data. Their results for cosmology were concluded with a Multi-Probe Methodology, which mainly combine the data from Galaxy-Galaxy Lensing, different shape of weak lensing, cosmic shear, galaxy clustering and photometric data set.

For the first-year data collected by DES, Dark Energy Survey Group showed the Cosmological Constraints results from Galaxy Clustering and Weak Lensing results and cosmic shear measurement. With Galaxy Clustering and Weak Lensing results, and for ΛCDM, , and at 68% confidence limits for ωCMD. Combine the most significant measurements of cosmic shear in a galaxy survey, Dark Energy Survey Group showed that at 68% confidence limits and for ΛCDM with . Other cosmological analyses from first year data showed a derivation and validation of redshift distribution estimates and their uncertainties for the galaxies used as weak lensing sources. The DES team also published a paper summarize all the Photometric Data Set for Cosmology for their first-year data.

For the third-year data collected by DES, they updated the Cosmological Constraints to for the ΛCDM model with the new cosmic shear measurements. From third-year data of Galaxy Clustering and Weak Lensing results, DES updated the Cosmological Constraints to and in ΛCDM at 68% confidence limits, , and in ωCDM at 68% confidence limits. Similarly, the DES team published their third-year observations for photometric data set for cosmology comprising nearly 5000 deg2 of grizY imaging in the south Galactic cap, including nearly 390 million objects, with depth reaching S/N ∼ 10 for extended objects up to ∼ 23.0, and top-of-the-atmosphere photometric uniformity < 3mmag.

Weak lensing

DES's 2021 Dark matter map using weak gravitational lensing data set projected in the foreground of observed galaxies.

Weak lensing was measured statistically by measuring the shear-shear correlation function, a two-point function, or its Fourier Transform, the shear power spectrum. In April 2015, the Dark Energy Survey released mass maps using cosmic shear measurements of about 2 million galaxies from the science verification data between August 2012 and February 2013. In 2021 weak lensing was used to map the dark matter in a region of the southern hemisphere sky and in 2022 together with galaxy clustering data to give new cosmological constrains.

Another big part of weak lensing result is to calibrate the redshift of the source galaxies. In December 2020 and June 2021, DES team published two papers showing their results about using weak lensing to calibrate the redshift of the source galaxies in order to mapping the matter density field with gravitational lensing.

Gravitational Waves

After Ligo detected the first gravitational wave signal from GW170817, DES made follow-up observations of GW170817 using DECam. With DECam independent discovery of the optical source, DES team establish its association with GW170817 by showing that none of the 1500 other sources found within the event localization region could plausibly be associated with the event. DES team monitored the source for over two weeks and provide the light curve data as a machine-readable file. From the observation data set, DES concluded that the optical counterpart they have identified near NGC\,4993 is associated with GW170817. This discovery ushers in the era of multi-messenger astronomy with gravitational waves and demonstrates the power of DECam to identify the optical counterparts of gravitational-wave sources.

Dwarf galaxies

Spiral Galaxy NGC 0895 imaged by DES

In March 2015, two teams released their discoveries of several new potential dwarf galaxies candidates found in Year 1 DES data. In August 2015, the Dark Energy Survey team announced the discovery of eight additional candidates in Year 2 DES data. Later on, Dark Energy Survey team found more dwarf galaxies. With more Dwarf Galaxy results, the team was able to take a deep look about more properties of the detected Dwarf Galaxy such as the chemical abundance, the structure of stellar population, and Stellar Kinematics and Metallicities. In Feb 2019, the team also discovered a sixth star cluster in the Fornax Dwarf Spheroidal Galaxy and a tidally Disrupted Ultra-Faint Dwarf Galaxy.

Baryon Acoustic Oscillations

The signature of Baryon Acoustic Oscillations (BAO) can be observed in the distribution of tracers of the matter density field and used to measure the expansion history of the Universe. BAO can also be measured using purely photometric data, though at less significance. DES team observation samples consists of 7 million galaxies distributed over a footprint of 4100 with 0.6<  < 1.1 and a typical redshift uncertainty of 0.03(1+z). From their statistics, they combine the likelihoods derived from angular correlations and spherical harmonics to constrain the ratio of comoving angular diameter distance at the effective redshift of our sample to the sound horizon scale at the drag epoch.

SuperNova-Type1a

Super Nova IA

In May 2019, Dark Energy Survey team published their first cosmology results using type IA supernova. The super Nova data was from DES-SN3YR. The Dark Energy Survey team found Ωm = 0.331 ± 0.038 with a flat ΛCDM model and Ωm = 0.321 ± 0.018, w = −0.978 ± 0.059 with a flat wCDM model. Analyzing the same data from DES-SN3YR, they also found a new current Hubble constant, .   This result has an excellent agreement with the Hubble constant measurement from Planck Satellite Collaboration in 2018. In June 2019, there is follow-up paper published by DES team discussing the systematic uncertainties, and validation of using the super nova IA to measure the cosmology results mentioned before. The team also published their photometric pipeline and light curve data in another paper published in the same month.

Minor planets

Several minor planets were discovered by DeCam in the course of The Dark Energy Survey, including high-inclination trans-Neptunian objects (TNOs).

List of DES discovered minor planets
Numbered MP
designation
Discovery
date
MP list link
(451657) 2012 WD36 19 November 2012 list
(471954) 2013 RM98 8 September 2013 list
(472262) 2014 QN441 18 August 2014 list
(483002) 2014 QS441 19 August 2014 list
(491767) 2012 VU113 15 November 2012 list
(491768) 2012 VV113 15 November 2012 list
(495189) 2012 VR113 28 September 2012 list
(495190) 2012 VS113 12 November 2012 list
(495297) 2013 TJ159 13 October 2013 list
Discoveries are credited either to
"DECam" or "Dark Energy Survey".

The MPC has assigned the IAU code W84 for DeCam's observations of small Solar System bodies. As of October 2019, the MPC inconsistently credits the discovery of 9 numbered minor planets, all of them trans-Neptunian objects, to either "DeCam" or "Dark Energy Survey". The list does not contain any unnumbered minor planets potentially discovered by DeCam, as discovery credits are only given upon a body's numbering, which in turn depends on a sufficiently secure orbit determination.

Gallery

Internet research

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Internet_...