Search This Blog

Sunday, August 6, 2023

Compiler-compiler

From Wikipedia, the free encyclopedia
(Redirected from Metacompiler)

In computer science, a compiler-compiler or compiler generator is a programming tool that creates a parser, interpreter, or compiler from some form of formal description of a programming language and machine.

The most common type of compiler-compiler is more precisely called a parser generator. It only handles syntactic analysis.

The input of a parser generator is a grammar file, typically written in Backus–Naur form (BNF) or extended Backus–Naur form (EBNF) that defines the syntax of a target programming language.

The output is the source code of a parser for the programming language. The output of the (compiled) parser source code is a parser. It may be either standalone or embedded. This parser takes as an input the source code of the target programming language source and performs some action or outputs an abstract syntax tree (AST).

Parser generators do not handle the semantics of the AST, or the generation of machine code for the target machine.

A metacompiler is a software development tool used mainly in the construction of compilers, translators, and interpreters for other programming languages. The input to a metacompiler is a computer program written in a specialized programming metalanguage designed mainly for the purpose of constructing compilers. The language of the compiler produced is called the object language. The minimal input producing a compiler is a metaprogram specifying the object language grammar and semantic transformations into an object program.

Variants

A typical parser generator associates executable code with each of the rules of the grammar that should be executed when these rules are applied by the parser. These pieces of code are sometimes referred to as semantic action routines since they define the semantics of the syntactic structure that is analyzed by the parser. Depending upon the type of parser that should be generated, these routines may construct a parse tree (or abstract syntax tree), or generate executable code directly.

One of the earliest (1964), surprisingly powerful, versions of compiler-compilers is META II, which accepted an analytical grammar with output facilities that produce stack machine code, and is able to compile its own source code and other languages.

Among the earliest programs of the original Unix versions being built at Bell Labs was the two-part lex and yacc system, which was normally used to output C programming language code, but had a flexible output system that could be used for everything from programming languages to text file conversion. Their modern GNU versions are flex and bison.

Some experimental compiler-compilers take as input a formal description of programming language semantics, typically using denotational semantics. This approach is often called 'semantics-based compiling', and was pioneered by Peter Mosses' Semantic Implementation System (SIS) in 1978. However, both the generated compiler and the code it produced were inefficient in time and space. No production compilers are currently built in this way, but research continues.

The Production Quality Compiler-Compiler (PQCC) project at Carnegie Mellon University does not formalize semantics, but does have a semi-formal framework for machine description.

Compiler-compilers exist in many flavors, including bottom-up rewrite machine generators (see JBurg) used to tile syntax trees according to a rewrite grammar for code generation, and attribute grammar parser generators (e.g. ANTLR can be used for simultaneous type checking, constant propagation, and more during the parsing stage).

Metacompilers

Metacompilers reduce the task of writing compilers by automating the aspects that are the same regardless of the object language. This makes possible the design of domain-specific languages which are appropriate to the specification of a particular problem. A metacompiler reduces the cost of producing translators for such domain-specific object languages to a point where it becomes economically feasible to include in the solution of a problem a domain-specific language design.

As a metacompiler's metalanguage will usually be a powerful string and symbol processing language, they often have strong applications for general-purpose applications, including generating a wide range of other software engineering and analysis tools.

Besides being useful for domain-specific language development, a metacompiler is a prime example of a domain-specific language, designed for the domain of compiler writing.

A metacompiler is a metaprogram usually written in its own metalanguage or an existing computer programming language. The process of a metacompiler, written in its own metalanguage, compiling itself is equivalent to self-hosting compiler. Most common compilers written today are self-hosting compilers. Self-hosting is a powerful tool, of many metacompilers, allowing the easy extension of their own metaprogramming metalanguage. The feature that separates a metacompiler apart from other compiler compilers is that it takes as input a specialized metaprogramming language that describes all aspects of the compiler's operation. A metaprogram produced by a metacompiler is as complete a program as a program written in C++, BASIC or any other general programming language. The metaprogramming metalanguage is a powerful attribute allowing easier development of computer programming languages and other computer tools. Command line processors, text string transforming and analysis are easily coded using metaprogramming metalanguages of metacompilers.

A full featured development package includes a linker and a run time support library. Usually, a machine-oriented system programming language, such as C or C++, is needed to write the support library. A library consisting of support functions needed for the compiling process usually completes the full metacompiler package.

The meaning of metacompiler

In computer science, the prefix meta is commonly used to mean about (its own category). For example, metadata are data that describe other data. A language that is used to describe other languages is a metalanguage. Meta may also mean on a higher level of abstraction. A metalanguage operates on a higher level of abstraction in order to describe properties of a language. Backus–Naur form (BNF) is a formal metalanguage originally used to define ALGOL 60. BNF is a weak metalanguage, for it describes only the syntax and says nothing about the semantics or meaning. Metaprogramming is the writing of computer programs with the ability to treat programs as their data. A metacompiler takes as input a metaprogram written in a specialized metalanguages (a higher level abstraction) specifically designed for the purpose of metaprogramming. The output is an executable object program.

An analogy can be drawn: That as a C++ compiler takes as input a C++ programming language program, a metacompiler takes as input a metaprogramming metalanguage program.

Forth metacompiler

Many advocates of the language Forth call the process of creating a new implementation of Forth a meta-compilation and that it constitutes a metacompiler. The Forth definition of metacompiler is:

"A metacompiler is a compiler which processes its own source code, resulting in an executable version of itself."

This Forth use of the term metacompiler is disputed in mainstream computer science. See Forth (programming language) and History of compiler construction. The actual Forth process of compiling itself is a combination of a Forth being a self-hosting extensible programming language and sometimes cross compilation, long established terminology in computer science. Metacompilers are a general compiler writing system. Besides the Forth metacompiler concept being indistinguishable from self-hosting and extensible language. The actual process acts at a lower level defining a minimum subset of forth words, that can be used to define additional forth words, A full Forth implementation can then be defined from the base set. This sounds like a bootstrap process. The problem is that almost every general purpose language compiler also fits the Forth metacompiler description.

When (self-hosting compiler) X processes its own source code, resulting in an executable version of itself, X is a metacompiler.

Just replace X with any common language, C, C++, Pascal, COBOL, Fortran, Ada, Modula-2, etc. And X would be a metacompiler according to the Forth usage of metacompiler. A metacompiler operates at an abstraction level above the compiler it compiles. It only operates at the same (self-hosting compiler) level when compiling itself. One has to see the problem with this definition of metacompiler. It can be applied to most any language.

However, on examining the concept of programming in Forth, adding new words to the dictionary, extending the language in this way is metaprogramming. It is this metaprogramming in Forth that makes it a metacompiler.

Programming in Forth is adding new words to the language. Changing the language in this way is metaprogramming. Forth is a metacompiler because Forth is a language specifically designed for metaprogramming. Programming in Forth is extending Forth adding words to the Forth vocabulary creates a new Forth dialect. Forth is a specialized metacompiler for Forth language dialects.

History

Design of the original Compiler Compiler was started by Tony Brooker and Derrick Morris in 1959, with initial testing beginning in March 1962. Brooker's Compiler Compiler was used to create compilers for the new Atlas computer at the University of Manchester, for several languages: Mercury Autocode, Extended Mercury Autocode, Atlas Autocode, ALGOL 60 and ASA Fortran. At roughly the same time, related work was being done by E. T. (Ned) Irons at Princeton, and Alick Glennie at the Atomic Weapons Research Establishment at Aldermaston whose "Syntax Machine" paper (declassified in 1977) inspired the META series of translator writing systems mentioned below.

The early history of metacompilers is closely tied with the history of SIG/PLAN Working group 1 on Syntax Driven Compilers. The group was started primarily through the effort of Howard Metcalfe in the Los Angeles area. In the fall of 1962 Howard Metcalfe designed two compiler-writing interpreters. One used a bottom-to-top analysis technique based on a method described by Ledley and Wilson. The other used a top-to-bottom approach based on work by Glennie to generate random English sentences from a context-free grammar.

At the same time, Val Schorre described two "meta machines", one generative and one analytic. The generative machine was implemented and produced random algebraic expressions. Meta I the first metacompiler was implemented by Schorre on an IBM 1401 at UCLA in January 1963. His original interpreters and metamachines were written directly in a pseudo-machine language. META II, however, was written in a higher-level metalanguage able to describe its own compilation into the pseudo-machine language.

Lee Schmidt at Bolt, Beranek, and Newman wrote a metacompiler in March 1963 that utilized a CRT display on the time-sharing PDP-l. This compiler produced actual machine code rather than interpretive code and was partially bootstrapped from Meta I.

Schorre bootstrapped Meta II from Meta I during the Spring of 1963. The paper on the refined metacompiler system presented at the 1964 Philadelphia ACM conference is the first paper on a metacompiler available as a general reference. The syntax and implementation technique of Schorre's system laid the foundation for most of the systems that followed. The system was implemented on a small 1401, and was used to implement a small ALGOL-like language.

Many similar systems immediately followed.

Roger Rutman of AC Delco developed and implemented LOGIK, a language for logical design simulation, on the IBM 7090 in January 1964. This compiler used an algorithm that produced efficient code for Boolean expressions.

Another paper in the 1964 ACM proceedings describes Meta III, developed by Schneider and Johnson at UCLA for the IBM 7090. Meta III represents an attempt to produce efficient machine code, for a large class of languages. Meta III was implemented completely in assembly language. Two compilers were written in Meta III, CODOL, a compiler-writing demonstration compiler, and PUREGOL, a dialect of ALGOL 60. (It was pure gall to call it ALGOL).

Late in 1964, Lee Schmidt bootstrapped the metacompiler EQGEN, from the PDP-l to the Beckman 420. EQGEN was a logic equation generating language.

In 1964, System Development Corporation began a major effort in the development of metacompilers. This effort includes powerful metacompilers, Bookl, and Book2 written in Lisp which have extensive tree-searching and backup ability. An outgrowth of one of the Q-32 systems at SDC is Meta 5. The Meta 5 system incorporates backup of the input stream and enough other facilities to parse any context-sensitive language. This system was successfully released to a wide number of users and had many string-manipulation applications other than compiling. It has many elaborate push-down stacks, attribute setting and testing facilities, and output mechanisms. That Meta 5 successfully translates JOVIAL programs to PL/I programs demonstrates its power and flexibility.

Robert McClure at Texas Instruments invented a compiler-compiler called TMG (presented in 1965). TMG was used to create early compilers for programming languages like B, PL/I and ALTRAN. Together with metacompiler of Val Schorre, it was an early inspiration for the last chapter of Donald Knuth's The Art of Computer Programming.

The LOT system was developed during 1966 at Stanford Research Institute and was modeled very closely after Meta II. It had new special-purpose constructs allowing it to generate a compiler which could in turn, compile a subset of PL/I. This system had extensive statistic-gathering facilities and was used to study the characteristics of top-down analysis.

SIMPLE is a specialized translator system designed to aid the writing of pre-processors for PL/I, SIMPLE, written in PL/I, is composed of three components: An executive, a syntax analyzer and a semantic constructor.

The TREE-META compiler was developed at Stanford Research Institute in Menlo Park, California. April 1968. The early metacompiler history is well documented in the TREE META manual. TREE META paralleled some of the SDC developments. Unlike earlier metacompilers it separated the semantics processing from the syntax processing. The syntax rules contained tree building operations that combined recognized language elements with tree nodes. The tree structure representation of the input was then processed by a simple form of unparse rules. The unparse rules used node recognition and attribute testing that when matched resulted in the associated action being performed. In addition like tree element could also be tested in an unparse rule. Unparse rules were also a recursive language being able to call unparse rules passing elements of thee tree before the action of the unparse rule was performed.

The concept of the metamachine originally put forth by Glennie is so simple that three hardware versions have been designed and one actually implemented. The latter at Washington University in St. Louis. This machine was built from macro-modular components and has for instructions the codes described by Schorre.

CWIC (Compiler for Writing and Implementing Compilers) is the last known Schorre metacompiler. It was developed at Systems Development Corporation by Erwin Book, Dewey Val Schorre and Steven J. Sherman With the full power of (lisp 2) a list processing language optimizing algorithms could operate on syntax generated lists and trees before code generation. CWIC also had a symbol table built into the language.

With the resurgence of domain-specific languages and the need for parser generators which are easy to use, easy to understand, and easy to maintain, metacompilers are becoming a valuable tool for advanced software engineering projects.

Other examples of parser generators in the yacc vein are ANTLR, Coco/R, CUP, GNU Bison, Eli, FSL, SableCC, SID (Syntax Improving Device), and JavaCC. While useful, pure parser generators only address the parsing part of the problem of building a compiler. Tools with broader scope, such as PQCC, Coco/R and DMS Software Reengineering Toolkit provide considerable support for more difficult post-parsing activities such as semantic analysis, code optimization and generation.

Schorre metalanguages

The earliest Schorre metacompilers, META I and META II, were developed by D. Val Schorre at UCLA. Other Schorre based metacompilers followed. Each adding improvements to language analysis and/or code generation.

In programming it is common to use the programming language name to refer to both the compiler and the programming language, the context distinguishing the meaning. A C++ program is compiled using a C++ compiler. That also applies in the following. For example, META II is both the compiler and the language.

The metalanguages in the Schorre line of metacompilers are functional programming languages that use top down grammar analyzing syntax equations having embedded output transformation constructs.

A syntax equation:

<name> = <body>;

is a compiled test function returning success or failure. <name> is the function name. <body> is a form of logical expression consisting of tests that may be grouped, have alternates, and output productions. A test is like a bool in other languages, success being true and failure being false.

Defining a programming language analytically top down is natural. For example, a program could be defined as:

 program = $declaration;

Defining a program as a sequence of zero or more declaration(s).

In the Schorre META X languages there is a driving rule. The program rule above is an example of a driving rule. The program rule is a test function that calls declaration, a test rule, that returns success or failure. The $ loop operator repeatedly calling declaration until failure is returned. The $ operator is always successful, even when there are zero declaration. Above program would always return success. (In CWIC a long fail can bypass declaration. A long-fail is part of the backtracking system of CWIC)

The character sets of these early compilers were limited. The character / was used for the alternant (or) operator. "A or B" is written as A / B. Parentheses ( ) are used for grouping.

A (B / C)

Describes a construct of A followed by B or C. As a boolean expression it would be

A and (B or C)

A sequence X Y has an implied X and Y meaning. ( ) are grouping and / the or operator. The order of evaluation is always left to right as an input character sequence is being specified by the ordering of the tests.

Special operator words whose first character is a "." are used for clarity. .EMPTY is used as the last alternate when no previous alternant need be present.

X (A / B / .EMPTY)

Indicates that X is optionally followed by A or B. This is a specific characteristic of these metalanguages being programming languages. Backtracking is avoided by the above. Other compiler constructor systems may have declared the three possible sequences and left it up to the parser to figure it out.

The characteristics of the metaprogramming metalanguages above are common to all Schorre metacompilers and those derived from them.

META I

META I was a hand compiled metacompiler used to compile META II. Little else is known of META I except that the initial compilation of META II produced nearly identical code to that of the hand coded META I compiler.

META II

Each rule consists optionally of tests, operators, and output productions. A rule attempts to match some part of the input program source character stream returning success or failure. On success the input is advanced over matched characters. On failure the input is not advanced.

Output productions produced a form of assembly code directly from a syntax rule.

TREE-META

TREE-META introduced tree building operators :<node_name> and [<number>] moving the output production transforms to unparsed rules. The tree building operators were used in the grammar rules directly transforming the input into an abstract syntax tree. Unparse rules are also test functions that matched tree patterns. Unparse rules are called from a grammar rule when an abstract syntax tree is to be transformed into output code. The building of an abstract syntax tree and unparse rules allowed local optimizations to be performed by analyzing the parse tree.

Moving of output productions to the unparse rules made a clear separation of grammar analysis and code production. This made the programming easier to read and understand.

CWIC

In 1968–1970, Erwin Book, Dewey Val Schorre, and Steven J. Sherman developed CWIC. (Compiler for Writing and Implementing Compilers) at System Development Corporation Charles Babbage Institute Center for the History of Information Technology (Box 12, folder 21),

CWIC is a compiler development system composed of three special-purpose, domain specific, languages, each intended to permit the description of certain aspects of translation in a straight forward manner. The syntax language is used to describe the recognition of source text and the construction from it to an intermediate tree structure. The generator language is used to describe the transformation of the tree into appropriate object language.

The syntax language follows Dewey Val Schorre's previous line of metacompilers. It most resembles TREE-META having tree building operators in the syntax language. The unparse rules of TREE-META are extended to work with the object based generator language based on LISP 2.

CWIC includes three languages:

  • Syntax: Transforms the source program input, into list structures using grammar transformation formula. A parsed expression structure is passed to a generator by placement of a generator call in a rule. A tree is represented by a list whose first element is a node object. The language has operators, < and >, specifically for making lists. The colon : operator is used to create node objects. :ADD creates an ADD node. The exclamation ! operator combines a number of parsed entries with a node to make a tree . Trees created by syntax rules are passed to generator functions, returning success or failure. The syntax language is very close to TREE-META. Both use a colon to create a node. CWIC's tree building exclamation !<number> functions the same as TREE-META's [<number>].
  • Generator: a named series of transforming rules, each consisting of an unparse, pattern matching, rule. and an output production written in a LISP 2 like language. the translation was to IBM 360 binary machine code. Other facilities of the generator language generalized output.
  • MOL-360: an independent mid level implementation language for the IBM System/360 family of computers developed in 1968 and used for writing the underlying support library.

Generators language

Generators Language had semantics similar to Lisp. The parse tree was thought of as a recursive list. The general form of a Generator Language function is:

  function-name(first-unparse_rule) => first-production_code_generator
           (second-unparse_rule) => second-production_code_generator
           (third-unparse_rule) => third-production_code_generator
               ...

The code to process a given tree included the features of a general purpose programming language, plus a form: <stuff>, which would emit (stuff) onto the output file. A generator call may be used in the unparse_rule. The generator is passed the element of unparse_rule pattern in which it is placed and its return values are listed in (). For example:

  expr_gen(ADD[expr_gen(x),expr_gen(y)]) =>
                                <AR + (x*16)+y;>
                                releasereg(y);
                                return x;
               (SUB[expr_gen(x),expr_gen(y)])=>
                                <SR + (x*16)+y;>
                                releasereg(y);
                                return x;
               (MUL[expr_gen(x),expr_gen(y)])=>
                              .
                              .
                              .
               (x)=>     r1 = getreg();
                            load(r1, x);
                            return r1;
...

That is, if the parse tree looks like (ADD[<something1>,<something2>]), expr_gen(x) would be called with <something1> and return x. A variable in the unparse rule is a local variable that can be used in the production_code_generator. expr_gen(y) is called with <something2> and returns y. Here is a generator call in an unparse rule is passed the element in the position it occupies. Hopefully in the above x and y will be registers on return. The last transforms is intended to load an atomic into a register and return the register. The first production would be used to generate the 360 "AR" (Add Register) instruction with the appropriate values in general registers. The above example is only a part of a generator. Every generator expression evaluates to a value that con then be further processed. The last transform could just as well have been written as:

               (x)=>     return load(getreg(), x);

In this case load returns its first parameter, the register returned by getreg(). the functions load and getreg are other CWIC generators.

CWIC addressed domain-specific languages before the term domain-specific language existed

From the authors of CWIC:

"A metacompiler assists the task of compiler-building by automating its non creative aspects, those aspects that are the same regardless of the language which the produced compiler is to translate. This makes possible the design of languages which are appropriate to the specification of a particular problem. It reduces the cost of producing processors for such languages to a point where it becomes economically feasible to begin the solution of a problem with language design."

Examples

Tunnel boring machine

From Wikipedia, the free encyclopedia
One of the boring machines used for the Channel Tunnel between France and the United Kingdom
A tunnel boring machine used to excavate the Gotthard Base Tunnel, Switzerland, the world's longest rail tunnel
A tunnel boring machine that was used at Yucca Mountain nuclear waste repository

A tunnel boring machine (TBM), also known as a "mole", is a machine used to excavate tunnels. Tunnels are excavated through hard rock, wet or dry soil, or sand. TBMs have been customized to work with each material. TBM-bored tunnel cross-sections range from one metre (3.3 ft) (micro-TBMs) to 17.6 metres (58 ft) to date. Narrower tunnels are typically bored using trenchless construction methods or horizontal directional drilling rather than TBMs. TBM tunnels are typically circular in cross-section although they can be customized for u-shaped, horseshoe, square or rectangular tunnels.

Tunnel boring machines are an alternative to drilling and blasting (D&B) methods and "hand mining". TBMs may be used for microtunneling.

TBMs limit the disturbance to the surrounding ground and produce a smooth tunnel wall. This reduces the cost of lining the tunnel, and is suitable for use in urban areas. TBMs are expensive to construct, and larger ones are difficult to transport. These up-front costs become less significant for longer tunnels. Tunneling with TBMs is much more efficient and shortens completion times. TBMs are not used for tunnels in heavily fractured and sheared rock.

History

Cutting shield used for the New Elbe Tunnel
Top view of a model of the TBM used on the Gotthard Base Tunnel
Looking towards the cutting shield at the hydraulic jacks
A tunnel boring machine cutter head being lowered underground for the construction of Sydney Metro

The first successful tunnelling shield was developed by Sir Marc Isambard Brunel to excavate the Thames Tunnel in 1825. However, this was only the invention of the shield concept and did not involve the construction of a complete tunnel boring machine, the digging still having to be accomplished by the then standard excavation methods.

The first boring machine reported to have been built was Henri Maus's Mountain Slicer. Commissioned by the King of Sardinia in 1845 to dig the Fréjus Rail Tunnel between France and Italy through the Alps, Maus had it built in 1846 in an arms factory near Turin. It consisted of more than 100 percussion drills mounted in the front of a locomotive-sized machine, mechanically power-driven from the entrance of the tunnel. The Revolutions of 1848 affected the funding, and the tunnel was not completed until 10 years later, by using less innovative and less expensive methods such as pneumatic drills.

In the United States, the first boring machine to have been built was used in 1853 during the construction of the Hoosac Tunnel in northwest Massachusetts. Made of cast iron, it was known as Wilson's Patented Stone-Cutting Machine, after inventor Charles Wilson. It drilled 10 feet (3 m) into the rock before breaking down. (The tunnel was eventually completed more than 20 years later, and as with the Fréjus Rail Tunnel, by using less ambitious methods.) Wilson's machine anticipated modern TBMs in the sense that it employed cutting discs, like those of a disc harrow, which were attached to the rotating head of the machine. In contrast to traditional chiseling or drilling and blasting, this innovative method of removing rock relied on simple metal wheels to apply a transient high pressure that fractured the rock.

In 1853, the American Ebenezer Talbot also patented a TBM that employed Wilson's cutting discs, although they were mounted on rotating arms, which in turn were mounted on a rotating plate. In the 1870s, John D. Brunton of England built a machine employing cutting discs that were mounted eccentrically on rotating plates, which in turn were mounted eccentrically on a rotating plate, so that the cutting discs would travel over almost all of the rock face that was to be removed.

The first TBM that tunneled a substantial distance was invented in 1863 and improved in 1875 by British Army officer Major Frederick Edward Blackett Beaumont (1833–1895); Beaumont's machine was further improved in 1880 by British Army officer Major Thomas English (1843–1935). In 1875, the French National Assembly approved the construction of a tunnel under the English Channel and the British Parliament supported a trial run using English's TBM. Its cutting head consisted of a conical drill bit behind which were a pair of opposing arms on which were mounted cutting discs. From June 1882 to March 1883, the machine tunneled, through chalk, a total of 1,840 m (6,036 ft). A French engineer, Alexandre Lavalley, who was also a Suez Canal contractor, used a similar machine to drill 1,669 m (5,476 ft) from Sangatte on the French side. However, despite this success, the cross-Channel tunnel project was abandoned in 1883 after the British military raised fears that the tunnel might be used as an invasion route. Nevertheless, in 1883, this TBM was used to bore a railway ventilation tunnel — 7 ft (2 m) in diameter and 6,750 feet (2 km) long — between Birkenhead and Liverpool, England, through sandstone under the Mersey River.

During the late 19th and early 20th century, inventors continued to design, build, and test TBMs in response to the need for tunnels for railroads, subways, sewers, water supplies, etc. TBMs employing rotating arrays of drills or hammers were patented. TBMs that resembled giant hole saws were proposed. Other TBMs consisted of a rotating drum with metal tines on its outer surface, or a rotating circular plate covered with teeth, or revolving belts covered with metal teeth. However, these TBMs proved expensive, cumbersome, and unable to excavate hard rock; interest in TBMs therefore declined. Nevertheless, TBM development continued in potash and coal mines, where the rock was softer.

A TBM with a bore diameter of 14.4 m (47 ft 3 in) was manufactured by The Robbins Company for Canada's Niagara Tunnel Project. The machine was used to bore a hydroelectric tunnel beneath Niagara Falls. The machine was named "Big Becky" in reference to the Sir Adam Beck hydroelectric dams to which it tunnelled to provide an additional hydroelectric tunnel.

An earth pressure balance TBM known as Bertha with a bore diameter of 17.45 metres (57 ft 3 in) was produced by Hitachi Zosen Corporation in 2013. It was delivered to Seattle, Washington, for its Highway 99 tunnel project. The machine began operating in July 2013, but stalled in December 2013 and required substantial repairs that halted the machine until January 2016. Bertha completed boring the tunnel on April 4, 2017.

Two TBM's supplied by CREG excavated two tunnels for Kuala Lumpur's Rapid Transit with a boring diameter of 6,67m. The medium was water saturated sandy mudstone, schistose mudstone, highly weathered mudstone as well as alluvium. It achieved a maximum advance rate of more than 345m/month.

The world's largest hard rock TBM, known as Martina, was built by Herrenknecht AG. Its excavation diameter was 15.62 m (51 ft 3 in), total length 130 m (430 ft); excavation area of 192 m2 (2,070 sq ft), thrust value 39,485 t, total weight 4,500 tons, total installed capacity 18 MW. Its yearly energy consumption was about 62 GWh. It is owned and operated by the Italian construction company Toto S.p.A. Costruzioni Generali (Toto Group) for the Sparvo gallery of the Italian Motorway Pass A1 ("Variante di Valico A1"), near Florence. The same company built the world's largest-diameter slurry TBM, excavation diameter of 17.6 metres (57 ft 9 in), owned and operated by the French construction company Dragages Hong Kong (Bouygues' subsidiary) for the Tuen Mun Chek Lap Kok link in Hong Kong.

Types

TBMs typically consist of a rotating cutting wheel in front, called a cutter head, followed by a main bearing, a thrust system, and support mechanisms. Machines vary according to the geology of the site, the amount of ground water present and other factors.

Shielded

The support structures at the rear of a TBM. This machine was used to excavate the main tunnel of the Yucca Mountain nuclear waste repository in Nevada.
Hydraulic jacks holding a TBM in place

Shielded TBMs erect concrete segments behind the machine to support the tunnel walls. They can bore tunnels through hard rock.

They bore with rotating discs mounted in the cutter head. The disc cutters create compressive stress fractures in the tunnel face. The excavated material (muck) is pushed through openings in the cutter head to a conveyor that carries it through the machine. Once through the machine conveyors/muck cars carry it out of the tunnel.

Double Shield TBMs have two modes; in stable ground they grip the tunnel walls to advance. In unstable, fractured ground, thrust cylinders push against the tunnel segments behind the machine. This keeps the thrust forces from impacting unreinforced tunnel walls.

Single Shield TBMs operate in the same way, but are used only in fractured ground, as they can only push against concrete segments.

Open/Main Beam

Open-type TBMs use their grippers to press against unshielded tunnel walls to maintain pressure on the tunnel face and to advance. They can also be used to bore hard rock. They use similar rotating discs to bore. Machines such as a Wirth machine can be moved only while ungripped. Other machines can move continuously. At the end of a Wirth boring cycle, legs drop to the ground, the grippers are retracted, and the machine advances. The grippers then reengage and the rear legs lift for the next cycle.

Main Beam machines do not install concrete segments behind the cutter head. Instead, the rock is held up using ground support methods such as ring beams, rock bolts, shotcrete, steel straps, ring steel and wire mesh.

Earth Pressure Balance machine

Tunnel boring machine at the site of Weinberg tunnell Altstetten-Zürich-Oerlikon near Zürich Oerlikon railway station
Urban installation for an 84 inches (2.1 m) sewer in Chicago, IL, USA

Earth Pressure Balance Machines (EPB) are used in soft ground with less than 7 bar of pressure. They use thrust cylinders to advance by pushing against concrete segments. The cutter head uses a combination of tungsten carbide cutting bits, carbide disc cutters, drag picks and/or hard rock disc cutters. It uses muck to maintain pressure at the tunnel face. The rate of extraction of spoil (using an Archimedes screw) and the advance rate can be adjusted to achieve the desired pressure. Additives such as bentonite, polymers and foam can be injected ahead of the face to stabilize the ground. They can separately be injected in the cutterhead/extraction screw to ensure that the spoil is sufficiently cohesive to maintain pressure and restrict water flow.

Slurry Shield

Slurry Shield machines can be used in soft ground with high water pressure or where ground conditions are granular (sands and gravels) do not allow a plug to form in the screw. The cutter head is filled with pressurised slurry typically made of bentonite clay that applies hydrostatic pressure to the face. The slurry mixes with the muck before it is pumped to a slurry separation plant, usually outside the tunnel.

Slurry separation plants use multi-stage filtration systems that separate spoil from slurry to allow reuse. The degree to which slurry can be 'cleaned' depends on the relative particle sizes of the muck. Slurry TBMs are not suitable for silts and clays as the particle sizes of the spoil are less than that of the bentonite. In this case, water is removed from the slurry leaving a clay cake, which may be polluted.

A caisson system is sometimes placed at the cutting head to allow workers to operate the machine, although air pressure may reach elevated levels in the caisson, requiring workers to be medically cleared as "fit to dive" and able to operate pressure locks.

Open face soft ground

Open face soft ground TBMs rely on the excavated ground to stand without support for a short interval. They are suitable for use in ground with a strength of up to ~10 MPa with low water inflows. They can bore tunnels with cross-section in excess of 10 metres. A backactor arm or cutter head bore to within 150 mm of the edge of the shield. After a boring cycle, the shield is jacked forward to begin a new cycle. Ground support is provided by precast concrete, or occasionally spheroidal graphite iron (SGI) segments that are bolted or supported until a support ring has been added. The final segment, called the key, is wedge-shaped, and expands the ring until it is tight against the ground.

Micro-tunnel shield

Micro tunnel shield TBMs are used to construct small tunnels, and is a smaller equivalent to a general tunnelling shield and generally bore tunnels of 1 to 1.5 m (3.3 to 4.9 ft), too small for operators to walk in.

Backup systems

Behind all types of tunnel boring machines, in the finished part of the tunnel, are trailing support decks known as the backup system, whose mechanisms can include conveyors or other systems for muck removal; slurry pipelines (if applicable); control rooms; electrical, dust-removal and ventilation systems; and mechanisms for transport of pre-cast segments.

Urban tunnelling and near-surface tunnelling

Urban tunnelling has the special requirement that the surface remain undisturbed, and that ground subsidence be avoided. The normal method of doing this in soft ground is to maintain soil pressures during and after construction.

TBMs with positive face control, such as earth pressure balance (EPB) and slurry shield (SS), are used in such situations. Both types (EPB and SS) are capable of reducing the risk of surface subsidence and voids if ground conditions are well documented. When tunnelling in urban environments, other tunnels, existing utility lines and deep foundations must be considered, and the project must accommodate measures to mitigate any detrimental effects to other infrastructure.

Yucca Mountain nuclear waste repository

Yucca Mountain
 
The proposed design

The Yucca Mountain Nuclear Waste Repository, as designated by the Nuclear Waste Policy Act amendments of 1987, is a proposed deep geological repository storage facility within Yucca Mountain for spent nuclear fuel and other high-level radioactive waste in the United States. The site is on federal land adjacent to the Nevada Test Site in Nye County, Nevada, about 80 mi (130 km) northwest of the Las Vegas Valley.

The project was approved in 2002 by the 107th United States Congress, but the 112th Congress ended federal funding for the site via amendment to the Department of Defense and Full-Year Continuing Appropriations Act, passed on April 14, 2011, during the Obama administration. The project has encountered many difficulties and was highly contested by the public, the Western Shoshone peoples, and many politicians. The project also faces strong state and regional opposition. The Government Accountability Office stated that the closure was for political, not technical or safety reasons.

This leaves the United States government (which disposes of its transuranic waste from nuclear weapons production 2,150 feet (660 m) below the surface at the Waste Isolation Pilot Plant in New Mexico) and American nuclear power plants without any designated long-term storage for their high-level radioactive waste (spent fuel) stored on-site in steel and concrete casks (dry cask storage) at 76 reactor sites in 34 states.

Under President Barack Obama, the Department of Energy (DOE) reviewed options other than Yucca Mountain for a high-level waste repository. The Blue Ribbon Commission on America's Nuclear Future, established by the Secretary of Energy, released its final report in January 2012. It detailed an urgent need to find a site suitable for constructing a consolidated geological repository, stating that any future facility should be developed by a new independent organization with direct access to the Nuclear Waste Fund, which is not subject to political and financial control as the Cabinet-level DOE is. But the site met with strong opposition in Nevada, including from then-Senate leader Harry Reid.

Under President Donald Trump, the DOE ceased deep borehole and other non–Yucca Mountain waste disposition research activities. For FY18, the DOE requested $120 million and the NRC $30 million from Congress to continue licensing activities for the Yucca Mountain Repository. For FY19, the DOE again requested $120 million but the NRC increased its request to $47.7 million. Congress decided to provide no funding for the remainder of FY18. In May 2019, Representative John Shimkus reintroduced a bill in the House for the site, but the Appropriation Committee killed an amendment by Representative Mike Simpson to add $74 million in Yucca Mountain funding to a DOE appropriations bill. On May 20, 2020, Under Secretary of Energy Mark W. Menezes testified in front of the Senate Energy and Natural Resources Committee that Trump strongly opposes proceeding with Yucca Mountain Repository.

In May 2021, Energy Secretary Jennifer Granholm said that Yucca Mountain would not be part of the Biden administration's plans for nuclear-waste disposal. She anticipated announcing the department's next steps "in the coming months".

Introduction

An infographic about the Yucca Mountain nuclear waste repository

Spent nuclear fuel is the radioactive by-product of electricity generation at commercial nuclear power plants, and high-level radioactive waste is the by-product of reprocessing spent fuel to produce fissile material for nuclear weapons. In 1982, Congress established a national policy to solve the problem of nuclear waste disposal. This policy is a federal law called the Nuclear Waste Policy Act, which made the U.S. Department of Energy (DOE) responsible for finding a site, building, and operating an underground disposal facility called a geologic repository. The recommendation to use a geologic repository dates to 1957, when the National Academy of Sciences recommended that the best way to protect the environment and public health and safety is to dispose of the waste in rock deep underground.

The DOE began studying Yucca Mountain in 1978 to determine whether it would be suitable for the nation's first long-term geologic repository for over 70,000 metric tons (69,000 long tons; 77,000 short tons) (150 million pounds) of spent nuclear fuel and high-level radioactive waste as of 2015[] stored at 121 sites around the nation. An estimated 10,000 metric tons (9,800 long tons; 11,000 short tons) of the waste would be from America's military nuclear programs.

On December 19, 1984, the DOE selected ten locations in six states for consideration as potential repository sites, based on data collected for nearly ten years. The ten sites were studied and results of these preliminary studies were reported in 1985. Based on these reports, President Ronald Reagan approved three sites for intensive scientific study called site characterization. The three sites were Hanford, Washington; Deaf Smith County, Texas; and Yucca Mountain.

In 1987, Congress amended the Nuclear Waste Policy Act and directed DOE to study only Yucca Mountain, which is adjacent to the former nuclear test site. The Act provided that if during site characterization Yucca Mountain was found unsuitable, studies would stop immediately. This option expired when Reagan actually recommended the site. On July 23, 2002, President George W. Bush signed House Joint Resolution 87, (Pub. L. 107–200 (text) (PDF)) allowing the DOE to take the next step in establishing a safe repository in which to store nuclear waste. The DOE was to begin accepting spent fuel at the Yucca Mountain Repository by January 31, 1998, but did not do so because of a series of delays due to legal challenges, concerns over how to transport nuclear waste to the facility, and political pressure resulting in underfunding of the construction.

On July 18, 2006, the DOE proposed March 31, 2017, as the date to open the facility and begin accepting waste based on full funding. On September 8, 2006, Bush nominated Ward (Edward) Sproat, a nuclear industry executive formerly of PECO energy in Pennsylvania, to lead the Yucca Mountain Project. Following the 2006 midterm congressional elections, Harry Reid, a longtime opponent of the repository, became the Senate Majority Leader, putting him in a position to greatly affect the future of the project. Reid said he would continue to work to block completion of the project, and is quoted as having said, "Yucca Mountain is dead. It'll never happen."

In the 2008 Omnibus Spending Bill, the Yucca Mountain Project's budget was reduced to $390 million. The project was able to reallocate resources and delay transportation expenditures to complete the License Application for submission on June 3, 2008. During his 2008 presidential campaign, Barack Obama promised to abandon the project. After his election, the Nuclear Regulatory Commission told Obama he did not have the ability to do so. On April 23, 2009, Lindsey Graham and eight other senators introduced legislation to provide "rebates" from a $30 billion federally managed fund into which nuclear power plants had been paying, so as to refund all collected funds if Congress canceled the project.

In November 2013, in response to a lawsuit filed by the National Association of Regulatory Utility Commissioners and the Nuclear Energy Institute, the US court of appeals ruled that nuclear utilities may stop paying into the nuclear waste recovery fund until either the DOE follows the Nuclear Waste Policy Act, which designates Yucca Mountain as the repository, or Congress changes the law. The fee ended May 16, 2014.

Lacking an operating repository, the federal government initially paid utility companies somewhere between $300 and $500 million per year in compensation for failing to comply with the contract it signed to take the spent nuclear fuel by 1998. For the ten years after 2015, it is estimated to cost taxpayers $24 billion in payments from the Judgment Fund. The Judgment Fund is not subject to budget rules and allows Congress to ignore the nuclear waste issue since payments therefrom do not have any impact on yearly spending for other programs.

Facility

A tour group entering the North Portal of Yucca Mountain

The purpose of the Yucca Mountain project is to comply with the Nuclear Waste Policy Act of 1982 and develop a national site for spent nuclear fuel and high-level radioactive waste storage. The management and operating contractor as of April 1, 2009 for the project is USA Repository Services (USA-RS), a wholly owned subsidiary of URS Corporation (now part of AECOM) with supporting principal subcontractors Shaw Corporation (now part of McDermott International Inc.) and Areva Federal Services LLC (now Orano federal services business).

After the layoff of 800 employees on March 31, 2009, about 100 employees remained on the project until all technical staff were laid off by the end of FY 2010 due to zero funding in the 2011 budget for the Office of Civilian Radioactive Waste Management. Sandia National Laboratories was responsible for post-closure analysis and ensuring compliance with the NWPA. The main tunnel of the Exploratory Studies Facility is U-shaped, 5 mi (8.0 km) long and 25 ft (7.6 m) wide.

There are also several cathedral-like alcoves that branch from the main tunnel. Most of the scientific experiments were conducted in these alcoves. The emplacement drifts (smaller-diameter tunnels branching off the main tunnel) where waste would have been stored were not constructed since they required authorization from the Nuclear Regulatory Commission. The repository has a statutory limit of 77,000 metric tons (85,000 short tons).

To store that much waste would have required 40 miles (64 km) of tunnels. The Nuclear Waste Policy Act further limits the capacity of the repository to 63,000 metric tons (62,000 long tons; 69,000 short tons) of initial heavy metal in commercial spent fuel. The 104 U.S. commercial reactors currently operating will produce this quantity of spent fuel by 2014, assuming that the spent fuel rods are not reprocessed. Currently, the US has no civil reprocessing plant.

The tunnel boring machine on display at the exit of the tunnel

By 2008, Yucca Mountain was one of the most studied pieces of geology in the world; between geologic studies and materials science, the United States had invested US$9 billion on the project. This site studied by the Nevada Bureau of Mines and Geology (NBMG) differs substantially from other potential repositories due to the finding of natural analogues of nuclear material that are currently being studied. The DOE estimates that it has over 100 million U.S. gallons of highly radioactive waste and 2,500 metric tons (2,800 short tons) of spent fuel from the production of nuclear weapons and from research activities in temporary storage.

The cost of the facility is being paid for by a combination of a tax on each kilowatt hour of nuclear power and by the taxpayers for disposal of weapons and naval nuclear waste. Based on the 2001 cost estimate, about 73% is funded by consumers of nuclear-powered electricity and 27% by the taxpayers.

The Total System Life Cycle Cost Director Sproat presented to Congress on July 15, 2008, was $90 billion. This cost could not be compared to previous estimates since it included a repository capacity about twice as large as previously estimated over a much longer period of time (100 years vs. 30 years). Additionally, the cost of the project continued to escalate because of insufficient funding to most efficiently move forward and complete the project. By 2007, the DOE announced it was seeking to double the size of the Yucca Mountain repository to a capacity of 135,000 metric tons (149,000 short tons), or 300 million pounds.

The tunnel boring machine (TBM) that excavated the main tunnel cost $13 million and was 400 ft (120 m) long when in operation. It now sits at its exit point at the South Portal (south entrance) of the facility. The short side tunnel alcoves were excavated using explosives.

Opposition

A map showing Yucca Mountain in southern Nevada, west of the Nevada Test Site

The DOE was scheduled to begin accepting spent fuel at the Yucca Mountain repository by January 31, 1998 (25 years ago). By 2010, years after this deadline, the future status of the repository at Yucca Mountain was still unknown due to ongoing litigation, and opposition by Harry Reid.

Because of construction delays, a number of nuclear power plants in the United States have resorted to dry cask storage of waste on-site indefinitely in steel and concrete casks.

The project is widely opposed in Nevada and is a hotly debated national topic. A two-thirds majority of Nevadans feel it is unfair for their state to have to store nuclear waste when there are no nuclear power plants in Nevada. Many Nevadans' opposition stemmed from the so-called "Screw Nevada Bill," the 1987 legislation halting study of Hanford and Texas as potential sites for the waste before conclusions could be made. The county containing the proposed facility, Nye County, supports the repository's development, as do six adjoining counties. A 2015 survey of Nevadans found 55% agreeing that the state should be open to discussion of what benefits could be received.

One point of concern has been the standard of radiation emission in 10,000 to 1,000,000 years. On August 9, 2005, the United States Environmental Protection Agency (EPA) proposed a limit of 350 millirem per year for that period. In October 2007, the DOE issued a draft of the Supplemental Environmental Impact Statement showing that for the first 10,000 years mean public dose would be 0.24 mrem/year and that thereafter the median public dose would be 0.98 mrem/year, both of which are substantially below the proposed EPA limit. For comparison, a hip X-ray results in a dose around 83 mrem and a CT head or chest scan results in around 1,110 mrem. Annually, in the United States, an individual's dose from background radiation is about 350 mrem, though some places get more than twice that.

On February 12, 2002, Secretary of Energy Spencer Abraham decided that this site was suitable to be the nation's nuclear repository. The governor of Nevada had 90 days to object and did so, but Congress overrode the objection. If the governor's objection had stood, the project would have been abandoned and a new site chosen. In August 2004, the repository became an election issue when Senator John Kerry said he would abandon the plans if elected.

In March 2005, the Energy and Interior departments revealed that several U.S. Geological Survey hydrologists had exchanged emails discussing possible falsification of quality assurance documents on water infiltration research. On February 17, 2006, the DOE's Office of Civilian Radioactive Waste Management (OCRWM) released a report confirming the technical soundness of infiltration modeling work performed by U.S. Geological Survey (USGS) employees. In March 2006, the U.S. Senate Committee on Environment and Public Works Majority Staff issued a 25-page white paper, "Yucca Mountain: The Most Studied Real Estate on the Planet." The conclusions were:

  • Extensive studies consistently show Yucca Mountain to be a sound site for nuclear waste disposal;
  • The cost of not moving forward is extremely high;
  • Nuclear waste disposal capability is an environmental imperative;
  • Nuclear waste disposal capability supports national security;
  • Demand for new nuclear plants also demands disposal capability.

On January 18, 2006, DOE OCRWM announced that it would designate Sandia National Laboratories as its lead laboratory to integrate repository science work for the Yucca Mountain Project. "We believe that establishing Sandia as our lead laboratory is an important step in our new path forward. The independent, expert review that the scientists at Sandia will perform will help ensure that the technical and scientific basis for the Yucca Mountain repository is without question," OCRWM's Acting Director Paul Golan said. "Sandia has unique experience in managing scientific investigations in support of a federally licensed geologic disposal facility, having served in that role as the scientific advisor to the Waste Isolation Pilot Plant in Carlsbad, New Mexico." Sandia began acting as the lead laboratory on October 1, 2006.

Because of questions raised by the State of Nevada and congressional members about the quality of the science behind Yucca Mountain, the DOE announced on March 31, 2006, the selection of Oak Ridge Associated Universities/Oak Ridge Institute for Science and Education (a not-for-profit consortium that includes 96 doctoral degree-granting institutions and 11 associate member universities) to provide expert reviews of scientific and technical work on the Yucca Mountain Project. DOE stated that the project "will be based on sound science. By bringing in Oak Ridge for review of technical work, DOE will seek to present a high level of expertise and credibility as they move the project forward ... This award gives DOE access to academic and research institutions to help DOE meet their mission and legal obligation to license, construct, and open Yucca Mountain as the nation's repository for spent nuclear fuel."

There was significant public and political opposition to the project in Nevada. An attempt was made to push ahead with it and override the opposition. But for large projects that would take decades to complete, there is every chance that sustained local opposition will prevail, and this happened with the Yucca Mountain project. Successful nuclear waste storage siting efforts in Scandinavia have involved local communities in the decision-making process and given them a veto at each stage, but this did not happen with Yucca Mountain. Local communities at potential storage and repository sites "should have early and continued involvement in the process, including funding that would allow them to retain technical experts".

On March 5, 2009, Energy Secretary Steven Chu reiterated in a Senate hearing that the Yucca Mountain site was no longer considered an option for storing reactor waste.

On March 3, 2010, the DOE filed a motion with the NRC to withdraw its license application, but multiple lawsuits to stop this action have been filed by states, counties, and individuals across the country as being unauthorized by the NWPA.

The costly nuclear accident in 2014 at New Mexico's Waste Isolation Pilot Plant in which a nuclear waste container exploded has caused doubt that it could serve as an alternative for Yucca.

In January 2019, Governor Steve Silolak vowed that "not one ounce" of nuclear waste would be allowed at Yucca Mountain, and a May funding bill did not include funding for the site. In May 2019, the Reno Gazette-Journal published a long-form essay cataloging opposition to the Yucca Mountain project. According to a tribal elder, the Western Shoshone view Yucca Mountain as sacred and a nuclear storage facility "will poison everything. It's people's life, our Mother Earth's life, all the living things here, all the creatures; whatever's crawling around, it's their life too." The tribes say they lack funds to discredit federal safety claims, but will be directly affected by a potential disaster.

Radiation standards

A tunnel inside the Exploratory Studies Facility.

Original standard

The EPA established its Yucca Mountain standards in June 2001. The storage standard set a dose limit of 15 millirem per year for the public outside the Yucca Mountain site. The disposal standards consisted of three components: an individual dose standard, a standard evaluating the impacts of human intrusion into the repository, and a groundwater protection standard. The individual-protection and human intrusion standards set a limit of 15 millirem per year to a reasonably maximally exposed individual, who would be among the most highly exposed members of the public.

The groundwater protection standard is consistent with EPA's Safe Drinking Water Act standards, which the Agency applies in many situations as a pollution prevention measure. The disposal standards were to apply for 10,000 years after the facility is closed. Dose assessments were to continue beyond 10,000 years and be placed in DOE's Environmental Impact Statement, but were not subject to a compliance standard.

The 10,000-year period for compliance assessment is consistent with EPA's generally applicable standards developed under the Nuclear Waste Policy Act. It also reflects international guidance regarding the level of confidence that can be placed in numerical projections over very long periods of time.

Inconsistent standards

Shortly after the EPA first established these standards in 2001, the nuclear industry, several environmental and public interest groups, and the State of Nevada challenged the standards in court. In July 2004, the Court of Appeals for the District of Columbia Circuit found in favor of the Agency on all counts except one: the 10,000 year regulatory time frame. The court ruled that EPA's 10,000-year compliance period for isolation of radioactive waste was not consistent with National Academy of Sciences (NAS) recommendations and was too short.

The NAS report had recommended standards be set for the time of peak risk, which might approach a period of one million years. By limiting the compliance time to 10,000 years, EPA did not respect a statutory requirement that it develop standards consistent with NAS recommendations.

EPA's rule

EPA published in the Federal Register a final rule in 2009. The new rule limits radiation doses from Yucca Mountain for up to 1,000,000 years after it closes. Within that regulatory time frame, the EPA has two dose standards that would apply based on the number of years from the time the facility is closed.

For the first 10,000 years, the EPA would retain the 2001 final rule's dose limit of 15 millirem per year. This is protection at the level of the most stringent radiation regulations in the U.S. today. From 10,000 to one million years, EPA established a dose limit of 100 millirem per year. EPA's rule requires DOE to show that Yucca Mountain can safely contain wastes, considering the effects of earthquakes, volcanic activity, climate change, and container corrosion, over one million years. The current analysis indicates that the repository will cause less than 1 mrem/year public dose for 1,000,000 years.

Geology

Looking west atop Yucca Mountain

The formation that makes up Yucca Mountain was created by several large eruptions from a caldera volcano and is composed of alternating layers of ignimbrite (welded tuff), non-welded tuff, and semi-welded tuff. The tuff surrounding the burial sites is expected to protect human health as it provides a natural barrier to the radiation. It lies along the transition between the Mojave and the Great Basin Deserts.

The volcanic tuff at Yucca Mountain is appreciably fractured and movement of water through an aquifer below the waste repository is primarily through fractures. While the fractures are usually confined to individual layers of tuff, the faults extend from the planned storage area all the way to the water table 600 to 1,500 ft (180 to 460 m) below the surface. Future water transport from the surface to waste containers is likely to be dominated by fractures. There is evidence that surface water has been transported down through the 700 ft (210 m) of overburden to the exploratory tunnel at Yucca Mountain in less than 50 years.

The aquifer of Yucca Mountain drains to Amargosa Valley, home to over 1400 people and a number of endangered species.

Some site opponents assert that, after the predicted containment failure of the waste containers, these cracks may provide a route for movement of radioactive waste that dissolves in the water flowing downward from the desert surface. Officials state that the waste containers will be stored in such a way as to minimize or even nearly eliminate this possibility.

The area around Yucca Mountain received much more rain in the geologic past and the water table was consequently much higher than it is today, though well below the level of the repository.

Earthquakes

DOE has stated that seismic and tectonic effects on the natural systems at Yucca Mountain will not significantly affect repository performance. Yucca Mountain lies in a region of ongoing tectonic deformation, but the deformation rates are too slow to significantly affect the mountain during the 10,000-year regulatory compliance period. Rises in the water table caused by seismic activity would be, at most, a few tens of meters and would not reach the repository. The fractured and faulted volcanic tuff that Yucca Mountain comprises reflects the occurrence of many earthquake-faulting and strong ground motion events during the last several million years, and the hydrological characteristics of the rock would not be changed significantly by seismic events that may occur in the next 10,000 years. The engineered barrier system components will reportedly provide substantial protection of the waste from seepage water, even under severe seismic loading.

In September 2007, it was discovered that the Bow Ridge fault line ran underneath the facility, hundreds of feet east of where it was originally thought to be located, beneath a storage pad where spent radioactive fuel canisters would be cooled before being sealed in a maze of tunnels. The discovery required several structures to be moved several hundred feet further to the east, and drew criticism from Robert R. Loux, then head of the Nevada Agency for Nuclear Projects, who argues that Yucca administrators should have known about the fault line's location years prior, and called the movement of the structures "just-in-time engineering."

In June 2008, a major nuclear equipment supplier, Holtec International, criticized DOE's safety plan for handling containers of radioactive waste before they are buried at the proposed Yucca Mountain dump. The concern is that, in an earthquake, the unanchored casks of nuclear waste material awaiting burial at Yucca Mountain could be sent into a "chaotic melee of bouncing and rolling juggernauts".

Transportation of waste

The nuclear waste was planned to be shipped to the site by rail and/or truck in robust containers known as spent nuclear fuel shipping casks, approved by the Nuclear Regulatory Commission. While the routes in Nevada would have been public, in the other states the planned routes, dates and times of transport would have been secret for security reasons. State and tribal representatives would have been notified before shipments of spent nuclear fuel entered their jurisdictions.

Nevada routes

The proposed Transportation Route of spent nuclear fuel through Nevada

Within Nevada, the planned primary mode of transportation was via rail through the Caliente Corridor. This corridor starts in Caliente, Nevada, traveling along the northern and western borders of the Nevada Test Site for approximately 200 miles (320 km). At this point, it turns south.

Other options that were being considered included a rail route along the Mina corridor. This rail route would have originated at the Fort Churchill Siding rail line, near Wabuska. The proposed corridor would have proceeded southeast through Hawthorne, Blair Junction, Lida Junction and Oasis Valley. At Oasis Valley, the rail line would have turned north-northeast towards Yucca Mountain. Use of this rail corridor by DOE would have required permission from the Walker River Paiute Tribe in order to cross their land. As the first 54 miles (87 km) of the proposed corridor was owned by the Department of Defense (DoD), additional permission from the DoD would have to have been granted.

The Nevada Center for Biological Diversity and the Nevada Attorney General both have expressed concern about the transportation routes, "through any number of sensitive habitats."

Impacts

Since the early 1960s, the U.S. has safely conducted more than 3,000 shipments of spent nuclear fuel without any harmful release of radioactive material. This safety record is comparable to the worldwide experience where more than 70,000 metric tons of spent nuclear fuel have been transported since 1970 – an amount approximately equal to the total amount of spent nuclear fuel that would have been shipped to Yucca Mountain.

But cities were still concerned about the transport of radioactive waste on highways and railroads that may have passed through heavily populated areas. Dr. Robert Halstead, a transportation adviser to the state of Nevada since 1988, said of transportation of the high-level waste, "They would heavily affect cities like Buffalo, Cleveland, Pittsburgh, in the Chicago metropolitan area, in Omaha." "Coming out of the south, the heaviest impacts would be in Atlanta, in Nashville, St. Louis, Kansas City, moving across through Salt Lake City, through downtown Las Vegas, up to Yucca Mountain. And the same cities would be affected by rail shipments as well." Spencer Abraham (DOE) said, "I think there's a general understanding that we move hazardous materials in this country, an understanding that the federal government knows how to do it safely."

In October 2018, a state senator from Utah argued that transferring nuclear waste from other states to Yucca Mountain on state highways and railways could be a health hazard.

Cultural impact

Archaeological surveys have found evidence that Native Americans used the immediate vicinity of Yucca Mountain on a temporary or seasonal basis. Some Native Americans disagree with the conclusions of archaeological investigators that their ancestors were highly mobile groups of hunter-gatherers who occupied the Yucca Mountain area before Euroamericans began using the area for prospecting, surveying, and ranching. They believe that these conclusions overlook traditional accounts of farming that occurred before European contact.

Yucca Mountain and surrounding lands were central in the lives of the Southern Paiute, Western Shoshone, and Owens Valley Paiute and Shoshone peoples, who shared them for religious ceremonies, resource uses, and social events.

Delays since 2009

Starting in 2009, the Obama administration attempted to close the Yucca Mountain repository, despite current US law that designates Yucca Mountain as the nation's nuclear waste repository. The administration agency, DOE, began implementation of the President's plan in May 2009. The Nuclear Regulatory Commission also went along with the administration's closure plan. Various state and congressional entities attempted to challenge the administration's closure plans, by statute and in court.

Most recently, in August 2013, a US Court of Appeals decision told the NRC and the Obama administration that they must either "approve or reject [DOE's] application for [the] never-completed waste storage site at Nevada's Yucca Mountain." They cannot simply make plans for its closure in violation of US law.

In May 2009, then United States Secretary of Energy Steven Chu stated:

Yucca Mountain as a repository is off the table. What we're going to be doing is saying, let's step back. We realize that we know a lot more today than we did 25 or 30 years ago. The NRC is saying that the dry cask storage at current sites would be safe for many decades, so that gives us time to figure out what we should do for a long-term strategy. We will be assembling a blue-ribbon panel to look at the issue. We're looking at reactors that have a high-energy neutron spectrum that can actually allow you to burn down the long-lived actinide waste. These are fast-neutron reactors. There's others: a resurgence of hybrid solutions of fusion fission where the fusion would impart not only energy, but again creates high-energy neutrons that can burn down the long-lived actinides. ...

Some of the waste is already vitrified. There is, in my mind, no economical reason why you would ever think of pulling it back into a potential fuel cycle. So one could well imagine—again, it depends on what the blue-ribbon panel says—one could well imagine that for a certain classification for a certain type of waste, you don't want to have access to it anymore, so that means you could use different sites than Yucca Mountain, such as salt domes. Once you put it in there, the salt oozes around it. These are geologically stable for a 50 to 100 million year time scale. The trouble with those type of places for repositories is you don't have access to it anymore. But say for certain types of waste you don't want to have access to it anymore—that's good. It's a very natural containment. ... whereas there would be other waste where you say it has some inherent value, let's keep it around for a hundred years, two hundred years, because there's a high likelihood we'll come back to it and want to recover that.

So the real thing is, let's get some really wise heads together and figure out how you want to deal with the interim and long-term storage. Yucca was supposed to be everything to everybody, and I think, knowing what we know today, there's going to have to be several regional areas.

In 2008, the U.S. Senate Committee on Environmental and Public Works found that failure to perform to contractual requirements could cost taxpayers up to $11 billion by 2020. In 2013, this estimate of taxpayer liability was raised to $21 billion. In July 2009, the House of Representatives voted 388 to 30 on amendments to HHR3183 (Roll call vote 591, via Clerk.House.gov) to not defund the Yucca Mountain repository in the FY2010 budget. In 2013, the House of Representatives voted twice during the 2014 Energy and Water Appropriations debate by over 80% majority to reject elimination of Yucca Mountain as the nation's only nuclear waste solution.

On April 13, 2010, the state of Washington filed suit to prevent the closing of Yucca Mountain, since this would slow efforts to clean up Hanford Nuclear Reservation. South Carolina, Aiken County (the location of Savannah River site) and others joined Washington. The United States Court of Appeals for the District of Columbia Circuit dismissed the suit in July 2011, saying the Nuclear Regulatory Commission had not ruled on the withdrawal of the license application. Washington and South Carolina filed another lawsuit on July 29.

With $32 billion received from power companies to fund the project, and $12 billion spent to study and build it, the federal government had $27 billion left, including interest. In March 2012, Senator Lindsey Graham introduced a bill requiring three-fourths of that money to be given back to customers, and the remainder to the companies for storage improvements.

In August 2013, the U.S. Court of Appeals for the District of Columbia ordered the Nuclear Regulatory Commission to either "approve or reject [DOE's] application for [the] never-completed waste storage site at Nevada's Yucca Mountain." The court opinion said that the NRC was "simply flouting the law" in its previous action to allow the Obama administration to continue plans to close the proposed waste site since a federal law designating Yucca Mountain as the nation's nuclear waste repository remains in effect. The court opinion stated that "The president may not decline to follow a statutory mandate or prohibition simply because of policy objections."

In response, NRC issued the final volumes of the Yucca Mountain Safety Evaluation Report (SER), which included the NRC staff's statement that the site would meet all applicable standards. At the same time, the staff also stated that NRC should not authorize construction of the repository until the requirements for land and water rights were met and a supplement to DOE's environmental impact statement (EIS) was finished. On March 3, 2015, NRC ordered the staff to complete the supplemental EIS and make the Yucca Mountain licensing document database publicly available, using all the remaining previously appropriated licensing funds.

In March 2015, Senator Lamar Alexander introduced the Nuclear Waste Administration Act of 2015 (S854) in the U.S. Senate. It was intended to establish a fully independent Nuclear Waste Administration (NWA) that would develop nuclear waste storage and disposal facilities. Construction of such facilities would require the consent of the state, local, and tribal governments that may be affected. The NWA would be required to complete a mission plan to open a pilot storage facility by 2021 for nuclear waste from non-operating reactors and other "emergency" deliveries (called "priority waste").

The goal would be to have a storage facility for waste from operating reactors or other "non-priority waste" available by 2025, and an actual permanent repository by the end of 2048. The current disposal limit of 70,000 metric tons for the nation's initial permanent repository would be repealed. Any nuclear waste fees collected after S854 was enacted would be held in a newly established Working Capital Fund. The Nuclear Waste Administration would be allowed to draw from that fund any amounts needed to carry out S.854, unless limited by annual appropriations or authorizations. S854 died in committee.

As of September 30, 2021, the Nuclear Waste Fund had an investment fair value of $52.4 billion.

Related legislation (2017–2019)

On March 15, 2017, the Trump administration announced it would request congressional approval for $120 million to restart licensing activity at the Yucca Mountain repository, with funding also to be used to create an interim storage program. The project would consolidate nuclear waste across the United States in Yucca Mountain, which had been stockpiled in local locations since 2010. The federal budget proposal was refused by Senate. Although his administration had allocated money to the project, in October 2018, President Donald Trump stated he opposed the use of Yucca mountain for dumping, saying he agreed "with the people of Nevada."

On May 11, 2018, the bill H.R. 3053 was approved in a 340–72 vote in the House of Representatives. The bill directed the DOE to resume the licensing process for Yucca Mountain, with licensing for a permanent site at the mountain to "take up to five years." The Nuclear Waste Policy Amendments Act was sponsored by John Shimkus. The Hill clarified that the bill would "set a path forward for the DOE to resume the process of planning for and building the southern Nevada site, transfer land to the DOE for it, ease the federal funding mechanism and allow DOE to build or license a temporary site to store waste while the Yucca project is being planned and built."

The bill would "direct [DOE] to revive the licensing process for Yucca Mountain to be designated as the country’s permanent site for nuclear waste." The bill would bring together waste from 121 locations in 39 states. All Nevada representatives opposed the bill. The measure was scheduled to go to the Senate next, and if passed there, would require the Nuclear Regulatory Commission to decide on the matter within 30 months.

The Hill noted that the bill received widespread support from lawmakers arguing that nuclear waste was best transferred out of their districts to Yucca, a concept opposed by Nevada representatives, with politicians such as Dina Titus dubbing it the "Screw Nevada 2.0" bill. Titus proposed an amendment that would have required long-term storage to be kept in locales that consented, which the House rejected, 332–80. In their opposition to the use of Yucca Mountain as a nuclear repository, Nevada representatives were supported by Dianne Feinstein of California and other politicians.

In June 2018, the Trump administration and some members of Congress again began proposing using Yucca Mountain, with Nevada Senators raising opposition. By early 2019, use of Yucca Mountain was in "political limbo" as opposition to the site led to an impasse. In January 2019, a panel of scientists introduced to Congress a 126-page report, Reset of America’s Nuclear Waste Management, which proposed including Yucca Mountain as a potential repository with "development of a consensus-based siting process, but one that would still include Yucca Mountain as a candidate."

Nevada National Security Site officials in April 2019 assured the public that the Device Assembly Facility on the Nevada security site was safe from earthquake threats. In contrast, Nevada officials claimed seismic activity in the region made it unsafe for the storage of nuclear waste. On April 1, 2019, the Las Vegas Review-Journal noted that "Nevada Democrats in the House" were seeking to block transfers of plutonium from DOE into the state by the use of the appropriations process.

Representation of a Lie group

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Representation_of_a_Lie_group...