Search This Blog

Sunday, January 5, 2020

Search algorithm

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Search_algorithm

Visual representation of a hash table, a data structure that allows for fast retrieval of information.
 
In computer science, a search algorithm is any algorithm which solves the search problem, namely, to retrieve information stored within some data structure, or calculated in the search space of a problem domain, either with discrete or continuous values. Specific applications of search algorithms include:
The classic search problems described above and web search are both problems in information retrieval, but are generally studied as separate subfields and are solved and evaluated differently. are generally focused on filtering and that find documents most relevant to human queries. Classic search algorithms are typically evaluated on how fast they can find a solution, and whether that solution is guaranteed to be optimal. Though information retrieval algorithms must be fast, the quality of ranking is more important, as is whether good results have been left out and bad results included. 

The appropriate search algorithm often depends on the data structure being searched, and may also include prior knowledge about the data. Some database structures are specially constructed to make search algorithms faster or more efficient, such as a search tree, hash map, or a database index.

Search algorithms can be classified based on their mechanism of searching. Linear search algorithms check every record for the one associated with a target key in a linear fashion. Binary, or half interval searches, repeatedly target the center of the search structure and divide the search space in half. Comparison search algorithms improve on linear searching by successively eliminating records based on comparisons of the keys until the target record is found, and can be applied on data structures with a defined order. Digital search algorithms work based on the properties of digits in data structures that use numerical keys. Finally, hashing directly maps keys to records based on a hash function. Searches outside a linear search require that the data be sorted in some way. 

Algorithms are often evaluated by their computational complexity, or maximum theoretical run time. Binary search functions, for example, have a maximum complexity of O(log n), or logarithmic time. This means that the maximum number of operations needed to find the search target is a logarithmic function of the size of the search space. 

Classes


For virtual search spaces

Algorithms for searching virtual spaces are used in the constraint satisfaction problem, where the goal is to find a set of value assignments to certain variables that will satisfy specific mathematical equations and inequations / equalities. They are also used when the goal is to find a variable assignment that will maximize or minimize a certain function of those variables. Algorithms for these problems include the basic brute-force search (also called "naïve" or "uninformed" search), and a variety of heuristics that try to exploit partial knowledge about the structure of this space, such as linear relaxation, constraint generation, and constraint propagation

An important subclass are the local search methods, that view the elements of the search space as the vertices of a graph, with edges defined by a set of heuristics applicable to the case; and scan the space by moving from item to item along the edges, for example according to the steepest descent or best-first criterion, or in a stochastic search. This category includes a great variety of general metaheuristic methods, such as simulated annealing, tabu search, A-teams, and genetic programming, that combine arbitrary heuristics in specific ways. 

This class also includes various tree search algorithms, that view the elements as vertices of a tree, and traverse that tree in some special order. Examples of the latter include the exhaustive methods such as depth-first search and breadth-first search, as well as various heuristic-based search tree pruning methods such as backtracking and branch and bound. Unlike general metaheuristics, which at best work only in a probabilistic sense, many of these tree-search methods are guaranteed to find the exact or optimal solution, if given enough time. This is called "completeness".

Another important sub-class consists of algorithms for exploring the game tree of multiple-player games, such as chess or backgammon, whose nodes consist of all possible game situations that could result from the current situation. The goal in these problems is to find the move that provides the best chance of a win, taking into account all possible moves of the opponent(s). Similar problems occur when humans or machines have to make successive decisions whose outcomes are not entirely under one's control, such as in robot guidance or in marketing, financial, or military strategy planning. This kind of problem — combinatorial search — has been extensively studied in the context of artificial intelligence. Examples of algorithms for this class are the minimax algorithm, alpha–beta pruning, * Informational search  and the A* algorithm.

For sub-structures of a given structure

The name "combinatorial search" is generally used for algorithms that look for a specific sub-structure of a given discrete structure, such as a graph, a string, a finite group, and so on. The term combinatorial optimization is typically used when the goal is to find a sub-structure with a maximum (or minimum) value of some parameter. (Since the sub-structure is usually represented in the computer by a set of integer variables with constraints, these problems can be viewed as special cases of constraint satisfaction or discrete optimization; but they are usually formulated and solved in a more abstract setting where the internal representation is not explicitly mentioned.)

An important and extensively studied subclass are the graph algorithms, in particular graph traversal algorithms, for finding specific sub-structures in a given graph — such as subgraphs, paths, circuits, and so on. Examples include Dijkstra's algorithm, Kruskal's algorithm, the nearest neighbour algorithm, and Prim's algorithm.

Another important subclass of this category are the string searching algorithms, that search for patterns within strings. Two famous examples are the Boyer–Moore and Knuth–Morris–Pratt algorithms, and several algorithms based on the suffix tree data structure.

Search for the maximum of a function

In 1953, American statistician Jack Kiefer devised Fibonacci search which can be used to find the maximum of a unimodal function and has many other applications in computer science.

For quantum computers

There are also search methods designed for quantum computers, like Grover's algorithm, that are theoretically faster than linear or brute-force search even without the help of data structures or heuristics.

Worst-case scenario

From Wikipedia, the free encyclopedia

A worst-case scenario is a concept in risk management wherein the planner, in planning for potential disasters, considers the most severe possible outcome that can reasonably be projected to occur in a given situation. Conceiving of worst-case scenarios is a common form of strategic planning, specifically scenario planning, to prepare for and minimize contingencies that could result in accidents, quality problems, or other issues.

Development and use

The worst-case scenario is "[o]ne of the most commonly used alternative scenarios". A risk manager may request "a conservative risk estimate representing a worst-case scenario" in order to determine the latitude they may exercise in planning steps to reduce risks. Generally, a worst-case scenario "is settled upon by agreeing that a given worst case is bad enough. However, it is important to recognize that no worst-case scenario is truly without potential nasty surprises". In other words, ‘[a] “worst-case scenario” is never the worst case’, both because situations may arise that no planner could reasonably foresee, and because a given worst-case scenario is likely to consider only contingencies expected to arise in connection with a particular disaster. The worst-case scenario devised by a seismologist might be a particularly bad earthquake, and the worst-case scenario devised by a meteorologist might be a particularly bad hurricane, but it is unlikely that either of them will devise a scenario where a particularly bad storm occurs at the same time as a particularly bad earthquake. 

The definition of a worst-case scenario varies by the field to which it is being applied. For example, in environmental engineering", "[a] worst-case scenario is defined as the release of the largest quantity of a regulated substance from a single vessel or process line failure that results in the greatest distance to an endpoint". In this field, "[a]s in other fields, the worst-case scenario is a useful device when low probability events may result in a catastrophe that must be avoided even at great cost, but in most health risk assessments, a worst-case scenario is essentially a type of bounding estimate". In computer science, the best, worst, and average case of a given algorithm express what the resource usage is at least, at most and on average, respectively. For many individuals, a worst case scenario is one that would result in their own death.

Criticisms

A number of criticisms have been leveled against the use of worst-case scenarios. In some cases, a conceivable worst-case scenario within a field may be so far beyond the capacity of participants to deal with that it is not worth the effort to develop or explore such a scenario; where this is possible, it is "important to evaluate whether the development of a worst-case scenario is reasonable and desirable". Entities that rely on such scenarios in planning may be led to plan too conservatively to take advantage of the usual absence of such scenarios, and may waste resources preparing for highly unlikely contingencies. At the extreme, it has been argued that the use of worst-case scenarios in disaster preparedness and training causes people to become conditioned to set aside ethical concerns and to over-react to lesser disasters.

Murphy's law

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Murphy%27s_law

Murphy's law is an adage or epigram that is typically stated as: "Anything that can go wrong will go wrong".

History

The perceived perversity of the universe has long been a subject of comment, and precursors to the modern version of Murphy's law are not hard to find. The concept may be as old as humanity. Recent significant research in this area has been conducted by members of the American Dialect Society. Society member Stephen Goranson has found a version of the law, not yet generalized or bearing that name, in a report by Alfred Holt at an 1877 meeting of an engineering society.
It is found that anything that can go wrong at sea generally does go wrong sooner or later, so it is not to be wondered that owners prefer the safe to the scientific .... Sufficient stress can hardly be laid on the advantages of simplicity. The human factor cannot be safely neglected in planning machinery. If attention is to be obtained, the engine must be such that the engineer will be disposed to attend to it.
Mathematician Augustus De Morgan wrote on June 23, 1866: "The first experiment already illustrates a truth of the theory, well confirmed by practice, what-ever can happen will happen if we make trials enough." In later publications "whatever can happen will happen" occasionally is termed "Murphy's law," which raises the possibility—if something went wrong—that "Murphy" is "De Morgan" misremembered (an option, among others, raised by Goranson on the American Dialect Society list).

American Dialect Society member Bill Mullins has found a slightly broader version of the aphorism in reference to stage magic. The British stage magician Nevil Maskelyne wrote in 1908:
It is an experience common to all men to find that, on any special occasion, such as the production of a magical effect for the first time in public, everything that can go wrong will go wrong. Whether we must attribute this to the malignity of matter or to the total depravity of inanimate things, whether the exciting cause is hurry, worry, or what not, the fact remains.
In 1948, humorist Paul Jennings coined the term resistentialism, a jocular play on resistance and existentialism, to describe "seemingly spiteful behavior manifested by inanimate objects",[6] where objects that cause problems (like lost keys or a runaway bouncy ball) are said to exhibit a high degree of malice toward humans.

The contemporary form of Murphy's law goes back as far as 1952, as an epigraph to a mountaineering book by John Sack, who described it as an "ancient mountaineering adage":
Anything that can possibly go wrong, does.

Association with Murphy

Cover of A History of Murphy's Law
 
According to the book A History of Murphy's Law by author Nick T. Spark, differing recollections years later by various participants make it impossible to pinpoint who first coined the saying Murphy's law. The law's name supposedly stems from an attempt to use new measurement devices developed by Edward Murphy. The phrase was coined in adverse reaction to something Murphy said when his devices failed to perform and was eventually cast into its present form prior to a press conference some months later — the first ever (of many) given by Dr. John Stapp, a U.S. Air Force colonel and Flight Surgeon in the 1950s. These conflicts (a long running interpersonal feud) were unreported until Spark researched the matter. His book expands upon and documents an original four part article published in 2003 (Annals of Improbable Research (AIR)) on the controversy: Why Everything You Know About Murphy's Law is Wrong

From 1948 to 1949, Stapp headed research project MX981 at Muroc Army Air Field (later renamed Edwards Air Force Base) for the purpose of testing the human tolerance for g-forces during rapid deceleration. The tests used a rocket sled mounted on a railroad track with a series of hydraulic brakes at the end. Initial tests used a humanoid crash test dummy strapped to a seat on the sled, but subsequent tests were performed by Stapp, at that time an Air Force captain. During the tests, questions were raised about the accuracy of the instrumentation used to measure the g-forces Captain Stapp was experiencing. Edward Murphy proposed using electronic strain gauges attached to the restraining clamps of Stapp's harness to measure the force exerted on them by his rapid deceleration. Murphy was engaged in supporting similar research using high speed centrifuges to generate g-forces. Murphy's assistant wired the harness, and a trial was run using a chimpanzee.

The sensors provided a zero reading; however, it became apparent that they had been installed incorrectly, with each sensor wired backwards. It was at this point that a disgusted Murphy made his pronouncement, despite being offered the time and chance to calibrate and test the sensor installation prior to the test proper, which he declined somewhat irritably, getting off on the wrong foot with the MX981 team. In an interview conducted by Nick Spark, George Nichols, another engineer who was present, stated that Murphy blamed the failure on his assistant after the failed test, saying, "If that guy has any way of making a mistake, he will." Nichols' account is that "Murphy's law" came about through conversation among the other members of the team; it was condensed to "If it can happen, it will happen," and named for Murphy in mockery of what Nichols perceived as arrogance on Murphy's part. Others, including Edward Murphy's surviving son Robert Murphy, deny Nichols' account (interviewed by Spark), and claim that the phrase did originate with Edward Murphy. According to Robert Murphy's account, his father's statement was along the lines of "If there's more than one way to do a job, and one of those ways will result in disaster, then he will do it that way."

The phrase first received public attention during a press conference in which Stapp was asked how it was that nobody had been severely injured during the rocket sled tests. Stapp replied that it was because they always took Murphy's law under consideration; he then summarized the law and said that in general, it meant that it was important to consider all the possibilities (possible things that could go wrong) before doing a test and act to counter them. Thus Stapp's usage and Murphy's alleged usage are very different in outlook and attitude. One is sour, the other an affirmation of the predictable being surmountable, usually by sufficient planning and redundancy. Nichols believes Murphy was unwilling to take the responsibility for the device's initial failure (by itself a blip of no large significance) and is to be doubly damned for not allowing the MX981 team time to validate the sensor's operability and for trying to blame an underling when doing so in the embarrassing aftermath. 

The association with the 1948 incident is by no means secure. Despite extensive research, no trace of documentation of the saying as Murphy's law has been found before 1951 (see above). The next citations are not found until 1955, when the May–June issue of Aviation Mechanics Bulletin included the line "Murphy's law: If an aircraft part can be installed incorrectly, someone will install it that way," and Lloyd Mallan's book, Men, Rockets and Space Rats, referred to: "Colonel Stapp's favorite takeoff on sober scientific laws—Murphy's law, Stapp calls it—'Everything that can possibly go wrong will go wrong'." The Mercury astronauts in 1962 attributed Murphy's law to U.S. Navy training films.

Fred R. Shapiro, the editor of the Yale Book of Quotations, has shown that in 1952 the adage was called "Murphy's law" in a book by Anne Roe, quoting an unnamed physicist:
he described [it] as "Murphy's law or the fourth law of thermodynamics" (actually there were only three last I heard) which states: "If anything can go wrong, it will."
In May 1951, Anne Roe gives a transcript of an interview (part of a Thematic Apperception Test, asking impressions on a drawing) with Theoretical Physicist number 3: "...As for himself he realized that this was the inexorable working of the second law of the thermodynamics which stated Murphy's law ‘If anything can go wrong it will’. I always liked 'Murphy's law.' I was told that by an architect" Anne Roe's papers are in the American Philosophical Society archives in Philadelphia; those records (as noted by Stephen Goranson on the American Dialect Society list 12/31/2008) identify the interviewed physicist as Howard Percy "Bob" Robertson (1903–1961). Robertson's papers are at the Caltech archives; there, in a letter Robertson offers Roe an interview within the first three months of 1949 (as noted by Goranson on American Dialect Society list 5/9/2009). The Robertson interview apparently predated the Muroc scenario said by Nick Spark (American Aviation Historical Society Journal 48 (2003) p. 169) to have occurred in or after June, 1949.

The name "Murphy's law" was not immediately secure. A story by Lee Correy in the February 1955 issue of Astounding Science Fiction referred to "Reilly's law," which "states that in any scientific or engineering endeavor, anything that can go wrong will go wrong". Atomic Energy Commission Chairman Lewis Strauss was quoted in the Chicago Daily Tribune on February 12, 1955, saying "I hope it will be known as Strauss' law. It could be stated about like this: If anything bad can happen, it probably will."

Arthur Bloch, in the first volume (1977) of his Murphy's Law, and Other Reasons Why Things Go WRONG series, prints a letter that he received from George E. Nichols, a quality assurance manager with the Jet Propulsion Laboratory. Nichols recalled an event that occurred in 1949 at Edwards Air Force Base, Muroc, California that, according to him, is the origination of Murphy's law, and first publicly recounted by USAF Col. John Paul Stapp. An excerpt from the letter reads:
The law's namesake was Capt. Ed Murphy, a development engineer from Wright Field Aircraft Lab. Frustration with a strap transducer which was malfunctioning due to an error in wiring the strain gage bridges caused him to remark – "If there is any way to do it wrong, he will" – referring to the technician who had wired the bridges at the Lab. I assigned Murphy's law to the statement and the associated variations.

Academic and scientific views

According to Richard Dawkins, so-called laws like Murphy's law and Sod's law are nonsense because they require inanimate objects to have desires of their own, or else to react according to one's own desires. Dawkins points out that a certain class of events may occur all the time, but are only noticed when they become a nuisance. He gives as an example aircraft noise interfering with filming. Aircraft are in the sky all the time, but are only taken note of when they cause a problem. This is a form of confirmation bias whereby the investigator seeks out evidence to confirm his already formed ideas, but does not look for evidence that contradicts them.

Similarly, David Hand, emeritus professor of mathematics and senior research investigator at Imperial College London, points out that the law of truly large numbers should lead one to expect the kind of events predicted by Murphy's law to occur occasionally. Selection bias will ensure that those ones are remembered and the many times Murphy's law was not true are forgotten.

There have been persistent references to Murphy's law associating it with the laws of thermodynamics from early on (see the quotation from Anne Roe's book above). In particular, Murphy's law is often cited as a form of the second law of thermodynamics (the law of entropy) because both are predicting a tendency to a more disorganised state. Atanu Chatterjee investigated this idea by formally stating Murphy's law in mathematical terms. Chatterjee found that Murphy's law so stated could be disproved using the principle of least action.

Variations (corollaries) of the law

From its initial public announcement, Murphy's law quickly spread to various technical cultures connected to aerospace engineering. Before long, variants had passed into the popular imagination, changing as they went. 

Author Arthur Bloch has compiled a number of books full of corollaries to Murphy's law and variations thereof. The first of these was Murphy's law and other reasons why things go wrong!,
Yhprum's law, where the name is spelled backwards, is "anything that can go right, will go right" — the optimistic application of Murphy's law in reverse.

Peter Drucker, the management consultant, with a nod to Murphy, formulated "Drucker's Law" in dealing with complexity of management: "If one thing goes wrong, everything else will, and at the same time."

Mrs. Murphy's Law is a corollary of Murphy's Law. It states that things will go wrong when Mr. Murphy is away, as in this formulation:

Catch-22 (logic)

From Wikipedia, the free encyclopedia

A catch-22 is a paradoxical situation from which an individual cannot escape because of contradictory rules or limitations. The term was coined by Joseph Heller, who used it in his 1961 novel Catch-22.

An example is:
In needing experience to get a job..."How can I get any experience until I get a job that gives me experience?" – Brantley Foster in The Secret of My Success.
Catch-22s often result from rules, regulations, or procedures that an individual is subject to, but has no control over, because to fight the rule is to accept it. Another example is a situation in which someone is in need of something that can only be had by not being in need of it (e.g, the only way to qualify for a loan is to prove to the bank that you don't need a loan). One connotation of the term is that the creators of the "catch-22" situation have created arbitrary rules in order to justify and conceal their own abuse of power.

Origin and meaning

Joseph Heller coined the term in his 1961 novel Catch-22, which describes absurd bureaucratic constraints on soldiers in World War II. The term is introduced by the character Doc Daneeka, an army psychiatrist who invokes "Catch-22" to explain why any pilot requesting mental evaluation for insanity—hoping to be found not sane enough to fly and thereby escape dangerous missions—demonstrates his own sanity in creating the request and thus cannot be declared insane. This phrase also means a dilemma or difficult circumstance from which there is no escape because of mutually conflicting or dependent conditions.
"You mean there's a catch?"
"Sure there's a catch," Doc Daneeka replied. "Catch-22. Anyone who wants to get out of combat duty isn't really crazy."
There was only one catch and that was Catch-22, which specified that a concern for one's own safety in the face of dangers that were real and immediate was the process of a rational mind. Orr was crazy and could be grounded. All he had to do was ask; and as soon as he did, he would no longer be crazy and would have to fly more missions. Orr would be crazy to fly more missions and sane if he didn't, but if he was sane, he had to fly them. If he flew them, he was crazy and didn't have to; but if he didn't want to, he was sane and had to. Yossarian was moved very deeply by the absolute simplicity of this clause of Catch-22 and let out a respectful whistle.
Different formulations of "Catch-22" appear throughout the novel. The term is applied to various loopholes and quirks of the military system, always with the implication that rules are inaccessible to and slanted against those lower in the hierarchy. In chapter 6, Yossarian (the protagonist) is told that Catch-22 requires him to do anything his commanding officer tells him to do, regardless of whether these orders contradict orders from the officer's superiors.

In a final episode, Catch-22 is described to Yossarian by an old woman recounting an act of violence by soldiers:
"Catch-22 says they have a right to do anything we can't stop them from doing."
"What the hell are you talking about?" Yossarian shouted at her in bewildered, furious protest. "How did you know it was Catch-22? Who the hell told you it was Catch-22?"
"The soldiers with the hard white hats and clubs. The girls were crying. 'Did we do anything wrong?' they said. The men said no and pushed them away out the door with the ends of their clubs. 'Then why are you chasing us out?' the girls said. 'Catch-22,' the men said. All they kept saying was 'Catch-22, Catch-22.' What does it mean, Catch-22? What is Catch-22?"
"Didn't they show it to you?" Yossarian demanded, stamping about in anger and distress. "Didn't you even make them read it?"
"They don't have to show us Catch-22," the old woman answered. "The law says they don't have to."
"What law says they don't have to?"
"Catch-22."
According to literature professor Ian Gregson, the old woman's narrative defines "Catch-22" more directly as the "brutal operation of power", stripping away the "bogus sophistication" of the earlier scenarios.

Other appearances in the novel

Besides referring to an unsolvable logical dilemma, Catch-22 is invoked to explain or justify the military bureaucracy. For example, in the first chapter, it requires Yossarian to sign his name to letters that he censors while he is confined to a hospital bed. One clause mentioned in chapter 10 closes a loophole in promotions, which one private had been exploiting to reattain the attractive rank of Private First Class after any promotion. Through courts-martial for going AWOL, he would be busted in rank back to private, but Catch-22 limited the number of times he could do this before being sent to the stockade.

At another point in the book, a prostitute explains to Yossarian that she cannot marry him because he is crazy, and she will never marry a crazy man. She considers any man crazy who would marry a woman who is not a virgin. This closed logic loop clearly illustrated Catch-22 because by her logic, all men who refuse to marry her are sane and thus she would consider marriage; but as soon as a man agrees to marry her, he becomes crazy for wanting to marry a non-virgin, and is instantly rejected.

At one point, Captain Black attempts to press Milo into depriving Major Major of food as a consequence of not signing a loyalty oath that Major Major was never given an opportunity to sign in the first place. Captain Black asks Milo, "You're not against Catch-22, are you?" 

In chapter 40, Catch-22 forces Colonels Korn and Cathcart to promote Yossarian to Major and ground him rather than simply sending him home. They fear that if they do not, others will refuse to fly, just as Yossarian did. 

Significance of the number 22

Heller originally wanted to call the phrase (and hence, the book) by other numbers, but he and his publishers eventually settled on 22. The number has no particular significance; it was chosen more or less for euphony. The title was originally Catch-18, but Heller changed it after the popular Mila 18 was published a short time beforehand.

Usage

The term "catch-22" has filtered into common usage in the English language. In a 1975 interview, Heller said the term would not translate well into other languages.

James E. Combs and Dan D. Nimmo suggest that the idea of a "catch-22" has gained popular currency because so many people in modern society are exposed to frustrating bureaucratic logic. They write:
Everyone, then, who deals with organizations understands the bureaucratic logic of Catch-22. In high school or college, for example, students can participate in student government, a form of self-government and democracy that allows them to decide whatever they want, just so long as the principal or dean of students approves. This bogus democracy that can be overruled by arbitrary fiat is perhaps a citizen's first encounter with organizations that may profess 'open' and libertarian values, but in fact are closed and hierarchical systems. Catch-22 is an organizational assumption, an unwritten law of informal power that exempts the organization from responsibility and accountability, and puts the individual in the absurd position of being excepted for the convenience or unknown purposes of the organization.
Along with George Orwell's "doublethink", "Catch-22" has become one of the best-recognized ways to describe the predicament of being trapped by contradictory rules.

A significant type of definition of alternative medicine has been termed a catch-22. In a 1998 editorial co-authored by Marcia Angell, a former editor of the New England Journal of Medicine, argued that:
"It is time for the scientific community to stop giving alternative medicine a free ride. There cannot be two kinds of medicine – conventional and alternative. There is only medicine that has been adequately tested and medicine that has not, medicine that works and medicine that may or may not work. Once a treatment has been tested rigorously, it no longer matters whether it was considered alternative at the outset. If it is found to be reasonably safe and effective, it will be accepted. But assertions, speculation, and testimonials do not substitute for evidence. Alternative treatments should be subjected to scientific testing no less rigorous than that required for conventional treatments."
This definition has been described by Robert L. Park as a logical catch-22 which ensures that any complementary and alternative medicine (CAM) method which is proven to work "would no longer be CAM, it would simply be medicine."

Usage in scientific research

In research, Catch-22 reflects scientist's frustration with known unknowns, of which Quantum computing is a prime example: If two electrons are entangled such that if a measurement identifies the first electron in one position around the circle, the other must occupy a position directly across the circle from it, (a relationship that holds when they are beside each other and when they're light-years apart). The Catch-22 of quantum computing is that quantum features only work when they're not being observed, so observing a quantum computer to check if it's exploiting quantum behaviour will destroy the quantum behaviour being checked. Heisenberg’s uncertainty principle prevents us from knowing a particle’s position and momentum simultaneously — if you measure one property, you destroy information about the other.

EC General Data Privacy Regulation: The EU's expansive privacy regulation places limitations on artificial intelligence development, which relies heavily on (big) data. Beyond its restrictions on the collection of user data, GDPR ensures that even if a company does collect personal data, its use for automated decision-making—a standard AI application—is limited. Article 22 mandates that a user can opt out of automated processing, in which case the company must provide a human-reviewed alternative that obeys the user’s wishes. When automation is used, it must be clearly explained to the user, and its application could still be punished for ambiguity or violating other regulations, making the use of AI a Catch-22 for GDPR-compliant bodies.

Artificial Intelligence: As indicated above AI depends on vast quantities of verified data, most of which is rightly considered private for personal or commercial reasons. This leads to a catch-22 resulting from inadvertent entry of seemingly innocuous or protected data to otherwise secure websites. Thus using dozens of "right of access" requests, Oxford-based researcher James Pavur found that he could access personal information—ranging from purchase histories, to credit card digits, to past and present home addresses—from several UK and US-based companies without even verifying his identity. In commercial fields various ploys to accumulate data useful for AI are ubiquitous. Access to high-quality training data is critical for startups that use machine learning as the core technology of their business. According to Moritz Mueller-Freitag, "While many algorithms and software tools are open sourced and shared across the research community, good datasets are usually proprietary and hard to build. Owning a large, domain-specific dataset can therefore become a significant source of competitive advantage." User input even includes such innocuous user interfaces that encourage users to correct errors, such as Mapillary and reCAPTCHA. Thus the web user is groomed progressively to cooperate in the construction of AI in exchange for access to unverifiable information, whilst his rights are extinguished by agreeing to unfathomable terms and conditions.

The problem of unknown unknowns: This is a kind of inverse Catch-22 situation (Perhaps Catch-0.0455) in which Joseph Heller's Yossarian doesn't know yet that the bomber he was afraid to crew this evening was shot down last night. A similar deficiency explains why scientists haven't come up with a cure for Alzheimer's disease; — they don't know exactly what it is. They can see what happens to patients and predict what will happen but don't understand its ultimate causes, why it affects the people it does, or why the symptoms grow worse over time. 

Assessing novel interpretations submitted to Scientific journals: If new knowledge from new studies is presented in the context of existing knowledge, that process allows the credibility of resulting conclusions to be established. As knowledge is evolutionary in nature, earlier knowledge usually forms a foundation for later increments. However academic constraints usually incline researchers to avoid Thinking outside the box. This leads to a Catch-22 problem for researchers who seek to reinterpret earlier studies, making deductions from existing data that dissent from existing, widely approved interpretations. For example, the current understanding of Pleistocene ice-sheets in North America and Europe is based ultimately on Agassiz’ 1842 interpretation of a thick mer de glace that covered much of the northern parts of continents. For 50 years after its publication, several experienced geologists drew attention to its serious shortcomings. Nevertheless, Agassiz’ interpretation underlies the 'canonical' version now taught to students everywhere, without any caveats. Today researchers undertaking a critical review of existing evidence, who reach conclusions that differ substantially from the 'Thick Pleistocene Ice' interpretation will have difficulty getting it past review teams and into various Quaternary journals. These authors' Catch-22 frustration will be amplified when they attempt to publish a summary of their review as an article in Wikipedia, where they may find that skeptical editors revert their submission on the logic that it seems to be 'Original Research'. 

Logic

The archetypal catch-22, as formulated by Heller, involves the case of John Yossarian, a U.S. Army Air Forces bombardier, who wishes to be grounded from combat flight. This will only happen if he is evaluated by the squadron's flight surgeon and found "unfit to fly". "Unfit" would be any pilot who is willing to fly such dangerous missions, as one would have to be mad to volunteer for possible death. However, to be evaluated, he must request the evaluation, an act that is considered sufficient proof for being declared sane. These conditions make it impossible to be declared "unfit".

The "Catch-22" is that "anyone who wants to get out of combat duty isn't really crazy". Hence, pilots who request a mental fitness evaluation are sane, and therefore must fly in combat. At the same time, if an evaluation is not requested by the pilot, he will never receive one and thus can never be found insane, meaning he must also fly in combat.

Therefore, Catch-22 ensures that no pilot can ever be grounded for being insane even if he is.

A logical formulation of this situation is:

1. For a person to be excused from flying (E) on the grounds of insanity, he must both be insane (I) and have requested an evaluation (R). (premise)
2. An insane person (I) does not request an evaluation (¬R) because he does not realize he is insane. (premise)
3. Either a person is not insane (¬I) or does not request an evaluation (¬R). (2. and material implication)
4. No person can be both insane (I) and request an evaluation (R). (3. and De Morgan's laws)
5. Therefore, no person can be excused from flying (¬E) because no person can be both insane and have requested an evaluation. (4., 1. and modus tollens)

The philosopher Laurence Goldstein argues that the "airman's dilemma" is logically not even a condition that is true under no circumstances; it is a "vacuous biconditional" that is ultimately meaningless. Goldstein writes:
The catch is this: what looks like a statement of the conditions under which an airman can be excused flying dangerous missions reduces not to the statement
(i) 'An airman can be excused flying dangerous missions if and only if Cont' (where 'Cont' is a contradiction)
(which could be a mean way of disguising an unpleasant truth), but to the worthlessly empty announcement
(ii) 'An airman can be excused flying dangerous missions if and only if it is not the case that an airman can be excused flying dangerous missions'
If the catch were (i), that would not be so bad—an airman would at least be able to discover that under no circumstances could he avoid combat duty. But Catch-22 is worse—a welter of words that amounts to nothing; it is without content, it conveys no information at all.

Innocent prisoner's dilemma

 
The innocent prisoner's dilemma, or parole deal, is a detrimental effect of a legal system in which admission of guilt can result in reduced sentences or early parole. When an innocent person is wrongly convicted of a crime, legal systems which need the individual to admit guilt, for example as a prerequisite step leading to parole, punish an innocent person for their integrity, and reward a person lacking in integrity. There have been many cases where innocent prisoners were given the choice between freedom, in exchange for claiming guilt, and remaining imprisoned and telling the truth. Individuals have died in prison rather than admit to crimes which they did not commit.

It has been demonstrated in Britain that prisoners who freely admit their guilt are more likely to re-offend than prisoners who maintain their innocence. Other research, however, has found no clear link between denial of guilt and recidivism.

United States law professor Daniel Medwed says convicts who go before a parole board maintaining their innocence are caught in a Catch-22 which he calls "the innocent prisoner’s dilemma". A false admission of guilt and remorse by an innocent person at a parole hearing may prevent a later investigation proving their innocence.

Detriment to individuals


In the United Kingdom

Michael Naughton, founder of the Innocence Network UK (INUK), says work carried out by the INUK includes research and public awareness on wrongful convictions, which can effect policy reforms. Most important is the development of a system to assess prisoners maintaining innocence, to distinguish potentially innocent prisoners from the prisoners who claim innocence for other reasons like "ignorance, misunderstanding or disagreement with criminal law; to protect another person or group from criminal conviction; or on 'abuse of process' or technical grounds in the hope of achieving an appeal." The system, he says, is being adopted by the prison parole board and prison service, for prisoners serving "indeterminate sentences (where the prisoner has no release date and does not get out until a parole board decides he or she is no longer a risk to the public). Previously, such prisoners were treated as 'deniers' with no account taken of the various reasons for maintaining innocence, nor the fact that some may actually be innocent." Those prisoners are unable to achieve parole unless they undertake offence-behaviour courses that require the admission of guilt as a prerequisite. This was represented in the Porridge episode Pardon Me. However, in recent years, this has diminished in significance; at the time Simon Hall ended his denials to murder in 2012, the Ministry of Justice denied that this would have any impact on his tariff, and his last online posting had concerned being released from prison in spite of his denials.

The murder of Linda Cook was committed in Portsmouth on 9 December 1986. The subsequent trial led to a miscarriage of justice when Michael Shirley, an 18-year-old Royal Navy sailor, was wrongly convicted of the crime and sentenced to life imprisonment. After serving the minimum 15 years, Shirley would have been released from prison had he confessed the killing to the parole board, but he refused to do so and said: "I would have died in prison rather than admit something I didn't do. I was prepared to stay in forever if necessary to prove my innocence." (Shirley's conviction was eventually quashed by the Court of Appeal in 2003, on the basis of exculpatory DNA evidence.) 

The Stephen Downing case involved the conviction and imprisonment in 1974 of a 17-year-old council worker, Stephen Downing, for the murder of a 32-year-old legal secretary, Wendy Sewell. His conviction was overturned in 2002, after Downing had served 27 years in prison. The case is thought to be the longest miscarriage of justice in British legal history, and attracted worldwide media attention. The case was featured in the 2004 BBC drama In Denial of Murder Downing claimed that had he falsely confessed he would have been released over a decade earlier. Because he did not admit to the crime he was classified as "IDOM" (In Denial of Murder) and ineligible for parole under English Law.

In the United States

In the United States the reality of a person being innocent, called "actual innocence", is not sufficient reason for the justice system to release a prisoner. Once a verdict has been made, it is rare for a court to reconsider evidence of innocence which could have been presented at the time of the original trial. Decisions by the State Board of Pardons and Paroles regarding its treatment of prisoners who may be actually innocent have been criticized by the international community.

Herbert Murray, who was convicted of murder in 1979, said, "When the judge asked me did I have anything to say, I couldn't say, because tears were coming down and I couldn’t communicate. I couldn't turn around and tell the family that they got the wrong man." The judge said he believed the defense's alibi witnesses; however, the judge was required by law to respect the jury's decision. After being locked up for 19 years, his parole officer said "Nineteen years is a long time. [....] But you’re no closer to the rehabilitative process than when you first walked into prison. The first step in that process is the internalization of guilt. You need to do some serious introspection, Mr. Murray, and come to grips with your behavior." Murray agreed with the parole officer, but maintained his innocence: "I agree! But again, I just didn't do it." 

In a news interview, Murray says he went before a parole board four times, maintaining his innocence until the fifth: "I said what the hell, let me tell these people what they want to hear." He admitted to the parole board that he committed the crime and was taking responsibility. "I felt like I sold my soul to the devil. Because before, I had that strength, because I stood on the truth. [...] I became so desperate to get out, I had to say something. I had to say something because what I said before didn't work." His parole was denied. After 29 years in prison, Medwed's Second Look clinic, a group dedicated to the release of innocent prisoners, assisted lawyers in his eighth parole board hearing which was successful, releasing him onto indefinite parole. Overturning the original conviction would be hampered by his admissions of guilt at his parole hearings.

Timothy Brian Cole (1960–99) was an African American military veteran and a student wrongly convicted of raping a fellow student in 1985. Cole was convicted by a jury of rape, primarily based on the testimony of the victim, Michele Mallin. He was sentenced to 25 years in prison. While incarcerated, Cole was offered parole if he would admit guilt, but he refused. "His greatest wish was to be exonerated and completely vindicated", his mother stated in a press interview. Cole died after serving 14 years in prison.

Another man, Jerry Wayne Johnson, confessed to the rape in 1995. Further, Mallin later admitted that she was mistaken as to the identity of her attacker. She stated that investigators botched the gathering of evidence and withheld information from her, causing her to believe that Cole was the perpetrator. Mallin told police that the rapist smoked during the rape. However, Cole never smoked because of his severe asthma. DNA evidence later showed him to be innocent. Cole died in prison on December 2, 1999; ten years later, a district court judge announced "to a 100 percent moral, factual and legal certainty" that Timothy Cole did not commit the rape. He was posthumously pardoned.

The dilemma can occur even before conviction. Kalief Browder was arrested in May 2010 for allegedly stealing a backpack. He spent the next three years on Rikers Island awaiting trial, much of it in solitary. During court appearances, prosecutors routinely asked for a short delay which would turn into a much lengthier wait. At times, Browder was offered plea bargains, and at one point, he was encouraged to plead guilty to misdemeanors, for which he would be sentenced to time already served and released. When he refused the plea deal, insisting on his innocence, the judge noted "If you go to trial and lose, you could get up to fifteen [years]." Eventually, in May 2013, the case was dismissed because prosecutors had lost contact with the only witness they had to the alleged crime.

Detriment to society

Gabe Tan reported a British conference in 2011, "the dilemma of maintaining innocence", concluded "Denial is not a valid measure of risk. In fact, research has shown that prisoners who openly admit to their crimes have the highest risk of re-offending."

In 2011, Michael Naughton suggested the focus on new evidence by the Criminal Cases Review Commission, rather than an examination of serious problems with evidence at original trials, meant in many cases “that the dangerous criminals who committed these crimes remain at liberty with the potential to commit further serious crimes.”

Robert A. Forde cited two studies at the conference. One, a ten-year study of 180 sex offenders by Harkins, Beech and Goodwill found prisoners who claimed to be innocent were the least likely to be re-convicted, and that those who 'admitted everything', claiming to be guilty, were most likely to re-offend. He also told the conference research by Hanson et al. in 2002, the denial by the prisoner of their offences had no bearing on their likelihood of re-offending.

Innocence Project

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Innocence_Project
 
Logo of The Innocence Project.gif
Formation1992
Founder
Founded at
TypeNonprofit organization
32-0077563
Legal status501(c)(3)
Purpose
  • Exoneration
  • Justice reform
"The Innocence Project's mission is to free the staggering number of innocent people who remain incarcerated, and to bring reform to the system responsible for their unjust imprisonment."
Headquarters40 Worth Street, Suite 701
New York, NY 10013
Region
United States
Executive Director
Maddy deLone
Vered Rabia
AffiliationsThe Innocence Network
Revenue (2018)
$13,426,018
Expenses (2018)$13,608,849
Endowment$21,620,304 (2018)
Employees (2017)
88
Volunteers (2017)
17
Websitewww.innocenceproject.org

The Innocence Project is a 501(c)(3) nonprofit legal organization that is committed to exonerating wrongly convicted people through the use of DNA testing and to reforming the criminal justice system to prevent future injustice. The group cites various studies estimating that in the United States, between 2.3% and 5% of all prisoners are innocent. The Innocence Project was founded in 1992 by Barry Scheck and Peter Neufeld.

As of November 17, 2019, the Innocence Project has worked on 189 successful DNA-based exonerations.

Founding

The Innocence Project was established in the wake of a study by the United States Department of Justice and United States Senate, in conjunction with the Benjamin N. Cardozo School of Law, which found that incorrect identification by eyewitnesses was a factor in over 70% of wrongful convictions. The original Innocence Project was founded in 1992 by Scheck and Neufeld as part of the Cardozo School of Law of Yeshiva University in New York City. It became an independent 501(c)(3) nonprofit organization on January 28, 2003, but it maintains institutional connections with Cardozo. As of September 5, 2018, the executive director of the Innocence Project is Madeline deLone.

The Innocence Project has become widespread as countries are using scientific data to overturn wrongful convictions and in turn freeing those wrongly convicted. One such example exists in the Republic of Ireland where in 2009 a project was set up at Griffith College Dublin.

Mission

The Innocence Project focuses on cases in which DNA evidence is available to be tested or retested. DNA testing is possible in 5–10% of criminal cases. Other members of the Innocence Network also help to exonerate those in whose cases DNA testing is not possible.

In addition to working on behalf of those who may have been wrongfully convicted of crimes throughout the United States, those working for the Innocence Project perform research and advocacy related to the causes of wrongful convictions

Some of the Innocence Project's successes have resulted in releasing people from death row. The successes of the project have fueled American opposition to the death penalty and have likely been a factor in the decision by some American states to institute moratoria on criminal executions.

In District Attorney's Office v. Osborne (2009), US Supreme Court Chief Justice Roberts wrote that post-conviction challenge "poses questions to our criminal justice systems and our traditional notions of finality better left to elected officials than federal judges." In the opinion, another justice wrote that forensic science has "serious deficiencies". Roberts also said that post-conviction DNA testing risks "unnecessarily overthrowing the established system of criminal justice." Law professor Kevin Jon Heller wrote: "It might lead to a reasonably accurate one."

Overturned convictions

As of November 2019, 367 people previously convicted of serious crimes in the United States had been exonerated by DNA testing since 1989, 21 of whom had been sentenced to death. Almost all (99%) of the wrongful convictions were of males, with minority groups constituting approximately 70% (61% African American and 8% Latino). The National Registry of Exonerations lists 1,579 convicted defendants who were exonerated through DNA and non-DNA evidence from January 1, 1989 through April 12, 2015. According to a study published in 2014, more than 4% of persons overall sentenced to death from 1973 to 2004 are probably innocent. The following are examples of notable exonerations:
  • In 2003, Steven Avery was exonerated after serving 18 years in prison for a sexual assault charge. After his release, he was convicted of murder.
  • In 2004, Darryl Hunt was exonerated after serving 19 1/2 years in prison of a life sentence for the rape and murder of a newspaper copy editor, Deborah Sykes.
  • In 2007, after an investigation begun by The Innocence Project, James Calvin Tillman was exonerated after serving 16 1/2 years in prison for a rape he did not commit. His sentence was 45 years.
  • In 2014, Glenn Ford was exonerated in the murder of Isadore Newman. Ford, an African American, had been convicted by an all-white jury without any physical evidence linking him to the crime, and with testimony withheld. He served 30 years on death row in Angola Prison before his release.

Work

The Innocence Project originated in New York City but accepts cases from any part of the United States. The majority of clients helped are of low socio-economic status and have used all possible legal options for justice. Many clients hope that DNA evidence will prove their innocence, as the emergence of DNA testing allows those who have been wrongly convicted of crimes to challenge their cases. The Innocence Project also works with the local, state and federal levels of law enforcement, legislators, and other programs to prevent further wrongful convictions.

About 3,000 prisoners write to the Innocence Project annually, and at any given time the Innocence Project is evaluating 6,000 to 8,000 potential cases.

All potential clients go through an extensive screening process to determine whether or not they are likely to be innocent. If they pass the process, the Innocence Project takes up their case. In roughly half of the cases that the Innocence Project takes on, the clients' guilt is reconfirmed by DNA testing. Of all the cases taken on by the Innocence Project, about 43% of clients were proven innocent, 42% were confirmed guilty, and evidence was inconclusive and not probative in 15% of cases. In about 40% of all DNA exoneration cases, law enforcement officials identified the actual perpetrator based on the same DNA test results that led to an exoneration.

Funding

The Innocence Project receives 45% of its funding from individual contributions, 30% from foundations, 15% from an annual benefit dinner, 7% from the Cardozo School of Law, and the rest from corporations.

Innocence Network

The Innocence Project is a founder of the Innocence Network, an organization of law and journalism schools, and public defense offices that collaborate to help convicted felons prove their innocence. 46 American states along with several other countries are a part of the network. In 2010, 29 people were exonerated worldwide from the work of the members of this organization.

The Innocence Network brings together a growing number of innocence organizations from across the United States as well as includes members from other English-speaking common law countries: Australia, Canada, Ireland, New Zealand, and the United Kingdom.

In South Africa, the Wits Justice Project investigates South African incarcerations. In partnership with the Wits Law Clinic, the Julia Mashele Trust, the Legal Resource Centre (LRC), the Open Democracy Advice Centre (ODAC), and the US Innocence Project, the Justice Project investigates individual cases of prisoners wrongly convicted or awaiting trial.

Causes of wrongful conviction

There are many reasons why wrongful convictions occur. The most common reason is false eyewitness identification, which played a role in more than 75% of wrongful convictions overturned by the Innocence Project. Often assumed to be incontrovertible, a growing body of evidence suggests that eyewitness identifications are unreliable. Another cause for misidentification is when a "show-up" procedure occurs. This is where a suspect is shown at the scene of a crime in a poorly lit lot or in a police car. Someone might also misidentify when they learn more about the suspect; it may cause them to change their description.

Unreliable or improper forensic science played a role in some 50% of Innocence Project cases. Scientific techniques such as bite-mark comparison, once widely used, are now known to be subjective. Many forensic science techniques also lack uniform scientific standards.

In about 25% of DNA exoneration cases, innocent people were coerced into making false confessions. Many of these false confessors went on to plead guilty to crimes they did not commit (usually to avoid a harsher sentence, or even the death penalty). 

Government misconduct, inadequate legal counsel, and the improper use of informants also contributed to many of the wrongful convictions since overturned by the Innocence Project. 

In popular culture


Film


Literature


Podcasts

  • Serial Season 1 referenced the Innocence Project in episode 7 where Deirdre Enright, director of investigation for the Innocence Project at the University of Virginia School of Law, and a team of law students analyzed the case against Adnan Syed.

Stage productions

  • The Exonerated (2002) is a play by Erik Jensen and Jessica Blank about six people who had been wrongly convicted and sentenced to death, but were exonerated.

Television

  • In Justice is an American television series with a similar premise.
  • Castle is an American television series; in the episode "Like Father, Like Daughter" (season 6, episode 7), the Innocence Project was mentioned, as well as Frank Henson who was wrongfully convicted in 1998 of the death of Kimberly Tolbert.
  • The Innocence Project, a BBC One drama series that aired from 2006 to 2007, is based on a UK version of the Innocence Project.
  • The Innocence Project was discussed in season 2, episode 9 of The Good Wife, "Nine Hours" (December 14, 2010). Innocence Project co-founder Barry Scheck played himself in the episode, which was largely based on the actual Innocence Project case of Cameron Todd Willingham. Cary Agos, a recurring character on The Good Wife, is said to have worked for the Innocence Project after law school (and is a family friend of Scheck's).
  • In season six of Suits, a US legal dramedy, law student and paralegal Rachel Zane takes on an Innocence Project for a man wrongfully accused of murder.
  • In season three of Riverdale, a dark reimagining of the Archie Comics universe, Veronica Lodge mentions starting a chapter of the Innocence Project to help free her boyfriend Archie Andrews from prison following being falsely convicted of murder.

Social privilege

From Wikipedia, the free encyclopedia https://en.wikipedi...