Search This Blog

Saturday, December 2, 2023

Glutamate hypothesis of schizophrenia

The glutamate hypothesis of schizophrenia models the subset of pathologic mechanisms of schizophrenia linked to glutamatergic signaling. The hypothesis was initially based on a set of clinical, neuropathological, and, later, genetic findings pointing at a hypofunction of glutamatergic signaling via NMDA receptors. While thought to be more proximal to the root causes of schizophrenia, it does not negate the dopamine hypothesis, and the two may be ultimately brought together by circuit-based models. The development of the hypothesis allowed for the integration of the GABAergic and oscillatory abnormalities into the converging disease model and made it possible to discover the causes of some disruptions.

Like the dopamine hypothesis, the development of the glutamate hypothesis developed from the observed effects of mind-altering drugs. However, where dopamine agonists can mimic positive symptoms with significant risks to brain structures during and after use, NMDA antagonists mimic some positive and negative symptoms with less brain harm, when combined with a GABAA activating drug. Likely, both dopaminergic and glutaminergic abnormalities are implicated in schizophrenia, from a profound alteration in the function of the chemical synapses, as well as electrical synaptic irregularities. These form a portion of the complex constellation of factors, neurochemically, psychologically, psychosocially, and structurally, which result in schizophrenia.

The role of heteromer formation

Alteration in the expression, distribution, autoregulation, and prevalence of specific glutamate heterodimers alters relative levels of paired G proteins to the heterodimer-forming glutamate receptor in question.

Namely: 5HT2A and mGlu2 form a dimer which mediates psychotomimetic and entheogenic effects of psychedelics; as such this receptor is of interest in schizophrenia. Agonists at either constituent receptor may modulate the other receptor allosterically; e.g. glutamate-dependent signaling via mGlu2 may modulate 5HT2A-ergic activity. Equilibrium between mGlu2/5HT2A is altered against tendency towards of psychosis by neuroleptic-pattern 5HT2A antagonists and mGlu2 agonists; both display antipsychotic activity. AMPA, the most widely distributed receptor in the brain, is a tetrameric ionotropic receptor; alterations in equilibrium between constituent subunits are seen in mGlu2/5HT2A antagonist (antipsychotic) administration- GluR2 is seen to be upregulated in the PFC while GluR1 downregulates in response to antipsychotic administration.

Reelin abnormalities may also be involved in the pathogenesis of schizophrenia via a glutamate-dependent mechanism. Reelin expression deficits are seen in schizophrenia, and reelin enhances expression of AMPA and NMDA alike. As such deficits in these two ionotropic glutamate receptors may be partially explained by altered reelin cascades. Neuregulin 1 deficits may also be involved in glutaminergic hypofunction as NRG1 hypofunction leads to schizophrenia-pattern behavior in mice; likely due in part to reduced NMDA signaling via Src suppression.

The role of synaptic pruning

Various neurotrophic factors dysregulate in schizophrenia and other mental illnesses, namely BDNF; expression of which is lowered in schizophrenia as well as in major depression and bipolar disorder. BDNF regulates in an AMPA-dependent mechanism - AMPA and BDNF alike are critical mediators of growth cone survival. NGF, another neurotrophin involved in maintenance of synaptic plasticity is similarly seen in deficit.

Dopaminergic excess, classically understood to result in schizophrenia, puts oxidative load on neurons; leading to inflammatory response and microglia activation. Similarly, toxoplasmosis infection in the CNS (positively correlated to schizophrenia) activates inflammatory cascades, also leading to microglion activation. The lipoxygenase-5 inhibitor minocycline has been seen to be marginally effective in halting schizophrenia progression. One of such inflammatory cascades' downstream transcriptional target, NF-κB, is observed to have altered expression in schizophrenia.

In addition, CB2 is one of the most widely distributed glial cell-expressed receptors, downregulation of this inhibitory receptor may increase global synaptic pruning activity. While difference in expression or distribution is observed, when the CB2 receptor is knocked out in mice, schizophreniform behaviors manifest. This may deregulate synaptic pruning processes in a tachyphlaxis mechanism wherein immediate excess CB2 activity leads to phosphorylation of the receptor via GIRK, resultant in b-arrestin-dependent internalization and subsequent trafficking to the proteasome for degradation.

The role of endogenous antagonists

Alterations in production of endogenous NMDA antagonists such as agmatine and kynurenic acid have been shown in schizophrenia. Deficit in NMDA activity produces psychotomimetic effects, though it remains to be seen if the blockade of NMDA via these agents is causative or actually mimetic of patterns resultant from monoaminergic disruption.

AMPA, the most widely distributed receptor in the brain, mediates long term potentiation via activity-dependent modulation of AMPA density. GluR1 subunit-containing AMPA receptors are Ca2+ permeable while GluR2/3 subunit-positive receptors are nearly impermeable to calcium ions. In the regulated pathway, GluR1 dimers populate the synapse at a rate proportional to NMDA-ergic Ca2+ influx. In the constitutative pathway, GluR2/3 dimers populate the synapse at a steady state.

This forms a positive feedback loop, where a small trigger impulse degating NMDA from Mg2+ pore blockade results in calcium influx, this calcium influx then triggers trafficking of GluR1-containing(Ca2+ permeable) subunits to the PSD, such trafficking of GluR1-positive AMPA to the postsynaptic neuron allows for upmodulation of the postsynaptic neuron's calcium influx in response to presynaptic calcium influx. Robust negative feedback at NMDA from kynurenic acid, magnesium, zinc, and agmatine prevents runaway feedback.

Misregulation of this pathway would sympathetically dysregulate LTP via disruption of NMDA. Such alteration in LTP may play a role, specifically in negative symptoms of schizophrenia, in creation of more broad disruptions such as loss of brain volume; an effect of the disease which antidopaminergics actually worsen, rather than treat.

The role of a7 nicotinic

Anandamide, an endocannabinoid, is an a7 nicotinic antagonist. Cigarettes, consumed far out of proportion by schizophrenics, contain nornitrosonicotine; a potent a7 antagonist. This may indicate a7 pentameter excess as a causative factor, or possibly as a method of self-medication to combat antipsychotic side effects. Cannabidiol, a FAAH inhibitor, increases levels in anandamide and may have antipsychotic effect; though results are mixed here as anandamide also is a cannabinoid and as such displays some psychotomimetic effect. However, a7 nicotinic agonists have been indicated as potential treatments for schizophrenia, though evidence is somewhat contradictory there is indication a7 nAChR is somehow involved in the pathogenesis of schizophrenia.

The role of 5-HT

This deficit in activation also results in a decrease in activity of 5-HT1A receptors in the raphe nucleus. This serves to increase global serotonin levels, as 5-HT1A serves as an autoreceptor. The 5-HT1B receptor, also acting as an autoreceptor, specifically within the striatum, but also parts of basal ganglia then will inhibit serotonin release. This disinhibits frontal dopamine release. The local deficit of 5-HT within the striatum, basal ganglia, and prefrontal cortex causes a deficit of excitatory 5-HT6 signalling. This could possibly be the reason antipsychotics sometimes are reported to aggravate negative symptoms as antipsychotics are 5HT6 antagonists This receptor is primarily GABAergic, as such, it causes an excess of glutamatergic, noradrenergic, dopaminergic, and cholinergic activity within the prefrontal cortex and the striatum. An excess of 5-HT7 signaling within the thalamus also creates too much excitatory transmission to the prefrontal cortex. Combined with another critical abnormality observed in those with schizophrenia: 5-HT2A dysfunction, this altered signalling cascade creates cortical, thus cognitive abnormalities. 5-HT2A allows a link between cortical, thus conscious, and the basal ganglia, unconscious. Axons from 5-HT2A neurons in layer V of the cerebral cortex reach the basal ganglia, forming a feedback loop. Signalling from layer V of the cerebral cortex to the basal ganglia alters 5-HT2C signalling. This feedback loop with 5-HT2A/5-HT2C is how the outer cortex layers can exert some control over our neuropeptides, specifically opioid peptides, oxytocin and vasopressin. This alteration in this limbic-layer V axis may create the profound change in social cognition (and sometimes cognition as a whole) that is observed in schizophrenia. However, genesis of the actual alterations is a much more complex phenomena.

The role of inhibitory transmission

The cortico-basal ganglia-thalamo-cortical loop is the source of the ordered input necessary for a higher level upper cortical loop. Feedback is controlled by the inhibitory potential of the cortices via the striatum. Through 5-HT2A efferents from layer V of the cortex transmission proceeds through the striatum into the globulus pallidus internal and substantia nigra pars compacta. This core input to the basal ganglia is combined with input from the subthalamic nucleus. The only primarily dopaminergic pathway in this loop is a reciprocal connection from the substantia nigra pars reticulata to the striatum.

Dopaminergic drugs such as dopamine releasing agents and direct dopamine receptor agonists create alterations in this primarily GABAergic pathway via increased dopaminergic feedback from the substantia nigra pars compacta to the striatum. However, dopamine also modulates other cortical areas, namely the VTA; with efferents to the amygdala and locus coeruleus, likely modulating anxiety and paranoid aspects of psychotic experience. As such, the glutamate hypothesis is probably not an explanation of primary causative factors in positive psychosis, but rather might possibly be an explanation for negative symptoms.

Dopamine hypothesis of schizophrenia elaborates upon the nature of abnormal lateral structures found in someone with a high risk for psychosis.

Altered signalling cascades

Again, thalamic input from layer V is a crucial factor in the functionality of the human brain. It allows the two sides to receive similar inputs, thus be able to perceive the same world. In psychosis, thalamic input loses much of its integrated character: hyperactive core feedback loops overwhelm the ordered output.[citation needed] This is due to excessive D2 and 5-HT2A activity. This alteration in input to the top and bottom of the cortex. The altered 5-HT signal cascade enhances the strength of excitatory thalamic input from layer V. This abnormality, enhancing the thalamic-cortical transmission cascade versus the corticostriatal control, creates a feedback loop, resulting in abnormally strong basal ganglia output.

The root of psychosis (experiences that cannot be explained, even within their own mind) is when basal ganglia input to layer V overwhelms the inhibitory potential of the higher cortexies resulting from striatal transmission. When combined with the excess prefrontal, specifically orbitofrontal transmission, from the hippocampus, this creates a brain prone to falling into self reinforcing belief.

However, given a specific environment, a person with this kind of brain (a human) can create a self-reinforcing pattern of maladaptive behavior, from the altered the layer II/III and III/I axises, from the disinhibited thalamic output. Rationality is impaired, primarily as response to the deficit of oxytocin and excess of vasopressin from the abnormal 5HT2C activity.

Frontal cortex activity will be impaired, when combined with excess DA activity: the basis for the advancement of schizophrenia, but it is also the neurologic mechanism behind many other psychotic diseases as well.. Heredation of schizophrenia may even be a result of conspecific "refrigerator parenting" techniques passed on though generations. However, the genetic component is the primary source of the neurological abnormalities which leave one prone to psychological disorders. Specifically, there is much overlap between bipolar disorder and schizophrenia, and other psychotic disorders.

Psychotic disorder is linked to excessive drug use, specifically dissociatives, psychedelics, stimulants, and marijuana.

Treatment

Alterations in serine racemase indicate that the endogenous NMDA agonist D-serine may be produced abnormally in schizophrenia and that d-serine may be an effective treatment for schizophrenia.

Schizophrenia is now treated by medications known as antipsychotics (or neuroleptics) that typically reduce dopaminergic activity because too much activity has been most strongly linked to positive symptoms, specifically persecutory delusions. Dopaminergic drugs do not induce the characteristic auditory hallucinations of schizophrenia. Dopaminergic drug abuse such as abuse of methamphetamine may result in a short lasting psychosis or provocation of a longer psychotic episode that may include symptoms of auditory hallucinations. The typical antipsychotics are known to have significant risks of side effects that can increase over time, and only show clinical effectiveness in reducing positive symptoms. Additionally, although newer atypical antipsychotics can have less affinity for dopamine receptors and still reduce positive symptoms, do not significantly reduce negative symptoms. A 2006 systematic review investigated the efficacy of glutamatergic drugs as add-on:

Outcome Findings in words Findings in numbers Quality of evidence
Global outcome
Relapse
(add-on glycine)
At present it is not possible to be confident about the effect of adding the glutamatergic drug to standard antipsychotic treatment. Data supporting this finding are very limited. RR 0.39 (0.02 to 8.73) Very low
Service outcome
Hospital admission
(add-on glycine)
There is no clarity about the benefits or otherwise of adding a glutamatergic drug to antipsychotics for outcomes about how much hospital/community care is used. Data supporting this finding are based on low quality evidence. RR 2.63 (0.12 to 59.40) Low
Mental state
No clinically significant improvement
(add-on glycine)
There is no evidence of clear advantage of using add-on glutamatergic to standard antipsychotic medication. These findings are based on data of low quality. RR 0.92 (0.79 to 1.08) Low
Adverse effects
Constipation
(add-on glycine or D-serine)
There is no clarity from very limited data. Additional glutamatergic could cause constipation or help avoid it. Data are very limited. RR 0.61 (0.06 to 6.02) Very low
Insomnia
(add-on glycine or D-serine)
Additional glutamatergic may help or cause insomnia - it is not clear from the very limited data. RR 0.61 (0.13 to 2.84) Very low
Missing outcomes
Quality of life This outcome was not reported in any studies

Psychotomimetic glutamate antagonists

Ketamine and PCP were observed to produce significant similarities to schizophrenia. Ketamine produces more similar symptoms (hallucinations, withdrawal) without observed permanent effects (other than ketamine tolerance). Both arylcyclohexamines have some(uM) affinity to D2 and as triple reuptake inhibitors. PCP is representative symptomatically, but does appear to cause brain structure changes seen in schizophrenia. Although unconfirmed, Dizocilpine discovered by a team at Merck seems to model both the positive and negative effects in a manner very similar to schizophreniform disorders.

Possible glutamate based treatment

An early clinical trial by Eli Lilly of the drug LY2140023 has shown potential for treating schizophrenia without the weight gain and other side-effects associated with conventional anti-psychotics. A trial in 2009 failed to prove superiority over placebo or Olanzapine, but Lilly explained this as being due to an exceptionally high placebo response. However, Eli Lilly terminated further development of the compound in 2012 after it failed in phase III clinical trials. This drug acts as a selective agonist at metabotropic mGluR2 and mGluR3 glutamate receptors (the mGluR3 gene has previously been associated with schizophrenia).

Studies of glycine (and related co-agonists at the NMDA receptor) added to conventional anti-psychotics have also found some evidence that these may improve symptoms in schizophrenia.

Animal models

Research done on mice in early 2009 has shown that when the neuregulin-1\ErbB post-synaptic receptor genes are deleted, the dendritic spines of glutamate neurons initially grow, but break down during later development. This led to symptoms (such as disturbed social function, inability to adapt to predictable future stressors) that overlap with schizophrenia. This parallels the time delay for symptoms setting in with schizophrenic humans who usually appear to show normal development until early adulthood.

Disrupted in schizophrenia 1 is a gene that is disrupted in schizophrenia.

Execution (computing)

From Wikipedia, the free encyclopedia
 
Execution in computer and software engineering is the process by which a computer or virtual machine reads and acts on the instructions of a computer program. Each instruction of a program is a description of a particular action which must be carried out, in order for a specific problem to be solved. Execution involves repeatedly following a 'fetch–decode–execute' cycle for each instruction done by control unit. As the executing machine follows the instructions, specific effects are produced in accordance with the semantics of those instructions.

Programs for a computer may be executed in a batch process without human interaction or a user may type commands in an interactive session of an interpreter. In this case, the "commands" are simply program instructions, whose execution is chained together.

The term run is used almost synonymously. A related meaning of both "to run" and "to execute" refers to the specific action of a user starting (or launching or invoking) a program, as in "Please run the application."

Process

Prior to execution, a program must first be written. This is generally done in source code, which is then compiled at compile time (and statically linked at link time) to produce an executable. This executable is then invoked, most often by an operating system, which loads the program into memory (load time), possibly performs dynamic linking, and then begins execution by moving control to the entry point of the program; all these steps depend on the Application Binary Interface of the operating system. At this point execution begins and the program enters run time. The program then runs until it ends, either normal termination or a crash.

Executable

Executable code, an executable file, or an executable program, sometimes simply referred to as an executable or binary, is a list of instructions and data to cause a computer "to perform indicated tasks according to encoded instructions", as opposed to a data file that must be interpreted (parsed) by a program to be meaningful.

The exact interpretation depends upon the use. "Instructions" is traditionally taken to mean machine code instructions for a physical CPU. In some contexts, a file containing scripting instructions (such as bytecode) may also be considered executable.

Context of execution

The context in which execution takes place is crucial. Very few programs execute on a bare machine. Programs usually contain implicit and explicit assumptions about resources available at the time of execution. Most programs execute within multitasking operating system and run-time libraries specific to the source language that provide crucial services not supplied directly by the computer itself. This supportive environment, for instance, usually decouples a program from direct manipulation of the computer peripherals, providing more general, abstract services instead.

Context switching

In order for programs and interrupt handlers to work without interference and share the same hardware memory and access to the I/O system, in a multitasking operating systems running on a digital system with a single CPU/MCU it is required to have some sort of software and hardware facilities to keep track of an executing processes data (memory page addresses, registers etc.) and to save and recover them back to the state they were in before they were suspended. This is achieved by a context switching. The running programs are often assigned a Process Context IDentifiers (PCID).

In Linux-based operating systems, a set of data stored in registers is usually saved into a process descriptor in memory to implement switching of context. PCIDs are also used.

Runtime

Runtime, run time, or execution time is the final phase of a computer program's life cycle, in which the code is being executed on the computer's central processing unit (CPU) as machine code. In other words, "runtime" is the running phase of a program.

A runtime error is detected after or during the execution (running state) of a program, whereas a compile-time error is detected by the compiler before the program is ever executed. Type checking, register allocation, code generation, and code optimization are typically done at compile time, but may be done at runtime depending on the particular language and compiler. Many other runtime errors exist and are handled differently by different programming languages, such as division by zero errors, domain errors, array subscript out of bounds errors, arithmetic underflow errors, several types of underflow and overflow errors, and many other runtime errors generally considered as software bugs which may or may not be caught and handled by any particular computer language.

Implementation details

When a program is to be executed, a loader first performs the necessary memory setup and links the program with any dynamically linked libraries it needs, and then the execution begins starting from the program's entry point. In some cases, a language or implementation will have these tasks done by the language runtime instead, though this is unusual in mainstream languages on common consumer operating systems.

Some program debugging can only be performed (or is more efficient or accurate when performed) at runtime. Logic errors and array bounds checking are examples. For this reason, some programming bugs are not discovered until the program is tested in a production environment with real data, despite sophisticated compile-time checking and pre-release testing. In this case, the end-user may encounter a "runtime error" message.

Application errors (exceptions)

Exception handling is one language feature designed to handle runtime errors, providing a structured way to catch completely unexpected situations as well as predictable errors or unusual results without the amount of inline error checking required of languages without it. More recent advancements in runtime engines enable automated exception handling which provides "root-cause" debug information for every exception of interest and is implemented independent of the source code, by attaching a special software product to the runtime engine.

Runtime system

A runtime system, also called runtime environment, primarily implements portions of an execution model. This is not to be confused with the runtime lifecycle phase of a program, during which the runtime system is in operation. When treating the runtime system as distinct from the runtime environment (RTE), the first may be defined as a specific part of the application software (IDE) used for programming, a piece of software that provides the programmer a more convenient environment for running programs during their production (testing and similar), while the second (RTE) would be the very instance of an execution model being applied to the developed program which is itself then run in the aforementioned runtime system.

Most programming languages have some form of runtime system that provides an environment in which programs run. This environment may address a number of issues including the management of application memory, how the program accesses variables, mechanisms for passing parameters between procedures, interfacing with the operating system, and otherwise. The compiler makes assumptions depending on the specific runtime system to generate correct code. Typically the runtime system will have some responsibility for setting up and managing the stack and heap, and may include features such as garbage collection, threads or other dynamic features built into the language.

Instruction cycle

The instruction cycle (also known as the fetch–decode–execute cycle, or simply the fetch-execute cycle) is the cycle that the central processing unit (CPU) follows from boot-up until the computer has shut down in order to process instructions. It is composed of three main stages: the fetch stage, the decode stage, and the execute stage.

This is a simple diagram illustrating the individual stages of the fetch-decode-execute cycle.

In simpler CPUs, the instruction cycle is executed sequentially, each instruction being processed before the next one is started. In most modern CPUs, the instruction cycles are instead executed concurrently, and often in parallel, through an instruction pipeline: the next instruction starts being processed before the previous instruction has finished, which is possible because the cycle is broken up into separate steps.

Interpreter

A system that executes a program is called an interpreter of the program. Loosely speaking, an interpreter directly executes a program. This contrasts with a language translator that converts a program from one language to another before it is executed.

Virtual machine

A virtual machine (VM) is the virtualization/emulation of a computer system. Virtual machines are based on computer architectures and provide functionality of a physical computer. Their implementations may involve specialized hardware, software, or a combination.

Virtual machines differ and are organized by their function, shown here:

Some virtual machine emulators, such as QEMU and video game console emulators, are designed to also emulate (or "virtually imitate") different system architectures thus allowing execution of software applications and operating systems written for another CPU or architecture. OS-level virtualization allows the resources of a computer to be partitioned via the kernel. The terms are not universally interchangeable.

Criticism of the Bible

From Wikipedia, the free encyclopedia

Authorship

At the end of the 17th century, only a few Bible scholars doubted that Moses wrote the Torah (also known as the Pentateuch, traditionally called the "Five Books of Moses"), such as Thomas Hobbes, Isaac La Peyrère and Baruch Spinoza, but in the late 18th century some scholars such as Jean Astruc (1753) began to systematically question his authorship. By the end of the 19th century, some such as Julius Wellhausen and Abraham Kuenen went as far as to claim that as a whole the work was of many more authors over many centuries from 1000 BC (the time of David) to 500 BC (the time of Ezra) and that the history it contained was often more polemical rather than strictly factual. By the first half of the 20th century, Hermann Gunkel had drawn attention to mythic aspects, and Albrecht Alt, Martin Noth, and the tradition history school argued that although its core traditions had genuinely ancient roots, the narratives were fictional framing devices and were not intended as history in the modern sense.

The modern consensus amongst Bible scholars is that the vast majority of the authors of books of the Bible are unknown. Most of them are written anonymously, and only some of the 27 books of the New Testament mention an author, some of which are probably or known to be pseudepigrapha, meaning they were written by someone other than whom the author said he was. The anonymous books have traditionally been attributed authors, though none of these, such as the "Five Books of Moses", or the four canonical gospels "according to Matthew, Mark, Luke, and John" have appeared to stand up under scrutiny. Only the 7 undisputed Pauline epistles appear to have most likely been written by Paul the Apostle, the Book of Revelation by John of Patmos (not by John the Apostle, nor by the author(s) of the other 'Johannine literature'). Scholars disagree whether Paul wrote the "Deutero-Pauline epistles" and whether Simon Peter wrote First Epistle of Peter; all other New Testament books that mention an author are most likely forgeries. Though, for the Pastorals, this can be a result of mainly a passing down the tradition of "scholarly consensus" vs. merited by the evidence.

In the 2nd century, the gnostics often claimed that their form of Christianity was the first, and they regarded Jesus as a teacher or an allegorical figure. Elaine Pagels has proposed that there are several examples of gnostic attitudes in the Pauline epistles. Bart D. Ehrman and Raymond E. Brown note that some of the Pauline epistles are widely regarded by scholars as pseudonymous, and it is the view of Timothy Freke, and others, that this involved a forgery in an attempt by the Church to bring in Paul's gnostic supporters and turn the arguments in the other epistles on their head.

Canonicity

Specific collections of biblical writings, such as the Hebrew Bible and Christian Bibles, are considered sacred and authoritative by their respective faith groups. The limits of the canon were effectively set by the proto-orthodox churches from the 1st throughout the 4th century; however, the status of the scriptures has been a topic of scholarly discussion in the later churches. Increasingly, the biblical works have been subjected to literary and historical criticism in an effort to interpret the biblical texts, independent of churches and dogmatic influences.

In the middle of the second century, Marcion of Sinope proposed rejecting the entire Jewish Bible. He considered the God portrayed therein to be a lesser deity, a demiurge, and that the law of Moses was contrived. A similar view is referred to as Jesuism, which does not affirm the scriptural authority of any biblical text other than the teachings of Jesus in the Gospels.

Judaism discount the New Testament and Old Testament deuterocanonicals. They, along with most Christians, also discredit the legitimacy of New Testament apocrypha.

Ethics

Elizabeth Anderson, a professor of philosophy and women's studies at the University of Michigan, Ann Arbor, states that "the Bible contains both good and evil teachings", and it is "morally inconsistent".

Anderson criticizes commands God gave to men in the Old Testament, such as: kill adulterers, homosexuals, and "people who work on the Sabbath" (Leviticus 20:10; Leviticus 20:13; Exodus 35:2, respectively); to commit ethnic cleansing (Exodus 34:11–14, Leviticus 26:7–9); commit genocide (Numbers 21: 2–3, Numbers 21:33–35, Deuteronomy 2:26–35, and Joshua 1–12); and other mass killings. Anderson considers the Bible to permit slavery, the beating of slaves, the rape of female captives in wartime, polygamy (for men), the killing of prisoners, and child sacrifice. She also provides several examples to illustrate what she considers "God's moral character": "Routinely punishes people for the sins of others ... punishes all mothers by condemning them to painful childbirth", punishes four generations of descendants of those who worship other gods, kills 24,000 Israelites because some of them sinned (Numbers 25:1–9), kills 70,000 Israelites for the sin of David in 2 Samuel 24:10–15, and "sends two bears out of the woods to tear forty-two children to pieces" because they called someone names in 2 Kings 2:23–24

Anderson criticizes what she terms morally repugnant lessons of the New Testament. She claims that "Jesus tells us his mission is to make family members hate one another, so that they shall love him more than their kin" (Matt 10:35–37), that "Disciples must hate their parents, siblings, wives, and children (Luke 14:26)", and that Peter and Paul elevate men over their wives "who must obey their husbands as gods" (1 Corinthians 11:3, 1 Corinthians 14:34–35, Eph. 5:22–24, Col. 3:18, 1 Tim. 2: 11–12, 1 Pet. 3:1). Anderson states that the Gospel of John implies that "infants and anyone who never had the opportunity to hear about Christ are damned [to hell], through no fault of their own".

Simon Blackburn states that the "Bible can be read as giving us a carte blanche for harsh attitudes to children, the mentally handicapped, animals, the environment, the divorced, unbelievers, people with various sexual habits, and elderly women".

Blackburn criticizes what he terms morally suspect themes of the New Testament. He notes some "moral quirks" of Jesus: that he could be "sectarian" (Matt 10:5–6), racist (Matt 15:26 and Mark 7:27), and placed no value on animal life (Luke 8: 27–33).

Blackburn provides examples of Old Testament moral criticisms, such as the phrase in Exodus 22:18, ("Thou shalt not suffer a witch to live.") which he says has "helped to burn alive tens or hundreds of thousands of women in Europe and America". He states that the Old Testament God apparently has "no problems with a slave-owning society", considers birth control a crime punishable by death, and "is keen on child abuse". Additional examples that are questioned today are the prohibition on touching women during their "period of menstrual uncleanliness (Lev. 15:19–24)", the apparent approval of selling daughters into slavery (Exodus 21:7), and the obligation to put to death someone working on the Sabbath (Exodus 35:2).

Historicity

The historicity of the Bible is the question of the Bible's "acceptability as a history". This can be extended to the question of the Christian New Testament as an accurate record of the historical Jesus and the Apostolic Age.

Scholars examine the historical context of the Bible passages, the importance ascribed to events by the authors, and the contrast between the descriptions of these events and other historical evidence.

Archaeological discoveries since the 19th century are open to interpretation, but broadly speaking they lend support to few of the Old Testament's narratives as history and offer evidence to challenge others. However, some scholars still hold that the overall Old Testament narrative is historically reliable.

Biblical minimalism is a label applied to a loosely knit group of scholars who hold that the Bible's version of history is not supported by any archaeological evidence so far unearthed, thus the Bible cannot be trusted as a history source. Author Richard I. Pervo details the non-historical sources of the Book of Acts.

Historicity of Jesus

The validity of the Gospels is challenged by writers such as Kersey Graves who claimed that mythic stories, that have parallels in the life of Jesus, support the conclusion that the gospel writers incorporated them into the story of Jesus and Gerald Massey, who specifically claimed that the life story of the Egyptian god Horus was copied by Christian Gnostics.[36] Parallels have also been drawn between Greek myths and the life of Jesus. The comparative mythology of Jesus Christ examines the parallels that have been proposed for the Biblical portrayal of Jesus in comparison to other religious or mythical domains. Some critics have alleged that Christianity is not founded on a historical figure, but rather on a mythical creation. One of these views proposes that Jesus was the Jewish manifestation of a pan-Hellenic cult, known as Osiris-Dionysus.

Christ myth theory proponents claim that the age, authorship, and authenticity of the Gospels can not be verified, thus the Gospels can not bear witness to the historicity of Jesus. This is in contrast with writers such as David Strauss, who regarded only the supernatural elements of the gospels as myth, but whereas these supernatural myths were a point of contention, there was no refutation of the gospels' authenticity as a witness to the historicity of Jesus.

Critics of the Gospels such as Richard Dawkins and Thomas Henry Huxley note that they were written long after the death of Jesus and that we have no real knowledge of the date of composition of the Gospels. Annie Besant and Thomas Paine note that the authors of the Gospels are not known.

Internal consistency

There are many places in the Bible in which inconsistencies—such as different numbers and names for the same feature, and different sequences for the same events—have been alleged and presented by critics as difficulties. Responses to these criticisms include the modern documentary hypothesis, the two-source hypothesis, and theories that the pastoral epistles are pseudonymous.

However, authors such as Raymond Brown have presented arguments that the Gospels contradict each other in various important respects and on various important details. W. D. Davies and E. P. Sanders state that: "on many points, especially about Jesus' early life, the evangelists were ignorant ... they simply did not know, and, guided by rumour, hope or supposition, did the best they could". Yet, E.P. Sanders has also opined, "The dominant view today seems to be that we can know pretty well what Jesus was out to accomplish, that we can know a lot about what he said, and that those two things make sense within the world of first-century Judaism." More critical scholars see the nativity stories either as completely fictional accounts, or at least constructed from traditions that predate the Gospels.

For example, many versions of the Bible specifically point out that the most reliable early manuscripts and other ancient witnesses did not include Mark 16:9–20, i.e., the Gospel of Mark originally ended at Mark 16:8, and additional verses were added a few hundred years later. This is known as the "Markan Appendix".

Translation issues

Translation of scripture into the vernacular (such as English and hundreds of other languages), though a common phenomenon, is also a subject of debate and criticism. For readability, clarity, or other reasons, translators may choose different wording or sentence structure, and some translations may choose to paraphrase passages. Because many of the words in the original language have ambiguous or difficult to translate meanings, debates over correct interpretation occur. For instance, at creation (Gen 1:2), is רוח אלהים (ruach 'elohiym) the "wind of god", "spirit of god"(i.e., the Holy Spirit in Christianity), or a "mighty wind" over the primordial deep? In Hebrew, רוח (ruach) can mean "wind", "breath" or "spirit". Both ancient and modern translators are divided over this and many other such ambiguities. Another example is the word used in the Masoretic Text to indicate the woman who would bear Immanuel is alleged to mean a young, unmarried woman in Hebrew, while Matthew 1:23 follows the Septuagint version of the passage that uses the Greek word parthenos, translated virgin, and is used to support the Christian idea of virgin birth. Those who view the Masoretic Text, which forms the basis of most English translations of the Old Testament, as being more accurate than the Septuagint, and trust its usual translation, may see this as an inconsistency, whereas those who take the Septuagint to be accurate may not.

More recently, several discoveries of ancient manuscripts such as the Dead Sea Scrolls, and Codex Sinaiticus, have led to modern translations like the New International Version differing somewhat from the older ones such as the 17th century King James Version, removing verses not present in the earliest manuscripts (see List of omitted Bible verses), some of which are acknowledged as interpolations, such as the Comma Johanneum, others having several highly variant versions in very important places, such as the resurrection scene in Mark 16. The King-James-Only Movement rejects these changes and upholds the King James Version as the most accurate.

In a 1973 Journal of Biblical Literature article, Philip B. Harner, Professor Emeritus of Religion at Heidelberg College, claimed that the traditional translation of John 1:1c ("and the Word was God" and one of the most frequently cited verses to support the doctrine of the Trinity) is incorrect. He endorses the New English Bible translation of John 1:1c, "and what God was, the Word was."

The Bible and science

Common points of criticism against the Bible are targeted at the Genesis creation narrative, Genesis flood myth, and the Tower of Babel. According to young Earth creationism, flat earth theory, and geocentrism, which all take a literal view of the book of Genesis, the universe, and all forms of life on Earth were created directly by God roughly 6,000 years ago, a global flood killed almost all life on Earth, and the diversity of languages originated from God confusing his people, who were in the process of constructing a large tower. These assertions, however, are contradicted by contemporary research in disciplines, such as archaeology, astronomy, biology, chemistry, geoscience, and physics. For instance, cosmological evidence suggests that the universe is approximately 13.8 billion years old. Analyses of the geological time scale date the Earth to be 4.5 billion years old. Developments in astronomy show the Solar System formed in a protoplanetary disk roughly 4.6 billion years ago. Physics and cosmology show that the Universe expanded, at a rapid rate, from quantum fluctuations in a process known as the Big Bang. Research within biology, chemistry, physics, astronomy, and geology has provided sufficient evidence to show life originated over 4 billion years ago through chemical processes. Countless fossils present throughout the fossil record, as well as research in molecular biology, genetics, anatomy, physiology, zoology, and other life sciences show all living organisms evolved over billions of years and share a common ancestry. Archaeological excavations have expanded human history, with material evidence of ancient cultures older than 6,000 years old. Moreover, 6,000 years is not enough time to account for the current amount of genetic variation in humans. If all humans were descended from two individuals that lived less than 10,000 years ago, it would require an impossibly high rate of mutation to reach humanity's current level of genetic diversity.

The argument that the literal story of Genesis can qualify as science collapses on three major grounds: the creationists' need to invoke miracles in order to compress the events of the earth's history into the biblical span of a few thousand years; their unwillingness to abandon claims clearly disproved, including the assertion that all fossils are products of Noah's flood; and their reliance upon distortion, misquote, half-quote, and citation out of context to characterize the ideas of their opponents.

Evolutionary creation, the religious belief that God created the world through the processes of evolution, seeks to reconcile some of these scientific challenges with the Christian faith.

According to one of the world's leading biblical archaeologists, William G. Dever,

Archaeology certainly doesn't prove literal readings of the Bible...It calls them into question, and that's what bothers some people. Most people really think that archaeology is out there to prove the Bible. No archaeologist thinks so. ... From the beginnings of what we call biblical archeology, perhaps 150 years ago, scholars, mostly western scholars, have attempted to use archeological data to prove the Bible. And for a long time it was thought to work. William Albright, the great father of our discipline, often spoke of the "archeological revolution." Well, the revolution has come but not in the way that Albright thought. The truth of the matter today is that archeology raises more questions about the historicity of the Hebrew Bible and even the New Testament than it provides answers, and that's very disturbing to some people.

Dever also wrote:

Archaeology as it is practiced today must be able to challenge, as well as confirm, the Bible stories. Some things described there really did happen, but others did not. The biblical narratives about Abraham, Moses, Joshua and Solomon probably reflect some historical memories of people and places, but the 'larger than life' portraits of the Bible are unrealistic and contradicted by the archaeological evidence....

I am not reading the Bible as Scripture… I am in fact not even a theist. My view all along—and especially in the recent books—is first that the biblical narratives are indeed 'stories', often fictional and almost always propagandistic, but that here and there they contain some valid historical information...

According to Dever, the scholarly consensus is that the figure of Moses is legendary, and not historical. However, he states that a "Moses-like figure" may have existed somewhere in the southern Transjordan in the mid-13th century BC.

Tel Aviv University archaeologist Ze'ev Herzog wrote in the Haaretz newspaper:

This is what archaeologists have learned from their excavations in the Land of Israel: the Israelites were never in Egypt, did not wander in the desert, did not conquer the land in a military campaign and did not pass it on to the 12 tribes of Israel. Perhaps even harder to swallow is that the united monarchy of David and Solomon, which is described by the Bible as a regional power, was at most a small tribal kingdom. And it will come as an unpleasant shock to many that the God of Israel, YHWH, had a female consort and that the early Israelite religion adopted monotheism only in the waning period of the monarchy and not at Mount Sinai.

Israel Finkelstein told The Jerusalem Post that Jewish archaeologists have found no historical or archaeological evidence to back the biblical narrative of the Exodus, the Jews' wandering in Sinai or Joshua's conquest of Canaan. On the alleged Temple of Solomon, Finkelstein said that there is no archaeological evidence to prove it really existed. Professor Yoni Mizrahi, an independent archaeologist who has worked with the International Atomic Energy Agency, agreed with Finkelstein.

Regarding the Exodus of Israelites from Egypt, Egyptian archaeologist Zahi Hawass said:

Really, it's a myth ... This is my career as an archaeologist. I should tell them the truth. If the people are upset, that is not my problem.

Notable critics

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...