Search This Blog

Monday, November 20, 2023

Criminal law

From Wikipedia, the free encyclopedia
 
Criminal law is the body of law that relates to crime. It prescribes conduct perceived as threatening, harmful, or otherwise endangering to the property, health, safety, and welfare of people inclusive of one's self. Most criminal law is established by statute, which is to say that the laws are enacted by a legislature. Criminal law includes the punishment and rehabilitation of people who violate such laws.

Criminal law varies according to jurisdiction, and differs from civil law, where emphasis is more on dispute resolution and victim compensation, rather than on punishment or rehabilitation.

Criminal procedure is a formalized official activity that authenticates the fact of commission of a crime and authorizes punitive or rehabilitative treatment of the offender.

History

The first civilizations generally did not distinguish between civil law and criminal law. The first written codes of law were designed by the Sumerians. Around 2100–2050 BC Ur-Nammu, the Neo-Sumerian king of Ur, enacted written legal code whose text has been discovered: the Code of Ur-Nammu although an earlier code of Urukagina of Lagash ( 2380–2360 BC ) is also known to have existed. Another important early code was the Code of Hammurabi, which formed the core of Babylonian law. Only fragments of the early criminal laws of Ancient Greece have survived, e.g. those of Solon and Draco.

The Old Bailey in London (in 1808) was the venue for more than 100,000 criminal trials between 1674 and 1834, including all death penalty cases.

In Roman law, Gaius's Commentaries on the Twelve Tables also conflated the civil and criminal aspects, treating theft (furtum) as a tort. Assault and violent robbery were analogized to trespass as to property. Breach of such laws created an obligation of law or vinculum juris discharged by payment of monetary compensation or damages. The criminal law of imperial Rome is collected in Books 47–48 of the Digest. After the revival of Roman law in the 12th century, sixth-century Roman classifications and jurisprudence provided the foundations of the distinction between criminal and civil law in European law from then until the present time.

The first signs of the modern distinction between crimes and civil matters emerged during the Norman Invasion of England. The special notion of criminal penalty, at least concerning Europe, arose in Spanish Late Scholasticism (see Alfonso de Castro), when the theological notion of God's penalty (poena aeterna) that was inflicted solely for a guilty mind, became transfused into canon law first and, finally, to secular criminal law. Codifiers and architects of Early Modern criminal law were the German jurist Benedikt Carpzov (1595–1666), professor of law in Leipzig, and two Italians, the Roman judge and lawyer Prospero Farinacci (1544–1618) and the Piedmontese lawyer and statesman Giulio Claro (1525–1575).

The development of the state dispensing justice in a court clearly emerged in the eighteenth century when European countries began maintaining police services. From this point, criminal law formalized the mechanisms for enforcement, which allowed for its development as a discernible entity.

Objectives of criminal law

Criminal law is distinctive for the uniquely serious, potential consequences or sanctions for failure to abide by its rules. Every crime is composed of criminal elements. Capital punishment may be imposed in some jurisdictions for the most serious crimes. Physical or corporal punishment may be imposed such as whipping or caning, although these punishments are prohibited in much of the world. Individuals may be incarcerated in prison or jail in a variety of conditions depending on the jurisdiction. Confinement may be solitary. Length of incarceration may vary from a day to life. Government supervision may be imposed, including house arrest, and convicts may be required to conform to particularized guidelines as part of a parole or probation regimen. Fines also may be imposed, seizing money or property from a person convicted of a crime.

Five objectives are widely accepted for enforcement of the criminal law by punishments: retribution, deterrence, incapacitation, rehabilitation and restoration. Jurisdictions differ on the value to be placed on each.

  • Retribution – Criminals ought to Be Punished in some way. This is the most widely seen goal. Criminals have taken improper advantage, or inflicted unfair detriment, upon others and consequently, the criminal law will put criminals at some unpleasant disadvantage to "balance the scales." People submit to the law to receive the right not to be murdered and if people contravene these laws, they surrender the rights granted to them by the law. Thus, one who murders may be executed himself. A related theory includes the idea of "righting the balance."
  • DeterrenceIndividual deterrence is aimed toward the specific offender. The aim is to impose a sufficient penalty to discourage the offender from criminal behavior. General deterrence aims at society at large. By imposing a penalty on those who commit offenses, other individuals are discouraged from committing those offenses.
  • Incapacitation – Designed simply to keep criminals away from society so that the public is protected from their misconduct. This is often achieved through prison sentences today. The death penalty or banishment have served the same purpose.
  • Rehabilitation – Aims at transforming an offender into a valuable member of society. Its primary goal is to prevent further offense by convincing the offender that their conduct was wrong.
  • Restoration – This is a victim-oriented theory of punishment. The goal is to repair, through state authority, any injury inflicted upon the victim by the offender. For example, one who embezzles will be required to repay the amount improperly acquired. Restoration is commonly combined with other main goals of criminal justice and is closely related to concepts in the civil law, i.e., returning the victim to his or her original position before the injury.

Selected criminal laws

Many laws are enforced by threat of criminal punishment, and the range of the punishment varies with the jurisdiction. The scope of criminal law is too vast to catalog intelligently. Nevertheless, the following are some of the more typical aspects of criminal law.

Elements

The criminal law generally prohibits undesirable acts. Thus, proof of a crime requires proof of some act. Scholars label this the requirement of an actus reus or guilty act. Some crimes – particularly modern regulatory offenses – require no more, and they are known as strict liability offenses (E.g. Under the Road traffic Act 1988 it is a strict liability offence to drive a vehicle with an alcohol concentration above the prescribed limit). Nevertheless, because of the potentially severe consequences of criminal conviction, judges at common law also sought proof of an intent to do some bad thing, the mens rea or guilty mind. As to crimes of which both actus reus and mens rea are requirements, judges have concluded that the elements must be present at precisely the same moment and it is not enough that they occurred sequentially at different times.

Actus reus

An English court room in 1886, with Lord Chief Justice Coleridge presiding

Actus reus is Latin for "guilty act" and is the physical element of committing a crime. It may be accomplished by an action, by threat of action, or exceptionally, by an omission to act, which is a legal duty to act. For example, the act of A striking B might suffice, or a parent's failure to give food to a young child also may provide the actus reus for a crime.

Where the actus reus is a failure to act, there must be a duty of care. A duty can arise through contract, a voluntary undertaking, a blood relation with whom one lives, and occasionally through one's official position. Duty also can arise from one's own creation of a dangerous situation. On the other hand, it was held in the U.K. that switching off the life support of someone in a persistent vegetative state is an omission to act and not criminal. Since discontinuation of power is not a voluntary act, not grossly negligent, and is in the patient's best interests, no crime takes place. In this case it was held that since a PVS patient could not give or withhold consent to medical treatment, it was for the doctors to decide whether treatment was in the patient's best interest. It was reasonable for them to conclude that treatment was not in the patient's best interest, and should therefore be stopped, when there was no prospect of improvement. It was never lawful to take active steps to cause or accelerate death, although in certain circumstances it was lawful to withhold life sustaining treatment, including feeding, without which the patient would die.

An actus reus may be nullified by an absence of causation. For example, a crime involves harm to a person, the person's action must be the but for cause and proximate cause of the harm. If more than one cause exists (e.g. harm comes at the hands of more than one culprit) the act must have "more than a slight or trifling link" to the harm.

Causation is not broken simply because a victim is particularly vulnerable. This is known as the thin skull rule. However, it may be broken by an intervening act (novus actus interveniens) of a third party, the victim's own conduct, or another unpredictable event. A mistake in medical treatment typically will not sever the chain, unless the mistakes are in themselves "so potent in causing death."

Mens rea

Mens rea is another Latin phrase, meaning "guilty mind". This is the mental element of the crime. A guilty mind means an intention to commit some wrongful act. Intention under criminal law is separate from a person's motive (although motive does not exist in Scots law).

A lower threshold of mens rea is satisfied when a defendant recognizes an act is dangerous but decides to commit it anyway. This is recklessness. It is the mental state of mind of the person at the time the actus reus was committed. For instance, if C tears a gas meter from a wall to get the money inside, and knows this will let flammable gas escape into a neighbour's house, he could be liable for poisoning. Courts often consider whether the actor did recognize the danger, or alternatively ought to have recognized a risk. Of course, a requirement only that one ought to have recognized a danger (though he did not) is tantamount to erasing intent as a requirement. In this way, the importance of mens rea has been reduced in some areas of the criminal law but is obviously still an important part in the criminal system.

Wrongfulness of intent also may vary the seriousness of an offense and possibly reduce the punishment but this is not always the case. A killing committed with specific intent to kill or with conscious recognition that death or serious bodily harm will result, would be murder, whereas a killing effected by reckless acts lacking such a consciousness could be manslaughter. On the other hand, it matters not who is actually harmed through a defendant's actions. The doctrine of transferred malice means, for instance, that if a man intends to strike a person with his belt, but the belt bounces off and hits another, mens rea is transferred from the intended target to the person who actually was struck.[Note: The notion of transferred intent does not exist within Scots' Law. In Scotland, one would not be charged with assault due to transferred intent, but instead assault due to recklessness.

Strict liability

Strict liability can be described as criminal or civil liability notwithstanding the lack of mens rea or intent by the defendant. Not all crimes require specific intent, and the threshold of culpability required may be reduced or demoted. For example, it might be sufficient to show that a defendant acted negligently, rather than intentionally or recklessly. In offenses of absolute liability, other than the prohibited act, it may not be necessary to show the act was intentional. Generally, crimes must include an intentional act, and "intent" is an element that must be proved in order to find a crime occurred. The idea of a "strict liability crime" is an oxymoron. The few exceptions are not truly crimes at all – but are administrative regulations and civil penalties created by statute, such as crimes against the traffic or highway code.

Fatal offenses

A murder, defined broadly, is an unlawful killing. Unlawful killing is probably the act most frequently targeted by the criminal law. In many jurisdictions, the crime of murder is divided into various gradations of severity, e.g., murder in the first degree, based on intent. Malice is a required element of murder. Manslaughter (Culpable Homicide in Scotland) is a lesser variety of killing committed in the absence of malice, brought about by reasonable provocation, or diminished capacity. Involuntary manslaughter, where it is recognized, is a killing that lacks all but the most attenuated guilty intent, recklessness.

Settled insanity is a possible defense.

Personal offenses

Many criminal codes protect the physical integrity of the body. The crime of battery is traditionally understood as an unlawful touching, although this does not include everyday knocks and jolts to which people silently consent as the result of presence in a crowd. Creating a fear of imminent battery is an assault, and also may give rise to criminal liability. Non-consensual intercourse, or rape, is a particularly egregious form of battery.

Property offenses

Property often is protected by the criminal law. Trespassing is unlawful entry onto the real property of another. Many criminal codes provide penalties for conversion, embezzlement, and theft, all of which involve deprivations of the value of the property. Robbery is a theft by force. Fraud in the UK is a breach of the Fraud Act 2006 by false representation, by failure to disclose information or by abuse of position.

Participatory offenses

Some criminal codes criminalize association with a criminal venture or involvement in criminality that does not actually come to fruition. Some examples are aiding, abetting, conspiracy, and attempt. However, in Scotland, the English concept of Aiding and Abetting is known as Art and Part Liability. See Glanville Williams, Textbook of Criminal Law, (London: Stevens & Sons, 1983); Glanville Williams, Criminal Law the General Part (London: Stevens & Sons, 1961).

Mala in se v. mala prohibita

While crimes are typically broken into degrees or classes to punish appropriately, all offenses can be divided into 'mala in se' and 'mala prohibita' laws. Both are Latin legal terms, mala in se meaning crimes that are thought to be inherently evil or morally wrong, and thus will be widely regarded as crimes regardless of jurisdiction. Mala in se offenses are felonies, property crimes, immoral acts and corrupt acts by public officials. Mala prohibita, on the other hand, refers to offenses that do not have wrongfulness associated with them. Parking in a restricted area, driving the wrong way down a one-way street, jaywalking or unlicensed fishing are examples of acts that are prohibited by statute, but without which are not considered wrong. Mala prohibita statutes are usually imposed strictly, as there does not need to be mens rea component for punishment under those offenses, just the act itself. For this reason, it can be argued that offenses that are mala prohibita are not really crimes at all.

Defenses

Criminal law jurisdictions

The exterior of the International Criminal Court's headquarters building in the Hague

Public international law deals extensively and increasingly with criminal conduct that is heinous and ghastly enough to affect entire societies and regions. The formative source of modern international criminal law was the Nuremberg trials following the Second World War in which the leaders of Nazism were prosecuted for their part in genocide and atrocities across Europe. The Nuremberg trials marked the beginning of criminal fault for individuals, where individuals acting on behalf of a government can be tried for violations of international law without the benefit of sovereign immunity. In 1998 an International criminal court was established in the Rome Statute.

Crime against nature

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Crime_against_nature

The crime against nature or unnatural act has historically been a legal term in English-speaking states identifying forms of sexual behavior not considered natural or decent and are legally punishable offenses. Sexual practices that have historically been considered to be "crimes against nature" include masturbation, sodomy and bestiality.

History and terminology

For much of modern history, a "crime against nature" was understood by courts to be synonymous to "buggery", and to include anal sex (copulation per anum) and bestiality. Early court decisions agreed that fellatio (copulation per os) was not included, though mainly because that practice was virtually unknown when the common-law definition was established (it remained so rare that first attempted fellatio prosecutions under the "crime against nature" statute date to 1817 in England and 1893 in the United States.) Likewise, sexual activities between two women were not covered. Over time, particularly starting in the early 20th century, some jurisdictions started enacting statutes or developing precedents the extended the scope of the crime to include fellatio and, sometimes, other sexual activities.

The term crime against nature is closely related to, and was often used interchangeably with, the term sodomy. (This varied from jurisdiction to jurisdiction. Sometimes the two terms were understood to be synonymous; sometimes sodomy was limited to sexual activities between two humans; and sometimes sodomy was taken to include anal sex or bestiality, whereas crime against nature also included fellatio.)

Until the early 19th century, courts were divided on whether the act needed to be completed (to result in ejaculation) in order to be a punishable offense. This question was deemed sufficiently important that, in 1828, English law was explicitly amended to specify that proof of ejaculation was not necessary for convictions for buggery and rape. The crime was not limited to same-sex activities, and, in case of an act between two adults, both participants were guilty, regardless of consent. Attempted or completed act of sodomy, committed by a husband against his wife, was grounds for divorce in common law.

Historically, the offense was usually referred to by its longer name, the detestable and abominable (or abominable and detestable, or, sometimes, infamous) crime against nature, committed with mankind or beast. This phrase originates in Buggery Act 1533, with words "crime against nature" substituted for "vice of buggery" in the original, and it was present in one of these forms in criminal codes of most U.S. states. Specific acts included under this heading were typically deemed too detestable to list them explicitly, resulting in a number of vagueness-based legal challenges to corresponding statutes. One of the most recent, and one of the rare successful challenges, is the 1971 Florida case of Franklin v. State. On the other hand, just 7 years prior, a similar challenge (Perkins v. State) failed in North Carolina. (In Perkins, the Court wrote that, if this were a new statute, it would have been "obviously unconstitutional for vagueness", but, since this was a statute whose history was traceable back to the reign of Henry VIII, it accumulated a number of judicial interpretations, and, backed with these interpretations, it was not unconstitutionally vague.)

Penalties for this offense varied greatly over time and between jurisdictions. Crime against nature remained punishable by death or life imprisonment both in the UK and in many U.S. states well into the 19th century. Liberalization of sexual morals led to reduction of penalties or decriminalization of the offense during the second half of the 20th century, so that, by 2003, it was no longer a punishable offense in 36 out of 50 U.S. states, and was only punishable by a fine in some of the remaining 14. (See Sodomy laws in the United States for details.)

Current use

Currently, the term crime against nature is still used in the statutes of the following American states. However, these laws are unconstitutional to enforce for sexual conduct between consenting adults in light of Lawrence v. Texas (2003). The crime against nature statutes are however still used to criminalize sexual conduct involving minors, incest, public sex, prostitution and bestiality.

Repeal and unconstitutionality

Except for the above eight states, all other states in the United States have repealed their "crimes against nature" laws. Furthermore, in 2003, in Lawrence v. Texas, the US Supreme Court held that nonremunerative sex between consenting adults in private was protected by the Constitution and could not be criminalized under "crimes against nature" laws. Thus, fellatio, cunnilingus and anal sex can no longer fall within the scope of such laws.

Similar laws

See also Sodomy laws.

  • Article 377 of the Indian Penal Code (since 1860) prohibits all sexual acts against human nature. The portion criminalising consensual sex in private between adults was struck down by the Supreme Court of India in 2018 - but however still officially on the statute books.
  • Paragraph 175 in the imperial penal code of the German Empire - repealed in 1994 officially.

Sunday, November 19, 2023

Polypharmacy

From Wikipedia, the free encyclopedia
Polypharmacy is often defined as taking 5 or more medicines.

Polypharmacy (polypragmasia) is an umbrella term to describe the simultaneous use of multiple medicines by a patient for their conditions. The term polypharmacy is often defined as regularly taking five or more medicines but there is no standard definition and the term has also been used in the context of when a person is prescribed 2 or more medications at the same time. Polypharmacy may be the consequence of having multiple long-term conditions, also known as multimorbidity and is more common in people who are older. In some cases, an excessive number of medications at the same time is worrisome, especially for people who are older with many chronic health conditions, because this increases the risk of an adverse event in that population. In many cases, polypharmacy cannot be avoided, but 'appropriate polypharmacy' practices are encouraged to decrease the risk of adverse effects. Appropriate polypharmacy is defined as the practice of prescribing for a person who has multiple conditions or complex health needs by ensuring that medications prescribed are optimized and follow 'best evidence' practices.

The prevalence of polypharmacy is estimated to be between 10% and 90% depending on the definition used, the age group studied, and the geographic location. Polypharmacy continues to grow in importance because of aging populations. Many countries are experiencing a fast growth of the older population, 65 years and older. This growth is a result of the baby-boomer generation getting older and an increased life expectancy as a result of ongoing improvement in health care services worldwide. About 21% of adults with intellectual disability are also exposed to polypharmacy. The level of polypharmacy has been increasing in the past decades. Research in the USA shows that the percentage of patients greater than 65 years-old using more than 5 medications increased from 24% to 39% between 1999 and 2012. Similarly, research in the UK found that the number of older people taking 5 plus medication had quadrupled from 12% to nearly 50% between 1994 and 2011.

Polypharmacy is not necessarily ill-advised, but in many instances can lead to negative outcomes or poor treatment effectiveness, often being more harmful than helpful or presenting too much risk for too little benefit. Therefore, health professionals consider it a situation that requires monitoring and review to validate whether all of the medications are still necessary. Concerns about polypharmacy include increased adverse drug reactions, drug interactions, prescribing cascade, and higher costs. A prescribing cascade occurs when a person is prescribed a drug and experiences an adverse drug effect that is misinterpreted as a new medical condition, so the patient is prescribed another drug. Polypharmacy also increases the burden of medication taking particularly in older people and is associated with medication non-adherence.

Polypharmacy is often associated with a decreased quality of life, including decreased mobility and cognition. Patient factors that influence the number of medications a patient is prescribed include a high number of chronic conditions requiring a complex drug regimen. Other systemic factors that impact the number of medications a patient is prescribed include a patient having multiple prescribers and multiple pharmacies that may not communicate.

Whether or not the advantages of polypharmacy (over taking single medications or monotherapy) outweigh the disadvantages or risks depends upon the particular combination and diagnosis involved in any given case. The use of multiple drugs, even in fairly straightforward illnesses, is not an indicator of poor treatment and is not necessarily overmedication. Moreover, it is well accepted in pharmacology that it is impossible to accurately predict the side effects or clinical effects of a combination of drugs without studying that particular combination of drugs in test subjects. Knowledge of the pharmacologic profiles of the individual drugs in question does not assure accurate prediction of the side effects of combinations of those drugs; and effects also vary among individuals because of genome-specific pharmacokinetics. Therefore, deciding whether and how to reduce a list of medications (deprescribe) is often not simple and requires the experience and judgment of a practicing clinician, as the clinician must weigh the pros and cons of keeping the patient on the medication. However, such thoughtful and wise review is an ideal that too often does not happen, owing to problems such as poorly handled care transitions (poor continuity of care, usually because of siloed information), overworked physicians and other clinical staff, and interventionism.

Appropriate medical uses

While polypharmacy is typically regarded as undesirable, prescription of multiple medications can be appropriate and therapeutically beneficial in some circumstances. “Appropriate polypharmacy” is described as prescribing for complex or multiple conditions in such a way that necessary medicines are used based on the best available evidence at the time to preserve safety and well-being. Polypharmacy is clinically indicated in some chronic conditions, for example in diabetes mellitus, but should be discontinued when evidence of benefit from the prescribed drugs no longer outweighs potential for harm (described below in Contraindications).

Often certain medications can interact with others in a positive way specifically intended when prescribed together, to achieve a greater effect than any of the single agents alone. This is particularly prominent in the field of anesthesia and pain management – where atypical agents such as antiepileptics, antidepressants, muscle relaxants, NMDA antagonists, and other medications are combined with more typical analgesics such as opioids, prostaglandin inhibitors, NSAIDS and others. This practice of pain management drug synergy is known as an analgesia sparing effect.

Examples

Special populations

People who are at greatest risk for negative polypharmacy consequences include elderly people, people with psychiatric conditions, patients with intellectual or developmental disabilities, people taking five or more drugs at the same time, those with multiple physicians and pharmacies, people who have been recently hospitalized, people who have concurrent comorbidities, people who live in rural communities, people with inadequate access to education, and those with impaired vision or dexterity. Marginalized populations may have a greater degrees of polypharmacy, which can occur more frequently in younger age groups.

It is not uncommon for people who are dependent or addicted to substances to enter or remain in a state of polypharmacy misuse. About 84% of prescription drug misusers reported using multiple drugs. Note, however, that the term polypharmacy and its variants generally refer to legal drug use as-prescribed, even when used in a negative or critical context.

Measures can be taken to limit polypharmacy to its truly legitimate and appropriate needs. This is an emerging area of research, frequently called deprescribing. Reducing the number of medications, as part of a clinical review, can be an effective healthcare intervention. Clinical pharmacists can perform drug therapy reviews and teach physicians and their patients about drug safety and polypharmacy, as well as collaborating with physicians and patients to correct polypharmacy problems. Similar programs are likely to reduce the potentially deleterious consequences of polypharmacy such as adverse drug events, non-adherence, hospital admissions, drug-drug interactions, geriatric syndromes, and mortality. Such programs hinge upon patients and doctors informing pharmacists of other medications being prescribed, as well as herbal, over-the-counter substances and supplements that occasionally interfere with prescription-only medication. Staff at residential aged care facilities have a range of views and attitudes towards polypharmacy that, in some cases, may contribute to an increase in medication use.

Risks of polypharmacy

The risk of polypharmacy increases with age, although there is some evidence that it may decrease slightly after age 90 years. Poorer health is a strong predictor of polypharmacy at any age, although it is unclear whether the polypharmacy causes the poorer health or if polypharmacy is used because of the poorer health. It appears possible that the risk factors for polypharmacy may be different for younger and middle-aged people compared to older people.

The use of polypharmacy is correlated to the use of potentially inappropriate medications. Potentially inappropriate medications are generally taken to mean those that have been agreed upon by expert consensus, such as by the Beers Criteria. These medications are generally inappropriate for older adults because the risks outweigh the benefits. Examples of these include urinary anticholinergics used to treat incontinence; the associated risks, with anticholinergics, include constipation, blurred vision, dry mouth, impaired cognition, and falls. Many older people living in long term care facilities experience polypharmacy, and under-prescribing of potentially indicated medicines and use of high risk medicines can also occur.

Polypharmacy is associated with an increased risk of falls in elderly people. Certain medications are well known to be associated with the risk of falls, including cardiovascular and psychoactive medications. There is some evidence that the risk of falls increases cumulatively with the number of medications. Although often not practical to achieve, withdrawing all medicines associated with falls risk can halve an individual's risk of future falls.

Every medication has potential adverse side-effects. With every drug added, there is an additive risk of side-effects. Also, some medications have interactions with other substances, including foods, other medications, and herbal supplements. 15% of older adults are potentially at risk for a major drug-drug interaction. Older adults are at a higher risk for a drug-drug interaction due to the increased number of medications prescribed and metabolic changes that occur with aging. When a new drug is prescribed, the risk of interactions increases exponentially. Doctors and pharmacists aim to avoid prescribing medications that interact; often, adjustments in the dose of medications need to be made to avoid interactions. For example, warfarin interacts with many medications and supplements that can cause it to lose its effect.

Pill burden

Pill burden is the number of pills (tablets or capsules, the most common dosage forms) that a person takes on a regular basis, along with all associated efforts that increase with that number - like storing, organizing, consuming, and understanding the various medications in one's regimen. The use of individual medications is growing faster than pill burden. A recent study found that older adults in long term care are taking an average of 14 to 15 tablets every day.

Poor medical adherence is a common challenge among individuals who have increased pill burden and are subject to polypharmacy. It also increases the possibility of adverse medication reactions (side effects) and drug-drug interactions. High pill burden has also been associated with an increased risk of hospitalization, medication errors, and increased costs for both the pharmaceuticals themselves and for the treatment of adverse events. Finally, pill burden is a source of dissatisfaction for many patients and family carers.

High pill burden was commonly associated with antiretroviral drug regimens to control HIV, and is also seen in other patient populations. For instance, adults with multiple common chronic conditions such as diabetes, hypertension, lymphedema, hypercholesterolemia, osteoporosis, constipation, inflammatory bowel disease, and clinical depression may be prescribed more than a dozen different medications daily. The combination of multiple drugs has been associated with an increased risk of adverse drug events.

Reducing pill burden is recognized as a way to improve medication compliance, also referred to as adherence. This is done through "deprescribing", where the risks and benefits are weighed when considering whether to continue a medication. This includes drugs such as bisphosphonates (for osteoporosis), which are often taken indefinitely although there is only evidence to use it for five to ten years. Patient educational programs, reminder messages, medication packaging, and the use of memory tricks has also been seen to improve adherence and reduce pill burden in several countries. These include associating medications with mealtimes, recording the dosage on the box, storing the medication in a special place, leaving it in plain sight in the living room, or putting the prescription sheet on the refrigerator. The development of applications has also shown some benefit in this regard. The use of a polypill regimen, such as combination pill for HIV treatment, as opposed to a multi-pill regimen, also alleviates pill burden and increases adherence.

The selection of long-acting active ingredients over short-acting ones may also reduce pill burden. For instance, ACE inhibitors are used in the management of hypertension. Both captopril and lisinopril are examples of ACE inhibitors. However, lisinopril is dosed once a day, whereas captopril may be dosed 2-3 times a day. Assuming that there are no contraindications or potential for drug interactions, using lisinopril instead of captopril may be an appropriate way to limit pill burden.

Interventions

The most common intervention to help people who are struggling with polypharmacy is deprescribing. Deprescribing can be confused with medication simplification, which does not attempt to reduce the number of medicines but rather reduce the number of dose forms and administration times. Deprescribing refers to reducing the number of medications that a person is prescribed and includes the identification and discontinuance of medications when the benefit no longer outweighs the harm. In elderly patients, this can commonly be done as a patient becomes more frail and treatment focus needs to shift from preventative to palliative. Deprescribing is feasible and effective in many settings including residential care, communities and hospitals. This preventative measure should be considered for anyone who exhibits one of the following: (1) a new symptom or adverse event arises, (2) when the person develops an end-stage disease, (3) if the combination of drugs is risky, or (4) if stopping the drug does not alter the disease trajectory.

Several tools exist to help physicians decide when to deprescribe and what medications can be added to a pharmaceutical regimen. The Beers Criteria and the STOPP/START criteria help identify medications that have the highest risk of adverse drug events (ADE) and drug-drug interactions. The Medication appropriateness tool for comorbid health conditions during dementia (MATCH-D) is the only tool available specifically for people with dementia, and also cautions against polypharmacy and complex medication regimens.

Barriers faced by both physicians and people taking the medications have made it challenging to apply deprescribing strategies in practice. For physicians, these include fear of consequences of deprescribing, the prescriber's own confidence in their skills and knowledge to deprescribe, reluctance to alter medications that are prescribed by specialists, the feasibility of deprescribing, lack of access to all of patients' clinical notes, and the complexity of having multiple providers. For patients who are prescribed or require the medication, barriers include attitudes or beliefs about the medications, inability to communicate with physicians, fears and uncertainties surrounding deprescribing, and influence of physicians, family, and the media. Barriers can include other health professionals or carers, such as in residential care, believing that the medicines are required.

In people with multiple long-term conditions (multimorbidity) and polypharmacy deprescribing represents a complex challenge as clinical guidelines are usually developed for single conditions. In these cases tools and guidelines like the Beers Criteria and STOPP/START could be used safely by clinicians but not all patients might benefit from stopping their medication. There is a need for clarity about how much clinicians can do beyond the guidelines and the responsibility they need to take could help them prescribing and deprescribing for complex cases. Further factors that can help clinicians tailor their decisions to the individual are: access to detailed data on the people in their care (including their backgrounds and personal medical goals), discussing plans to stop a medicine already when it is first prescribed, and a good relationship that involves mutual trust and regular discussions on progress. Furthermore, longer appointments for prescribing and deprescribing would allow time explain the process of deprescribing, explore related concerns, and support making the right decisions.

The effectiveness of specific interventions to improve the appropriate use of polypharmacy such as pharmaceutical care and computerised decision support is unclear. This is due to low quality of current evidence surrounding these interventions. High quality evidence is needed to make any conclusions about the effects of such interventions in any environment, including in care homes. Deprescribing is not influenced by whether medicines are prescribed through a paper-based or an electronic system. Deprescribing rounds has been proposed as a potentially successful methodology in reducing polypharmacy. Sharing of positive outcomes from physicians who have implemented deprescribing, increased communication between all practitioners involved in patient care, higher compensation for time spent deprescribing, and clear deprescribing guidelines can help enable the practice of deprescribing. Despite the difficulties, a recent blinded study of deprescribing reported that participants used an average of two fewer medicines each after 12 months showing again that deprescribing is feasible.

Adverse drug reaction

From Wikipedia, the free encyclopedia
Adverse drug reaction
A rash due to a drug reaction

An adverse drug reaction (ADR) is a harmful, unintended result caused by taking medication. ADRs may occur following a single dose or prolonged administration of a drug or may result from the combination of two or more drugs. The meaning of this term differs from the term "side effect" because side effects can be beneficial as well as detrimental. The study of ADRs is the concern of the field known as pharmacovigilance. An adverse event (AE) refers to any unexpected and inappropriate occurrence at the time a drug is used, whether or not the event is associated with the administration of the drug. An ADR is a special type of AE in which a causative relationship can be shown. ADRs are only one type of medication-related harm. Another type of medication-related harm type includes not taking prescribed medications, which is also known as non-adherence. Non-adherence to medications can lead to death and other negative outcomes. Adverse drug reactions require the use of a medication.

Classification

Traditional Classification

  • Type A: augmented pharmacological effects, which are dose-dependent and predictable
Type A reactions, which constitute approximately 80% of adverse drug reactions, are usually a consequence of the drug's primary pharmacological effect (e.g., bleeding when using the anticoagulant warfarin) or a low therapeutic index of the drug (e.g., nausea from digoxin), and they are therefore predictable. They are dose-related and usually mild, although they may be serious or even fatal (e.g. intracranial bleeding from warfarin). Such reactions are usually due to inappropriate dosage, especially when drug elimination is impaired. The term side effects may be applied to minor type A reactions.
  • Type B: Type B reactions are not dose-dependent and are not predictable, and so may be called idiosyncratic. These reactions can be due to particular elements within the person or the environment.

Types A and B were proposed in the 1970s, and the other types were proposed subsequently when the first two proved insufficient to classify ADRs.

Other types of adverse drug reactions are Type C, Type D, Type E, and Type F. Type C was categorized for chronic adverse drug reactions, Type D for delayed adverse drug reactions, Type E for withdrawal adverse drug reactions, and Type F for failure of therapy as an adverse drug reaction. Adverse drug reactions can also be categorized using time-relatedness, dose-relatedness, and susceptibility, which collectively are called the DoTS classification.

Seriousness

The U.S Food and Drug Administration defines a serious adverse event as one when the patient outcome is one of the following:

  • Death
  • Life-threatening
  • Hospitalization (initial or prolonged)
  • Disability - significant, persistent, or permanent change, impairment, damage or disruption in the patient's body function/structure, physical activities or quality of life.
  • Congenital abnormality
  • Requires intervention to prevent permanent impairment or damage

Severity is a measure of the intensity of the adverse event in question. The terms "severe" and "serious", when applied to adverse events, are technically very different. They are easily confused but can not be used interchangeably, requiring care in usage. Seriousness usually indicates patient outcome (such as negative outcomes including disability, long-term effects, and death).

A headache is severe if it causes intense pain. There are scales like "visual analog scale" that help clinicians assess the severity. On the other hand, a headache is not usually serious (but may be in case of subarachnoid hemorrhage, subdural bleed, even a migraine may temporally fit criteria), unless it also satisfies the criteria for seriousness listed above.

In adverse drug reactions, the seriousness of the reaction is important for reporting.

Location

Adverse effects may be local, i.e. limited to a certain location, or systemic, where medication has caused adverse effects throughout the systemic circulation.

For instance, some ocular antihypertensives cause systemic effects, although they are administered locally as eye drops, since a fraction escapes to the systemic circulation.

Mechanisms

Adverse drug reaction leading to hepatitis (drug-induced hepatitis) with granulomata. Other causes were excluded with extensive investigations. Liver biopsy. H&E stain.

As research better explains the biochemistry of drug use, fewer ADRs are Type B and more are Type A. Common mechanisms are:

  • Abnormal pharmacokinetics due to:
  • Synergistic effects between either:
    • a drug and a disease
    • two drugs
  • Antagonism effects between either:
    • a drug and a disease
    • two drugs

Abnormal pharmacokinetics

Comorbid disease states

Various diseases, especially those that cause renal or hepatic insufficiency, may alter drug metabolism. Resources are available that report changes in a drug's metabolism due to disease states.

The Medication Appropriateness Tool for Comorbid Health Conditions in Dementia (MATCH-D) criteria warns that people with dementia are more likely to experience adverse effects, and that they are less likely to be able to reliably report symptoms.

Genetic factors

Pharmacogenomics includes how genes can predict potential adverse drug reactions. However, pharmacogenomics is not limited to adverse events (of any type), but also looks at how genes may impact other responses to medications, such as low/no effect or expected/normal responses (especially based on drug metabolism).

Abnormal drug metabolism may be due to inherited factors of either Phase I oxidation or Phase II conjugation.

Phase I reactions

Phase I reactions include metabolism by cytochrome P450. Patients have abnormal metabolism by cytochrome P450 due to either inheriting abnormal alleles or due to drug interactions. Tables are available to check for drug interactions due to P450 interactions.

Inheriting abnormal butyrylcholinesterase (pseudocholinesterase) may affect metabolism of drugs such as succinylcholine.

Phase II reactions

Inheriting abnormal N-acetyltransferase which conjugated some drugs to facilitate excretion may affect the metabolism of drugs such as isoniazid, hydralazine, and procainamide.

Inheriting abnormal thiopurine S-methyltransferase may affect the metabolism of the thiopurine drugs mercaptopurine and azathioprine.

Protein binding

Protein binding interactions are usually transient and mild until a new steady state is achieved. These are mainly for drugs without much first-pass liver metabolism. The principal plasma proteins for drug binding are:

  1. albumin
  2. α1-acid glycoprotein
  3. lipoproteins

Some drug interactions with warfarin are due to changes in protein binding.

Drug Interactions

The risk of drug interactions is increased with polypharmacy, especially in older adults.

Additive drug effects

Two or more drugs that contribute to the same mechanism in the body can have additive toxic or adverse effects. One example of this is multiple medications administered concurrently that prolong the QT interval, such as antiarrhythmics like sotalol and some macrolide antibiotics, such as systemic azithromycin. Another example of additive effects for adverse drug reactions is in serotonin toxicity (serotonin syndrome). If medications that cause increased serotonin levels are combined, they can cause serotonin toxicity (though therapeutic doses of one agent that increases serotonin levels can cause serotonin toxicity in certain cases and individuals). Some of the medications that can contribute to serotonin toxicity include MAO inhibitors, SSRIs, and tricyclic antidepressants.

Altered Metabolism

Some medications can either inhibit or induce key drug metabolizing enzymes or drug transporters, which when combined with other medications that utilize the same proteins can lead to either toxic or sub-therapeutic adverse effects. One example of this is a patient taking a cytochrome P450 3A4 (CYP3A4) inhibitor such as the antibiotic clarithromycin, as well as another medication metabolized by CYP3A4 such as the anticoagulant apixaban, which results in elevated blood concentrations of apixaban and greater risk of serious bleeds. Additionally, Clarithromycin is a permeability glycoprotein (P-gp) efflux pump inhibitor, which when given with apixaban (a substrate for P-gp) will lead to increased absorption of apixaban, resulting in the same adverse effects as with CYP3A4 inhibition.

Assessing causality

Causality assessment is used to determine the likelihood that a drug caused a suspected ADR. There are a number of different methods used to judge causation, including the Naranjo algorithm, the Venulet algorithm and the WHO causality term assessment criteria. Each have pros and cons associated with their use and most require some level of expert judgement to apply. An ADR should not be labeled as 'certain' unless the ADR abates with a challenge-dechallenge-rechallenge protocol (stopping and starting the agent in question). The chronology of the onset of the suspected ADR is important, as another substance or factor may be implicated as a cause; co-prescribed medications and underlying psychiatric conditions may be factors in the ADR.

Assigning causality to a specific agent often proves difficult, unless the event is found during a clinical study or large databases are used. Both methods have difficulties and can be fraught with error. Even in clinical studies, some ADRs may be missed as large numbers of test individuals are required to find a specific adverse drug reaction, especially for rare ADRs. Psychiatric ADRs are often missed as they are grouped together in the questionnaires used to assess the population.

Monitoring bodies

Many countries have official bodies that monitor drug safety and reactions. On an international level, the WHO runs the Uppsala Monitoring Centre. The European Union runs the European Medicines Agency (EMA). In the United States, the Food and Drug Administration (FDA) is responsible for monitoring post-marketing studies. The FDA has a reporting system called the FDA Adverse Event Reporting System, where individuals can report adverse drug events. Healthcare professionals, consumers, and the pharmaceutical industry can all submit information to this system. For health products marketed in Canada, a branch of Health Canada called The Canada Vigilance Program is responsible for surveillance. Both healthcare professionals and consumers can report to this program. In Australia, the Therapeutic Goods Administration (TGA) conducts postmarket monitoring of therapeutic products. In the UK, a monitoring system called the Yellow Card Scheme was established in 1964. The Yellow Card Scheme was set up to surveil medications and other health products.

Epidemiology

A study by the Agency for Healthcare Research and Quality (AHRQ) found that in 2011, sedatives and hypnotics were a leading source for adverse drug events seen in the hospital setting. Approximately 2.8% of all ADEs present on admission and 4.4% of ADEs that originated during a hospital stay were caused by a sedative or hypnotic drug. A second study by AHRQ found that in 2011, the most common specifically identified causes of adverse drug events that originated during hospital stays in the U.S. were steroids, antibiotics, opiates/narcotics, and anticoagulants. Patients treated in urban teaching hospitals had higher rates of ADEs involving antibiotics and opiates/narcotics compared to those treated in urban nonteaching hospitals. Those treated in private, nonprofit hospitals had higher rates of most ADE causes compared to patients treated in public or private, for-profit hospitals.

Medication related harm (MRH) is common after hospital discharge in older adults, but methodological inconsistencies between studies and a paucity of data on risk factors limits clear understanding of the epidemiology. There was a wide range in incidence, from 0.4% to 51.2% of participants, and 35% to 59% of harm was preventable. Medication related harm incidence within 30 days after discharge ranged from 167 to 500 events per 1,000 individuals discharged (17–51% of individuals).

In the U.S., females had a higher rate of ADEs involving opiates and narcotics than males in 2011, while male patients had a higher rate of anticoagulant ADEs. Nearly 8 in 1,000 adults aged 65 years or older experienced one of the four most common ADEs (steroids, antibiotics, opiates/narcotics, and anticoagulants) during hospitalization. A study showed that 48% of patients had an adverse drug reaction to at least one drug, and pharmacist involvement helps to pick up adverse drug reactions.

In 2012, McKinsey & Company concluded that the cost of the 50-100 million preventable error-related adverse drug events would be between US$18–115 billion.

An article published in The Journal of the American Medical Association (JAMA) in 2016 reported adverse drug event statistics from emergency departments around the United States in 2013-2014. From this article, an estimated prevalence of adverse drug events that were presented to the emergency department (ED) was 4 events out of every 1000 people. This article reported that 57.1% of these adverse drug events presented to the ED were in females. As well, out of all of the adverse drug events presented to the emergency department documented in this article, 17.6% were from anticoagulants, 16.1% were from antibiotics, and 13.3% from diabetic agents.

Death drive

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Death_drive

In classical Freudian psychoanalytic theory, the death drive (German: Todestrieb) is the drive toward death and destruction, often expressed through behaviors such as aggression, repetition compulsion, and self-destructiveness. It was originally proposed by Sabina Spielrein in her paper "Destruction as the Cause of Coming Into Being" (Die Destruktion als Ursache des Werdens) in 1912, which was then taken up by Sigmund Freud in 1920 in Beyond the Pleasure Principle. This concept has been translated as "opposition between the ego or death instincts and the sexual or life instincts". In Beyond the Pleasure Principle, Freud used the plural "death drives" (Todestriebe) much more frequently than the singular.

The death drive opposes Eros, the tendency toward survival, propagation, sex, and other creative, life-producing drives. The death drive is sometimes referred to as "Thanatos" in post-Freudian thought, complementing "Eros", although this term was not used in Freud's own work, being rather introduced by Wilhelm Stekel in 1909 and then by Paul Federn in the present context. Subsequent psychoanalysts such as Jacques Lacan and Melanie Klein have defended the concept.

Terminology

The standard edition of Freud's works in English confuses two terms that are different in German, Instinkt (instinct) and Trieb (drive), often translating both as instinct; for example, "the hypothesis of a death instinct, the task of which is to lead organic life back into the inanimate state". "This equating of Instinkt and Trieb has created serious misunderstandings". Freud actually refers to the term "Instinkt" in explicit use elsewhere, and so while the concept of "instinct" can loosely be referred to as a "drive," any essentialist or naturalist connotations of the term should be put in abeyance. In a sense, the death drive is a force that is not essential to the life of an organism (unlike an "instinct") and tends to denature it or make it behave in ways that are sometimes counter-intuitive. In other words, the term death "instinct" is simply a false representation of death drive. The term is almost universally known in scholarly literature on Freud as the "death drive", and Lacanian psychoanalysts often shorten it to simply "drive" (although Freud posited the existence of other drives as well, and Lacan explicitly states in Seminar XI that all drives are partial to the death drive). The contemporary Penguin translations of Freud translate Trieb and Instinkt as "drive" and "instinct" respectively.

Origin of the theory: Beyond the Pleasure Principle

It was a basic premise of Freud's that "the course taken by mental events is automatically regulated by the pleasure principle...[associated] with an avoidance of unpleasure or a production of pleasure". Three main types of conflictual evidence, difficult to explain satisfactorily in such terms, led Freud late in his career to look for another principle in mental life beyond the pleasure principle—a search that would ultimately lead him to the concept of the death drive.

The first problem Freud encountered was the phenomenon of repetition in (war) trauma. When Freud worked with people with trauma (particularly the trauma experienced by soldiers returning from World War I), he observed that subjects often tended to repeat or re-enact these traumatic experiences: "dreams occurring in traumatic patients have the characteristic of repeatedly bringing the patient back into the situation of his accident", contrary to the expectations of the pleasure principle.

A second problematic area was found by Freud in children's play (such as the Fort/Da Forth/here game played by Freud's grandson, who would stage and re-stage the disappearance of his mother and even himself). "How then does his repetition of this distressing experience as a game fit in with the pleasure principle?"

The third problem came from clinical practice. Freud found his patients, dealing with painful experiences that had been repressed, regularly "obliged to repeat the repressed material as a contemporary experience instead of ... remembering it as something belonging to the past". Combined with what he called "the compulsion of destiny ... come across [in] people all of whose human relationships have the same outcome", such evidence led Freud "to justify the hypothesis of a compulsion to repeat—something that would seem more primitive, more elementary, more instinctual than the pleasure principle which it over-rides".

He then set out to find an explanation of such a compulsion, an explanation that some scholars have labeled as "metaphysical biology". In Freud's own words, "What follows is speculation, often far-fetched speculation, which the reader will consider or dismiss according to his individual predilection". Seeking a new instinctual paradigm for such problematic repetition, he found it ultimately in "an urge in organic life to restore an earlier state of things"—the inorganic state from which life originally emerged. From the conservative, restorative character of instinctual life, Freud derived his death drive, with its "pressure towards death", and the resulting "separation of the death instincts from the life instincts" seen in Eros. The death drive then manifested itself in the individual creature as a force "whose function is to assure that the organism shall follow its own path to death".

Seeking further potential clinical support for the existence of such a self-destructive force, Freud found it through a reconsideration of his views of masochism—previously "regarded as sadism that has been turned round upon the subject's own ego"—so as to allow that "there might be such a thing as primary masochism—a possibility which I had contested" before. Even with such support, however, he remained very tentative to the book's close about the provisional nature of his theoretical construct: what he called "the whole of our artificial structure of hypotheses".

Although Spielrein's paper was published in 1912, Freud initially resisted the concept as he considered it to be too Jungian. Nevertheless, Freud eventually adopted the concept, and in later years would build extensively upon the tentative foundations he had set out in Beyond the Pleasure Principle. In The Ego and the Id (1923) he would develop his argument to state that "the death instinct would thus seem to express itself—though probably only in part—as an instinct of destruction directed against the external world". The following year he would spell out more clearly that the "libido has the task of making the destroying instinct innocuous, and it fulfils the task by diverting that instinct to a great extent outwards .... The instinct is then called the destructive instinct, the instinct for mastery, or the will to power", a perhaps much more recognisable set of manifestations.

At the close of the decade, in Civilization and Its Discontents (1930), Freud acknowledged that "To begin with it was only tentatively that I put forward the views I have developed here, but in the course of time they have gained such a hold upon me that I can no longer think in any other way".

Philosophy

From a philosophical perspective, the death drive may be viewed in relation to the work of the German philosopher Arthur Schopenhauer. His philosophy, expounded in The World as Will and Representation (1818) postulates that all exists by a metaphysical "will" (more clearly, a will to live), and that pleasure affirms this will. Schopenhauer's pessimism led him to believe that the affirmation of the "will" was a negative and immoral thing, due to his belief of life producing more suffering than happiness. The death drive would seem to manifest as a natural and psychological negation of the "will".

Freud was well aware of such possible linkages. In a letter of 1919, he wrote that regarding "the theme of death, [that I] have stumbled onto an odd idea via the drives and must now read all sorts of things that belong to it, for instance Schopenhauer". Ernest Jones (who like many analysts was not convinced of the need for the death drive, over and above an instinct of aggression) considered that "Freud seemed to have landed in the position of Schopenhauer, who taught that 'death is the goal of life'".

However, as Freud put it to the imagined auditors of his New Introductory Lectures (1932), "You may perhaps shrug your shoulders and say: "That isn't natural science, it's Schopenhauer's philosophy!" But, ladies and gentlemen, why should not a bold thinker have guessed something that is afterwards confirmed by sober and painstaking detailed research?" He then went on to add that "what we are saying is not even genuine Schopenhauer....we are not overlooking the fact that there is life as well as death. We recognise two basic instincts and give each of them its own aim".

Cultural application: Civilization and Its Discontents

Freud applied his new theoretical construct in Civilization and Its Discontents (1930) to the difficulties inherent in Western civilization—indeed, in civilization and in social life as a whole. In particular, given that "a portion of the [death] instinct is diverted towards the external world and comes to light as an instinct of aggressiveness', he saw 'the inclination to aggression ... [as] the greatest impediment to civilization". The need to overcome such aggression entailed the formation of the [cultural] superego: "We have even been guilty of the heresy of attributing the origin of conscience to this diversion inwards of aggressiveness". The presence thereafter in the individual of the superego and a related sense of guilt—"Civilization, therefore, obtains mastery over the individual's dangerous desire for aggression by ... setting up an agency within him to watch over it"—leaves an abiding sense of uneasiness inherent in civilized life, thereby providing a structural explanation for 'the suffering of civilized man'.

Freud made a further connection between group life and innate aggression, where the former comes together more closely by directing aggression to other groups, an idea later picked up by group analysts like Wilfred Bion.

Continuing development of Freud's views

In the closing decade of Freud's life, it has been suggested, his view of the death drive changed somewhat, with "the stress much more upon the death instinct's manifestations outwards". Given "the ubiquity of non-erotic aggressivity and destructiveness", he wrote in 1930, "I adopt the standpoint, therefore, that the inclination to aggression is an original, self-subsisting instinctual disposition in man".

In 1933, he conceived of his original formulation of the death drive 'the improbability of our speculations. A queer instinct, indeed, directed to the destruction of its own organic home!'. He wrote moreover that "Our hypothesis is that there are two essentially different classes of instincts: the sexual instincts, understood in the widest sense—Eros, if you prefer that name—and the aggressive instincts, whose aim is destruction". In 1937, he went so far as to suggest privately that 'We should have a neat schematic picture if we supposed that originally, at the beginning of life, all libido was directed to the inside and all aggressiveness to the outside'. In his last writings, it was the contrast of "two basic instincts, Eros and the destructive instinct ... our two primal instincts, Eros and destructiveness", on which he laid stress. Nevertheless, his belief in "the death instinct ... [as] a return to an earlier state ... into an inorganic state" continued to the end.

Mortido and Destrudo

The terms mortido and destrudo, formed analogously to libido, refer to the energy of the death instinct. In the early 21st century, their use amongst Freudian psychoanalysts has been waning, but still designate destructive energy. The importance of integrating mortido into an individual's life, as opposed to splitting it off and disowning it, has been taken up by figures like Robert Bly in the men's movement.

Paul Federn used the term mortido for the new energy source, and has generally been followed in that by other analytic writers. His disciple and collaborator Weiss, however, chose destrudo, which was later taken up by Charles Brenner.

Mortido has also been applied in contemporary expositions of the Cabbala.

Whereas Freud himself never named the aggressive and destructive energy of the death drive (as he had done with the life drive, "libido"), the next generation of psychoanalysts vied to find suitable names for it.

Literary criticism has been almost more prepared than psychoanalysis to make at least metaphorical use of the term 'Destrudo'. Artistic images were seen by Joseph Campbell in terms of "incestuous 'libido' and patricidal 'destrudo'"; while literary descriptions of the conflict between destrudo and libido are still fairly widespread in the 21st century.

Destrudo as an evocative name also appears in rock music and video games.

Paul Federn

Mortido was introduced by Freud's pupil Paul Federn to cover the psychic energy of the death instinct, something left open by Freud himself: Providing what he saw as clinical proof of the reality of the death instinct in 1930, Federn reported on the self-destructive tendencies of severely melancholic patients as evidence of what he would later call inwardly-directed mortido.

However, Freud himself favoured neither term – mortido or destrudo. This worked against either of them gaining widespread popularity in the psychoanalytic literature.

Edoardo Weiss

Destrudo is a term introduced by Italian psychoanalyst Edoardo Weiss in 1935 to denote the energy of the death instinct, on the analogy of libido—and thus to cover the energy of the destructive impulse in Freudian psychology.

Destrudo is the opposite of libido—the urge to create, an energy that arises from the Eros (or "life") drive—and is the urge to destroy arising from Thanatos (death), and thus an aspect of what Sigmund Freud termed "the aggressive instincts, whose aim is destruction".

Weiss related aggression/destrudo to secondary narcissism, something generally only described in terms of the libido turning towards the self.

Eric Berne

Eric Berne, who was a pupil of Federn's, made extensive use of the term mortido in his pre-transactional analysis study, The Mind in Action (1947). As he wrote in the foreword to the third edition of 1967, "the historical events of the last thirty years...become much clearer by introducing Paul Federn's concept of mortido".

Berne saw mortido as activating such forces as hate and cruelty, blinding anger and social hostilities; and considered that inwardly directed mortido underlay the phenomena of guilt and self-punishment, as well as their clinical exacerbations in the form of depression or melancholia.

Berne saw sexual acts as gratifying mortido at the same time as libido; and recognised that on occasion the former becomes more important sexually than the latter, as in sadomasochism and destructive emotional relationships.

Berne's concern with the role of mortido in individuals and groups, social formations and nations, arguably continued throughout all his later writings.

Jean Laplanche

Jean Laplanche has explored repeatedly the question of mortido, and of how far a distinctive instinct of destruction can be identified in parallel to the forces of libido.

Analytic reception

As Freud wryly commented in 1930, "The assumption of the existence of an instinct of death or destruction has met with resistance even in analytic circles". Indeed, Ernest Jones would comment of Beyond the Pleasure Principle that the book not only "displayed a boldness of speculation that was unique in all his writings" but was "further noteworthy in being the only one of Freud's which has received little acceptance on the part of his followers".

Otto Fenichel in his compendious survey of the first Freudian half-century concluded that "the facts on which Freud based his concept of a death instinct in no way necessitate the assumption ... of a genuine self-destructive instinct". Heinz Hartmann set the tone for ego psychology when he "chose to ... do without 'Freud's other, mainly biologically oriented set of hypotheses of the "life" and "death instincts"'". In the object relations theory, among the independent group 'the most common repudiation was the loathsome notion of the death instinct'. Indeed, "for most analysts Freud's idea of a primitive urge towards death, of a primary masochism, was ... bedevilled by problems".

Nevertheless, the concept has been defended, extended, and carried forward by some analysts, generally those tangential to the psychoanalytic mainstream; while among the more orthodox, arguably of "those who, in contrast to most other analysts, take Freud's doctrine of the death drive seriously, K. R. Eissler has been the most persuasive—or least unpersuasive".

Melanie Klein and her immediate followers considered that "the infant is exposed from birth to the anxiety stirred up by the inborn polarity of instincts—the immediate conflict between the life instinct and the death instinct"; and Kleinians indeed built much of their theory of early childhood around the outward deflection of the latter. "This deflection of the death instinct, described by Freud, in Melanie Klein's view consists partly of a projection, partly of the conversion of the death instinct into aggression".

French psychoanalyst Jacques Lacan, for his part, castigated the "refusal to accept this culminating point of Freud's doctrine ... by those who conduct their analysis on the basis of a conception of the ego ... that death instinct whose enigma Freud propounded for us at the height of his experience". Characteristically, he stressed the linguistic aspects of the death drive: "the symbol is substituted for death in order to take possession of the first swelling of life .... There is therefore no further need to have recourse to the outworn notion of primordial masochism in order to understand the reason for the repetitive games in ... his Fort! and in his Da!."

Eric Berne too would proudly proclaim that he, "besides having repeated and confirmed the conventional observations of Freud, also believes right down the line with him concerning the death instinct, and the pervasiveness of the repetition compulsion".

For the twenty-first century, "the death drive today ... remains a highly controversial theory for many psychoanalysts ... [almost] as many opinions as there are psychoanalysts".

Freud's conceptual opposition of death and eros drives in the human psyche was applied by Walter A. Davis in Deracination: Historicity, Hiroshima, and the Tragic Imperative and Death's Dream Kingdom: The American Psyche since 9/11. Davis described social reactions to both Hiroshima and 9/11 from the Freudian viewpoint of the death force. Unless they consciously take responsibility for the damage of those reactions, Davis claims that Americans will repeat them.

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...