Search This Blog

Saturday, May 22, 2021

Anti-consumerism

From Wikipedia, the free encyclopedia

Anti-consumerism is a sociopolitical ideology that is opposed to consumerism, the continual buying and consuming of material possessions. Anti-consumerism is concerned with the private actions of business corporations in pursuit of financial and economic goals at the expense of the public welfare, especially in matters of environmental protection, social stratification, and ethics in the governing of a society. In politics, anti-consumerism overlaps with environmental activism, anti-globalization, and animal-rights activism; moreover, a conceptual variation of anti-consumerism is post-consumerism, living in a material way that transcends consumerism.

Anti-consumerism arose in response to the problems caused by the long-term mistreatment of human consumers and of the animals consumed, and from the incorporation of consumer education to school curricula; examples of anti-consumerism are the book No Logo (2000) by Naomi Klein, and documentary films such as The Corporation (2003), by Mark Achbar and Jennifer Abbott, and Surplus: Terrorized into Being Consumers (2003), by Erik Gandini; each made anti-corporate activism popular as an ideologically accessible form of civil and political action.

The criticism of economic materialism as a dehumanizing behaviour that is destructive of the Earth, as human habitat, comes from religion and social activism. The religious criticism asserts that materialist consumerism interferes with the connection between the individual and God, and so is an inherently immoral style of life; thus the German historian Oswald Spengler (1880–1936) said that, "Life in America is exclusively economic in structure, and lacks depth." From the Roman Catholic perspective, Thomas Aquinas said that, "Greed is a sin against God, just as all mortal sins, in as much as man condemns things eternal for the sake of temporal things"; in that vein, Francis of Assisi, Ammon Hennacy, and Mohandas Gandhi said that spiritual inspiration guided them towards simple living.

From the secular perspective, social activism indicates that from consumerist materialism derive crime (which originates from the poverty of economic inequality), industrial pollution and the consequent environmental degradation, and war as a business.

About the societal discontent born of malaise and hedonism, Pope Benedict XVI said that the philosophy of materialism offers no raison d'ĂȘtre for human existence; likewise, the writer Georges Duhamel said that "American materialism [is] a beacon of mediocrity that threatened to eclipse French civilization".

Background

Anti-consumerism originated from criticism of consumption, starting with Thorstein Veblen, who, in the book The Theory of the Leisure Class: An Economic Study of Institutions (1899), indicated that consumerism dates from the cradle of civilization. The term consumerism also denotes economic policies associated with Keynesian economics, and the belief that the free choice of consumers should dictate the economic structure of a society (cf. producerism).

Politics and society

An anti-consumerist stencil graffiti saying "Consuming consumes you"

Many anti-corporate activists believe the rise of large-business corporations poses a threat to the legitimate authority of nation states and the public sphere. They feel corporations are invading people's privacy, manipulating politics and governments, and creating false needs in consumers. They state evidence such as invasive advertising adware, spam, telemarketing, child-targeted advertising, aggressive guerrilla marketing, massive corporate campaign contributions in political elections, interference in the policies of sovereign nation states (Ken Saro-Wiwa), and news stories about corporate corruption (Enron, for example).

Anti-consumerism protesters point out that the main responsibility of a corporation is to answer only to shareholders, giving human rights and other issues almost no consideration. The management does have a primary responsibility to their shareholders, since any philanthropic activities that do not directly serve the business could be deemed to be a breach of trust. This sort of financial responsibility means that multi-national corporations will pursue strategies to intensify labor and reduce costs. For example, they will attempt to find low wage economies with laws which are conveniently lenient on human rights, the natural environment, trade union organization and so on (see, for example, Nike).

An important contribution to the critique of consumerism has been made by French philosopher Bernard Stiegler, arguing modern capitalism is governed by consumption rather than production, and the advertising techniques used to create consumer behaviour amount to the destruction of psychic and collective individuation. The diversion of libidinal energy toward the consumption of consumer products, he argues, results in an addictive cycle of consumption, leading to hyper-consumption, the exhaustion of desire, and the reign of symbolic misery.

In art, Banksy, an influential British graffiti master, painter, activist, filmmaker and all-purpose provocateur, has created satirical and provocative works about the consumerist society (notable examples include "Napalm", also known as "Can't Beat The Feelin'", an attack on Walt Disney Pictures and McDonald's, and "Death By Swoosh", directed at Nike). Working undercover, the secretive street artist challenges social ideas and goads viewers into rethinking their surroundings, to acknowledge the absurdities of closely held preconceptions. In his own words, "You owe the companies nothing. Less than nothing, you especially don't owe them any courtesy. They owe you. They have re-arranged the world to put themselves in front of you. They never asked for your permission, don't even start asking for theirs." After 2003, Banksy wrote the New Yorker by e-mail: "I give away thousands of paintings for free. I don't think it's possible to make art about world poverty and trouser all the cash." Banksy believes that there is a consumerist shift in art, and for the first time, the bourgeois world of art belongs to the people. On his website, he provides high-resolution images of his work for free downloading.

Conspicuous consumption

It is preoccupation with possessions, more than anything else, that prevents us from living freely and nobly.

Trying to reduce environmental pollution without reducing consumerism is like combating drug trafficking without reducing the drug addiction.

In many critical contexts, the term describes the tendency of people to identify strongly with products or services they consume, especially with commercial brand names and obvious status-enhancing appeal, such as a brand of expensive automobiles or jewelry. It is a pejorative term which most people deny, having some more specific excuse or rationalization for consumption other than the idea that they are "compelled to consume". A culture that has a high amount of consumerism is referred to as a consumer culture.

To those who embrace the idea of consumerism, these products are not seen as valuable in themselves, but rather as social signals that allow them to identify like-minded people through consumption and display of similar products. Few would yet go so far, though, as to admit that their relationships with a product or brand name could be substitutes for healthy human relationships that sometimes lack in a dysfunctional modern society.

The older term conspicuous consumption described the United States in the 1960s, but was soon linked to larger debates about media influence, culture jamming, and its corollary productivism.

Anti-consumerist stencil art

The term and concept of conspicuous consumption originated at the turn of the 20th century in the writing of economist Thorstein Veblen. The term describes an apparently irrational and confounding form of economic behaviour. Veblen's scathing proposal that this unnecessary consumption is a form of status display is made in darkly humorous observations like the following, from his 1899 book, The Theory of the Leisure Class:

It is true of dress in even a higher degree than of most other items of consumption, that people will undergo a very considerable degree of privation in the comforts or the necessaries of life in order to afford what is considered a decent amount of wasteful consumption; so that it is by no means an uncommon occurrence, in an inclement climate, for people to go ill clad in order to appear well dressed.

In 1955, economist Victor Lebow stated (as quoted by William Rees, 2009):

Our enormously productive economy demands that we make consumption our way of life, that we convert the buying and use of goods into rituals, that we seek our spiritual satisfaction and our ego satisfaction in consumption. We need things consumed, burned up, worn out, replaced and discarded at an ever-increasing rate.

According to archaeologists, evidence of conspicuous consumption up to several millennia ago has been found, suggesting that such behavior is inherent to humans.

Consumerism and advertising

Anti-consumerists believe advertising plays a huge role in human life by informing values and assumptions of the cultural system, deeming what is acceptable and determining social standards. They declare that ads create a hyper-real world where commodities appear as the key to securing happiness. Anti-consumerists cite studies that find that individuals believe their quality of life improves in relation to social values that lie outside the capability of the market place. Therefore, advertising attempts to equate the social with the material by utilizing images and slogans to link commodities with the real sources of human happiness, such as meaningful relationships. Ads are then a detriment to society because they tell consumers that accumulating more and more possessions will bring them closer to self-actualization, or the concept of a complete and secure being. "The underlying message is that owning these products will enhance our image and ensure our popularity with others." And while advertising promises that a product will make the consumer happy, advertising simultaneously depends upon the consumer never being truly happy, as then the consumer would no longer feel the need to consume needless products.

Anti-consumerists claim that in a consumerist society, advertisement images disempower and objectify the consumer. By stressing individual power, choice and desire, advertising falsely implies the control lies with the consumer. Because anti-consumerists believe commodities supply only short-term gratification, they detract from a sustainably happy society. Further, advertisers have resorted to new techniques of capturing attention, such as the increased speed of ads and product placements. In this way, commercials infiltrate the consumerist society and become an inextricable part of culture. Anti-consumerists condemn advertising because it constructs a simulated world that offers fantastical escapism to consumers, rather than reflecting actual reality. They further argue that ads depict the interests and lifestyles of the elite as natural; cultivating a deep sense of inadequacy among viewers. They denounce use of beautiful models because they glamorize the commodity beyond reach of the average individual.

In an opinion segment of New Scientist magazine published in August 2009, reporter Andy Coghlan cited William Rees of the University of British Columbia and epidemiologist Warren Hern of the University of Colorado at Boulder, saying that human beings, despite considering themselves civilized thinkers, are "subconsciously still driven by an impulse for survival, domination and expansion... an impulse which now finds expression in the idea that inexorable economic growth is the answer to everything, and, given time, will redress all the world's existing inequalities." According to figures presented by Rees at the annual meeting of the Ecological Society of America, human society is in a "global overshoot", consuming 30% more material than is sustainable from the world's resources. Rees went on to state that at present, 85 countries are exceeding their domestic "bio-capacities", and compensate for their lack of local material by depleting the stocks of other countries.

Austrian economics

Austrian economic advocates focus on the entrepreneur, promoting a productive lifestyle rather than a materialistic one wherein the individual is defined by things and not their self.

Criticism

Critics of anti-consumerism have accused anti-consumerists of opposing modernity or utilitarianism, arguing that it can lead to elitism, primarily among libertarian viewpoints, who argue that every person should decide their level of consumption independent of outside influence. Right-wing critics see anti-consumerism as rooted in socialism. In 1999, the right-libertarian magazine Reason attacked anti-consumerism, claiming Marxist academics are repackaging themselves as anti-consumerists. James B. Twitchell, a professor at the University of Florida and popular writer, referred to anti-consumerist arguments as "Marxism Lite".

There have also been socialist critics of anti-consumerism who see it as a form of anti-modern "reactionary socialism", and state that anti-consumerism has also been adopted by ultra-conservatives and fascists.

In popular media

In Fight Club, the protagonist finds himself participating in terroristic acts against corporate society and consumer culture.

In Mr. Robot, Elliot Alderson, a young cybersecurity engineer, joins a hacker group known as fsociety, which aims to crash the U.S. economy, eliminating all debt.

In the novel American Psycho by Bret Easton Ellis, the protagonist Patrick Bateman criticizes the consumerist society of America in the 1980s of which he is a personification. Later on he goes on a killing spree without any consequences, suggesting that the people around him are so self-absorbed and focused on consuming that they either do not see or do not care about his acts.

Internet culture

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Internet_culture

Internet culture, or cyberculture, is a culture that describes the many manifestations of the use of computer networks for communication, entertainment, and business, and recreation. Some features of Internet culture include online communities, gaming, social media, and more, as well as topics related to identity and privacy. Due to the internet’s large scale use and adoption, the impacts of internet culture on society and non-digital cultures have been widespread. Additionally, because of the all encompassing nature of the internet and internet culture, different facets of internet culture are often studied individually rather than holistically, such as social media, gaming, specific communities, and more.

Overview

The internet is one gigantic well-stocked fridge ready for raiding; for some strange reason, people go up there and just give stuff away.
Mega 'Zines, Macworld (1995)

Since the boundaries of cyberculture are difficult to define, the term is used flexibly, and its application to specific circumstances can be controversial. It generally refers at least to the cultures of virtual communities, but extends to a wide range of cultural issues relating to "cyber-topics", e.g. cybernetics, and the perceived or predicted cyborgization of the human body and human society itself. It can also embrace associated intellectual and cultural movements, such as cyborg theory and cyberpunk. The term often incorporates an implicit anticipation of the future.

The Oxford English Dictionary lists the earliest usage of the term "cyberculture" in 1963, when Alice Mary Hilton wrote the following, "In the era of cyberculture, all the plows pull themselves and the fried chickens fly right onto our plates." This example, and all others, up through 1995 are used to support the definition of cyberculture as "the social conditions brought about by automation and computerization." The American Heritage Dictionary broadens the sense in which "cyberculture" is used by defining it as, "The culture arising from the use of computer networks, as for communication, entertainment, work, and business". However, both OED and the American Heritage Dictionary fail to describe cyberculture as a culture within and among users of computer networks. This cyberculture may be purely an online culture or it may span both virtual and physical worlds. This is to say, that cyberculture is a culture endemic to online communities; it is not just the culture that results from computer use, but culture that is directly mediated by the computer. Another way to envision cyberculture is as the electronically enabled linkage of like-minded, but potentially geographically disparate (or physically disabled and hence less mobile) persons.

Cyberculture is a wide social and cultural movement closely linked to advanced information science and information technology, their emergence, development and rise to social and cultural prominence between the 1960s and the 1990s. Cyberculture was influenced at its genesis by those early users of the internet, frequently including the architects of the original project. These individuals were often guided in their actions by the hacker ethic. While early cyberculture was based on a small cultural sample, and its ideals, the modern cyberculture is a much more diverse group of users and the ideals that they espouse.

Numerous specific concepts of cyberculture have been formulated by such authors as Lev Manovich, Arturo Escobar and Fred Forest. However, most of these concepts concentrate only on certain aspects, and they do not cover these in great detail. Some authors aim to achieve a more comprehensive understanding distinguished between early and contemporary cyberculture (Jakub Macek), or between cyberculture as the cultural context of information technology and cyberculture (more specifically cyberculture studies) as "a particular approach to the study of the 'culture + technology' complex" (David Lister et al.).

Manifestations

Manifestations of cyberculture include various human interactions mediated by computer networks. They can be activities, pursuits, games, places, and metaphors, and include a diverse base of applications. Some are supported by specialized software and others work on commonly accepted internet protocols. Examples include but are not limited to:

Social impact

The Internet is one of the most popular forms of communication today with billions of people using it every day. This is because the internet is full of a wide variety of tools that can allow for information retrieval and communication, which can occur between individuals, groups, or even within mass contexts. It has created a culture that many people are involved in which has led to countless positive and negative impacts.

The Internet provides an array of tools for people to use for information retrieval and communication in individual, group, and mass contexts

Positive Impacts

The creation of the Internet has impacted our society greatly, giving us the ability to communicate with others online, store information such as files and pictures, and help maintain our government. As the Internet progressed, digital and audio files could be created and shared on the Internet, it became one of the main sources of information, business, and entertainment, and it led to the creation of different social media platforms such as Instagram, Twitter, Facebook and Snapchat. Communicating with others has never been easier in our day and age allowing people to connect and interact with each other. The Internet helps us maintain our relationships with others by acting as a supplement to physical interactions with our friends and family. People are also able to make forums and talk about different topics with each other which can help form and build relationships. This gives people the ability to express their own views freely. Social groups created on the Internet have also been connected to improving and maintaining our health in general. Interacting with social groups online can help prevent and possibly treat depression. In response to the rising prevalence of mental health disorders, including anxiety and depression, a 2019 study by Christo El Morr and others demonstrated that York University students in Toronto were extremely interested in participating in an online mental health support community. The study mentions that many students prefer an anonymous online mental health community to a traditional in person service, due to the social stigmatization of mental health disorders. Overall, online communication with others gives people the sense that they are wanted and are welcomed into social groups.

Negative Impacts

With access to the Internet becoming easier for people, it has led to a substantial amount of disadvantages. Addiction is becoming a huge problem because the Internet is becoming more relied on for a variety of certain tasks such as communicating, commerce, and education. There are a range of different symptoms connected to addiction such as withdraw, anxiety, and mood swings. Addiction to social media is very prevalent with adolescents, but the interaction they have with one another can be detrimental for their health. Rude comments on posts can lower individuals self-esteem making them feel unworthy and may lead to depression. Social interaction online may substitute face-to-face interactions for some people instead of acting as a supplement. This can negatively impact people's social skills and cause one to have feelings of loneliness. People may also have the chance of being cyber bullied when using online applications. Cyber bullying may include harassment, video shaming, impersonating, and much more. Such cyber bullying is particularly intense towards members of groups deemed "cringy" by internet culture at large. A concept called cyber bullying theory is now being used to describe that children who use social networking more frequently are more likely to become victims of cyber bullying. Additionally, some evidence shows that too much internet use can stunt memory and attention development in children. The ease of access to information which the internet provides discourages information retention. However, the cognitive consequences are not yet fully known. The staggering amount of available information online can lead to feelings of information overload. Some effects of this phenomenon include reduced comprehension, decision making, and behavior control.

Qualities

First and foremost, cyberculture derives from traditional notions of culture, as the roots of the word imply. In non-cyberculture, it would be odd to speak of a single, monolithic culture. In cyberculture, by extension, searching for a single thing that is cyberculture would likely be problematic. The notion that there is a single, definable cyberculture is likely the complete dominance of early cyber territory by affluent North Americans. Writing by early proponents of cyberspace tends to reflect this assumption.

The ethnography of cyberspace is an important aspect of cyberculture that does not reflect a single unified culture. It "is not a monolithic or placeless 'cyberspace'; rather, it is numerous new technologies and capabilities, used by diverse people, in diverse real-world locations." It is malleable, perishable, and can be shaped by the vagaries of external forces on its users. For example, the laws of physical world governments, social norms, the architecture of cyberspace, and market forces shape the way cybercultures form and evolve. As with physical world cultures, cybercultures lend themselves to identification and study.

There are several qualities that cybercultures share that make them warrant the prefix "cyber-". Some of those qualities are that cyberculture:

  • Is a community mediated by ICTs.
  • Is culture "mediated by computer screens".
  • Relies heavily on the notion of information and knowledge exchange.
  • Depends on the ability to manipulate tools to a degree not present in other forms of culture (even artisan culture, e.g., a glass-blowing culture).
  • Allows vastly expanded weak ties and has been criticized for overly emphasizing the same (see Bowling Alone and other works).
  • Multiplies the number of eyeballs on a given problem, beyond that which would be possible using traditional means, given physical, geographic, and temporal constraints.
  • Is a "cognitive and social culture, not a geographic one".
  • Is "the product of like-minded people finding a common 'place' to interact."
  • Is inherently more "fragile" than traditional forms of community and culture (John C. Dvorak).

Thus, cyberculture can be generally defined as the set of technologies (material and intellectual), practices, attitudes, modes of thought, and values that developed with cyberspace.

Sharing has been argued to be an important quality for the Internet culture.

Identity – "Architectures of credibility"

Cyberculture, like culture in general, relies on establishing identity and credibility. However, in the absence of direct physical interaction, it could be argued that the process for such establishment is more difficult.

One early study, conducted from 1998-1999, found that the participants view information obtained online as being slightly more credible than information from magazines, radio, and television. However, the same study found that the participants viewed information obtained from newspapers as the most credible, on average. Finally, this study found that an individual's rate of verification of information obtained online was low, and perhaps over reported depending on the type of information.

How does cyberculture rely on and establish identity and credibility? This relationship is two-way, with identity and credibility being both used to define the community in cyberspace and to be created within and by online communities.

In some senses, online credibility is established in much the same way that it is established in the offline world; however, since these are two separate worlds, it is not surprising that there are differences in their mechanisms and interactions of the markers found in each.

Following the model put forth by Lawrence Lessig in Code: Version 2.0, the architecture of a given online community may be the single most important factor regulating the establishment of credibility within online communities. Some factors may be:

  • Anonymous versus Known
  • Linked to Physical Identity versus Internet-based Identity Only
  • Unrated Commentary System versus Rated Commentary System
  • Positive Feedback-oriented versus Mixed Feedback (positive and negative) oriented
  • Moderated versus Unmoderated

Anonymous versus known

Many sites allow anonymous commentary, where the user-id attached to the comment is something like "guest" or "anonymous user". In an architecture that allows anonymous posting about other works, the credibility being impacted is only that of the product for sale, the original opinion expressed, the code written, the video, or other entity about which comments are made (e.g., a Slashdot post). Sites that require "known" postings can vary widely from simply requiring some kind of name to be associated with the comment to requiring registration, wherein the identity of the registrant is visible to other readers of the comment. These "known" identities allow and even require commentators to be aware of their own credibility, based on the fact that other users will associate particular content and styles with their identity. By definition, then, all blog postings are "known" in that the blog exists in a consistently defined virtual location, which helps to establish an identity, around which credibility can gather. Conversely, anonymous postings are inherently incredible. Note that a "known" identity need have nothing to do with a given identity in the physical world.

Linked to physical identity versus internet-based identity only

Architectures can require that physical identity be associated with commentary, as in Lessig's example of Counsel Connect. However, to require linkage to physical identity, many more steps must be taken (collecting and storing sensitive information about a user) and safeguards for that collected information must be established-the users must have more trust of the sites collecting the information (yet another form of credibility). Irrespective of safeguards, as with Counsel Connect, using physical identities links credibility across the frames of the internet and real space, influencing the behaviors of those who contribute in those spaces. However, even purely internet-based identities have credibility. Just as Lessig describes linkage to a character or a particular online gaming environment, nothing inherently links a person or group to their internet-based persona, but credibility (similar to "characters") is "earned rather than bought, and because this takes time and (credibility is) not fungible, it becomes increasingly hard" to create a new persona.

Unrated commentary system versus rated commentary system

In some architectures, those who review or offer comments can, in turn, be rated by other users. This technique offers the ability to regulate the credibility of given authors by subjecting their comments to direct "quantifiable" approval ratings.

Positive feedback-oriented versus mixed feedback (positive and negative) oriented

Architectures can be oriented around positive feedback or a mix of both positive and negative feedback. While a particular user may be able to equate fewer stars with a "negative" rating, the semantic difference is potentially important. The ability to actively rate an entity negatively may violate laws or norms that are important in the jurisdiction in which the internet property is important. The more public a site, the more important this concern may be, as noted by Goldsmith & Wu regarding eBay.

Moderated versus unmoderated

Architectures can also be oriented to give editorial control to a group or individual. Many email lists are worked in this fashion (e.g., Freecycle). In these situations, the architecture usually allows, but does not require that contributions be moderated. Further, moderation may take two different forms: reactive or proactive. In the reactive mode, an editor removes posts, reviews, or content that is deemed offensive after it has been placed on the site or list. In the proactive mode, an editor must review all contributions before they are made public.

In a moderated setting, credibility is often given to the moderator. However, that credibility can be damaged by appearing to edit in a heavy-handed way, whether reactive or proactive (as experienced by digg.com). In an unmoderated setting, credibility lies with the contributors alone. The very existence of an architecture allowing moderation may lend credibility to the forum being used (as in Howard Rheingold's examples from the WELL), or it may take away credibility (as in corporate web sites that post feedback, but edit it highly).

Cyberculture studies

The field of cyberculture studies examines the topics explained above, including the communities emerging within the networked spaces sustained by the use of modern technology. Students of cyberculture engage with political, philosophical, sociological, and psychological issues that arise from the networked interactions of human beings by humans who act in various relations to information science and technology.

Donna Haraway, Sadie Plant, Manuel De Landa, Bruce Sterling, Kevin Kelly, Wolfgang Schirmacher, Pierre Levy, David Gunkel, Victor J.Vitanza, Gregory Ulmer, Charles D. Laughlin, and Jean Baudrillard are among the key theorists and critics who have produced relevant work that speaks to, or has influenced studies in, cyberculture. Following the lead of Rob Kitchin, in his work Cyberspace: The World in the Wires, cyberculture might be viewed from different critical perspectives. These perspectives include futurism or techno-utopianism, technological determinism, social constructionism, postmodernism, poststructuralism, and feminist theory.

Ischemia

From Wikipedia, the free encyclopedia
 
Ischemia
Other namesischaemia, ischĂŠmia
Ischemia.JPG
Vascular ischemia of the toes with characteristic cyanosis
Pronunciation
SpecialtyVascular surgery

Ischemia or ischaemia is a restriction in blood supply to tissues, causing a shortage of oxygen that is needed for cellular metabolism (to keep tissue alive). Ischemia is generally caused by problems with blood vessels, with resultant damage to or dysfunction of tissue i.e. hypoxia and microvascular dysfunction. It also means local anemia in a given part of a body sometimes resulting from constriction (such as vasoconstriction, thrombosis or embolism). Ischemia comprises not only insufficiency of oxygen, but also reduced availability of nutrients and inadequate removal of metabolic wastes. Ischemia can be partial (poor perfusion) or total.

Signs and symptoms

Since oxygen is carried to tissues in the blood, insufficient blood supply causes tissue to become starved of oxygen. In the highly metabolically active tissues of the heart and brain, irreversible damage to tissues can occur in as little as 3–4 minutes at body temperature. The kidneys are also quickly damaged by loss of blood flow (renal ischemia). Tissues with slower metabolic rates may undergo irreversible damage after 20 minutes.

Clinical manifestations of acute limb ischemia (which can be summarized as the "six P's") include pain, pallor, pulseless, paresthesia, paralysis, and poikilothermia.

Without immediate intervention, ischemia may progress quickly to tissue necrosis and gangrene within a few hours. Paralysis is a very late sign of acute arterial ischemia and signals the death of nerves supplying the extremity. Foot drop may occur as a result of nerve damage. Because nerves are extremely sensitive to hypoxia, limb paralysis or ischemic neuropathy may persist after revascularization and may be permanent.

Cardiac ischemia

Cardiac ischemia may be asymptomatic or may cause chest pain, known as angina pectoris. It occurs when the heart muscle, or myocardium, receives insufficient blood flow. This most frequently results from atherosclerosis, which is the long-term accumulation of cholesterol-rich plaques in the coronary arteries. Ischemic heart disease is the most common cause of death in most Western countries and a major cause of hospital admissions.

Bowel

Both large and small bowel can be affected by ischemia. Ischemia of the large intestine may result in an inflammatory process known as ischemic colitis. Ischemia of the small bowel is called mesenteric ischemia.

Brain

Brain ischemia is insufficient blood flow to the brain, and can be acute or chronic. Acute ischemic stroke is a neurologic emergency that may be reversible if treated rapidly. Chronic ischemia of the brain may result in a form of dementia called vascular dementia. A brief episode of ischemia affecting the brain is called a transient ischemic attack (TIA), often called a mini-stroke. 10% of TIAs will develop into a stroke within 90 days, half of which will occur in the first two days following the TIA.[10]

Limb

Lack of blood flow to a limb results in acute limb ischemia.

Cutaneous

Reduced blood flow to the skin layers may result in mottling or uneven, patchy discoloration of the skin

Kidney Ischemia

Kidney Ischemia is a loss of blood flow to the kidney cells. Several physical symptoms include shrinkage of one or both kidneys, renovascular hypertension, acute renal failure, progressive azotemia, and acute pulmonary edema. It is a disease with high mortality rate and high morbidity. Failure to treat could cause chronic kidney disease and a need for renal surgery.

Causes

Ischemia is a vascular disease involving an interruption in the arterial blood supply to a tissue, organ, or extremity that, if untreated, can lead to tissue death. It can be caused by embolism, thrombosis of an atherosclerotic artery, or trauma. Venous problems like venous outflow obstruction and low-flow states can cause acute arterial ischemia. An aneurysm is one of the most frequent causes of acute arterial ischemia. Other causes are heart conditions including myocardial infarction, mitral valve disease, chronic atrial fibrillation, cardiomyopathies, and prosthesis, in all of which thrombi are prone to develop.

Occlusion

The thrombi may dislodge and may travel anywhere in the circulatory system, where they may lead to pulmonary embolus, an acute arterial occlusion causing the oxygen and blood supply distal to the embolus to decrease suddenly. The degree and extent of symptoms depend on the size and location of the obstruction, the occurrence of clot fragmentation with embolism to smaller vessels, and the degree of peripheral arterial disease (PAD).

Trauma

Traumatic injury to an extremity may produce partial or total occlusion of a vessel from compression, shearing, or laceration. Acute arterial occlusion may develop as a result of arterial dissection in the carotid artery or aorta or as a result of iatrogenic arterial injury (e.g., after angiography).

Other

An inadequate flow of blood to a part of the body may be caused by any of the following:

Pathophysiology

Native records of contractile activity of the left ventricle of isolated rat heart perfused under Langendorff technique. Curve A - contractile function of the heart is greatly depressed after ischemia-reperfusion. Curve B - a set of short ischemic episodes (ischemic preconditioning) before prolonged ischemia provides functional recovery of contractile activity of the heart at reperfusion.

Ischemia results in tissue damage in a process known as ischemic cascade. The damage is the result of the build-up of metabolic waste products, inability to maintain cell membranes, mitochondrial damage, and eventual leakage of autolyzing proteolytic enzymes into the cell and surrounding tissues.

Restoration of blood supply to ischemic tissues can cause additional damage known as reperfusion injury that can be more damaging than the initial ischemia. Reintroduction of blood flow brings oxygen back to the tissues, causing a greater production of free radicals and reactive oxygen species that damage cells. It also brings more calcium ions to the tissues causing further calcium overloading and can result in potentially fatal cardiac arrhythmias and also accelerates cellular self-destruction. The restored blood flow also exaggerates the inflammation response of damaged tissues, causing white blood cells to destroy damaged cells that may otherwise still be viable.

Treatment

Early treatment is essential to keep the affected limb viable. The treatment options include injection of an anticoagulant, thrombolysis, embolectomy, surgical revascularisation, or partial amputation. Anticoagulant therapy is initiated to prevent further enlargement of the thrombus. Continuous IV unfractionated heparin has been the traditional agent of choice.

If the condition of the ischemic limb is stabilized with anticoagulation, recently formed emboli may be treated with catheter-directed thrombolysis using intra-arterial infusion of a thrombolytic agent (e.g., recombinant tissue plasminogen activator (tPA), streptokinase, or urokinase). A percutaneous catheter inserted into the femoral artery and threaded to the site of the clot is used to infuse the drug. Unlike anticoagulants, thrombolytic agents work directly to resolve the clot over a period of 24 to 48 hours.

Direct arteriotomy may be necessary to remove the clot. Surgical revascularization may be used in the setting of trauma (e.g., laceration of the artery). Amputation is reserved for cases where limb salvage is not possible. If the patient continues to have a risk of further embolization from some persistent source, such as chronic atrial fibrillation, treatment includes long-term oral anticoagulation to prevent further acute arterial ischemic episodes.

Decrease in body temperature reduces the aerobic metabolic rate of the affected cells, reducing the immediate effects of hypoxia. Reduction of body temperature also reduces the inflammation response and reperfusion injury. For frostbite injuries, limiting thawing and warming of tissues until warmer temperatures can be sustained may reduce reperfusion injury.

Ischemic stroke is at times treated with various levels of statin therapy at hospital discharge, followed by home time, in an attempt to lower the risk of adverse events.

Society and culture

The Infarct Combat Project (ICP) is an international nonprofit organization founded in 1998 to fight ischemic heart diseases through education and research.

Etymology and pronunciation

The word ischemia (/ÉȘˈskiːmiə/) is from Greek ጎσχαÎčÎŒÎżÏ‚ iskhaimos, "staunching blood" from ጎσχω iskhÎż, "keep back, restrain" and αጷΌα haima, "blood".

Vitamin deficiency

From Wikipedia, the free encyclopedia
  
Vitamin deficiency
Other namesAvitaminosis, hypovitaminosis
SpecialtyEndocrinology

Vitamin deficiency is the condition of a long-term lack of a vitamin. When caused by not enough vitamin intake it is classified as a primary deficiency, whereas when due to an underlying disorder such as malabsorption it is called a secondary deficiency. An underlying disorder may be metabolic – as in a genetic defect for converting tryptophan to niacin – or from lifestyle choices that increase vitamin needs, such as smoking or drinking alcohol. Governments guidelines on vitamin deficiencies advise certain intakes for healthy people, with specific values for women, men, babies, the elderly, and during pregnancy or breastfeeding. Many countries have mandated vitamin food fortification programs to prevent commonly occurring vitamin deficiencies.

Conversely hypervitaminosis refers to symptoms caused by vitamin intakes in excess of needs, especially for fat-soluble vitamins that can accumulate in body tissues.

The history of the discovery of vitamin deficiencies progressed over centuries from observations that certain conditions – for example, scurvy – could be prevented or treated with certain foods having high content of a necessary vitamin, to the identification and description of specific molecules essential for life and health. During the 20th century, several scientists were awarded the Nobel Prize in Physiology or Medicine or the Nobel Prize in Chemistry for their roles in the discovery of vitamins.

Defining deficiency

A number of regions have published guidelines defining vitamin deficiencies and advising specific intakes for healthy people, with different recommendations for women, men, infants, the elderly, and during pregnancy and breast feeding including Japan, the European Union, the United States, and Canada. These documents have been updated as research is published. In the US, Recommended Dietary Allowances (RDAs) were first set in 1941 by the Food and Nutrition Board of the National Academy of Sciences. There were periodic updates, culminating in the Dietary Reference Intakes. Updated in 2016, the US Food and Drug Administration published a set of tables that define Estimated Average Requirements (EARs) and (RDAs). RDAs are higher to cover people with higher than average needs. Together, these are part of Dietary Reference Intakes. For a few vitamins, there is not sufficient information to set EARs and RDAs. For these, an Adequate Intake is shown, based on an assumption that what healthy people consume is sufficient. Countries do not always agree on the amounts of vitamins needed to safeguard against deficiency. For example, for vitamin C, the RDAs for women for Japan, the European Union (called Population Reference Intakes) and the US are 100, 95 and 75 mg/day, respectively. India sets its recommendation at 40 mg/day.

Individual vitamin deficiencies

Water-soluble vitamins

  • Thiamine (Vitamine B1) deficiency is especially common in countries that do not require fortification of wheat and maize flour and rice to replace the naturally occurring thiamine content lost to milling, bleaching and other processing. Severe deficiency causes beriberi, which became prevalent in Asia as more people adopted a diet primarily of white rice. Wernicke encephalopathy and Korsakoff syndrome are forms of beriberi. Alcoholism can also cause vitamin deficiency. Symptoms of deficiency include weight loss, emotional disturbances, impaired sensory perception, weakness and pain in the limbs, and periods of irregular heart beat. Long-term deficiencies can be life-threatening. Deficiency is assessed by red blood cell status and urinary output.
  • Riboflavin (Vitamine B2) deficiency is especially common in countries that do not require fortification of wheat and maize flour and rice to replace the naturally occurring riboflavin lost during processing. Deficiency causes painful red tongue with sore throat, chapped and cracked lips, and inflammation at the corners of the mouth (angular cheilitis). Eyes can be itchy, watery, bloodshot and sensitive to light. Riboflavin deficiency also causes anemia with red blood cells that are normal in size and hemoglobin content, but reduced in number. This is distinct from anemia caused by deficiency of folic acid or vitamin B12.
  • Niacin (Vitamine B3) deficiency causes pellagra, a reversible nutritional wasting disease characterized by four classic symptoms often referred to as the four Ds: diarrhea, dermatitis, dementia, and death. The dermatitis occurs on areas of skin exposed to sunlight, such as backs of hands and neck. Niacin deficiency is a consequence of a diet low in both niacin and the amino acid tryptophan, a precursor for the vitamin. Chronic alcoholism is a contributing risk factor. Low plasma tryptophan is a non-specific indicator, meaning it can have other causes. The signs and symptoms of niacin deficiency start to revert within days of oral supplementation with large amounts of the vitamin.
  • Pantothenic acid (Vitamine B5) deficiency is extremely rare. Symptoms include irritability, fatigue, and apathy.
  • Vitamin B6 deficiency is uncommon, although it may be observed in certain conditions, such as end-stage kidney diseases or malabsorption syndromes, such as celiac disease, Crohn disease or ulcerative colitis. Signs and symptoms include microcytic anemia, electroencephalographic abnormalities, dermatitis, depression and confusion.
  • Biotin (Vitamin B7) deficiency is rare, although biotin status can be compromised in alcoholics and during pregnancy and breastfeeding. Decreased urinary excretion of biotin and increased urinary excretion of 3-hydroxyisovaleric acid are better indicators of biotin deficiency than concentration in the blood. Deficiency affects hair growth and skin health.
  • Folate (Vitamin B9) deficiency is common, and associated with numerous health problems, but primarily with neural tube defects (NTDs) in infants when the mother's plasma concentrations were low during the first third of pregnancies. Government-mandated fortification of foods with folic acid has reduced the incidence of NTDs by 25% to 50% in more than 60 countries using such fortification. Deficiency can also result from rare genetic factors, such as mutations in the MTHFR gene that lead to compromised folate metabolism. Cerebral folate deficiency is a rare condition in which concentrations of folate are low in the brain despite being normal in the blood.
  • Vitamin B12 deficiency can lead to pernicious anemia, megaloblastic anemia, subacute combined degeneration of spinal cord, and methylmalonic acidemia, among other conditions. Supplementation with folate can mask vitamin B12 deficiency. Consuming a vegan diet increases the risk, since Vitamin B12 is only found in food and drinks made from animal products, including eggs and dairy products.
  • Vitamin C deficiency is rare. Consequently, no countries fortify foods as a means of preventing this deficiency. The historic importance of vitamin C deficiency relates to occurrence on long sea-going voyages, when the ship food supplies had no good source of the vitamin. Deficiency results in scurvy when plasma concentrations fall below 0.2 mg/dL, whereas the normal plasma concentration range is 0.4 to 1.5 mg/dL. Deficiency leads to weakness, weight loss and general aches and pains. Longer-term depletion affects connective tissues, severe gum disease, and bleeding from the skin.

Fat-soluble vitamins

  • Vitamin A deficiency can cause nyctalopia (night blindness) and keratomalacia, the latter leading to permanent blindness if not treated. It is the leading cause of preventable childhood blindness, afflicting 250,000 to 500,000 malnourished children in the developing world each year, about half of whom die within a year of becoming blind, as vitamin A deficiency also weakens the immune system. The normal range is 30 to 65 ÎŒg/dL, but plasma concentrations within the range are not a good indicator of a pending deficiency because the normal range is sustained until liver storage is depleted. After that happens, plasma retinol concentration falls to lower than 20 ÎŒg/dL, signifying a state of vitamin A inadequacy.
  • Vitamin D deficiency is common. Most foods do not contain vitamin D, indicating that a deficiency will occur unless people get sunlight exposure or eat manufactured foods purposely fortified with vitamin D. It is typically diagnosed by measuring the concentration of the 25-hydroxyvitamin D (25(OH)D) in plasma, which is the most accurate measure of stores of vitamin D in the body. Deficiency is defined as less than 10 ng/mL, and insufficiency in the range of 10-30 ng/mL. Serum 25(OH)D concentrations above 30 ng/mL are "not consistently associated with increased benefit." Serum concentrations above 50 ng/mL may be cause for concern. Vitamin D deficiency is a known cause of rickets, and has been linked to numerous other health problems.
  • Vitamin E deficiency is rare, occurring as a consequence of abnormalities in dietary fat absorption or metabolism, such as a defect in the alpha-tocopherol transport protein, rather than from a diet low in vitamin E. The US Institute of Medicine defines deficiency as a blood concentration of less than 12 ÎŒmol/L. Deficiency causes poor conduction of electrical impulses along nerves due to changes in nerve membrane structure and function.
  • Vitamin K deficiency as a consequence of low dietary intake is rare. A deficient state can be a result of fat malabsorption diseases. Signs and symptoms can include sensitivity to bruising, bleeding gums, nosebleeds, and heavy menstrual bleeding in women. Newborn infants are a special case. Plasma vitamin K is low at birth, even if the mother is supplemented during pregnancy, because the vitamin is not transported across the placenta. Vitamin K deficiency bleeding (VKDB) due to physiologically low vitamin K plasma concentrations is a serious risk for premature and term newborn and young infants. Untreated, consequences can cause brain damage or death. The prevalence of VKDB is reported at 0.25 to 1.7%, with higher risk in Asian populations. The recommended prevention treatment is an intramuscular injection of 1 mg of vitamin K at birth (called the Vitamin K shot.). There are protocols for oral administration, but intramuscular injection is preferred.

Prevention

Food fortification

Food fortification is the process of adding micronutrients (essential trace elements and vitamins) to food as a public health policy which aims to reduce the number of people with dietary deficiencies within a population. Staple foods of a region can lack particular nutrients due to the soil of the region or from inherent inadequacy of a normal diet. Addition of micronutrients to staples and condiments can prevent large-scale deficiency diseases in these cases.

As defined by the World Health Organization (WHO) and the Food and Agriculture Organization of the United Nations (FAO), fortification refers to "the practice of deliberately increasing the content of an essential micronutrient, i.e., vitamins and minerals in a food irrespective of whether the nutrients were originally in the food before processing or not, so as to improve the nutritional quality of the food supply and to provide a public health benefit with minimal risk to health", whereas enrichment is defined as "synonymous with fortification and refers to the addition of micronutrients to a food which are lost during processing". The Food Fortification Initiative lists all countries in the world that conduct fortification programs, and within each country, what nutrients are added to which foods. Vitamin fortification programs exist in one or more countries for folate, niacin, riboflavin, thiamin, vitamin A, vitamin B6, vitamin B12, vitamin D and vitamin E. As of December 21, 2018, 81 countries required food fortification with one or more vitamins. The most commonly fortified vitamin – as used in 62 countries – is folate; the most commonly fortified food is wheat flour.

Genetic engineering

Starting in 2000, rice was experimentally genetically engineered to produce higher than normal beta-carotene content, giving it a yellow/orange color. The product is referred to as golden rice (Oryza sativa). Biofortified sweet potato, maize, and cassava were other crops introduced to enhance the content of beta-carotene and certain minerals.

When eaten, beta-carotene is a provitamin, converted to retinol (vitamin A). The concept is that in areas of the world where vitamin A deficiency is common, growing and eating this rice would reduce the rates of vitamin A deficiency, particularly its effect on childhood vision problems. As of 2018, fortified golden crops were still in the process of government approvals, and were being assessed for taste and education about their health benefits to improve acceptance and adoption by consumers in impoverished countries.

Hypervitaminosis

Some vitamins cause acute or chronic toxicity, a condition called hypervitaminosis, which occurs mainly for fat-soluble vitamins if over-consumed by excessive supplementation. Hypervitaminosis A and hypervitaminosis D are the most common examples. Vitamin D toxicity does not result from sun exposure or consuming foods rich in vitamin D, but rather from excessive intake of vitamin D supplements, possibly leading to hypercalcemia, nausea, weakness, and kidney stones.

The United States, European Union and Japan, among other countries, have established "tolerable upper intake levels" for those vitamins which have documented toxicity.

History

The discovery dates of vitamins and their sources
Year of discovery Vitamin
1913 Vitamin A (Retinol)
1910 Vitamin B1 (Thiamine)
1920 Vitamin C (Ascorbic acid)
1920 Vitamin D (Calciferol)
1920 Vitamin B2 (Riboflavin)
1922 Vitamin E (Tocopherol)
1929 Vitamin K1 (Phylloquinone)
1931 Vitamin B5 (Pantothenic acid)
1931 Vitamin B7 (Biotin)
1934 Vitamin B6 (Pyridoxine)
1936 Vitamin B3 (Niacin)
1941 Vitamin B9 (Folate)
1948 Vitamin B12 (Cobalamins)

In 1747, the Scottish surgeon James Lind discovered that citrus foods helped prevent scurvy, a particularly deadly disease in which collagen is not properly formed, causing poor wound healing, bleeding of the gums, severe pain, and death. In 1753, Lind published his Treatise on the Scurvy, which recommended using lemons and limes to avoid scurvy, which was adopted by the British Royal Navy. This led to the nickname limey for British sailors. Lind's discovery, however, was not widely accepted by individuals in the Royal Navy's Arctic expeditions in the 19th century, where it was widely believed that scurvy could be prevented by practicing good hygiene, regular exercise, and maintaining the morale of the crew while on board, rather than by a diet of fresh food.

During the late 18th and early 19th centuries, the use of deprivation studies allowed scientists to isolate and identify a number of vitamins. Lipid from fish oil was used to cure rickets in rats, and the fat-soluble nutrient was called "antirachitic A". Thus, the first "vitamin" bioactivity ever isolated, which cured rickets, was initially called "vitamin A"; however, the bioactivity of this compound is now called vitamin D. In 1881, Russian medical doctor Nikolai I. Lunin studied the effects of scurvy at the University of Tartu. He fed mice an artificial mixture of all the separate constituents of milk known at that time, namely the proteins, fats, carbohydrates, and salts. The mice that received only the individual constituents died, while the mice fed by milk itself developed normally. He made a conclusion that substances essential for life must be present in milk other than the known principal ingredients. However, his conclusions were rejected by his advisor, Gustav von Bunge.

In East Asia, where polished white rice was the common staple food of the middle class, beriberi resulting from lack of vitamin B1 was endemic. In 1884, Takaki Kanehiro, a British-trained medical doctor of the Imperial Japanese Navy, observed that beriberi was endemic among low-ranking crew who often ate nothing but rice, but not among officers who consumed a Western-style diet. With the support of the Japanese Navy, he experimented using crews of two battleships; one crew was fed only white rice, while the other was fed a diet of meat, fish, barley, rice, and beans. The group that ate only white rice documented 161 crew members with beriberi and 25 deaths, while the latter group had only 14 cases of beriberi and no deaths. This convinced Takaki and the Japanese Navy that diet was the cause of beriberi, but they mistakenly believed that sufficient amounts of protein prevented it. That diseases could result from some dietary deficiencies was further investigated by Christiaan Eijkman, who in 1897 discovered that feeding unpolished rice instead of the polished variety to chickens helped to prevent beriberi. The following year, Frederick Hopkins postulated that some foods contained "accessory factors" — in addition to proteins, carbohydrates, fats etc. — that are necessary for the functions of the human body. Hopkins and Eijkman were awarded the Nobel Prize for Physiology or Medicine in 1929 for their discoveries.

 

Jack Drummond's single-paragraph article in 1920 which provided structure and nomenclature used today for vitamins

In 1910, the first vitamin complex was isolated by Japanese scientist Umetaro Suzuki, who succeeded in extracting a water-soluble complex of micronutrients from rice bran and named it aberic acid (later Orizanin). He published this discovery in a Japanese scientific journal. When the article was translated into German, the translation failed to state that it was a newly discovered nutrient, a claim made in the original Japanese article, and hence his discovery failed to gain publicity. In 1912 Polish-born biochemist Casimir Funk, working in London, isolated the same complex of micronutrients and proposed the complex be named "vitamine". It was later to be known as vitamin B3 (niacin), though he described it as "anti-beri-beri-factor" (which would today be called thiamine or vitamin B1). Funk proposed the hypothesis that other diseases, such as rickets, pellagra, coeliac disease, and scurvy could also be cured by vitamins. Max Nierenstein a friend and reader of Biochemistry at Bristol University reportedly suggested the "vitamine" name (from "vital amine"). The name soon became synonymous with Hopkins' "accessory factors", and, by the time it was shown that not all vitamins are amines, the word was already ubiquitous. In 1920, Jack Cecil Drummond proposed that the final "e" be dropped to deemphasize the "amine" reference, after researchers began to suspect that not all "vitamines" (in particular, vitamin A) have an amine component.

In 1930, Paul Karrer elucidated the correct structure for beta-carotene, the main precursor of vitamin A, and identified other carotenoids. Karrer and Norman Haworth confirmed Albert Szent-Györgyi's discovery of ascorbic acid and made significant contributions to the chemistry of flavins, which led to the identification of lactoflavin. For their investigations on carotenoids, flavins and vitamins A and B2, Karrer and Haworth jointly received the Nobel Prize in Chemistry in 1937. In 1931, Albert Szent-Györgyi and a fellow researcher Joseph Svirbely suspected that "hexuronic acid" was actually vitamin C, and gave a sample to Charles Glen King, who proved its anti-scorbutic activity in his long-established guinea pig scorbutic assay. In 1937, Szent-Györgyi was awarded the Nobel Prize in Physiology or Medicine for this discovery. In 1938, Richard Kuhn was awarded the Nobel Prize in Chemistry for his work on carotenoids and vitamins, specifically B2 and B6. In 1943, Edward Adelbert Doisy and Henrik Dam were awarded the Nobel Prize in Physiology or Medicine for their discovery of vitamin K and its chemical structure. In 1967, George Wald was awarded the Nobel Prize in Physiology or Medicine (jointly with Ragnar Granit and Haldan Keffer Hartline) for the discovery that vitamin A could participate directly in a physiological process.

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...