Search This Blog

Friday, January 14, 2022

Psychology of self

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Psychology_of_self

The psychology of self is the study of either the cognitive, conative or affective representation of one's identity, or the subject of experience. The earliest formulation of the self in modern psychology derived from the distinction between the self as I, the subjective knower, and the self as Me, the object that is known.

Current views of the self in psychology position the self as playing an integral part in human motivation, cognition, affect, and social identity. It may be the case that we can now usefully attempt to ground experience of self in a neural process with cognitive consequences, which will give us insight into the elements of which the complex multiply situated selves of modern identity are composed.

The self has many facets that help make up integral parts of it, such as self-awareness, self-esteem, self-knowledge, and self-perception. All parts of the self enable people to alter, change, add, and modify aspects of themselves in order to gain social acceptance in society.

A useful accounting of contributing factors to what we call "selfhood" is the self gradually emerges and arises at the intersection between:

  • the habits in our biological-metabolic processes,
  • the sociocultural habits of local culture inculcated into us,
  • our role models, good and bad,
  • how much responsibility the individual takes to make healthy choices gain and again, develop and strengthen their own chooser self.

Kohut's formulation

Heinz Kohut initially proposed a bipolar self compromising two systems of narcissistic perfection: 1) a system of ambitions and, 2) a system of ideals. Kohut called the pole of ambitions the narcissistic self (later, the grandiose self), while the pole of ideals was designated the idealized parental imago. According to Kohut, these poles of the self represented natural progressions in the psychic life of infants and toddlers.

Kohut argued that when the child's ambitions and exhibitionistic strivings were chronically frustrated, arrests in the grandiose self led to the preservation of a false, expansive sense of self that could manifest outwardly in the visible grandiosity of the frank narcissist, or remain hidden from view, unless discovered in a narcissistic therapeutic transference (or selfobject transference) that would expose these primitive grandiose fantasies and strivings. Kohut termed this form of transference a mirror transference. In this transference, the strivings of the grandiose self are mobilized and the patient attempts to use the therapist to gratify these strivings.

Kohut proposed that arrests in the pole of ideals occurred when the child suffered chronic and excessive disappointment over the failings of early idealized figures. Deficits in the pole of ideals were associated with the development of an idealizing transference to the therapist who becomes associated with the patient's primitive fantasies of omnipotent parental perfection.

Kohut believed that narcissistic injuries were inevitable and, in any case, necessary to temper ambitions and ideals with realism through the experience of more manageable frustrations and disappointments. It was the chronicity and lack of recovery from these injuries (arising from a number of possible causes) that he regarded as central to the preservation of primitive self systems untempered by realism.

According to the 1984 book, How Does Analysis Cure, Kohut's observation of patients led him to propose two additional forms of transference associated with self deficits: 1) the twinship and, 2) the merger transference. In his later years, Kohut believed that self object needs were both present and quite varied in normal individuals, as well as in narcissistic individuals. To be clear, selfobjects are not external persons. Kohut and Wolf, 1978 explain:

"Self objects are objects which we experience as part of our self; the expected control over them is, therefore, closer to the concept of control which a grownup expects to have over his own body and mind than to the concept of control which he expects to have over others. (p.413)"

Kohut's notion of the self can be difficult to grasp because it is experience-distant, although it is posited based upon experience-near observation of the therapeutic transference. Kohut relied heavily on empathy as a method of observation. Specifically, the clinician's observations of their own feelings in the transference help the clinician see things from the subjective view of the patient—to experience the world in ways that are closer to the way the patient experiences it. (note: Kohut did not regard empathy as curative. Empathy is a method of observation).

Winnicott's selves

Donald Winnicott distinguished what he called the "true self" from the "false self" in the human personality, considering the true self as one based on the individual's sense of being, not doing, something which was rooted in the experiencing body. As he memorably put it to Harry Guntrip, 'You know about "being active", but not about "just growing, just breathing"': it was the latter qualities that went to form the true self.

Nevertheless, Winnicott did not undervalue the role of the false self in the human personality, regarding it in fact as a necessary form of defensive organization – a kind of caretaker, a survival suit behind the protection of which the true self was able to continue to exist. Five levels of false self organization were identified by Winnicott, running along a kind of continuum.

  1. In the most severe instance, the false self completely replaces and ousts the true self, leaving the latter a mere possibility.
  2. Less severely, the false self protects the true self, which remains unactualized - for Winnicott a clear example of a clinical condition organised for the positive goal of preserving the individual in spite of abnormal environmental conditions of the environment.
  3. Closer to health, the false self supports the individual's search for conditions that will allow the true self to recover its well-being - its own identity.
  4. Even closer to health, we find the false self "... established on the basis of identifications".
  5. Finally, in a healthy person, the false self is composed of that which facilitates social behavior, the manners and courtesy that allows for a smooth social life, with emotions expressed in socially acceptable forms.

As for the true self, Winnicott linked it both to playing, and to a kind of "hide and seek"' designed to protect creative ownership of one's real self against exploitation, without entirely forfeiting the ability to relate to others.

Berne's transactional analysis

In his transactional analysis theory Eric Berne distinguished the personality's ego states - Parent, Adult and Child - from what he called 'the real Self, the one that can move from one ego state to another'.

  • The parent ego consists of borrowed behaviors and feelings from previous caregivers. The parent ego can consist of either the Nurturing or Critical Parent. The Nurturing Parent contains a more loving nature, whereas the Critical (or Prejudiced) Parent consists of preconceived ideas, thoughts, and behaviors learned from previous parents or caregivers. Some of this information can be beneficial, while others are not.
  • The adult ego is otherwise known as our data-processing center. This ego state is able to judge information based on facts, rather than emotions or preconceived beliefs.
  • The child ego is identified as the state that holds all of our memories, emotions, and feelings. People carry this ego state with them all of the time and can reflect back on it at any time. This state can also be divided into two segments: the Free (or Natural) child and the Adapted (and/or Rebellious) child. The Free child represents spontaneity, creativity, and a direct way of perceiving the world. Intimate relationships are able to form due to a person's contact with their own inner child. The fewer people are in touch with their inner child, the less they are able to form intimate relationships with other people. The Adapted child is the state in which people are able to comply and respond with parental commands and messages. If a parental command is viewed as too strong and demanding, a child ego can rebel against it, which is why this state can also become the Rebellious Child.

Berne considered that 'the feeling of "Self" is a mobile one. It can reside in any of the three ego states at any given moment, and can jump from one to the other as occasion arises'.

A person's tone, gestures, choice of words, posture, and emotional state can portray which ego state they are currently in. By knowing about their own ego states, a person can use each one in particular situations in order to enhance their experience or make new social connections. For example, a person would most likely want to be in a Free Child state along with the Adult state while attending a party in order to maximize the fun they are having while also being able to make wise choices.

Transactions is another concept in the transactional theory that relates to how people of a certain ego state interact with people of the same or different ego state at a particular moment. Straight transactions are complementary and result in clear communication among other people. On the contrary, crossed transactions are of diverging ego states that make communication either hard or frustrating. These provoke emotional stress and negative feedback. Nevertheless, Berne saw the Self as the most valuable part of the personality: 'when people get to know each other well, they penetrate into the depths where this real Self resides, and that is the part of the other person they respect and love'.

Jungian archetype of the self

In classical Jungian analysis, the Self is the central archetype of several archetypes, which are apriori or predispositions of responding to the world in particular ways. The Self signifies the coherent whole, unifying both the conscious and unconscious mind of a person. The Self, according to Jung, is the most important and difficult archetype to understand. It is realized as the product of individuation, which is defined as the process of integrating one's personality. The self can appear to the individual both impersonally as dreams and images (circle, mandala, crystal, or stone) or personally (royal couple, divine child, or another divine symbol). Symbolic spiritual people, such as Christ and Mohammed, are also seen as symbols of the self, because they represent unity and equilibrium. The Wise Old Woman/Man can also serve as 'a symbolic personification of the Self'.

What distinguishes classical Jungian psychology from earlier theories is the idea that there are two centers of the personality. The ego is the center of conscious identity, whereas the Self is the center of the total personality—including consciousness, the unconscious, and the ego. The Self is both the whole and the center. While the ego is a self-contained little circle off the center contained within the whole, the Self can be understood as the greater circle. People know of this Self, yet it is not known. Jung expresses it in this way: "If the Self could be wholly experienced, it would be a limited experience, whereas in reality its experience is unlimited and endless.... If I were one with the Self I would have knowledge of everything, I would speak Sanskrit, read cuneiform script, know the events that took place in pre-history be acquainted with the life of other planets, etc."

The Self, besides being the centre of the psyche, is also autonomous, meaning that it exists outside of time and space. Jung also called the Self an imago dei. The Self is the source of dreams and often appears as an authority figure in dreams with the ability to perceive the future or guide one in the present.

Critiques of the concept

'Selfhood' or complete autonomy is a common Western approach to psychology and models of self are employed constantly in areas such as psychotherapy and self-help. Edward E. Sampson (1989) argues that the preoccupation with independence is harmful in that it creates racial, sexual and national divides and does not allow for observation of the self-in-other and other-in-self.

The very notion of selfhood has been attacked on the grounds that it is seen as necessary for the mechanisms of advanced capitalism to function. In Inventing our selves: Psychology, power, and personhood, Nikolas Rose (1998) proposes that psychology is now employed as a technology that allows humans to buy into an invented and arguably false sense of self. In this way, 'Foucault's theories of self have been extensively developed by Rose to explore techniques of governance via self-formation...the self has to become an enterprising subject, acquiring cultural capital in order to gain employment', thus contributing to self-exploitation.

It is suggested by Kohut that for an individual to talk about, explain, understand or judge oneself is linguistically impossible, since it requires the self to understand its self. This is seen as philosophically invalid, being self-referential, or reification, also known as a circular argument. Thus, if actions arise so that the self attempts self-explanation, confusion may well occur within linguistic mental pathways and processes.

As for the theorists of the self, Winnicott has his critics, suggesting that his theory of the way 'the False Self is invented to manage a prematurely important object...enacts a kind of dissociated regard or recognition of the object' is itself rooted in 'his own childhood experience of trying to "make my living" by keeping his mother alive'.

The self has long been considered as the central element and support of any experience. The self is not 'permanently stuck into the heart of consciousness'. "I am not always as intensively aware of me as an agent, as I am of my actions. That results from the fact that I perform only part of my actions, the other part being conducted by my thought, expression, practical operations, and so on."

Memory and the self

Memory and the self are interconnected with each other that, combined, can be defined as the Self-Memory System (SMS). The self is viewed as a combination of memories and self-images (working self). Conway proposes that a person's long-term memory and working self are dependent on each other. Our prior knowledge of our self puts constraints on what our working self is and the working self modifies the access to our long-term memory, as well as, what it consists of.

One view of the Self, following from John Locke, sees it as a product of episodic memory. It has been suggested that transitory mental constructions within episodic memory form a self-memory system that grounds the goals of the working self, but research upon those with amnesia find they have a coherent sense of self based upon preserved conceptual autobiographical knowledge, and semantic facts, and so conceptual knowledge rather than episodic memory.

Both episodic and semantic memory systems have been proposed to generate a sense of self-identity: personal episodic memory enables the phenomenological continuity of identity, while personal semantic memory generates the narrative continuity of identity. "The nature of personal narratives depends on highly conceptual and ‘story-like' information about one's life, which resides at the general event level of autobiographical memory and is thus unlikely to rely on more event-specific episodic systems."

Implicit theories and self-concepts

A two-step process for recalling past states, proposed by Ross, suggests that:

  1. The current state of attribute or belief is assessed
  2. A theory of stability or change is invoked
  3. Steps 1 and 2 are combined to infer the earlier state of attribute or belief

This theory suggests that recollection of past states would be biased if a person's state has changed but they expect no change to have occurred, or if the state has remained constant when a change was expected.

For example, an implicit theory of stability is often invoked when assessing political allegiances, therefore if this allegiance actually changes, recollection of past allegiance will be incorrect, and assumed to be the same as the current political identification.

An implicit theory of change is invoked when a change in an attribute over time is expected. One example of this is a study by Conway and Ross, which demonstrates that if a change in skill is expected, but there is no actual improvement, people will believe that their past skill state was worse than it was.

Recalling Pain

In general recollection of pain is fairly accurate, although differences are seen between recollection of acute and chronic pain. Research suggests that recall for acute pain is more accurate than recall for chronic pain.

An interesting phenomenon seen in recollection of pain in the self is the peak-end phenomenon. Research has shown that when enduring painful experiences, people will 'prefer' more drawn out experiences that end with lower levels of pain, over shorter experiences that end with higher levels of pain, even though the shorter experiences provide less pain overall.

Recalled ratings of pain are more closely related to a combination of the peak pain in the experience, and the end pain of the experience. Whilst the length of the experience factors in very little, there is a 'duration neglect' when recollecting painful experiences.

Social psychology

Symbolic interactionism stresses the 'social construction of an individual's sense of self' through two main methods: 'In part the self emerges through interaction with others....But the self is a product of social structure as well as of face-to-face interaction'. This aspect of social psychology emphasizes the theme of mutual constitution of the person and situation. Instead of focusing on the levels of class, race, and gender structure, this perspective seeks to understand the self in the way an individual lives their life on a moment-by-moment basis.

Social psychology acknowledges that "one of the most important life tasks each of us faces is understanding both who we are and how we feel about ourselves". This allows us to better understand ourselves, abilities, and preferences so that we are able to make choices and decisions that suit us best. However, rather than absolute knowledge, it would seem that 'a healthy sense of self calls for both accurate self-knowledge and protective self-enhancement, in just the right amounts at just the right times.'

Self as an emergent phenomena

In dynamical social psychology as proposed by Nowak et al., the self is rather an emergent property that emerges as an experiential phenomenon from the interaction of psychological perceptions and experience. In this orientation, which draws from physics and biology, psychology is approached with the formula involving the whole as not the sum of parts since new properties emerge from the overview of system. This is also hinted in dynamical evolutionary social psychology by Douglas Kenrick et al. where a set of decision rules generates complex behaviour.

Parts of the self

The self is an automatic part of every human being, in which enables people to relate to others. The self is made up of three main parts that, incorporated, allow for the self to maintain its function. The parts of the self include: Self-knowledge, interpersonal self, and the agent self.

Self-knowledge

Self-knowledge is sometimes referred to as self-concept. This feature allows for people to gather information and beliefs about themselves. A person's self-awareness, self-esteem, and self-deception all fall under the self-knowledge part of self. We learn about ourselves through our looking-glass selves, introspection, social comparisons, and self-perception.

The looking glass self is a term to describe a theory that people learn about themselves through other people. In the looking-glass self proposal, a person visualizes how they appear to others, the person imagines how other people will judge them, and they then develop a response to the judgment they receive from other people. The response will likely be something viewed as pride or shame about themselves. The looking-glass self has proved to be partially accurate and inaccurate. A person's self-concept does not solely depend on how others view them. A person can view themselves as friendly; however they may appear to be quiet and uptight to another person that may not know them very well.

Introspection refers to the manner in which a person gathers information about oneself through mental functions and emotions. Although a person might not know why they are thinking or feeling in such a way, they are able to know what it is they are feeling. However, developmental stages in life might affect introspection. In a Rosenburg study, children up to a certain stage in development showed that they knew that their parents actually knew them better than they knew themselves. Also, studies done by Nisbett and Wilson uncovered the fact that people might not actually know what they are thinking all of the time. In one particular study, they discovered that many people bought the first stockings that they saw and gave the reasoning behind their choice for buying being based on the color or softness. So, in conclusion, introspection is a way of gaining knowledge about yourself through your inner emotions and thinking, however it is a conscious part of the brain. The automatic part of the brain can make us do a lot of unconscious acts that people have no reasoning for.

Social comparison is regarded as the way in which we compare ourselves to other people around us. By looking to other people, we can rate our work and behaviors as good, neutral, or bad. The most beneficial or useful comparisons are those of people that are in the same category as ourselves. For example, a high school football player would be more appropriate in comparing themselves to an all-star high school football player, rather than a Super Bowl-winning football player with over 10 years of experience. An upward social comparison refers to a person comparing oneself to a person that is perceived as better than them in a particular area. This can be either motivational or discouraging to the person making the comparison. A downward social comparison refers to a person comparing oneself to a person that is perceived as worse than them, which can make the person making the comparison feel better about their self.

The self-perception theory is another theory in which a person infers about themselves through their behavior. Their behavior can give them insight as to how their feelings and emotions truly are. If a person regards their self as being smart, however they continuously receive bad grades over the years, that person might rearrange their thinking that they are not as smart as they previously thought. This helps readjust a person's thoughts in order to match their behavior better.

Self-knowledge is a desire for the majority of human beings. In knowing about ourselves, we are more capable of knowing how to be socially acceptable and desirable. We seek out self-knowledge due to the appraisal motive, self-enhancement motive, and consistency motive. The appraisal motive describes the desire to learn the truth about oneself in general. The self-enhancement motive is the desire to learn about one's good qualities only. The consistency motive is the desire to receive reinforcement of those preconceived notions that a person has about their self. This feedback will verify the thoughts and beliefs they already had relating to their self.

Self-awareness can be divided into two categories: private self-awareness and public self-awareness. Private self-awareness is defined as the self looking inward at oneself, including emotions, thoughts, beliefs, and feelings. All of these cannot be discovered by anyone else. Public self-awareness is defined by gathering information about your self through the perceptions of others. The actions and behaviors that others show towards a person will help that person establish a sense of how others perceive them. For example, if a person likes to sing, however many other people discourage their singing, that person can conclude that they might not be the best at singing. Therefore, in this situation, they are gaining public self-awareness about an aspect of themselves.
Self-esteem describes how a person evaluates their self positively or negatively. Four factors that contribute to self-esteem are the reactions we get from other people, how we compare people to ourselves, social roles, and our identification. Our social roles can sometimes be conceived as higher intelligence or ability, such as an Olympic athlete or biotechnologist. Other social roles might be stigmatized as being negative, such as a criminal or homeless person. People with high self-esteem view their selves as containing positive traits. They are more willing to take more risks and aim for success. People with high self-esteem tend to be confident, gain self-acceptance, do not worry as much about what others think about them, and think more optimistically. In contrast, people with low self-esteem view their selves as containing few or no positive traits, rather than viewing their selves as containing negative traits. It is rare for a person to rate their overall self as being terrible. People with low self-esteem typically:
  • do not wish to fail
  • are less confident in their success rate
  • have confused and diverged notions about their self (self-concept confusion)
  • focus on self-protection more so than self-enhancement
  • are more prone to emotional imbalances
  • are less confident about their success than high self-esteemed people
  • worry what others think about them consistently
  • have more pessimistic thinking
  • desire to resemble others more than high self-esteemed people
Our self-concept entails the thoughts, feelings, and beliefs that each of us uniquely foster. However, many psychologists have questioned whether our self-concept is more realistic or filled with illusions about ourselves and the world around us. Clinical psychologists have studied depressed people with perceived low self-esteem in order to observe if their perceptions were fabricated or not. Contrary to their hypothesis, they found that depressed people have a more realistic view of the world, the qualities they obtain, and the control they have over situations in their life. It was proposed by psychologists Shelley Taylor and Jonathon Brown that the majority of people in normal-functioning mental states display and are instilled with positive illusions including:
  • overestimating their own good qualities
  • their control over happenings in their life
  • an unrealistic portrayal of optimism
Positive illusions remain constant for the majority of one's life due to self-deception. Self-deception strategies are mental tricks of a person's mind that hide the truth and constitute false beliefs. Due to self-deception, people are able to obtain resiliency upon negative events that might occur throughout life. This also can reinforce different ideas or thoughts that the person wishes and hopes for. The self-serving bias is a strategy in which a person titles acknowledgment for success and rejects blame for failure. For example, a person who wins a track meet would glorify their ability as an athlete. However, if that person were to come in last in the meet, the person would most likely put blame on constituting factors such as a muscle cramp or previous injury preventing a good performance. Another strategy that people use is greater criticism involving bad feedback rather than good. A person would judge a situation more harshly when they did worse, while the opposite would occur for a situation that entailed good feedback.

Interpersonal self

Interpersonal self can also be referred to as your public self. This feature allows for social connection to others. With the interpersonal self, a person is able to display themselves to the others around them. Interpersonal self is apparent in situations of self-presentation, being a group member or partner in a relationship, a person's social roles, and their reputation. For example, a person might show confidence and determination in their work atmosphere, whereas they show more of their emotional and nurturing side in their romantic relationship.

Social roles are defined as the parts that a person plays in different situations and with other people. Our roles change in order to fit the "expected" behaviors in various scenarios. For example, a person may be a mother, a doctor, a wife, and daughter. Their behavior would most likely change in their transition from being a doctor to coming home to their daughter.

Social norms constitute the "unwritten rules" that we have about how to act in certain scenarios and with various people in our lives. For example, when a person is in a classroom, they are more likely to be quiet and attentive; whereas at a party, they are more likely to be socially engaged and standing. Norms act as guidelines that shape our behavior. Without them, there would not be any order, as well as lack of understanding in situations in society.

Agent self

The agent self is known as the executive function that allows for actions. This is how we, as individuals, make choices and utilize our control in situations and actions. The agent self resides over everything that involves decision making, self-control, taking charge in situations, and actively responding. A person might desire to eat unhealthy foods, however it is their agent self that allows that person to choose to avoid eating them and make a healthier food choice.

Dieting

From Wikipedia, the free encyclopedia
 

Dieting is the practice of eating food in a regulated way to decrease, maintain, or increase body weight, or to prevent and treat diseases such as diabetes and obesity. Dieting to lose weight is recommended for people with weight-related health problems, but not otherwise healthy people. As weight loss depends on calorie intake, different kinds of calorie-reduced diets, such as those emphasising particular macronutrients (low-fat, low-carbohydrate, etc), have been shown to be no more effective than one another. As weight regain is common, diet success is best predicted by long-term adherence. Regardless, the outcome of a diet can vary widely depending on the individual.

The first popular diet was "Banting", named after William Banting. In his 1863 pamphlet, Letter on Corpulence, Addressed to the Public, he outlined the details of a particular low-carbohydrate, low-calorie diet that led to his own dramatic weight loss.

One survey found that almost half of all American adults attempt to lose their weight through dieting.

History

William Banting, popularized one of the first weight loss diets in the 19th century.

According to Foxcroft, the word diet comes from the Greek diaita, which represents a notion of a whole way healthy lifestyle including both mental and physical health, rather than a narrow weight-loss regimen.

One of the first dietitians was the English doctor George Cheyne. He himself was tremendously overweight and would constantly eat large quantities of rich food and drink. He began a meatless diet, taking only milk and vegetables, and soon regained his health. He began publicly recommending his diet for everyone suffering from obesity. In 1724, he wrote An Essay of Health and Long Life, in which he advises exercise and fresh air and avoiding luxury foods.

The Scottish military surgeon, John Rollo, published Notes of a Diabetic Case in 1797. It described the benefits of a meat diet for those suffering from diabetes, basing this recommendation on Matthew Dobson's discovery of glycosuria in diabetes mellitus. By means of Dobson's testing procedure (for glucose in the urine) Rollo worked out a diet that had success for what is now called type 2 diabetes.

The first popular diet was "Banting", named after the English undertaker William Banting. In 1863, he wrote a booklet called Letter on Corpulence, Addressed to the Public, which contained the particular plan for the diet he had successfully followed. His own diet was four meals per day, consisting of meat, greens, fruits, and dry wine. The emphasis was on avoiding sugar, sweet foods, starch, beer, milk and butter. Banting's pamphlet was popular for years to come, and would be used as a model for modern diets. The pamphlet's popularity was such that the question "Do you bant?" referred to his method, and eventually to dieting in general. His booklet remains in print as of 2007.

The first weight-loss book to promote calorie counting, and the first weight-loss book to become a bestseller, was the 1918 Diet and Health: With Key to the Calories by American physician and columnist Lulu Hunt Peters.

It was estimated that over 1000 weight loss diets have been developed up to 2014.

Types

A restricted diet is more often pursued by those who want to lose weight. Some people follow a diet to gain weight (usually in the form of muscle). Diets can also be used to maintain a stable body weight and improve health.

Low-fat

Low-fat diets involve the reduction of the percentage of fat in one's diet. Calorie consumption is reduced because less fat is consumed. Diets of this type include NCEP Step I and II. A meta-analysis of 16 trials of 2–12 months' duration found that low-fat diets (without intentional restriction of caloric intake) resulted in average weight loss of 3.2 kg (7.1 lb) over habitual eating.

A low-fat, plant-based diet has been found to improve control of weight, blood sugar levels, and cardiovascular health.

Low-carbohydrate

Low-carbohydrate diets are relatively high in protein and fats. Low-carbohydrate diets are sometimes ketogenic (i.e., they restrict carbohydrate intake sufficiently to cause ketosis).

"The glycemic index (GI) factor is a ranking of foods based on their overall effect on blood sugar levels. The diet based around this research is called the Low GI diet. Low glycemic index foods, such as lentils, provide a slower, more consistent source of glucose to the bloodstream, thereby stimulating less insulin release than high glycemic index foods, such as white bread."

A randomized controlled trial comparing four diets concluded that the high-carbohydrate, low-glycemic index diet was the most favorable as it led to both high weight loss and a decline in low density lipoprotein.

The "glycemic load" is the glycemic index multiplied by the amount of carbohydrate. A meta-analysis by the Cochrane Collaboration concluded that low glycemic index or low glycemic load diets led to more weight loss and better lipid profiles but did not separate the effects of the load versus the index.

Low-calorie

Low-calorie diets usually produce an energy deficit of 500–1,000 calories per day, which can result in a 0.5 to 1 kilogram (1.1 to 2.2 pounds) weight loss per week. One of the most commonly used low-calorie diets is Weight Watchers. The National Institutes of Health reviewed 34 randomized controlled trials to determine the effectiveness of low-calorie diets. They found that these diets lowered total body mass by 8% in the short term, over 3–12 months. Women doing low-calorie diets should have at least 1,000 calories per day and men should have approximately 1,200 calories per day. These caloric intake values vary depending on additional factors, such as age and weight.

Very low-calorie

Very low calorie diets provide 200–800 calories per day, maintaining protein intake but limiting calories from both fat and carbohydrates. They subject the body to starvation and produce an average loss of 1.5–2.5 kg (3.3–5.5 lb) per week. "2-4-6-8", a popular diet of this variety, follows a four-day cycle in which only 200 calories are consumed the first day, 400 the second day, 600 the third day, 800 the fourth day, and then totally fasting, after which the cycle repeats. There is some evidence that these diets results in considerable weight loss. These diets are not recommended for general use and should be reserved for the management of obesity as they are associated with adverse side effects such as loss of lean muscle mass, increased risks of gout, and electrolyte imbalances. People attempting these diets must be monitored closely by a physician to prevent complications.

The concept of crash dieting is to drastically reduce calories, using a very-low-calorie diet. Crash dieting can be highly dangerous because it can cause various kind of issues for the human body. Crash dieting can produce weight loss but without professional supervision all along, the extreme reduction in calories and potential unbalance in the diet's composition can lead to detrimental effects, including sudden death.

Fasting

Fasting is when there is a long time interval between the meals. In dieting, long term (periodic) fasting is not recommended, instead, having small portions of food after small intervals is encouraged. Lengthy fasting can also be dangerous due to the risk of malnutrition and should be carried out only under medical supervision. During prolonged fasting or very low calorie diets the reduction of blood glucose, the preferred energy source of the brain, causes the body to deplete its glycogen stores. Once glycogen is depleted the body begins to fuel the brain using ketones, while also metabolizing body protein (including but not limited to skeletal muscle) to be used to synthesize sugars for use as energy by the rest of the body. Most experts believe that a prolonged fast can lead to muscle wasting, although some dispute this. The use of short-term fasting, or various forms of intermittent fasting, have been used as a form of dieting to circumvent the issues of long fasting.

Detox

Detox diets are promoted with unsubstantiated claims that they can eliminate "toxins" from the human body. Many of these diets use herbs or celery and other juicy low-calorie vegetables.

Environmentally sustainable

Another kind of diet focuses not on the dieter's health effects, but on its environment. The One Blue Dot plan of the BDA offers recommendations towards reducing diets' environmental impacts, by:

  1. Reducing meat to 70g per person per day.
  2. Prioritising plant proteins.
  3. Promoting fish from sustainable sources.
  4. Moderate dairy consumption.
  5. Focusing on wholegrain starchy foods.
  6. Promoting seasonal locally sourced fruits and vegetables.
  7. Reducing high fat, sugar and salty foods overconsumption.
  8. Promoting tap water and unsweetened tea/coffee as the de facto choice for healthy hydration.
  9. Reducing food waste.

Effectiveness

Several diets are effective for weight loss of obese individuals, with diet success most predicted by adherence and little effect resulting from the type or brand of diet. As weight maintenance depends on calorie intake, diets emphasising certain macronutrients (low-fat, low-carbohydrate, etc.) have been shown to be no more effective than one another and no more effective than diets that maintain a typical mix of foods with smaller portions and perhaps some substitutions (e.g. low-fat milk, or less salad dressing). A meta-analysis of six randomized controlled trials found no difference between low-calorie, low-carbohydrate, and low-fat diets, with a 2–4 kilogram weight loss over 12–18 months in all studies. Extreme diets may, in some cases, lead to malnutrition.

A major challenge regarding weight loss and dieting relates to compliance. While dieting can effectively promote weight loss in the short term, the intervention is hard to maintain over time and suppresses skeletal muscle thermogenesis. Suppressed thermogenesis accelerates weight regain once the diet stops, unless that phase is accompanied by a well-timed exercise intervention, as described by the Summermatter cycle.

On average, short-term dieting results in a meaningful long-term weight-loss, although more limited because of gradual 1 to 2 kg/year weight regain. For each individual, the results will be different, with some even regaining more weight than they lost, while a few others achieve a tremendous loss, so that the "average weight loss" of a diet is not indicative of the results other dieters may achieve. A 2001 meta-analysis of 29 American studies found that participants of structured weight-loss programs maintained an average of 23% (3 kg) of their initial weight loss after five years, representing an sustained 3.2% reduction in body mass.

Dieting appears more effective than exercise for weight loss, but combining both provides even greater long-term results.

Adverse Effects

Increased Mortality Rate

A number of studies have found that intentional weight loss is associated with an increase in mortality in people without weight-related health problems. A 2009 meta-analysis of 26 studies found that "intentional weight loss had a small benefit for individuals classified as unhealthy (with obesity-related risk factors), especially unhealthy obese, but appeared to be associated with slightly increased mortality for healthy individuals, and for those who were overweight but not obese."

Dietary Supplements

Due to extreme or unbalanced diets, dietary supplements may be needed. They are able to provide the vitamins, minerals, herbs or other supplements that may be missing from an unbalanced diet. While they could be very helpful to maintain a healthy lifestyle with an unbalanced diet, supplements are medications that can't be overused. Overdosing on any dietary supplement can cause a range of side effects depending on which supplement was taken.

Eating disorders

In an editorial for Psychological Medicine, George Hsu concludes that dieting is likely to lead to the development of an eating disorder in the presence of certain risk factors. A 2006 study found that dieting and unhealthy weight-control behaviors were predictive of obesity and eating disorders five years later, with the authors recommending a "shift away from dieting and drastic weight-control measures toward the long-term implementation of healthful eating and physical activity".

Mechanism

When the body is expending more energy than it is consuming (e.g. when exercising), the body's cells rely on internally stored energy sources, such as complex carbohydrates and fats, for energy. The first source to which the body turns is glycogen (by glycogenolysis). Glycogen is a complex carbohydrate, 65% of which is stored in skeletal muscles and the remainder in the liver (totaling about 2,000 kcal in the whole body). It is created from the excess of ingested macronutrients, mainly carbohydrates. When glycogen is nearly depleted, the body begins lipolysis, the mobilization and catabolism of fat stores for energy. In this process fats, obtained from adipose tissue, or fat cells, are broken down into glycerol and fatty acids, which can be used to generate energy. The primary by-products of metabolism are carbon dioxide and water; carbon dioxide is expelled through the respiratory system.

Set-Point Theory

The Set-Point Theory, first introduced in 1953, postulated that each body has a preprogrammed fixed weight, with regulatory mechanisms to compensate. This theory was quickly adopted and used to explain failures in developing effective and sustained weight loss procedures. A 2019 systematic review of multiple weight change procedures, including alternate day fasting and time-restricted feeding but also exercise and overeating, found systematic "energetic errors" for all these procedures. This shows that the body cannot precisely compensate for errors in energy/calorie intake, countering the Set-Point Theory and potentially explaining both weight loss and weight gain such as obesity. This review was conducted on short-term studies, therefore such a mechanism cannot be excluded in the long term, as evidence is currently lacking on this timeframe.

Methods

Meals timing

Meals timing schedule is known to be an important factor of any diet. Recent evidence suggest that new scheduling strategies, such as intermittent fasting or skipping meals, and strategically placed snacks before meals, may be recommendable to reduce cardiovascular risks as part of a broader lifestyle and dietary change.

Food diary

A 2008 study published in the American Journal of Preventive Medicine showed that dieters who kept a daily food diary (or diet journal), lost twice as much weight as those who did not keep a food log, suggesting that if a person records their eating, they are more aware of what they consume and therefore eat fewer calories.

Water

A 2009 review found limited evidence suggesting that encouraging water consumption and substituting energy-free beverages for energy-containing beverages (i.e., reducing caloric intake) may facilitate weight management. A 2009 article found that drinking 500 ml of water prior to meals for a 12-week period resulted in increased long-term weight reduction. (References given in main article.)

Society

It is estimated that about 1 out of 3 Americans is dieting at any given time. 85% of dieters are women. Approximately sixty billion dollars are spent every year in the USA on diet products, including "diet foods," such as light sodas, gym memberships or specific regimes. 80% of dieters start by themselves, whereas 20% see a professional or join a paid program. The typical dieter attempts 4 tries per year.

Weight loss groups

Some weight loss groups aim to make money, others work as charities. The former include Weight Watchers and Peertrainer. The latter include Overeaters Anonymous, TOPS Club and groups run by local organizations.

These organizations' customs and practices differ widely. Some groups are modelled on twelve-step programs, while others are quite informal. Some groups advocate certain prepared foods or special menus, while others train dieters to make healthy choices from restaurant menus and while grocery-shopping and cooking.

Orthorexia nervosa

From Wikipedia, the free encyclopedia

Orthorexia nervosa /ˌɔːrθəˈrɛksiə nɜːrˈvsə/ (also known as orthorexia) (ON) is a proposed eating disorder characterized by an excessive preoccupation with eating healthy food. The term was introduced in 1997 by American physician Steven Bratman, M.D. He suggested that some people's dietary restrictions intended to promote health may paradoxically lead to unhealthy consequences, such as social isolation, anxiety, loss of ability to eat in a natural, intuitive manner, reduced interest in the full range of other healthy human activities, and, in rare cases, severe malnutrition or even death.

In 2009, Ursula Philpot, chair of the British Dietetic Association and senior lecturer at Leeds Metropolitan University, described people with orthorexia nervosa as being "solely concerned with the quality of the food they put in their bodies, refining and restricting their diets according to their personal understanding of which foods are truly 'pure'." This differs from other eating disorders, such as anorexia nervosa and bulimia nervosa, where those affected focus on the quantity of food eaten.

Orthorexia nervosa also differs from anorexia nervosa in that it does not disproportionally affect one gender. Studies have found that orthorexia nervosa is equally found in both men and women with no significant gender differences at all. Furthermore, research has found significant positive correlations between ON and both narcissism and perfectionism, but no significant correlation between ON and self esteem. This shows that high-ON individuals likely take pride over their healthy eating habits over others and that is the driving force behind their orthorexia as opposed to body image like anorexia.

Orthorexia nervosa is not recognized as an eating disorder by the American Psychiatric Association, and so is not mentioned as an official diagnosis in the widely used Diagnostic and Statistical Manual of Mental Disorders (DSM).

Signs and symptoms

Symptoms of orthorexia nervosa include "obsessive focus on food choice, planning, purchase, preparation, and consumption; food regarded primarily as source of health rather than pleasure; distress or disgust when in proximity to prohibited foods; exaggerated faith that inclusion or elimination of particular kinds of food can prevent or cure disease or affect daily well-being; periodic shifts in dietary beliefs while other processes persist unchanged; moral judgment of others based on dietary choices; body image distortion around sense of physical "impurity" rather than weight; persistent belief that dietary practices are health-promoting despite evidence of malnutrition."

Cause

There has been no investigation into whether there may be a biological cause specific to orthorexia nervosa. It may be a food-centered manifestation of obsessive-compulsive disorder, which has a lot to do with control.

Diagnosis

In 2016, formal criteria for orthorexia were proposed in the peer-reviewed journal Eating Behaviors by Thom Dunn and Steven Bratman. These criteria are as follows:

Criterion A. Obsessive focus on "healthy" eating, as defined by a dietary theory or set of beliefs whose specific details may vary; marked by exaggerated emotional distress in relationship to food choices perceived as unhealthy; weight loss may ensue, but this is conceptualized as an aspect of ideal health rather than as the primary goal. As evidenced by the following:

  1. Compulsive behavior and/or mental preoccupation regarding affirmative and restrictive dietary practices believed by the individual to promote optimum health. (Footnotes to this criteria add: Dietary practices may include use of concentrated "food supplements." Exercise performance and/or fit body image may be regarded as an aspect or indicator of health.)
  2. Violation of self-imposed dietary rules causes exaggerated fear of disease, sense of personal impurity and/or negative physical sensations, accompanied by anxiety and shame.
  3. Dietary restrictions escalate over time, and may come to include elimination of entire food groups and involve progressively more frequent and/or severe "cleanses" (partial fasts) regarded as purifying or detoxifying. This escalation commonly leads to weight loss, but the desire to lose weight is absent, hidden or subordinated to ideation about healthy food.

Criterion B. The compulsive behavior and mental preoccupation becomes clinically impairing by any of the following:

  1. Malnutrition, severe weight loss or other medical complications from restricted diet
  2. Intrapersonal distress or impairment of social, academic or vocational functioning secondary to beliefs or behaviors about healthy diet
  3. Positive body image, self-worth, identity and/or satisfaction excessively dependent on compliance with self-defined "healthy" eating behavior.

A diagnostic questionnaire has been developed for orthorexia sufferers, similar to questionnaires for other eating disorders, named the ORTO-15. However, Dunn and Bratman critique this survey tool as lacking appropriate internal and external validation.

Epidemiology

Results across scientific findings have yet to find a definitive conclusion to support whether nutrition students and professionals are at higher risk than other population subgroups, due to differing results in the research literature. There are only a few notable scientific works that, in an attempt to explore the breadth and depth of the still vaguely-understood illness, have tried to identify which groups in society are most vulnerable to its onset. This includes a 2008 German study, which based its research on the widespread suspicion that the most nutritionally-informed, such as university nutrition students, are a potential high-risk group for eating disorders, due to a substantial accumulation of knowledge on food and its relationship to health; the idea being that the more one knows about health, the more likely an unhealthy fixation about being healthy can develop. This study also inferred that orthorexic tendencies may even fuel a desire to study the science, indicating that many within this field might suffer from the disorder before commencing the course. However the results found that the students in the study, upon initial embarkation of their degree, did not have higher orthorexic values than other non-nutrition university students, and thus the report concluded that further research is needed to clarify the relationship between food-education and the onset of ON.

Similarly, in a Portuguese study on nutrition tertiary students, the participants' orthorexic scores (according to the ORTO-15 diagnostic questionnaire) actually decreased as they progressed through their course, as well as the overall risk of developing an eating disorder being an insignificant 4.2 percent. The participants also answered questionnaires to provide insight into their eating behaviours and attitudes, and despite this study finding that nutrition and health-science students tend to have more restrictive eating behaviours, these studies however found no evidence to support that these students have "more disturbed or disordered eating patterns than other students" These two aforementioned studies conclude that the more understanding of food one has is not necessarily a risk factor for ON, explaining that the data gathered suggests dietetics professionals are not at significant risk of it.

However, these epidemiologic studies have been critiqued as using a fundamentally flawed survey tool that inflates prevalence rates. Scholars have questioned both the reliability and validity of the ORTO-15.

Most scientific findings tend to agree, however, young adults and adolescents are extremely susceptible to developing eating disorders. One study found that there was no relationship between BOT score and college major, which may indicate the prevalence of mental health issues and eating disorders on college campuses and that health and science majors are no longer the only ones affected More studies have also been conducted on the link between increased Instagram use and Orthorexia nervosa. The social media based healthy community has recently grown in popularity especially on platforms such as Instagram. The hashtag #food is one of the top 25 most popular hashtags on Instagram. A study that investigated this relationship found that increased use of Instagram correlated between symptoms of ON with no other social media platform having the same effect. With young adults and adolescents making up the majority of social media users, exposure to this type of content can lead to developing unhealthy behavior.

History

In a 1997 article in the magazine Yoga Journal, the American physician Steven Bratman coined the term "orthorexia nervosa" from the Greek ορθο- (ortho, "right" or "correct"), and όρεξις (orexis, "appetite"), literally meaning 'correct appetite', but in practice meaning 'correct diet'. The term is modeled on anorexia, literally meaning "without appetite", as used in the definition of the condition anorexia nervosa. (In both terms, "nervosa" indicates an unhealthy psychological state.) Bratman described orthorexia as an unhealthy fixation with what the individual considers to be healthy eating. Beliefs about what constitutes healthy eating commonly originate in one or another dietary theory such as raw foods veganism or macrobiotics, but are then taken to extremes, leading to disordered eating patterns and psychological and/or physical impairment. Bratman based this proposed condition on his personal experiences in the 1970s, as well as behaviors he observed among his patients in the 1990s. In 2000, Bratman, with David Knight, authored the book Health Food Junkies, which further expanded on the subject.

Following the publication of the book, in 2004 a team of Italian researchers from La Sapienza University of Rome, published the first empirical study attempting to develop a tool to measure the prevalence of orthorexia, known as the ORTO-15.

In 2015, responding to news articles in which the term orthorexia is applied to people who merely follow a non-mainstream theory of healthy eating, Bratman specified the following: "A theory may be conventional or unconventional, extreme or lax, sensible or totally wacky, but, regardless of the details, followers of the theory do not necessarily have orthorexia. They are simply adherents of a dietary theory. The term 'orthorexia' only applies when an eating disorder develops around that theory." Bratman elsewhere clarifies that with a few exceptions, most common theories of healthy eating are followed safely by the majority of their adherents; however, "for some people, going down the path of a restrictive diet in search of health may escalate into dietary perfectionism." Karin Kratina, PhD, writing for the National Eating Disorders Association, summarizes this process as follows: "Eventually food choices become so restrictive, in both variety and calories, that health suffers – an ironic twist for a person so completely dedicated to healthy eating."

Although orthorexia is not recognized as a mental disorder by the American Psychiatric Association, and it is not listed in the DSM-5, as of January 2016, four case reports and more than 40 other articles on the subject have been published in a variety of peer-reviewed journals internationally. According to a study published in 2011, two-thirds of a sample of 111 Dutch-speaking eating disorder specialists felt they had observed the syndrome in their clinical practice.

According to the Macmillan English Dictionary, the word is entering the English lexicon. The concept of orthorexia as a newly developing eating disorder has attracted significant media attention in the 21st century.

Orthorexia and other disorders

Orthorexia differs from anorexia and bulimia in its relationship to food. Instead of focusing on food intake in an attempt to lose weight and eat less, orthorexia is an "obsession about the quality of food intake" and is fueled by a feeling of achieving perfection and purity by only consuming "healthy" foods.

Orthorexic behaviors can often lead to malnutrition and weight loss, and it is often associated with anorexia nervosa. Studies have also shown that obsessive-compulsive tendencies are linked to the development of orthorexia, and some researchers suggest that orthorexia should be diagnosed as OCD because it is driven by an obsession for attaining a perfect diet.

Heuristic

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Heuristic

A heuristic (/hjʊˈrɪstɪk/; from Ancient Greek εὑρίσκω (heurískō) 'I find, discover'), or heuristic technique, is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

Examples that employ heuristics include using trial and error, a rule of thumb or an educated guess.

Overview

Heuristics are the strategies derived from previous experiences with similar problems. These strategies depend on using readily accessible, though loosely applicable, information to control problem solving in human beings, machines and abstract issues. When an individual applies a heuristic in practice, it generally performs as expected. However it can alternatively create systematic errors.

The most fundamental heuristic is trial and error, which can be used in everything from matching nuts and bolts to finding the values of variables in algebra problems. In mathematics, some common heuristics involve the use of visual representations, additional assumptions, forward/backward reasoning and simplification. Here are a few commonly used heuristics from George Pólya's 1945 book, How to Solve It:

  • If you are having difficulty understanding a problem, try drawing a picture.
  • If you can't find a solution, try assuming that you have a solution and seeing what you can derive from that ("working backward").
  • If the problem is abstract, try examining a concrete example.
  • Try solving a more general problem first (the "inventor's paradox": the more ambitious plan may have more chances of success).

In psychology, heuristics are simple, efficient rules, learned or inculcated by evolutionary processes, that have been proposed to explain how people make decisions, come to judgements, and solve problems typically when facing complex problems or incomplete information. Researchers test if people use those rules with various methods. These rules work well under most circumstances, but in certain cases can lead to systematic errors or cognitive biases.

History

The study of heuristics in human decision-making was developed in the 1970s and the 1980s by the psychologists Amos Tversky and Daniel Kahneman although the concept had been originally introduced by the Nobel laureate Herbert A. Simon, whose original, primary object of research was problem solving that showed that we operate within what he calls bounded rationality. He coined the term satisficing, which denotes a situation in which people seek solutions, or accept choices or judgements, that are "good enough" for their purposes although they could be optimised.

Rudolf Groner analysed the history of heuristics from its roots in ancient Greece up to contemporary work in cognitive psychology and artificial intelligence, proposing a cognitive style "heuristic versus algorithmic thinking", which can be assessed by means of a validated questionnaire.

Adaptive toolbox

Gerd Gigerenzer and his research group argued that models of heuristics need to be formal to allow for predictions of behavior that can be tested. They study the fast and frugal heuristics in the "adaptive toolbox" of individuals or institutions, and the ecological rationality of these heuristics; that is, the conditions under which a given heuristic is likely to be successful. The descriptive study of the "adaptive toolbox" is done by observation and experiment, the prescriptive study of the ecological rationality requires mathematical analysis and computer simulation. Heuristics – such as the recognition heuristic, the take-the-best heuristic and fast-and-frugal trees – have been shown to be effective in predictions, particularly in situations of uncertainty. It is often said that heuristics trade accuracy for effort but this is only the case in situations of risk. Risk refers to situations where all possible actions, their outcomes and probabilities are known. In the absence of this information, that is under uncertainty, heuristics can achieve higher accuracy with lower effort. This finding, known as a less-is-more effect, would not have been found without formal models. The valuable insight of this program is that heuristics are effective not despite their simplicity — but because of it. Furthermore, Gigerenzer and Wolfgang Gaissmaier found that both individuals and organisations rely on heuristics in an adaptive way.

Cognitive-experiential self-theory

Heuristics, through greater refinement and research, have begun to be applied to other theories, or be explained by them. For example, the cognitive-experiential self-theory (CEST) also is an adaptive view of heuristic processing. CEST breaks down two systems that process information. At some times, roughly speaking, individuals consider issues rationally, systematically, logically, deliberately, effortfully and verbally. On other occasions, individuals consider issues intuitively, effortlessly, globally, and emotionally. From this perspective, heuristics are part of a larger experiential processing system that is often adaptive, but vulnerable to error in situations that require logical analysis.

Attribute substitution

In 2002, Daniel Kahneman and Shane Frederick proposed that cognitive heuristics work by a process called attribute substitution, which happens without conscious awareness. According to this theory, when somebody makes a judgement (of a "target attribute") that is computationally complex, a more easily calculated "heuristic attribute" is substituted. In effect, a cognitively difficult problem is dealt with by answering a rather simpler problem, without being aware of this happening. This theory explains cases where judgements fail to show regression toward the mean. Heuristics can be considered to reduce the complexity of clinical judgments in health care.

Psychology

Informal models of heuristics

  • Affect heuristic — Mental shortcut which uses emotion to influence the decision. Emotion is the effect that plays the lead role that makes the decision or solves the problem quickly or efficiently. It is used while judging the risks and benefits of something, depending on the positive or negative feelings that people associate with a stimulus. It can also be considered the gut decision since if the gut feeling is right, then the benefits are high and the risks are low.
  • Anchoring and adjustment — Describes the common human tendency to rely more heavily on the first piece of information offered (the "anchor") when making decisions. For example, in a study done with children, the children were told to estimate the number of jellybeans in a jar. Groups of children were given either a high or low "base" number (anchor). Children estimated the number of jellybeans to be closer to the anchor number that they were given.
  • Availability heuristic — A mental shortcut that occurs when people make judgements about the probability of events by the ease with which examples come to mind. For example, in a 1973 Tversky & Kahneman experiment, the majority of participants reported that there were more words in the English language that start with the letter K than for which K was the third letter. There are actually twice as many words in the English Language that have K as the third letter as those that start with K, but words that start with K are much easier to recall and bring to mind.
  • Balance Heuristic — Applies to when an individual balances the negative and positive effects from a decision which makes the choice obvious.
  • Base Rate Heuristic — When a decision involves probability this is a mental shortcut that uses relevant data to determine the probability of an outcome occurring. When using this Heuristic there is a common issue where individuals misjudge the likelihood of a situation. For example, if there is a test for a disease which has an accuracy of 90%, people may think it’s a 90% they have the disease even though the disease only affects 1 in 500 people. 
  • Common Sense Heuristic --- Used frequently by individuals when the potential outcomes of a decision appear obvious. For example, when your television remote goes flat, you would change the batteries.
  • Contagion heuristic — follows the Law of Contagion or Similarity. This leads people to avoid others that are viewed as "contaminated" to the observer. This happens due to the fact of the observer viewing something that is seen as bad or to seek objects that have been associated with what seems good. Some things one can view as harmful can tend not to really be. This sometimes leads to irrational thinking on behalf of the observer.
  • Default Heuristic — In real world models it is common for consumers to apply this heuristic when selecting the default option regardless of whether the option was their preference.
  • Educated Guess Heuristic — When an individual responds to a decision using relevant information they have stored relating to the problem.
  • Effort heuristic — the worth of an object is determined by the amount of effort put into the production of the object. Objects that took longer to produce are more valuable while the objects that took less time are deemed not as valuable. Also applies to how much effort is put into achieving the product. This can be seen as the difference of working and earning the object versus finding the object on the side of the street. It can be the same object but the one found will not be deemed as valuable as the one that we earned.
  • Escalation of commitment — Describes the phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the cost, starting today, of continuing the decision outweighs the expected benefit. This is related to the sunk cost fallacy.
  • Fairness Heuristic — Applies to the reaction of an individual to a decision from an authoritative figure. If the decision is enacted in a fair manner the likelihood of the individual to comply voluntarily is higher than if it is unfair.
  • Familiarity heuristic — A mental shortcut applied to various situations in which individuals assume that the circumstances underlying the past behavior still hold true for the present situation and that the past behavior thus can be correctly applied to the new situation. Especially prevalent when the individual experiences a high cognitive load.
  • Naïve diversification — When asked to make several choices at once, people tend to diversify more than when making the same type of decision sequentially.
  • Peak–end rule — a person's subjective perceptions during the most intense and final moments of an event are averaged together into a single judgment. For example, a person might judge the difficulty of a workout by taking into consideration only the most demanding part of the workout (e.g., Tabata sprints) and what happens at the very end (e.g., a cool-down). In this way, a difficult workout such as the one described here could be perceived as "easier" than a more relaxed workout that did not vary in intensity (e.g., 45 minutes of cycling in aerobic zone 3, without cool-down).
  • Representativeness heuristic — A mental shortcut used when making judgements about the probability of an event under uncertainty. Or, judging a situation based on how similar the prospects are to the prototypes the person holds in his or her mind. For example, in a 1982 Tversky and Kahneman experiment, participants were given a description of a woman named Linda. Based on the description, it was likely that Linda was a feminist. Eighty to ninety percent of participants, choosing from two options, chose that it was more likely for Linda to be a feminist and a bank teller than only a bank teller. The likelihood of two events cannot be greater than that of either of the two events individually. For this reason, the representativeness heuristic is exemplary of the conjunction fallacy.
  • Scarcity heuristic — As in economics, the scarcer an object or event is, the more value is attributed to the object or event. The lack of abundance is an indicator of value and provides a mental shortcut that influences the subjective valuation based on how easily the thing might be replaced or lost to competitors. The scarcity heuristic is a cognitive rule that the more difficult it is to acquire an item, the more value that item must have. In many situations we use an item’s availability, its perceived abundance, to quickly estimate quality and/or utility. This can lead to systematic judgement errors or cognitive bias.
  • Simulation heuristic — simplified mental strategy in which people determine the likelihood of an event happening based on how easy it is to mentally picture the event happening. People regret the events that are easier to imagine over the ones that would be harder to. It is also thought that people will use this heuristic to predict the likelihood of another's behavior happening. This shows that people are constantly simulating everything around them in order to be able to predict the likelihood of events around them. It is believed that people do this by mentally undoing events that they have experienced and then running mental simulations of the events with the corresponding input values of the altered model.
  • Social proof — also known as the informational social influence which was named by Robert Cialdini in his 1984 book Influence. It is where people copy the actions of others. It is more prominent when people are uncertain how to behave, especially in ambiguous social situations.
  • Working Backward Heuristic — When an individual assumes they have already solved a problem they work backwards in order to find how to achieve the solution they originally figured out.

Formal models of heuristics

Cognitive maps

Heuristics were also found to be used in the manipulation and creation of cognitive maps. Cognitive maps are internal representations of our physical environment, particularly associated with spatial relationships. These internal representations are used by our memory as a guide in our external environment. It was found that when questioned about maps imaging, distancing, etc., people commonly made distortions to images. These distortions took shape in the regularisation of images (i.e., images are represented as more like pure abstract geometric images, though they are irregular in shape).

There are several ways that humans form and use cognitive maps, with visual intake being an especially key part of mapping: the first is by using landmarks, whereby a person uses a mental image to estimate a relationship, usually distance, between two objects. The second is route-road knowledge, and is generally developed after a person has performed a task and is relaying the information of that task to another person. The third is a survey, whereby a person estimates a distance based on a mental image that, to them, might appear like an actual map. This image is generally created when a person's brain begins making image corrections. These are presented in five ways:

  1. Right-angle bias: when a person straightens out an image, like mapping an intersection, and begins to give everything 90-degree angles, when in reality it may not be that way.
  2. Symmetry heuristic: when people tend to think of shapes, or buildings, as being more symmetrical than they really are.
  3. Rotation heuristic: when a person takes a naturally (realistically) distorted image and straightens it out for their mental image.
  4. Alignment heuristic: similar to the previous, where people align objects mentally to make them straighter than they really are.
  5. Relative-position heuristic: people do not accurately distance landmarks in their mental image based on how well they remember them.

Another method of creating cognitive maps is by means of auditory intake based on verbal descriptions. Using the mapping based from a person's visual intake, another person can create a mental image, such as directions to a certain location.

Philosophy

A heuristic device is used when an entity X exists to enable understanding of, or knowledge concerning, some other entity Y.

A good example is a model that, as it is never identical with what it models, is a heuristic device to enable understanding of what it models. Stories, metaphors, etc., can also be termed heuristic in this sense. A classic example is the notion of utopia as described in Plato's best-known work, The Republic. This means that the "ideal city" as depicted in The Republic is not given as something to be pursued, or to present an orientation-point for development. Rather, it shows how things would have to be connected, and how one thing would lead to another (often with highly problematic results), if one opted for certain principles and carried them through rigorously.

Heuristic is also often used as a noun to describe a rule-of-thumb, procedure, or method. Philosophers of science have emphasised the importance of heuristics in creative thought and the construction of scientific theories. (See The Logic of Scientific Discovery by Karl Popper; and philosophers such as Imre Lakatos, Lindley Darden, William C. Wimsatt and others.)

Law

In legal theory, especially in the theory of law and economics, heuristics are used in the law when case-by-case analysis would be impractical, insofar as "practicality" is defined by the interests of a governing body.

The present securities regulation regime largely assumes that all investors act as perfectly rational persons. In truth, actual investors face cognitive limitations from biases, heuristics, and framing effects. For instance, in all states in the United States the legal drinking age for unsupervised persons is 21 years, because it is argued that people need to be mature enough to make decisions involving the risks of alcohol consumption. However, assuming people mature at different rates, the specific age of 21 would be too late for some and too early for others. In this case, the somewhat arbitrary deadline is used because it is impossible or impractical to tell whether an individual is sufficiently mature for society to trust them with that kind of responsibility. Some proposed changes, however, have included the completion of an alcohol education course rather than the attainment of 21 years of age as the criterion for legal alcohol possession. This would put youth alcohol policy more on a case-by-case basis and less on a heuristic one, since the completion of such a course would presumably be voluntary and not uniform across the population.

The same reasoning applies to patent law. Patents are justified on the grounds that inventors must be protected so they have incentive to invent. It is therefore argued that it is in society's best interest that inventors receive a temporary government-granted monopoly on their idea, so that they can recoup investment costs and make economic profit for a limited period. In the United States, the length of this temporary monopoly is 20 years from the date the patent application was filed, though the monopoly does not actually begin until the application has matured into a patent. However, like the drinking-age problem above, the specific length of time would need to be different for every product to be efficient. A 20-year term is used because it is difficult to tell what the number should be for any individual patent. More recently, some, including University of North Dakota law professor Eric E. Johnson, have argued that patents in different kinds of industries – such as software patents – should be protected for different lengths of time.

Stereotyping

Stereotyping is a type of heuristic that people use to form opinions or make judgements about things they have never seen or experienced. They work as a mental shortcut to assess everything from the social status of a person (based on their actions), to whether a plant is a tree based on the assumption that it is tall, has a trunk and has leaves (even though the person making the evaluation might never have seen that particular type of tree before).

Stereotypes, as first described by journalist Walter Lippmann in his book Public Opinion (1922), are the pictures we have in our heads that are built around experiences as well as what we are told about the world.

Artificial intelligence

A heuristic can be used in artificial intelligence systems while searching a solution space. The heuristic is derived by using some function that is put into the system by the designer, or by adjusting the weight of branches based on how likely each branch is to lead to a goal node.

Closure (psychology)

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Closure_(psychology)

Closure or need for closure (NFC) (used interchangeably with need for cognitive closure (NFCC)) are social psychological terms that describe an individual's desire for a clear, firm answer to a question and an aversion toward ambiguity. 

The term "need" denotes a motivated tendency to seek out information. The need for closure is the motivation to find an answer to an ambiguous situation. This motivation is enhanced by the perceived benefits of obtaining closure, such as the increased ability to predict the world and a stronger basis for action. This motivation is also enhanced by the perceived costs of lacking closure, such as dealing with uncertainty. A sense of closure is not usually possible with ambiguous loss, such as a missing person, and the hoped-for benefits, such as a sense of relief after the death of a person who inflicted harm, are not necessarily obtained. Because of this mismatch between what individuals hope will happen if they achieve closure and what they actually experience, the idea of getting closure has been described as a myth.

The level of the need for cognitive closure is a fairly stable individual characteristic. It can affect what information individuals seek out and how they process it. This need can be affected by situational factors. For example, in the presence of circumstances that increase the need for closure, individuals are more likely to use simple cognitive structures to process information.

According to Kruglanski et al., need for closure exerts its effects via two general tendencies: the urgency tendency (the inclination to attain closure as quickly as possible) and the permanence tendency (the tendency to maintain it for as long as possible). Together, these tendencies may produce the inclinations to seize and then freeze on early judgmental cues, reducing the extent of information processing and hypothesis generation and introducing biases in thinking.

Need for Closure Scale

The need for closure in social psychology is thought to be a fairly stable dispositional characteristic that can, nonetheless, be affected by situational factors. The Need for Closure Scale (NFCS) was developed by Arie Kruglanski, Donna Webster, and Adena Klem in 1993 and is designed to operationalize this construct and is presented as a unidimensional instrument possessing strong discriminant and predictive validity.

People who score high on the need for closure scale are more likely to exhibit impression primacy effects to correspondence bias, make stereotypical judgments, assimilate new information to existing, active beliefs, and, in the presence of prior information, resist persuasion. Someone rating low on need for closure will express more ideational fluidity and creative acts. Items on the scale include statements such as "I think that having clear rules and order at work is essential to success," and "I do not like situations that are uncertain." Items such as "Even after I've made up my mind about something, I am always eager to consider a different opinion," and "I like to have friends who are unpredictable" are reverse scored.

Composed of 42 items, the scale has been used in numerous research studies and has been translated into multiple languages. Although Webster and Kruglanski (1994) treated the Need for Closure Scale as unidimensional (i.e., as measuring a single factor), the scale actually contains two orthogonal factors, decisiveness and need for structure. Thus, using a total scale score can overlook effects for each factor and complicate interpretations. In 2007, Roets and Van Hiel tried to resolve this issue by revising the scale so it would measure only one thing. They came up with a set of new decisiveness items that provided a viable alternative for the old Decisiveness subscale of the NFCS, which was poorly related to the other NFCS facet scales and had questionable validity. The new items were developed with explicit reference to decisiveness but formulated in such a way that they relate to the need rather than to the ability to decide. In 2011, Roets and Van Hiel created an abridged and empirically validated NFC scale consisting of only 15 items from the original NFC.

NFCS items correlate positively with authoritarianism, intolerance of ambiguity, dogmatism, need for order and structure and negatively with cognitive complexity and impulsivity, among several other cognitive tools and personality traits.

High NFC scores consistently correlate with items on the C-Scale (conservatism) as well as other measures of political and social conservatism.

Need to avoid

Functionally opposite to the need for closure is the need to avoid closure. Need to avoid closure reflects the desire to suspend judgmental commitment. It also contains the subcategories specific and non-specific need to avoid closure. Avoidance of specific closure reflects the desire to avoid specific answers to one's questions. The non-specific need to avoid closure is much like the need for closure irrespective of whether or not this new knowledge points to a conclusion having positive or negative implications for them.

The need to avoid closure may stem from the perceived costs of possessing closure (e.g., envisioned penalties for an erroneous closure or perceived drawbacks of actions implied by closure) and the perceived benefits of lacking closure (e.g., immunity from possible criticism of any given closure). The need to avoid closure is controlled by the desire to avoid negative consequences of achieving closure of a situation or to continue the benefits of not closing but elongating a situation.

The need and avoidance of closure are conceptualized as ends of a continuum ranging from strong strivings for closure to strong resistance of closure. This is applied in the NFC Scale.

Lack of

The lack of closure leaves a situation in ambiguity. People high in need for closure seek to avoid this ambiguity at all costs where people high in need to avoid closure strive to make situations more ambiguous. Some perceived benefits of cognitive closure may relate to predictability, the basis for action, or social status accorded the possessors of knowledge (i.e., "experts"). Similarly, some perceived costs of lacking closure may relate to the additional time and effort required to attain closure, or the unpleasantness of process whereby closure must be reached. Occasionally, however, lack of closure maybe perceived to offer various advantages such as freedom from a constraining commitment, neutrality in an acrimonious dispute, the maintenance of a romantic mystery and so on. Though lack of closure is generally thought of as being negative, it is clear that closure and lack of closure have positive or negative implications depending on the person and situation surrounding them.

Implications

A need for cognitive closure may occur while engaged in goal-driven or goal-motivated cognitive functions (e.g., attention control, memory recall, information selection and processing, cognitive inhibition, etc.). Ideally, people should attempt to acquire new knowledge to satisfy questions regarding particular issues (specific cognitive closure) irrespective of whether that knowledge points to a conclusion having positive or negative implications for them (non-specific cognitive closure). But because urgency and permanence are central to the motivational core of this overall process, individuals (or groups) may be compelled, consciously or unconsciously, to obtain information prematurely and irrespective of content.

A high need for cognitive closure might then invite bias in:

  1. selecting the most relevant information one should attend to for increasing chances of adaptation
  2. initiating and sustaining cognitive manipulations that are required to achieve particular outcomes
  3. making judgments and assessments of input information
  4. weighing information during the course of decision-making

For example, the level of NFCC can influence decision-making strategies used by an individual. In a study by Choi et al. that manipulated NFCC, the authors found that a higher NFCC was associated with a preference for using the faster "attribute-based search" which involves examining all available alternatives on one attribute and then moving on to the next attribute. Individuals with a lower NFCC, in contrast, used the "alternative-based search", such that they examine all attributes of one alternative, then move on to the next alternative. Thus, studying NFCC has huge implications for consumer buying behavior.

Need for closure has also been found to have a role in race- and gender-based prejudice. Roets describes a conceptual fit between Allport's "motivated cognitive style" of individuals who exhibit prejudice and Kruglanksi and Webster's concept of high-NFCC individuals, such that both display urgency tendency i.e. the desire for quick, definite answers and permanence tendency i.e., the perseverance of the obtained answer in spite of contradictory information. Thus, NFC provides a strong empirical base for Allport's hypothesized underlying cognitive style of prejudiced individuals.

A high need also induces the tendency to form knowledge more quickly, tying into other concepts, such as a tendency to prefer autocracy i.e. "hard" forms of influence that motivate the targets to comply with the agents' demands quickly via the promise of positive consequences or the threat of negative consequences, rather than "soft" forms of influence that might use extended argumentation or persuasion.

Additionally, and especially in those with strong needs for certainty (as measured on NFC Scale), the impulse to achieve cognitive closure may sometimes produce or evoke a mood instability, and/or truncated perceptions of one's available behavioral choices, should some newly acquired information challenge preconceptions that they had long considered to be certain, permanent and inviolate e.g. certain religious or ethical views and values.

Thus it is apparent that the need for cognitive closure may have important implications for both personal and inter-personal thoughts and actions, including some related to educational processes and school learning.

In education

Formal education environments, such as elementary and secondary schools, present opportunities for learners to acquire new knowledge and skills, and to achieve deep, domain-specific conceptual mastery which, through well-designed pedagogical guidance and academic study, may enhance future career readiness, civic engagement, and general well-being. However, although it is understood that the basic principles of learning assert the importance of attending to students’ prior knowledge, fostering conceptual understanding, and cultivating metacognitive awareness, students must also become engaged and be willing to tolerate and cognitively work through the intellectual ambiguity often associated with exposure to novel information and tasks.

Yet for students who have high need for cognitive closure, this phenomenon may inadvertently lead to the inhibition of cognitive functions and processes essential to the learning process, so that they can maintain their prior certainty and/or perceived permanence of personally or socially important ideas, even if those ideas or knowledge are distinctly unrelated to any specific content or information being presented in the classroom. In instances such as these, an individual's desire for cognitive closure in another area may outweigh her/his motivation to expend cognitive resources toward learning new information. As a result, the student may appear uninterested and susceptible to under-achieving e.g. poor grades or not performing to expected levels.

Unfortunately, in the absence of understanding and consideration of how need for cognitive closure may influence academic and/or achievement motivation, educators may erroneously conclude that a student does not have a desire to learn or that she/he has a cognitive, psychological, intellectual, or behavioral deficiency that is impeding the learning process. This is not to suggest that need for cognitive closure is a suitable explanation for all learning problems; however, in working with students who appear to be experiencing learning challenges manifested through amotivation or low motivation, it would not be unreasonable to explore need for cognitive closure as a potential factor.

Research

Individuals scoring high on the NFCS are more likely to attempt to draw closure by relying on incipient cues, and the first-encountered apparent fit. The need for closure is also said to predispose a very narrow or shallow information search, along with a higher tendency to use cognitive heuristics, when seeking solutions. (Van Hiel and Mervielde, 2003)

In studies on creativity, individuals with high need-for-closure ratings had low creativity scores. Those low in need-for-closure more frequently produced novel solutions that motivated and inspired others in their groups, and the outcomes of the projects in which they participated were rated as correspondingly more productive.

Most research on the need for closure has investigated its relation to social stimuli. However, recent research suggests that it may also predict responses to non-social stimuli. In particular, the need for closure predicts an evaluative bias against deviant non-social stimuli (e.g., the letter "A" presented in a category of letter "B"s)

"Closure" has also been used more loosely to refer to the outcome of an experience which, by virtue of its completion, demonstrates a therapeutic value. Legal scholars have linked "closure" to "catharsis" and "satisfaction" and at times the legal system may be enlisted into an individual's desire for the cessation of uncertainty. In the case of the death penalty, for example, victims seeking "closure" may adopt effective strategies as diverse as retribution, on one hand, and forgiveness on the other.

Magnet school

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Magnet_sc...