Search This Blog

Tuesday, June 26, 2018

Treatment of cancer

From Wikipedia, the free encyclopedia
Cancer can be treated by surgery, chemotherapy, radiation therapy, hormonal therapy, targeted therapy (including immunotherapy such as monoclonal antibody therapy) and synthetic lethality. The choice of therapy depends upon the location and grade of the tumor and the stage of the disease, as well as the general state of the patient (performance status). A number of experimental cancer treatments are also under development. Under current estimates, two in five people will have cancer at some point in their lifetime.[1]

Complete removal of the cancer without damage to the rest of the body (that is, achieving cure with near-zero adverse effects) is the ideal goal of treatment and is often the goal in practice. Sometimes this can be accomplished by surgery, but the propensity of cancers to invade adjacent tissue or to spread to distant sites by microscopic metastasis often limits its effectiveness; and chemotherapy and radiotherapy can have a negative effect on normal cells.[2] Therefore, cure with nonnegligible adverse effects may be accepted as a practical goal in some cases; and besides curative intent, practical goals of therapy can also include (1) suppressing the cancer to a subclinical state and maintaining that state for years of good quality of life (that is, treating the cancer as a chronic disease), and (2) palliative care without curative intent (for advanced-stage metastatic cancers).

Because "cancer" refers to a class of diseases,[3][4] it is unlikely that there will ever be a single "cure for cancer" any more than there will be a single treatment for all infectious diseases.[5] Angiogenesis inhibitors were once thought to have potential as a "silver bullet" treatment applicable to many types of cancer, but this has not been the case in practice.[6]

Types of treatments

The treatment of cancer has undergone evolutionary changes as understanding of the underlying biological processes has increased. Tumor removal surgeries have been documented in ancient Egypt, hormone therapy and radiation therapy were developed in the late 19th Century. Chemotherapy, immunotherapy and newer targeted therapies are products of the 20th century. As new information about the biology of cancer emerges, treatments will be developed and modified to increase effectiveness, precision, survivability, and quality of life.

Surgery

In theory, non-hematological cancers can be cured if entirely removed by surgery, but this is not always possible. When the cancer has metastasized to other sites in the body prior to surgery, complete surgical excision is usually impossible. In the Halstedian model of cancer progression, tumors grow locally, then spread to the lymph nodes, then to the rest of the body. This has given rise to the popularity of local-only treatments such as surgery for small cancers. Even small localized tumors are increasingly recognized as possessing metastatic potential.

Examples of surgical procedures for cancer include mastectomy for breast cancer, prostatectomy for prostate cancer, and lung cancer surgery for non-small cell lung cancer. The goal of the surgery can be either the removal of only the tumor, or the entire organ.[7] A single cancer cell is invisible to the naked eye but can regrow into a new tumor, a process called recurrence. For this reason, the pathologist will examine the surgical specimen to determine if a margin of healthy tissue is present, thus decreasing the chance that microscopic cancer cells are left in the patient.

In addition to removal of the primary tumor, surgery is often necessary for staging, e.g. determining the extent of the disease and whether it has metastasized to regional lymph nodes. Staging is a major determinant of prognosis and of the need for adjuvant therapy. Occasionally, surgery is necessary to control symptoms, such as spinal cord compression or bowel obstruction. This is referred to as palliative treatment.

Surgery may be performed before or after other forms of treatment. Treatment before surgery is often described as neoadjuvant. In breast cancer, the survival rate of patients who receive neoadjuvant chemotherapy are no different to those who are treated following surgery.[8] Giving chemotherapy earlier allows oncologists to evaluate the effectiveness of the therapy, and may make removal of the tumor easier. However, the survival advantages of neoadjuvant treatment in lung cancer are less clear.[9]

Radiation therapy

Radiation therapy (also called radiotherapy, X-ray therapy, or irradiation) is the use of ionizing radiation to kill cancer cells and shrink tumors. Radiation therapy can be administered externally via external beam radiotherapy (EBRT) or internally via brachytherapy. The effects of radiation therapy are localised and confined to the region being treated. Radiation therapy injures or destroys cells in the area being treated (the "target tissue") by damaging their genetic material, making it impossible for these cells to continue to grow and divide. Although radiation damages both cancer cells and normal cells, most normal cells can recover from the effects of radiation and function properly. The goal of radiation therapy is to damage as many cancer cells as possible, while limiting harm to nearby healthy tissue. Hence, it is given in many fractions, allowing healthy tissue to recover between fractions.

Radiation therapy may be used to treat almost every type of solid tumor, including cancers of the brain, breast, cervix, larynx, liver, lung, pancreas, prostate, skin, stomach, uterus, or soft tissue sarcomas. Radiation is also used to treat leukemia and lymphoma. Radiation dose to each site depends on a number of factors, including the radio sensitivity of each cancer type and whether there are tissues and organs nearby that may be damaged by radiation. Thus, as with every form of treatment, radiation therapy is not without its side effects. Radiation therapy kills cancer cells by damaging their DNA (the molecules inside cells that carry genetic information and pass it from one generation to the next) (1).Radiation therapy can either damage DNA directly or create charged particles (free radicals) within the cells that can in turn damage the DNA. (2) Radiation therapy can lead to dry mouth from exposure of salivary glands to radiation. The salivary glands lubricate the mouth with moisture or spit. Post therapy, the salivary glands will resume functioning but rarely in the same fashion. Dry mouth caused by radiation can be a lifelong problem.[10] The specifics of your brain cancer radiation therapy plan will be based on several factors, including the type and size of the brain tumor and the extent of disease. External beam radiation is commonly used for brain cancer. The area radiated typically includes the tumor and an area surrounding the tumor. For metastatic brain tumors, radiation is sometimes given to the entire brain. Radiation therapy uses special equipment to send high doses of radiation to the cancer cells. Most cells in the body grow and divide to form new cells. But cancer cells grow and divide faster than many of the normal cells around them. Radiation works by making small breaks in the DNA inside cell. Radiation might not be a choice of treatment if the tumour was diagnosed on the late stage or is located on vulnerable places. Moreover, radiation causes significant side effects if used in children aged 0–14. It was determined to be a beneficial treatment but it causes significant side effects that influence the lifestyle of the young patients. Radiotherapy is the use of high-energy rays, usually x-rays and similar rays (such as electrons) to treat disease. It works by destroying cancer cells in the area that's treated. Although normal cells can also be damaged by radiotherapy, they can usually repair themselves, but cancer cells can't. If the tumour was found on the late stage, it requires patients to have higher radiation exposure which might be harmful for the organs. Radiotherapy is determined to be an effective treatment in adults but it causes significant side effects that can influence patients` daily living. In children radiotherapy mostly causes long-term side effects such as hearing loss and blindness. Children who had received cranial radiotherapy are deemed at a high risk for academic failure and cognitive delay. Study by Reddy A.T. determined the significant decrease in IQ with higher doses of radiation, specifically for children with brain tumours. Radiation therapy is not the best treatment for brain tumours, especially in young children as it causes significant damages. There are alternative treatments available for young patients such as surgical resection to decrease the occurrence of side effects.

Chemotherapy

Chemotherapy is the treatment of cancer with drugs ("anticancer drugs") that can destroy cancer cells. In current usage, the term "chemotherapy" usually refers to cytotoxic drugs which affect rapidly dividing cells in general, in contrast with targeted therapy (see below). Chemotherapy drugs interfere with cell division in various possible ways, e.g. with the duplication of DNA or the separation of newly formed chromosomes. Most forms of chemotherapy target all rapidly dividing cells and are not specific to cancer cells, although some degree of specificity may come from the inability of many cancer cells to repair DNA damage, while normal cells generally can. Hence, chemotherapy has the potential to harm healthy tissue, especially those tissues that have a high replacement rate (e.g. intestinal lining). These cells usually repair themselves after chemotherapy.

Because some drugs work better together than alone, two or more drugs are often given at the same time. This is called "combination chemotherapy"; most chemotherapy regimens are given in a combination.[11]

The treatment of some leukaemias and lymphomas requires the use of high-dose chemotherapy, and total body irradiation (TBI). This treatment ablates the bone marrow, and hence the body's ability to recover and repopulate the blood. For this reason, bone marrow, or peripheral blood stem cell harvesting is carried out before the ablative part of the therapy, to enable "rescue" after the treatment has been given. This is known as autologous stem cell transplantation.

Targeted therapies

Targeted therapy, which first became available in the late 1990s, has had a significant impact in the treatment of some types of cancer, and is currently a very active research area. This constitutes the use of agents specific for the deregulated proteins of cancer cells. Small molecule targeted therapy drugs are generally inhibitors of enzymatic domains on mutated, overexpressed, or otherwise critical proteins within the cancer cell. Prominent examples are the tyrosine kinase inhibitors imatinib (Gleevec/Glivec) and gefitinib (Iressa).

Monoclonal antibody therapy is another strategy in which the therapeutic agent is an antibody which specifically binds to a protein on the surface of the cancer cells. Examples include the anti-HER2/neu antibody trastuzumab (Herceptin) used in breast cancer, and the anti-CD20 antibody rituximab, used in a variety of B-cell malignancies.

Targeted therapy can also involve small peptides as "homing devices" which can bind to cell surface receptors or affected extracellular matrix surrounding the tumor. Radionuclides which are attached to these peptides (e.g. RGDs) eventually kill the cancer cell if the nuclide decays in the vicinity of the cell. Especially oligo- or multimers of these binding motifs are of great interest, since this can lead to enhanced tumor specificity and avidity.

Photodynamic therapy (PDT) is a ternary treatment for cancer involving a photosensitizer, tissue oxygen, and light (often using lasers[12]). PDT can be used as treatment for basal cell carcinoma (BCC) or lung cancer; PDT can also be useful in removing traces of malignant tissue after surgical removal of large tumors.[13]

High-energy therapeutic ultrasound could increase higher-density anti-cancer drug load and nanomedicines to target tumor sites by 20x fold higher than traditional target cancer therapy.[14]

Immunotherapy


A renal cell carcinoma (lower left) in a kidney specimen.

Cancer immunotherapy refers to a diverse set of therapeutic strategies designed to induce the patient's own immune system to fight the tumor. Contemporary methods for generating an immune response against tumours include intravesical BCG immunotherapy for superficial bladder cancer, and use of interferons and other cytokines to induce an immune response in renal cell carcinoma and melanoma patients. Cancer vaccines to generate specific immune responses are the subject of intensive research for a number of tumours, notably malignant melanoma and renal cell carcinoma. Sipuleucel-T is a vaccine-like strategy in late clinical trials for prostate cancer in which dendritic cells from the patient are loaded with prostatic acid phosphatase peptides to induce a specific immune response against prostate-derived cells.

Allogeneic hematopoietic stem cell transplantation ("bone marrow transplantation" from a genetically non-identical donor) can be considered a form of immunotherapy, since the donor's immune cells will often attack the tumor in a phenomenon known as graft-versus-tumor effect. For this reason, allogeneic HSCT leads to a higher cure rate than autologous transplantation for several cancer types, although the side effects are also more severe.

The cell based immunotherapy in which the patients own Natural Killer cells(NK) and Cytotoxic T-Lymphocytes(CTL) are used has been in practice in Japan since 1990. NK cells and CTLs primarily kill the cancer cells when they are developed. This treatment is given together with the other modes of treatment such as Surgery, radiotherapy or Chemotherapy and called as Autologous Immune Enhancement Therapy (AIET)[15][16]

Hormonal therapy

The growth of some cancers can be inhibited by providing or blocking certain hormones. Common examples of hormone-sensitive tumors include certain types of breast and prostate cancers. Removing or blocking estrogen or testosterone is often an important additional treatment. In certain cancers, administration of hormone agonists, such as progestogens may be therapeutically beneficial.

Angiogenesis inhibitors

Angiogenesis inhibitors prevent the extensive growth of blood vessels (angiogenesis) that tumors require to survive. Some, such as bevacizumab, have been approved and are in clinical use. One of the main problems with anti-angiogenesis drugs is that many factors stimulate blood vessel growth in cells normal or cancerous. Anti-angiogenesis drugs only target one factor, so the other factors continue to stimulate blood vessel growth. Other problems include route of administration, maintenance of stability and activity and targeting at the tumor vasculature.[17]

Synthetic lethality

Synthetic lethality arises when a combination of deficiencies in the expression of two or more genes leads to cell death, whereas a deficiency in only one of these genes does not. The deficiencies can arise through mutations, epigenetic alterations or inhibitors of one or both of the genes.

Cancer cells are frequently deficient in a DNA repair gene.[18][19] (Also see DNA repair deficiency in cancer.) This DNA repair defect either may be due to mutation or, often, epigenetic silencing (see epigenetic silencing of DNA repair). If this DNA repair defect is in one of seven DNA repair pathways (see DNA repair pathways), and a compensating DNA repair pathway is inhibited, then the tumor cells may be killed by synthetic lethality. Non-tumorous cells, with the initial pathway intact, can survive.

Ovarian cancer

Mutations in DNA repair genes BRCA1 or BRCA2 (active in homologous recombinational repair) are synthetically lethal with inhibition of DNA repair gene PARP1 (active in the base excision repair and in the microhomology-mediated end joining pathways of DNA repair).[20][21]

Ovarian cancers have a mutational defect in BRCA1 in about 18% of patients (13% germline mutations and 5% somatic mutations) (see BRCA1). Olaparib, a PARP inhibitor, was approved in 2014 by the US FDA for use in BRCA-associated ovarian cancer that had previously been treated with chemotherapy.[22] The FDA, in 2016, also approved the PARP inhibitor rucaparib to treat women with advanced ovarian cancer who have already been treated with at least two chemotherapies and have a BRCA1 or BRCA2 gene mutation.[23]

Colon cancer

In colon cancer, epigenetic defects in the WRN gene appear to be synthetically lethal with inactivation of TOP1. In particular, irinotecan inactivation of TOP1 was synthetically lethal with deficient expression of the DNA repair WRN gene in patients with colon cancer.[24] In a 2006 study, 45 patients had colonic tumors with hypermethylated WRN gene promoters (silenced WRN expression), and 43 patients had tumors with unmethylated WRN gene promoters, so that WRN protein expression was high.[24] Irinotecan was more strongly beneficial for patients with hypermethylated WRN promoters (39.4 months survival) than for those with unmethylated WRN promoters (20.7 months survival). The WRN gene promoter is hypermethylated in about 38% of colorectal cancers.[24]

There are five different stages of colon cancer, and these five stages all have treatment. Stage 0, is where the patient is required to undergo surgery to remove the polyp (American Cancer Society[25]). Stage 1, depending on the location of the cancer in the colon and lymph nodes, the patient undergoes surgery just like Stage 0. Stage 2 patients undergoes removing nearby lymph nodes, but depending on what the doctor says, the patent might have to undergo chemotherapy after surgery (if the cancer is at higher risk of coming back). Stage 3, is where the cancer has spread all throughout the lymph nodes but not yet to other organs or body parts. When getting to this stage, Surgery is conducted on the colon and lymph nodes, then the doctor orders Chemotherapy (FOLFOX or CapeOx) to treat the colon cancer in the location needed (American Cancer Society[25]). The last a patient can get is Stage 4. Stage 4 patients only undergo surgery it's for the prevention of the cancer, along with pain relief. If the pain continues with these two options then the doctor will recommended radiation therapy. The main treatment is Chemotherapy, due to how aggressive the cancer becomes in this stage not only to the colon but to the lymph nodes.

Symptom control and palliative care

Although the control of the symptoms of cancer is not typically thought of as a treatment directed at the cancer, it is an important determinant of the quality of life of cancer patients, and plays an important role in the decision whether the patient is able to undergo other treatments. Although doctors generally have the therapeutic skills to reduce pain, Chemotherapy-induced nausea and vomiting, diarrhea, hemorrhage and other common problems in cancer patients, the multidisciplinary specialty of palliative care has arisen specifically in response to the symptom control needs of this group of patients.

Pain medication, such as morphine and oxycodone, and antiemetics, drugs to suppress nausea and vomiting, are very commonly used in patients with cancer-related symptoms. Improved antiemetics such as ondansetron and analogues, as well as aprepitant have made aggressive treatments much more feasible in cancer patients.

Cancer pain can be associated with continuing tissue damage due to the disease process or the treatment (i.e. surgery, radiation, chemotherapy). Although there is always a role for environmental factors and affective disturbances in the genesis of pain behaviors, these are not usually the predominant etiologic factors in patients with cancer pain. Some patients with severe pain associated with cancer are nearing the end of their lives, but in all cases palliative therapies should be used to control the pain. Issues such as social stigma of using opioids, work and functional status, and health care consumption can be concerns and may need to be addressed in order for the person to feel comfortable taking the medications required to control his or her symptoms. The typical strategy for cancer pain management is to get the patient as comfortable as possible using the least amount of medications possible but opioids, surgery, and physical measures are often required. In the past doctors have been reluctant to prescribe narcotics for pain in terminal cancer patients, for fear of contributing to addiction or suppressing respiratory function. The palliative care movement, a more recent offshoot of the hospice movement, has engendered more widespread support for preemptive pain treatment for cancer patients. The World Health Organization also noted uncontrolled cancer pain as a worldwide problem and established a "ladder" as a guideline for how practitioners should treat pain in patients who have cancer [26]

Cancer-related fatigue is a very common problem for cancer patients, and has only recently become important enough for oncologists to suggest treatment, even though it plays a significant role in many patients' quality of life.

Hospice in cancer

Hospice is a group that provides care at the home of a person that has an advanced illness with a likely prognosis of less than 6 months. As most treatments for cancer involve significant unpleasant side effects, a patient with little realistic hope of a cure or prolonged life may choose to seek comfort care only, forgoing more radical therapies in exchange for a prolonged period of normal living. This is an especially important aspect of care for those patients whose disease is not a good candidate for other forms of treatment. In these patients, the risks related to the chemotherapy may actually be higher than the chance of responding to the treatment, making further attempts to cure the disease impossible. Of note, patients on hospice can sometimes still get treatments such as radiation therapy if it is being used to treat symptoms, not as an attempt to cure the cancer.

Research

Clinical trials, also called research studies, test new treatments in people with cancer. The goal of this research is to find better ways to treat cancer and help cancer patients. Clinical trials test many types of treatment such as new drugs, new approaches to surgery or radiation therapy, new combinations of treatments, or new methods such as gene therapy.

A clinical trial is one of the final stages of a long and careful cancer research process. The search for new treatments begins in the laboratory, where scientists first develop and test new ideas. If an approach seems promising, the next step may be testing a treatment in animals to see how it affects cancer in a living being and whether it has harmful effects. Of course, treatments that work well in the lab or in animals do not always work well in people. Studies are done with cancer patients to find out whether promising treatments are safe and effective.

Patients who take part may be helped personally by the treatment they receive. They get up-to-date care from cancer experts, and they receive either a new treatment being tested or the best available standard treatment for their cancer. At the same time, new treatments also may have unknown risks, but if a new treatment proves effective or more effective than standard treatment, study patients who receive it may be among the first to benefit. There is no guarantee that a new treatment being tested or a standard treatment will produce good results. In children with cancer, a survey of trials found that those enrolled in trials were on average not more likely to do better or worse than those on standard treatment; this confirms that success or failure of an experimental treatment cannot be predicted.[27]

Exosome research

Exosomes are lipid-covered microvesicles shed by solid tumors into bodily fluids, such as blood and urine. Current research is being done attempting to use exosomes as a detection and monitoring method for a variety of cancers.[28][29] The hope is to be able to detect cancer with a high sensitivity and specificity via detection of specific exosomes in the blood or urine. The same process can also be used to more accurately monitor a patient's treatment progress. Enzyme linked lectin specific assay or ELLSA has been proven to directly detect melanoma derived exosomes from fluid samples.[30] Previously, exosomes had been measured by total protein content in purified samples and by indirect immunomodulatory effects. ELLSA directly measures exosome particles in complex solutions, and has already been found capable of detecting exosomes from other sources, including ovarian cancer and tuberculosis-infected macrophages.

Exosomes secreted by tumors are also believed to be responsible for triggering programmed cell death (apoptosis) of immune cells; interrupting T-cell signaling required to mount an immune response; inhibiting the production of anti-cancer cytokines, and has implications in the spread of metastasis and allowing for angiogenesis.[31] Studies are currently being done with "Lectin affinity plasmapheresis" (LAP),[30] LAP is a blood filtration method which selectively targets the tumor based exosomes and removes them from the bloodstream. It is believed that decreasing the tumor-secreted exosomes in a patient's bloodstream will slow down progression of the cancer while at the same time increasing the patients own immune response.

Complementary and alternative

Complementary and alternative medicine (CAM) treatments are the diverse group of medical and health care systems, practices, and products that are not part of conventional medicine and have not been shown to be effective.[32] "Complementary medicine" refers to methods and substances used along with conventional medicine, while "alternative medicine" refers to compounds used instead of conventional medicine.[33] CAM use is common among people with cancer; a 2000 study found that 69% of cancer patients had used at least one CAM therapy as part of their cancer treatment.[34] Most complementary and alternative medicines for cancer have not been rigorously studied or tested. Some alternative treatments which have been investigated and shown to be ineffective continue to be marketed and promoted.[35]

Mindfulness-based interventions appear to facilitate physical and emotional adjustment to life with cancer through symptom reduction, positive psychological growth, and by bringing about favourable changes in biological outcomes.[36]

Special circumstances

In pregnancy

The incidence of concurrent cancer during pregnancy has risen due to the increasing age of pregnant mothers[37] and due to the incidental discovery of maternal tumors during prenatal ultrasound examinations.

Cancer treatment needs to be selected to do least harm to both the woman and her embryo/fetus. In some cases a therapeutic abortion may be recommended.

Radiation therapy is out of the question, and chemotherapy always poses the risk of miscarriage and congenital malformations.[37] Little is known about the effects of medications on the child.

Even if a drug has been tested as not crossing the placenta to reach the child, some cancer forms can harm the placenta and make the drug pass over it anyway.[37] Some forms of skin cancer may even metastasize to the child's body.[37]

Diagnosis is also made more difficult, since computed tomography is infeasible because of its high radiation dose. Still, magnetic resonance imaging works normally.[37] However, contrast media cannot be used, since they cross the placenta.[37]

As a consequence of the difficulties to properly diagnose and treat cancer during pregnancy, the alternative methods are either to perform a Cesarean section when the child is viable in order to begin a more aggressive cancer treatment, or, if the cancer is malignant enough that the mother is unlikely to be able to wait that long, to perform an abortion in order to treat the cancer.[37]

In utero

Fetal tumors are sometimes diagnosed while still in utero. Teratoma is the most common type of fetal tumor, and usually is benign. In some cases these are surgically treated while the fetus is still in the uterus.

Prefrontal cortex

From Wikipedia, the free encyclopedia
 
Prefrontal Cortex
Gray726-Brodman-prefrontal.svg
Brodmann areas, 9, 10, 11, 12, 13, 14, 24, 25, 32, 44, 45, 46, and 47 are all in the prefrontal cortex [1]
Details
Part of Frontal lobe
Parts Superior frontal gyrus
Middle frontal gyrus
Inferior frontal gyrus
Artery Anterior cerebral
Middle cerebral
Vein Superior sagittal sinus
Identifiers
Latin Cortex praefrontalis
MeSH D017397
NeuroNames 2429
NeuroLex ID nlx_anat_090801
FMA 224850

In mammalian brain anatomy, the prefrontal cortex (PFC) is the cerebral cortex which covers the front part of the frontal lobe. The PFC contains Brodmann areas 8, 9, 10, 11, 12, 13, 14, 24, 25, 32, 44, 45, 46, and 47.[1]

Many authors have indicated an integral link between a person's will to live, personality, and the functions of the prefrontal cortex.[2] This brain region has been implicated in planning complex cognitive behavior, personality expression, decision making, and moderating social behavior.[3] The basic activity of this brain region is considered to be orchestration of thoughts and actions in accordance with internal goals.[4]

The most typical psychological term for functions carried out by the prefrontal cortex area is executive function. Executive function relates to abilities to differentiate among conflicting thoughts, determine good and bad, better and best, same and different, future consequences of current activities, working toward a defined goal, prediction of outcomes, expectation based on actions, and social "control" (the ability to suppress urges that, if not suppressed, could lead to socially unacceptable outcomes).

The frontal cortex supports concrete rule learning. More anterior regions along the rostro-caudal axis of frontal cortex support rule learning at higher levels of abstraction.[5]

Structure

Definition

There are three possible ways to define the prefrontal cortex:
  • as the granular frontal cortex
  • as the projection zone of the medial dorsal nucleus of the thalamus
  • as that part of the frontal cortex whose electrical stimulation does not evoke movements
The prefrontal cortex has been defined based on cytoarchitectonics by the presence of a cortical granular layer IV. It is not entirely clear who first used this criterion. Many of the early cytoarchitectonic researchers restricted the use of the term prefrontal to a much smaller region of cortex including the gyrus rectus and the gyrus rostralis (Campbell, 1905; G. E. Smith, 1907; Brodmann, 1909; von Economo and Koskinas, 1925). In 1935, however, Jacobsen used the term prefrontal to distinguish granular prefrontal areas from agranular motor and premotor areas.[6] In terms of Brodmann areas, the prefrontal cortex traditionally includes areas 8, 9, 10, 11, 12, 13, 14, 24, 25, 32, 44, 45, 46, and 47,[1] however, not all of these areas are strictly granular – 44 is dysgranular, caudal 11 and orbital 47 are agranular.[7] The main problem with this definition is that it works well only in primates but not in nonprimates, as the latter lack a granular layer IV.[8]

To define the prefrontal cortex as the projection zone of the mediodorsal nucleus of the thalamus builds on the work of Rose and Woolsey,[9] who showed that this nucleus projects to anterior and ventral parts of the brain in nonprimates, however, Rose and Woolsey termed this projection zone "orbitofrontal." It seems to have been Akert, who, for the first time in 1964, explicitly suggested that this criterion could be used to define homologues of the prefrontal cortex in primates and nonprimates.[10] This allowed the establishment of homologies despite the lack of a granular frontal cortex in nonprimates.

The projection zone definition is still widely accepted today (e.g. Fuster[11]), although its usefulness has been questioned.[7][12] Modern tract tracing studies have shown that projections of the mediodorsal nucleus of the thalamus are not restricted to the granular frontal cortex in primates. As a result, it was suggested to define the prefrontal cortex as the region of cortex that has stronger reciprocal connections with the mediodorsal nucleus than with any other thalamic nucleus.[8] Uylings et al.[8] acknowledge, however, that even with the application of this criterion, it might be rather difficult to define the prefrontal cortex unequivocally.

A third definition of the prefrontal cortex is the area of frontal cortex whose electrical stimulation does not lead to observable movements. For example, in 1890 David Ferrier[13] used the term in this sense. One complication with this definition is that the electrically "silent" frontal cortex includes both granular and non-granular areas.[7]

Subdivisions

Medial and lateral view of the prefrontal cortex

The table below shows different ways to subdivide parts of the prefrontal cortex based upon Brodmann areas.[1]

8
lateral 9
46
12
44
45
47
medial 9
medial 10
24
25
32
11
13
14
caudal
lateral
medial
orbitofrontal
dorsolateral
ventrolateral

Interconnections

The prefrontal cortex is highly interconnected with much of the brain, including extensive connections with other cortical, subcortical and brain stem sites.[14] The dorsal prefrontal cortex is especially interconnected with brain regions involved with attention, cognition and action,[15] while the ventral prefrontal cortex interconnects with brain regions involved with emotion.[16] The prefrontal cortex also receives inputs from the brainstem arousal systems, and its function is particularly dependent on its neurochemical environment.[17] Thus, there is coordination between our state of arousal and our mental state.[18] The interplay between the prefrontal cortex and socioemotional system of the brain is relevant for adolescent development, as proposed by the Dual Systems Model.

The medial prefrontal cortex has been implicated in the generation of slow-wave sleep (SWS), and prefrontal atrophy has been linked to decreases in SWS.[19] Prefrontal atrophy occurs naturally as individuals age, and it has been demonstrated that older adults experience impairments in memory consolidation as their medial prefrontal cortices degrade.[19] In monkeys, significant atrophy has been found as a result of neuroleptic or antipsychotic psychiatric medication.[20] In older adults, instead of being transferred and stored in the neocortex during SWS, memories start to remain in the hippocampus where they were encoded, as evidenced by increased hippocampal activation compared to younger adults during recall tasks, when subjects learned word associations, slept, and then were asked to recall the learned words.[19]

Function

Executive function

The original studies of Fuster and of Goldman-Rakic emphasized the fundamental ability of the prefrontal cortex to represent information not currently in the environment, and the central role of this function in creating the "mental sketch pad". Goldman-Rakic spoke of how this representational knowledge was used to intelligently guide thought, action, and emotion, including the inhibition of inappropriate thoughts, distractions, actions, and feelings.[21] In this way, working memory can be seen as fundamental to attention and behavioral inhibition. Fuster speaks of how this prefrontal ability allows the wedding of past to future, allowing both cross-temporal and cross-modal associations in the creation of goal-directed, perception-action cycles.[22] This ability to represent underlies all other higher executive functions.

Shimamura proposed Dynamic Filtering Theory to describe the role of the prefrontal cortex in executive functions. The prefrontal cortex is presumed to act as a high-level gating or filtering mechanism that enhances goal-directed activations and inhibits irrelevant activations. This filtering mechanism enables executive control at various levels of processing, including selecting, maintaining, updating, and rerouting activations. It has also been used to explain emotional regulation.[23]

Miller and Cohen proposed an Integrative Theory of Prefrontal Cortex Function, that arises from the original work of Goldman-Rakic and Fuster. The two theorize that “cognitive control stems from the active maintenance of patterns of activity in the prefrontal cortex that represents goals and means to achieve them. They provide bias signals to other brain structures whose net effect is to guide the flow of activity along neural pathways that establish the proper mappings between inputs, internal states, and outputs needed to perform a given task”.[24] In essence, the two theorize that the prefrontal cortex guides the inputs and connections, which allows for cognitive control of our actions.

The prefrontal cortex is of significant importance when top-down processing is needed. Top-down processing by definition is when behavior is guided by internal states or intentions. According to the two, “The PFC is critical in situations when the mappings between sensory inputs, thoughts, and actions either are weakly established relative to other existing ones or are rapidly changing”.[24] An example of this can be portrayed in the Wisconsin Card Sorting Test (WCST). Subjects engaging in this task are instructed to sort cards according to the shape, color, or number of symbols appearing on them. The thought is that any given card can be associated with a number of actions and no single stimulus-response mapping will work. Human subjects with PFC damage are able to sort the card in the initial simple tasks, but unable to do so as the rules of classification change.

Miller and Cohen conclude that the implications of their theory can explain how much of a role the PFC has in guiding control of cognitive actions. In the researchers' own words, they claim that, “depending on their target of influence, representations in the PFC can function variously as attentional templates, rules, or goals by providing top-down bias signals to other parts of the brain that guide the flow of activity along the pathways needed to perform a task”.[24]

Experimental data indicate a role for the prefrontal cortex in mediating normal sleep physiology, dreaming and sleep-deprivation phenomena.[25]

When analyzing and thinking about attributes of other individuals, the medial prefrontal cortex is activated, however, it is not activated when contemplating the characteristics of inanimate objects.[26]

Studies using fMRI have shown that the medial prefrontal cortex (mPFC), specifically the anterior medial prefrontal cortex (amPFC), may modulate mimicry behavior. Neuroscientists are suggesting that social priming influences activity and processing in the amPFC, and that this area of the prefrontal cortex modulates mimicry responses and behavior.[27]

As of recent, researchers have used neuroimaging techniques to find that along with the basal ganglia, the prefrontal cortex is involved with learning exemplars, which is part of the exemplar theory, one of the three main ways our mind categorizes things. The exemplar theory states that we categorize judgements by comparing it to a similar past experience within our stored memories.[28]

A 2014 meta-analysis by Professor Nicole P.Yuan from the University of Arizona found that larger prefrontal cortex volume and greater PFC cortical thickness were associated with better executive performance.[29]

Attention and memory

Lebedev et al. experiment that dissociated representation of spatial
attention from representation of spatial memory in prefrontal cortex [30]

A widely accepted theory regarding the function of the brain's prefrontal cortex is that it serves as a store of short-term memory. This idea was first formulated by Jacobsen, who reported in 1936 that damage to the primate prefrontal cortex caused short-term memory deficits.[31] Karl Pribram and colleagues (1952) identified the part of the prefrontal cortex responsible for this deficit as area 46, also known as the dorsolateral prefrontal cortex (dlPFC).[32] More recently, Goldman-Rakic and colleagues (1993) evoked short-term memory loss in localized regions of space by temporary inactivation of portions of the dlPFC.[33] Once the concept of working memory (see also Baddeley's model of working memory) was established in contemporary neuroscience by Baddeley (1986), these neuropsychological findings contributed to the theory that the prefrontal cortex implements working memory and, in some extreme formulations, only working memory.[34] In the 1990s this theory developed a wide following, and it became the predominant theory of PF function, especially for nonhuman primates. The concept of working memory used by proponents of this theory focused mostly on the short-term maintenance of information, and rather less on the manipulation or monitoring of such information or on the use of that information for decisions. Consistent with the idea that the prefrontal cortex functions predominantly in maintenance memory, delay-period activity in the PF has often been interpreted as a memory trace. (The phrase "delay-period activity" applies to neuronal activity that follows the transient presentation of an instruction cue and persists until a subsequent "go" or "trigger" signal.)

To explore alternative interpretations of delay-period activity in the prefrontal cortex, Lebedev et al. (2004) investigated the discharge rates of single prefrontal neurons as monkeys attended to a stimulus marking one location while remembering a different, unmarked location.[30] Both locations served as potential targets of a saccadic eye movement. Although the task made intensive demands on short-term memory, the largest proportion of prefrontal neurons represented attended locations, not remembered ones. These findings showed that short-term memory functions cannot account for all, or even most, delay-period activity in the part of the prefrontal cortex explored. The authors suggested that prefrontal activity during the delay-period contributes more to the process of attentional selection (and selective attention) than to memory storage.

Clinical significance

In the last few decades, brain imaging systems have been used to determine brain region volumes and nerve linkages. Several studies have indicated that reduced volume and interconnections of the frontal lobes with other brain regions is observed in patients diagnosed with mental disorders and prescribed potent antipsychotics; those subjected to repeated stressors;[35] suicides;[36] those incarcerated; criminals; sociopaths; those affected by lead poisoning;[37] and daily male cannabis users (only 13 people were tested).[38] It is believed that at least some of the human abilities to feel guilt or remorse, and to interpret reality, are dependent on a well-functioning prefrontal cortex.[39] It is also widely believed[by whom?] that the size and number of connections in the prefrontal cortex relates directly to sentience, as the prefrontal cortex in humans occupies a far larger percentage of the brain than in any other animal. And it is theorized that, as the brain has tripled in size over five million years of human evolution,[40] the prefrontal cortex has increased in size sixfold.[41]
A review on executive functions in healthy exercising individuals noted that the left and right halves of the prefrontal cortex, which is divided by the medial longitudinal fissure, appear to become more interconnected in response to consistent aerobic exercise.[42] Two reviews of structural neuroimaging research indicate that marked improvements in prefrontal and hippocampal gray matter volume occur in healthy adults that engage in medium intensity exercise for several months.[43][44]

A functional neuroimaging review of meditation-based practices suggested that practicing mindfulness enhances prefrontal activation, which was noted to be correlated with increased well-being and reduced anxiety;[45] however, the review noted the need for cohort studies in future research to better establish this.[45]

History

Perhaps the seminal case in prefrontal cortex function is that of Phineas Gage, whose left frontal lobe was destroyed when a large iron rod was driven through his head in an 1848 accident. The standard presentation (e.g.[46]) is that, although Gage retained normal memory, speech and motor skills, his personality changed radically: He became irritable, quick-tempered, and impatient—characteristics he did not previously display — so that friends described him as "no longer Gage"; and, whereas he had previously been a capable and efficient worker, afterward he was unable to complete tasks. However, careful analysis of primary evidence shows that descriptions of Gage's psychological changes are usually exaggerated when held against the description given by Gage's doctor, the most striking feature being that changes described years after Gage's death are far more dramatic than anything reported while he was alive.[47][48]

Subsequent studies on patients with prefrontal injuries have shown that the patients verbalized what the most appropriate social responses would be under certain circumstances. Yet, when actually performing, they instead pursued behavior aimed at immediate gratification, despite knowing the longer-term results would be self-defeating.

The interpretation of this data indicates that not only are skills of comparison and understanding of eventual outcomes harbored in the prefrontal cortex but the prefrontal cortex (when functioning correctly) controls the mental option to delay immediate gratification for a better or more rewarding longer-term gratification result. This ability to wait for a reward is one of the key pieces that define optimal executive function of the human brain.

There is much current research devoted to understanding the role of the prefrontal cortex in neurological disorders. Many disorders, such as schizophrenia, bipolar disorder, and ADHD, have been related to dysfunction of the prefrontal cortex, and thus this area of the brain offers the potential for new treatments of these conditions.[citation needed] Clinical trials have begun on certain drugs that have been shown to improve prefrontal cortex function, including guanfacine, which acts through the alpha-2A adrenergic receptor. A downstream target of this drug, the HCN channel, is one of the most recent areas of exploration in prefrontal cortex pharmacology.[49]

Etymology

The term "prefrontal" as describing a part of the brain appears to have been introduced by Richard Owen in 1868.[6] For him, the prefrontal area was restricted to the anterior-most part of the frontal lobe (approximately corresponding to the frontal pole). It has been hypothesized that his choice of the term was based on the prefrontal bone present in most amphibians and reptiles.[6]

Additional images

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...