Search This Blog

Wednesday, June 23, 2021

Antibiotic

From Wikipedia, the free encyclopedia

Antibiotic

Staphylococcus aureus (AB Test).jpg
Testing the susceptibility of Staphylococcus aureus to antibiotics by the Kirby-Bauer disk diffusion method – antibiotics diffuse from antibiotic-containing disks and inhibit growth of S. aureus, resulting in a zone of inhibition.

An antibiotic is a type of antimicrobial substance active against bacteria. It is the most important type of antibacterial agent for fighting bacterial infections, and antibiotic medications are widely used in the treatment and prevention of such infections. They may either kill or inhibit the growth of bacteria. A limited number of antibiotics also possess antiprotozoal activity. Antibiotics are not effective against viruses such as the common cold or influenza; drugs which inhibit viruses are termed antiviral drugs or antivirals rather than antibiotics.

Sometimes, the term antibiotic—literally "opposing life", from the Greek roots ἀντι anti, "against" and βίος bios, "life"—is broadly used to refer to any substance used against microbes, but in the usual medical usage, antibiotics (such as penicillin) are those produced naturally (by one microorganism fighting another), whereas nonantibiotic antibacterials (such as sulfonamides and antiseptics) are fully synthetic. However, both classes have the same goal of killing or preventing the growth of microorganisms, and both are included in antimicrobial chemotherapy. "Antibacterials" include antiseptic drugs, antibacterial soaps, and chemical disinfectants, whereas antibiotics are an important class of antibacterials used more specifically in medicine and sometimes in livestock feed.

Antibiotics have been used since ancient times. Many civilizations used topical application of mouldy bread, with many references to its beneficial effects arising from ancient Egypt, Nubia, China, Serbia, Greece, and Rome. The first person to directly document the use of molds to treat infections was John Parkinson (1567–1650). Antibiotics revolutionized medicine in the 20th century. Alexander Fleming (1881–1955) discovered modern day penicillin in 1928, the widespread use of which proved significantly beneficial during wartime. However, the effectiveness and easy access to antibiotics have also led to their overuse and some bacteria have evolved resistance to them. The World Health Organization has classified antimicrobial resistance as a widespread "serious threat [that] is no longer a prediction for the future, it is happening right now in every region of the world and has the potential to affect anyone, of any age, in any country".

Medical uses

Antibiotics are used to treat or prevent bacterial infections, and sometimes protozoan infections. (Metronidazole is effective against a number of parasitic diseases). When an infection is suspected of being responsible for an illness but the responsible pathogen has not been identified, an empiric therapy is adopted. This involves the administration of a broad-spectrum antibiotic based on the signs and symptoms presented and is initiated pending laboratory results that can take several days.

When the responsible pathogenic microorganism is already known or has been identified, definitive therapy can be started. This will usually involve the use of a narrow-spectrum antibiotic. The choice of antibiotic given will also be based on its cost. Identification is critically important as it can reduce the cost and toxicity of the antibiotic therapy and also reduce the possibility of the emergence of antimicrobial resistance. To avoid surgery, antibiotics may be given for non-complicated acute appendicitis.

Antibiotics may be given as a preventive measure and this is usually limited to at-risk populations such as those with a weakened immune system (particularly in HIV cases to prevent pneumonia), those taking immunosuppressive drugs, cancer patients, and those having surgery. Their use in surgical procedures is to help prevent infection of incisions. They have an important role in dental antibiotic prophylaxis where their use may prevent bacteremia and consequent infective endocarditis. Antibiotics are also used to prevent infection in cases of neutropenia particularly cancer-related.

Administration

There are many different routes of administration for antibiotic treatment. Antibiotics are usually taken by mouth. In more severe cases, particularly deep-seated systemic infections, antibiotics can be given intravenously or by injection. Where the site of infection is easily accessed, antibiotics may be given topically in the form of eye drops onto the conjunctiva for conjunctivitis or ear drops for ear infections and acute cases of swimmer's ear. Topical use is also one of the treatment options for some skin conditions including acne and cellulitis. Advantages of topical application include achieving high and sustained concentration of antibiotic at the site of infection; reducing the potential for systemic absorption and toxicity, and total volumes of antibiotic required are reduced, thereby also reducing the risk of antibiotic misuse. Topical antibiotics applied over certain types of surgical wounds have been reported to reduce the risk of surgical site infections. However, there are certain general causes for concern with topical administration of antibiotics. Some systemic absorption of the antibiotic may occur; the quantity of antibiotic applied is difficult to accurately dose, and there is also the possibility of local hypersensitivity reactions or contact dermatitis occurring. It is recommended to administer antibiotics as soon as possible, especially in life-threatening infections. Many emergency departments stock antibiotics for this purpose.

Prevalence

Antibiotic consumption varies widely between countries. The WHO report on surveillance of antibiotic consumption’ published in 2018 analysed 2015 data from 65 countries. As measured in defined daily doses per 1,000 inhabitants per day. Mongolia had the highest consumption with a rate of 64.4. Burundi had the lowest at 4.4. Amoxicillin and amoxicillin/clavulanic acid were the most frequently consumed.

Side effects

Health advocacy messages such as this one encourage patients to talk with their doctor about safety in using antibiotics.

Antibiotics are screened for any negative effects before their approval for clinical use, and are usually considered safe and well tolerated. However, some antibiotics have been associated with a wide extent of adverse side effects ranging from mild to very severe depending on the type of antibiotic used, the microbes targeted, and the individual patient. Side effects may reflect the pharmacological or toxicological properties of the antibiotic or may involve hypersensitivity or allergic reactions. Adverse effects range from fever and nausea to major allergic reactions, including photodermatitis and anaphylaxis. Safety profiles of newer drugs are often not as well established as for those that have a long history of use.

Common side-effects include diarrhea, resulting from disruption of the species composition in the intestinal flora, resulting, for example, in overgrowth of pathogenic bacteria, such as Clostridium difficile. Taking probiotics during the course of antibiotic treatment can help prevent antibiotic-associated diarrhea. Antibacterials can also affect the vaginal flora, and may lead to overgrowth of yeast species of the genus Candida in the vulvo-vaginal area. Additional side effects can result from interaction with other drugs, such as the possibility of tendon damage from the administration of a quinolone antibiotic with a systemic corticosteroid.

Some antibiotics may also damage the mitochondrion, a bacteria-derived organelle found in eukaryotic, including human, cells. Mitochondrial damage cause oxidative stress in cells and has been suggested as a mechanism for side effects from fluoroquinolones. They are also known to affect chloroplasts.

Correlation with obesity

Exposure to antibiotics early in life is associated with increased body mass in humans and mouse models. Early life is a critical period for the establishment of the intestinal microbiota and for metabolic development. Mice exposed to subtherapeutic antibiotic treatment – with either penicillin, vancomycin, or chlortetracycline had altered composition of the gut microbiota as well as its metabolic capabilities. One study has reported that mice given low-dose penicillin (1 μg/g body weight) around birth and throughout the weaning process had an increased body mass and fat mass, accelerated growth, and increased hepatic expression of genes involved in adipogenesis, compared to control mice. In addition, penicillin in combination with a high-fat diet increased fasting insulin levels in mice. However, it is unclear whether or not antibiotics cause obesity in humans. Studies have found a correlation between early exposure of antibiotics (<6 months) and increased body mass (at 10 and 20 months). Another study found that the type of antibiotic exposure was also significant with the highest risk of being overweight in those given macrolides compared to penicillin and cephalosporin. Therefore, there is correlation between antibiotic exposure in early life and obesity in humans, but whether or not there is a causal relationship remains unclear. Although there is a correlation between antibiotic use in early life and obesity, the effect of antibiotics on obesity in humans needs to be weighed against the beneficial effects of clinically indicated treatment with antibiotics in infancy.

Interactions

Birth control pills

There are few well-controlled studies on whether antibiotic use increases the risk of oral contraceptive failure. The majority of studies indicate antibiotics do not interfere with birth control pills, such as clinical studies that suggest the failure rate of contraceptive pills caused by antibiotics is very low (about 1%). Situations that may increase the risk of oral contraceptive failure include non-compliance (missing taking the pill), vomiting, or diarrhea. Gastrointestinal disorders or interpatient variability in oral contraceptive absorption affecting ethinylestradiol serum levels in the blood. Women with menstrual irregularities may be at higher risk of failure and should be advised to use backup contraception during antibiotic treatment and for one week after its completion. If patient-specific risk factors for reduced oral contraceptive efficacy are suspected, backup contraception is recommended.

In cases where antibiotics have been suggested to affect the efficiency of birth control pills, such as for the broad-spectrum antibiotic rifampicin, these cases may be due to an increase in the activities of hepatic liver enzymes' causing increased breakdown of the pill's active ingredients. Effects on the intestinal flora, which might result in reduced absorption of estrogens in the colon, have also been suggested, but such suggestions have been inconclusive and controversial. Clinicians have recommended that extra contraceptive measures be applied during therapies using antibiotics that are suspected to interact with oral contraceptives. More studies on the possible interactions between antibiotics and birth control pills (oral contraceptives) are required as well as careful assessment of patient-specific risk factors for potential oral contractive pill failure prior to dismissing the need for backup contraception.

Alcohol

Interactions between alcohol and certain antibiotics may occur and may cause side effects and decreased effectiveness of antibiotic therapy. While moderate alcohol consumption is unlikely to interfere with many common antibiotics, there are specific types of antibiotics, with which alcohol consumption may cause serious side effects. Therefore, potential risks of side effects and effectiveness depend on the type of antibiotic administered.

Antibiotics such as metronidazole, tinidazole, cephamandole, latamoxef, cefoperazone, cefmenoxime, and furazolidone, cause a disulfiram-like chemical reaction with alcohol by inhibiting its breakdown by acetaldehyde dehydrogenase, which may result in vomiting, nausea, and shortness of breath. In addition, the efficacy of doxycycline and erythromycin succinate may be reduced by alcohol consumption. Other effects of alcohol on antibiotic activity include altered activity of the liver enzymes that break down the antibiotic compound.

Pharmacodynamics

The successful outcome of antimicrobial therapy with antibacterial compounds depends on several factors. These include host defense mechanisms, the location of infection, and the pharmacokinetic and pharmacodynamic properties of the antibacterial. The bactericidal activity of antibacterials may depend on the bacterial growth phase, and it often requires ongoing metabolic activity and division of bacterial cells. These findings are based on laboratory studies, and in clinical settings have also been shown to eliminate bacterial infection. Since the activity of antibacterials depends frequently on its concentration, in vitro characterization of antibacterial activity commonly includes the determination of the minimum inhibitory concentration and minimum bactericidal concentration of an antibacterial. To predict clinical outcome, the antimicrobial activity of an antibacterial is usually combined with its pharmacokinetic profile, and several pharmacological parameters are used as markers of drug efficacy.

Combination therapy

In important infectious diseases, including tuberculosis, combination therapy (i.e., the concurrent application of two or more antibiotics) has been used to delay or prevent the emergence of resistance. In acute bacterial infections, antibiotics as part of combination therapy are prescribed for their synergistic effects to improve treatment outcome as the combined effect of both antibiotics is better than their individual effect. Methicillin-resistant Staphylococcus aureus infections may be treated with a combination therapy of fusidic acid and rifampicin. Antibiotics used in combination may also be antagonistic and the combined effects of the two antibiotics may be less than if one of the antibiotics was given as a monotherapy. For example, chloramphenicol and tetracyclines are antagonists to penicillins. However, this can vary depending on the species of bacteria. In general, combinations of a bacteriostatic antibiotic and bactericidal antibiotic are antagonistic.

In addition to combining one antibiotic with another, antibiotics are sometimes co-administered with resistance-modifying agents. For example, β-lactam antibiotics may be used in combination with β-lactamase inhibitors, such as clavulanic acid or sulbactam, when a patient is infected with a β-lactamase-producing strain of bacteria.

Classes

Molecular targets of antibiotics on the bacteria cell
 
 
Protein synthesis inhibitors (antibiotics)

Antibiotics are commonly classified based on their mechanism of action, chemical structure, or spectrum of activity. Most target bacterial functions or growth processes. Those that target the bacterial cell wall (penicillins and cephalosporins) or the cell membrane (polymyxins), or interfere with essential bacterial enzymes (rifamycins, lipiarmycins, quinolones, and sulfonamides) have bactericidal activities. Protein synthesis inhibitors (macrolides, lincosamides, and tetracyclines) are usually bacteriostatic (with the exception of bactericidal aminoglycosides). Further categorization is based on their target specificity. "Narrow-spectrum" antibiotics target specific types of bacteria, such as gram-negative or gram-positive, whereas broad-spectrum antibiotics affect a wide range of bacteria. Following a 40-year break in discovering classes of antibacterial compounds, four new classes of antibiotics were introduced to clinical use in the late 2000s and early 2010s: cyclic lipopeptides (such as daptomycin), glycylcyclines (such as tigecycline), oxazolidinones (such as linezolid), and lipiarmycins (such as fidaxomicin).

Production

With advances in medicinal chemistry, most modern antibacterials are semisynthetic modifications of various natural compounds. These include, for example, the beta-lactam antibiotics, which include the penicillins (produced by fungi in the genus Penicillium), the cephalosporins, and the carbapenems. Compounds that are still isolated from living organisms are the aminoglycosides, whereas other antibacterials—for example, the sulfonamides, the quinolones, and the oxazolidinones—are produced solely by chemical synthesis. Many antibacterial compounds are relatively small molecules with a molecular weight of less than 1000 daltons.

Since the first pioneering efforts of Howard Florey and Chain in 1939, the importance of antibiotics, including antibacterials, to medicine has led to intense research into producing antibacterials at large scales. Following screening of antibacterials against a wide range of bacteria, production of the active compounds is carried out using fermentation, usually in strongly aerobic conditions.

Resistance

The emergence of resistance of bacteria to antibiotics is a common phenomenon. Emergence of resistance often reflects evolutionary processes that take place during antibiotic therapy. The antibiotic treatment may select for bacterial strains with physiologically or genetically enhanced capacity to survive high doses of antibiotics. Under certain conditions, it may result in preferential growth of resistant bacteria, while growth of susceptible bacteria is inhibited by the drug. For example, antibacterial selection for strains having previously acquired antibacterial-resistance genes was demonstrated in 1943 by the Luria–Delbrück experiment. Antibiotics such as penicillin and erythromycin, which used to have a high efficacy against many bacterial species and strains, have become less effective, due to the increased resistance of many bacterial strains.

Resistance may take the form of biodegradation of pharmaceuticals, such as sulfamethazine-degrading soil bacteria introduced to sulfamethazine through medicated pig feces. The survival of bacteria often results from an inheritable resistance, but the growth of resistance to antibacterials also occurs through horizontal gene transfer. Horizontal transfer is more likely to happen in locations of frequent antibiotic use.

Antibacterial resistance may impose a biological cost, thereby reducing fitness of resistant strains, which can limit the spread of antibacterial-resistant bacteria, for example, in the absence of antibacterial compounds. Additional mutations, however, may compensate for this fitness cost and can aid the survival of these bacteria.

Paleontological data show that both antibiotics and antibiotic resistance are ancient compounds and mechanisms. Useful antibiotic targets are those for which mutations negatively impact bacterial reproduction or viability.

Several molecular mechanisms of antibacterial resistance exist. Intrinsic antibacterial resistance may be part of the genetic makeup of bacterial strains. For example, an antibiotic target may be absent from the bacterial genome. Acquired resistance results from a mutation in the bacterial chromosome or the acquisition of extra-chromosomal DNA. Antibacterial-producing bacteria have evolved resistance mechanisms that have been shown to be similar to, and may have been transferred to, antibacterial-resistant strains. The spread of antibacterial resistance often occurs through vertical transmission of mutations during growth and by genetic recombination of DNA by horizontal genetic exchange. For instance, antibacterial resistance genes can be exchanged between different bacterial strains or species via plasmids that carry these resistance genes. Plasmids that carry several different resistance genes can confer resistance to multiple antibacterials. Cross-resistance to several antibacterials may also occur when a resistance mechanism encoded by a single gene conveys resistance to more than one antibacterial compound.

Antibacterial-resistant strains and species, sometimes referred to as "superbugs", now contribute to the emergence of diseases that were for a while well controlled. For example, emergent bacterial strains causing tuberculosis that are resistant to previously effective antibacterial treatments pose many therapeutic challenges. Every year, nearly half a million new cases of multidrug-resistant tuberculosis (MDR-TB) are estimated to occur worldwide. For example, NDM-1 is a newly identified enzyme conveying bacterial resistance to a broad range of beta-lactam antibacterials. The United Kingdom's Health Protection Agency has stated that "most isolates with NDM-1 enzyme are resistant to all standard intravenous antibiotics for treatment of severe infections." On 26 May 2016, an E. coli "superbug" was identified in the United States resistant to colistin, "the last line of defence" antibiotic.

Misuse

This poster from the US Centers for Disease Control and Prevention "Get Smart" campaign, intended for use in doctors' offices and other healthcare facilities, warns that antibiotics do not work for viral illnesses such as the common cold.

Per The ICU Book "The first rule of antibiotics is to try not to use them, and the second rule is try not to use too many of them." Inappropriate antibiotic treatment and overuse of antibiotics have contributed to the emergence of antibiotic-resistant bacteria. Self-prescribing of antibiotics is an example of misuse. Many antibiotics are frequently prescribed to treat symptoms or diseases that do not respond to antibiotics or that are likely to resolve without treatment. Also, incorrect or suboptimal antibiotics are prescribed for certain bacterial infections. The overuse of antibiotics, like penicillin and erythromycin, has been associated with emerging antibiotic resistance since the 1950s. Widespread usage of antibiotics in hospitals has also been associated with increases in bacterial strains and species that no longer respond to treatment with the most common antibiotics.

Common forms of antibiotic misuse include excessive use of prophylactic antibiotics in travelers and failure of medical professionals to prescribe the correct dosage of antibiotics on the basis of the patient's weight and history of prior use. Other forms of misuse include failure to take the entire prescribed course of the antibiotic, incorrect dosage and administration, or failure to rest for sufficient recovery. Inappropriate antibiotic treatment, for example, is their prescription to treat viral infections such as the common cold. One study on respiratory tract infections found "physicians were more likely to prescribe antibiotics to patients who appeared to expect them". Multifactorial interventions aimed at both physicians and patients can reduce inappropriate prescription of antibiotics. The lack of rapid point of care diagnostic tests, particularly in resource-limited settings is considered as one of the drivers of antibiotic misuse.

Several organizations concerned with antimicrobial resistance are lobbying to eliminate the unnecessary use of antibiotics. The issues of misuse and overuse of antibiotics have been addressed by the formation of the US Interagency Task Force on Antimicrobial Resistance. This task force aims to actively address antimicrobial resistance, and is coordinated by the US Centers for Disease Control and Prevention, the Food and Drug Administration (FDA), and the National Institutes of Health, as well as other US agencies. A non-governmental organization campaign group is Keep Antibiotics Working. In France, an "Antibiotics are not automatic" government campaign started in 2002 and led to a marked reduction of unnecessary antibiotic prescriptions, especially in children.

The emergence of antibiotic resistance has prompted restrictions on their use in the UK in 1970 (Swann report 1969), and the European Union has banned the use of antibiotics as growth-promotional agents since 2003. Moreover, several organizations (including the World Health Organization, the National Academy of Sciences, and the U.S. Food and Drug Administration) have advocated restricting the amount of antibiotic use in food animal production. However, commonly there are delays in regulatory and legislative actions to limit the use of antibiotics, attributable partly to resistance against such regulation by industries using or selling antibiotics, and to the time required for research to test causal links between their use and resistance to them. Two federal bills (S.742 and H.R. 2562) aimed at phasing out nontherapeutic use of antibiotics in US food animals were proposed, but have not passed. These bills were endorsed by public health and medical organizations, including the American Holistic Nurses' Association, the American Medical Association, and the American Public Health Association.

Despite pledges by food companies and restaurants to reduce or eliminate meat that comes from animals treated with antibiotics, the purchase of antibiotics for use on farm animals has been increasing every year.

There has been extensive use of antibiotics in animal husbandry. In the United States, the question of emergence of antibiotic-resistant bacterial strains due to use of antibiotics in livestock was raised by the US Food and Drug Administration (FDA) in 1977. In March 2012, the United States District Court for the Southern District of New York, ruling in an action brought by the Natural Resources Defense Council and others, ordered the FDA to revoke approvals for the use of antibiotics in livestock, which violated FDA regulations.

Studies have shown that common misconceptions about the effectiveness and necessity of antibiotics to treat common mild illnesses contribute to their overuse.

History

Before the early 20th century, treatments for infections were based primarily on medicinal folklore. Mixtures with antimicrobial properties that were used in treatments of infections were described over 2,000 years ago. Many ancient cultures, including the ancient Egyptians and ancient Greeks, used specially selected mold and plant materials to treat infections. Nubian mummies studied in the 1990s were found to contain significant levels of tetracycline. The beer brewed at that time was conjectured to have been the source.

The use of antibiotics in modern medicine began with the discovery of synthetic antibiotics derived from dyes.

Synthetic antibiotics derived from dyes

Arsphenamine, also known as salvarsan, discovered in 1907 by Paul Ehrlich.

Synthetic antibiotic chemotherapy as a science and development of antibacterials began in Germany with Paul Ehrlich in the late 1880s. Ehrlich noted certain dyes would color human, animal, or bacterial cells, whereas others did not. He then proposed the idea that it might be possible to create chemicals that would act as a selective drug that would bind to and kill bacteria without harming the human host. After screening hundreds of dyes against various organisms, in 1907, he discovered a medicinally useful drug, the first synthetic antibacterial organoarsenic compound salvarsan, now called arsphenamine.

This heralded the era of antibacterial treatment that was begun with the discovery of a series of arsenic-derived synthetic antibiotics by both Alfred Bertheim and Ehrlich in 1907. Ehrlich and Bertheim had experimented with various chemicals derived from dyes to treat trypanosomiasis in mice and spirochaeta infection in rabbits. While their early compounds were too toxic, Ehrlich and Sahachiro Hata, a Japanese bacteriologist working with Erlich in the quest for a drug to treat syphilis, achieved success with the 606th compound in their series of experiments. In 1910 Ehrlich and Hata announced their discovery, which they called drug "606", at the Congress for Internal Medicine at Wiesbaden. The Hoechst company began to market the compound toward the end of 1910 under the name Salvarsan, now known as arsphenamine. The drug was used to treat syphilis in the first half of the 20th century. In 1908, Ehrlich received the Nobel Prize in Physiology or Medicine for his contributions to immunology. Hata was nominated for the Nobel Prize in Chemistry in 1911 and for the Nobel Prize in Physiology or Medicine in 1912 and 1913.

The first sulfonamide and the first systemically active antibacterial drug, Prontosil, was developed by a research team led by Gerhard Domagk in 1932 or 1933 at the Bayer Laboratories of the IG Farben conglomerate in Germany, for which Domagk received the 1939 Nobel Prize in Physiology or Medicine. Sulfanilamide, the active drug of Prontosil, was not patentable as it had already been in use in the dye industry for some years. Prontosil had a relatively broad effect against Gram-positive cocci, but not against enterobacteria. Research was stimulated apace by its success. The discovery and development of this sulfonamide drug opened the era of antibacterials.

Penicillin and other natural antibiotics

Penicillin, discovered by Alexander Fleming in 1928

Observations about the growth of some microorganisms inhibiting the growth of other microorganisms have been reported since the late 19th century. These observations of antibiosis between microorganisms led to the discovery of natural antibacterials. Louis Pasteur observed, "if we could intervene in the antagonism observed between some bacteria, it would offer perhaps the greatest hopes for therapeutics".

In 1874, physician Sir William Roberts noted that cultures of the mold Penicillium glaucum that is used in the making of some types of blue cheese did not display bacterial contamination. In 1876, physicist John Tyndall also contributed to this field.

In 1895 Vincenzo Tiberio, Italian physician, published a paper on the antibacterial power of some extracts of mold.

In 1897, doctoral student Ernest Duchesne submitted a dissertation, "Contribution à l'étude de la concurrence vitale chez les micro-organismes: antagonisme entre les moisissures et les microbes" (Contribution to the study of vital competition in micro-organisms: antagonism between molds and microbes), the first known scholarly work to consider the therapeutic capabilities of molds resulting from their anti-microbial activity. In his thesis, Duchesne proposed that bacteria and molds engage in a perpetual battle for survival. Duchesne observed that E. coli was eliminated by Penicillium glaucum when they were both grown in the same culture. He also observed that when he inoculated laboratory animals with lethal doses of typhoid bacilli together with Penicillium glaucum, the animals did not contract typhoid. Unfortunately Duchesne's army service after getting his degree prevented him from doing any further research. Duchesne died of tuberculosis, a disease now treated by antibiotics.

Alexander Fleming was awarded a Nobel prize for his role in the discovery of penicillin

In 1928, Sir Alexander Fleming postulated the existence of penicillin, a molecule produced by certain molds that kills or stops the growth of certain kinds of bacteria. Fleming was working on a culture of disease-causing bacteria when he noticed the spores of a green mold, Penicillium chrysogenum, in one of his culture plates. He observed that the presence of the mold killed or prevented the growth of the bacteria. Fleming postulated that the mold must secrete an antibacterial substance, which he named penicillin in 1928. Fleming believed that its antibacterial properties could be exploited for chemotherapy. He initially characterized some of its biological properties, and attempted to use a crude preparation to treat some infections, but he was unable to pursue its further development without the aid of trained chemists.

Ernst Chain, Howard Florey and Edward Abraham succeeded in purifying the first penicillin, penicillin G, in 1942, but it did not become widely available outside the Allied military before 1945. Later, Norman Heatley developed the back extraction technique for efficiently purifying penicillin in bulk. The chemical structure of penicillin was first proposed by Abraham in 1942 and then later confirmed by Dorothy Crowfoot Hodgkin in 1945. Purified penicillin displayed potent antibacterial activity against a wide range of bacteria and had low toxicity in humans. Furthermore, its activity was not inhibited by biological constituents such as pus, unlike the synthetic sulfonamides. (see below) The development of penicillin led to renewed interest in the search for antibiotic compounds with similar efficacy and safety. For their successful development of penicillin, which Fleming had accidentally discovered but could not develop himself, as a therapeutic drug, Chain and Florey shared the 1945 Nobel Prize in Medicine with Fleming.

Florey credited Rene Dubos with pioneering the approach of deliberately and systematically searching for antibacterial compounds, which had led to the discovery of gramicidin and had revived Florey's research in penicillin. In 1939, coinciding with the start of World War II, Dubos had reported the discovery of the first naturally derived antibiotic, tyrothricin, a compound of 20% gramicidin and 80% tyrocidine, from Bacillus brevis. It was one of the first commercially manufactured antibiotics and was very effective in treating wounds and ulcers during World War II. Gramicidin, however, could not be used systemically because of toxicity. Tyrocidine also proved too toxic for systemic usage. Research results obtained during that period were not shared between the Axis and the Allied powers during World War II and limited access during the Cold War.

Late 20th century

During the mid-20th century, the number of new antibiotic substances introduced for medical use increased significantly. From 1935 to 1968, 12 new classes were launched. However, after this, the number of new classes dropped markedly, with only two new classes introduced between 1969 and 2003.

Etymology of the words 'antibiotic' and 'antibacterial'

The term 'antibiosis', meaning "against life", was introduced by the French bacteriologist Jean Paul Vuillemin as a descriptive name of the phenomenon exhibited by these early antibacterial drugs. Antibiosis was first described in 1877 in bacteria when Louis Pasteur and Robert Koch observed that an airborne bacillus could inhibit the growth of Bacillus anthracis. These drugs were later renamed antibiotics by Selman Waksman, an American microbiologist, in 1947.

The term antibiotic was first used in 1942 by Selman Waksman and his collaborators in journal articles to describe any substance produced by a microorganism that is antagonistic to the growth of other microorganisms in high dilution. This definition excluded substances that kill bacteria but that are not produced by microorganisms (such as gastric juices and hydrogen peroxide). It also excluded synthetic antibacterial compounds such as the sulfonamides. In current usage, the term "antibiotic" is applied to any medication that kills bacteria or inhibits their growth, regardless of whether that medication is produced by a microorganism or not.

The term "antibiotic" derives from anti + βιωτικός (biōtikos), "fit for life, lively", which comes from βίωσις (biōsis), "way of life", and that from βίος (bios), "life". The term "antibacterial" derives from Greek ἀντί (anti), "against" + βακτήριον (baktērion), diminutive of βακτηρία (baktēria), "staff, cane", because the first bacteria to be discovered were rod.

Antibiotic pipeline

Both the WHO and the Infectious Disease Society of America report that the weak antibiotic pipeline does not match bacteria's increasing ability to develop resistance. The Infectious Disease Society of America report noted that the number of new antibiotics approved for marketing per year had been declining and identified seven antibiotics against the Gram-negative bacilli currently in phase 2 or phase 3 clinical trials. However, these drugs did not address the entire spectrum of resistance of Gram-negative bacilli. According to the WHO fifty one new therapeutic entities - antibiotics (including combinations), are in phase 1-3 clinical trials as of May 2017. Antibiotics targeting multidrug-resistant Gram-positive pathogens remains a high priority.

A few antibiotics have received marketing authorization in the last seven years. The cephalosporin ceftaroline and the lipoglycopeptides oritavancin and telavancin for the treatment of acute bacterial skin and skin structure infection and community-acquired bacterial pneumonia. The lipoglycopeptide dalbavancin and the oxazolidinone tedizolid has also been approved for use for the treatment of acute bacterial skin and skin structure infection. The first in a new class of narrow spectrum macrocyclic antibiotics, fidaxomicin, has been approved for the treatment of C. difficile colitis. New cephalosporin-lactamase inhibitor combinations also approved include ceftazidime-avibactam and ceftolozane-avibactam for complicated urinary tract infection and intra-abdominal infection.

Possible improvements include clarification of clinical trial regulations by FDA. Furthermore, appropriate economic incentives could persuade pharmaceutical companies to invest in this endeavor. In the US, the Antibiotic Development to Advance Patient Treatment (ADAPT) Act was introduced with the aim of fast tracking the drug development of antibiotics to combat the growing threat of 'superbugs'. Under this Act, FDA can approve antibiotics and antifungals treating life-threatening infections based on smaller clinical trials. The CDC will monitor the use of antibiotics and the emerging resistance, and publish the data. The FDA antibiotics labeling process, 'Susceptibility Test Interpretive Criteria for Microbial Organisms' or 'breakpoints', will provide accurate data to healthcare professionals. According to Allan Coukell, senior director for health programs at The Pew Charitable Trusts, "By allowing drug developers to rely on smaller datasets, and clarifying FDA's authority to tolerate a higher level of uncertainty for these drugs when making a risk/benefit calculation, ADAPT would make the clinical trials more feasible."

Replenishing the antibiotic pipeline and developing other new therapies

Because antibiotic-resistant bacterial strains continue to emerge and spread, there is a constant need to develop new antibacterial treatments. Current strategies include traditional chemistry-based approaches such as natural product-based drug discovery, newer chemistry-based approaches such as drug design, traditional biology-based approaches such as immunoglobulin therapy, and experimental biology-based approaches such as phage therapy, fecal microbiota transplants, antisense RNA-based treatments, and CRISPR-Cas9-based treatments.

Natural product-based antibiotic discovery

Bacteria, fungi, plants, animals and other organisms are being screened in the search for new antibiotics.

Most of the antibiotics in current use are natural products or natural product derivatives, and bacterial, fungal, plant and animal extracts are being screened in the search for new antibiotics. Organisms may be selected for testing based on ecological, ethnomedical, genomic or historical rationales. Medicinal plants, for example, are screened on the basis that they are used by traditional healers to prevent or cure infection and may therefore contain antibacterial compounds. Also, soil bacteria are screened on the basis that, historically, they have been a very rich source of antibiotics (with 70 to 80% of antibiotics in current use derived from the actinomycetes).

In addition to screening natural products for direct antibacterial activity, they are sometimes screened for the ability to suppress antibiotic resistance and antibiotic tolerance. For example, some secondary metabolites inhibit drug efflux pumps, thereby increasing the concentration of antibiotic able to reach its cellular target and decreasing bacterial resistance to the antibiotic. Natural products known to inhibit bacterial efflux pumps include the alkaloid lysergol, the carotenoids capsanthin and capsorubin, and the flavonoids rotenone and chrysin. Other natural products, this time primary metabolites rather than secondary metabolites, have been shown to eradicate antibiotic tolerance. For example, glucose, mannitol, and fructose reduce antibiotic tolerance in Escherichia coli and Staphylococcus aureus, rendering them more susceptible to killing by aminoglycoside antibiotics.

Natural products may be screened for the ability to suppress bacterial virulence factors too. Virulence factors are molecules, cellular structures and regulatory systems that enable bacteria to evade the body's immune defenses (e.g. urease, staphyloxanthin), move towards, attach to, and/or invade human cells (e.g. type IV pili, adhesins, internalins), coordinate the activation of virulence genes (e.g. quorum sensing), and cause disease (e.g. exotoxins). Examples of natural products with antivirulence activity include the flavonoid epigallocatechin gallate (which inhibits listeriolysin O), the quinone tetrangomycin (which inhibits staphyloxanthin), and the sesquiterpene zerumbone (which inhibits Acinetobacter baumannii motility).

Immunoglobulin therapy

Antibodies (anti-tetanus immunoglobulin) have been used in the treatment and prevention of tetanus since the 1910s, and this approach continues to be a useful way of controlling bacterial disease. The monoclonal antibody bezlotoxumab, for example, has been approved by the US FDA and EMA for recurrent Clostridium difficile infection, and other monoclonal antibodies are in development (e.g. AR-301 for the adjunctive treatment of S. aureus ventilator-associated pneumonia). Antibody treatments act by binding to and neutralizing bacterial exotoxins and other virulence factors.

Phage therapy

Phage injecting its genome into a bacterium. Viral replication and bacterial cell lysis will ensue.

Phage therapy is under investigation as a method of treating antibiotic-resistant strains of bacteria. Phage therapy involves infecting bacterial pathogens with viruses. Bacteriophages and their host ranges are extremely specific for certain bacteria, thus, unlike antibiotics, they do not disturb the host organism's intestinal microbiota. Bacteriophages, also known simply as phages, infect and kill bacteria primarily during lytic cycles. Phages insert their DNA into the bacterium, where it is transcribed and used to make new phages, after which the cell will lyse, releasing new phage that are able to infect and destroy further bacteria of the same strain. The high specificity of phage protects "good" bacteria from destruction.

Some disadvantages to the use of bacteriophages also exist, however. Bacteriophages may harbour virulence factors or toxic genes in their genomes and, prior to use, it may be prudent to identify genes with similarity to known virulence factors or toxins by genomic sequencing. In addition, the oral and IV administration of phages for the eradication of bacterial infections poses a much higher safety risk than topical application. Also, there is the additional concern of uncertain immune responses to these large antigenic cocktails.

There are considerable regulatory hurdles that must be cleared for such therapies. Despite numerous challenges, the use of bacteriophages as a replacement for antimicrobial agents against MDR pathogens that no longer respond to conventional antibiotics, remains an attractive option.

Fecal microbiota transplants

Fecal microbiota transplants are an experimental treatment for C. difficile infection.

Fecal microbiota transplants involve transferring the full intestinal microbiota from a healthy human donor (in the form of stool) to patients with C. difficile infection. Although this procedure has not been officially approved by the US FDA, its use is permitted under some conditions in patients with antibiotic-resistant C. difficile infection. Cure rates are around 90%, and work is underway to develop stool banks, standardized products, and methods of oral delivery.

Antisense RNA-based treatments

Antisense RNA-based treatment (also known as gene silencing therapy) involves (a) identifying bacterial genes that encode essential proteins (e.g. the Pseudomonas aeruginosa genes acpP, lpxC, and rpsJ), (b) synthesizing single stranded RNA that is complementary to the mRNA encoding these essential proteins, and (c) delivering the single stranded RNA to the infection site using cell-penetrating peptides or liposomes. The antisense RNA then hybridizes with the bacterial mRNA and blocks its translation into the essential protein. Antisense RNA-based treatment has been shown to be effective in in vivo models of P. aeruginosa pneumonia.

In addition to silencing essential bacterial genes, antisense RNA can be used to silence bacterial genes responsible for antibiotic resistance. For example, antisense RNA has been developed that silences the S. aureus mecA gene (the gene that encodes modified penicillin-binding protein 2a and renders S. aureus strains methicillin-resistant). Antisense RNA targeting mecA mRNA has been shown to restore the susceptibility of methicillin-resistant staphylococci to oxacillin in both in vitro and in vivo studies.

CRISPR-Cas9-based treatments

In the early 2000s, a system was discovered that enables bacteria to defend themselves against invading viruses. The system, known as CRISPR-Cas9, consists of (a) an enzyme that destroys DNA (the nuclease Cas9) and (b) the DNA sequences of previously encountered viral invaders (CRISPR). These viral DNA sequences enable the nuclease to target foreign (viral) rather than self (bacterial) DNA.

Although the function of CRISPR-Cas9 in nature is to protect bacteria, the DNA sequences in the CRISPR component of the system can be modified so that the Cas9 nuclease targets bacterial resistance genes or bacterial virulence genes instead of viral genes. The modified CRISPR-Cas9 system can then be administered to bacterial pathogens using plasmids or bacteriophages. This approach has successfully been used to silence antibiotic resistance and reduce the virulence of enterohemorrhagic E. coli in an in vivo model of infection.

Reducing the selection pressure for antibiotic resistance

Share of population using safely managed sanitation facilities in 2015.

In addition to developing new antibacterial treatments, it is important to reduce the selection pressure for the emergence and spread of antibiotic resistance. Strategies to accomplish this include well-established infection control measures such as infrastructure improvement (e.g. less crowded housing), better sanitation (e.g. safe drinking water and food) and vaccine development, other approaches such as antibiotic stewardship, and experimental approaches such as the use of prebiotics and probiotics to prevent infection.

Vaccines

Vaccines rely on immune modulation or augmentation. Vaccination either excites or reinforces the immune competence of a host to ward off infection, leading to the activation of macrophages, the production of antibodies, inflammation, and other classic immune reactions. Antibacterial vaccines have been responsible for a drastic reduction in global bacterial diseases. Vaccines made from attenuated whole cells or lysates have been replaced largely by less reactogenic, cell-free vaccines consisting of purified components, including capsular polysaccharides and their conjugates, to protein carriers, as well as inactivated toxins (toxoids) and proteins.

 

Endurance running hypothesis

From Wikipedia, the free encyclopedia

The endurance running hypothesis is the hypothesis that the evolution of certain human characteristics can be explained as adaptations to long-distance running. The hypothesis suggests that endurance running played an important role for early hominins in obtaining food. Researchers have proposed that endurance running began as an adaptation for scavenging and later for persistence hunting.

Anatomical and physiological adaptations

Running vs. walking

Much research has been geared towards the mechanics of how bipedal walking has evolved in the genus Homo. However, little research has been conducted to examine how the specific adaptations for running emerged, and how they influenced human evolution.

The bit of research that has focused on human running provides much evidence for bodily function and structures that improve running only, and are not used in walking. This suggests that running was an adaptation, not that it came about as a byproduct of walking.

Running and walking incorporated different biomechanisms. Walking requires an "inverted pendulum" where the body's center of mass is shifted over the extended leg, to exchange potential and kinetic energy with each step. Running involves a "mass spring" mechanism to exchange potential and kinetic energy, with the use of tendons and ligaments. Tendons and ligaments are elastic tissues that store energy. They are stretched and then release energy as they recoil. This mass spring mechanism becomes less energetically costly at faster speeds and is therefore more efficient than the inverted pendulum of walking mechanics when traveling at greater speeds. Tendons and ligaments, however, do not provide these benefits in walking.

Although the mass spring mechanism can be more energetically favorable at higher speeds, it also results in an increase in ground reaction forces and is less stable because there is more movement and pitching of the limbs and core of the body. Ground forces and body pitching movement is less of an issue in the walking gait, where the position of the body's center of mass varies less, making walking an inherently more stable gait. In response to the destabilization of the running gait, the human body appears to have evolved adaptations to increase stabilization, as well as for the mass-spring mechanism in general. These adaptations, described below, are all evidence for selection for endurance running.

Skeletal evidence

Many researchers compare the skeletal structures of early hominins such as Australopithecus to those of Homo in order to identify structural differences that may be significant to endurance running.

Nuchal ligament: Because the head is decoupled from the shoulders, early Homo needed a way to stabilize the head. The nuchal ligament is an important evolved feature in head stabilization. It starts at the midline of the occiput and connects to the upper trapezius. This ligament is also important in terms of archaeological findings, because it leaves a small indentation and ridge in the skull, allowing researchers to see if various species had a nuchal ligament. The ability to see traces of ligaments in archaeological findings is rare because they degrade quickly and often leave no trace. In the case of the nuchal ligament, a trace of its existence is left with the presence of the skull ridge. Because neither Australopithecus nor Pan had the skull ridge, it has been concluded that this feature is unique to Homo. Because the nuchal ligament is only activated while running, the amount of running can be inferred from the rugosity of the muscle insertions. In the case of Homo Erectus and Neanderthals, very strong nuchal ligament markings are present, but are less marked in modern humans, indicating a decrease in running behavior.

Nuchal ligament of Homo sapiens

Shoulder and head stabilization: The human skeleton is different from early hominins as there is less of a connection between the pectoral girdle parts of the shoulders and upper back and head, which would be advantageous for climbing but would hinder the movements of the upper body needed to counter leg movement and therefore stabilize the body and head when running. This stabilization is unnecessary in walking.

Limb length and mass: Homo has longer legs relative to body mass, which helps to decrease the energetic costs of running, as time in contact with the ground increases. There is also a decrease in mass of distal parts of limbs of humans, which is known to decrease metabolic costs in endurance running, but has little effect on walking. Additionally, the mass of the upper body limbs in Homo has decreased considerably, relative to total body mass, which is important to reduce the effort of stabilizing the arms in running.

Joint surface: Humans have evolved to absorb great shock and force on the skeletal structure while running. The impact force on the body can reach up to 3–4 times body weight in endurance running, putting the skeletal structure under great stress. To reduce this stress humans have increased joint surfaces relative to body mass to spread force over larger surface areas, particularly in the lower body. This adaptation, which allows humans to absorb great shock and force applied to the skeleton, is not seen in australopithecine skeletal structures.

Plantar arch: The plantar arch in the human foot has an elastic spring function that generates energy for running but not walking. Fossils of the australopithecine foot show only partial arch, suggesting less of a spring capacity. For the plantar arch spring mechanism to function fully, there must also be restricted rotation in the hind and front parts of the foot. This restriction comes from projected toe bone and compacted mid-foot joint structures in humans, which does not become present until Homo habilis.

Calcaneal tuber and Achilles tendon: Studies have explored the calcaneal tuber, the posterior half of the calcaneus bone, as a correlate for Achilles tendon length and have found correlation between calcaneal tuber length and Achilles tendon length. Because shorter calcaneal tuber length leads to greater Achilles stretch, more kinetic energy is converted to elastic energy, translating into better overall running economy. Comparisons between Neanderthals and modern humans reveal that this adaptation was absent in Neanderthals, leading researchers to conclude that endurance running capabilities may have been enhanced in anatomically modern humans.

Shorter toes: Human toes are straight and extremely short in relation to body size compared to other animals. In running, the toes support 50 to 75% of body mass in humans. Impulse and mechanical work increase in humans as toe length increases, showing that it is energetically favorable to have shorter toes. The costs of shorter toes are decreased gripping capabilities and power output. However, the efficiency benefits seem to outweigh these costs, as the toes of A. afarensis remains were shorter than great apes, but 40% longer than modern humans, meaning that there is a trend toward shorter toes as the primate species moves away from tree-dwelling. This 40% increase in toe length would theoretically induce a flexor impulse 2.5 times that of modern humans, which would require twice as much mechanical work to stabilize.

Stabilization

Semicircular canal: The semicircular canal, a series of three interconnected tubes within each ear, is important for sensing angular rotations of the head and thus plays a crucial role in maintaining balance and sensing and coordinating movement. Comparative studies have shown that animals with larger semicircular canals are able to sense a greater range of head movements and therefore have greater speed and agility. Evolutionarily, greatly reduced semicircular canal diameters are evident in Neanderthals but expanded in modern humans, suggesting that this adaptation was selected for in response to increased endurance running.

Vestibulo-ocular reflexes (VORs): VORs are enabled by muscles in the eye, which sense angular accelerations of the head and adjust eye movements to stabilize these images. This was an important adaptation for running because it allowed Homo to see more clearly during the rough pitching motion that occurs during running.

Gluteals: The gluteus maximus in Homo erectus is significantly larger than that of Australopithecus. It is suited to absorb and return force, much like a spring, as the body oscillates vertically with each step. Gluteals of that size and strength are not necessary for walking.

Iliac spine: Homo has expanded areas on the sacrum and posterior iliac spine for greater muscle attachment. These areas are used to stabilize the trunk and reduce the body's forward pitch caused by running strides.

Increased efficiency

Thermoregulation

In addition to advances in skeletal structure and stabilization, adaptations that led to increased efficiency in dissipation of heat were instrumental in the evolution of endurance running in Homo. The duration for which an animal can run is determined by its capacity to release more heat than is produced to avoid lethal temperatures.

The majority of mammals, including humans, rely on evaporative cooling to maintain body temperature. Most medium-to-large mammals rely on panting, while humans rely on sweating, to dissipate heat. Advantages of panting include cooler skin surface, little salt loss, and heat loss by forced convection instead of reliance on wind or other means of convection. On the other hand, sweating is advantageous in that evaporation occurs over a much larger surface area (the skin), and it is independent of respiration, thus is a much more flexible mode of cooling during intense activity such as running. Because human sweat glands are under a higher level of neuronal control than those of other species, they allow for the excretion of more sweat per unit surface area than any other species. Heat dissipation of later hominins was also enhanced by the reduction in body hair. By ridding themselves of an insulating fur coat, running humans are better able to dissipate the heat generated by exercise.

In addition to improved thermoregulation, hominins have evolved an enhanced method of respiration consistent with the demands of running. Due to their orientation, respiration in quadrupedal mammals is affected by skeletal and muscular stresses generated through the motion of running. The bones and muscles of the chest cavity are not only responsible for shock absorption, but are also subjected to continuous compression and expansion during the running cycle. Because of this movement, quadrupeds are restricted to one breath per locomotor cycle, and thus must coordinate their running gait and respiration rate. This tight coordination then translates into another restriction: a specific running speed that is most energetically favorable. The upright orientation of bipedal hominins, however, frees them from this respiration-gait restriction. Because their chest cavities are not directly compressed or involved in the motion of running, hominins are able to vary their breathing patterns with gait. This flexibility in respiration rate and running gait contributes to hominins having a broader range of energetically favorable running speeds.

Storage and utilization of energy

During periods of prolonged exercise, animals are dependent on a combination of two sources of fuel: glycogen stored in the muscles and liver, and fat. Because glycogen is more easily oxidized than fat, it is depleted first. However, over longer periods of time, energy demands require that fat stores be utilized as fuel. This is true for all mammals, but hominins, and later modern humans, have an advantage of being able to alter their diet to meet these prolonged energy demands.

In addition to flexibility in the utilization of energy, hominins have evolved larger thyroid and adrenal glands which enable them to utilize the energy in carbohydrates and fatty acids more readily and efficiently. These organs are responsible for releasing hormones including epinephrine, norepinephrine, adrenocorticotropic hormone (ACTH), glucagon, and thyroxine. Larger glands allows for greater production of these key hormones and ultimately, maximized utilization of stored fuel.

Taken together, the flexibility in diet and the enhanced usage of fuel heightens the previously mentioned finding that, unlike quadrupeds, hominins do not have a single energetically optimal running speed. For quadrupeds, increasing running speed means increasing the demand for oxygen and fuel. Due to skeletal structure and bipedalism, hominins are free to run energetically over a broader range of speeds and gaits, while maintaining a constant energy consumption rate of approximately 4.1 MJ per 15 km. Thus their utilization of energy is greatly enhanced.

Endurance running and scavenging

All of the aforementioned adaptations enabled Homo to scavenge for food more effectively. Endurance running could have been used as a means of gaining access to distant carcasses or food stores faster than other scavengers and/or carnivores. Scavenging may have taken one or both of two forms: opportunistic scavenging and strategic scavenging.

Early Homo almost certainly scavenged opportunistically. Scavenging is considered opportunistic when one "come[s] across carcasses in the course of [their] daily foraging activities".

Strategic scavenging involves a planned search for carcasses. This style of scavenging would have benefitted from endurance running much more than opportunistic scavenging. Strategic scavenging would have involved the use of long range cues, such as birds circling overhead. Endurance running would have been advantageous in this setting because it allowed hominins to reach the carcass more quickly. Selection pressures would have been very high for strategic scavenging, because hominins were diurnal, while their major competitors (hyenas, lions, etc.) were not. Thus, they would have had to make sure to capitalize on daytime carcasses. Selection pressure also came from the weakness of Homo. Because they were very weak, they were unlikely to drive off any large competition at the carcass. This fact led to an even higher need for a way to reach the carcass before these competitors.

Endurance running and persistence hunting

Persistence hunting is "a form of pursuit hunting in which [the hunter uses] endurance running during the midday heat to drive [prey] into hyperthermia and exhaustion so they can easily be killed". Many question persistence hunting's plausibility when bow and arrow and other technologies were so much more efficient. However, in the Early Stone Age (ESA), spears were only sharpened wood, and hominins had not begun using tools. The lack of spearheads or bows meant they could only hunt from very close range—between 6 and 10 meters. Hominins thus must have developed a way to stab prey from close range without causing serious bodily harm to themselves. Persistence hunting makes killing an animal easier by first bringing it to exhaustion, so that it can no longer retaliate violently.

Persistence hunters work by hunting in the middle of the day, when it is hottest. Hunters choose a single target prey and chase it at a speed between its trot and gallop, which is extremely inefficient for the animal. The hunter then continues pursuing over a period of hours, during which he may lose sight of the animal. In this case, the hunter must use tracks and an understanding of the animal to continue the chase. The prey eventually overheats and becomes unable to continue fleeing. Homo, which does not overheat as quickly because of its superior thermoregulation capabilities, is then able to stab the prey while it is incapacitated and cannot attack.

Tracking and running

Due to the complexity of following a fleeing animal, tracking methods must have been a prerequisite for the use of endurance running in persistence hunting. Scientists posit that early tracking methods were developed in open, sparsely vegetated terrain such as the Kalahari Desert in southern Africa. This "systemic tracking" involves simply following the footprints of animals and was most likely used for tracking grassland species on soft terrain. Skeletal remains suggest that during the Middle Stone Age, hominins used systemic tracking to scavenge for medium-sized animals in vegetation cover, but for hunting antelope in more open grasslands. From the Middle Stone Age into the Later Stone Age, tracking methods developed into what is termed "speculative tracking". When tracks could not easily be found and followed, Homo predicted where tracks were most likely to be found and interpreted other signs to locate prey. This advanced method of tracking allowed for the exploitation of prey in a variety of terrains, making endurance running for persistence hunting more plausible.

The process of tracking can last many hours and even days in the case of very large mammals. Often, the hunter(s) will have to run after the animal to keep up. The skeletal parameters of the tibia of early modern humans and Neanderthals have been compared with runners, and it surprisingly shows that these individuals were running even more than cross-country runners today. Particularly, European Neanderthals, the Skhul and Qafzeh hominins, and Late Stone Age Khoisan score very high compared to runners. This is consistent with modern observations of Khoisan, who routinely spend hours running after animals that have been shot with arrows.

Examples of persistence hunters

Although exact dates and methods of persistence hunting are difficult to study, several recent accounts of persistence hunting have been recorded. Tribes in the Kalahari Desert in Botswana have been known to employ endurance running to scavenge and hunt prey. In the open country, the Xo and Gwi tribes run down slow-moving animals such as aardvark and porcupines, while during the hotter part of the day, they target animals such as eland, kudu, gemsbok, hartebeest, duiker, steenbok, cheetah, caracal, and African wildcats. In addition to these existing African tribes, it has been suggested that the Tarahumara people in Mexico and the Paiute people and Navajo in the American Southwest, used persistence hunting to capture prey including deer and pronghorn. The Aborigines in Australia are known to have hunted kangaroo in similar ways. Due to the increased availability of weapons, nutrition, tracking devices, and motor vehicles, one may argue that persistence hunting is no longer an effective method of hunting animals for food. However, there are examples of the practice occurring in modern times: the Xo and Gwi in the central Kalahari, still practice persistence hunting and have developed advanced methods of doing so. Similarly, the Russian Lykov family that lived in isolation for 40 years also used persistence hunting due to a lack of weapons.

In culture and folklore

In the oral traditions of the Hadza, an isolated aboriginal people of hunter-gatherers living in Tanzania, the Tlaatlanebe in their folk history's second epoch practiced this.

In the first epoch, the world was inhabited by large hairy humanoids called Akakaanebe ("ancestors"), who did not yet possess tools or fire. They simply "stared" at game until it fell dead, referring to either scavenging or early persistence hunting without weapons, or a combination of the two. They did not build houses but slept under trees.

The Tlaatlanebe of the second epoch, however, were large but without hair and lived in caves. As animals had grown more wary of humans due to earlier hunting, they now had to be chased and hunted with dogs.

Criticisms

While there is evidence supporting selection on human morphology to improve endurance running ability, there is some dispute over whether the ecological benefits of scavenging and persistence hunting foraging behaviors were the driving force behind this development.

The majority of the arguments opposing persistence hunting and scavenging behaviors are linked to the fact that the paleohabitat and paleoecology of early Homo were not conducive to these behaviors. It is thought that the earliest members of Homo lived in African savanna-woodlands. This environment consisted of open grassland, as well as parts with dense vegetation—an intermediate between forest and open savannas. The presence of such tree covering would reduce visibility and so require tracking skills. This causes problems for the hypothesis of persistence hunting and running to aid scavenging.

Against persistence hunting

Ungulates are known from archaeological evidence to have been the main prey of the early Homo, and given their great speed, they would have easily been able to outrun early hominins. Ungulate speed, coupled with the variable visibility of the savanna-woodland, meant that hunting by endurance running required the ability to track prey. Pickering and Bunn argue that tracking is part of a sophisticated cognitive skill set that early hominins would not have had, and that even if they were following a trail of blood left by an injured ungulate—which may have been in their cognitive capacity—the ability to craft penetrating projectile technology was absent in early hominins.

It has been suggested that modern hunters in Africa do not use persistence hunting as a foraging method, and most often give up a chase where the trail they were following ends in vegetation. The rare groups of hunters who do occasionally participate in persistence hunting are able to do so because of the extremely hot and open environments. In these groups, a full day of rest and recovery is required after a hunt, indicating the great toll persistence hunts take on the body, making them rare undertakings.

Finally, in critique of Liebenberg's research on modern day persistence hunting, it was revealed that the majority of the hunts initiated were prompted for filming rather than spontaneous, and that few of these hunts were successful. The hunts that were successful involved external factors such as the hunters being able to stop and refill water bottles.

A response to these criticisms has been formulated by Lieberman et al., noting that it is unclear how humans could have grown to occupy a new niche as a diurnal social carnivore without persistence hunting, as the weapons preferred in modern hunter-gatherer tribes would not have been available at the time.

Against scavenging

The proposed benefit of endurance running in scavenging is the ability of early hominins to outcompete other scavengers in reaching food sources. However paleoanthropological studies suggest that the savanna-woodland habitat caused a very low competition environment. Due to low visibility, carcasses were not easily located by mammalian carnivores, resulting in less competition.

Hunting hypothesis

From Wikipedia, the free encyclopedia

In paleoanthropology, the hunting hypothesis is the hypothesis that human evolution was primarily influenced by the activity of hunting for relatively large and fast animals, and that the activity of hunting distinguished human ancestors from other hominins.

While it is undisputed that early humans were hunters, the importance of this fact for the final steps in the emergence of the genus Homo out of earlier australopithecines, with its bipedalism and production of stone tools (from about 2.5 million years ago), and eventually also control of fire (from about 1.5 million years ago), is emphasized in the "hunting hypothesis", and de-emphasized in scenarios that stress the omnivore status of humans as their recipe for success, and social interaction, including mating behaviour as essential in the emergence of language and culture.

Advocates of the hunting hypothesis tend to believe that tool use and toolmaking essential to effective hunting were an extremely important part of human evolution, and trace the origin of language and religion to a hunting context.

As societal evidence David Buss cites that modern tribal population deploy hunting as their primary way of acquiring food. The Aka pygmies in the Central African Republic spend 56% of their quest for nourishment hunting, 27% gathering, and 17% processing food. Additionally, the !Kung in Botswana retain 40% of their calories from hunting and this percentage varies from 20% to 90% depending on the season. For physical evidence Buss first looks to the guts of humans and apes. The human gut consists mainly of the small intestines, which are responsible for the rapid breakdown of proteins and absorption of nutrients. The ape's gut is primarily colon, which indicates a vegetarian diet. This structural difference supports the hunting hypothesis in being an evolutionary branching point between modern humans and modern primates. Buss also cites human teeth in that fossilized human teeth have a thin enamel coating with very little heavy wear and tear that would result from a plant diet. The absence of thick enamel also indicates that historically humans have maintained a meat-heavy diet. Buss notes that the bones of animals human ancestors killed found at Olduvai Gorge have cut marks at strategic points on the bones that indicate tool usage and provide evidence for ancestral butchers.

Applications

Sexual division of labor (evolutionary perspective)

According to the hunting hypothesis, women are preoccupied with pregnancy and dependent children and so do not hunt because it is dangerous and less profitable. Gijsbert Stoet highlights the fact that men are more competent in throwing skills, focused attention, and spatial abilities. (Experiments 1 and 2). Another possible explanation for women gathering is their inherent prioritization of rearing offspring, which is difficult to uphold if women were hunting.

Provisioning hypothesis

Parental investment

Buss purports that the hunting hypothesis explains the high level of human male parental investment in offspring as compared to primates. Meat is an economical and condensed food resource in that it can be brought home to feed the young, however it is not efficient to carry low-calorie food across great distances. Thus, the act of hunting and the required transportation of the kill in order to feed offspring is a reasonable explanation for human male provisioning.

Male coalitions

Buss suggests that the Hunting hypothesis also explains the advent of strong male coalitions. Although chimpanzees form male-male coalitions, they tend to be temporary and opportunistic. Contrastingly, large game hunters require consistent and coordinated cooperation to succeed in large game hunting. Thus male coalitions were the result of working together to succeed in providing meat for the hunters themselves and their families. Kristen Hawkes suggests further that obtaining resources intended for community consumption increases a male's fitness by appealing to the male's society and thus being in the good favor of both males and females. The male relationship would improve hunting success and create alliances for future conflict and the female relationship would improve direct reproductive success. Buss proposes alternate explanations of emergence of the strong male coalitions. He suggests that male coalitions may have been the result of group-on-group aggression, defense, and in-group political alliances. This explanation does not support the relationship between male coalitions and hunting.

Hawkes proposes that hunters pursue large game and divide the kill across the group. Hunters compete to divvy up the kill to signal courage, power, generosity, prosocial intent, and dedication. By engaging in these activities, hunters receive reproductive benefits and respect. These reproductive benefits lead to greater reproductive success in more skilled hunters. Evidence of these hunting goals that do not only benefit the families of the hunters are in the Ache and Hadza men. Hawkes notes that their hunting techniques are less efficient than alternative methods and are energetically costly, but the men place more importance on displaying their bravery, power, and prosocial intent than on hunting efficiency. This method is different as compared to other societies where hunters retain the control of their kills and signal their intent of sharing. This alternate method aligns with the coalition support hypothesis, in efforts to create and preserve political associations.

Reciprocal altruism

The meat from successful large game hunts are more than what a single hunter can consume. Further, hunting success varies by week. One week a hunter may succeed in hunting large game and the next may return with no meat. In this situation Buss suggests that there are low costs to giving away meat that cannot be eaten by the individual hunter on his own and large benefits from the expectation of the returned favor in a week where his hunting is not successful. Hawkes calls this sharing “tolerated theft” and purports that the benefits of reciprocal altruism stem from the result that families will experience “lower daily variation and higher daily average” in their resources.

Provisioning may actually be a form of sexual competition between males for females. Hawkes suggests that male provisioning is a particularly human behavior, which forges the nuclear family. The structure of familial provisioning determines a form of resource distribution. However, Hawkes does acknowledge inconsistencies across societies and contexts such as the fluctuating time courses dedicated to hunting and gathering, which are not directly correlated with return rates, the fact that nutrition value is often chosen over caloric count, and the fact that meat is a more widely spread resource than other resources.

The show-off hypothesis

The show-off hypothesis is the concept that more successful men have better mate options. The idea relates back to the fact that meat, the result of hunting expeditions, is a distinct resource in that it comes in large quantities that more often than not the hunter's own family is not able to consume in a timely manner so that the meat doesn't go sour. Also the success of hunting is unpredictable whereas berries and fruits, unless there is a drought or a bad bush, are fairly consistent in seasonality. Kristen Hawkes argues that women favor neighbors opting for men who provide the advantageous, yet infrequent meat feasts. These women may profit from alliance and the resulting feasts, especially in times of shortage. Hawkes suggests that it would be beneficial for women to reward men who employ the “show-off strategy” by supporting them in a dispute, caring for their offspring, or providing sexual favors. The benefits women may gain from their alignment lie in favored treatment of the offspring spawned by the show-off from neighbors. Buss echoes and cites Hawke's thoughts on the show-off's benefits in sexual access, increased likelihood of having children, and the favorable treatment his children would receive from the other members of the society. Hawkes also suggests that show-offs are more likely to live in large groups and thus be less susceptible to predators. Show-offs gain more benefits from just sharing with their family (classical fitness) in the potential favorable treatment from the community and reciprocal altruism from other members of the community.

Hawkes uses the Ache people of Paraguay as evidence for the Show-off hypothesis. Food acquired by men was more widely distributed across the community and inconsistent resources that came in large quantities when acquired were also more widely shared.

While this is represented in the Ache according to Hawkes, Buss notes that this trend is contradicted in the Hadza who evenly distribute the meat across all members of their population and whose hunters have very little control over the distribution. In the Hadza the show-off hypothesis does not have to do with the resources that result from hunting, but from the prestige and risk that is involved in big game hunting. There are possible circuitous benefits such as protection and defense.

 

Autoimmunity

From Wikipedia, the free encyclopedia


Autoimmunity is the system of immune responses of an organism against its own healthy cells, tissues and other body normal constituents. Any disease that results from such an aberrant immune response is termed an "autoimmune disease". Prominent examples include celiac disease, post-infectious IBS, diabetes mellitus type 1, Henloch Scholein Pupura (HSP) sarcoidosis, systemic lupus erythematosus (SLE), Sjögren syndrome, eosinophilic granulomatosis with polyangiitis, Hashimoto's thyroiditis, Graves' disease, idiopathic thrombocytopenic purpura, Addison's disease, rheumatoid arthritis (RA), ankylosing spondylitis, polymyositis (PM), dermatomyositis (DM) and multiple sclerosis (MS). Autoimmune diseases are very often treated with steroids.

Autoimmunity means presence of antibodies or T cells that react with self-protein and is present in all individuals, even in normal health state. It causes autoimmune diseases if self-reactivity can lead to tissue damage.

History

In the later 19th century it was believed that the immune system was unable to react against the body's own tissues. Paul Ehrlich, at the turn of the 20th century, proposed the concept of horror autotoxicus. Ehrlich later adjusted his theory to recognize the possibility of autoimmune tissue attacks, but believed certain innate protection mechanisms would prevent the autoimmune response from becoming pathological.

In 1904 this theory was challenged by the discovery of a substance in the serum of patients with paroxysmal cold hemoglobinuria that reacted with red blood cells. During the following decades, a number of conditions could be linked to autoimmune responses. However, the authoritative status of Ehrlich's postulate hampered the understanding of these findings. Immunology became a biochemical rather than a clinical discipline. By the 1950s the modern understanding of autoantibodies and autoimmune diseases started to spread.

More recently it has become accepted that autoimmune responses are an integral part of vertebrate immune systems (sometimes termed "natural autoimmunity"). Autoimmunity should not be confused with alloimmunity.

Low-level autoimmunity

While a high level of autoimmunity is unhealthy, a low level of autoimmunity may actually be beneficial. Taking the experience of a beneficial factor in autoimmunity further, one might hypothesize with intent to prove that autoimmunity is always a self-defense mechanism of the mammal system to survive. The system does not randomly lose the ability to distinguish between self and non-self; the attack on cells may be the consequence of cycling metabolic processes necessary to keep the blood chemistry in homeostasis.

Second, autoimmunity may have a role in allowing a rapid immune response in the early stages of an infection when the availability of foreign antigens limits the response (i.e., when there are few pathogens present). In their study, Stefanova et al. (2002) injected an anti-MHC class II antibody into mice expressing a single type of MHC Class II molecule (H-2b) to temporarily prevent CD4+ T cell-MHC interaction. Naive CD4+ T cells (those that have not encountered non-self antigens before) recovered from these mice 36 hours post-anti-MHC administration showed decreased responsiveness to the antigen pigeon cytochrome c peptide, as determined by ZAP70 phosphorylation, proliferation, and interleukin 2 production. Thus Stefanova et al. (2002) demonstrated that self-MHC recognition (which, if too strong may contribute to autoimmune disease) maintains the responsiveness of CD4+ T cells when foreign antigens are absent.

Immunological tolerance

Pioneering work by Noel Rose and Ernst Witebsky in New York, and Roitt and Doniach at University College London provided clear evidence that, at least in terms of antibody-producing B cells (B lymphocytes), diseases such as rheumatoid arthritis and thyrotoxicosis are associated with loss of immunological tolerance, which is the ability of an individual to ignore "self", while reacting to "non-self". This breakage leads to the immune system's mounting an effective and specific immune response against self determinants. The exact genesis of immunological tolerance is still elusive, but several theories have been proposed since the mid-twentieth century to explain its origin.

Three hypotheses have gained widespread attention among immunologists:

  • Clonal deletion theory, proposed by Burnet, according to which self-reactive lymphoid cells are destroyed during the development of the immune system in an individual. For their work Frank M. Burnet and Peter B. Medawar were awarded the 1960 Nobel Prize in Physiology or Medicine "for discovery of acquired immunological tolerance".
  • Clonal anergy theory, proposed by Nossal, in which self-reactive T- or B-cells become inactivated in the normal individual and cannot amplify the immune response.
  • Idiotype network theory, proposed by Jerne, wherein a network of antibodies capable of neutralizing self-reactive antibodies exists naturally within the body.

In addition, two other theories are under intense investigation:

  • Clonal ignorance theory, according to which autoreactive T cells that are not represented in the thymus will mature and migrate to the periphery, where they will not encounter the appropriate antigen because it is inaccessible tissues. Consequently, auto-reactive B cells, that escape deletion, cannot find the antigen or the specific helper T cell.
  • Suppressor population or Regulatory T cell theory, wherein regulatory T-lymphocytes (commonly CD4+FoxP3+ cells, among others) function to prevent, downregulate, or limit autoaggressive immune responses in the immune system.

Tolerance can also be differentiated into "central" and "peripheral" tolerance, on whether or not the above-stated checking mechanisms operate in the central lymphoid organs (thymus and bone marrow) or the peripheral lymphoid organs (lymph node, spleen, etc., where self-reactive B-cells may be destroyed). It must be emphasised that these theories are not mutually exclusive, and evidence has been mounting suggesting that all of these mechanisms may actively contribute to vertebrate immunological tolerance.

A puzzling feature of the documented loss of tolerance seen in spontaneous human autoimmunity is that it is almost entirely restricted to the autoantibody responses produced by B lymphocytes. Loss of tolerance by T cells has been extremely hard to demonstrate, and where there is evidence for an abnormal T cell response it is usually not to the antigen recognised by autoantibodies. Thus, in rheumatoid arthritis there are autoantibodies to IgG Fc but apparently no corresponding T cell response. In systemic lupus there are autoantibodies to DNA, which cannot evoke a T cell response, and limited evidence for T cell responses implicates nucleoprotein antigens. In Celiac disease there are autoantibodies to tissue transglutaminase but the T cell response is to the foreign protein gliadin. This disparity has led to the idea that human autoimmune disease is in most cases (with probable exceptions including type I diabetes) based on a loss of B cell tolerance which makes use of normal T cell responses to foreign antigens in a variety of aberrant ways.

Immunodeficiency and autoimmunity

There are a large number of immunodeficiency syndromes that present clinical and laboratory characteristics of autoimmunity. The decreased ability of the immune system to clear infections in these patients may be responsible for causing autoimmunity through perpetual immune system activation.

One example is common variable immunodeficiency (CVID) where multiple autoimmune diseases are seen, e.g.: inflammatory bowel disease, autoimmune thrombocytopenia and autoimmune thyroid disease.

Familial hemophagocytic lymphohistiocytosis, an autosomal recessive primary immunodeficiency, is another example. Pancytopenia, rashes, swollen lymph nodes and enlargement of the liver and spleen are commonly seen in such individuals. Presence of multiple uncleared viral infections due to lack of perforin are thought to be responsible.

In addition to chronic and/or recurrent infections many autoimmune diseases including arthritis, autoimmune hemolytic anemia, scleroderma and type 1 diabetes mellitus are also seen in X-linked agammaglobulinemia (XLA). Recurrent bacterial and fungal infections and chronic inflammation of the gut and lungs are seen in chronic granulomatous disease (CGD) as well. CGD is a caused by decreased production of nicotinamide adenine dinucleotide phosphate (NADPH) oxidase by neutrophils. Hypomorphic RAG mutations are seen in patients with midline granulomatous disease; an autoimmune disorder that is commonly seen in patients with granulomatosis with polyangiitis and NK/T cell lymphomas.

Wiskott–Aldrich syndrome (WAS) patients also present with eczema, autoimmune manifestations, recurrent bacterial infections and lymphoma.

In autoimmune polyendocrinopathy-candidiasis-ectodermal dystrophy (APECED) also autoimmunity and infections coexist: organ-specific autoimmune manifestations (e.g. hypoparathyroidism and adrenocortical failure) and chronic mucocutaneous candidiasis.

Finally, IgA deficiency is also sometimes associated with the development of autoimmune and atopic phenomena.

Genetic factors

Certain individuals are genetically susceptible to developing autoimmune diseases. This susceptibility is associated with multiple genes plus other risk factors. Genetically predisposed individuals do not always develop autoimmune diseases.

Three main sets of genes are suspected in many autoimmune diseases. These genes are related to:

The first two, which are involved in the recognition of antigens, are inherently variable and susceptible to recombination. These variations enable the immune system to respond to a very wide variety of invaders, but may also give rise to lymphocytes capable of self-reactivity.

Fewer correlations exist with MHC class I molecules. The most notable and consistent is the association between HLA B27 and spondyloarthropathies like ankylosing spondylitis and reactive arthritis. Correlations may exist between polymorphisms within class II MHC promoters and autoimmune disease.

The contributions of genes outside the MHC complex remain the subject of research, in animal models of disease (Linda Wicker's extensive genetic studies of diabetes in the NOD mouse), and in patients (Brian Kotzin's linkage analysis of susceptibility to SLE).

Recently, PTPN22 has been associated with multiple autoimmune diseases including Type I diabetes, rheumatoid arthritis, systemic lupus erythematosus, Hashimoto's thyroiditis, Graves’ disease, Addison's disease, Myasthenia Gravis, vitiligo, systemic sclerosis juvenile idiopathic arthritis, and psoriatic arthritis.

Sex
















There is some evidence that a person's sex may also have some role in the development of autoimmunity; that is, most autoimmune diseases are sex-related. A few autoimmune diseases that men are just as or more likely to develop as women include: ankylosing spondylitis, type 1 diabetes mellitus, granulomatosis with polyangiitis, Crohn's disease, Primary sclerosing cholangitis and psoriasis.

The reasons for the sex role in autoimmunity vary. Women appear to generally mount larger inflammatory responses than men when their immune systems are triggered, increasing the risk of autoimmunity. Involvement of sex steroids is indicated by that many autoimmune diseases tend to fluctuate in accordance with hormonal changes, for example: during pregnancy, in the menstrual cycle, or when using oral contraception. A history of pregnancy also appears to leave a persistent increased risk for autoimmune disease. It has been suggested that the slight, direct exchange of cells between mothers and their children during pregnancy may induce autoimmunity. This would tip the gender balance in the direction of the female.

Another theory suggests the female high tendency to get autoimmunity is due to an imbalanced X-chromosome inactivation. The X-inactivation skew theory, proposed by Princeton University's Jeff Stewart, has recently been confirmed experimentally in scleroderma and autoimmune thyroiditis. Other complex X-linked genetic susceptibility mechanisms are proposed and under investigation.

Environmental factors

Infectious diseases and parasites

An interesting inverse relationship exists between infectious diseases and autoimmune diseases. In areas where multiple infectious diseases are endemic, autoimmune diseases are quite rarely seen. The reverse, to some extent, seems to hold true. The hygiene hypothesis attributes these correlations to the immune-manipulating strategies of pathogens. While such an observation has been variously termed as spurious and ineffective, according to some studies, parasite infection is associated with reduced activity of autoimmune disease.

The putative mechanism is that the parasite attenuates the host immune response in order to protect itself. This may provide a serendipitous benefit to a host that also suffers from autoimmune disease. The details of parasite immune modulation are not yet known, but may include secretion of anti-inflammatory agents or interference with the host immune signaling.

A paradoxical observation has been the strong association of certain microbial organisms with autoimmune diseases. For example, Klebsiella pneumoniae and coxsackievirus B have been strongly correlated with ankylosing spondylitis and diabetes mellitus type 1, respectively. This has been explained by the tendency of the infecting organism to produce super-antigens that are capable of polyclonal activation of B-lymphocytes, and production of large amounts of antibodies of varying specificities, some of which may be self-reactive (see below).

Chemical agents and drugs

Certain chemical agents and drugs can also be associated with the genesis of autoimmune conditions, or conditions that simulate autoimmune diseases. The most striking of these is the drug-induced lupus erythematosus. Usually, withdrawal of the offending drug cures the symptoms in a patient.

Cigarette smoking is now established as a major risk factor for both incidence and severity of rheumatoid arthritis. This may relate to abnormal citrullination of proteins, since the effects of smoking correlate with the presence of antibodies to citrullinated peptides.

Pathogenesis of autoimmunity

Several mechanisms are thought to be operative in the pathogenesis of autoimmune diseases, against a backdrop of genetic predisposition and environmental modulation. It is beyond the scope of this article to discuss each of these mechanisms exhaustively, but a summary of some of the important mechanisms have been described:

  • T-cell bypass – A normal immune system requires the activation of B cells by T cells before the former can undergo differentiation into plasma B-cells and subsequently produce antibodies in large quantities. This requirement of a T cell can be bypassed in rare instances, such as infection by organisms producing super-antigens, which are capable of initiating polyclonal activation of B-cells, or even of T-cells, by directly binding to the β-subunit of T-cell receptors in a non-specific fashion.
  • T-cell–B-cell discordance – A normal immune response is assumed to involve B and T cell responses to the same antigen, even if we know that B cells and T cells recognise very different things: conformations on the surface of a molecule for B cells and pre-processed peptide fragments of proteins for T cells. However, there is nothing as far as we know that requires this. All that is required is that a B cell recognising antigen X endocytoses and processes a protein Y (normally =X) and presents it to a T cell. Roosnek and Lanzavecchia showed that B cells recognising IgGFc could get help from any T cell responding to an antigen co-endocytosed with IgG by the B cell as part of an immune complex. In coeliac disease it seems likely that B cells recognising tissue transglutamine are helped by T cells recognising gliadin.
  • Aberrant B cell receptor-mediated feedback – A feature of human autoimmune disease is that it is largely restricted to a small group of antigens, several of which have known signaling roles in the immune response (DNA, C1q, IgGFc, Ro, Con. A receptor, Peanut agglutinin receptor(PNAR)). This fact gave rise to the idea that spontaneous autoimmunity may result when the binding of antibody to certain antigens leads to aberrant signals being fed back to parent B cells through membrane bound ligands. These ligands include B cell receptor (for antigen), IgG Fc receptors, CD21, which binds complement C3d, Toll-like receptors 9 and 7 (which can bind DNA and nucleoproteins) and PNAR. More indirect aberrant activation of B cells can also be envisaged with autoantibodies to acetyl choline receptor (on thymic myoid cells) and hormone and hormone binding proteins. Together with the concept of T-cell–B-cell discordance this idea forms the basis of the hypothesis of self-perpetuating autoreactive B cells. Autoreactive B cells in spontaneous autoimmunity are seen as surviving because of subversion both of the T cell help pathway and of the feedback signal through B cell receptor, thereby overcoming the negative signals responsible for B cell self-tolerance without necessarily requiring loss of T cell self-tolerance.
  • Molecular mimicry – An exogenous antigen may share structural similarities with certain host antigens; thus, any antibody produced against this antigen (which mimics the self-antigens) can also, in theory, bind to the host antigens, and amplify the immune response. The idea of molecular mimicry arose in the context of rheumatic fever, which follows infection with Group A beta-haemolytic streptococci. Although rheumatic fever has been attributed to molecular mimicry for half a century no antigen has been formally identified (if anything too many have been proposed). Moreover, the complex tissue distribution of the disease (heart, joint, skin, basal ganglia) argues against a cardiac specific antigen. It remains entirely possible that the disease is due to e.g. an unusual interaction between immune complexes, complement components and endothelium.
  • Idiotype cross-reactionIdiotypes are antigenic epitopes found in the antigen-binding portion (Fab) of the immunoglobulin molecule. Plotz and Oldstone presented evidence that autoimmunity can arise as a result of a cross-reaction between the idiotype on an antiviral antibody and a host cell receptor for the virus in question. In this case, the host-cell receptor is envisioned as an internal image of the virus, and the anti-idiotype antibodies can react with the host cells.
  • Cytokine dysregulationCytokines have been recently divided into two groups according to the population of cells whose functions they promote: Helper T-cells type 1 or type 2. The second category of cytokines, which include IL-4, IL-10 and TGF-β (to name a few), seem to have a role in prevention of exaggeration of pro-inflammatory immune responses.
  • Dendritic cell apoptosis – immune system cells called dendritic cells present antigens to active lymphocytes. Dendritic cells that are defective in apoptosis can lead to inappropriate systemic lymphocyte activation and consequent decline in self-tolerance.
  • Epitope spreading or epitope drift – when the immune reaction changes from targeting the primary epitope to also targeting other epitopes. In contrast to molecular mimicry, the other epitopes need not be structurally similar to the primary one.
  • Epitope modification or Cryptic epitope exposure – this mechanism of autoimmune disease is unique in that it does not result from a defect in the hematopoietic system. Instead, disease results from the exposure of cryptic N-glycan (polysaccharide) linkages common to lower eukaryotes and prokaryotes on the glycoproteins of mammalian non-hematopoietic cells and organs This exposure of phylogenically primitive glycans activates one or more mammalian innate immune cell receptors to induce a chronic sterile inflammatory state. In the presence of chronic and inflammatory cell damage, the adaptive immune system is recruited and self–tolerance is lost with increased autoantibody production. In this form of the disease, the absence of lymphocytes can accelerate organ damage, and intravenous IgG administration can be therapeutic. Although this route to autoimmune disease may underlie various degenerative disease states, no diagnostics for this disease mechanism exist at present, and thus its role in human autoimmunity is currently unknown.

The roles of specialized immunoregulatory cell types, such as regulatory T cells, NKT cells, γδ T-cells in the pathogenesis of autoimmune disease are under investigation.

Classification

Autoimmune diseases can be broadly divided into systemic and organ-specific or localised autoimmune disorders, depending on the principal clinico-pathologic features of each disease.

Using the traditional “organ specific” and “non-organ specific” classification scheme, many diseases have been lumped together under the autoimmune disease umbrella. However, many chronic inflammatory human disorders lack the telltale associations of B and T cell driven immunopathology. In the last decade it has been firmly established that tissue "inflammation against self" does not necessarily rely on abnormal T and B cell responses.

This has led to the recent proposal that the spectrum of autoimmunity should be viewed along an “immunological disease continuum,” with classical autoimmune diseases at one extreme and diseases driven by the innate immune system at the other extreme. Within this scheme, the full spectrum of autoimmunity can be included. Many common human autoimmune diseases can be seen to have a substantial innate immune mediated immunopathology using this new scheme. This new classification scheme has implications for understanding disease mechanisms and for therapy development.

Diagnosis

Diagnosis of autoimmune disorders largely rests on accurate history and physical examination of the patient, and high index of suspicion against a backdrop of certain abnormalities in routine laboratory tests (example, elevated C-reactive protein).

In several systemic disorders, serological assays which can detect specific autoantibodies can be employed. Localised disorders are best diagnosed by immunofluorescence of biopsy specimens.

Autoantibodies are used to diagnose many autoimmune diseases. The levels of autoantibodies are measured to determine the progress of the disease.

Treatments

Treatments for autoimmune disease have traditionally been immunosuppressive, anti-inflammatory, or palliative. Managing inflammation is critical in autoimmune diseases. Non-immunological therapies, such as hormone replacement in Hashimoto's thyroiditis or Type 1 diabetes mellitus treat outcomes of the autoaggressive response, thus these are palliative treatments. Dietary manipulation limits the severity of celiac disease. Steroidal or NSAID treatment limits inflammatory symptoms of many diseases. IVIG is used for CIDP and GBS. Specific immunomodulatory therapies, such as the TNFα antagonists (e.g. etanercept), the B cell depleting agent rituximab, the anti-IL-6 receptor tocilizumab and the costimulation blocker abatacept have been shown to be useful in treating RA. Some of these immunotherapies may be associated with increased risk of adverse effects, such as susceptibility to infection.

Helminthic therapy is an experimental approach that involves inoculation of the patient with specific parasitic intestinal nematodes (helminths). There are currently two closely related treatments available, inoculation with either Necator americanus, commonly known as hookworms, or Trichuris Suis Ova, commonly known as Pig Whipworm Eggs.

T-cell vaccination is also being explored as a possible future therapy for autoimmune disorders.

Nutrition and autoimmunity

Vitamin D/Sunlight

  • Because most human cells and tissues have receptors for vitamin D, including T and B cells, adequate levels of vitamin D can aid in the regulation of the immune system. Vitamin D plays a role in immune function by acting on T cells and natural killer cells.  Research has demonstrated an association between low serum vitamin D and autoimmune diseases, including multiple sclerosistype 1 diabetes, and Systemic Lupus Erythematosus (commonly referred to simply as lupus).  However, since photosensitivity occurs in lupus, patients are advised to avoid sunlight which may be responsible for vitamin D deficiency seen in this disease. Polymorphisms in the vitamin D receptor gene are commonly found in people with autoimmune diseases, giving one potential mechanism for vitamin D's role in autoimmunity. There is mixed evidence on the effect of vitamin D supplementation in type 1 diabetes, lupus, and multiple sclerosis.

Omega-3 Fatty Acids

  • Studies have shown that adequate consumption of omega-3 fatty acids counteracts the effects of arachidonic acids, which contribute to symptoms of autoimmune diseases. Human and animal trials suggest that omega-3 is an effective treatment modality for many cases of Rheumatoid Arthritis, Inflammatory Bowel Disease, Asthma, and Psoriasis.
  • While major depression is not necessarily an autoimmune disease, some of its physiological symptoms are inflammatory and autoimmune in nature. Omega-3 may inhibit production of interferon gamma and other cytokines which cause the physiological symptoms of depression. This may be due to the fact that an imbalance in omega-3 and omega-6 fatty acids, which have opposing effects, is instrumental in the etiology of major depression.

Probiotics/Microflora

  • Various types of bacteria and microflora present in fermented dairy products, especially Lactobacillus casei, have been shown to both stimulate immune response to tumors in mice and to regulate immune function, delaying or preventing the onset of nonobese diabetes. This is particularly true of the Shirota strain of L. casei (LcS). The LcS strain is mainly found in yogurt and similar products in Europe and Japan, and rarely elsewhere.

Antioxidants

  • It has been theorized that free radicals contribute to the onset of type-1 diabetes in infants and young children, and therefore that the risk could be reduced by high intake of antioxidant substances during pregnancy. However, a study conducted in a hospital in Finland from 1997-2002 concluded that there was no statistically significant correlation between antioxidant intake and diabetes risk. This study involved monitoring of food intake through questionnaires, and estimated antioxidant intake on this basis, rather than by exact measurements or use of supplements.

Inequality (mathematics)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Inequality...