Search This Blog

Wednesday, April 16, 2025

Peer review

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Peer_review
A reviewer at the American National Institutes of Health evaluating a grant proposal

Peer review is the evaluation of work by one or more people with similar competencies as the producers of the work (peers). It functions as a form of self-regulation by qualified members of a profession within the relevant field. Peer review methods are used to maintain quality standards, improve performance, and provide credibility. In academia, scholarly peer review is often used to determine an academic paper's suitability for publication. Peer review can be categorized by the type of activity and by the field or profession in which the activity occurs, e.g., medical peer review. It can also be used as a teaching tool to help students improve writing assignments.

Henry Oldenburg (1619–1677) was a German-born British philosopher who is seen as the 'father' of modern scientific peer review. It developed over the following centuries with, for example, the journal Nature making it standard practice in 1973. The term "peer review" was first used in the early 1970s. A monument to peer review has been at the Higher School of Economics in Moscow since 2017.

Professional

Professional peer review focuses on the performance of professionals, with a view to improving quality, upholding standards, or providing certification. In academia, peer review is used to inform decisions related to faculty advancement and tenure.

A prototype professional peer review process was recommended in the Ethics of the Physician written by Ishāq ibn ʻAlī al-Ruhāwī (854–931). He stated that a visiting physician had to make duplicate notes of a patient's condition on every visit. When the patient was cured or had died, the notes of the physician were examined by a local medical council of other physicians, who would decide whether the treatment had met the required standards of medical care.

Professional peer review is common in the field of health care, where it is usually called clinical peer review. Further, since peer review activity is commonly segmented by clinical discipline, there is also physician peer review, nursing peer review, dentistry peer review, etc. Many other professional fields have some level of peer review process: accounting, law, engineering (e.g., software peer review, technical peer review), aviation, and even forest fire management.

Peer review is used in education to achieve certain learning objectives, particularly as a tool to reach higher order processes in the affective and cognitive domains as defined by Bloom's taxonomy. This may take a variety of forms, including closely mimicking the scholarly peer review processes used in science and medicine.

Scholarly

Scholarly peer review or academic peer review (also known as refereeing) is the process of having a draft version of a researcher's methods and findings reviewed (usually anonymously) by experts (or "peers") in the same field. Peer review is widely used for helping the academic publisher (that is, the editor-in-chief, the editorial board or the program committee) decide whether the work should be accepted, considered acceptable with revisions, or rejected for official publication in an academic journal, a monograph or in the proceedings of an academic conference. If the identities of authors are not revealed to each other, the procedure is called dual-anonymous peer review.

Academic peer review requires a community of experts in a given (and often narrowly defined) academic field, who are qualified and able to perform reasonably impartial review. Impartial review, especially of work in less narrowly defined or inter-disciplinary fields, may be difficult to accomplish, and the significance (good or bad) of an idea may never be widely appreciated among its contemporaries. Peer review is generally considered necessary to academic quality and is used in most major scholarly journals. However, peer review does not prevent publication of invalid research, and as experimentally controlled studies of this process are difficult to arrange, direct evidence that peer review improves the quality of published papers is scarce.

Medical

Medical peer review may be distinguished in four classifications:

  1. Clinical peer review is a procedure for assessing a patient's involvement with experiences of care. It is a piece of progressing proficient practice assessment and centered proficient practice assessment—significant supporters of supplier credentialing and privileging.
  2. Peer evaluation of clinical teaching skills for both physicians and nurses.
  3. Scientific peer review of journal articles.
  4. A secondary round of peer review for the clinical value of articles concurrently published in medical journals.

Additionally, "medical peer review" has been used by the American Medical Association to refer not only to the process of improving quality and safety in health care organizations, but also to the process of rating clinical behavior or compliance with professional society membership standards. The clinical network believes it to be the most ideal method of guaranteeing that distributed exploration is dependable and that any clinical medicines that it advocates are protected and viable for individuals. Thus, the terminology has poor standardization and specificity, particularly as a database search term.

Technical

In engineering, technical peer review is a type of engineering review. Technical peer reviews are a well-defined review process for finding and fixing defects, conducted by a team of peers with assigned roles. Technical peer reviews are carried out by peers representing areas of life cycle affected by material being reviewed (usually limited to 6 or fewer people). Technical peer reviews are held within development phases, between milestone reviews, on completed products or completed portions of products.

Government policy

The European Union has been using peer review in the "Open Method of Co-ordination" of policies in the fields of active labour market policy since 1999. In 2004, a program of peer reviews started in social inclusion. Each program sponsors about eight peer review meetings in each year, in which a "host country" lays a given policy or initiative open to examination by half a dozen other countries and the relevant European-level NGOs. These usually meet over two days and include visits to local sites where the policy can be seen in operation. The meeting is preceded by the compilation of an expert report on which participating "peer countries" submit comments. The results are published on the web.

The United Nations Economic Commission for Europe, through UNECE Environmental Performance Reviews, uses peer review, referred to as "peer learning", to evaluate progress made by its member countries in improving their environmental policies.

The State of California is the only U.S. state to mandate scientific peer review. In 1997, the Governor of California signed into law Senate Bill 1320 (Sher), Chapter 295, statutes of 1997, which mandates that, before any CalEPA Board, Department, or Office adopts a final version of a rule-making, the scientific findings, conclusions, and assumptions on which the proposed rule are based must be submitted for independent external scientific peer review. This requirement is incorporated into the California Health and Safety Code Section 57004.

Pedagogical

Peer review, or student peer assessment, is the method by which editors and writers work together in hopes of helping the author establish and further flesh out and develop their own writing. Peer review is widely used in secondary and post-secondary education as part of the writing process. This collaborative learning tool involves groups of students reviewing each other's work and providing feedback and suggestions for revision. Rather than a means of critiquing each other's work, peer review is often framed as a way to build connection between students and help develop writers' identity. While widely used in English and composition classrooms, peer review has gained popularity in other disciplines that require writing as part of the curriculum including the social and natural sciences.

Peer review in classrooms helps students become more invested in their work, and the classroom environment at large. Understanding how their work is read by a diverse readership before it is graded by the teacher may also help students clarify ideas and understand how to persuasively reach different audience members via their writing. It also gives students professional experience that they might draw on later when asked to review the work of a colleague prior to publication. The process can also bolster the confidence of students on both sides of the process. It has been found that students are more positive than negative when reviewing their classmates' writing. Peer review can help students not get discouraged but rather feel determined to improve their writing.

Critics of peer review in classrooms say that it can be ineffective due to students' lack of practice giving constructive criticism, or lack of expertise in the writing craft at large. Peer review can be problematic for developmental writers, particularly if students view their writing as inferior to others in the class as they may be unwilling to offer suggestions or ask other writers for help. Peer review can impact a student's opinion of themselves as well as others as sometimes students feel a personal connection to the work they have produced, which can also make them feel reluctant to receive or offer criticism. Teachers using peer review as an assignment can lead to rushed-through feedback by peers, using incorrect praise or criticism, thus not allowing the writer or the editor to get much out of the activity. As a response to these concerns, instructors may provide examples, model peer review with the class, or focus on specific areas of feedback during the peer review process. Instructors may also experiment with in-class peer review vs. peer review as homework, or peer review using technologies afforded by learning management systems online. Students that are older can give better feedback to their peers, getting more out of peer review, but it is still a method used in classrooms to help students young and old learn how to revise. With evolving and changing technology, peer review will develop as well. New tools could help alter the process of peer review.

Peer seminar

Peer seminar is a method that involves a speaker that presents ideas to an audience that also acts as a "contest". To further elaborate, there are multiple speakers that are called out one at a time and given an amount of time to present the topic that they have researched. Each speaker may or may not talk about the same topic but each speaker has something to gain or lose which can foster a competitive atmosphere. This approach allows speakers to present in a more personal tone while trying to appeal to the audience while explaining their topic.

Peer seminars may be somewhat similar to what conference speakers do, however, there is more time to present their points, and speakers can be interrupted by audience members to provide questions and feedback upon the topic or how well the speaker did in presenting their topic.

Peer review in writing

Professional peer review focuses on the performance of professionals, with a view to improving quality, upholding standards, or providing certification. Peer review in writing is a pivotal component among various peer review mechanisms, often spearheaded by educators and involving student participation, particularly in academic settings. It constitutes a fundamental process in academic and professional writing, serving as a systematic means to ensure the quality, effectiveness, and credibility of scholarly work. However, despite its widespread use, it is one of the most scattered, inconsistent, and ambiguous practices associated with writing instruction. Many scholars questioning its effectiveness and specific methodologies. Critics of peer review in classrooms express concerns about its ineffectiveness due to students' lack of practice in giving constructive criticism or their limited expertise in the writing craft overall.

Critiques of peer review

Academic peer review has faced considerable criticism, with many studies highlighting inherent issues in the peer review process.

The editorial peer review process has been found to be strongly biased against 'negative studies,' i.e. studies that do not work. This then biases the information base of medicine. Journals become biased against negative studies when values come into play. "Who wants to read something that doesn't work?" asks Richard Smith in the Journal of the Royal Society of Medicine. "That's boring."

This is also particularly evident in university classrooms, where the most common source of writing feedback during student years often comes from teachers, whose comments are often highly valued. Students may become influenced to provide research in line with the professor's viewpoints, because of the teacher's position of high authority. The effectiveness of feedback largely stems from its high authority. Benjamin Keating, in his article "A Good Development Thing: A Longitudinal Analysis of Peer Review and Authority in Undergraduate Writing," conducted a longitudinal study comparing two groups of students (one majoring in writing and one not) to explore students' perceptions of authority. This research, involving extensive analysis of student texts, concludes that students majoring in non-writing fields tend to undervalue mandatory peer review in class, while those majoring in writing value classmates' comments more. This reflects that peer review feedback has a certain threshold, and effective peer review requires a certain level of expertise. For non-professional writers, peer review feedback may be overlooked, thereby affecting its effectiveness.

Elizabeth Ellis Miller, Cameron Mozafari, Justin Lohr and Jessica Enoch state, "While peer review is an integral part of writing classrooms, students often struggle to effectively engage in it." The authors illustrate some reasons for the inefficiency of peer review based on research conducted during peer review sessions in university classrooms:

  1. Lack of Training: Students and even some faculty members may not have received sufficient training to provide constructive feedback. Without proper guidance on what to look for and how to provide helpful comments, peer reviewers may find it challenging to offer meaningful insights.
  2. Limited Engagement: Students may participate in peer review sessions with minimal enthusiasm or involvement, viewing them as obligatory tasks rather than valuable learning opportunities. This lack of investment can result in superficial feedback that fails to address underlying issues in the writing.
  3. Time Constraints: Instructors often allocate limited time for peer review activities during class sessions, which may not be adequate for thorough reviews of peers' work. Consequently, feedback may be rushed or superficial, lacking the depth required for meaningful improvement.

This research demonstrates that besides issues related to expertise, numerous objective factors contribute to students' poor performance in peer review sessions, resulting in feedback from peer reviewers that may not effectively assist authors. Additionally, this study highlights the influence of emotions in peer review sessions, suggesting that both peer reviewers and authors cannot completely eliminate emotions when providing and receiving feedback. This can lead to peer reviewers and authors approaching the feedback with either positive or negative attitudes towards the text, resulting in selective or biased feedback and review, further impacting their ability to objectively evaluate the article. It implies that subjective emotions may also affect the effectiveness of peer review feedback.

Pamela Bedore and Brian O'Sullivan also hold a skeptical view of peer review in most writing contexts. The authors conclude, based on comparing different forms of peer review after systematic training at two universities, that "the crux is that peer review is not just about improving writing but about helping authors achieve their writing vision." Feedback from the majority of non-professional writers during peer review sessions often tends to be superficial, such as simple grammar corrections and questions. This precisely reflects the implication in the conclusion that the focus is only on improving writing skills. Meaningful peer review involves understanding the author's writing intent, posing valuable questions and perspectives, and guiding the author to achieve their writing goals.

Alternatives

Various alternatives to peer review have been suggested (such as, in the context of science funding, funding-by-lottery).

Comparison and improvement

Magda Tigchelaar compares peer review with self-assessment through an experiment that divided students into three groups: self-assessment, peer review, and no review. Across four writing projects, she observed changes in each group, with surprising results showing significant improvement only in the self-assessment group. The author's analysis suggests that self-assessment allows individuals to clearly understand the revision goals at each stage, as the author is the most familiar with their writing. Thus, self-checking naturally follows a systematic and planned approach to revision. In contrast, the effectiveness of peer review is often limited due to the lack of structured feedback, characterized by scattered, meaningless summaries and evaluations that fail to meet the author's expectations for revising their work.

Stephanie Conner and Jennifer Gray highlight the value of most students' feedback during peer review. They argue that many peer review sessions fail to meet students' expectations, as students, even as reviewers themselves, feel uncertain about providing constructive feedback due to their lack of confidence in their writing. The authors offer numerous improvement strategies. For instance, the peer review process can be segmented into groups, where students present the papers to be reviewed while other group members take notes and analyze them. Then, the review scope can be expanded to the entire class. This widens the review sources and further enhances the level of professionalism.

With evolving technology, peer review is also expected to evolve. New tools have the potential to transform the peer review process. Mimi Li discusses the effectiveness and feedback of an online peer review software used in their freshman writing class. Unlike traditional peer review methods commonly used in classrooms, the online peer review software offers many tools for editing articles and comprehensive guidance. For instance, it lists numerous questions peer reviewers can ask and allows various comments to be added to the selected text. Based on observations over a semester, students showed varying degrees of improvement in their writing skills and grades after using the online peer review software. Additionally, they highly praised the technology of online peer review.

Tuesday, April 15, 2025

Fermentation

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Fermentation
Phylogenetic tree of bacteria and archaea, highlighting those that carry out fermentation. Their end products are also highlighted. Figure modified from Hackmann (2024).

Fermentation is a type of redox metabolism carried out in the absence of oxygen. During fermentation, organic molecules (e.g., glucose) are catabolized and donate electrons to other organic molecules. In the process, ATP and organic end products (e.g., lactate) are formed.

Because oxygen is not required, it is an alternative to aerobic respiration. Over 25% of bacteria and archaea carry out fermentation. They live in the gut, sediments, food, and other environments. Eukaryotes, including humans and other animals, also carry out fermentation.

Fermentation is important in several areas of human society. Humans have used fermentation in production of food for 13,000 years. Humans and their livestock have microbes in the gut that carry out fermentation, releasing products used by the host for energy. Fermentation is used at an industrial level to produce commodity chemicals, such as ethanol and lactate. In total, fermentation forms more than 50 metabolic end products. This process highlights the power of microbial activity.

Definition

The definition of fermentation has evolved over the years. The most modern definition is catabolism, where organic compounds are both the electron donor and acceptor. A common electron donor is glucose, and pyruvate is a common electron acceptor. This definition distinguishes fermentation from aerobic respiration, where oxygen is the acceptor and types of anaerobic respiration, where an inorganic species is the acceptor.

Fermentation had been defined differently in the past. In 1876, Louis Pasteur described it as "la vie sans air" (life without air). This definition came before the discovery of anaerobic respiration. Later, it had been defined as catabolism that forms ATP through only substrate-level phosphorylation. However, several pathways of fermentation have been discovered to form ATP through an electron transport chain and ATP synthase, also.

Some sources define fermentation loosely as any large-scale biological manufacturing process. See Industrial fermentation. This definition focuses on the process of manufacturing rather than metabolic details.

Biological role and prevalence

Fermentation is used by organisms to generate ATP energy for metabolism. One advantage is that it requires no oxygen or other external electron acceptors, and thus it can be carried out when those electron acceptors are absent. A disadvantage is that it produces relatively little ATP, yielding only between 2 and 4.5 per glucose compared to 32 for aerobic respiration.

Over 25% of bacteria and archaea carry out fermentation. This type of metabolism is most common in the phylum Bacillota, and it is least common in Actinomycetota. Their most common habitat is host-associated ones, such as the gut.

Animals, including humans, also carry out fermentation. The product of fermentation in humans is lactate, and it is formed during anaerobic exercise or in cancerous cells. No animal is known to survive on fermentation alone, even as one parasitic animal (Henneguya zschokkei) is known to survive without oxygen.

Substrates and products of fermentation

The most common substrates and products of fermentation. Figure modified from Hackmann (2024).

Fermentation uses a range of substrates and forms a variety of metabolic end products. Of the 55 end products formed, the most common are acetate and lactate. Of the 46 chemically-defined substrates that have been reported, the most common are glucose and other sugars.

Biochemical overview

Overview of the biochemical pathways for fermentation of glucose. Figure modified from Hackmann (2024).

When an organic compound is fermented, it is broken down to a simpler molecule and releases electrons. The electrons are transferred to a redox cofactor, which in turn transfers them to an organic compound. ATP is generated in the process, and it can be formed by substrate-level phosphorylation or by ATP synthase.

When glucose is fermented, it enters glycolysis or the pentose phosphate pathway and is converted to pyruvate. From pyruvate, pathways branch out to form a number of end products (e.g. lactate). At several points, electrons are released and accepted by redox cofactors (NAD and ferredoxin). At later points, these cofactors donate electrons to their final acceptor and become oxidized. ATP is also formed at several points in the pathway.

The biochemical pathways of fermentation of glucose in poster format. Figure modified from Hackmann (2024).

While fermentation is simple in overview, its details are more complex. Across organisms, fermentation of glucose involves over 120 different biochemical reactions. Further, multiple pathways can be responsible for forming the same product. For forming acetate from its immediate precursor (pyruvate or acetyl-CoA), six separate pathways have been found.

Biochemistry of individual products

Ethanol

In ethanol fermentation, one glucose molecule is converted into two ethanol molecules and two carbon dioxide (CO2) molecules. It is used to make bread dough rise: the carbon dioxide forms bubbles, expanding the dough into a foam. The ethanol is the intoxicating agent in alcoholic beverages such as wine, beer and liquor. Fermentation of feedstocks, including sugarcane, maize, and sugar beets, produces ethanol that is added to gasoline. In some species of fish, including goldfish and carp, it provides energy when oxygen is scarce (along with lactic acid fermentation).

Before fermentation, a glucose molecule breaks down into two pyruvate molecules (glycolysis). The energy from this exothermic reaction is used to bind inorganic phosphates to ADP, which converts it to ATP, and convert NAD+ to NADH. The pyruvates break down into two acetaldehyde molecules and give off two carbon dioxide molecules as waste products. The acetaldehyde is reduced into ethanol using the energy and hydrogen from NADH, and the NADH is oxidized into NAD+ so that the cycle may repeat. The reaction is catalyzed by the enzymes pyruvate decarboxylase and alcohol dehydrogenase.

History of bioethanol fermentation

The history of ethanol as a fuel spans several centuries and is marked by a series of significant milestones. Samuel Morey, an American inventor, was the first to produce ethanol by fermenting corn in 1826. However, it was not until the California Gold Rush in the 1850s that ethanol was first used as a fuel in the United States. Rudolf Diesel demonstrated his engine, which could run on vegetable oils and ethanol, in 1895, but the widespread use of petroleum-based diesel engines made ethanol less popular as a fuel. In the 1970s, the oil crisis reignited interest in ethanol, and Brazil became a leader in ethanol production and use. The United States began producing ethanol on a large scale in the 1980s and 1990s as a fuel additive to gasoline, due to government regulations. Today, ethanol continues to be explored as a sustainable and renewable fuel source, with researchers developing new technologies and biomass sources for its production.

  • 1826: Samuel Morey, an American inventor, was the first to produce ethanol by fermenting corn. However, ethanol was not widely used as a fuel until many years later. (1)
  • 1850s: Ethanol was first used as a fuel in the United States during the California gold rush. Miners used ethanol as a fuel for lamps and stoves because it was cheaper than whale oil. (2)
  • 1895: German engineer Rudolf Diesel demonstrated his engine, which was designed to run on vegetable oils, including ethanol. However, the widespread use of diesel engines fueled by petroleum made ethanol less popular as a fuel. (3)
  • 1970s: The oil crisis of the 1970s led to renewed interest in ethanol as a fuel. Brazil became a leader in ethanol production and use, due in part to government policies that encouraged the use of biofuels. (4)
  • 1980s–1990s: The United States began to produce ethanol on a large scale as a fuel additive to gasoline. This was due to the passage of the Clean Air Act in 1990, which required the use of oxygenates, such as ethanol, to reduce emissions. (5)
  • 2000s–present: There has been continued interest in ethanol as a renewable and sustainable fuel. Researchers are exploring new sources of biomass for ethanol production, such as switchgrass and algae, and developing new technologies to improve the efficiency of the fermentation process. (6)

Lactic acid

Homolactic fermentation (producing only lactic acid) is the simplest type of fermentation. Pyruvate from glycolysis undergoes a simple redox reaction, forming lactic acid. Overall, one molecule of glucose (or any six-carbon sugar) is converted to two molecules of lactic acid:

C6H12O6 → 2 CH3CHOHCOOH

It occurs in the muscles of animals when they need energy faster than the blood can supply oxygen. It also occurs in some kinds of bacteria (such as lactobacilli) and some fungi. It is the type of bacteria that convert lactose into lactic acid in yogurt, giving it its sour taste. These lactic acid bacteria can carry out either homolactic fermentation, where the end-product is mostly lactic acid, or heterolactic fermentation, where some lactate is further metabolized to ethanol and carbon dioxide (via the phosphoketolase pathway), acetate, or other metabolic products, e.g.:

C6H12O6 → CH3CHOHCOOH + C2H5OH + CO2

If lactose is fermented (as in yogurts and cheeses), it is first converted into glucose and galactose (both six-carbon sugars with the same atomic formula):

C12H22O11 + H2O → 2 C6H12O6

Heterolactic fermentation is in a sense intermediate between lactic acid fermentation and other types, e.g. alcoholic fermentation. Reasons to go further and convert lactic acid into something else include:

  • The acidity of lactic acid impedes biological processes. This can be beneficial to the fermenting organism as it drives out competitors that are unadapted to the acidity. As a result, the food will have a longer shelf life (one reason foods are purposely fermented in the first place); however, beyond a certain point, the acidity starts affecting the organism that produces it.
  • The high concentration of lactic acid (the final product of fermentation) drives the equilibrium backwards (Le Chatelier's principle), decreasing the rate at which fermentation can occur and slowing down growth.
  • Ethanol, into which lactic acid can be easily converted, is volatile and will readily escape, allowing the reaction to proceed easily. CO2 is also produced, but it is only weakly acidic and even more volatile than ethanol.
  • Acetic acid (another conversion product) is acidic and not as volatile as ethanol; however, in the presence of limited oxygen, its creation from lactic acid releases additional energy. It is a lighter molecule than lactic acid, forming fewer hydrogen bonds with its surroundings (due to having fewer groups that can form such bonds), thus is more volatile and will also allow the reaction to proceed more quickly.
  • If propionic acid, butyric acid, and longer monocarboxylic acids are produced, the amount of acidity produced per glucose consumed will decrease, as with ethanol, allowing faster growth.

Hydrogen gas

Hydrogen gas is produced in many types of fermentation as a way to regenerate NAD+ from NADH. Electrons are transferred to ferredoxin, which in turn is oxidized by hydrogenase, producing H2. Hydrogen gas is a substrate for methanogens and sulfate reducers, which keep the concentration of hydrogen low and favor the production of such an energy-rich compound, but hydrogen gas at a fairly high concentration can nevertheless be formed, as in flatus.

For example, Clostridium pasteurianum ferments glucose to butyrate, acetate, carbon dioxide, and hydrogen gas: The reaction leading to acetate is:

C6H12O6 + 4 H2O → 2 CH3COO + 2 HCO3 + 4 H+ + 4 H2

Glyoxylate

Glyoxylate fermentation is a type of fermentation used by microbes that are able to utilize glyoxylate as a nitrogen source.

Other

Other types of fermentation include mixed acid fermentation, butanediol fermentation, butyrate fermentation, caproate fermentation, and acetone–butanol–ethanol fermentation.

In the broader sense

In food and industrial contexts, any chemical modification performed by a living being in a controlled container can be termed "fermentation". The following do not fall into the biochemical sense, but are called fermentation in the larger sense:

Alternative protein

Fermentation is used to produce the heme protein found in the Impossible Burger.

Fermentation can be used to make alternative protein sources. It is commonly used to modify existing protein foods, including plant-based ones such as soy, into more flavorful forms such as tempeh and fermented tofu.

More modern "fermentation" makes recombinant protein to help produce meat analogue, milk substitute, cheese analogues, and egg substitutes. Some examples are:

Heme proteins such as myoglobin and hemoglobin give meat its characteristic texture, flavor, color, and aroma. The myoglobin and leghemoglobin ingredients can be used to replicate this property, despite them coming from a vat instead of meat.

Enzymes

Industrial fermentation can be used for enzyme production, where proteins with catalytic activity are produced and secreted by microorganisms. The development of fermentation processes, microbial strain engineering and recombinant gene technologies has enabled the commercialization of a wide range of enzymes. Enzymes are used in all kinds of industrial segments, such as food (lactose removal, cheese flavor), beverage (juice treatment), baking (bread softness, dough conditioning), animal feed, detergents (protein, starch and lipid stain removal), textile, personal care and pulp and paper industries.

Modes of industrial operation

Most industrial fermentation uses batch or fed-batch procedures, although continuous fermentation can be more economical if various challenges, particularly the difficulty of maintaining sterility, can be met.

Batch

In a batch process, all the ingredients are combined and the reactions proceed without any further input. Batch fermentation has been used for millennia to make bread and alcoholic beverages, and it is still a common method, especially when the process is not well understood. However, it can be expensive because the fermentor must be sterilized using high pressure steam between batches. Strictly speaking, there is often addition of small quantities of chemicals to control the pH or suppress foaming.

Batch fermentation goes through a series of phases. There is a lag phase in which cells adjust to their environment; then a phase in which exponential growth occurs. Once many of the nutrients have been consumed, the growth slows and becomes non-exponential, but production of secondary metabolites (including commercially important antibiotics and enzymes) accelerates. This continues through a stationary phase after most of the nutrients have been consumed, and then the cells die.

Fed-batch

Fed-batch fermentation is a variation of batch fermentation where some of the ingredients are added during the fermentation. This allows greater control over the stages of the process. In particular, production of secondary metabolites can be increased by adding a limited quantity of nutrients during the non-exponential growth phase. Fed-batch operations are often sandwiched between batch operations.

Open

The high cost of sterilizing the fermentor between batches can be avoided using various open fermentation approaches that are able to resist contamination. One is to use a naturally evolved mixed culture. This is particularly favored in wastewater treatment, since mixed populations can adapt to a wide variety of wastes. Thermophilic bacteria can produce lactic acid at temperatures of around 50 °Celsius, sufficient to discourage microbial contamination; and ethanol has been produced at a temperature of 70 °C. This is just below its boiling point (78 °C), making it easy to extract. Halophilic bacteria can produce bioplastics in hypersaline conditions. Solid-state fermentation adds a small amount of water to a solid substrate; it is widely used in the food industry to produce flavors, enzymes and organic acids.

Continuous

In continuous fermentation, substrates are added and final products removed continuously. There are three varieties: chemostats, which hold nutrient levels constant; turbidostats, which keep cell mass constant; and plug flow reactors in which the culture medium flows steadily through a tube while the cells are recycled from the outlet to the inlet. If the process works well, there is a steady flow of feed and effluent and the costs of repeatedly setting up a batch are avoided. Also, it can prolong the exponential growth phase and avoid byproducts that inhibit the reactions by continuously removing them. However, it is difficult to maintain a steady state and avoid contamination, and the design tends to be complex. Typically the fermentor must run for over 500 hours to be more economical than batch processors.

History of the use of fermentation

The use of fermentation, particularly for beverages, has existed since the Neolithic and has been documented dating from 7000 to 6600 BCE in Jiahu, China, 5000 BCE in India, Ayurveda mentions many Medicated Wines, 6000 BCE in Georgia, 3150 BCE in ancient Egypt, 3000 BCE in Babylon, 2000 BCE in pre-Hispanic Mexico, and 1500 BC in Sudan. Fermented foods have a religious significance in Judaism and Christianity. The Baltic god Rugutis was worshiped as the agent of fermentation. In alchemy, fermentation ("putrefaction") was symbolized by Capricorn ♑︎.

Louis Pasteur in his laboratory

In 1837, Charles Cagniard de la Tour, Theodor Schwann and Friedrich Traugott Kützing independently published papers concluding, as a result of microscopic investigations, that yeast is a living organism that reproduces by budding. Schwann boiled grape juice to kill the yeast and found that no fermentation would occur until new yeast was added. However, a lot of chemists, including Antoine Lavoisier, continued to view fermentation as a simple chemical reaction and rejected the notion that living organisms could be involved. This was seen as a reversion to vitalism and was lampooned in an anonymous publication by Justus von Liebig and Friedrich Wöhler.

The turning point came when Louis Pasteur (1822–1895), during the 1850s and 1860s, repeated Schwann's experiments and showed fermentation is initiated by living organisms in a series of investigations. In 1857, Pasteur showed lactic acid fermentation is caused by living organisms. In 1860, he demonstrated how bacteria cause souring in milk, a process formerly thought to be merely a chemical change. His work in identifying the role of microorganisms in food spoilage led to the process of pasteurization.

In 1877, working to improve the French brewing industry, Pasteur published his famous paper on fermentation, "Etudes sur la Bière", which was translated into English in 1879 as "Studies on fermentation". He defined fermentation (incorrectly) as "Life without air", yet he correctly showed how specific types of microorganisms cause specific types of fermentations and specific end-products.

Although showing fermentation resulted from the action of living microorganisms was a breakthrough, it did not explain the basic nature of fermentation; nor did it prove it is caused by microorganisms which appear to be always present. Many scientists, including Pasteur, had unsuccessfully attempted to extract the fermentation enzyme from yeast.

Success came in 1897 when the German chemist Eduard Buechner ground up yeast, extracted a juice from them, then found to his amazement this "dead" liquid would ferment a sugar solution, forming carbon dioxide and alcohol much like living yeasts.

Buechner's results are considered to mark the birth of biochemistry. The "unorganized ferments" behaved just like the organized ones. From that time on, the term enzyme came to be applied to all ferments. It was then understood fermentation is caused by enzymes produced by microorganisms. In 1907, Buechner won the Nobel Prize in chemistry for his work.

Advances in microbiology and fermentation technology have continued steadily up until the present. For example, in the 1930s, it was discovered microorganisms could be mutated with physical and chemical treatments to be higher-yielding, faster-growing, tolerant of less oxygen, and able to use a more concentrated medium. Strain selection and hybridization developed as well, affecting most modern food fermentations.

Post 1930s

The field of fermentation has been critical to producing a wide range of consumer goods, from food and drink to industrial chemicals and pharmaceuticals. Since its early beginnings in ancient civilizations, fermentation has continued to evolve and expand, with new techniques and technologies driving advances in product quality, yield, and efficiency. The period from the 1930s onward saw a number of significant advancements in fermentation technology, including the development of new processes for producing high-value products like antibiotics and enzymes, the increasing importance of fermentation in the production of bulk chemicals, and a growing interest in the use of fermentation for the production of functional foods and nutraceuticals.

The 1950s and 1960s saw the development of new fermentation technologies, such as immobilized cells and enzymes, which allowed for more precise control over fermentation processes and increased the production of high-value products like antibiotics and enzymes. In the 1970s and 1980s, fermentation became increasingly important in producing bulk chemicals like ethanol, lactic acid, and citric acid. This led to developing new fermentation techniques and genetically engineered microorganisms to improve yields and reduce production costs. In the 1990s and 2000s, there was a growing interest in fermentation to produce functional foods and nutraceuticals, which have potential health benefits beyond basic nutrition. This led to new fermentation processes, probiotics, and other functional ingredients.

Overall, the period from 1930 onward saw significant advancements in the use of fermentation for industrial purposes, leading to the production of a wide range of fermented products that are now consumed worldwide.

Frontal lobe

From Wikipedia, the free encyclopedia
Frontal lobe
Principal fissures and lobes of the cerebrum viewed laterally (Frontal lobe is shown in blue.).
Details
Part ofCerebrum
ArteryAnterior cerebral
Middle cerebral
Identifiers
Latinlobus frontalis
Acronym(s)FL
MeSHD005625
NeuroNames56
NeuroLex IDbirnlex_928
TA98A14.1.09.110
TA25445
FMA61824
Anatomical terms of neuroanatomy

The frontal lobe is the largest of the four major lobes of the brain in mammals, and is located at the front of each cerebral hemisphere (in front of the parietal lobe and the temporal lobe). It is parted from the parietal lobe by a groove between tissues called the central sulcus and from the temporal lobe by a deeper groove called the lateral sulcus (Sylvian fissure). The most anterior rounded part of the frontal lobe (though not well-defined) is known as the frontal pole, one of the three poles of the cerebrum.

The frontal lobe is covered by the frontal cortex. The frontal cortex includes the premotor cortex and the primary motor cortex – parts of the motor cortex. The front part of the frontal cortex is covered by the prefrontal cortex. The nonprimary motor cortex is a functionally defined portion of the frontal lobe.

There are four principal gyri in the frontal lobe. The precentral gyrus is directly anterior to the central sulcus, running parallel to it and contains the primary motor cortex, which controls voluntary movements of specific body parts. Three horizontally arranged subsections of the frontal gyrus are the superior frontal gyrus, the middle frontal gyrus, and the inferior frontal gyrus. The inferior frontal gyrus is divided into three parts – the orbital part, the triangular part and the opercular part.

The frontal lobe contains most of the dopaminergic neurons in the cerebral cortex. The dopaminergic pathways are associated with reward, attention, short-term memory tasks, planning, and motivation. Dopamine tends to limit and select sensory information coming from the thalamus to the forebrain.

Structure

Frontal lobe (red) of left cerebral hemisphere

The frontal lobe is the largest lobe of the brain and makes up about a third of the surface area of each hemisphere. On the lateral surface of each hemisphere, the central sulcus separates the frontal lobe from the parietal lobe. The lateral sulcus separates the frontal lobe from the temporal lobe.

The frontal lobe can be divided into a lateral, polar, orbital (above the orbit; also called basal or ventral), and medial part. Each of these parts consists of a particular gyrus:

The gyri are separated by sulci. E.g., the precentral gyrus is in front of the central sulcus, and behind the precentral sulcus. The superior and middle frontal gyri are divided by the superior frontal sulcus. The middle and inferior frontal gyri are divided by the inferior frontal sulcus.

In humans the frontal lobe reaches full maturity only after the 20s—the prefrontal cortex, in particular, continues in maturing 'til the second and third decades of life—which, thereafter, marks the cognitive maturity associated with adulthood. A small amount of atrophy, however, is normal in the aging person's frontal lobe. Fjell, in 2009, studied atrophy of the brain in people aged 60–91 years. The 142 healthy participants were scanned using MRI. Their results were compared to those of 122 participants with Alzheimer's disease. A follow-up one year later showed there to have been a marked volumetric decline in those with Alzheimer's and a much smaller decline (averaging 0.5%) in the healthy group. These findings corroborate those of Coffey, who in 1992 indicated that the frontal lobe decreases in volume approximately 0.5–1% per year.

Function

The entirety of the frontal cortex can be considered the "action cortex", much as the posterior cortex is considered the "sensory cortex". It is devoted to action of one kind or another: skeletal movement, ocular movement, speech control, and the expression of emotions. In humans, the largest part of the frontal cortex, the prefrontal cortex (PFC), is responsible for internal, purposeful mental action, commonly called reasoning or prefrontal synthesis.

The function of the PFC involves the ability to project future consequences that result from current actions. PFC functions also include override and suppression of socially unacceptable responses as well as differentiation of tasks.

The PFC also plays an important part in integrating longer non-task based memories stored across the brain. These are often memories associated with emotions derived from input from the brain's limbic system. The frontal lobe modifies those emotions, generally to fit socially acceptable norms.

Psychological tests that measure frontal lobe function include finger tapping (as the frontal lobe controls voluntary movement), the Wisconsin Card Sorting Test, and measures of language, numeracy skills, and decision making, all of which are controlled by the frontal lobe.

Clinical significance

Damage

Damage to the frontal lobe can occur in a number of ways and result in many different consequences. Transient ischemic attacks (TIAs) also known as mini-strokes, and strokes are common causes of frontal lobe damage in older adults (65 and over). These strokes and mini-strokes can occur due to the blockage of blood flow to the brain or as a result of the rupturing of an aneurysm in a cerebral artery. Other ways in which injury can occur include traumatic brain injuries incurred following accidents, diagnoses such as Alzheimer's disease or Parkinson's disease (which cause dementia symptoms), and frontal lobe epilepsy (which can occur at any age). Very often, frontal lobe damage is recognized in those with prenatal alcohol exposure.

Symptoms

Common effects of damage to the frontal lobe are varied. Patients who have experienced frontal lobe trauma may know the appropriate response to a situation but display inappropriate responses to those same situations in real life . Similarly, emotions that are felt may not be expressed in the face or voice. For example, someone who is feeling happy would not smile, and the voice would be devoid of emotion. Along the same lines, though, the person may also exhibit excessive, unwarranted displays of emotion. Depression is common in stroke patients. Also common is a loss of or decrease in motivation. Someone might not want to carry out normal daily activities and would not feel "up to it". Those who are close to the person who has experienced the damage may notice changes in behavior. The case of Phineas Gage was long considered exemplary of these symptoms, though more recent research has suggested that accounts of his personality change have been poorly evidenced. The frontal lobe is the same part of the brain that is responsible for executive functions such as planning for the future, judgment, decision-making skills, attention span, and inhibition. These functions can decrease in someone whose frontal lobe is damaged.

Consequences that are seen less frequently are also varied. Confabulation may be the most frequently indicated "less common" effect. In the case of confabulation, someone gives false information while maintaining the belief that it is the truth. In a small number of patients, uncharacteristic cheerfulness can be noted. This effect is seen mostly in patients with lesions to the right frontal portion of the brain.

Another infrequent effect is that of reduplicative paramnesia, in which patients believe that the location in which they currently reside is a replica of one located somewhere else. Similarly, those who experience Capgras syndrome after frontal lobe damage believe that an identical "replacement" has taken the identity of a close friend, relative, or other person and is posing as that person. This last effect is seen mostly in schizophrenic patients who also have a neurological disorder in the frontal lobe.

DNA damage

In the human frontal cortex, a set of genes undergo reduced expression after age 40 and especially after age 70. This set includes genes that have key functions in synaptic plasticity important in learning and memory, vesicular transport and mitochondrial function. During aging, DNA damage is markedly increased in the promoters of the genes displaying reduced expression in the frontal cortex. In cultured human neurons, these promoters are selectively damaged by oxidative stress.

Individuals with HIV associated neurocognitive disorders accumulate nuclear and mitochondrial DNA damage in the frontal cortex.

Genetic

A report from the National Institute of Mental Health says a gene variant of (COMT) that reduces dopamine activity in the prefrontal cortex is related to poorer performance and inefficient functioning of that brain region during working memory, tasks, and to a slightly increased risk for schizophrenia.

History

Psychosurgery

In the early 20th century, a medical treatment for mental illness, first developed by Portuguese neurologist Egas Moniz, involved damaging the pathways connecting the frontal lobe to the limbic system. A frontal lobotomy (sometimes called frontal leucotomy) successfully reduced distress but at the cost of often blunting the subject's emotions, volition and personality. The indiscriminate use of this psychosurgical procedure, combined with its severe side effects and a mortality rate of 7.4 to 17 per cent, earned it a bad reputation. The frontal lobotomy has largely died out as a psychiatric treatment. More precise psychosurgical procedures are still used, although rarely. They may include anterior capsulotomy (bilateral thermal lesions of the anterior limbs of the internal capsule) or the bilateral cingulotomy (involving lesions of the anterior cingulate gyri) and might be used to treat otherwise untreatable obsessional disorders or clinical depression.

Theories of function

Theories of frontal lobe function can be separated into four categories:

  • Single-process theories, which propose that "damage to a single process or system is responsible for a number of different dysexecutive symptoms"
  • Multi-process theories, which propose "that the frontal lobe executive system consists of a number of components that typically work together in everyday actions (heterogeneity of function)"
  • Construct-led theories, which propose that "most if not all frontal functions can be explained by one construct (homogeneity of function) such as working memory or inhibition"
  • Single-symptom theories, which propose that a specific dysexecutive symptom (e.g., confabulation) is related to the processes and construct of the underlying structures.

Other theories include:

  • Stuss (1999) suggests a differentiation into two categories according to homogeneity and heterogeneity of function.
  • Grafman's managerial knowledge units (MKU) / structured event complex (SEC) approach (cf. Wood & Grafman, 2003)
  • Miller & Cohen's integrative theory of prefrontal functioning (e.g. Miller & Cohen, 2001)
  • Rolls's stimulus-reward approach and Stuss's anterior attentional functions (Burgess & Simons, 2005; Burgess, 2003; Burke, 2007).

It may be highlighted that the theories described above differ in their focus on certain processes/systems or construct-lets. Stuss (1999) remarks that the question of homogeneity (single construct) or heterogeneity (multiple processes/systems) of function "may represent a problem of semantics and/or incomplete functional analysis rather than an unresolvable dichotomy" (p. 348). However, further research will show if a unified theory of frontal lobe function that fully accounts for the diversity of functions will be available.

Other primates

Many scientists had thought that the frontal lobe was disproportionately enlarged in humans compared to other primates. This was thought to be an important feature of human evolution and seen as the primary reason why human cognition differs from that of other primates. However, this view in relation to great apes has since been challenged by neuroimaging studies. Using magnetic resonance imaging to determine the volume of the frontal cortex in humans, all extant ape species, and several monkey species, it was found that the human frontal cortex was not relatively larger than the cortex of other great apes, but was relatively larger than the frontal cortex of lesser apes and the monkeys. The higher cognition of the humans is instead seen to relate to a greater connectedness given by neural tracts that do not affect the cortical volume. This is also evident in the pathways of the language network connecting the frontal and temporal lobes.

Human extinction

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Human_ext...