Search This Blog

Friday, January 26, 2024

E-learning (theory)

From Wikipedia, the free encyclopedia

E-learning theory describes the cognitive science principles of effective multimedia learning using electronic educational technology.

Multimedia instructional design principles

Beginning with cognitive load theory as their motivating scientific premise, researchers such as Richard E. Mayer, John Sweller, and Roxana Moreno established within the scientific literature a set of multimedia instructional design principles that promote effective learning. Many of these principles have been "field tested" in everyday learning settings and found to be effective there as well. The majority of this body of research has been performed using university students given relatively short lessons on technical concepts with which they held low prior knowledge. However, David Roberts has tested the method with students in nine social science disciplines including sociology, politics and business studies. His longitudinal research program over 3 years established a clear improvement in levels of student engagement and in the development of active learning principles among students exposed to a combination of images and text over students exposed only to text. A number of other studies have shown these principles to be effective with learners of other ages and with non-technical learning content.

Research using learners who have greater prior knowledge of the lesson material sometimes finds results that contradict these design principles. This has led some researchers to put forward the "expertise effect" as an instructional design principle unto itself.

The underlying theoretical premise, cognitive load theory, describes the amount of mental effort that is related to performing a task as falling into one of three categories: germane, intrinsic, and extraneous.

  • Germane cognitive load: the mental effort required to process the task's information, make sense of it, and access and/or store it in long-term memory (for example, seeing a math problem, identifying the values and operations involved, and understanding that your task is to solve the math problem).
  • Intrinsic cognitive load: the mental effort required to perform the task itself (for example, actually solving the math problem).
  • Extraneous cognitive load: the mental effort imposed by the way that the task is delivered, which may or may not be efficient (for example, finding the math problem you are supposed to solve on a page that also contains advertisements for math books).

The multimedia instructional design principles identified by Mayer, Sweller, Moreno, and their colleagues are largely focused on minimizing extraneous cognitive load and managing intrinsic and germane loads at levels that are appropriate for the learner. Examples of these principles in practice include

  • Reducing extraneous load by eliminating visual and auditory effects and elements that are not central to the lesson, such as seductive details (the coherence principle)
  • Reducing germane load by delivering verbal information through audio presentation (narration) while delivering relevant visual information through static images or animations (the modality principle)
  • Controlling intrinsic load by breaking the lesson into smaller segments and giving learners control over the pace at which they move forward through the lesson material (the segmenting principle).

Cognitive load theory (and by extension, many of the multimedia instructional design principles) is based in part on a model of working memory by Alan Baddeley and Graham Hitch, who proposed that working memory has two largely independent, limited capacity sub-components that tend to work in parallel – one visual and one verbal/acoustic. This gave rise to dual-coding theory, first proposed by Allan Paivio and later applied to multimedia learning by Richard Mayer. According to Mayer, separate channels of working memory process auditory and visual information during any lesson. Consequently, a learner can use more cognitive processing capacities to study materials that combine auditory verbal information with visual graphical information than to process materials that combine printed (visual) text with visual graphical information. In other words, the multi-modal materials reduce the cognitive load imposed on working memory.

In a series of studies, Mayer and his colleagues tested Paivio's dual-coding theory with multimedia lesson materials. They repeatedly found that students given multimedia with animation and narration consistently did better on transfer questions than those who learned from animation and text-based materials. That is, they were significantly better when it came to applying what they had learned after receiving multimedia rather than mono-media (visual only) instruction. These results were then later confirmed by other groups of researchers.

The initial studies of multimedia learning were limited to logical scientific processes that centered on cause-and-effect systems like automobile braking systems, how a bicycle pump works, or cloud formation. However, subsequent investigations found that the modality effect extended to other areas of learning.

Empirically established principles

  • Multimedia principle: Deeper learning is observed when words and relevant graphics are both presented than when words are presented alone (also called the multimedia effect). Simply put, the three most common elements in multimedia presentations are relevant graphics, audio narration, and explanatory text. Combining any two of these three elements works better than using just one or all three.
  • Modality principle: Deeper learning occurs when graphics are explained by audio narration instead of on-screen text. Exceptions have been observed when learners are familiar with the content, are not native speakers of the narration language, or when only printed words appear on the screen. Generally speaking, audio narration leads to better learning than the same words presented as text on the screen. This is especially true for walking someone through graphics on the screen and when the material to be learned is complex, or the terminology being used is already understood by the student (otherwise, see "pre-training"). One exception to this is when the learner will be using the information as a reference and will need to look back to it again and again.
  • Coherence principle: Avoid including graphics, music, narration, and other content that does not support the learning. This helps focus the learner on the content they need to learn and minimizes cognitive load imposed on memory by irrelevant and possibly distracting content. The less learners know about the lesson content, the easier it is for them to get distracted by anything shown that is not directly relevant to the lesson. For learners with greater prior knowledge, however, some motivational imagery may increase their interest and learning effectiveness.
  • Contiguity principle: Keep related pieces of information together. Deeper learning occurs when relevant text (for example, a label) is placed close to graphics, when spoken words and graphics are presented at the same time, and when feedback is presented next to the answer given by the learner.
  • Segmenting principle: Deeper learning occurs when content is broken into small chunks. Break down long lessons into several shorter lessons. Break down long text passages into multiple shorter ones.
  • Signaling principle: The use of visual, auditory, or temporal cues to draw attention to critical elements of the lesson. Common techniques include arrows, circles, highlighting or bolding text, and pausing or vocal emphasis in narration. Ending lesson segments after the critical information has been given may also serve as a signaling cue.
  • Learner control principle: Deeper learning occurs when learners can control the rate at which they move forward through segmented content. Learners tend to do best when the narration stops after a short, meaningful segment of content is given and the learner has to click a "continue" button in order to start the next segment. Some research suggests not overwhelming the learner with too many control options, however. Giving just pause and play buttons may work better than giving pause, play, fast forward, and reverse buttons. Also, high prior-knowledge learners may learn better when the lesson moves forward automatically, but they have a pause button that allows them to stop when they choose to do so.
  • Personalization principle: Deeper learning in multimedia lessons occur when learners experience a stronger social presence, as when a conversational script or learning agents are used. The effect is best seen when the tone of voice is casual, informal, and in a 1st person ("I" or "we") or 2nd person ("you") voice. For example, of the following two sentences, the second version conveys more of a casual, informal, conversational tone:
A. The learner should have the sense that someone is talking directly to them when they hear the narration.
B. Your learner should feel like someone is talking directly to them when they hear your narration.
 
Also, research suggests that using a polite tone of voice ("You may want to try multiplying both sides of the equation by 10.") leads to deeper learning for low prior knowledge learners than does a less polite, more directive tone of voice ("Multiply both sides of the equation by 10."), but may impair deeper learning in high prior knowledge learners. Finally, adding pedagogical agents (computer characters) can help if used to reinforce important content. For example, have the character narrate the lesson, point out critical features in on-screen graphics, or visually demonstrate concepts to the learner.
  • Pre-training principle: Deeper learning occurs when lessons present key concepts or vocabulary before presenting the processes or procedures related to those concepts. According to Mayer, Mathias, and Wetzel, "Before presenting a multimedia explanation, make sure learners visually recognize each major component, can name each component and can describe the major state changes of each component. In short, make sure learners build component models before presenting a cause-and-effect explanation of how a system works." However, others have noted that including pre-training content appears to be more important for low prior knowledge learners than for high prior knowledge learners.
  • Redundancy principle: Deeper learning occurs when lesson graphics are explained by audio narration alone rather than audio narration and on-screen text. This effect is stronger when the lesson is fast-paced, and the words are familiar to the learners. Exceptions to this principle include: screens with no visuals, learners who are not native speakers of the course language, and placement of only a few keywords on the screen (i.e., labeling critical elements of the graphic image).
  • Expertise effect: Instructional methods, such as those described above, that are helpful to domain novices or low prior knowledge learners may have no effect or may even depress learning in high prior knowledge learners.

Such principles may not apply outside of laboratory conditions. For example, Muller found that adding approximately 50% additional extraneous but interesting material did not result in any significant difference in learner performance. There is ongoing debate concerning the mechanisms underlying these beneficial principles, and on what boundary conditions may apply.

Learning theories

Good pedagogical practice has a theory of learning at its core. However, no single best-practice e-learning standard has emerged. This may be unlikely given the range of learning and teaching styles, the potential ways technology can be implemented, and how educational technology itself is changing. Various pedagogical approaches or learning theories may be considered in designing and interacting with e-learning programs.

Social-constructivist – this pedagogy is particularly well afforded by the use of discussion forums, blogs, wikis, and online collaborative activities. It is a collaborative approach that opens educational content creation to a wider group, including the students themselves. The One Laptop Per Child Foundation attempted to use a constructivist approach in its project.

Laurillard's conversational model is also particularly relevant to e-learning, and Gilly Salmon's Five-Stage Model is a pedagogical approach to the use of discussion boards.

The cognitive perspective focuses on the cognitive processes involved in learning as well as how the brain works.

The emotional perspective focuses on the emotional aspects of learning, like motivation, engagement, fun, etc.

The behavioural perspective focuses on the skills and behavioural outcomes of the learning process. Role-playing and application to on-the-job settings.

The contextual perspective focuses on the environmental and social aspects which can stimulate learning. Interaction with other people, collaborative discovery, and the importance of peer support as well as pressure.

Mode neutral Convergence or promotion of 'transmodal' learning where online and classroom learners can coexist within one learning environment, thus encouraging interconnectivity and the harnessing of collective intelligence.

For many theorists, it's the interaction between student and teacher and student and student in the online environment that enhances learning (Mayes and de Freitas 2004). Pask's theory that learning occurs through conversations about a subject which in turn helps to make knowledge explicit, has an obvious application to learning within a VLE.

Salmon developed a five-stage model of e-learning and e-moderating that for some time has had a major influence where online courses and online discussion forums have been used. In her five-stage model, individual access and the ability of students to use the technology are the first steps to involvement and achievement. The second step involves students creating an identity online and finding others with whom to interact; online socialization is a critical element of the e-learning process in this model. In step 3, students give and share information relevant to the course with each other. Collaborative interaction amongst students is central to step 4. The fifth step in Salmon's model involves students looking for benefits from the system and using resources from outside of it to deepen their learning. Throughout all of this, the tutor/teacher/lecturer fulfills the role of moderator or e-moderator, acting as a facilitator of student learning.

Some criticism is now beginning to emerge. Her model does not easily transfer to other contexts (she developed it with experience from an Open University distance learning course). It ignores the variety of learning approaches that are possible within computer-mediated communication (CMC) and the range of learning available theories (Moule 2007).

Self-regulation

Self-regulated learning refers to several concepts that play major roles in learning and which have significant relevance in e-learning. explains that in order to develop self-regulation, learning courses should offer opportunities for students to practice strategies and skills by themselves. Self-regulation is also strongly related to a student's social sources, such as parents and teachers. Moreover, Steinberg (1996) found that high-achieving students usually have high-expectation parents who monitor their children closely.

In the academic environment, self-regulated learners usually set their academic goals and monitor and react themselves in the process in order to achieve their goals. Schunk argues, "Students must regulate not only their actions but also their underlying achievement-related cognitions, beliefs, intentions and effects"(p. 359). Moreover, academic self-regulation also helps students develop confidence in their ability to perform well in e-learning courses.

Theoretical framework

E-learning literature identifies an ecology of concepts from a bibliometric study were identified the most used concepts associated with the use of computers in learning contexts, e.g., computer-assisted instruction (CAI), computer-assisted learning (CAL), computer-based education (CBE), e-learning, learning management systems (LMS), self-directed learning (SDL), and massive open online courses (MOOC). All these concepts have two aspects in common: learning and computers, except the SDL concept, which derives from psychology and does not necessarily apply to computer usage. These concepts are yet to be studied in scientific research and stand in contrast to MOOCs. Nowadays, e-learning can also mean massive distribution of content and global classes for all Internet users. E-learning studies can be focused on three principal dimensions: users, technology, and services.

Application of Learning theory (education) to E-Learning (theory)

As alluded to at the beginning of this section, the discussion of whether to use virtual or physical learning environments is unlikely to yield an answer in the current format. First, the efficacy of the learning environment may depend on the concept being taught.  Additionally, comparisons provide differences in learning theories as explanations for the differences between virtual and physical environments as a post-mortem explanation.  When virtual and physical environments were designed so that the same learning theories were employed by the students, (Physical Engagement, Cognitive Load, Embodied Encoding, Embodied Schemas, and Conceptual Salience), differences in post-test performance did not lie between physical vs. virtual, but instead in how the environment was designed to support the particular learning theory.  

These findings suggest that as long as virtual learning environments are well designed and able to emulate the most important aspects of the physical environment that they are intended to replicate or enhance, research that has been previously applied to physical models or environments can also be applied to virtual ones. This means that it's possible to apply a wealth of research from physical learning theory to virtual environments. These virtual learning environments – once developed – can present cost-effective solutions to learning, concerning time invested in setting up, use, and iterative use. Additionally, due to the relatively low cost, students are able to perform advanced analytical techniques without the cost of lab supplies. Many even believe that when considering the appropriate affordances of each (virtual or physical) representation, a blend that uses both can further enhance student learning.

Teacher use of technology

Computing technology was not created by teachers. There has been little consultation between those who promote its use in schools and those who teach with it. Decisions to purchase technology for education are very often political decisions. Most staff using these technologies did not grow up with them. Training teachers to use computer technology did improve their confidence in its use, but there was considerable dissatisfaction with training content and style of delivery. The communication element, in particular, was highlighted as the least satisfactory part of the training, by which many teachers meant the use of a VLE and discussion forums to deliver online training (Leask 2002). Technical support for online learning, lack of access to hardware, poor monitoring of teacher progress, and a lack of support by online tutors were just some of the issues raised by the asynchronous online delivery of training (Davies 2004).

Newer generation web 2.0 services provide customizable, inexpensive platforms for authoring and disseminating multimedia-rich e-learning courses and do not need specialized information technology (IT) support.

Pedagogical theory may have application in encouraging and assessing online participation. Assessment methods for online participation have been reviewed.

Methodology

From Wikipedia, the free encyclopedia

Methodologies are traditionally divided into quantitative and qualitative research. Quantitative research is the main methodology of the natural sciences. It uses precise numerical measurements. Its goal is usually to find universal laws used to make predictions about future events. The dominant methodology in the natural sciences is called the scientific method. It includes steps like observation and the formulation of a hypothesis. Further steps are to test the hypothesis using an experiment, to compare the measurements to the expected results, and to publish the findings.

Qualitative research is more characteristic of the social sciences and gives less prominence to exact numerical measurements. It aims more at an in-depth understanding of the meaning of the studied phenomena and less at universal and predictive laws. Common methods found in the social sciences are surveys, interviews, focus groups, and the nominal group technique. They differ from each other concerning their sample size, the types of questions asked, and the general setting. In recent decades, many social scientists have started using mixed-methods research, which combines quantitative and qualitative methodologies.

Many discussions in methodology concern the question of whether the quantitative approach is superior, especially whether it is adequate when applied to the social domain. A few theorists reject methodology as a discipline in general. For example, some argue that it is useless since methods should be used rather than studied. Others hold that it is harmful because it restricts the freedom and creativity of researchers. Methodologists often respond to these objections by claiming that a good methodology helps researchers arrive at reliable theories in an efficient way. The choice of method often matters since the same factual material can lead to different conclusions depending on one's method. Interest in methodology has risen in the 20th century due to the increased importance of interdisciplinary work and the obstacles hindering efficient cooperation.

Definitions

The term "methodology" is associated with a variety of meanings. In its most common usage, it refers either to a method, to the field of inquiry studying methods, or to philosophical discussions of background assumptions involved in these processes. Some researchers distinguish methods from methodologies by holding that methods are modes of data collection while methodologies are more general research strategies that determine how to conduct a research project. In this sense, methodologies include various theoretical commitments about the intended outcomes of the investigation.

As method

The term "methodology" is sometimes used as a synonym for the term "method". A method is a way of reaching some predefined goal. It is a planned and structured procedure for solving a theoretical or practical problem. In this regard, methods stand in contrast to free and unstructured approaches to problem-solving. For example, descriptive statistics is a method of data analysis, radiocarbon dating is a method of determining the age of organic objects, sautéing is a method of cooking, and project-based learning is an educational method. The term "technique" is often used as a synonym both in the academic and the everyday discourse. Methods usually involve a clearly defined series of decisions and actions to be used under certain circumstances. The goal of following the steps of a method is to bring about the result promised by it. In the context of inquiry, methods may be defined as systems of rules and procedures to discover regularities of nature, society, and thought. In this sense, methodology can refer to procedures used to arrive at new knowledge or to techniques of verifying and falsifying pre-existing knowledge claims. This encompasses various issues pertaining both to the collection of data and their analysis. Concerning the collection, it involves the problem of sampling and of how to go about the data collection itself, like surveys, interviews, or observation. There are also numerous methods of how the collected data can be analyzed using statistics or other ways of interpreting it to extract interesting conclusions.

As study of methods

However, many theorists emphasize the differences between the terms "method" and "methodology". In this regard, methodology may be defined as "the study or description of methods" or as "the analysis of the principles of methods, rules, and postulates employed by a discipline". This study or analysis involves uncovering assumptions and practices associated with the different methods and a detailed description of research designs and hypothesis testing. It also includes evaluative aspects: forms of data collection, measurement strategies, and ways to analyze data are compared and their advantages and disadvantages relative to different research goals and situations are assessed. In this regard, methodology provides the skills, knowledge, and practical guidance needed to conduct scientific research in an efficient manner. It acts as a guideline for various decisions researchers need to take in the scientific process.

Methodology can be understood as the middle ground between concrete particular methods and the abstract and general issues discussed by the philosophy of science. In this regard, methodology comes after formulating a research question and helps the researchers decide what methods to use in the process. For example, methodology should assist the researcher in deciding why one method of sampling is preferable to another in a particular case or which form of data analysis is likely to bring the best results. Methodology achieves this by explaining, evaluating and justifying methods. Just as there are different methods, there are also different methodologies. Different methodologies provide different approaches to how methods are evaluated and explained and may thus make different suggestions on what method to use in a particular case.

According to Aleksandr Georgievich Spirkin, "[a] methodology is a system of principles and general ways of organising and structuring theoretical and practical activity, and also the theory of this system". Helen Kara defines methodology as "a contextual framework for research, a coherent and logical scheme based on views, beliefs, and values, that guides the choices researchers make". Ginny E. Garcia and Dudley L. Poston understand methodology either as a complex body of rules and postulates guiding research or as the analysis of such rules and procedures. As a body of rules and postulates, a methodology defines the subject of analysis as well as the conceptual tools used by the analysis and the limits of the analysis. Research projects are usually governed by a structured procedure known as the research process. The goal of this process is given by a research question, which determines what kind of information one intends to acquire.

As discussion of background assumptions

Some theorists prefer an even wider understanding of methodology that involves not just the description, comparison, and evaluation of methods but includes additionally more general philosophical issues. One reason for this wider approach is that discussions of when to use which method often take various background assumptions for granted, for example, concerning the goal and nature of research. These assumptions can at times play an important role concerning which method to choose and how to follow it. For example, Thomas Kuhn argues in his The Structure of Scientific Revolutions that sciences operate within a framework or a paradigm that determines which questions are asked and what counts as good science. This concerns philosophical disagreements both about how to conceptualize the phenomena studied, what constitutes evidence for and against them, and what the general goal of researching them is. So in this wider sense, methodology overlaps with philosophy by making these assumptions explicit and presenting arguments for and against them. According to C. S. Herrman, a good methodology clarifies the structure of the data to be analyzed and helps the researchers see the phenomena in a new light. In this regard, a methodology is similar to a paradigm. A similar view is defended by Spirkin, who holds that a central aspect of every methodology is the world view that comes with it.

The discussion of background assumptions can include metaphysical and ontological issues in cases where they have important implications for the proper research methodology. For example, a realist perspective considering the observed phenomena as an external and independent reality is often associated with an emphasis on empirical data collection and a more distanced and objective attitude. Idealists, on the other hand, hold that external reality is not fully independent of the mind and tend, therefore, to include more subjective tendencies in the research process as well.

For the quantitative approach, philosophical debates in methodology include the distinction between the inductive and the hypothetico-deductive interpretation of the scientific method. For qualitative research, many basic assumptions are tied to philosophical positions such as hermeneutics, pragmatism, Marxism, critical theory, and postmodernism. According to Kuhn, an important factor in such debates is that the different paradigms are incommensurable. This means that there is no overarching framework to assess the conflicting theoretical and methodological assumptions. This critique puts into question various presumptions of the quantitative approach associated with scientific progress based on the steady accumulation of data.

Other discussions of abstract theoretical issues in the philosophy of science are also sometimes included. This can involve questions like how and whether scientific research differs from fictional writing as well as whether research studies objective facts rather than constructing the phenomena it claims to study. In the latter sense, some methodologists have even claimed that the goal of science is less to represent a pre-existing reality and more to bring about some kind of social change in favor of repressed groups in society.

Related terms and issues

Viknesh Andiappan and Yoke Kin Wan use the field of process systems engineering to distinguish the term "methodology" from the closely related terms "approach", "method", "procedure", and "technique". On their view, "approach" is the most general term. It can be defined as "a way or direction used to address a problem based on a set of assumptions". An example is the difference between hierarchical approaches, which consider one task at a time in a hierarchical manner, and concurrent approaches, which consider them all simultaneously. Methodologies are a little more specific. They are general strategies needed to realize an approach and may be understood as guidelines for how to make choices. Often the term "framework" is used as a synonym. A method is a still more specific way of practically implementing the approach. Methodologies provide the guidelines that help researchers decide which method to follow. The method itself may be understood as a sequence of techniques. A technique is a step taken that can be observed and measured. Each technique has some immediate result. The whole sequence of steps is termed a "procedure". A similar but less complex characterization is sometimes found in the field of language teaching, where the teaching process may be described through a three-level conceptualization based on "approach", "method", and "technique".

One question concerning the definition of methodology is whether it should be understood as a descriptive or a normative discipline. The key difference in this regard is whether methodology just provides a value-neutral description of methods or what scientists actually do. Many methodologists practice their craft in a normative sense, meaning that they express clear opinions about the advantages and disadvantages of different methods. In this regard, methodology is not just about what researchers actually do but about what they ought to do or how to perform good research.

Types

Theorists often distinguish various general types or approaches to methodology. The most influential classification contrasts quantitative and qualitative methodology.

Quantitative and qualitative

Quantitative research is closely associated with the natural sciences. It is based on precise numerical measurements, which are then used to arrive at exact general laws. This precision is also reflected in the goal of making predictions that can later be verified by other researchers. Examples of quantitative research include physicists at the Large Hadron Collider measuring the mass of newly created particles and positive psychologists conducting an online survey to determine the correlation between income and self-assessed well-being.

Qualitative research is characterized in various ways in the academic literature but there are very few precise definitions of the term. It is often used in contrast to quantitative research for forms of study that do not quantify their subject matter numerically. However, the distinction between these two types is not always obvious and various theorists have argued that it should be understood as a continuum and not as a dichotomy. A lot of qualitative research is concerned with some form of human experience or behavior, in which case it tends to focus on a few individuals and their in-depth understanding of the meaning of the studied phenomena. Examples of the qualitative method are a market researcher conducting a focus group in order to learn how people react to a new product or a medical researcher performing an unstructured in-depth interview with a participant from a new experimental therapy to assess its potential benefits and drawbacks. It is also used to improve quantitative research, such as informing data collection materials and questionnaire design. Qualitative research is frequently employed in fields where the pre-existing knowledge is inadequate. This way, it is possible to get a first impression of the field and potential theories, thus paving the way for investigating the issue in further studies.

Quantitative methods dominate in the natural sciences but both methodologies are used in the social sciences. Some social scientists focus mostly on one method while others try to investigate the same phenomenon using a variety of different methods. It is central to both approaches how the group of individuals used for the data collection is selected. This process is known as sampling. It involves the selection of a subset of individuals or phenomena to be measured. Important in this regard is that the selected samples are representative of the whole population, i.e. that no significant biases were involved when choosing. If this is not the case, the data collected does not reflect what the population as a whole is like. This affects generalizations and predictions drawn from the biased data. The number of individuals selected is called the sample size. For qualitative research, the sample size is usually rather small, while quantitative research tends to focus on big groups and collecting a lot of data. After the collection, the data needs to be analyzed and interpreted to arrive at interesting conclusions that pertain directly to the research question. This way, the wealth of information obtained is summarized and thus made more accessible to others. Especially in the case of quantitative research, this often involves the application of some form of statistics to make sense of the numerous individual measurements.

Many discussions in the history of methodology center around the quantitative methods used by the natural sciences. A central question in this regard is to what extent they can be applied to other fields, like the social sciences and history. The success of the natural sciences was often seen as an indication of the superiority of the quantitative methodology and used as an argument to apply this approach to other fields as well. However, this outlook has been put into question in the more recent methodological discourse. In this regard, it is often argued that the paradigm of the natural sciences is a one-sided development of reason, which is not equally well suited to all areas of inquiry. The divide between quantitative and qualitative methods in the social sciences is one consequence of this criticism.

Which method is more appropriate often depends on the goal of the research. For example, quantitative methods usually excel for evaluating preconceived hypotheses that can be clearly formulated and measured. Qualitative methods, on the other hand, can be used to study complex individual issues, often with the goal of formulating new hypotheses. This is especially relevant when the existing knowledge of the subject is inadequate. Important advantages of quantitative methods include precision and reliability. However, they have often difficulties in studying very complex phenomena that are commonly of interest to the social sciences. Additional problems can arise when the data is misinterpreted to defend conclusions that are not directly supported by the measurements themselves. In recent decades, many researchers in the social sciences have started combining both methodologies. This is known as mixed-methods research. A central motivation for this is that the two approaches can complement each other in various ways: some issues are ignored or too difficult to study with one methodology and are better approached with the other. In other cases, both approaches are applied to the same issue to produce more comprehensive and well-rounded results.

Qualitative and quantitative research are often associated with different research paradigms and background assumptions. Qualitative researchers often use an interpretive or critical approach while quantitative researchers tend to prefer a positivistic approach. Important disagreements between these approaches concern the role of objectivity and hard empirical data as well as the research goal of predictive success rather than in-depth understanding or social change.

Others

Various other classifications have been proposed. One distinguishes between substantive and formal methodologies. Substantive methodologies tend to focus on one specific area of inquiry. The findings are initially restricted to this specific field but may be transferrable to other areas of inquiry. Formal methodologies, on the other hand, are based on a variety of studies and try to arrive at more general principles applying to different fields. They may also give particular prominence to the analysis of the language of science and the formal structure of scientific explanation. A closely related classification distinguishes between philosophical, general scientific, and special scientific methods.

One type of methodological outlook is called "proceduralism". According to it, the goal of methodology is to boil down the research process to a simple set of rules or a recipe that automatically leads to good research if followed precisely. However, it has been argued that, while this ideal may be acceptable for some forms of quantitative research, it fails for qualitative research. One argument for this position is based on the claim that research is not a technique but a craft that cannot be achieved by blindly following a method. In this regard, research depends on forms of creativity and improvisation to amount to good science.

Other types include inductive, deductive, and transcendental methods. Inductive methods are common in the empirical sciences and proceed through inductive reasoning from many particular observations to arrive at general conclusions, often in the form of universal laws. Deductive methods, also referred to as axiomatic methods, are often found in formal sciences, such as geometry. They start from a set of self-evident axioms or first principles and use deduction to infer interesting conclusions from these axioms. Transcendental methods are common in Kantian and post-Kantian philosophy. They start with certain particular observations. It is then argued that the observed phenomena can only exist if their conditions of possibility are fulfilled. This way, the researcher may draw general psychological or metaphysical conclusions based on the claim that the phenomenon would not be observable otherwise.

Importance

It has been argued that a proper understanding of methodology is important for various issues in the field of research. They include both the problem of conducting efficient and reliable research as well as being able to validate knowledge claims by others. Method is often seen as one of the main factors of scientific progress. This is especially true for the natural sciences where the developments of experimental methods in the 16th and 17th century are often seen as the driving force behind the success and prominence of the natural sciences. In some cases, the choice of methodology may have a severe impact on a research project. The reason is that very different and sometimes even opposite conclusions may follow from the same factual material based on the chosen methodology.

Aleksandr Georgievich Spirkin argues that methodology, when understood in a wide sense, is of great importance since the world presents us with innumerable entities and relations between them. Methods are needed to simplify this complexity and find a way of mastering it. On the theoretical side, this concerns ways of forming true beliefs and solving problems. On the practical side, this concerns skills of influencing nature and dealing with each other. These different methods are usually passed down from one generation to the next. Spirkin holds that the interest in methodology on a more abstract level arose in attempts to formalize these techniques to improve them as well as to make it easier to use them and pass them on. In the field of research, for example, the goal of this process is to find reliable means to acquire knowledge in contrast to mere opinions acquired by unreliable means. In this regard, "methodology is a way of obtaining and building up ... knowledge".

Various theorists have observed that the interest in methodology has risen significantly in the 20th century. This increased interest is reflected not just in academic publications on the subject but also in the institutionalized establishment of training programs focusing specifically on methodology. This phenomenon can be interpreted in different ways. Some see it as a positive indication of the topic's theoretical and practical importance. Others interpret this interest in methodology as an excessive preoccupation that draws time and energy away from doing research on concrete subjects by applying the methods instead of researching them. This ambiguous attitude towards methodology is sometimes even exemplified in the same person. Max Weber, for example, criticized the focus on methodology during his time while making significant contributions to it himself. Spirkin believes that one important reason for this development is that contemporary society faces many global problems. These problems cannot be solved by a single researcher or a single discipline but are in need of collaborative efforts from many fields. Such interdisciplinary undertakings profit a lot from methodological advances, both concerning the ability to understand the methods of the respective fields and in relation to developing more homogeneous methods equally used by all of them.

Criticism

Most criticism of methodology is directed at one specific form or understanding of it. In such cases, one particular methodological theory is rejected but not methodology at large when understood as a field of research comprising many different theories. In this regard, many objections to methodology focus on the quantitative approach, specifically when it is treated as the only viable approach. Nonetheless, there are also more fundamental criticisms of methodology in general. They are often based on the idea that there is little value to abstract discussions of methods and the reasons cited for and against them. In this regard, it may be argued that what matters is the correct employment of methods and not their meticulous study. Sigmund Freud, for example, compared methodologists to "people who clean their glasses so thoroughly that they never have time to look through them". According to C. Wright Mills, the practice of methodology often degenerates into a "fetishism of method and technique".

Some even hold that methodological reflection is not just a waste of time but actually has negative side effects. Such an argument may be defended by analogy to other skills that work best when the agent focuses only on employing them. In this regard, reflection may interfere with the process and lead to avoidable mistakes. According to an example by Gilbert Ryle, "[w]e run, as a rule, worse, not better, if we think a lot about our feet". A less severe version of this criticism does not reject methodology per se but denies its importance and rejects an intense focus on it. In this regard, methodology has still a limited and subordinate utility but becomes a diversion or even counterproductive by hindering practice when given too much emphasis.

Another line of criticism concerns more the general and abstract nature of methodology. It states that the discussion of methods is only useful in concrete and particular cases but not concerning abstract guidelines governing many or all cases. Some anti-methodologists reject methodology based on the claim that researchers need freedom to do their work effectively. But this freedom may be constrained and stifled by "inflexible and inappropriate guidelines". For example, according to Kerry Chamberlain, a good interpretation needs creativity to be provocative and insightful, which is prohibited by a strictly codified approach. Chamberlain uses the neologism "methodolatry" to refer to this alleged overemphasis on methodology. Similar arguments are given in Paul Feyerabend's book "Against Method".

However, these criticisms of methodology in general are not always accepted. Many methodologists defend their craft by pointing out how the efficiency and reliability of research can be improved through a proper understanding of methodology.

A criticism of more specific forms of methodology is found in the works of the sociologist Howard S. Becker. He is quite critical of methodologists based on the claim that they usually act as advocates of one particular method usually associated with quantitative research. An often-cited quotation in this regard is that "[m]ethodology is too important to be left to methodologists". Alan Bryman has rejected this negative outlook on methodology. He holds that Becker's criticism can be avoided by understanding methodology as an inclusive inquiry into all kinds of methods and not as a mere doctrine for converting non-believers to one's preferred method.

In different fields

Part of the importance of methodology is reflected in the number of fields to which it is relevant. They include the natural sciences and the social sciences as well as philosophy and mathematics.

Natural sciences

The methodology underlying a type of DNA sequencing

The dominant methodology in the natural sciences (like astronomy, biology, chemistry, geoscience, and physics) is called the scientific method. Its main cognitive aim is usually seen as the creation of knowledge, but various closely related aims have also been proposed, like understanding, explanation, or predictive success. Strictly speaking, there is no one single scientific method. In this regard, the expression "scientific method" refers not to one specific procedure but to different general or abstract methodological aspects characteristic of all the aforementioned fields. Important features are that the problem is formulated in a clear manner and that the evidence presented for or against a theory is public, reliable, and replicable. The last point is important so that other researchers are able to repeat the experiments to confirm or disconfirm the initial study. For this reason, various factors and variables of the situation often have to be controlled to avoid distorting influences and to ensure that subsequent measurements by other researchers yield the same results. The scientific method is a quantitative approach that aims at obtaining numerical data. This data is often described using mathematical formulas. The goal is usually to arrive at some universal generalizations that apply not just to the artificial situation of the experiment but to the world at large. Some data can only be acquired using advanced measurement instruments. In cases where the data is very complex, it is often necessary to employ sophisticated statistical techniques to draw conclusions from it.

The scientific method is often broken down into several steps. In a typical case, the procedure starts with regular observation and the collection of information. These findings then lead the scientist to formulate a hypothesis describing and explaining the observed phenomena. The next step consists in conducting an experiment designed for this specific hypothesis. The actual results of the experiment are then compared to the expected results based on one's hypothesis. The findings may then be interpreted and published, either as a confirmation or disconfirmation of the initial hypothesis.

Two central aspects of the scientific method are observation and experimentation. This distinction is based on the idea that experimentation involves some form of manipulation or intervention. This way, the studied phenomena are actively created or shaped. For example, a biologist inserting viral DNA into a bacterium is engaged in a form of experimentation. Pure observation, on the other hand, involves studying independent entities in a passive manner. This is the case, for example, when astronomers observe the orbits of astronomical objects far away. Observation played the main role in ancient science. The scientific revolution in the 16th and 17th century affected a paradigm change that gave a much more central role to experimentation in the scientific methodology. This is sometimes expressed by stating that modern science actively "puts questions to nature". While the distinction is usually clear in the paradigmatic cases, there are also many intermediate cases where it is not obvious whether they should be characterized as observation or as experimentation.

A central discussion in this field concerns the distinction between the inductive and the hypothetico-deductive methodology. The core disagreement between these two approaches concerns their understanding of the confirmation of scientific theories. The inductive approach holds that a theory is confirmed or supported by all its positive instances, i.e. by all the observations that exemplify it. For example, the observations of many white swans confirm the universal hypothesis that "all swans are white". The hypothetico-deductive approach, on the other hand, focuses not on positive instances but on deductive consequences of the theory. This way, the researcher uses deduction before conducting an experiment to infer what observations they expect. These expectations are then compared to the observations they actually make. This approach often takes a negative form based on falsification. In this regard, positive instances do not confirm a hypothesis but negative instances disconfirm it. Positive indications that the hypothesis is true are only given indirectly if many attempts to find counterexamples have failed. A cornerstone of this approach is the null hypothesis, which assumes that there is no connection (see causality) between whatever is being observed. It is up to the researcher to do all they can to disprove their own hypothesis through relevant methods or techniques, documented in a clear and replicable process. If they fail to do so, it can be concluded that the null hypothesis is false, which provides support for their own hypothesis about the relation between the observed phenomena.

Social sciences

Significantly more methodological variety is found in the social sciences, where both quantitative and qualitative approaches are used. They employ various forms of data collection, such as surveys, interviews, focus groups, and the nominal group technique. Surveys belong to quantitative research and usually involve some form of questionnaire given to a large group of individuals. It is paramount that the questions are easily understandable by the participants since the answers might not have much value otherwise. Surveys normally restrict themselves to closed questions in order to avoid various problems that come with the interpretation of answers to open questions. They contrast in this regard to interviews, which put more emphasis on the individual participant and often involve open questions. Structured interviews are planned in advance and have a fixed set of questions given to each individual. They contrast with unstructured interviews, which are closer to a free-flow conversation and require more improvisation on the side of the interviewer for finding interesting and relevant questions. Semi-structured interviews constitute a middle ground: they include both predetermined questions and questions not planned in advance. Structured interviews make it easier to compare the responses of the different participants and to draw general conclusions. However, they also limit what may be discovered and thus constrain the investigation in many ways. Depending on the type and depth of the interview, this method belongs either to quantitative or to qualitative research. The terms research conversation  and muddy interview have been used to describe interviews conducted in informal settings which may not occur purely for the purposes of data collection.

Focus groups are a qualitative research method often used in market research. They constitute a form of group interview involving a small number of demographically similar people. Researchers can use this method to collect data based on the interactions and responses of the participants. The interview often starts by asking the participants about their opinions on the topic under investigation, which may, in turn, lead to a free exchange in which the group members express and discuss their personal views. An important advantage of focus groups is that they can provide insight into how ideas and understanding operate in a cultural context. However, it is usually difficult to use these insights to discern more general patterns true for a wider public. One advantage of focus groups is that they can help the researcher identify a wide range of distinct perspectives on the issue in a short time. The group interaction may also help clarify and expand interesting contributions. One disadvantage is due to the moderator's personality and group effects, which may influence the opinions stated by the participants. When applied to cross-cultural settings, cultural and linguistic adaptations and group composition considerations are important to encourage greater participation in the group discussion.

The nominal group technique is similar to focus groups with a few important differences. The group often consists of experts in the field in question. The group size is similar but the interaction between the participants is more structured. The goal is to determine how much agreement there is among the experts on the different issues. The initial responses are often given in written form by each participant without a prior conversation between them. In this manner, group effects potentially influencing the expressed opinions are minimized. In later steps, the different responses and comments may be discussed and compared to each other by the group as a whole.

Most of these forms of data collection involve some type of observation. Observation can take place either in a natural setting, i.e. the field, or in a controlled setting such as a laboratory. Controlled settings carry with them the risk of distorting the results due to their artificiality. Their advantage lies in precisely controlling the relevant factors, which can help make the observations more reliable and repeatable. Non-participatory observation involves a distanced or external approach. In this case, the researcher focuses on describing and recording the observed phenomena without causing or changing them, in contrast to participatory observation.

An important methodological debate in the field of social sciences concerns the question of whether they deal with hard, objective, and value-neutral facts, as the natural sciences do. Positivists agree with this characterization, in contrast to interpretive and critical perspectives on the social sciences. According to William Neumann, positivism can be defined as "an organized method for combining deductive logic with precise empirical observations of individual behavior in order to discover and confirm a set of probabilistic causal laws that can be used to predict general patterns of human activity". This view is rejected by interpretivists. Max Weber, for example, argues that the method of the natural sciences is inadequate for the social sciences. Instead, more importance is placed on meaning and how people create and maintain their social worlds. The critical methodology in social science is associated with Karl Marx and Sigmund Freud. It is based on the assumption that many of the phenomena studied using the other approaches are mere distortions or surface illusions. It seeks to uncover deeper structures of the material world hidden behind these distortions. This approach is often guided by the goal of helping people effect social changes and improvements.

Philosophy

Philosophical methodology is the metaphilosophical field of inquiry studying the methods used in philosophy. These methods structure how philosophers conduct their research, acquire knowledge, and select between competing theories. It concerns both descriptive issues of what methods have been used by philosophers in the past and normative issues of which methods should be used. Many philosophers emphasize that these methods differ significantly from the methods found in the natural sciences in that they usually do not rely on experimental data obtained through measuring equipment. Which method one follows can have wide implications for how philosophical theories are constructed, what theses are defended, and what arguments are cited in favor or against. In this regard, many philosophical disagreements have their source in methodological disagreements. Historically, the discovery of new methods, like methodological skepticism and the phenomenological method, has had important impacts on the philosophical discourse.

A great variety of methods has been employed throughout the history of philosophy. Methodological skepticism gives special importance to the role of systematic doubt. This way, philosophers try to discover absolutely certain first principles that are indubitable. The geometric method starts from such first principles and employs deductive reasoning to construct a comprehensive philosophical system based on them. Phenomenology gives particular importance to how things appear to be. It consists in suspending one's judgments about whether these things actually exist in the external world. This technique is known as epoché and can be used to study appearances independent of assumptions about their causes. The method of conceptual analysis came to particular prominence with the advent of analytic philosophy. It studies concepts by breaking them down into their most fundamental constituents to clarify their meaning. Common sense philosophy uses common and widely accepted beliefs as a philosophical tool. They are used to draw interesting conclusions. This is often employed in a negative sense to discredit radical philosophical positions that go against common sense. Ordinary language philosophy has a very similar method: it approaches philosophical questions by looking at how the corresponding terms are used in ordinary language.

Many methods in philosophy rely on some form of intuition. They are used, for example, to evaluate thought experiments, which involve imagining situations to assess their possible consequences in order to confirm or refute philosophical theories. The method of reflective equilibrium tries to form a coherent perspective by examining and reevaluating all the relevant beliefs and intuitions. Pragmatists focus on the practical consequences of philosophical theories to assess whether they are true or false. Experimental philosophy is a recently developed approach that uses the methodology of social psychology and the cognitive sciences for gathering empirical evidence and justifying philosophical claims.

Mathematics

In the field of mathematics, various methods can be distinguished, such as synthetic, analytic, deductive, inductive, and heuristic methods. For example, the difference between synthetic and analytic methods is that the former start from the known and proceed to the unknown while the latter seek to find a path from the unknown to the known. Geometry textbooks often proceed using the synthetic method. They start by listing known definitions and axioms and proceed by taking inferential steps, one at a time, until the solution to the initial problem is found. An important advantage of the synthetic method is its clear and short logical exposition. One disadvantage is that it is usually not obvious in the beginning that the steps taken lead to the intended conclusion. This may then come as a surprise to the reader since it is not explained how the mathematician knew in the beginning which steps to take. The analytic method often reflects better how mathematicians actually make their discoveries. For this reason, it is often seen as the better method for teaching mathematics. It starts with the intended conclusion and tries to find another formula from which it can be deduced. It then goes on to apply the same process to this new formula until it has traced back all the way to already proven theorems. The difference between the two methods concerns primarily how mathematicians think and present their proofs. The two are equivalent in the sense that the same proof may be presented either way.

Statistics

Statistics investigates the analysis, interpretation, and presentation of data. It plays a central role in many forms of quantitative research that have to deal with the data of many observations and measurements. In such cases, data analysis is used to cleanse, transform, and model the data to arrive at practically useful conclusions. There are numerous methods of data analysis. They are usually divided into descriptive statistics and inferential statistics. Descriptive statistics restricts itself to the data at hand. It tries to summarize the most salient features and present them in insightful ways. This can happen, for example, by visualizing its distribution or by calculating indices such as the mean or the standard deviation. Inferential statistics, on the other hand, uses this data based on a sample to draw inferences about the population at large. That can take the form of making generalizations and predictions or by assessing the probability of a concrete hypothesis.

Pedagogy

Pedagogy can be defined as the study or science of teaching methods. In this regard, it is the methodology of education: it investigates the methods and practices that can be applied to fulfill the aims of education. These aims include the transmission of knowledge as well as fostering skills and character traits. Its main focus is on teaching methods in the context of regular schools. But in its widest sense, it encompasses all forms of education, both inside and outside schools. In this wide sense, pedagogy is concerned with "any conscious activity by one person designed to enhance learning in another". The teaching happening this way is a process taking place between two parties: teachers and learners. Pedagogy investigates how the teacher can help the learner undergo experiences that promote their understanding of the subject matter in question.

Various influential pedagogical theories have been proposed. Mental-discipline theories were already common in ancient Greek and state that the main goal of teaching is to train intellectual capacities. They are usually based on a certain ideal of the capacities, attitudes, and values possessed by educated people. According to naturalistic theories, there is an inborn natural tendency in children to develop in a certain way. For them, pedagogy is about how to help this process happen by ensuring that the required external conditions are set up. Herbartianism identifies five essential components of teaching: preparation, presentation, association, generalization, and application. They correspond to different phases of the educational process: getting ready for it, showing new ideas, bringing these ideas in relation to known ideas, understanding the general principle behind their instances, and putting what one has learned into practice. Learning theories focus primarily on how learning takes place and formulate the proper methods of teaching based on these insights. One of them is apperception or association theory, which understands the mind primarily in terms of associations between ideas and experiences. On this view, the mind is initially a blank slate. Learning is a form of developing the mind by helping it establish the right associations. Behaviorism is a more externally oriented learning theory. It identifies learning with classical conditioning, in which the learner's behavior is shaped by presenting them with a stimulus with the goal of evoking and solidifying the desired response pattern to this stimulus.

The choice of which specific method is best to use depends on various factors, such as the subject matter and the learner's age. Interest and curiosity on the side of the student are among the key factors of learning success. This means that one important aspect of the chosen teaching method is to ensure that these motivational forces are maintained, through intrinsic or extrinsic motivation. Many forms of education also include regular assessment of the learner's progress, for example, in the form of tests. This helps to ensure that the teaching process is successful and to make adjustments to the chosen method if necessary.

Related concepts

Methodology has several related concepts, such as paradigm and algorithm. In the context of science, a paradigm is a conceptual worldview. It consists of a number of basic concepts and general theories, that determine how the studied phenomena are to be conceptualized and which scientific methods are considered reliable for studying them. Various theorists emphasize similar aspects of methodologies, for example, that they shape the general outlook on the studied phenomena and help the researcher see them in a new light.

In computer science, an algorithm is a procedure or methodology to reach the solution of a problem with a finite number of steps. Each step has to be precisely defined so it can be carried out in an unambiguous manner for each application. For example, the Euclidean algorithm is an algorithm that solves the problem of finding the greatest common divisor of two integers. It is based on simple steps like comparing the two numbers and subtracting one from the other.

Deliberative democracy

From Wikipedia, the free encyclopedia

Deliberative democracy or discursive democracy is a form of democracy in which deliberation is central to decision-making. Deliberative democracy seeks quality over quantity by limiting decision-makers to a smaller but more representative sample of the population that is given the time and resources to focus on one issue.

It often adopts elements of both consensus decision-making and majority rule. Deliberative democracy differs from traditional democratic theory in that authentic deliberation, not mere voting, is the primary source of legitimacy for the law. Deliberative democracy is related to consultative democracy, in which public consultation with citizens is central to democratic processes. The distance between deliberative democracy and concepts like representative democracy or direct democracy is debated. While some practitioners and theorists use deliberative democracy to describe elected bodies whose members propose and enact legislation, Hélène Landemore and others increasingly use deliberative democracy to refer to decision-making by randomly-selected lay citizens with equal power.

Deliberative democracy has a long history of practice and theory traced back to ancient times, with an increase in academic attention in the 1990s, and growing implementations since 2010. Joseph M. Bessette has been credited with coining the term in his 1980 work Deliberative Democracy: The Majority Principle in Republican Government.

Overview

Deliberative democracy holds that, for a democratic decision to be legitimate, it must be preceded by authentic deliberation, not merely the aggregation of preferences that occurs in voting. Authentic deliberation is deliberation among decision-makers that is free from distortions of unequal political power, such as power a decision-maker obtains through economic wealth or the support of interest groups.

The roots of deliberative democracy can be traced back to Aristotle and his notion of politics; however, the German philosopher Jürgen Habermas' work on communicative rationality and the public sphere is often identified as a major work in this area.

Deliberative democracy can be practiced by decision-makers in both representative democracies and direct democracies. In elitist deliberative democracy, principles of deliberative democracy apply to elite societal decision-making bodies, such as legislatures and courts; in populist deliberative democracy, principles of deliberative democracy apply to groups of lay citizens who are empowered to make decisions. One purpose of populist deliberative democracy can be to use deliberation among a group of lay citizens to distill a more authentic public opinion about societal issues for other decision-makers to consider; devices such as the deliberative opinion poll have been designed to achieve this goal. Another purpose of populist deliberative democracy can, like direct democracy, result directly in binding law. If political decisions are made by deliberation but not by the people themselves or their elected representatives, then there is no democratic element; this deliberative process is called elite deliberation.

James Fearon and Portia Pedro believe deliberative processes most often generate ideal conditions of impartiality, rationality and knowledge of the relevant facts, resulting in more morally correct outcomes. Former diplomat Carne Ross contends that the processes more civil, collaborative, and evidence-based than the debates in traditional town hall meetings or in internet forums if citizens know their debates will impact society. Some fear the influence of a skilled orator.

Characteristics

Fishkin's model of deliberation

James Fishkin, who has designed practical implementations of deliberative democracy through deliberative polling for over 15 years in various countries, describes five characteristics essential for legitimate deliberation:

  • Information: The extent to which participants are given access to reasonably accurate information that they believe to be relevant to the issue
  • Substantive balance: The extent to which arguments offered by one side or from one perspective are answered by considerations offered by those who hold other perspectives
  • Diversity: The extent to which the major positions in the public are represented by participants in the discussion
  • Conscientiousness: The extent to which participants sincerely weigh the merits of the arguments
  • Equal consideration: The extent to which arguments offered by all participants are considered on the merits regardless of which participants offer them

Studies by James Fishkin and others have concluded that deliberative democracy tends to produce outcomes which are superior to those in other forms of democracy. Desirable outcomes in their research include less partisanship and more sympathy with opposing views; more respect for evidence-based reasoning rather than opinion; a greater commitment to the decisions taken by those involved; and a greater chance for widely shared consensus to emerge, thus promoting social cohesion between people from different backgrounds. Fishkin cites extensive empirical support for the increase in public spiritedness that is often caused by participation in deliberation, and says theoretical support can be traced back to foundational democratic thinkers such as John Stuart Mill and Alexis de Tocqueville.

Cohen's outline

Joshua Cohen, a student of John Rawls, argued that the five main features of deliberative democracy include:

  1. An ongoing independent association with expected continuation.
  2. The citizens in the democracy structure their institutions such that deliberation is the deciding factor in the creation of the institutions and the institutions allow deliberation to continue.
  3. A commitment to the respect of a pluralism of values and aims within the polity.
  4. The citizens consider deliberative procedure as the source of legitimacy, and prefer the causal history of legitimation for each law to be transparent and easily traceable to the deliberative process.
  5. Each member recognizes and respects other members' deliberative capacity.

Cohen presents deliberative democracy as more than a theory of legitimacy, and forms a body of substantive rights around it based on achieving "ideal deliberation":

  1. It is free in two ways:
    1. The participants consider themselves bound solely by the results and preconditions of the deliberation. They are free from any authority of prior norms or requirements.
    2. The participants suppose that they can act on the decision made; the deliberative process is a sufficient reason to comply with the decision reached.
  2. Parties to deliberation are required to state reasons for their proposals, and proposals are accepted or rejected based on the reasons given, as the content of the very deliberation taking place.
  3. Participants are equal in two ways:
    1. Formal: anyone can put forth proposals, criticize, and support measures. There is no substantive hierarchy.
    2. Substantive: The participants are not limited or bound by certain distributions of power, resources, or pre-existing norms. "The participants…do not regard themselves as bound by the existing system of rights, except insofar as that system establishes the framework of free deliberation among equals."
  4. Deliberation aims at a rationally motivated consensus: it aims to find reasons acceptable to all who are committed to such a system of decision-making. When consensus or something near enough is not possible, majoritarian decision making is used.

In Democracy and Liberty, an essay published in 1998, Cohen updated his idea of pluralism to "reasonable pluralism" – the acceptance of different, incompatible worldviews and the importance of good faith deliberative efforts to ensure that as far as possible the holders of these views can live together on terms acceptable to all.

Gutmann and Thompson's model

Amy Gutmann and Dennis F. Thompson's definition captures the elements that are found in most conceptions of deliberative democracy. They define it as "a form of government in which free and equal citizens and their representatives justify decisions in a process in which they give one another reasons that are mutually acceptable and generally accessible, with the aim of reaching decisions that are binding on all at present but open to challenge in the future".

They state that deliberative democracy has four requirements, which refer to the kind of reasons that citizens and their representatives are expected to give to one another:

  1. Reciprocal. The reasons should be acceptable to free and equal persons seeking fair terms of cooperation.
  2. Accessible. The reasons must be given in public and the content must be understandable to the relevant audience.
  3. Binding. The reason-giving process leads to a decision or law that is enforced for some period of time. The participants do not deliberate just for the sake of deliberation or for individual enlightenment.
  4. Dynamic or Provisional. The participants must keep open the possibility of changing their minds, and continuing a reason-giving dialogue that can challenge previous decisions and laws.

Standards of good deliberation - from first to second generation (Bächtiger et al., 2018)

For Bächtiger, Dryzek, Mansbridge and Warren, the ideal standards of "good deliberation" which deliberative democracy should strive towards have changed:

Standards for "good deliberation"
First generation Second generation
Respect Unchallenged, unrevised
Absence of power Unchallenged, unrevised
Equality Inclusion, mutual respect, equal communicative freedom, equal opportunity for influence
Reasons Relevant considerations
Aim at consensus Aim at both consensus and clarifying conflict
Common good orientation Orientation to both common good and self-interest constrained by fairness
Publicity Publicity in many conditions, but not all (e.g. in negotiations when representatives can be trusted)
Accountability Accountability to constituents when elected, to other participants and citizens when not elected
Sincerity Sincerity in matters of importance; allowable insincerity in greetings, compliments, and other communications intended to increase sociality

History

Early examples

Consensus-based decision making similar to deliberative democracy has been found in different degrees and variations throughout the world going back millennia. The most discussed early example of deliberative democracy arose in Greece as Athenian democracy during the sixth century BC. Athenian democracy was both deliberative and largely direct: some decisions were made by representatives but most were made by "the people" directly. Athenian democracy came to an end in 322 BC. Even some 18th century leaders advocating for representative democracy mention the importance of deliberation among elected representatives.

Recent scholarship

Call for the establishment of deliberative democracy seen at the Rally to Restore Sanity and/or Fear

The deliberative element of democracy was not widely studied by academics until the late 20th century. According to Professor Stephen Tierney, perhaps the earliest notable example of academic interest in the deliberative aspects of democracy occurred in John Rawls 1971 work A Theory of Justice. Joseph M. Bessette has been credited with coining the term "deliberative democracy" in his 1980 work Deliberative Democracy: The Majority Principle in Republican Government, and went on to elaborate and defend the notion in "The Mild Voice of Reason" (1994). In the 1990s, deliberative democracy began to attract substantial attention from political scientists. According to Professor John Dryzek, early work on deliberative democracy was part of efforts to develop a theory of democratic legitimacy. Theorists such as Carne Ross advocate deliberative democracy as a complete alternative to representative democracy. The more common view, held by contributors such as James Fishkin, is that direct deliberative democracy can be complementary to traditional representative democracy. Others contributing to the notion of deliberative democracy include Carlos Nino, Jon Elster, Roberto Gargarella, John Gastil, Jürgen Habermas, David Held, Joshua Cohen, Amy Gutmann, Noëlle McAfee, Rense Bos, Jane Mansbridge, Jose Luis Marti, Dennis Thompson, Benny Hjern, Hal Koch, Seyla Benhabib, Ethan Leib, Charles Sabel, Jeffrey K. Tulis, David Estlund, Mariah Zeisberg, Jeffrey L. McNairn, Iris Marion Young, Robert B. Talisse, and Hélène Landemore.

Although political theorists took the lead in the study of deliberative democracy, political scientists have in recent years begun to investigate its processes. One of the main challenges currently is to discover more about the actual conditions under which the ideals of deliberative democracy are more or less likely to be realized.

Drawing on the work of Hannah Arendt, Shmuel Lederman laments the fact that "deliberation and agonism have become almost two different schools of thought" that are discussed as "mutually exclusive conceptions of politics" as seen in the works of Chantal Mouffe, Ernesto Laclau, and William E. Connolly. Giuseppe Ballacci argues that agonism and deliberation are not only compatible but mutually dependent: "a properly understood agonism requires the use of deliberative skills but also that even a strongly deliberative politics could not be completely exempt from some of the consequences of agonism".

Most recently, scholarship has focused on the emergence of a 'systemic approach' to the study of deliberation. This suggests that the deliberative capacity of a democratic system needs to be understood through the interconnection of the variety of sites of deliberation which exist, rather than any single setting. Some studies have conducted experiments to examine how deliberative democracy addresses the problems of sustainability and underrepresentation of future generations. Although not always the case, participation in deliberation has been found to shift participants opinions in favour of environmental positions.

Modern examples

The OECD documented nearly 300 examples (1986-2019) and finds their use increasing since 2010. For example, a representative sample of 4000 lay citizens used a 'Citizens' congress' to coalesce around a plan on how to rebuild New Orleans after Hurricane Katrina.

Delayed-choice quantum eraser

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Delayed-choice_quantum_eraser A delayed-cho...