Search This Blog

Thursday, August 30, 2018

Information theory

From Wikipedia, the free encyclopedia

Information theory studies the quantification, storage, and communication of information. It was originally proposed by Claude E. Shannon in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper entitled "A Mathematical Theory of Communication". Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for digital subscriber line (DSL)). Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields.

A key measure in information theory is "entropy". Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy.

The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, and electrical engineering. The theory has also found applications in other areas, including statistical inference, natural language processing, cryptography, neurobiology, human vision, the evolution and function of molecular codes (bioinformatics), model selection in statistics, thermal physics, quantum computing, linguistics, plagiarism detection, pattern recognition, and anomaly detection. Important sub-fields of information theory include source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, and measures of information.

Overview

Information theory studies the transmission, processing, extraction, and utilization of information. Abstractly, information can be thought of as the resolution of uncertainty. In the case of communication of information over a noisy channel, this abstract concept was made concrete in 1948 by Claude Shannon in his paper "A Mathematical Theory of Communication", in which "information" is thought of as a set of possible messages, where the goal is to send these messages over a noisy channel, and then to have the receiver reconstruct the message with low probability of error, in spite of the channel noise. Shannon's main result, the noisy-channel coding theorem showed that, in the limit of many channel uses, the rate of information that is asymptotically achievable is equal to the channel capacity, a quantity dependent merely on the statistics of the channel over which the messages are sent.

Information theory is closely associated with a collection of pure and applied disciplines that have been investigated and reduced to engineering practice under a variety of rubrics throughout the world over the past half century or more: adaptive systems, anticipatory systems, artificial intelligence, complex systems, complexity science, cybernetics, informatics, machine learning, along with systems sciences of many descriptions. Information theory is a broad and deep mathematical theory, with equally broad and deep applications, amongst which is the vital field of coding theory.

Coding theory is concerned with finding explicit methods, called codes, for increasing the efficiency and reducing the error rate of data communication over noisy channels to near the channel capacity. These codes can be roughly subdivided into data compression (source coding) and error-correction (channel coding) techniques. In the latter case, it took many years to find the methods Shannon's work proved were possible. A third class of information theory codes are cryptographic algorithms (both codes and ciphers). Concepts, methods and results from coding theory and information theory are widely used in cryptography and cryptanalysis. See the article ban (unit) for a historical application.
 
Information theory is also used in information retrieval, intelligence gathering, gambling, statistics, and even in musical composition.

Historical background

The landmark event that established the discipline of information theory and brought it to immediate worldwide attention was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948.

Prior to this paper, limited information-theoretic ideas had been developed at Bell Labs, all implicitly assuming events of equal probability. Harry Nyquist's 1924 paper, Certain Factors Affecting Telegraph Speed, contains a theoretical section quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system, giving the relation W = K log m (recalling Boltzmann's constant), where W is the speed of transmission of intelligence, m is the number of different voltage levels to choose from at each time step, and K is a constant. Ralph Hartley's 1928 paper, Transmission of Information, uses the word information as a measurable quantity, reflecting the receiver's ability to distinguish one sequence of symbols from any other, thus quantifying information as H = log Sn = n log S, where S was the number of possible symbols, and n the number of symbols in a transmission. The unit of information was therefore the decimal digit, which has since sometimes been called the hartley in his honor as a unit or scale or measure of information. Alan Turing in 1940 used similar ideas as part of the statistical analysis of the breaking of the German second world war Enigma ciphers.

Much of the mathematics behind information theory with events of different probabilities were developed for the field of thermodynamics by Ludwig Boltzmann and J. Willard Gibbs. Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by Rolf Landauer in the 1960s, are explored in Entropy in thermodynamics and information theory.

In Shannon's revolutionary and groundbreaking paper, the work for which had been substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion that
"The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point."
With it came the ideas of

Quantities of information

Information theory is based on probability theory and statistics. Information theory often concerns itself with measures of information of the distributions associated with random variables. Important quantities of information are entropy, a measure of information in a single random variable, and mutual information, a measure of information in common between two random variables. The former quantity is a property of the probability distribution of a random variable and gives a limit on the rate at which data generated by independent samples with the given distribution can be reliably compressed. The latter is a property of the joint distribution of two random variables, and is the maximum rate of reliable communication across a noisy channel in the limit of long block lengths, when the channel statistics are determined by the joint distribution.

The choice of logarithmic base in the following formulae determines the unit of information entropy that is used. A common unit of information is the bit, based on the binary logarithm. Other units include the nat, which is based on the natural logarithm, and the decimal digit, which is based on the common logarithm.

In what follows, an expression of the form p log p is considered by convention to be equal to zero whenever p = 0. This is justified because \lim _{p\rightarrow 0+}p\log p=0 for any logarithmic base.

Entropy of an information source

Based on the probability mass function of each source symbol to be communicated, the Shannon entropy H, in units of bits (per symbol), is given by

{\displaystyle H=-\sum _{i}p_{i}\log _{2}(p_{i})}
where pi is the probability of occurrence of the i-th possible value of the source symbol. This equation gives the entropy in the units of "bits" (per symbol) because it uses a logarithm of base 2, and this base-2 measure of entropy has sometimes been called the "shannon" in his honor. Entropy is also commonly computed using the natural logarithm (base e, where e is Euler's number), which produces a measurement of entropy in "nats" per symbol and sometimes simplifies the analysis by avoiding the need to include extra constants in the formulas. Other bases are also possible, but less commonly used. For example, a logarithm of base 28 = 256 will produce a measurement in bytes per symbol, and a logarithm of base 10 will produce a measurement in decimal digits (or hartleys) per symbol.

Intuitively, the entropy HX of a discrete random variable X is a measure of the amount of uncertainty associated with the value of X when only its distribution is known.

The entropy of a source that emits a sequence of N symbols that are independent and identically distributed (iid) is N·H bits (per message of N symbols). If the source data symbols are identically distributed but not independent, the entropy of a message of length N will be less than N·H.

The entropy of a Bernoulli trial as a function of success probability, often called the binary entropy function, Hb(p). The entropy is maximized at 1 bit per trial when the two possible outcomes are equally probable, as in an unbiased coin toss.

If one transmits 1000 bits (0s and 1s), and the value of each of these bits is known to the receiver (has a specific value with certainty) ahead of transmission, it is clear that no information is transmitted. If, however, each bit is independently equally likely to be 0 or 1, 1000 shannons of information (more often called bits) have been transmitted. Between these two extremes, information can be quantified as follows. If 𝕏 is the set of all messages {x1, …, xn} that X could be, and p(x) is the probability of some x\in \mathbb {X} , then the entropy, H, of X is defined:

H(X)=\mathbb {E} _{X}[I(x)]=-\sum _{x\in \mathbb {X} }p(x)\log p(x).
(Here, I(x) is the self-information, which is the entropy contribution of an individual message, and 𝔼X is the expected value.) A property of entropy is that it is maximized when all the messages in the message space are equiprobable p(x) = 1/n; i.e., most unpredictable, in which case H(X) = log n.
The special case of information entropy for a random variable with two outcomes is the binary entropy function, usually taken to the logarithmic base 2, thus having the shannon (Sh) as unit:
H_{\mathrm {b} }(p)=-p\log _{2}p-(1-p)\log _{2}(1-p).

Joint entropy

The joint entropy of two discrete random variables X and Y is merely the entropy of their pairing: (X, Y). This implies that if X and Y are independent, then their joint entropy is the sum of their individual entropies.

For example, if (X, Y) represents the position of a chess piece — X the row and Y the column, then the joint entropy of the row of the piece and the column of the piece will be the entropy of the position of the piece.
H(X,Y)=\mathbb {E} _{X,Y}[-\log p(x,y)]=-\sum _{x,y}p(x,y)\log p(x,y)\,
Despite similar notation, joint entropy should not be confused with cross entropy.

Conditional entropy (equivocation)

The conditional entropy or conditional uncertainty of X given random variable Y (also called the equivocation of X about Y) is the average conditional entropy over Y:
{\displaystyle H(X|Y)=\mathbb {E} _{Y}[H(X|y)]=-\sum _{y\in Y}p(y)\sum _{x\in X}p(x|y)\log p(x|y)=-\sum _{x,y}p(x,y)\log p(x|y).}
Because entropy can be conditioned on a random variable or on that random variable being a certain value, care should be taken not to confuse these two definitions of conditional entropy, the former of which is in more common use. A basic property of this form of conditional entropy is that:
H(X|Y)=H(X,Y)-H(Y).\,

Mutual information (transinformation)

Mutual information measures the amount of information that can be obtained about one random variable by observing another. It is important in communication where it can be used to maximize the amount of information shared between sent and received signals. The mutual information of X relative to Y is given by:
I(X;Y)=\mathbb {E} _{X,Y}[SI(x,y)]=\sum _{x,y}p(x,y)\log {\frac {p(x,y)}{p(x)\,p(y)}}
where SI (Specific mutual Information) is the pointwise mutual information.

A basic property of the mutual information is that
I(X;Y)=H(X)-H(X|Y).\,
That is, knowing Y, we can save an average of I(X; Y) bits in encoding X compared to not knowing Y.

Mutual information is symmetric:
I(X;Y)=I(Y;X)=H(X)+H(Y)-H(X,Y).\,
Mutual information can be expressed as the average Kullback–Leibler divergence (information gain) between the posterior probability distribution of X given the value of Y and the prior distribution on X:
I(X;Y)=\mathbb {E} _{p(y)}[D_{\mathrm {KL} }(p(X|Y=y)\|p(X))].
In other words, this is a measure of how much, on the average, the probability distribution on X will change if we are given the value of Y. This is often recalculated as the divergence from the product of the marginal distributions to the actual joint distribution:
I(X;Y)=D_{\mathrm {KL} }(p(X,Y)\|p(X)p(Y)).
Mutual information is closely related to the log-likelihood ratio test in the context of contingency tables and the multinomial distribution and to Pearson's χ2 test: mutual information can be considered a statistic for assessing independence between a pair of variables, and has a well-specified asymptotic distribution.

Kullback–Leibler divergence (information gain)

The Kullback–Leibler divergence (or information divergence, information gain, or relative entropy) is a way of comparing two distributions: a "true" probability distribution p(X), and an arbitrary probability distribution q(X). If we compress data in a manner that assumes q(X) is the distribution underlying some data, when, in reality, p(X) is the correct distribution, the Kullback–Leibler divergence is the number of average additional bits per datum necessary for compression. It is thus defined
D_{\mathrm {KL} }(p(X)\|q(X))=\sum _{x\in X}-p(x)\log {q(x)}\,-\,\sum _{x\in X}-p(x)\log {p(x)}=\sum _{x\in X}p(x)\log {\frac {p(x)}{q(x)}}.
Although it is sometimes used as a 'distance metric', KL divergence is not a true metric since it is not symmetric and does not satisfy the triangle inequality (making it a semi-quasimetric).

Another interpretation of the KL divergence is the "unnecessary surprise" introduced by a prior from the truth: suppose a number X is about to be drawn randomly from a discrete set with probability distribution p(x). If Alice knows the true distribution p(x), while Bob believes (has a prior) that the distribution is q(x), then Bob will be more surprised than Alice, on average, upon seeing the value of X. The KL divergence is the (objective) expected value of Bob's (subjective) surprisal minus Alice's surprisal, measured in bits if the log is in base 2. In this way, the extent to which Bob's prior is "wrong" can be quantified in terms of how "unnecessarily surprised" it is expected to make him.

Other quantities

Other important information theoretic quantities include Rényi entropy (a generalization of entropy), differential entropy (a generalization of quantities of information to continuous distributions), and the conditional mutual information.

Coding theory

A picture showing scratches on the readable surface of a CD-R. Music and data CDs are coded using error correcting codes and thus can still be read even if they have minor scratches using error detection and correction.

Coding theory is one of the most important and direct applications of information theory. It can be subdivided into source coding theory and channel coding theory. Using a statistical description for data, information theory quantifies the number of bits needed to describe the data, which is the information entropy of the source.
  • Data compression (source coding): There are two formulations for the compression problem:
  • lossless data compression: the data must be reconstructed exactly;
  • lossy data compression: allocates bits needed to reconstruct the data, within a specified fidelity level measured by a distortion function. This subset of information theory is called rate–distortion theory.
  • Error-correcting codes (channel coding): While data compression removes as much redundancy as possible, an error correcting code adds just the right kind of redundancy (i.e., error correction) needed to transmit the data efficiently and faithfully across a noisy channel.
This division of coding theory into compression and transmission is justified by the information transmission theorems, or source–channel separation theorems that justify the use of bits as the universal currency for information in many contexts. However, these theorems only hold in the situation where one transmitting user wishes to communicate to one receiving user. In scenarios with more than one transmitter (the multiple-access channel), more than one receiver (the broadcast channel) or intermediary "helpers" (the relay channel), or more general networks, compression followed by transmission may no longer be optimal. Network information theory refers to these multi-agent communication models.

Source theory

Any process that generates successive messages can be considered a source of information. A memoryless source is one in which each message is an independent identically distributed random variable, whereas the properties of ergodicity and stationarity impose less restrictive constraints. All such sources are stochastic. These terms are well studied in their own right outside information theory.

Rate

Information rate is the average entropy per symbol. For memoryless sources, this is merely the entropy of each symbol, while, in the case of a stationary stochastic process, it is
r=\lim _{n\to \infty }H(X_{n}|X_{n-1},X_{n-2},X_{n-3},\ldots );
that is, the conditional entropy of a symbol given all the previous symbols generated. For the more general case of a process that is not necessarily stationary, the average rate is
r=\lim _{n\to \infty }{\frac {1}{n}}H(X_{1},X_{2},\dots X_{n});
that is, the limit of the joint entropy per symbol. For stationary sources, these two expressions give the same result.

It is common in information theory to speak of the "rate" or "entropy" of a language. This is appropriate, for example, when the source of information is English prose. The rate of a source of information is related to its redundancy and how well it can be compressed, the subject of source coding.

Channel capacity

Communications over a channel—such as an ethernet cable—is the primary motivation of information theory. As anyone who's ever used a telephone (mobile or landline) knows, however, such channels often fail to produce exact reconstruction of a signal; noise, periods of silence, and other forms of signal corruption often degrade quality.

Consider the communications process over a discrete channel. A simple model of the process is shown below:

Comm Channel.svg

Here X represents the space of messages transmitted, and Y the space of messages received during a unit time over our channel. Let p(y|x) be the conditional probability distribution function of Y given X. We will consider p(y|x) to be an inherent fixed property of our communications channel (representing the nature of the noise of our channel). Then the joint distribution of X and Y is completely determined by our channel and by our choice of f(x), the marginal distribution of messages we choose to send over the channel. Under these constraints, we would like to maximize the rate of information, or the signal, we can communicate over the channel. The appropriate measure for this is the mutual information, and this maximum mutual information is called the channel capacity and is given by:
C=\max _{f}I(X;Y).\!
This capacity has the following property related to communicating at information rate R (where R is usually bits per symbol). For any information rate R < C and coding error ε > 0, for large enough N, there exists a code of length N and rate ≥ R and a decoding algorithm, such that the maximal probability of block error is ≤ ε; that is, it is always possible to transmit with arbitrarily small block error. In addition, for any rate R > C, it is impossible to transmit with arbitrarily small block error.
Channel coding is concerned with finding such nearly optimal codes that can be used to transmit data over a noisy channel with a small coding error at a rate near the channel capacity.

Capacity of particular channel models

Binary symmetric channel.svg
  • A binary erasure channel (BEC) with erasure probability p is a binary input, ternary output channel. The possible channel outputs are 0, 1, and a third symbol 'e' called an erasure. The erasure represents complete loss of information about an input bit. The capacity of the BEC is 1 − p bits per channel use.
Binary erasure channel.svg

Applications to other fields

Intelligence uses and secrecy applications

Information theoretic concepts apply to cryptography and cryptanalysis. Turing's information unit, the ban, was used in the Ultra project, breaking the German Enigma machine code and hastening the end of World War II in Europe. Shannon himself defined an important concept now called the unicity distance. Based on the redundancy of the plaintext, it attempts to give a minimum amount of ciphertext necessary to ensure unique decipherability.

Information theory leads us to believe it is much more difficult to keep secrets than it might first appear. A brute force attack can break systems based on asymmetric key algorithms or on most commonly used methods of symmetric key algorithms (sometimes called secret key algorithms), such as block ciphers. The security of all such methods currently comes from the assumption that no known attack can break them in a practical amount of time.

Information theoretic security refers to methods such as the one-time pad that are not vulnerable to such brute force attacks. In such cases, the positive conditional mutual information between the plaintext and ciphertext (conditioned on the key) can ensure proper transmission, while the unconditional mutual information between the plaintext and ciphertext remains zero, resulting in absolutely secure communications. In other words, an eavesdropper would not be able to improve his or her guess of the plaintext by gaining knowledge of the ciphertext but not of the key. However, as in any other cryptographic system, care must be used to correctly apply even information-theoretically secure methods; the Venona project was able to crack the one-time pads of the Soviet Union due to their improper reuse of key material.

Pseudorandom number generation

Pseudorandom number generators are widely available in computer language libraries and application programs. They are, almost universally, unsuited to cryptographic use as they do not evade the deterministic nature of modern computer equipment and software. A class of improved random number generators is termed cryptographically secure pseudorandom number generators, but even they require random seeds external to the software to work as intended. These can be obtained via extractors, if done carefully. The measure of sufficient randomness in extractors is min-entropy, a value related to Shannon entropy through Rényi entropy; Rényi entropy is also used in evaluating randomness in cryptographic systems. Although related, the distinctions among these measures mean that a random variable with high Shannon entropy is not necessarily satisfactory for use in an extractor and so for cryptography uses.

Seismic exploration

One early commercial application of information theory was in the field of seismic oil exploration. Work in this field made it possible to strip off and separate the unwanted noise from the desired seismic signal. Information theory and digital signal processing offer a major improvement of resolution and image clarity over previous analog methods.

Semiotics

Concepts from information theory such as redundancy and code control have been used by semioticians such as Umberto Eco and Ferruccio Rossi-Landi to explain ideology as a form of message transmission whereby a dominant social class emits its message by using signs that exhibit a high degree of redundancy such that only one message is decoded among a selection of competing ones.

Miscellaneous applications

Information theory also has applications in gambling and investing, black holes, and bioinformatics.

Wednesday, August 29, 2018

21st century skills

From Wikipedia, the free encyclopedia
 
 
P21's Framework for 21st Century Learning

21st century skills comprise skills, abilities, and learning dispositions that have been identified as being required for success in 21st century society and workplaces by educators, business leaders, academics, and governmental agencies. This is part of a growing international movement focusing on the skills required for students to master in preparation for success in a rapidly changing, digital society. Many of these skills are also associated with deeper learning, which is based on mastering skills such as analytic reasoning, complex problem solving, and teamwork. These skills differ from traditional academic skills in that they are not primarily content knowledge-based.

During the latter decades of the 20th century and into the 21st century, society has undergone an accelerating pace of change in economy and technology. Its effects on the workplace, and thus on the demands on the educational system preparing students for the workforce, have been significant in several ways. Beginning in the 1980s, government, educators, and major employers issued a series of reports identifying key skills and implementation strategies to steer students and workers towards meeting the demands of the changing workplace and society.

The current workforce is significantly more likely to change career fields or jobs. Those in the Baby Boom generation entered the workforce with a goal of stability; subsequent generations are more concerned with finding happiness and fulfillment in their work lives. Young workers in North America are now likely to change jobs at a much higher rate than previously, as much as once every 4.4 years on average. With this employment mobility comes a demand for different skills, ones that enable people to be flexible and adaptable in different roles or in different career fields.

As western economies have transformed from industrial-based to service-based, trades and vocations have smaller roles. However, specific hard skills and mastery of particular skill sets, with a focus on digital literacy, are in increasingly high demand. People skills that involve interaction, collaboration, and managing others are increasingly important. Skills that enable people to be flexible and adaptable in different roles or in different fields, those that involve processing information and managing people more than manipulating equipment—in an office or a factory—are in greater demand. These are also referred to as "applied skills" or "soft skills", including personal, interpersonal, or learning-based skills, such as life skills (problem-solving behaviors), people skills, and social skills. The skills have been grouped into three main areas:
Many of these skills are also identified as key qualities of progressive education, a pedagogical movement that began in the late nineteenth century and continues in various forms to the present.

Background

Since the early 1980s, a variety of governmental, academic, non-profit, and corporate entities have conducted considerable research to identify key personal and academic skills and competencies they determined were needed for the current and next generation. The identification and implementation of 21st century skills into education and workplaces began in the United States but has spread to Canada, the United Kingdom, New Zealand, and through national and international organizations such as APEC and the OECD.

In 1981, the US Secretary of Education created the National Commission on Excellence in Education to examine the quality of education in the United States." The commission issued its report A Nation at Risk: The Imperative for Educational Reform in 1983. A key finding was that "educational reform should focus on the goal of creating a Learning Society." The report's recommendations included instructional content and skills:

Five New Basics: English, Mathematics, Science, Social Studies, Computer Science
 
Other Curriculum Matters: Develop proficiency, rigor, and skills in Foreign Languages, Performing Arts, Fine Arts, Vocational Studies, and the pursuit of higher level education.
 
Skills and abilities (consolidated):
  • enthusiasm for learning
  • deep understanding
  • application of learning
  • examination, inquiry, critical thinking and reasoning
  • communication – write well, listen effectively, discuss intelligently, be proficient in a foreign language,
  • cultural, social, and environmental - understanding and implications
  • technology – understand the computer as an information, computation, and communication device, and the world of computers, electronics, and related technologies.
  • diverse learning across a broad range - fine arts, performing arts, and vocational
Until the dawn of the 21st century, education systems across the world focussed on preparing their students to accumulate content and knowledge. As a result, schools focussed on providing literacy and numeracy skills to their students, as these skills were perceived as necessary to gain content and knowledge.  Recent developments in technology and telecommunication have made information and knowledge ubiquitous and easily accessible in the 21st century. Therefore, while skills such as literacy and numeracy are still relevant and necessary, they are no longer sufficient. In order to respond to technological, demographic and socio-economic changes, education systems began to make the shift toward providing their students with a range of skills that relied not only on cognition but also on the interdependencies of cognitive, social, and emotional characteristics.

Notable efforts were conducted by the US Secretary of Labor's Commission on Achieving Necessary Skills (SCANS), a national coalition called the Partnership for 21st Century Skills (P21), the international Organization for Economic Cooperation and Development, the American Association of College and Universities, researchers at MIT and other institutions of higher learning, and private organizations.

Additional research has found that the top skills demanded by U.S. Fortune 500 companies by the year 2000 had shifted from traditional reading, writing and arithmetic to teamwork, problem solving, and interpersonal skills. A 2006 Conference Board survey of some 400 employers revealed that the most important skills for new workforce entrants included oral and written communications and critical thinking/problem solving, ahead of basic knowledge and skills, such as the reading comprehension and mathematics. While the ‘three Rs’ were still considered foundational to new workforce entrants’ abilities, employers emphasized that applied skills like collaboration/teamwork and critical thinking were ‘very important’ to success at work."

A 2006 report from MIT researchers countered the suggestion that students acquire critical skills and competencies independently by interacting with popular culture, noting three continuing trends that suggest the need for policy and pedagogical interventions:"
  • The Participation Gap — the unequal access to the opportunities, experiences, skills, and knowledge that will prepare youth for full participation in the world of tomorrow.
  • The Transparency Problem — The challenges young people face in learning to see clearly the ways that media shape perceptions of the world.
  • The Ethics Challenge — The breakdown of traditional forms of professional training and socialization that might prepare young people for their increasingly public roles as media makers and community participants."
According to labor economists at MIT and Harvard’s Graduate School of Education, the economic changes brought about over the past four decades by emerging technology and globalization, employers’ demands for people with competencies like complex thinking and communications skills has increased greatly. They argue that the success of the U.S. economy will rely on the nation’s ability to give students the "foundational skills in problem-solving and communications that computers don’t have."

In 2010, the Common Core State Standards Initiative, an effort sponsored by the National Governors Association (NGA) and the Council of Chief State School Officers (CCSSO), issued the Common Core Standards, calling for the integration of 21st century skills into K-12 curricula across the United States.

The skills

The skills and competencies that are generally considered "21st Century skills" are varied but share some common themes. They are based on the premise that effective learning, or deeper learning, a set of student educational outcomes including acquisition of robust core academic content, higher-order thinking skills, and learning dispositions. This pedagogy involves creating, working with others, analyzing, and presenting and sharing both the learning experience and the learned knowledge or wisdom, including to peers and mentors as well as teachers. This contrasts with more traditional learning methodology that involves learning by rote and regurgitating info/knowledge back to the teacher for a grade. The skills are geared towards students and workers to foster engagement; seeking, forging, and facilitating connections to knowledge, ideas, peers, instructors, and wider audiences; creating/producing; and presenting/publishing. The classification or grouping has been undertaken to encourage and promote pedagogies that facilitate deeper learning through both traditional instruction as well as active learning, project-based learning, problem based learning, and others. A 2012 survey conducted by the American Management Association (AMA) identified three top skills necessary for their employees: critical thinking, communication and collaboration. Below are some of the more readily identifiable lists of 21st century skills.

Common Core

The Common Core Standards issued in 2010 were intended to support the "application of knowledge through higher-order thinking skills." The initiative's stated goals are to promote the skills and concepts required for college and career readiness in multiple disciplines and life in the global economy. Skills identified for success in the areas of literacy and mathematics:
  • cogent reasoning
  • evidence collection
  • critical-thinking, problem-solving, analytical
  • communication

SCANS

Following the release of A Nation at Risk, the U.S. Secretary of Labor appointed the Secretary's Commission on Achieving Necessary Skills (SCANS) to determine the skills needed for young people to succeed in the workplace to foster a high-performance economy. SCANS focused on what they called "learning a living" system. In 1991, they issued their initial report, What Work Requires of Schools. The report concluded that a high-performance workplace requires workers who have key fundamental skills: basic skills and knowledge, thinking skills to apply that knowledge, personal skills to manage and perform; and five key workplace competencies.
Fundamental Skills
  • Basic Skills: reads, writes, performs arithmetic and mathematical operations, listens and speaks.
  • Thinking Skills: thinks creatively, makes decisions, solves problems, visualizes, knows how to learn, and reasons
  • Personal Qualities: displays responsibility, self-esteem, sociability, self-management, and integrity and honesty
Workplace Competencies
  • Resources: identifies, organizes, plans, and allocates resources
  • Interpersonal: works with others (participates as member of a team, teaches others new skills, serves clients/customers, exercises leadership, negotiates, works with diversity)
  • Information: acquires and uses information (acquires and evaluates, organizes and maintains, and interprets and communicates information; uses computers to process information)
  • Systems: understands complex inter-relationships (understands systems, monitors and corrects performance, improves or designs systems)
  • Technology: works with a variety of technologies (selects technology, applies technology to task, maintains and troubleshoots equipment)

Partnership for 21st Century Skills (P21)

In 2002 the Partnership for 21st Century Skills (now the Partnership for 21st Century Learning, or P21) was founded as a non-profit organization by a coalition that included members of the national business community, education leaders, and policymakers: the National Education Association (NEA), United States Department of Education, AOL Time Warner Foundation, Apple Computer, Inc., Cable in the Classroom, Cisco Systems, Inc., Dell Computer Corporation, Microsoft Corporation, SAP, Ken Kay (President and Co-Founder), and Dins Golder-Dardis. To foster a national conversation on "the importance of 21st century skills for all students" and "position 21st century readiness at the center of US K-12 education", P21 identified six key skills:
  • Core subjects.
  • 21st century content.
  • Learning and thinking skills.
  • Information and communication technologies (ICT) literacy.
  • Life skills.
  • 21st century assessments.
7C Skills have been identified by P21 senior fellows at P21, Bernie Trilling and Charles Fadel:
  • Critical thinking and problem solving
  • Creativity and innovation
  • Cross-cultural understanding
  • Communications, information, and media literacy
  • Computing and ICT literacy
  • Career and learning self-reliance

The Four Cs

The P21 organization also conducted research that identified deeper learning competencies and skills they called the Four Cs of 21st century learning: The University of Southern California's Project New Literacies website list four different "C" skills:
  • Create
  • Circulate
  • Connect
  • Collaborate

7 Survival Skills

In 2008, author and Harvard Graduate School of Education researcher Tony Wagner identified what he termed the "7 Survival Skills" needed for the modern workplace:
  • Critical thinking and problem solving
  • Collaboration
  • Agility and adaptability
  • Initiative and entrepreneurialism
  • Effective oral and written communication
  • Accessing and analyzing information
  • Curiosity and imagination

Participatory culture & new media literacies

Researchers at MIT, led by Henry Jenkins, Director of the Comparative Media Studies Program, in 2006 issued a white paper ("Confronting the Challenges of a Participatory Culture: Media Education for the 21st Century"), that examined digital media and learning. To address this Digital Divide, they recommended an effort be made to develop the cultural competencies and social skills required to participate fully in modern society instead of merely advocating for installing computers in each classroom. What they term participatory culture shifts this literacy from the individual level to a broader connection and involvement, with the premise that networking and collaboration develop social skills that are vital to new literacies. These in turn build on traditional foundation skills and knowledge taught in school: traditional literacy, research, technical, and critical analysis skills. Participatory culture is defined by this study as having: low barriers to artistic expression and civic engagement, strong support for creating and sharing one’s creations, informal mentorship, belief that members' own contributions matter, and social connection (caring what other people think about their creations). Forms of participatory culture include:
  • Affiliations — memberships, formal and informal, in online communities centered around various forms of media, such as message boards, metagaming, game clans, and other social media).
  • Expressions — producing new creative forms, such as digital sampling, skinning and modding, fan videomaking, fan fiction writing, zines, mash-ups.
  • Collaborative Problem-solving — working together in teams, formal and informal, to complete tasks and develop new knowledge (such as through Wikipedia, alternative reality gaming, spoiling).
  • Circulations — shaping the flow of media (such as podcasting, blogging).
The skills identified were:
  • Play
  • Simulation
  • Appropriation
  • Multitasking
  • Distributed Cognition
  • Collective Intelligence
  • Judgment
  • Transmedia Navigation
  • Networking
  • Negotiation
A 2005 study (Lenhardt & Madden) found that more than one-half of all teens have created media content, and roughly one third of teens who use the Internet have shared content they produced, indicating a high degree of involvement in participatory cultures. Such digital literacies emphasize the intellectual activities of a person working with sophisticated information communications technology, not on proficiency with the tool.

EnGauge 21st century skills

In 2003 the North Central Regional Educational Laboratory and the Metiri Group issued a report entitled "enGauge® 21st Century Skills: Literacy in the Digital Age" based on two years of research. The report called for policymakers and educators to define 21st century skills, highlight the relationship of those skills to conventional academic standards, and recognize the need for multiple assessments to measure and evaluate these skills within the context of academic standards and the current technological and global society. To provide a common understanding of, and language for discussing, the needs of students, citizens, and workers in a modern digital society, the report identified four "skill clusters":
  • Digital-Age Literacy
  • Inventive Thinking
  • Effective Communication
  • High Productivity

OECD competencies

In 1997, member countries of the Organization for Economic Cooperation and Development launched the Programme for International Student Assessment (PISA) to monitor "the extent to which students near the end of compulsory schooling have acquired the knowledge and skills essential for full participation in society". In 2005 they identified three "Competency Categories:"
  • Using Tools Interactively
  • Interacting in Heterogeneous Groups
  • Acting Autonomously

American Association of College and Universities

The AAC&U conducted several studies and surveys of their members. In 2007 they recommended that graduates of higher education attain four skills - The Essential Learning Outcomes:
  • Knowledge of Human Cultures and the Physical and Natural World
  • Intellectual and Practical Skills
  • Personal and Social Responsibility
  • Integrative Learning
They found that skills most widely addressed in college and university goals are:[41]
  • writing
  • critical thinking
  • quantitative reasoning
  • oral communication
  • intercultural skills
  • information literacy
  • ethical reasoning
A 2015 survey of AAC&U member institutions added the following goals:
  • analytic reasoning
  • research skills and projects
  • integration of learning across disciplines
  • application of learning beyond the classroom
  • civic engagement and competence

ISTE / NETS performance standards

The ISTE Educational Technology Standards (formerly National Educational Technology Standards (NETS)) are a set of standards published by the International Society for Technology in Education (ISTE) to leverage the use of technology in K-12 education. These are sometimes intermixed with information and communication technologies (ICT) skills. In 2007 NETS issued a series of six performance indicators (only the first four are on their website as of 2016):
  • Creativity and Innovation
  • Communication and Collaboration
  • Research and Information Fluency
  • Critical Thinking, Problem Solving, and Decision Making
  • Digital Citizenship
  • Technology Operations and Concepts

ICT Literacy Panel digital literacy standards (2007)

In 2007 the Educational Testing Service (ETS) ICT Literacy Panel released its digital literacy standards:

Information and Communication Technologies (ICT) proficiencies:
  • Cognitive proficiency
  • Technical proficiency
  • ICT proficiency
A person possessing these skills would be expected to perform these tasks for a particular set of information: access, manage, integrate, evaluate, create/publish/present. The emphasis is on proficiency with digital tools.

Dede learning styles and categories

In 2005, Chris Dede of the Harvard Graduate School of Education developed a framework based on new digital literacies entitled
 
Neomillennial Learning Styles:
  • Fluency in multiple media
  • Active learning based on collectively seeking, sieving, and synthesizing experiences.
  • Expression through non-linear, associational webs of representations.
  • Co-design by teachers and students of personalized learning experiences.
Dede category system
 
With the exponential expansion of personal access to Internet resources, including social media, information and content on the Internet has evolved from being created by website providers to individuals and communities of contributors. The 21st century Internet centered on material created by a small number of people, Web 2.0 tools (e.g. Wikipedia) foster online communication, collaboration, and creation of content by large numbers of people (individually or in groups) in online communities.

In 2009, Dede created a category system for Web 2.0 tools:
  • Sharing (communal bookmarking, photo/video sharing, social networking, writers’ workshops/fanfiction)
  • Thinking (blogs, podcasts, online discussion fora)
  • Co-Creating (wikis/collaborative file creation, mashups/collective media creation, collaborative social change communities)

World Economic Forum

In 2015, the World Economic Forum published a report titled ‘New Vision for Education: Unlocking the Potential of Technology’  that focused on the pressing issue of the 21st-century skills gap and ways to address it through technology. In the report, they defined a set of 16 crucial proficiencies for education in the 21st century. Those skills include six “foundational literacies”, four “competencies” and six “character qualities” listed below.

Foundation Literacies
  • Literacy and numeracy
  • Scientific literacy
  • ICT literacy
  • Financial literacy
  • Cultural literacy
  • Civic literacy
Competencies
  • Critical thinking/problem solving
  • Communication
  • Collaboration
Character Qualities
  • Creativity
  • Initiative
  • Persistence/grit
  • Adaptability
  • Curiosity
  • Leadership
  • Social and cultural awareness

National Research Council

In a paper titled ‘Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century’  produced by the National Research Council of National Academies, the National Research defines 21st century skills, describes how the skills relate to each other and summaries the evidence regarding 21st century skills.

As a first step toward describing “21st century skills,” the National Research Council identified three domains of competence: cognitive, interpersonal, and intrapersonal while recognizing that the three domains while different, are intertwined in human development and learning. These three domains represent distinct facets of human thinking and build on previous efforts to identify and organize dimensions of human behaviour. The committee produced the following cluster of 21st century skills in the above mentioned 3 domains.

Cognitive Competencies
  • Cognitive processes and strategies: Critical thinking, problem solving, analysis, reasoning and argumentation, interpretation, decision-making, adaptive learning
  • Knowledge: Information literacy, ICT literacy, oral and written communication, and active listening
  • Creativity: Creativity and innovation
Intrapersonal Competencies
  • Intellectual openness: Flexibility, adaptability, artistic and cultural appreciation, personal and social responsibility, appreciation for diversity, adaptability, continuous learning, intellectual interest and curiosity
  • Work ethic/conscientiousness: Initiative, self-direction, responsibility, perseverance, grit, career orientation, ethics, integrity, citizenship
  • Positive core self-evaluation: Self monitoring, self evaluation, self reinforcement, physical and psychological health
Interpersonal Competencies
  • Teamwork and collaboration: Communication, collaboration, cooperation, teamwork, coordination, interpersonal skills
  • Leadership: Responsibility, assertive communication, self presentation, social influence with others

Implementation

Multiple agencies and organizations have issued guides and recommendation for implementation of 21st century skills in a variety of learning environments and learning spaces. These include five separate educational areas: standards, assessment, professional development, curriculum & instruction, and learning environments.

The designs of learning environments and curricula have been impacted by the initiatives and efforts to implement and support 21st century skills with a move away from the factory model school model and into a variety of different organizational models. Hands-on learning project-based learning have resulted in the development of programs and spaces such as STEM and makerspaces. Collaborative learning environments have fostered flexibility in furniture and classroom layout as well as differentiated spaces, such as small seminar rooms near classrooms. Literacy with, and access to, digital technology has impacted the design of furniture and fixed components as students and teachers use tablets, interactive whiteboards and interactive projectors. Classroom sizes have grown to accommodate a variety of furniture arrangements and grouping, many of which are less space-efficient than traditional configurations of desks in rows.

Pythagoras

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Pythagoras   Pythagoras Bust of Pythago...