Search This Blog

Friday, March 17, 2023

Network effect

From Wikipedia, the free encyclopedia
Diagram illustrating the network effect in a few simple phone networks. The lines represent potential calls between phones. As the number of phones connected to the network grows, the number of potential calls available to each phone grows and increases the utility of each phone, new and existing.

In economics, a network effect (also called network externality or demand-side economies of scale) is the phenomenon by which the value or utility a user derives from a good or service depends on the number of users of compatible products. Network effects are typically positive, resulting in a given user deriving more value from a product as more users join the same network. The adoption of a product by an additional user can be broken into two effects: an increase in the value to all other users ( "total effect") and also the enhancement of other non-users' motivation for using the product ("marginal effect").

Network effects can be direct or indirect. Direct network effects arise when a given user's utility increases with the number of other users of the same product or technology, meaning that adoption of a product by different users is complementary. This effect is separate from effects related to price, such as a benefit to existing users resulting from price decreases as more users join. Direct network effects can be seen with social networking services, including Twitter, Facebook, Airbnb, Uber, and LinkedIn; telecommunications devices like the telephone; and instant messaging services such as MSN, AIM or QQ. Indirect (or cross-group) network effects arise when there are "at least two different customer groups that are interdependent, and the utility of at least one group grows as the other group(s) grow". For example, hardware may become more valuable to consumers with the growth of compatible software.

Network effects are commonly mistaken for economies of scale, which describe decreasing average production costs in relation to the total volume of units produced. Economies of scale are a common phenomenon in traditional industries such as manufacturing, whereas network effects are most prevalent in new economy industries, particularly information and communication technologies. Network effects are the demand side counterpart of economies of scale, as they function by increasing a customer's willingness to pay due rather than decreasing the supplier's average cost.

Upon reaching critical mass, a bandwagon effect can result. As the network continues to become more valuable with each new adopter, more people are incentivised to adopt, resulting in a positive feedback loop. Multiple equilibria and a market monopoly are two key potential outcomes in markets that exhibit network effects. Consumer expectations are key in determining which outcomes will result.

Origins

Network effects were a central theme in the arguments of Theodore Vail, the first post-patent president of Bell Telephone, in gaining a monopoly on US telephone services. In 1908, when he presented the concept in Bell's annual report, there were over 4,000 local and regional telephone exchanges, most of which were eventually merged into the Bell System.

Network effects were popularized by Robert Metcalfe, stated as Metcalfe's law. Metcalfe was one of the co-inventors of Ethernet and a co-founder of the company 3Com. In selling the product, Metcalfe argued that customers needed Ethernet cards to grow above a certain critical mass if they were to reap the benefits of their network. According to Metcalfe, the rationale behind the sale of networking cards was that the cost of the network was directly proportional to the number of cards installed, but the value of the network was proportional to the square of the number of users. This was expressed algebraically as having a cost of N, and a value of N2. While the actual numbers behind this proposition were never firm, the concept allowed customers to share access to expensive resources like disk drives and printers, send e-mail, and eventually access the Internet.

The economic theory of the network effect was advanced significantly between 1985 and 1995 by researchers Michael L. Katz, Carl Shapiro, Joseph Farrell, and Garth Saloner. Author, high-tech entrepreneur Rod Beckstrom presented a mathematical model for describing networks that are in a state of positive network effect at BlackHat and Defcon in 2009 and also presented the "inverse network effect" with an economic model for defining it as well. Because of the positive feedback often associated with the network effect, system dynamics can be used as a modelling method to describe the phenomena. Word of mouth and the Bass diffusion model are also potentially applicable. The next major advance occurred between 2000 and 2003 when researchers Geoffrey G Parker, Marshall Van Alstyne, Jean-Charles Rochet and Jean Tirole independently developed the two-sided market literature showing how network externalities that cross distinct groups can lead to free pricing for one of those groups.

Evidence and consequences

Dynamics of activity on online platforms, as indicated via posts in social media platforms reveal long term economic consequences of network effects in both the offline and online economy.
Clues about the long term results of network effects on the global economy are reveled in new research into Online Diversity.

While the diversity of sources is in decline, there is a countervailing force of continually increasing functionality with new services, products and applications — such as music streaming services (Spotify), file sharing programs (Dropbox) and messaging platforms (Messenger, Whatsapp and Snapchat). Another major finding was the dramatic increase in the “infant mortality” rate of websites — with the dominant players in each functional niche - once established guarding their turf more staunchly than ever.

On the other hand, growing network effect does not always bring proportional increase in returns. Whether additional users bring more value depends on the commoditization of supply, the type of incremental user and the nature of substitutes. For example, social networks can hit an inflection point, after which additional users do not bring more value. This could be attributed to the fact that as more people join the network, its users are less willing to share personal content and the site becomes more focused on news and public content. 

Economics

Network economics refers to business economics that benefit from the network effect. This is when the value of a good or service increases when others buy the same good or service. Examples are website such as EBay, or iVillage where the community comes together and shares thoughts to help the website become a better business organization.

In sustainability, network economics refers to multiple professionals (architects, designers, or related businesses) all working together to develop sustainable products and technologies. The more companies are involved in environmentally friendly production, the easier and cheaper it becomes to produce new sustainable products. For instance, if no one produces sustainable products, it is difficult and expensive to design a sustainable house with custom materials and technology. But due to network economics, the more industries are involved in creating such products, the easier it is to design an environmentally sustainable building.

Another benefit of network economics in a certain field is improvement that results from competition and networking within an industry.

Adoption and competition

Critical mass

In the early phases of a network technology, incentives to adopt the new technology are low. After a certain number of people have adopted the technology, network effects become significant enough that adoption becomes a dominant strategy. This point is called critical mass. At the critical mass point, the value obtained from the good or service is greater than or equal to the price paid for the good or service.

When a product reaches critical mass, network effects will drive subsequent growth until a stable balance is reached. Therefore, a key business concern must then be how to attract users prior to reaching critical mass. Critical quality is closely related to consumer expectations, which will be affected by price and quality of products or services, the company's reputation and the growth path of the network. Thus, one way is to rely on extrinsic motivation, such as a payment, a fee waiver, or a request for friends to sign up. A more natural strategy is to build a system that has enough value without network effects, at least to early adopters. Then, as the number of users increases, the system becomes even more valuable and is able to attract a wider user base.

Limits to growth

Network growth is generally not infinite, and tends to plateau when it reaches market saturation (all customers have already joined) or diminishing returns make acquisition of the last few customers too costly.

Networks can also stop growing or collapse if they do not have enough capacity to handle growth. For example, a overloaded phone network that has so many customers that it becomes congested, leading to busy signals, the inability to get a dial tone, and poor customer support. This creates a risk that customers will defect to a rival network because of the inadequate capacity of the existing system. After this point, each additional user decreases the value obtained by every other user.

Peer-to-peer (P2P) systems are networks designed to distribute load among their user pool. This theoretically allows P2P networks to scale indefinitely. The P2P based telephony service Skype benefits from this effect and its growth is limited primarily by market saturation.

Market tipping

Network effects give rise to the potential outcome of market tipping, defined as "the tendency of one system to pull away from its rivals in popularity once it has gained an initial edge". Tipping results in a market in which only one good or service dominates and competition is stifled, and can result in a monopoly. This is because network effects tend to incentivise users to coordinate their adoption of a single product. Therefore, tipping can result in a natural form of market concentration in markets that display network effects. However, the presence of network effects does not necessarily imply that a market will tip; the following additional conditions must be met:

  1. The utility derived by users from network effects must exceed the utility they derive from differentiation
  2. Users must have high costs of multihoming (i.e. adopting more than one competing networks)
  3. Users must have high switching costs

If any of these three conditions are not satisfied, the market may fail to tip and multiple products with significant market shares may coexist. One such example is the U.S. instant messaging market, which remained an oligopoly despite significant network effects. This can be attributed to the low multi-homing and switching costs faced by users.

Market tipping does not imply permanent success in a given market. Competition can be reintroduced into the market due to shocks such as the development of new technologies. Additionally, if the price is raised above customers' willingness to pay, this may reverse market tipping.

Multiple equilibria and expectations

Networks effects often result in multiple potential market equilibrium outcomes. The key determinant in which equilibrium will manifest are the expectations of the market participants, which are self-fulfilling. Because users are incentivised to coordinate their adoption, user will tend to adopt the product that they expect to draw the largest number of users. These expectations may be shaped by path dependence, such as a perceived first-mover advantage, which can result in lock-in. The most commonly cited example of path dependence is the QWERTY keyboard, which owes its ubiquity to its establishment of an early lead in the keyboard layout industry and high switching costs, rather than any inherent advantage over competitors. Other key influences of adoption expectations can be reputational (e.g. a firm that has previously produced high quality products may be favoured over a new firm).

Markets with network effects may result in inefficient equilibrium outcomes. With simultaneous adoption, users may fail to coordinate towards a single agreed-upon product, resulting in splintering among different networks, or may coordinate to lock-in to a different product than the one that is best for them.

Technology lifecycle

If some existing technology or company whose benefits are largely based on network effects starts to lose market share against a challenger such as a disruptive technology or open standards based competition, the benefits of network effects will reduce for the incumbent, and increase for the challenger. In this model, a tipping point is eventually reached at which the network effects of the challenger dominate those of the former incumbent, and the incumbent is forced into an accelerating decline, whilst the challenger takes over the incumbent's former position.

Sony's Betamax and Victor Company of Japan (JVC)'s video home system (VHS) can both be used for video cassette recorders (VCR), but the two technologies are not compatible. Therefore, the VCR that is suitable for one type of cassette cannot fit in another. VHS's technology gradually surpassed Betamax in the competition. In the end, Betamax lost its original market share and was replaced by VHS.

Negative network externalities

Negative network externalities, in the mathematical sense, are those that have a negative effect compared to normal (positive) network effects. Just as positive network externalities (network effects) cause positive feedback and exponential growth, negative network externalities create negative feedback and exponential decay. In nature, negative network externalities are the forces that pull towards equilibrium, are responsible for stability, and represent physical limitations keeping systems bounded.

Besides, Negative network externalities has four characteristics, which are namely, more login retries, longer query times, longer download times and more download attempts. Therefore, congestion occurs when the efficiency of a network decreases as more people use it, and this reduces the value to people already using it. Traffic congestion that overloads the freeway and network congestion on connections with limited bandwidth both display negative network externalities.

Braess's paradox suggests that adding paths through a network can have a negative effect on performance of the network.

Interoperability

Interoperability has the effect of making the network bigger and thus increases the external value of the network to consumers. Interoperability achieves this primarily by increasing potential connections and secondarily by attracting new participants to the network. Other benefits of interoperability include reduced uncertainty, reduced lock-in, commoditization and competition based on price.

Interoperability can be achieved through standardization or other cooperation. Companies involved in fostering interoperability face a tension between cooperating with their competitors to grow the potential market for products and competing for market share.

Compatibility and incompatibility

Product compatibility is closely related to network externalities in company's competition, which refers to two systems that can be operated together without changing. Compatible products are characterized by better matching with customers, so they can enjoy all the benefits of the network without having to purchase products from the same company. However, not only products of compatibility will intensify competition between companies, this will make users who had purchased products lose their advantages, but also proprietary networks may raise the industry entry standards. Compared to large companies with better reputation or strength, weaker companies or small networks will more inclined to choose compatible products.

Besides, the compatibility of products is conducive to the company's increase in market share. For example, the Windows system is famous for its operating compatibility, thereby satisfying consumers' diversification of other applications. As the supplier of Windows systems, Microsoft benefits from indirect network effects, which cause the growing of the company's market share.

Incompatibility is the opposite of compatibility. Because incompatibility of products will aggravate market segmentation and reduce efficiency, and also harm consumer interests and enhance competition. The result of the competition between incompatible networks depends on the complete sequential of adoption and the early preferences of the adopters. Effective competition determines the market share of companies, which is historically important. Since the installed base can directly bring more network profit and increase the consumers' expectations, which will have a positive impact on the smooth implementation of subsequent network effects.

Open versus closed standards

In communication and information technologies, open standards and interfaces are often developed through the participation of multiple companies and are usually perceived to provide mutual benefit. But, in cases in which the relevant communication protocols or interfaces are closed standards, the network effect can give the company controlling those standards monopoly power. The Microsoft corporation is widely seen by computer professionals as maintaining its monopoly through these means. One observed method Microsoft uses to put the network effect to its advantage is called Embrace, extend and extinguish.

Mirabilis is an Israeli start-up which pioneered instant messaging (IM) and was bought by America Online. By giving away their ICQ product for free and preventing interoperability between their client software and other products, they were able to temporarily dominate the market for instant messaging. The IM technology has completed the use from the home to the workplace, because of its faster processing speed and simplified process characteristics. Because of the network effect, new IM users gained much more value by choosing to use the Mirabilis system (and join its large network of users) than they would use a competing system. As was typical for that era, the company never made any attempt to generate profits from its dominant position before selling the company.

Network effect as a competitive advantage

Network effect can significantly influence the competitive landscape of an industry. According to Michael E. Porter, strong network effect might decrease the threat of new entrants, which is one of the five major competitive forces that act on an industry. Persistent barriers to entry a market may help incumbent companies to fend off competition and keep or increase their market share, while maintaining profitability and return on capital. 

These attractive characteristics are one of the reasons that allowed platform companies like Amazon, Google or Facebook to grow rapidly and create shareholder value. On the other hand, network effect can result in high concentration of power in an industry, or even a monopoly. This often leads to increased scrutiny from regulators that try to restore healthy competition, as is often the case with large technology companies.

Examples

The Telephone

Network effects are the incremental benefit gained by each user for each new user that joins a network. An example of a direct network effect is the telephone. Originally when only a small number of people owned a telephone the value it provided was minimal. Not only did other people need to own a telephone for it to be useful, but it also had to be connected to the network through the users home. As technology advanced it became more affordable for people to own a telephone. This created more value and utility due to the increase in users. Eventually increased usage through exponential growth led to the telephone is used by almost every household adding more value to the network for all users. Without the network effect and technological advances the telephone would have no where near the amount of value or utility as it does today.

Financial exchanges

Stock exchanges and derivatives exchanges feature a network effect. Market liquidity is a major determinant of transaction cost in the sale or purchase of a security, as a bid–ask spread exists between the price at which a purchase can be made versus the price at which the sale of the same security can be made. As the number of sellers and buyers in the exchange, who have the symmetric information increases, liquidity increases, and transaction costs decrease. This then attracts a larger number of buyers and sellers to the exchange.

The network advantage of financial exchanges is apparent in the difficulty that startup exchanges have in dislodging a dominant exchange. For example, the Chicago Board of Trade has retained overwhelming dominance of trading in US Treasury bond futures despite the startup of Eurex US trading of identical futures contracts. Similarly, the Chicago Mercantile Exchange has maintained dominance in trading of Eurobond interest rate futures despite a challenge from Euronext.Liffe.

Cryptocurrencies

Cryptocurrencies such as Bitcoin, also feature network effects. Bitcoin's unique properties make it an attractive asset to users and investors. The more users that join the network, the more valuable and secure it becomes. This method creates incentive for users to join so that when the network and community grows, a network effect occurs, making it more likely that new people will also join. Bitcoin provides its users with financial value through the network effect which may lead to more investors due to the appeal of financial gain. This is an example of an indirect network effect as the value only increases due to the initial network being created.

Software

The widely used computer software benefits from powerful network effects. The software-purchase characteristic is that it is easily influenced by the opinions of others, so the customer base of the software is the key to realizing a positive network effect. Although customers' motivation for choosing software is related to the product itself, media interaction and word-of-mouth recommendations from purchased customers can still increase the possibility of software being applied to other customers who have not purchased it, thereby resulting in network effects.

In 2007 Apple released the iPhone followed by the app store. Most iPhone apps rely heavily on the existence of strong network effects. This enables the software to grow in popularity very quickly and spread to a large userbase with very limited marketing needed. The Freemium business model has evolved to take advantage of these network effects by releasing a free version that will not limit the adoption or any users and then charge for premium features as the primary source of revenue. Furthermore, some software companies will launch free trial versions during the trial period to attract buyers and reduce their uncertainty. The duration of free time is related to the network effect. The more positive feedback the company received, the shorter the free trial time will be.

Software companies (for example Adobe or Autodesk) often give significant discounts to students. By doing so, they intentionally stimulate the network effect - as more students learn to use a particular piece of software, it becomes more viable for companies and employers to use it as well. And the more employers require a given skill, the higher the benefit that employees will receive from learning it. This creates a self-reinforcing cycle, further strengthening the network effect.

Web sites

Many web sites benefit from a network effect. One example is web marketplaces and exchanges. For example, eBay would not be a particularly useful site if auctions were not competitive. As the number of users grows on eBay, auctions grow more competitive, pushing up the prices of bids on items. This makes it more worthwhile to sell on eBay and brings more sellers onto eBay, which, in turn, drives prices down again due to increased supply. Increased supply brings even more buyers to eBay. Essentially, as the number of users of eBay grows, prices fall and supply increases, and more and more people find the site to be useful.

Network effects were used as justification in business models by some of the dot-com companies in the late 1990s. These firms operated under the belief that when a new market comes into being which contains strong network effects, firms should care more about growing their market share than about becoming profitable. The justification was that market share would determine which firm could set technical and marketing standards and giving these companies a first-mover advantage.

Social networking websites are good examples. The more people register onto a social networking website, the more useful the website is to its registrants.

Google uses the network effect in its advertising business with its Google AdSense service. AdSense places ads on many small sites, such as blogs, using Google technology to determine which ads are relevant to which blogs. Thus, the service appears to aim to serve as an exchange (or ad network) for matching many advertisers with many small sites. In general, the more blogs AdSense can reach, the more advertisers it will attract, making it the most attractive option for more blogs.

By contrast, the value of a news site is primarily proportional to the quality of the articles, not to the number of other people using the site. Similarly, the first generation of search engines experienced little network effect, as the value of the site was based on the value of the search results. This allowed Google to win users away from Yahoo! without much trouble, once users believed that Google's search results were superior. Some commentators mistook the value of the Yahoo! brand (which does increase as more people know of it) for a network effect protecting its advertising business.

Rail gauge

The dominant rail gauge in each country shown

There are strong network effects in the initial choice of rail gauge, and in gauge conversion decisions. Even when placing isolated rails not connected to any other lines, track layers usually choose a standard rail gauge so they can use off-the-shelf rolling stock. Although a few manufacturers make rolling stock that can adjust to different rail gauges, most manufacturers make rolling stock that only works with one of the standard rail gauges. This even applies to urban rail systems where historically tramways and to a lesser extent metros would come in a wide array of different gauges, nowadays virtually all new networks are built to a handful of gauges and overwhelmingly standard gauge.

Credit cards

For credit cards that are now widely used, large-scale applications on the market are closely related to network effects. Credit card, as one of the currency payment methods in the current economy, which was originated in 1949. Early research on the circulation of credit cards at the retail level found that credit card interest rates were not affected by macroeconomic interest rates and remained almost unchanged. Later, credit cards gradually entered the network level due to changes in policy priorities and became a popular trend in payment in the 1980s. Different levels of credit cards separate benefit from two types of network effects. The application of credit cards related to external network effects, which is because when this has become a payment method, and more people use credit cards. Each additional person uses the same credit card, the value of rest people who use the credit card will increase. Besides, the credit card system at the network level could be seen as a two-sided market. On the one hand, the number of cardholders attracts merchants to use credit cards as a payment method. On the other hand, an increasing number of merchants can also attract more new cardholders. In other words, the use of credit cards has increased significantly among merchants which leads to increased value. This can conversely increase the cardholder's credit card value and the number of users. Moreover, credit card services also display a network effect between merchant discounts and credit accessibility. When credit accessibility increases which greater sales can be obtained, merchants are willing to be charged more discounts by credit card issuers.

Visa has become a leader in the electronic payment industry through the network effect of credit cards as its competitive advantage. Till 2016, Visa's credit card market share has risen from a quarter to as much as half in four years. Visa is benefit from the network effect. Since every additional Visa cardholder is more attractive to merchants, and merchants can also attract more new cardholders through the brand. In other words, the popularity and convenience of Visa in the electronic payment market, lead more people and merchants choose to use Visa, which greatly increases the value of Visa.

Technology adoption life cycle / Disruptive innovation

From Wikipedia, the free encyclopedia

An 1880 penny-farthing (left), and a 1886 Rover safety bicycle with gearing
Christensen's Types of Innovation

Sustaining

An innovation that does not significantly affect existing markets. It may be either:
Evolutionary
An innovation that improves a product in an existing market in ways that customers are expecting (e.g., fuel injection for gasoline engines, which displaced carburetors.)
Revolutionary (discontinuous but sustaining)
An innovation that is unexpected, but nevertheless does not affect existing markets (e.g., the first automobiles in the late 19th century, which were expensive luxury items, and as such very few were sold)

Disruptive

An innovation that creates a new market or enters at the bottom of an existing market by providing a different set of values, which ultimately (and unexpectedly) overtakes incumbents (e.g., the lower-priced, affordable Ford Model T, which displaced horse-drawn carriages)

In business theory, disruptive innovation is innovation that creates a new market and value network or enters at the bottom of an existing market and eventually displaces established market-leading firms, products, and alliances. The concept was developed by the American academic Clayton Christensen and his collaborators beginning in 1995, and has been called the most influential business idea of the early 21st century. Lingfei Wu, Dashun Wang, and James A. Evans generalized this term to identify disruptive science and technological advances from more than 65 million papers, patents and software products that span the period 1954–2014. Their work was featured as the cover of the February 2019 issue of Nature  and was included among the Altmetric 100 most-discussed work in 2019.

Not all innovations are disruptive, even if they are revolutionary. For example, the first automobiles in the late 19th century were not a disruptive innovation, because early automobiles were expensive luxury items that did not disrupt the market for horse-drawn vehicles. The market for transportation essentially remained intact until the debut of the lower-priced Ford Model T in 1908. The mass-produced automobile was a disruptive innovation, because it changed the transportation market, whereas the first thirty years of automobiles did not.

Disruptive innovations tend to be produced by outsiders and entrepreneurs in startups, rather than existing market-leading companies. The business environment of market leaders does not allow them to pursue disruptive innovations when they first arise, because they are not profitable enough at first and because their development can take scarce resources away from sustaining innovations (which are needed to compete against current competition). Small teams are more likely to create disruptive innovations than large teams. A disruptive process can take longer to develop than by the conventional approach and the risk associated to it is higher than the other more incremental, architectural or evolutionary forms of innovations, but once it is deployed in the market, it achieves a much faster penetration and higher degree of impact on the established markets.

Beyond business and economics disruptive innovations can also be considered to disrupt complex systems, including economic and business-related aspects. Through identifying and analyzing systems for possible points of intervention, one can then design changes focused on disruptive interventions.

Usage history

The term disruptive technologies was coined by Clayton M. Christensen and introduced in his 1995 article Disruptive Technologies: Catching the Wave, which he cowrote with Joseph Bower. The article is aimed at both management executives who make the funding or purchasing decisions in companies, as well as the research community, which is largely responsible for introducing the disruptive vector to the consumer market. He describes the term further in his book The Innovator's Dilemma. Innovator's Dilemma explored the case of the disk drive industry (the disk drive and memory industry, with its rapid technological evolution, is to the study of technology what fruit flies are to the study of genetics, as Christensen was told in the 1990s) and the excavating and Earth-moving industry (where hydraulic actuation slowly, yet eventually, displaced cable-actuated machinery). In his sequel with Michael E. Raynor, The Innovator's Solution, Christensen replaced the term disruptive technology with disruptive innovation because he recognized that most technologies are not intrinsically disruptive or sustaining in character; rather, it is the business model that identifies the crucial idea that potentiates profound market success and subsequently serves as the disruptive vector. Comprehending Christensen's business model, which takes the disruptive vector from the idea borne from the mind of the innovator to a marketable product, is central to understanding how novel technology facilitates the rapid destruction of established technologies and markets by the disruptor. Christensen and Mark W. Johnson, who cofounded the management consulting firm Innosight, described the dynamics of "business model innovation" in the 2008 Harvard Business Review article "Reinventing Your Business Model". The concept of disruptive technology continues a long tradition of identifying radical technological change in the study of innovation by economists, and its implementation and execution by its management at a corporate or policy level.

According to Christensen, "the term 'disruptive innovation' is misleading when it is used to refer to the derivative, or 'instantaneous value', of the market behavior of the product or service, rather than the integral, or 'sum over histories', of the product's market behavior."

In the late 1990s, the automotive sector began to embrace a perspective of "constructive disruptive technology" by working with the consultant David E. O'Ryan, whereby the use of current off-the-shelf technology was integrated with newer innovation to create what he called "an unfair advantage". The process or technology change as a whole had to be "constructive" in improving the current method of manufacturing, yet disruptively impact the whole of the business case model, resulting in a significant reduction of waste, energy, materials, labor, or legacy costs to the user.

In keeping with the insight that a persuasive advertising campaign can be just as effective as technological sophistication at bringing a successful product to market, Christensen's theory explains why many disruptive innovations are not advanced or useful technologies, rather combinations of existing off-the-shelf components, applied shrewdly to a fledgling value network.

Online news site TechRepublic proposes an end using the term, and similar related terms, suggesting that, as of 2014, it is overused jargon.

Definition

  • Disruption is a process, not a product or service, that occurs from the nascent to the mainstream
  • Originates in low-end (less demanding customers) or new market (where none existed) footholds
  • New firms don't catch on with mainstream customers until quality catches up with their standards
  • Success is not a requirement and some business can be disruptive but fail
  • New firm's business model differs significantly from incumbent

Christensen continues to develop and refine the theory and has accepted that not all examples of disruptive innovation perfectly fit into his theory. For example, he conceded that originating in the low end of the market is not always a cause of disruptive innovation, but rather it fosters competitive business models, using Uber as an example. In an interview with Forbes magazine he stated:

"Uber helped me realize that it isn’t that being at the bottom of the market is the causal mechanism, but that it’s correlated with a business model that is unattractive to its competitor".

Theory

The current theoretical understanding of disruptive innovation is different from what might be expected by default, an idea that Clayton M. Christensen called the "technology mudslide hypothesis". This is the simplistic idea that an established firm fails because it doesn't "keep up technologically" with other firms. In this hypothesis, firms are like climbers scrambling upward on crumbling footing, where it takes constant upward-climbing effort just to stay still, and any break from the effort (such as complacency born of profitability) causes a rapid downhill slide. Christensen and colleagues have shown that this simplistic hypothesis is wrong; it doesn't model reality. What they have shown is that good firms are usually aware of the innovations, but their business environment does not allow them to pursue them when they first arise, because they are not profitable enough at first and because their development can take scarce resources away from that of sustaining innovations (which are needed to compete against current competition). In Christensen's terms, a firm's existing value networks place insufficient value on the disruptive innovation to allow its pursuit by that firm. Meanwhile, start-up firms inhabit different value networks, at least until the day that their disruptive innovation is able to invade the older value network. At that time, the established firm in that network can at best only fend off the market share attack with a me-too entry, for which survival (not thriving) is the only reward.

In the technology mudslide hypothesis, Christensen differentiated disruptive innovation from sustaining innovation. He explained that the latter's goal is to improve existing product performance. On the other hand, he defines a disruptive innovation as a product or service designed for a new set of customers.

Generally, disruptive innovations were technologically straightforward, consisting of off-the-shelf components put together in a product architecture that was often simpler than prior approaches. They offered less of what customers in established markets wanted and so could rarely be initially employed there. They offered a different package of attributes valued only in emerging markets remote from, and unimportant to, the mainstream.

Christensen also noted that products considered as disruptive innovations tend to skip stages in the traditional product design and development process to quickly gain market traction and competitive advantage. He argued that disruptive innovations can hurt successful, well-managed companies that are responsive to their customers and have excellent research and development. These companies tend to ignore the markets most susceptible to disruptive innovations, because the markets have very tight profit margins and are too small to provide a good growth rate to an established (sizable) firm. Thus, disruptive technology provides an example of an instance when the common business-world advice to "focus on the customer" (or "stay close to the customer", or "listen to the customer") can be strategically counterproductive.

While Christensen argued that disruptive innovations can hurt successful, well-managed companies, O'Ryan countered that "constructive" integration of existing, new, and forward-thinking innovation could improve the economic benefits of these same well-managed companies, once decision-making management understood the systemic benefits as a whole.

How low-end disruption occurs over time

Christensen distinguishes between "low-end disruption", which targets customers who do not need the full performance valued by customers at the high end of the market, and "new-market disruption", which targets customers who have needs that were previously unserved by existing incumbents.

Low-end disruption

"Low-end disruption" occurs when the rate at which products improve exceeds the rate at which customers can adopt the new performance. Therefore, at some point the performance of the product overshoots the needs of certain customer segments. At this point, a disruptive technology may enter the market and provide a product that has lower performance than the incumbent but that exceeds the requirements of certain segments, thereby gaining a foothold in the market.

In low-end disruption, the disruptor is focused initially on serving the least profitable customer, who is happy with a good enough product. This type of customer is not willing to pay premium for enhancements in product functionality. Once the disruptor has gained a foothold in this customer segment, it seeks to improve its profit margin. To get higher profit margins, the disruptor needs to enter the segment where the customer is willing to pay a little more for higher quality. To ensure this quality in its product, the disruptor needs to innovate. The incumbent will not do much to retain its share in a not-so-profitable segment, and will move up-market and focus on its more attractive customers. After a number of such encounters, the incumbent is squeezed into smaller markets than it was previously serving. And then, finally, the disruptive technology meets the demands of the most profitable segment and drives the established company out of the market.

New market disruption

"New market disruption" occurs when a product fits a new or emerging market segment that is not being served by existing incumbents in the industry. Some scholars note that the creation of a new market is a defining feature of disruptive innovation, particularly in the way it tend to improve products or services differently in comparison to normal market drivers. It initially caters to a niche market and proceeds on defining the industry over time once it is able to penetrate the market or induce consumers to defect from the existing market into the new market it created.

The extrapolation of the theory to all aspects of life has been challenged, as has the methodology of relying on selected case studies as the principal form of evidence. Jill Lepore points out that some companies identified by the theory as victims of disruption a decade or more ago, rather than being defunct, remain dominant in their industries today (including Seagate Technology, U.S. Steel, and Bucyrus). Lepore questions whether the theory has been oversold and misapplied, as if it were able to explain everything in every sphere of life, including not just business but education and public institutions.

Disruptive technology

In 2009, Milan Zeleny described high technology as disruptive technology and raised the question of what is being disrupted. The answer, according to Zeleny, is the support network of high technology. For example, introducing electric cars disrupts the support network for gasoline cars (network of gas and service stations). Such disruption is fully expected and therefore effectively resisted by support net owners. In the long run, high (disruptive) technology bypasses, upgrades, or replaces the outdated support network.

Questioning the concept of a disruptive technology, Haxell (2012) questions how such technologies get named and framed, pointing out that this is a positioned and retrospective act.

Technology, being a form of social relationship, always evolves. No technology remains fixed. Technology starts, develops, persists, mutates, stagnates, and declines, just like living organisms. The evolutionary life cycle occurs in the use and development of any technology. A new high-technology core emerges and challenges existing technology support nets (TSNs), which are thus forced to coevolve with it. New versions of the core are designed and fitted into an increasingly appropriate TSN, with smaller and smaller high-technology effects. High technology becomes regular technology, with more efficient versions fitting the same support net. Finally, even the efficiency gains diminish, emphasis shifts to product tertiary attributes (appearance, style), and technology becomes TSN-preserving appropriate technology. This technological equilibrium state becomes established and fixated, resisting being interrupted by a technological mutation; then new high technology appears and the cycle is repeated.

Regarding this evolving process of technology, Christensen said:

The technological changes that damage established companies are usually not radically new or difficult from a technological point of view. They do, however, have two important characteristics: First, they typically present a different package of performance attributes—ones that, at least at the outset, are not valued by existing customers. Second, the performance attributes that existing customers do value improve at such a rapid rate that the new technology can later invade those established markets.

The World Bank's 2019 World Development Report on The Changing Nature of Work examines how technology shapes the relative demand for certain skills in labor markets and expands the reach of firms - robotics and digital technologies, for example, enable firms to automate, replacing labor with machines to become more efficient, and innovate, expanding the number of tasks and products. Joseph Bower explained the process of how disruptive technology, through its requisite support net, dramatically transforms a certain industry.

When the technology that has the potential for revolutionizing an industry emerges, established companies typically see it as unattractive: it’s not something their mainstream customers want, and its projected profit margins aren’t sufficient to cover big-company cost structure. As a result, the new technology tends to get ignored in favor of what’s currently popular with the best customers. But then another company steps in to bring the innovation to a new market. Once the disruptive technology becomes established there, smaller-scale innovation rapidly raise the technology’s performance on attributes that mainstream customers’ value.

For example, the automobile was high technology with respect to the horse carriage. It evolved into technology and finally into appropriate technology with a stable, unchanging TSN. The main high-technology advance in the offing is some form of electric car—whether the energy source is the sun, hydrogen, water, air pressure, or traditional charging outlet. Electric cars preceded the gasoline automobile by many decades and are now returning to replace the traditional gasoline automobile. The printing press was a development that changed the way that information was stored, transmitted, and replicated. This allowed empowered authors but it also promoted censorship and information overload in writing technology.

Milan Zeleny described the above phenomenon. He also wrote that:

Implementing high technology is often resisted. This resistance is well understood on the part of active participants in the requisite TSN. The electric car will be resisted by gas-station operators in the same way automated teller machines (ATMs) were resisted by bank tellers and automobiles by horsewhip makers. Technology does not qualitatively restructure the TSN and therefore will not be resisted and never has been resisted. Middle management resists business process reengineering because BPR represents a direct assault on the support net (coordinative hierarchy) they thrive on. Teamwork and multi-functionality is resisted by those whose TSN provides the comfort of narrow specialization and command-driven work.

Social media could be considered a disruptive innovation within sports. More specifically, the way that news in sports circulates nowadays versus the pre-internet era where sports news was mainly on T.V., radio, and newspapers. Social media has created a new market for sports that was not around before in the sense that players and fans have instant access to information related to sports.

High-technology effects

High technology is a technology core that changes the very architecture (structure and organization) of the components of the technology support net. High technology therefore transforms the qualitative nature of the TSN's tasks and their relations, as well as their requisite physical, energy, and information flows. It also affects the skills required, the roles played, and the styles of management and coordination—the organizational culture itself.

This kind of technology core is different from regular technology core, which preserves the qualitative nature of flows and the structure of the support and only allows users to perform the same tasks in the same way, but faster, more reliably, in larger quantities, or more efficiently. It is also different from appropriate technology core, which preserves the TSN itself with the purpose of technology implementation and allows users to do the same thing in the same way at comparable levels of efficiency, instead of improving the efficiency of performance.

On differences between high and low technologies, Milan Zeleny wrote:

The effects of high technology always breaks the direct comparability by changing the system itself, therefore requiring new measures and new assessments of its productivity. High technology cannot be compared and evaluated with the existing technology purely on the basis of cost, net present value or return on investment. Only within an unchanging and relatively stable TSN would such direct financial comparability be meaningful. For example, you can directly compare a manual typewriter with an electric typewriter, but not a typewriter with a word processor. Therein lies the management challenge of high technology.

Not all modern technologies are high technologies, only those used and functioning as such, and embedded in their requisite TSNs. They have to empower the individual because only through the individual can they empower knowledge. Not all information technologies have integrative effects. Some information systems are still designed to improve the traditional hierarchy of command and thus preserve and entrench the existing TSN. The administrative model of management, for instance, further aggravates the division of task and labor, further specializes knowledge, separates management from workers, and concentrates information and knowledge in centers.

As knowledge surpasses capital, labor, and raw materials as the dominant economic resource, technologies are also starting to reflect this shift. Technologies are rapidly shifting from centralized hierarchies to distributed networks. Nowadays knowledge does not reside in a super-mind, super-book, or super-database, but in a complex relational pattern of networks brought forth to coordinate human action.

Internal auditor response

Internal audit plays a critical role maintaining effective control mitigating emerging risks. Businesses will increase risk or bypass opportunity if auditors do not address disruption-related risks. Michael G. Alles has discussed that Big Data is a disruptive innovation that auditors must incorporate in practice. A 2019 study, Internal Auditors' Response to Disruptive Innovation, reports on the evolution of internal audit to react to changes. Disruptions examined include data analytics, agile processes, cloud computing, robotic process automation, continuous auditing, regulatory change, and artificial intelligence.

Proactive approach

A proactive approach to addressing the challenge posited by disruptive innovations has been debated by scholars. Petzold criticized the lack of acknowledgment of underlying process of the change to study the disruptive innovation over time from a process view and complexify the concept to support the understanding of its unfolding and advance its manageability. Keeping in view the multidimensional nature of disruptive innovation a measurement framework has been developed by Guo to enable a systemic assessment of disruptive potential of innovations, providing insights for the decisions in product/service launch and resource allocation. Middle managers play an important role in long term sustainability of any firm and thus have been studied to have a proactive role in exploitation of the disruptive innovation process.

Examples

In the practical world, the popularization of personal computers illustrates how knowledge contributes to the ongoing technology innovation. The original centralized concept (one computer, many persons) is a knowledge-defying idea of the prehistory of computing, and its inadequacies and failures have become clearly apparent. The era of personal computing brought powerful computers "on every desk" (one person, one computer). This short transitional period was necessary for getting used to the new computing environment, but was inadequate from the vantage point of producing knowledge. Adequate knowledge creation and management come mainly from networking and distributed computing (one person, many computers). Each person's computer must form an access point to the entire computing landscape or ecology through the Internet of other computers, databases, and mainframes, as well as production, distribution, and retailing facilities, and the like. For the first time, technology empowers individuals rather than external hierarchies. It transfers influence and power where it optimally belongs: at the loci of the useful knowledge. Even though hierarchies and bureaucracies do not innovate, free and empowered individuals do; knowledge, innovation, spontaneity, and self-reliance are becoming increasingly valued and promoted.

Uber is not an example of disruption because it did not originate in a low-end or new market footholds. One of the conditions for the business to be considered disruptive according to Clayton M. Christensen is that the business should originate on a) low-end or b) new-market footholds. Instead, Uber was launched in San Francisco, a large urban city with an established taxi service and did not target low-end customers or created a new market (from the consumer perspective). In contrast, UberSELECT, an option that provides luxurious cars such as limousine at a discounted price, is an example of disruption innovation because it originates from low-end customers segment - customers who would not have entered the traditional luxurious market.

The technology adoption lifecycle is a sociological model that describes the adoption or acceptance of a new product or innovation, according to the demographic and psychological characteristics of defined adopter groups. The process of adoption over time is typically illustrated as a classical normal distribution or "bell curve". The model indicates that the first group of people to use a new product is called "innovators", followed by "early adopters". Next come the early majority and late majority, and the last group to eventually adopt a product are called "Laggards" or "phobics." For example, a phobic may only use a cloud service when it is the only remaining method of performing a required task, but the phobic may not have an in-depth technical knowledge of how to use the service.

The demographic and psychological (or "psychographic") profiles of each adoption group were originally specified by the North Central Rural Sociology Committee (Subcommittee for the Study of the Diffusion of Farm Practices) by agricultural researchers Beal and Bohlen in 1957.

The report summarized the categories as:

  • innovators – had larger farms, were more educated, more prosperous and more risk-oriented
  • early adopters – younger, more educated, tended to be community leaders, less prosperous
  • early majority – more conservative but open to new ideas, active in community and influence to neighbors
  • late majority – older, less educated, fairly conservative and less socially active
  • laggards – very conservative, had small farms and capital, oldest and least educated

The model has subsequently been adapted for many areas of technology adoption in the late 20th century.

Adaptations of the model

The model has spawned a range of adaptations that extend the concept or apply it to specific domains of interest.

In his book Crossing the Chasm, Geoffrey Moore proposes a variation of the original lifecycle. He suggests that for discontinuous innovations, which may result in a Foster disruption based on an s-curve, there is a gap or chasm between the first two adopter groups (innovators/early adopters), and the vertical markets.

Disruption as it is used today are of the Clayton M. Christensen variety. These disruptions are not s-curve based.

In educational technology, Lindy McKeown has provided a similar model (a pencil metaphor) describing the Information and Communications Technology uptake in education.

In medical sociology, Carl May has proposed normalization process theory that shows how technologies become embedded and integrated in health care and other kinds of organization.

Wenger, White and Smith, in their book Digital habitats: Stewarding technology for communities, talk of technology stewards: people with sufficient understanding of the technology available and the technological needs of a community to steward the community through the technology adoption process.

Rayna and Striukova (2009) propose that the choice of initial market segment has crucial importance for crossing the chasm, as adoption in this segment can lead to a cascade of adoption in the other segments. This initial market segment has, at the same time, to contain a large proportion of visionaries, to be small enough for adoption to be observed from within the segment and from other segment and be sufficiently connected with other segments. If this is the case, the adoption in the first segment will progressively cascade into the adjacent segments, thereby triggering the adoption by the mass-market.

Stephen L. Parente (1995) implemented a Markov Chain to model economic growth across different countries given different technological barriers.

In Product marketing, Warren Schirtzinger proposed an expansion of the original lifecycle (the Customer Alignment Lifecycle) which describes the configuration of five different business disciplines that follow the sequence of technology adoption.

Examples

One way to model product adoption is to understand that people's behaviors are influenced by their peers and how widespread they think a particular action is. For many format-dependent technologies, people have a non-zero payoff for adopting the same technology as their closest friends or colleagues. If two users both adopt product A, they might get a payoff a > 0; if they adopt product B, they get b > 0. But if one adopts A and the other adopts B, they both get a payoff of 0.

A threshold can be set for each user to adopt a product. Say that a node v in a graph has d neighbors: then v will adopt product A if a fraction p of its neighbors is greater than or equal to some threshold. For example, if v's threshold is 2/3, and only one of its two neighbors adopts product A, then v will not adopt A. Using this model, we can deterministically model product adoption on sample networks.

History

The technology adoption lifecycle is a sociological model that is an extension of an earlier model called the diffusion process, which was originally published in 1957 by Joe M. Bohlen, George M. Beal and Everett M. Rogers at Iowa State University and which was originally published only for its application to agriculture and home economics, building on earlier research conducted there by Neal C. Gross and Bryce Ryan. Their original purpose was to track the purchase patterns of hybrid seed corn by farmers.

Bohlen, Beal and Rogers together developed a model called the diffusion process and later, Rogers generalized the use of it in his widely acclaimed 1962 book Diffusion of Innovations (now in its fifth edition), describing how new ideas and technologies spread in different cultures. Others have since used the model to describe how innovations spread between states in the U.S.

Reporters Without Borders

From Wikipedia, the free encyclopedia

Reporters Without Borders
Reporters Sans Frontières
Formation1985
FounderRobert Ménard, Rémy Loury, Jacques Molénat and Émilien Jubineau
TypeNonprofit organization, non-governmental organization with consultant status at the United Nations
HeadquartersParis, France
Director General
Christophe Deloire
(since July 2012)
Key people
Christophe Deloire, Secretary General
Pierre Haski, President RSF France
Mickael Rediske, President RSF Germany
Christian Mihr, CEO RSF Germany
Rubina Möhring, President RSF Austria
Alfonso Armada, President RSF Spain
Gérard Tschopp, President RSF Switzerland
Erik Halkjær, President, RSF Sweden
Jarmo Mäkelä, President, RSF Finland
Budget
€6 million (RSF France)
Staff
Approximately 100
Websitersf.org/en
Protest action in Paris, April 2008, displaying a 'Reporters Without Borders (RSF)' flag depicting the Olympic rings in the form of handcuffs or padlocks, along with the legend 'Beijing 2008'.

Reporters Without Borders (RWB; French: Reporters sans frontières; RSF) is an international non-profit and non-governmental organization with the stated aim of safeguarding the right to freedom of information. It describes its advocacy as founded on the belief that everyone requires access to the news and information, in line with Article 19 of the Universal Declaration of Human Rights that recognizes the right to receive and share information regardless of frontiers, along with other international rights charters. RSF has consultative status at the United Nations, UNESCO, the Council of Europe, and the International Organisation of the Francophonie.

Activities

RSF works on the ground in defence of individual journalists at risk and also at the highest levels of government and international forums to defend the right to freedom of expression and information. It provides daily briefings and press releases on threats to media freedom in French, English, Spanish, Portuguese, Arabic, Persian and Chinese and publishes an annual press freedom round up, the World Press Freedom Index, that measures the state of media freedom in 180 countries. The organization provides assistance to journalists at risk and training in digital and physical security, as well as campaigning to raise public awareness of abuse against journalists and to secure their safety and liberty. RSF lobbies governments and international bodies to adopt standards and legislation in support of media freedom and takes legal action in defence of journalists under threat. In addition, RSF keeps a yearly count of journalists killed on the job.

To mark World Day Against Cyber-Censorship on 12 March, Reporters Without Borders (RSF) unveiled a list of 20 Digital Predators of Press Freedom and announced that it is unblocking access to a total 21 websites in the sixth year of its Operation Collateral Freedom.

History

https://en.wikipedia.org/wiki/Reporters_Without_Borders
 
Head office in Paris

RSF was founded in Montpellier, France, in 1985 by Robert Ménard, Rémy Loury, Jacques Molénat and Émilien Jubineau. It was registered as a non-profit organization in 1995. Ménard was RSF's first secretary general, succeeded by Jean-Francois Juillard. Christophe Deloire was appointed secretary-general in 2012.

Structure

RSF's head office is based in Paris. It has 13 regional and national offices, including Brussels, London, Washington, Berlin, Rio de Janeiro, Taipei and Dakar, and a network of 146 correspondents. It employs 57 salaried staff in Paris and internationally. A board of governors, elected from RSF's members, approves the organization's policies. An International Council has oversight of the organization's activities and approves the accounts and budget.

Advocacy

World Press Freedom Index

2022 Press Freedom Index
  Good
  Satisfactory
  Problematic
  Difficult
  Very serious
  Not classified
 

Information and Democracy Initiative

In 2018, RSF launched the Information and Democracy Commission to introduce new guarantees for freedom of opinion and expression in the global space of information and communication. In a joint mission statement, the commission's presidents, RSF secretary-general Christophe Deloire and Nobel laureate Shirin Ebadi identified a range of factors currently threatening that freedom. This includes: political control of the media, subjugation of news and information to private interests, the growing influence of corporate actors, online mass disinformation and the erosion of quality journalism.

This Commission published the International Declaration on Information and Democracy to state principles, define objectives and propose forms of governance for the global online space for information and communication. The Declaration emphasised that corporate entities with a structural function in the global space have duties, especially as regards political and ideological neutrality, pluralism and accountability. It called for recognition of the right to information that is diverse, independent and reliable in order to form opinions freely and participate fully in the democratic debate.

At the Paris Peace Forum in 2018, 12 countries launched a political process aimed at providing democratic guarantees for news and information and freedom of opinion, based on the principles set out in the Declaration.

Journalism Trust Initiative

RSF launched the Journalism Trust Initiative (JTI) in 2018 with its partners the European Broadcasting Union (EBU), Agence France Presse (AFP) and the Global Editors Network (GEN). JTI defines indicators for trustworthy journalism and rewards compliance, bringing tangible benefits for all media outlets and supporting them in creating a healthy space for information. JTI distinguishes itself from similar initiatives by focusing on the process of journalism rather than content alone. Media outlets will be expected to comply with standards that include transparency of ownership, sources of revenue and proof of a range of professional safeguards.

Actions

RSF's defence of journalistic freedom includes international missions, the publication of country reports, training of journalists and public protests. Recent global advocacy and practical interventions have included: opening a centre for women journalists in Afghanistan in 2017, a creative protest with street-artist C215 in Strasbourg for Turkish journalists in detention, turning off the Eiffel Tower lights in tribute to murdered Saudi journalist Jamal Kashoggi and providing training to journalists and bloggers in Syria. In July 2018, RSF sent a mission to Saudi Arabia to call for the release of 30 journalists. The organization publishes a gallery of Predators of Press Freedom, highlighting the most egregious international violators of press freedom. It also maintains an online Press Freedom Barometer, monitoring the number of journalists, media workers and citizen journalists killed or imprisoned. Its programme Operation Collateral Freedom, launched in 2014, provides alternative access to censored websites by creating mirror sites: 22 sites have been unblocked in 12 countries, including Iran, China, Saudi Arabia and Vietnam. RSF offers grants to journalists at risk and supports media workers in need of refuge and protection.

Cumhuriyet's former editor-in-chief Can Dündar receiving the 2015 RSF Prize. Shortly thereafter, he was arrested.

Prizes

RSF's annual Press Freedom Prize, created in 1992, honours courageous and independent journalists who have faced threats or imprisonment for their work and who have challenged the abuse of power. TV5-Monde is a partner in the prize.

A Netizen Prize was introduced in 2010, in partnership with Google, recognizing individuals, including bloggers and cyber-dissidents, who have advanced freedom of information online through investigative reporting or other initiatives.

In 2018, RSF launched new categories for the Press Freedom Prize: courage, independence and impact.

Every few years, RSF also distributes Press freedom predator anti-awards.

Press Freedom Prizewinners, 1992–2020

Netizen Prize

RWB 2011 Netizen Prize
  • 2010 Change for Equality website, www.we-change.org, women's rights activists, Iran
  • 2011: Nawaat.org, bloggers, Tunisia
  • 2012: Local Coordination Committees of Syria, media centre, citizen journalists and activists, Syria
  • 2013: Huynh Ngoc Chenh, blogger, Vietnam
  • 2014: Raif Badawi, blogger, Saudi Arabia
  • 2015: Zone9, blogger collective, Ethiopia
  • 2016: Lu Yuyu and Li Tingyu, citizen journalists, China

Annual reports

RSF issues a report annually.

RSF reported that 67 journalists were killed, while 879 were arrested and 38 were abducted in 2012. The number of journalists killed worldwide in 2014 was 66, two-thirds of whom were killed in war zones. The deadliest areas for the journalists in 2014 were Syria, Palestine, Ukraine, Iraq and Libya. The number of journalists convicted by their government rose to 178 in 2014, most of them in Egypt, Ukraine, China, Eritrea and Iran. RSF said that 110 journalists were killed in the course of their work in 2015. In 2016, RSF stated that, there were 348 imprisoned journalists and 52 hostages. Nearly two-thirds of imprisoned journalists were in Turkey, China, Syria, Egypt and Iran. The RSF's 2017 annual report stated that 65 journalists were killed, 326 journalists were imprisoned and 54 journalists were taken hostage during the year. RSF's 2018 report stated that over 80 journalists were killed, 348 were currently imprisoned, and another 60 were being held hostage.

Publications

In addition to its country, regional and thematic reports, RSF publishes a photography book 100 Photos for Press Freedom three times a year as a tool for advocacy and a fundraiser. It is a significant source of income for the organization, raising nearly a quarter of its funds in 2018:

Selected reports

  • 2016 Freedom of expression under state of emergency, Turkey (with ARTICLE 19 and others)
  • 2016 When oligarchs go shopping
  • 2017 Who owns the media?
  • 2017 Media Ownership Monitor, Ukraine (with Ukrainian Institute of Mass Information)
  • 2018 Women's Rights: forbidden subject
  • 2018 Journalists: the bête noire of organized crime
  • 2018 Cambodia: independent press in ruins
  • 2018 Women's rights: forbidden subject
  • 2019 China's Pursuit of a New World Order Media
  • 2019 Media Ownership Monitor, Pakistan (with Freedom Network)

Statements

On 22 February 2020, RSF issued a statement condemning the IRGC's call for journalists to be detained in Iran. IRGC intelligence has summoned some journalists and banned any media activities. Reporters Without Borders described the IRGC's intelligence action as "arbitrary and illegal" and aimed at "preventing journalists from being informed on social media."

Following the outbreak of the Coronavirus in Iran, RSF issued a statement on 6 March expressing concern over the health of imprisoned journalists.

On 16 April 2020, RSF wrote to two United Nations special rapporteurs on Freedom of Expression and Health, urging the United Nations to issue serious warnings to governments that restrict freedom of expression in the context of the coronavirus epidemic. The letter, signed by RSF Director Christian Mihr, stated: "Freedom of the press and access to information are more important than ever at the time of Corona's pandemic."

On 21 April 2020, the RSF based in Paris said that the pandemic had amplified and highlighted many crises and over shadowed freedom of the press. The high representative of the EU, Josep Borrell, stated that the pandemic should not be used to justify the limitation of democratic and civil freedoms and that the rule of law and international commitments should be respected. He said freedom of speech and access to information should not be limited and that measures taken against the pandemic should not be used to restrict human rights advocates, reporters, media staff and institutions of civil societies.

On 25 June 2020, RSF issued a statement entitled "Enforced online repentance, Iran's new method of repression". According to the report, the Revolutionary Guards summoned a number of journalists, writers and human rights activists and threatened to detain them, forcing them to express their regrets or apologies for publishing their comments in cyberspace in order to silence them.

On 25 June 2020, Reporters Without Borders issued a statement entitled "Online Repentance, a New Method of Repression in the Islamic Republic of Iran." According to the report, the Revolutionary Guards summoned and threatened to detain a number of journalists, writers, and human rights activists, forcing them to express regret or apology for posting their views online to silence them. The organization condemned the pressure, threats and silence of social activists.

Funding

RSF's budget for 2018 totalled €6.1m. 51% of the organization's income comes from public subsidy; 12% from private funds; 16% from commercial activities; 14% from sponsorships; and 3% from public donations. Foundations supporting RSF's work through services include the American Express, the Société Générale, the Swedish International Development Cooperation Agency, and Ford Foundation.

RSF has been criticised for accepting funding from the National Endowment for Democracy in the US and the Center for a Free Cuba. In response, Secretary-general Robert Ménard stated that funding from NED totalled 0.92 per cent of RSF's budget and was used to support African journalists and their families. RSF stated that it ceased its relationship with the Center for a Free Cuba in 2008.

Recognitions

RSF has received multiple international awards honouring its achievements:

RSF was criticized for accepting the Dan David Prize, awarded by the Dan David Foundation in Israel, due to the alleged Palestinian journalists killed or arrested in Gaza.

Entropy (information theory)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Entropy_(information_theory) In info...