Search This Blog

Sunday, December 15, 2024

BitTorrent

From Wikipedia, the free encyclopedia

Torrent, also referred to simply as torrent, is a communication protocol for peer-to-peer file sharing (P2P), which enables users to distribute data and electronic files over the Internet in a decentralized manner. The protocol is developed and maintained by Rainberry, Inc., and was first released in 2001.

To send or receive files, users use a BitTorrent client on their Internet-connected computer, which are available for a variety of computing platforms and operating systems, including an official client. BitTorrent trackers provide a list of files available for transfer and allow the client to find peer users, known as "seeds", who may transfer the files. BitTorrent downloading is considered to be faster than HTTP ("direct downloading") and FTP due to the lack of a central server that could limit bandwidth.

BitTorrent is one of the most common protocols for transferring large files, such as digital video files containing TV shows and video clips, or digital audio files. BitTorrent accounted for a third of all internet traffic in 2004, according to a study by Cachelogic. As recently as 2019 BitTorrent remained a significant file sharing protocol according to Sandvine, generating a substantial amount of Internet traffic, with 2.46% of downstream, and 27.58% of upstream traffic, although this share has declined significantly since then.

History

The middle computer is acting as a "seed" to provide a file to the other computers which act as peers.

Programmer Bram Cohen, a University at Buffalo alumnus, designed the protocol in April 2001, and released the first available version on 2 July 2001. Cohen and Ashwin Navin founded BitTorrent, Inc. (later renamed Rainberry, Inc.) to further develop the technology in 2004.

The first release of the BitTorrent client had no search engine and no peer exchange. Up until 2005, the only way to share files was by creating a small text file called a "torrent", that they would upload to a torrent index site. The first uploader acted as a seed, and downloaders would initially connect as peers. Those who wish to download the file would download the torrent, which their client would use to connect to a tracker which had a list of the IP addresses of other seeds and peers in the swarm. Once a peer completed a download of the complete file, it could in turn function as a seed. These files contain metadata about the files to be shared and the trackers which keep track of the other seeds and peers.

In 2005, first Vuze and then the BitTorrent client introduced distributed tracking using distributed hash tables which allowed clients to exchange data on swarms directly without the need for a torrent file.

In 2006, peer exchange functionality was added allowing clients to add peers based on the data found on connected nodes.

In 2017, BitTorrent, Inc. released the BitTorrent v2 protocol specification. BitTorrent v2 is intended to work seamlessly with previous versions of the BitTorrent protocol. The main reason for the update was that the old cryptographic hash function, SHA-1, is no longer considered safe from malicious attacks by the developers, and as such, v2 uses SHA-256. To ensure backwards compatibility, the v2 .torrent file format supports a hybrid mode where the torrents are hashed through both the new method and the old method, with the intent that the files will be shared with peers on both v1 and v2 swarms. Another update to the specification is adding a hash tree to speed up time from adding a torrent to downloading files, and to allow more granular checks for file corruption. In addition, each file is now hashed individually, enabling files in the swarm to be deduplicated, so that if multiple torrents include the same files, but seeders are only seeding the file from some, downloaders of the other torrents can still download the file. In addition, file hashes can be displayed on tracker, torrent indexing services, to search for swarms by searching for hashes of files contained in them. These hashes are different from the usual SHA-256 hash of files and can be obtained using tools. Magnet links for v2 also support a hybrid mode to ensure support for legacy clients.

Design

Animation of protocol use: The colored dots beneath each computer in the animation represent different parts of the file being shared. By the time a copy to a destination computer of each of those parts completes, a copy to another destination computer of that part (or other parts) is already taking place between users.

The BitTorrent protocol can be used to reduce the server and network impact of distributing large files. Rather than downloading a file from a single source server, the BitTorrent protocol allows users to join a "swarm" of hosts to upload and download from each other simultaneously. The protocol is an alternative to the older single source, multiple mirror sources technique for distributing data, and can work effectively over networks with lower bandwidth. Using the BitTorrent protocol, several basic computers, such as home computers, can replace large servers while efficiently distributing files to many recipients. This lower bandwidth usage also helps prevent large spikes in internet traffic in a given area, keeping internet speeds higher for all users in general, regardless of whether or not they use the BitTorrent protocol.

The file being distributed is divided into segments called pieces. As each peer receives a new piece of the file, it becomes a source (of that piece) for other peers, relieving the original seed from having to send that piece to every computer or user wishing a copy. With BitTorrent, the task of distributing the file is shared by those who want it; it is entirely possible for the seed to send only a single copy of the file itself and eventually distribute to an unlimited number of peers. Each piece is protected by a cryptographic hash contained in the torrent descriptor. This ensures that any modification of the piece can be reliably detected, and thus prevents both accidental and malicious modifications of any of the pieces received at other nodes. If a node starts with an authentic copy of the torrent descriptor, it can verify the authenticity of the entire file it receives.

Pieces are typically downloaded non-sequentially, and are rearranged into the correct order by the BitTorrent client, which monitors which pieces it needs, and which pieces it has and can upload to other peers. Pieces are of the same size throughout a single download (for example, a 10 MB file may be transmitted as ten 1 MB pieces or as forty 256 KB pieces). Due to the nature of this approach, the download of any file can be halted at any time and be resumed at a later date, without the loss of previously downloaded information, which in turn makes BitTorrent particularly useful in the transfer of larger files. This also enables the client to seek out readily available pieces and download them immediately, rather than halting the download and waiting for the next (and possibly unavailable) piece in line, which typically reduces the overall time of the download. This eventual transition from peers to seeders determines the overall "health" of the file (as determined by the number of times a file is available in its complete form).

The distributed nature of BitTorrent can lead to a flood-like spreading of a file throughout many peer computer nodes. As more peers join the swarm, the likelihood of a successful download by any particular node increases. Relative to traditional Internet distribution schemes, this permits a significant reduction in the original distributor's hardware and bandwidth resource costs. Distributed downloading protocols in general provide redundancy against system problems, reduce dependence on the original distributor, and provide sources for the file which are generally transient and therefore there is no single point of failure as in one way server-client transfers.

Though both ultimately transfer files over a network, a BitTorrent download differs from a one way server-client download (as is typical with an HTTP or FTP request, for example) in several fundamental ways:

  • BitTorrent makes many small data requests over different IP connections to different machines, while server-client downloading is typically made via a single TCP connection to a single machine.
  • BitTorrent downloads in a random or in a "rarest-first" approach that ensures high availability, while classic downloads are sequential.

Taken together, these differences allow BitTorrent to achieve much lower cost to the content provider, much higher redundancy, and much greater resistance to abuse or to "flash crowds" than regular server software. However, this protection, theoretically, comes at a cost: downloads can take time to rise to full speed because it may take time for enough peer connections to be established, and it may take time for a node to receive sufficient data to become an effective uploader. This contrasts with regular downloads (such as from an HTTP server, for example) that, while more vulnerable to overload and abuse, rise to full speed very quickly, and maintain this speed throughout. In the beginning, BitTorrent's non-contiguous download methods made it harder to support "streaming playback". In 2014, the client Popcorn Time allowed for streaming of BitTorrent video files. Since then, more and more clients are offering streaming options.

Searching

The BitTorrent protocol provides no way to index torrent files. As a result, a comparatively small number of websites have hosted a large majority of torrents, many linking to copyrighted works without the authorization of copyright holders, rendering those sites especially vulnerable to lawsuits. A BitTorrent index is a "list of .torrent files, which typically includes descriptions" and information about the torrent's content. Several types of websites support the discovery and distribution of data on the BitTorrent network. Public torrent-hosting sites such as The Pirate Bay allow users to search and download from their collection of torrent files. Users can typically also upload torrent files for content they wish to distribute. Often, these sites also run BitTorrent trackers for their hosted torrent files, but these two functions are not mutually dependent: a torrent file could be hosted on one site and tracked by another unrelated site. Private host/tracker sites operate like public ones except that they may restrict access to registered users and may also keep track of the amount of data each user uploads and downloads, in an attempt to reduce "leeching".

Web search engines allow the discovery of torrent files that are hosted and tracked on other sites; examples include The Pirate Bay and BTDigg. These sites allow the user to ask for content meeting specific criteria (such as containing a given word or phrase) and retrieve a list of links to torrent files matching those criteria. This list can often be sorted with respect to several criteria, relevance (seeders to leechers ratio) being one of the most popular and useful (due to the way the protocol behaves, the download bandwidth achievable is very sensitive to this value). Metasearch engines allow one to search several BitTorrent indices and search engines at once.

The Tribler BitTorrent client was among the first to incorporate built-in search capabilities. With Tribler, users can find .torrent files held by random peers and taste buddies. It adds such an ability to the BitTorrent protocol using a gossip protocol, somewhat similar to the eXeem network which was shut down in 2005. The software includes the ability to recommend content as well. After a dozen downloads, the Tribler software can roughly estimate the download taste of the user, and recommend additional content.

In May 2007, researchers at Cornell University published a paper proposing a new approach to searching a peer-to-peer network for inexact strings, which could replace the functionality of a central indexing site. A year later, the same team implemented the system as a plugin for Vuze called Cubit and published a follow-up paper reporting its success.

A somewhat similar facility but with a slightly different approach is provided by the BitComet client through its "Torrent Exchange" feature. Whenever two peers using BitComet (with Torrent Exchange enabled) connect to each other they exchange lists of all the torrents (name and info-hash) they have in the Torrent Share storage (torrent files which were previously downloaded and for which the user chose to enable sharing by Torrent Exchange). Thus each client builds up a list of all the torrents shared by the peers it connected to in the current session (or it can even maintain the list between sessions if instructed).

At any time the user can search into that Torrent Collection list for a certain torrent and sort the list by categories. When the user chooses to download a torrent from that list, the .torrent file is automatically searched for (by info-hash value) in the DHT Network and when found it is downloaded by the querying client which can subsequently create and initiate a downloading task.

Downloading and sharing

Users find a torrent of interest on a torrent index site or by using a search engine built into the client, download it, and open it with a BitTorrent client. The client connects to the tracker(s) or seeds specified in the torrent file, from which it receives a list of seeds and peers currently transferring pieces of the file(s). The client connects to those peers to obtain the various pieces. If the swarm contains only the initial seeder, the client connects directly to it, and begins to request pieces. Clients incorporate mechanisms to optimize their download and upload rates.

The effectiveness of this data exchange depends largely on the policies that clients use to determine to whom to send data. Clients may prefer to send data to peers that send data back to them (a "tit for tat" exchange scheme), which encourages fair trading. But strict policies often result in suboptimal situations, such as when newly joined peers are unable to receive any data because they do not have any pieces yet to trade themselves or when two peers with a good connection between them do not exchange data simply because neither of them takes the initiative. To counter these effects, the official BitTorrent client program uses a mechanism called "optimistic unchoking", whereby the client reserves a portion of its available bandwidth for sending pieces to random peers (not necessarily known good partners, or "preferred peers") in hopes of discovering even better partners and to ensure that newcomers get a chance to join the swarm.

Although "swarming" scales well to tolerate "flash crowds" for popular content, it is less useful for unpopular or niche market content. Peers arriving after the initial rush might find the content unavailable and need to wait for the arrival of a "seed" in order to complete their downloads. The seed arrival, in turn, may take long to happen (this is termed the "seeder promotion problem"). Since maintaining seeds for unpopular content entails high bandwidth and administrative costs, this runs counter to the goals of publishers that value BitTorrent as a cheap alternative to a client-server approach. This occurs on a huge scale; measurements have shown that 38% of all new torrents become unavailable within the first month. A strategy adopted by many publishers which significantly increases availability of unpopular content consists of bundling multiple files in a single swarm. More sophisticated solutions have also been proposed; generally, these use cross-torrent mechanisms through which multiple torrents can cooperate to better share content.

Creating and publishing

The peer distributing a data file treats the file as a number of identically sized pieces, usually with byte sizes of a power of 2, and typically between 32 KB and 16 MB each. The peer creates a hash for each piece, using the SHA-1 hash function, and records it in the torrent file. Pieces with sizes greater than 512 KB will reduce the size of a torrent file for a very large payload, but is claimed to reduce the efficiency of the protocol. When another peer later receives a particular piece, the hash of the piece is compared to the recorded hash to test that the piece is error-free. Peers that provide a complete file are called seeders, and the peer providing the initial copy is called the initial seeder. The exact information contained in the torrent file depends on the version of the BitTorrent protocol.

By convention, the name of a torrent file has the suffix .torrent. Torrent files use the Bencode file format, and contain an "announce" section, which specifies the URL of the tracker, and an "info" section, containing (suggested) names for the files, their lengths, the piece length used, and a SHA-1 hash code for each piece, all of which are used by clients to verify the integrity of the data they receive. Though SHA-1 has shown signs of cryptographic weakness, Bram Cohen did not initially consider the risk big enough for a backward incompatible change to, for example, SHA-3. As of BitTorrent v2 the hash function has been updated to SHA-256.

In the early days, torrent files were typically published to torrent index websites, and registered with at least one tracker. The tracker maintained lists of the clients currently connected to the swarm. Alternatively, in a trackerless system (decentralized tracking) every peer acts as a tracker. Azureus was the first BitTorrent client to implement such a system through the distributed hash table (DHT) method. An alternative and incompatible DHT system, known as Mainline DHT, was released in the Mainline BitTorrent client three weeks later (though it had been in development since 2002) and subsequently adopted by the μTorrent, Transmission, rTorrent, KTorrent, BitComet, and Deluge clients.

After the DHT was adopted, a "private" flag – analogous to the broadcast flag – was unofficially introduced, telling clients to restrict the use of decentralized tracking regardless of the user's desires. The flag is intentionally placed in the info section of the torrent so that it cannot be disabled or removed without changing the identity of the torrent. The purpose of the flag is to prevent torrents from being shared with clients that do not have access to the tracker. The flag was requested for inclusion in the official specification in August 2008, but has not been accepted yet. Clients that have ignored the private flag were banned by many trackers, discouraging the practice.

Anonymity

BitTorrent does not, on its own, offer its users anonymity. One can usually see the IP addresses of all peers in a swarm in one's own client or firewall program. This may expose users with insecure systems to attacks. In some countries, copyright organizations scrape lists of peers, and send takedown notices to the internet service provider of users participating in the swarms of files that are under copyright. In some jurisdictions, copyright holders may launch lawsuits against uploaders or downloaders for infringement, and police may arrest suspects in such cases.

Various means have been used to promote anonymity. For example, the BitTorrent client Tribler makes available a Tor-like onion network, optionally routing transfers through other peers to obscure which client has requested the data. The exit node would be visible to peers in a swarm, but the Tribler organization provides exit nodes. One advantage of Tribler is that clearnet torrents can be downloaded with only a small decrease in download speed from one "hop" of routing.

i2p provides a similar anonymity layer although in that case, one can only download torrents that have been uploaded to the i2p network. The bittorrent client Vuze allows users who are not concerned about anonymity to take clearnet torrents, and make them available on the i2p network.

Most BitTorrent clients are not designed to provide anonymity when used over Tor, and there is some debate as to whether torrenting over Tor acts as a drag on the network.

Private torrent trackers are usually invitation only, and require members to participate in uploading, but have the downside of a single centralized point of failure. Oink's Pink Palace and What.cd are examples of private trackers which have been shut down.

Seedbox services download the torrent files first to the company's servers, allowing the user to direct download the file from there. One's IP address would be visible to the Seedbox provider, but not to third parties.

Virtual private networks encrypt transfers, and substitute a different IP address for the user's, so that anyone monitoring a torrent swarm will only see that address.

Associated technologies

Distributed trackers

On 2 May 2005, Azureus 2.3.0.0 (now known as Vuze) was released, utilizing a distributed database system. This system is a distributed hash table implementation which allows the client to use torrents that do not have a working BitTorrent tracker. A bootstrap server is instead utilized. The following month, BitTorrent, Inc. released version 4.2.0 of the Mainline BitTorrent client, which supported an alternative DHT implementation (popularly known as "Mainline DHT", outlined in a draft on their website) that is incompatible with that of Azureus. In 2014, measurement showed concurrent users of Mainline DHT to be from 10 million to 25 million, with a daily churn of at least 10 million.

Current versions of the official BitTorrent client, μTorrent, BitComet, Transmission and BitSpirit all share compatibility with Mainline DHT. Both DHT implementations are based on Kademlia. As of version 3.0.5.0, Azureus also supports Mainline DHT in addition to its own distributed database through use of an optional application plugin. This potentially allows the Azureus/Vuze client to reach a bigger swarm.

Another idea that has surfaced in Vuze is that of virtual torrents. This idea is based on the distributed tracker approach and is used to describe some web resource. Currently, it is used for instant messaging. It is implemented using a special messaging protocol and requires an appropriate plugin. Anatomic P2P is another approach, which uses a decentralized network of nodes that route traffic to dynamic trackers. Most BitTorrent clients also use peer exchange (PEX) to gather peers in addition to trackers and DHT. Peer exchange checks with known peers to see if they know of any other peers. With the 3.0.5.0 release of Vuze, all major BitTorrent clients now have compatible peer exchange.

Web seeding

Web "seeding" was implemented in 2006 as the ability of BitTorrent clients to download torrent pieces from an HTTP source in addition to the "swarm". The advantage of this feature is that a website may distribute a torrent for a particular file or batch of files and make those files available for download from that same web server; this can simplify long-term seeding and load balancing through the use of existing, cheap, web hosting setups. In theory, this would make using BitTorrent almost as easy for a web publisher as creating a direct HTTP download. In addition, it would allow the "web seed" to be disabled if the swarm becomes too popular while still allowing the file to be readily available. This feature has two distinct specifications, both of which are supported by Libtorrent and the 26+ clients that use it.

The first was created by John "TheSHAD0W" Hoffman, who created BitTornado. This first specification requires running a web service that serves content by info-hash and piece number, rather than filename.

The other specification is created by GetRight authors and can rely on a basic HTTP download space (using byte serving).

In September 2010, a new service named Burnbit was launched which generates a torrent from any URL using webseeding. There are server-side solutions that provide initial seeding of the file from the web server via standard BitTorrent protocol and when the number of external seeders reach a limit, they stop serving the file from the original source.

RSS feeds

A technique called broadcatching combines RSS feeds with the BitTorrent protocol to create a content delivery system, further simplifying and automating content distribution. Steve Gillmor explained the concept in a column for Ziff-Davis in December 2003. The discussion spread quickly among bloggers (Ernest Miller, Chris Pirillo, etc.). In an article entitled Broadcatching with BitTorrent, Scott Raymond explained:

I want RSS feeds of BitTorrent files. A script would periodically check the feed for new items, and use them to start the download. Then, I could find a trusted publisher of an Alias RSS feed, and "subscribe" to all new episodes of the show, which would then start downloading automatically – like the "season pass" feature of the TiVo.

— Scott Raymond, scottraymond.net

The RSS feed will track the content, while BitTorrent ensures content integrity with cryptographic hashing of all data, so feed subscribers will receive uncorrupted content. One of the first and popular software clients (free and open source) for broadcatching is Miro. Other free software clients such as PenguinTV and KatchTV are also now supporting broadcatching. The BitTorrent web-service MoveDigital added the ability to make torrents available to any web application capable of parsing XML through its standard REST-based interface in 2006, though this has since been discontinued. Additionally, Torrenthut is developing a similar torrent API that will provide the same features, and help bring the torrent community to Web 2.0 standards. Alongside this release is a first PHP application built using the API called PEP, which will parse any Really Simple Syndication (RSS 2.0) feed and automatically create and seed a torrent for each enclosure found in that feed.

Throttling and encryption

Since BitTorrent makes up a large proportion of total traffic, some ISPs have chosen to "throttle" (slow down) BitTorrent transfers. For this reason, methods have been developed to disguise BitTorrent traffic in an attempt to thwart these efforts. Protocol header encrypt (PHE) and Message stream encryption/Protocol encryption (MSE/PE) are features of some BitTorrent clients that attempt to make BitTorrent hard to detect and throttle. As of November 2015, Vuze, BitComet, KTorrent, Transmission, Deluge, μTorrent, MooPolice, Halite, qBittorrent, rTorrent, and the latest official BitTorrent client (v6) support MSE/PE encryption.

In August 2007, Comcast was preventing BitTorrent seeding by monitoring and interfering with the communication between peers. Protection against these efforts is provided by proxying the client-tracker traffic via an encrypted tunnel to a point outside of the Comcast network. In 2008, Comcast called a "truce" with BitTorrent, Inc. with the intention of shaping traffic in a protocol-agnostic manner. Questions about the ethics and legality of Comcast's behavior have led to renewed debate about net neutrality in the United States. In general, although encryption can make it difficult to determine what is being shared, BitTorrent is vulnerable to traffic analysis. Thus, even with MSE/PE, it may be possible for an ISP to recognize BitTorrent and also to determine that a system is no longer downloading but only uploading data, and terminate its connection by injecting TCP RST (reset flag) packets.

Multitrackers

Another unofficial feature is an extension to the BitTorrent metadata format proposed by John Hoffman and implemented by several indexing websites. It allows the use of multiple trackers per file, so if one tracker fails, others can continue to support file transfer. It is implemented in several clients, such as BitComet, BitTornado, BitTorrent, KTorrent, Transmission, Deluge, μTorrent, rtorrent, Vuze, and Frostwire. Trackers are placed in groups, or tiers, with a tracker randomly chosen from the top tier and tried, moving to the next tier if all the trackers in the top tier fail.

Torrents with multiple trackers can decrease the time it takes to download a file, but also have a few consequences:

  • Poorly implemented clients may contact multiple trackers, leading to more overhead-traffic.
  • Torrents from closed trackers suddenly become downloadable by non-members, as they can connect to a seed via an open tracker.

Peer selection

As of December 2008, BitTorrent, Inc. was working with Oversi on new Policy Discover Protocols that query the ISP for capabilities and network architecture information. Oversi's ISP hosted NetEnhancer box is designed to "improve peer selection" by helping peers find local nodes, improving download speeds while reducing the loads into and out of the ISP's network.

Implementations

The BitTorrent specification is free to use and many clients are open source, so BitTorrent clients have been created for all common operating systems using a variety of programming languages. The official BitTorrent client, μTorrent, qBittorrent, Transmission, Vuze, and BitComet are some of the most popular clients.

Some BitTorrent implementations such as MLDonkey and Torrentflux are designed to run as servers. For example, this can be used to centralize file sharing on a single dedicated server which users share access to on the network. Server-oriented BitTorrent implementations can also be hosted by hosting providers at co-located facilities with high bandwidth Internet connectivity (e.g., a datacenter) which can provide dramatic speed benefits over using BitTorrent from a regular home broadband connection. Services such as ImageShack can download files on BitTorrent for the user, allowing them to download the entire file by HTTP once it is finished.

The Opera web browser supports BitTorrent natively. Brave web browser ships with an extension which supports WebTorrent, a BitTorrent-like protocol based on WebRTC instead of UDP and TCP. BitLet allowed users to download Torrents directly from their browser using a Java applet (until browsers removed support for Java applets). An increasing number of hardware devices are being made to support BitTorrent. These include routers and NAS devices containing BitTorrent-capable firmware like OpenWrt. Proprietary versions of the protocol which implement DRM, encryption, and authentication are found within managed clients such as Pando.

Adoption

A growing number of individuals and organizations are using BitTorrent to distribute their own or licensed works (e.g. indie bands distributing digital files of their new songs). Independent adopters report that BitTorrent technology reduces demands on private networking hardware and bandwidth, an essential for non-profit groups with large amounts of internet traffic.

Many major open source and free software projects encourage BitTorrent as well as conventional downloads of their products (via HTTP, FTP etc.) to increase availability and to reduce load on their own servers, especially when dealing with larger files. In addition, some video game installers, especially those whose large size makes them difficult to host due to bandwidth limits, extremely frequent downloads, and unpredictable changes in network traffic, will distribute instead a specialized, stripped down BitTorrent client with enough functionality to download the game from the other running clients and the primary server (which is maintained in case not enough peers are available).

Some uses of BitTorrent for file sharing may violate laws in some jurisdictions (see legislation section).

Popularity and traffic statistics

As of January 2012, BitTorrent is utilized by 150 million active users. Based on this figure, the total number of monthly users may be estimated to more than a quarter of a billion (≈ 250 million). As of February 2013, BitTorrent was responsible for 3.35% of all worldwide bandwidth—more than half of the 6% of total bandwidth dedicated to file sharing. As of 2013, BitTorrent had 15–27 million concurrent users at any time.

Film, video, and music

  • BitTorrent Inc. has obtained a number of licenses from Hollywood studios for distributing popular content from their websites.
  • Sub Pop Records releases tracks and videos via BitTorrent Inc. to distribute its 1000+ albums. Babyshambles and The Libertines (both bands associated with Pete Doherty) have extensively used torrents to distribute hundreds of demos and live videos. US industrial rock band Nine Inch Nails frequently distributes albums via BitTorrent.
  • Podcasting software has integrated BitTorrent to help podcasters deal with the download demands of their MP3 "radio" programs. Specifically, Juice and Miro (formerly known as Democracy Player) support automatic processing of .torrent files from RSS feeds. Similarly, some BitTorrent clients, such as μTorrent, are able to process web feeds and automatically download content found within them.
  • DGM Live purchases are provided via BitTorrent.
  • VODO, a service which distributes "free-to-share" movies and TV shows via BitTorrent.

Broadcasters

  • In 2008, the CBC became the first public broadcaster in North America to make a full show (Canada's Next Great Prime Minister) available for download using BitTorrent.
  • The Norwegian Broadcasting Corporation (NRK) has since March 2008 experimented with bittorrent distribution, available online. Only selected works in which NRK owns all royalties are published. Responses have been very positive, and NRK is planning to offer more content.
  • The Dutch VPRO broadcasting organization released four documentaries in 2009 and 2010 under a Creative Commons license using the content distribution feature of the Mininova tracker.

Cloud Service Providers

  • The Amazon AWS's Simple Storage Service (S3), until April 29, 2021, had supported sharing of bucket objects with BitTorrent protocols. As of June 13, 2020, the feature is only available in service regions launched after May 30, 2016. The feature for the existing customers will be extended for an additional 12 months following the deprecation. After April 29, 2022, BitTorrent clients will no longer connect to Amazon S3.

Software

Government

Education

  • Florida State University uses BitTorrent to distribute large scientific data sets to its researchers.
  • Many universities that have BOINC distributed computing projects have used the BitTorrent functionality of the client-server system to reduce the bandwidth costs of distributing the client-side applications used to process the scientific data. If a BOINC distributed computing application needs to be updated (or merely sent to a user), it can do so with little impact on the BOINC server.
  • The developing Human Connectome Project uses BitTorrent to share their open dataset.
  • Academic Torrents is a BitTorrent tracker for use by researchers in fields that need to share large datasets.

Others

  • Facebook uses BitTorrent to distribute updates to Facebook servers.
  • Twitter uses BitTorrent to distribute updates to Twitter servers.
  • The Internet Archive added BitTorrent to its file download options for over 1.3 million existing files, and all newly uploaded files, in August 2012. This method is the fastest means of downloading media from the Archive.

By early 2015, AT&T estimated that BitTorrent accounted for 20% of all broadband traffic.

Routers that use network address translation (NAT) must maintain tables of source and destination IP addresses and ports. Because BitTorrent frequently contacts 20–30 servers per second, the NAT tables of some consumer-grade routers are rapidly filled. This is a known cause of some home routers ceasing to work correctly.

Legislation

Although the protocol itself is legal, problems stem from using the protocol to traffic copyright infringing works, since BitTorrent is often used to download otherwise paid content, such as movies and video games. There has been much controversy over the use of BitTorrent trackers. BitTorrent metafiles themselves do not store file contents. Whether the publishers of BitTorrent metafiles violate copyrights by linking to copyrighted works without the authorization of copyright holders is controversial. Various jurisdictions have pursued legal action against websites that host BitTorrent trackers.

As a result the use of BitTorrent may sometimes be limited by Internet Service Providers (ISPs) due to legal or copyright grounds. Users may choose to run seedboxes or virtual private networks (VPNs) to circumvent these restrictions.

High-profile examples include the closing of Suprnova.org, TorrentSpy, LokiTorrent, BTJunkie, Mininova, Oink's Pink Palace and What.cd. BitTorrent search engine The Pirate Bay torrent website, formed by a Swedish group, is noted for the "legal" section of its website in which letters and replies on the subject of alleged copyright infringements are publicly displayed. On 31 May 2006, The Pirate Bay's servers in Sweden were raided by Swedish police on allegations by the MPAA of copyright infringement; however, the tracker was up and running again three days later. In the study used to value NBC Universal in its merger with Comcast, Envisional examined the 10,000 torrent swarms managed by PublicBT which had the most active downloaders. After excluding pornographic and unidentifiable content, it was found that only one swarm offered legitimate content.

In the United States, more than 200,000 lawsuits have been filed for copyright infringement on BitTorrent since 2010. In the United Kingdom, on 30 April 2012, the High Court of Justice ordered five ISPs to block The Pirate Bay.

Security

One concern is the UDP flood attack. BitTorrent implementations often use μTP for their communication. To achieve high bandwidths, the underlying protocol used is UDP, which allows spoofing of source addresses of internet traffic. It has been possible to carry out denial-of-service attacks in a P2P lab environment, where users running BitTorrent clients act as amplifiers for an attack at another service. However this is not always an effective attack because ISPs can check if the source address is correct.

Several studies on BitTorrent found files available for download containing malware. In particular, one small sample indicated that 18% of all executable programs available for download contained malware. Another study claims that as much as 14.5% of BitTorrent downloads contain zero-day malware, and that BitTorrent was used as the distribution mechanism for 47% of all zero-day malware they have found.

Cosmology

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Cosmology
The Hubble eXtreme Deep Field (XDF) was completed in September 2012 and shows the farthest galaxies ever photographed at that time. Except for the few stars in the foreground (which are bright and easily recognizable because only they have diffraction spikes), every speck of light in the photo is an individual galaxy, some of them as old as 13.2 billion years; the observable universe is estimated to contain more than 2 trillion galaxies.

Cosmology (from Ancient Greek κόσμος (cosmos) 'the universe, the world' and λογία (logia) 'study of') is a branch of physics and metaphysics dealing with the nature of the universe, the cosmos. The term cosmology was first used in English in 1656 in Thomas Blount's Glossographia, and in 1731 taken up in Latin by German philosopher Christian Wolff in Cosmologia Generalis. Religious or mythological cosmology is a body of beliefs based on mythological, religious, and esoteric literature and traditions of creation myths and eschatology. In the science of astronomy, cosmology is concerned with the study of the chronology of the universe.

Physical cosmology is the study of the observable universe's origin, its large-scale structures and dynamics, and the ultimate fate of the universe, including the laws of science that govern these areas. It is investigated by scientists, including astronomers and physicists, as well as philosophers, such as metaphysicians, philosophers of physics, and philosophers of space and time. Because of this shared scope with philosophy, theories in physical cosmology may include both scientific and non-scientific propositions and may depend upon assumptions that cannot be tested. Physical cosmology is a sub-branch of astronomy that is concerned with the universe as a whole. Modern physical cosmology is dominated by the Big Bang Theory which attempts to bring together observational astronomy and particle physics; more specifically, a standard parameterization of the Big Bang with dark matter and dark energy, known as the Lambda-CDM model.

Theoretical astrophysicist David N. Spergel has described cosmology as a "historical science" because "when we look out in space, we look back in time" due to the finite nature of the speed of light.

Disciplines

Physics and astrophysics have played central roles in shaping our understanding of the universe through scientific observation and experiment. Physical cosmology was shaped through both mathematics and observation in an analysis of the whole universe. The universe is generally understood to have begun with the Big Bang, followed almost instantaneously by cosmic inflation, an expansion of space from which the universe is thought to have emerged 13.799 ± 0.021 billion years ago. Cosmogony studies the origin of the universe, and cosmography maps the features of the universe.

In Diderot's Encyclopédie, cosmology is broken down into uranology (the science of the heavens), aerology (the science of the air), geology (the science of the continents), and hydrology (the science of waters).

Metaphysical cosmology has also been described as the placing of humans in the universe in relationship to all other entities. This is exemplified by Marcus Aurelius's observation that a man's place in that relationship: "He who does not know what the world is does not know where he is, and he who does not know for what purpose the world exists, does not know who he is, nor what the world is."

Discoveries

Physical cosmology

Physical cosmology is the branch of physics and astrophysics that deals with the study of the physical origins and evolution of the universe. It also includes the study of the nature of the universe on a large scale. In its earliest form, it was what is now known as "celestial mechanics," the study of the heavens. Greek philosophers Aristarchus of Samos, Aristotle, and Ptolemy proposed different cosmological theories. The geocentric Ptolemaic system was the prevailing theory until the 16th century when Nicolaus Copernicus, and subsequently Johannes Kepler and Galileo Galilei, proposed a heliocentric system. This is one of the most famous examples of epistemological rupture in physical cosmology.

Isaac Newton's Principia Mathematica, published in 1687, was the first description of the law of universal gravitation. It provided a physical mechanism for Kepler's laws and also allowed the anomalies in previous systems, caused by gravitational interaction between the planets, to be resolved. A fundamental difference between Newton's cosmology and those preceding it was the Copernican principle—that the bodies on Earth obey the same physical laws as all celestial bodies. This was a crucial philosophical advance in physical cosmology.

Modern scientific cosmology is widely considered to have begun in 1917 with Albert Einstein's publication of his final modification of general relativity in the paper "Cosmological Considerations of the General Theory of Relativity" (although this paper was not widely available outside of Germany until the end of World War I). General relativity prompted cosmogonists such as Willem de Sitter, Karl Schwarzschild, and Arthur Eddington to explore its astronomical ramifications, which enhanced the ability of astronomers to study very distant objects. Physicists began changing the assumption that the universe was static and unchanging. In 1922, Alexander Friedmann introduced the idea of an expanding universe that contained moving matter.

In parallel to this dynamic approach to cosmology, one long-standing debate about the structure of the cosmos was coming to a climax – the Great Debate (1917 to 1922) – with early cosmologists such as Heber Curtis and Ernst Öpik determining that some nebulae seen in telescopes were separate galaxies far distant from our own. While Heber Curtis argued for the idea that spiral nebulae were star systems in their own right as island universes, Mount Wilson astronomer Harlow Shapley championed the model of a cosmos made up of the Milky Way star system only. This difference of ideas came to a climax with the organization of the Great Debate on 26 April 1920 at the meeting of the U.S. National Academy of Sciences in Washington, D.C. The debate was resolved when Edwin Hubble detected Cepheid Variables in the Andromeda Galaxy in 1923 and 1924. Their distance established spiral nebulae well beyond the edge of the Milky Way.

Subsequent modelling of the universe explored the possibility that the cosmological constant, introduced by Einstein in his 1917 paper, may result in an expanding universe, depending on its value. Thus the Big Bang model was proposed by the Belgian priest Georges Lemaître in 1927 which was subsequently corroborated by Edwin Hubble's discovery of the redshift in 1929 and later by the discovery of the cosmic microwave background radiation by Arno Penzias and Robert Woodrow Wilson in 1964. These findings were a first step to rule out some of many alternative cosmologies.

Since around 1990, several dramatic advances in observational cosmology have transformed cosmology from a largely speculative science into a predictive science with precise agreement between theory and observation. These advances include observations of the microwave background from the COBE, WMAP and Planck satellites, large new galaxy redshift surveys including 2dfGRS and SDSS, and observations of distant supernovae and gravitational lensing. These observations matched the predictions of the cosmic inflation theory, a modified Big Bang theory, and the specific version known as the Lambda-CDM model. This has led many to refer to modern times as the "golden age of cosmology".

In 2014, the BICEP2 collaboration claimed that they had detected the imprint of gravitational waves in the cosmic microwave background. However, this result was later found to be spurious: the supposed evidence of gravitational waves was in fact due to interstellar dust.

On 1 December 2014, at the Planck 2014 meeting in Ferrara, Italy, astronomers reported that the universe is 13.8 billion years old and composed of 4.9% atomic matter, 26.6% dark matter and 68.5% dark energy.

Religious or mythological cosmology

Religious or mythological cosmology is a body of beliefs based on mythological, religious, and esoteric literature and traditions of creation and eschatology. Creation myths are found in most religions, and are typically split into five different classifications, based on a system created by Mircea Eliade and his colleague Charles Long.

  • Types of Creation Myths based on similar motifs:
    • Creation ex nihilo in which the creation is through the thought, word, dream or bodily secretions of a divine being.
    • Earth diver creation in which a diver, usually a bird or amphibian sent by a creator, plunges to the seabed through a primordial ocean to bring up sand or mud which develops into a terrestrial world.
    • Emergence myths in which progenitors pass through a series of worlds and metamorphoses until reaching the present world.
    • Creation by the dismemberment of a primordial being.
    • Creation by the splitting or ordering of a primordial unity such as the cracking of a cosmic egg or a bringing order from chaos.

Philosophy

Representation of the observable universe on a logarithmic scale. Distance from the Sun increases from center to edge. Planets and other celestial bodies were enlarged to appreciate their shapes.

Cosmology deals with the world as the totality of space, time and all phenomena. Historically, it has had quite a broad scope, and in many cases was found in religion. Some questions about the Universe are beyond the scope of scientific inquiry but may still be interrogated through appeals to other philosophical approaches like dialectics. Some questions that are included in extra-scientific endeavors may include: Charles Kahn, an important historian of philosophy, attributed the origins of ancient Greek cosmology to Anaximander.

  • What is the origin of the universe? What is its first cause (if any)? Is its existence necessary? (see monism, pantheism, emanationism and creationism)
  • What are the ultimate material components of the universe? (see mechanism, dynamism, hylomorphism, atomism)
  • What is the ultimate reason (if any) for the existence of the universe? Does the cosmos have a purpose? (see teleology)
  • Does the existence of consciousness have a role in the existence of reality? How do we know what we know about the totality of the cosmos? Does cosmological reasoning reveal metaphysical truths? (see epistemology)

Historical cosmologies

Name Author and date Classification Remarks
Hindu cosmology Rigveda (c. 1700–1100 BCE) Cyclical or oscillating, Infinite in time Primal matter remains manifest for 311.04 trillion years and unmanifest for an equal length of time. The universe remains manifest for 4.32 billion years and unmanifest for an equal length of time. Innumerable universes exist simultaneously. These cycles have and will last forever, driven by desires.
Zoroastrian Cosmology Avesta (c. 1500–600 BCE) Dualistic Cosmology According to Zoroastrian Cosmology, the universe is the manifestation of perpetual conflict between Existence and non-existence, Good and evil and light and darkness. the universe will remain in this state for 12000 years; at the time of resurrection, the two elements will be separated again.
Jain cosmology Jain Agamas (written around 500 CE as per the teachings of Mahavira 599–527 BCE) Cyclical or oscillating, eternal and finite Jain cosmology considers the loka, or universe, as an uncreated entity, existing since infinity, the shape of the universe as similar to a man standing with legs apart and arm resting on his waist. This Universe, according to Jainism, is broad at the top, narrow at the middle and once again becomes broad at the bottom.
Babylonian cosmology Babylonian literature (c. 2300–500 BCE) Flat Earth floating in infinite "waters of chaos" The Earth and the Heavens form a unit within infinite "waters of chaos"; the Earth is flat and circular, and a solid dome (the "firmament") keeps out the outer "chaos"-ocean.
Eleatic cosmology Parmenides (c. 515 BCE) Finite and spherical in extent The Universe is unchanging, uniform, perfect, necessary, timeless, and neither generated nor perishable. Void is impossible. Plurality and change are products of epistemic ignorance derived from sense experience. Temporal and spatial limits are arbitrary and relative to the Parmenidean whole.
Samkhya Cosmic Evolution Kapila (6th century BCE), pupil Asuri Prakriti (Matter) and Purusha (Consiouness) Relation Prakriti (Matter) is the source of the world of becoming. It is pure potentiality that evolves itself successively into twenty four tattvas or principles. The evolution itself is possible because Prakriti is always in a state of tension among its constituent strands known as gunas (Sattva (lightness or purity), Rajas (passion or activity), and Tamas (inertia or heaviness)). The cause and effect theory of Sankhya is called Satkaarya-vaada (theory of existent causes), and holds that nothing can really be created from or destroyed into nothingness—all evolution is simply the transformation of primal Nature from one form to another.
Biblical cosmology Genesis creation narrative Earth floating in infinite "waters of chaos" The Earth and the Heavens form a unit within infinite "waters of chaos"; the "firmament" keeps out the outer "chaos"-ocean.
Anaximander's model Anaximander (c. 560 BCE) Geocentric, cylindrical Earth, infinite in extent, finite time; first purely mechanical model The Earth floats very still in the centre of the infinite, not supported by anything. At the origin, after the separation of hot and cold, a ball of flame appeared that surrounded Earth like bark on a tree. This ball broke apart to form the rest of the Universe. It resembled a system of hollow concentric wheels, filled with fire, with the rims pierced by holes like those of a flute; no heavenly bodies as such, only light through the holes. Three wheels, in order outwards from Earth: stars (including planets), Moon and a large Sun.
Atomist universe Anaxagoras (500–428 BCE) and later Epicurus Infinite in extent The universe contains only two things: an infinite number of tiny seeds (atoms) and the void of infinite extent. All atoms are made of the same substance, but differ in size and shape. Objects are formed from atom aggregations and decay back into atoms. Incorporates Leucippus' principle of causality: "nothing happens at random; everything happens out of reason and necessity". The universe was not ruled by gods.
Pythagorean universe Philolaus (d. 390 BCE) Existence of a "Central Fire" at the center of the Universe. At the center of the Universe is a central fire, around which the Earth, Sun, Moon and planets revolve uniformly. The Sun revolves around the central fire once a year, the stars are immobile. The Earth in its motion maintains the same hidden face towards the central fire, hence it is never seen. First known non-geocentric model of the Universe.
De Mundo Pseudo-Aristotle (d. 250 BCE or between 350 and 200 BCE) The Universe is a system made up of heaven and Earth and the elements which are contained in them. There are "five elements, situated in spheres in five regions, the less being in each case surrounded by the greater – namely, earth surrounded by water, water by air, air by fire, and fire by ether – make up the whole Universe."
Stoic universe Stoics (300 BCE – 200 CE) Island universe The cosmos is finite and surrounded by an infinite void. It is in a state of flux, and pulsates in size and undergoes periodic upheavals and conflagrations.
Platonic universe Plato (c. 360 BCE) Geocentric, complex cosmogony, finite extent, implied finite time, cyclical Static Earth at center, surrounded by heavenly bodies which move in perfect circles, arranged by the will of the Demiurge in order: Moon, Sun, planets and fixed stars. Complex motions repeat every 'perfect' year.
Eudoxus' model Eudoxus of Cnidus (c. 340 BCE) and later Callippus Geocentric, first geometric-mathematical model The heavenly bodies move as if attached to a number of Earth-centered concentrical, invisible spheres, each of them rotating around its own and different axis and at different paces. There are twenty-seven homocentric spheres with each sphere explaining a type of observable motion for each celestial object. Eudoxus emphasised that this is a purely mathematical construct of the model in the sense that the spheres of each celestial body do not exist, it just shows the possible positions of the bodies.
Aristotelian universe Aristotle (384–322 BCE) Geocentric (based on Eudoxus' model), static, steady state, finite extent, infinite time Static and spherical Earth is surrounded by 43 to 55 concentric celestial spheres, which are material and crystalline. Universe exists unchanged throughout eternity. Contains a fifth element, called aether, that was added to the four classical elements.
Aristarchean universe Aristarchus (c. 280 BCE) Heliocentric Earth rotates daily on its axis and revolves annually about the Sun in a circular orbit. Sphere of fixed stars is centered about the Sun.
Ptolemaic model Ptolemy (2nd century CE) Geocentric (based on Aristotelian universe) Universe orbits around a stationary Earth. Planets move in circular epicycles, each having a center that moved in a larger circular orbit (called an eccentric or a deferent) around a center-point near Earth. The use of equants added another level of complexity and allowed astronomers to predict the positions of the planets. The most successful universe model of all time, using the criterion of longevity. The Almagest (the Great System).
Capella's model Martianus Capella (c. 420) Geocentric and Heliocentric The Earth is at rest in the center of the universe and circled by the Moon, the Sun, three planets and the stars, while Mercury and Venus circle the Sun.
Aryabhatan model Aryabhata (499) Geocentric or Heliocentric The Earth rotates and the planets move in elliptical orbits around either the Earth or Sun; uncertain whether the model is geocentric or heliocentric due to planetary orbits given with respect to both the Earth and Sun.
Quranic cosmology Quran (610–632 CE) Flat-earth The universe consists of stacked flat layers, including seven levels of heaven and in some interpretations seven levels of earth (including hell)
Medieval universe Medieval philosophers (500–1200) Finite in time A universe that is finite in time and has a beginning is proposed by the Christian philosopher John Philoponus, who argues against the ancient Greek notion of an infinite past. Logical arguments supporting a finite universe are developed by the early Muslim philosopher Al-Kindi, the Jewish philosopher Saadia Gaon, and the Muslim theologian Al-Ghazali.
Non-Parallel Multiverse Bhagvata Puran (800–1000) Multiverse, Non Parallel Innumerable universes is comparable to the multiverse theory, except nonparallel where each universe is different and individual jiva-atmas (embodied souls) exist in exactly one universe at a time. All universes manifest from the same matter, and so they all follow parallel time cycles, manifesting and unmanifesting at the same time.
Multiversal cosmology Fakhr al-Din al-Razi (1149–1209) Multiverse, multiple worlds and universes There exists an infinite outer space beyond the known world, and God has the power to fill the vacuum with an infinite number of universes.
Maragha models Maragha school (1259–1528) Geocentric Various modifications to Ptolemaic model and Aristotelian universe, including rejection of equant and eccentrics at Maragheh observatory, and introduction of Tusi-couple by Al-Tusi. Alternative models later proposed, including the first accurate lunar model by Ibn al-Shatir, a model rejecting stationary Earth in favour of Earth's rotation by Ali Kuşçu, and planetary model incorporating "circular inertia" by Al-Birjandi.
Nilakanthan model Nilakantha Somayaji (1444–1544) Geocentric and heliocentric A universe in which the planets orbit the Sun, which orbits the Earth; similar to the later Tychonic system.
Copernican universe Nicolaus Copernicus (1473–1543) Heliocentric with circular planetary orbits, finite extent First described in De revolutionibus orbium coelestium. The Sun is in the center of the universe, planets including Earth orbit the Sun, but the Moon orbits the Earth. The universe is limited by the sphere of the fixed stars.
Tychonic system Tycho Brahe (1546–1601) Geocentric and Heliocentric A universe in which the planets orbit the Sun and the Sun orbits the Earth, similar to the earlier Nilakanthan model.
Bruno's cosmology Giordano Bruno (1548–1600) Infinite extent, infinite time, homogeneous, isotropic, non-hierarchical Rejects the idea of a hierarchical universe. Earth and Sun have no special properties in comparison with the other heavenly bodies. The void between the stars is filled with aether, and matter is composed of the same four elements (water, earth, fire, and air), and is atomistic, animistic and intelligent.
De Magnete William Gilbert (1544–1603) Heliocentric, indefinitely extended Copernican heliocentrism, but he rejects the idea of a limiting sphere of the fixed stars for which no proof has been offered.
Keplerian Johannes Kepler (1571–1630) Heliocentric with elliptical planetary orbits Kepler's discoveries, marrying mathematics and physics, provided the foundation for the present conception of the Solar System, but distant stars were still seen as objects in a thin, fixed celestial sphere.
Static Newtonian Isaac Newton (1642–1727) Static (evolving), steady state, infinite Every particle in the universe attracts every other particle. Matter on the large scale is uniformly distributed. Gravitationally balanced but unstable.
Cartesian Vortex universe René Descartes 17th century Static (evolving), steady state, infinite System of huge swirling whirlpools of aethereal or fine matter produces gravitational effects. But his vacuum was not empty; all space was filled with matter.
Hierarchical universe Immanuel Kant, Johann Lambert 18th century Static (evolving), steady state, infinite Matter is clustered on ever larger scales of hierarchy. Matter is endlessly recycled.
Einstein Universe with a cosmological constant Albert Einstein 1917 Static (nominally). Bounded (finite) "Matter without motion". Contains uniformly distributed matter. Uniformly curved spherical space; based on Riemann's hypersphere. Curvature is set equal to Λ. In effect Λ is equivalent to a repulsive force which counteracts gravity. Unstable.
De Sitter universe Willem de Sitter 1917 Expanding flat space.

Steady state. Λ > 0

"Motion without matter." Only apparently static. Based on Einstein's general relativity. Space expands with constant acceleration. Scale factor increases exponentially (constant inflation).
MacMillan universe William Duncan MacMillan 1920s Static and steady state New matter is created from radiation; starlight perpetually recycled into new matter particles.
Friedmann universe, spherical space Alexander Friedmann 1922 Spherical expanding space. k = +1 ; no Λ Positive curvature. Curvature constant k = +1

Expands then recollapses. Spatially closed (finite).

Friedmann universe, hyperbolic space Alexander Friedmann 1924 Hyperbolic expanding space. k = −1 ; no Λ Negative curvature. Said to be infinite (but ambiguous). Unbounded. Expands forever.
Dirac large numbers hypothesis Paul Dirac 1930s Expanding Demands a large variation in G, which decreases with time. Gravity weakens as universe evolves.
Friedmann zero-curvature Einstein and De Sitter 1932 Expanding flat space

k = 0 ; Λ = 0 Critical density

Curvature constant k = 0. Said to be infinite (but ambiguous). "Unbounded cosmos of limited extent". Expands forever. "Simplest" of all known universes. Named after but not considered by Friedmann. Has a deceleration term q = 1/2, which means that its expansion rate slows down.
The original Big Bang (Friedmann-Lemaître) Georges Lemaître 1927–1929 Expansion

Λ > 0 ; Λ > |Gravity|

Λ is positive and has a magnitude greater than gravity. Universe has initial high-density state ("primeval atom"). Followed by a two-stage expansion. Λ is used to destabilize the universe. (Lemaître is considered the father of the Big Bang model.)
Oscillating universe (Friedmann-Einstein) Favored by Friedmann 1920s Expanding and contracting in cycles Time is endless and beginningless; thus avoids the beginning-of-time paradox. Perpetual cycles of Big Bang followed by Big Crunch. (Einstein's first choice after he rejected his 1917 model.)
Eddington universe Arthur Eddington 1930 First static then expands Static Einstein 1917 universe with its instability disturbed into expansion mode; with relentless matter dilution becomes a De Sitter universe. Λ dominates gravity.
Milne universe of kinematic relativity Edward Milne 1933, 1935;

William H. McCrea 1930s

Kinematic expansion without space expansion Rejects general relativity and the expanding space paradigm. Gravity not included as initial assumption. Obeys cosmological principle and special relativity; consists of a finite spherical cloud of particles (or galaxies) that expands within an infinite and otherwise empty flat space. It has a center and a cosmic edge (surface of the particle cloud) that expands at light speed. Explanation of gravity was elaborate and unconvincing.
Friedmann–Lemaître–Robertson–Walker class of models Howard Robertson, Arthur Walker 1935 Uniformly expanding Class of universes that are homogeneous and isotropic. Spacetime separates into uniformly curved space and cosmic time common to all co-moving observers. The formulation system is now known as the FLRW or Robertson–Walker metrics of cosmic time and curved space.
Steady-state Hermann Bondi, Thomas Gold 1948 Expanding, steady state, infinite Matter creation rate maintains constant density. Continuous creation out of nothing from nowhere. Exponential expansion. Deceleration term q = −1.
Steady-state Fred Hoyle 1948 Expanding, steady state; but unstable Matter creation rate maintains constant density. But since matter creation rate must be exactly balanced with the space expansion rate the system is unstable.
Ambiplasma Hannes Alfvén 1965 Oskar Klein Cellular universe, expanding by means of matter–antimatter annihilation Based on the concept of plasma cosmology. The universe is viewed as "meta-galaxies" divided by double layers and thus a bubble-like nature. Other universes are formed from other bubbles. Ongoing cosmic matter-antimatter annihilations keep the bubbles separated and moving apart preventing them from interacting.
Brans–Dicke theory Carl H. Brans, Robert H. Dicke Expanding Based on Mach's principle. G varies with time as universe expands. "But nobody is quite sure what Mach's principle actually means."
Cosmic inflation Alan Guth 1980 Big Bang modified to solve horizon and flatness problems Based on the concept of hot inflation. The universe is viewed as a multiple quantum flux – hence its bubble-like nature. Other universes are formed from other bubbles. Ongoing cosmic expansion kept the bubbles separated and moving apart.
Eternal inflation (a multiple universe model) Andreï Linde 1983 Big Bang with cosmic inflation Multiverse based on the concept of cold inflation, in which inflationary events occur at random each with independent initial conditions; some expand into bubble universes supposedly like the entire cosmos. Bubbles nucleate in a spacetime foam.
Cyclic model Paul Steinhardt; Neil Turok 2002 Expanding and contracting in cycles; M-theory Two parallel orbifold planes or M-branes collide periodically in a higher-dimensional space. With quintessence or dark energy.
Cyclic model Lauris Baum; Paul Frampton 2007 Solution of Tolman's entropy problem Phantom dark energy fragments universe into large number of disconnected patches. The observable patch contracts containing only dark energy with zero entropy.

Table notes: the term "static" simply means not expanding and not contracting. Symbol G represents Newton's gravitational constant; Λ (Lambda) is the cosmological constant.

East Asian studies

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/East_Asian_studies East Asian studies is a...