Search This Blog

Thursday, January 23, 2020

Oxford Internet Institute

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Oxford_Internet_Institute
 
Front door of the Oxford Internet Institute on St Giles, Oxford.

The Oxford Internet Institute (OII) is a multi-disciplinary department of social and computer science dedicated to the study of information, communication, and technology, and is part of the Social Sciences Division of the University of Oxford, England. It is housed over three sites on St Giles in Oxford, including a primary site at 1 St Giles, owned by Balliol College. The department undertakes research and teaching devoted to understanding life online, with the aim of shaping Internet research, policy, and practice.

Founded in 2001, the OII has tracked the Internet's development and use, aiming to shed light on individual, collective and institutional behaviour online. The department brings together academics from a wide range of disciplines including political science, sociology, geography, economics, philosophy, physics and psychology.

The current director is Professor Philip N. Howard

Research

Research at the OII covers a huge variety of topics, with faculty publishing journal articles and books on issues including privacy and security, e-government and e-democracy, virtual economies, smart cities, digital exclusion, digital humanities, online gaming, big data and Internet geography. The OII currently has the following research clusters reflecting the diverse expertise of faculty:
  • Digital Politics and Government
  • Information Governance and Security
  • Social Data Science
  • Connectivity, Inclusion and Inequality
  • Internet Economies
  • Digital Knowledge and Culture
  • Education, Digital Life and Wellbeing
  • Ethics and Philosophy of Information
The faculty and students of the OII have made many advances to the study of technology and society. Director Helen Margetts' book Political Turbulence was the first book-length manuscript on the impact of social media on public life. Viktor Mayer-Schönberger's book Big Data was the first to define and explain the importance of big data in contemporary economics, culture, and politics. His 2018 work Reinventing Capitalism in the Age of Big Data focuses on the role of platforms for allocating demand and supply beyond the price mechanism. Professor Philip N. Howard was the first political scientist to define and study astroturfing and has been responsible for significant research on fake news, bots and trolls in international affairs. Luciano Floridi is the world's leading ethicist on AI and big data. 

Studies of Wikipedia

OII has published several studies in Internet geography and Wikipedia. In November 2011, the Guardian Data Blog published maps of geotagged Wikipedia articles written in English, Arabic, Egyptian Arabic, French, Hebrew and Persian. OII Senior Research Fellow Mark Graham led the study and published the results on his blog, Zero Geography. Graham also leads an OII project focused on how new users are perceived, represented, and incorporated into the Wikipedia community. In 2013, OII researchers led by Taha Yasseri published a study of controversial topics in 10 different language versions of Wikipedia, using data related to "edit wars". As of late, the OII has, amongst other things, been involved in research on the effects of computational propaganda, the ethics of big data in different contexts and the political implications of the Internet and social media. It closely collaborates with other institutions of the University of Oxford such as the Reuters Institute for the Study of Journalism, the department for computer science and the Oxford Martin School. 

Teaching

Since 2006, the OII has offered a DPhil (doctoral) degree in "Information, Communication, and the Social Sciences." Since 2009, it has offered a one-year Master of Science (MSc) degree in "Social Science of the Internet". From 2015, prospective students can apply to study the MSc degree part-time over two years. In addition, the department also runs an annual Summer Doctoral Programme which brings outstanding PhD students to study at the OII for two weeks each July. From 2018, prospective students also have the option to apply for a one-year Master of Science degree in Social Data Science with the related DPhil in Social Data Science available from 2020 onward.

History

The Oxford Internet Institute was made possible by a major donation from the Shirley Foundation of over £10m, with public funding totalling over £5m from the Higher Education Funding Council for England.

The idea originated with Derek Wyatt MP and Andrew Graham, then Master-Elect of Balliol. Two Balliol Alumni, who knew Dame Stephanie from The Worshipful Company of Information Technologists, approached Dame Stephanie for support.

The Oxford Internet Institute is part of a small network of research centres that includes the centres like the Berkman Klein Center for Internet & Society and Information Society Project at Yale Law School. But it is the only one that functions as a fully functioning, degree-granting department. 

Directors


Faculty


OII awards

For its 10th anniversary the OII launched the OII awards for lifetime achievement awards on the internet research field and the Internet & Society awards for significant recent contribution to develop the internet for public good.

Lifetime achievement awards winners

2016:
2014:
2013:
2012:
2011:

Internet and society awards

2016:
2014:
2013:
2012:
2011:

Wednesday, January 22, 2020

Berkman Klein Center for Internet & Society

 
Berkman Klein Center for Internet & Society
BKC Ltd Horz RGB.png
MottoExploring cyberspace, sharing in its study & pioneering its development.
Formation1998
TypeTechnology research center
Location
Websitecyber.harvard.edu

The Berkman Klein Center for Internet & Society is a research center at Harvard University that focuses on the study of cyberspace. Founded at Harvard Law School, the center traditionally focused on internet-related legal issues. On May 15, 2008, the Center was elevated to an interfaculty initiative of Harvard University as a whole. It is named after the Berkman family. On July 5, 2016, the Center added "Klein" to its name following a gift of $15 million from Michael R. Klein.

History and mission

The location at 23 Everett Street

The center was founded in 1996 as the "Center on Law and Technology" by Jonathan Zittrain and Professor Charles Nesson. This built on previous work including a 1994 seminar they held on legal issues involving the early Internet. Professor Arthur Miller and students David Marglin and Tom Smuts also worked on that seminar and related discussions. In 1997, the Berkman family underwrote the center, and Lawrence Lessig joined as the first Berkman professor. In 1998, the center changed its name to the "Berkman Center for Internet & Society at Harvard Law School". Since then, it has grown from a small project within Harvard Law School to a major interdisciplinary center at Harvard University. The Berkman Klein Center seeks to understand how the development of Internet-related technologies is inspired by the social context in which they are embedded and how the use of those technologies affects society in turn. It seeks to use the lessons drawn from this research to inform the design of Internet-related law and pioneer the development of the Internet itself. The Berkman Klein Center sponsors Internet-related events and conferences, and hosts numerous visiting lecturers and research fellows.

Members of the center teach, write books, scientific articles, weblogs with RSS 2.0 feeds (for which the Center holds the specification), and podcasts (of which the first series took place at the Berkman Klein Center). Its newsletter, The Buzz, is on the Web and available by e-mail, and it hosts a blog community of Harvard faculty, students, and Berkman Klein Center affiliates.

The Berkman Klein Center faculty and staff have also conducted major public policy reviews of pressing issues. In 2008, John Palfrey led a review of child safety online called the Internet Safety Technical Task Force. In 2009, Yochai Benkler led a review of United States broadband policy. In 2010, Urs Gasser, along with Palfrey and others, led a review of Internet governance body ICANN, focusing on transparency, accountability, and public participation.

Projects and initiatives

The Berkman Klein Center's main research topics are Teens and Media, Monitoring, Privacy, Digital art, Internet Governance, Cloud Computing and Internet censorship. The Berkman Klein Center supports events, presentations, and conferences about the Internet and invites scientists to share their ideas.

Digital Media Law Project

The Digital Media Law Project (DMLP) was a project hosted by the Berkman Klein Center for Internet & Society at Harvard Law School. It had previously been known as the Citizen Media Law Project. The purposes of the DMLP were:
  1. To provide resources and other assistance, including legal assistance as of 2009, to individuals and groups involved in online and citizen media.
  2. To "ensur[e] that online journalists, media organizations, and their sources are allowed to examine and debate network security and data protection vulnerabilities without criminal punishment, in order to inform citizens and lawmakers about networked computer security."
  3. To facilitate the participation of citizens in online media.
  4. To protect the freedom of speech on the Internet.
In 2014, Berkman Klein Center announced that it would "spin off its most effective initiatives and cease operation as a stand-alone project within the Berkman Klein Center."

Internet and Democracy Project

The Berkman Klein Center operated the now-completed Internet and Democracy Project, which describes itself as an:
initiative that will examine how the Internet influences democratic norms and modes, including its impact on civil society, citizen media, government transparency, and the rule of law, with a focus on the Middle East. Through a grant of $1.5 million from the US Department of State's Middle East Partnership Initiative, the Berkman Center will undertake the study over the next two years in collaboration with its extended community and institutional partners. As with all its projects, the Berkman Center retains complete independence in its research and other efforts under this grant.
The goal of this work is to support the rights of citizens to access, develop and share independent sources of information, to advocate responsibly, to strengthen online networks, and to debate ideas freely with both civil society and government. These subjects will be examined through a series of case studies in which new technologies and online resources have influenced democracy and civic engagement. The project will include original research and the identification and development of innovative web-based tools that support the goals of the project. The team, led by Project Director Bruce Etling, will draw on communities from around the world, with a focus on the Middle East.

StopBadware

In 2006, the Center established the non-profit organization StopBadware, aiming to stop viruses, spyware, and other threats to the open Internet, in partnership with the Oxford Internet Institute, Google, Lenovo and Sun Microsystems. In 2010, StopBadware became an independent entity supported by Google, PayPal, and Mozilla.

Digital Public Library of America

The Digital Public Library of America is a project aimed at making a large-scale digital public library accessible to all.

Ethics and Governance of Artificial Intelligence

In 2017, the BKC received a $27M grant with the MIT Media Lab to "advance Artificial Intelligence research for the public good" and "to ensure automation and machine learning are researched, developed, and deployed in a way which vindicate social values of fairness, human autonomy, and justice".

Members



The center also has active groups of faculty associates, affiliates and alumni who host and participate in their projects each year.

Students for Free Culture

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Students_for_Free_Culture 
 
Students for Free Culture
Free Culture dot org logo.png
Websitefreeculture.org

Students for Free Culture, formerly known as FreeCulture.org, is an international student organization working to promote free culture ideals, such as cultural participation and access to information. It was inspired by the work of former Stanford, now Harvard, law professor Lawrence Lessig, who wrote the book Free Culture, and it frequently collaborates with other prominent free culture NGOs, including Creative Commons, the Electronic Frontier Foundation, and Public Knowledge. Students for Free Culture has over 30 chapters on college campuses around the world, and a history of grassroots activism.

Students for Free Culture is sometimes referred to as "FreeCulture", "the Free Culture Movement", and other variations on the "free culture" theme, but none of those are its official name. It is officially Students for Free Culture, as set for in the new bylaws that were ratified by its chapters on October 1, 2007, which changed its name from FreeCulture.org to Students for Free Culture.

Goals

Students for Free Culture has stated its goals in a "manifesto":
The mission of the Free Culture movement is to build a bottom-up, participatory structure to society and culture, rather than a top-down, closed, proprietary structure. Through the democratizing power of digital technology and the Internet, we can place the tools of creation and distribution, communication and collaboration, teaching and learning into the hands of the common person -- and with a truly active, connected, informed citizenry, injustice and oppression will slowly but surely vanish from the earth.
It has yet to publish a more "official" mission statement, but some of its goals are:
  • decentralization of creativity—getting ordinary people and communities involved with art, science, journalism and other creative industries, especially through new technologies
  • reforming copyright, patent, and trademark law in the public interest, ensuring that new creators are not stifled by old creators
  • making important information available to the public

Purpose

According to its website, Students for Free Culture has four main functions within the free culture movement:
  1. Creating and providing resources for its chapters and for the general public
  2. Outreach to youth and students
  3. Networking with other people, companies and organizations in the free culture movement
  4. Issue advocacy on behalf of its members

History


Initial stirrings at Swarthmore College

Students for Free Culture had its origins in the Swarthmore Coalition for the Digital Commons (SCDC), a student group at Swarthmore College. The SCDC was founded in 2003 by students Luke Smith and Nelson Pavlosky, and was originally focused on issues related to free software, digital restrictions management, and treacherous computing, inspired largely by the Free Software Foundation. After watching Lawrence Lessig's OSCON 2002 speech entitled "free culture" however, they expanded the club's scope to cover cultural participation in general (rather than just in the world of software and computers), and began tackling issues such as copyright reform. In September 2004, SCDC was renamed Free Culture Swarthmore, laying the groundwork for Students for Free Culture and making it the first existing chapter. 

OPG v. Diebold case

Within a couple of months of founding the SCDC, Smith and Pavlosky became embroiled in the controversy surrounding Diebold Election Systems (now Premier Election Solutions), a voting machine manufacturer accused of making bug-ridden and insecure electronic voting machines. The SCDC had been concerned about electronic voting machines using proprietary software rather than open source software, and kept an eye on the situation. Their alarm grew when a copy of Diebold's internal e-mail archives leaked onto the Internet, revealing questionable practices at Diebold and possible flaws with Diebold's machines, and they were spurred into action when Diebold began sending legal threats to voting activists who posted the e-mails on their websites. Diebold was claiming that the e-mails were their copyrighted material, and that anyone who posted these e-mails online was infringing upon their intellectual property. The SCDC posted the e-mail archive on its website and prepared for the inevitable legal threats.

Diebold sent takedown notices under the DMCA to the SCDC's ISP, Swarthmore College. Swarthmore took down the SCDC website, and the SCDC co-founders sought legal representation. They contacted the Electronic Frontier Foundation for help, and discovered that they had an opportunity to sign on to an existing lawsuit against Diebold, OPG v. Diebold, with co-plaintiffs from a non-profit ISP called the Online Policy Group who had also received legal threats from Diebold. With pro bono legal representation from EFF and the Stanford Cyberlaw Clinic, they sued Diebold for abusing copyright law to suppress freedom of speech online. After a year of legal battles, the judge ruled that posting the e-mails online was a fair use, and that Diebold had violated the DMCA by misrepresenting their copyright claims over the e-mails. 

The network of contacts that Smith and Pavlosky built during the lawsuit, including dozens of students around the country who had also hosted the Diebold memos on their websites, gave them momentum they needed to found an international student movement based on the same free culture principles as the SCDC. They purchased the domain name Freeculture.org and began building a website, while contacting student activists at other schools who could help them start the organization. 

FreeCulture.org launching at Swarthmore

On April 23, 2004, Smith and Pavlosky announced the official launch of FreeCulture.org, in an event at Swarthmore College featuring Lawrence Lessig as the keynote speaker (Lessig had released his book Free Culture less than a month beforehand.) The SCDC became the first Freeculture.org chapter (beginning the process of changing its name to Free Culture Swarthmore), and students from other schools in the area who attended the launch went on to found chapters on their campuses, including Bryn Mawr College and Franklin and Marshall.

Internet campaigns

FreeCulture.org began by launching a number of internet campaigns, in an attempt to raise its profile and bring itself to the attention of college students. These have covered issues ranging from defending artistic freedom (Barbie in a Blender) to fighting the Induce Act (Save The iPod), from celebrating Creative Commons licenses and the public domain (Undead Art) to opposing business method patents (Cereal Solidarity). While these one-shot websites succeeded in attracting attention from the press and encouraged students to get involved, they didn't directly help the local chapters, and the organization now concentrates less on web campaigns than it did in the past. However, their recent Down With DRM video contest was a successful "viral video" campaign against DRM, and internet campaigns remain an important tool in free culture activism.

Increased emphasis on local chapters

Currently the organization focuses on providing services to its local campus chapters, including web services such as mailing lists and wikis, pamphlets and materials for tabling, and organizing conferences where chapter members can meet up. Active chapters are located at schools such as New York University (NYU), Harvard, MIT, Fordham Law, Dartmouth, University of Florida, Swarthmore, USC, Emory, Reed, and Yale

The NYU chapter made headlines when it began protesting outside of record stores against DRM on CDs during the Sony rootkit scandal, resulting in similar protests around New York and Philadelphia.
In 2008, the MIT chapter developed and released YouTomb, a website to track videos removed by DMCA takedown from YouTube.

Other activities at local chapters include:
  • art shows featuring Creative Commons-licensed art,
  • mix CD-exchanging flash mobs,
  • film-remixing contests,
  • iPod liberating parties, where the organizers help people replace the proprietary DRM-encumbered operating system on their iPods with a free software system like Rockbox,
  • Antenna Alliance, a project that provides free recording space to bands, releases their music online under Creative Commons licenses, and distributes the music to college radio stations,
  • a campaign to promote open access on university campuses.

Structure

Students for Free Culture began as a loose confederation of student groups on different campuses, but it has been moving towards becoming an official tax-exempt non-profit.

With the passage of official bylaws, Students for Free Culture now has a clear governance structure which makes it accountable to its chapters. The supreme decision-making body is the Board of Directors, which is elected once a year by the chapters, using a Schulze method for voting. It is meant to make long-term, high-level decisions, and should not meddle excessively in lower-level decisions. Practical everyday decisions will be made by the Core team, composed of any students who are members of chapters and meet the attendance requirements. Really low-level decisions and minutiae will be handled by a coordinator, who ideally will be a paid employee of the organization, and other volunteers and assistants. A new board of directors was elected in February 2008, and a new Core Team was assembled shortly thereafter. There is no coordinator yet.

Petroleum exploration in the Arctic

 
Location of Arctic Basins assessed by the USGS.

The exploration of the Arctic for petroleum is considered to be extremely technically challenging. However, recent technological developments, as well as relatively high oil prices, have allowed for exploration. As a result, the region has received significant interest from the petroleum industry

Since the onset of the 2010s oil glut in 2014, the commercial interest in Arctic exploration has declined.

Overview

There are 19 geological basins making up the Arctic region. Some of these basins have experienced oil and gas exploration, most notably the Alaska North Slope where oil was first produced in 1968 from Prudhoe Bay. However, only half the basins – such as the Beaufort Sea and the West Barents Sea – have been explored.

A 2008 United States Geological Survey estimates that areas north of the Arctic Circle have 90 billion barrels of undiscovered, technically recoverable oil (and 44 billion barrels of natural gas liquids ) in 25 geologically defined areas thought to have potential for petroleum. This represents 13% of the undiscovered oil in the world. Of the estimated totals, more than half of the undiscovered oil resources are estimated to occur in just three geologic provinces – Arctic Alaska, the Amerasian Basin, and the East Greenland Rift Basins.

More than 70% of the mean undiscovered oil resources is estimated to occur in five provinces: Arctic Alaska, Amerasia Basin, East Greenland Rift Basins, East Barents Basins, and West Greenland–East Canada. It is further estimated that approximately 84% of the undiscovered oil and gas occurs offshore. The USGS did not consider economic factors such as the effects of permanent sea ice or oceanic water depth in its assessment of undiscovered oil and gas resources. This assessment is lower than a 2000 survey, which had included lands south of the arctic circle.

A recent study carried out by Wood Mackenzie on the Arctic potential comments that the likely remaining reserves will be 75% natural gas and 25% oil. It highlights four basins that are likely to be the focus of the petroleum industry in the upcoming years: the Kronprins Christian Basin, which is likely to have large reserves, the southwest Greenland basin, due to its proximity to markets, and the more oil-prone basins of Laptev and Baffin Bay

Timeline
Year Region Milestone
1964 Cook Inlet shallow water steel platform in moving ice
1969 North West Passage first commercial ship (SS Manhattan) to transit NW passage
1971 Canadian Beaufort shallow water sand island exploration
1974 Arctic Islands shallow and deep water ice islands exploration
1976 Canadian Beaufort 20–70 m water depth ice-strengthened drillship exploration (Canmar drillship)
1981 Canadian Beaufort shallow water caisson exploration (Tarsiut caissons)
1983 Canadian Beaufort 20–70 m ice-resistant floating exploration drilling
1984 US & Canadian Beaufort shallow water caisson & gravity based structure exploration (SDC drilling)
1987 US & Canadian Beaufort spray ice islands used to reduce cost
1998 Sakhalin extension of Molikpaq caisson for early production in ice
2007 Barents Sea subsea to shore LNG (Snøhvit field)
2007/08 Sakhalin shallow water ice-resistant GBS production
2008 Varandey 1st arctic offshore tanker loading terminal
2012 West Greenland deepwater floating exploration drilling in ice
2014 Pechora Sea 1st shallow water year-round manned GBS production in the Arctic (Prirazlomnaya platform)

 

Canada

Extensive drilling was done in the Canadian Arctic during the 1970s and 1980s by such companies as Panarctic Oils Ltd., Petro Canada and Dome Petroleum. After 176 wells were drilled at billions of dollars of cost, approximately 1.9 billion barrels (300×106 m3) of oil and 19.8 trillion cubic feet (560×109 m3) of natural gas were found. These discoveries were insufficient to justify development, and all the wells which were drilled were plugged and abandoned.

Drilling in the Canadian Arctic turned out to be expensive and dangerous. The geology of the Canadian Arctic turned out to be far more complex than oil-producing regions like the Gulf of Mexico. It was discovered to be gas prone rather than oil prone (i.e. most of the oil had been transformed into natural gas by geological processes), and most of the reservoirs had been fractured by tectonic activity, allowing most of the petroleum which might at one time have been present to leak out.

Russia

In June 2007, a group of Russian geologists returned from a six-week voyage on a nuclear icebreaker 50 Let Pobedy, the expedition called Arktika 2007. They had travelled to the Lomonosov ridge, an underwater shelf going between Russia's remote, inhospitable eastern Arctic Ocean, and Ellesmere Island in Canada where the ridge lies 400m under the ocean surface.

According to Russia's media, the geologists returned with the "sensational news" that the Lomonosov ridge was linked to Russian Federation territory, boosting Russia's claim over the oil-and-gas rich triangle. The territory contained 10bn tonnes of gas and oil deposits, the scientists said.

Greenland

Greenland is believed by some geologists to have some of the world’s largest remaining oil resources. Prospecting is taking place under the auspices of NUNAOIL, a partnership between the Greenland Home Rule Government and the Danish state. U.S. Geological Survey found in 2001 that the waters off north-eastern Greenland, in the Greenland Sea north and south of the Arctic Circle, could contain up to 110 billion barrels (17×109 m3) of oil.

Greenland has offered 8 license blocks for tender along its west coast by Baffin Bay. Currently, 7 of those blocks have been bid for by a combination of multinational oil companies and the National Oil Company NUNAOIL. Companies that have participated successfully in the previous license rounds and have formed a partnership for the licenses with NUNAOIL are, DONG Energy, Chevron, ExxonMobil, Husky Energy, Cairn Energy. The area available, known as the West Disko licensing round, is of interest because of its relative accessibility compared to other Arctic basins as the area remains largely free of ice. Also, it has a number of promising geological leads and prospects from the Paleocene era. 

United States

Prudhoe Bay Oil Field on Alaska's North Slope is the largest oil field in North America, The field was discovered on March 12, 1968, by Atlantic Richfield Company (ARCO) and is operated by BP; partners are ExxonMobil and ConocoPhillips Alaska.

In September 2012 Shell delayed actual oil drilling in the Chukchi until the following summer due to heavier-than-normal ice and the Arctic Challenger, an oil-spill response vessel, not being ready on time. However, on September 23, Shell began drilling a "top-hole" over its Burger prospect in the Chukchi. And on October 3, Shell began drilling a top-hole over its Sivulliq prospect in the Beaufort Sea, after being notified by the Alaska Eskimo Whaling Commission that drilling could begin.

In September, 2012, Statoil chose to delay its oil exploration plans at its Amundsen prospect in the Chukchi Sea, about 100 miles northwest of Wainwright, Alaska, by at least one year, to 2015 at the earliest.

In 2012 Conoco planned to drill at its Devil's Paw prospect (part of a 2008 lease buy in the Chukchi Sea 120 miles west of Wainwright) in summer of 2013. This project was later shelved in 2013 after concerns over rig type and federal regulations related to runaway well containment.

October 11, 2012, Dep. Secretary of the Department of the Interior David Hayes stated that support for the permitting process for Arctic offshore petroleum drilling will continue if President Obama stays in office.

Shell, however, announced in September 2015 that it was abandoning exploration "for the foreseeable future" in Alaska, after tests showed disappointing quantities of oil and gas in the area.

On October 4, 2016 Caelus Energy Alaska announced its discovery at Smith Bay could "provide 200,000 barrels per day of light, highly mobile oil".

Norway

Rosneft and Statoil made the Arctic exploration deal in May 2012. It is the third deal Rosneft has signed in the past month, after Arctic exploration agreements with Italy's Eni and US giant Exxon Mobil. Compared to other Arctic oil states, Norway is probably best equipped for oil spill preparedness in the Arctic.

Environmental concerns

Greenpeace have launched the Save the Arctic Project since the melting Arctic is under threat from oil drilling, industrial fishing and conflict.

Geological basins in the Arctic

Hydrocarbon exploration

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Hydrocarbon_exploration
 

Hydrocarbon exploration (or oil and gas exploration) is the search by petroleum geologists and geophysicists for deposits of hydrocarbons, particularly petroleum and natural gas, in the Earth using petroleum geology

Exploration methods

Visible surface features such as oil seeps, natural gas seeps, pockmarks (underwater craters caused by escaping gas) provide basic evidence of hydrocarbon generation (be it shallow or deep in the Earth). However, most exploration depends on highly sophisticated technology to detect and determine the extent of these deposits using exploration geophysics. Areas thought to contain hydrocarbons are initially subjected to a gravity survey, magnetic survey, passive seismic or regional seismic reflection surveys to detect large-scale features of the sub-surface geology. Features of interest (known as leads) are subjected to more detailed seismic surveys which work on the principle of the time it takes for reflected sound waves to travel through matter (rock) of varying densities and using the process of depth conversion to create a profile of the substructure. Finally, when a prospect has been identified and evaluated and passes the oil company's selection criteria, an exploration well is drilled in an attempt to conclusively determine the presence or absence of oil or gas.Offshore the risk can be reduced by using electromagnetic methods.

Oil exploration is an expensive, high-risk operation. Offshore and remote area exploration is generally only undertaken by very large corporations or national governments. Typical shallow shelf oil wells (e.g. North Sea) cost US$10 – 30 million, while deep water wells can cost up to US$100 million plus. Hundreds of smaller companies search for onshore hydrocarbon deposits worldwide, with some wells costing as little as US$100,000.

Elements of a petroleum prospect

Mud log in process, a common way to study the rock types when drilling oil wells.

A prospect is a potential trap which geologists believe may contain hydrocarbons. A significant amount of geological, structural and seismic investigation must first be completed to redefine the potential hydrocarbon drill location from a lead to a prospect. Four geological factors have to be present for a prospect to work and if any of them fail neither oil nor gas will be present.
Source rock 
When organic-rich rock such as oil shale or coal is subjected to high pressure and temperature over an extended period of time, hydrocarbons form.
Migration 
The hydrocarbons are expelled from source rock by three density-related mechanisms: the newly matured hydrocarbons are less dense than their precursors, which causes over-pressure; the hydrocarbons are lighter, and so migrate upwards due to buoyancy, and the fluids expand as further burial causes increased heating. Most hydrocarbons migrate to the surface as oil seeps, but some will get trapped.
Reservoir 
The hydrocarbons are contained in a reservoir rock. This is commonly a porous sandstone or limestone. The oil collects in the pores within the rock although open fractures within non-porous rocks (e.g. fractured granite) may also store hydrocarbons. The reservoir must also be permeable so that the hydrocarbons will flow to surface during production.
Trap 
The hydrocarbons are buoyant and have to be trapped within a structural (e.g. Anticline, fault block) or stratigraphic trap. The hydrocarbon trap has to be covered by an impermeable rock known as a seal or cap-rock in order to prevent hydrocarbons escaping to the surface

Exploration risk

Hydrocarbon exploration is a high risk investment and risk assessment is paramount for successful project portfolio management. Exploration risk is a difficult concept and is usually defined by assigning confidence to the presence of the imperative geological factors, as discussed above. This confidence is based on data and/or models and is usually mapped on Common Risk Segment Maps (CRS Maps). High confidence in the presence of imperative geological factors is usually coloured green and low confidence coloured red. Therefore, these maps are also called Traffic Light Maps, while the full procedure is often referred to as Play Fairway Analysis. The aim of such procedures is to force the geologist to objectively assess all different geological factors. Furthermore, it results in simple maps that can be understood by non-geologists and managers to base exploration decisions on.

Terms used in petroleum evaluation

Bright spot 
On a seismic section, coda that have high amplitudes due to a formation containing hydrocarbons.
Chance of success
An estimate of the chance of all the elements (see above) within a prospect working, described as a probability.
Dry hole 
A boring that does not contain commercial hydrocarbons.
Flat spot 
Possibly an oil-water, gas-water or gas-oil contact on a seismic section; flat due to gravity.
Full Waveform Inversion 
A supercomputer technique recently use in conjunction with seismic sensors to explore for petroleum deposits offshore.
Hydrocarbon in place 
Amount of hydrocarbon likely to be contained in the prospect. This is calculated using the volumetric equation - GRV x N/G x Porosity x Sh / FVF 
 
Gross rock volume (GRV) 
Amount of rock in the trap above the hydrocarbon water contact
Net sand 
Part of GRV that has the lithological capacity for being a productive zone; i.e. less shale contaminations.
Net reserve 
Part of net sand that has the minimum reservoir qualities; i.e. minimum porosity and permeability values.
Net/gross ratio (N/G) 
Proportion of the GRV formed by the reservoir rock (range is 0 to 1)
Porosity 
Percentage of the net reservoir rock occupied by pores (typically 5-35%)
Hydrocarbon saturation (Sh) 
Some of the pore space is filled with water - this must be discounted
Formation volume factor (FVF) 
Oil shrinks and gas expands when brought to the surface. The FVF converts volumes at reservoir conditions (high pressure and high temperature) to storage and sale conditions
Lead 
Potential accumulation is currently poorly defined and requires more data acquisition and/or evaluation in order to be classified as a prospect.
Play 
An area in which hydrocarbon accumulations or prospects of a given type occur. For example, the shale gas plays in North America include the Barnett, Eagle Ford, Fayetteville, Haynesville, Marcellus, and Woodford, among many others.
Prospect 
A lead which has been more fully evaluated.
Recoverable hydrocarbons 
Amount of hydrocarbon likely to be recovered during production. This is typically 10-50% in an oil field and 50-80% in a gas field.

Licensing

Petroleum resources are typically owned by the government of the host country. In the United States, most onshore (land) oil and gas rights (OGM) are owned by private individuals, in which case oil companies must negotiate terms for a lease of these rights with the individual who owns the OGM. Sometimes this is not the same person who owns the land surface. In most nations the government issues licences to explore, develop and produce its oil and gas resources, which are typically administered by the oil ministry. There are several different types of licence. Oil companies often operate in joint ventures to spread the risk; one of the companies in the partnership is designated the operator who actually supervises the work.
Tax and Royalty 
Companies would pay a royalty on any oil produced, together with a profits tax (which can have expenditure offset against it). In some cases there are also various bonuses and ground rents (license fees) payable to the government - for example a signature bonus payable at the start of the licence. Licences are awarded in competitive bid rounds on the basis of either the size of the work programme (number of wells, seismic etc.) or size of the signature bonus.
Production Sharing contract (PSA) 
A PSA is more complex than a Tax/Royalty system - The companies bid on the percentage of the production that the host government receives (this may be variable with the oil price), There is often also participation by the Government owned National Oil Company (NOC). There are also various bonuses to be paid. Development expenditure is offset against production revenue.
Service contract 
This is when an oil company acts as a contractor for the host government, being paid to produce the hydrocarbons.

Reserves and resources

Resources are hydrocarbons which may or may not be produced in the future. A resource number may be assigned to an undrilled prospect or an unappraised discovery. Appraisal by drilling additional delineation wells or acquiring extra seismic data will confirm the size of the field and lead to project sanction. At this point the relevant government body gives the oil company a production licence which enables the field to be developed. This is also the point at which oil reserves and gas reserves can be formally booked.

Oil and gas reserves

Oil and gas reserves are defined as volumes that will be commercially recovered in the future. Reserves are separated into three categories: proved, probable, and possible. To be included in any reserves category, all commercial aspects must have been addressed, which includes government consent. Technical issues alone separate proved from unproved categories. All reserve estimates involve some degree of uncertainty.
  • Proved reserves are the highest valued category. Proved reserves have a "reasonable certainty" of being recovered, which means a high degree of confidence that the volumes will be recovered. Some industry specialists refer to this as P90, i.e., having a 90% certainty of being produced. The SEC provides a more detailed definition:
Proved oil and gas reserves are those quantities of oil and gas, which, by analysis of geoscience and engineering data, can be estimated with reasonable certainty to be economically producible—from a given date forward, from known reservoirs, and under existing economic conditions, operating methods, and government regulations—prior to the time at which contracts providing the right to operate expire, unless evidence indicates that renewal is reasonably certain, regardless of whether deterministic or probabilistic methods are used for the estimation. The project to extract the hydrocarbons must have commenced or the operator must be reasonably certain that it will commence the project within a reasonable time.
  • Probable reserves are volumes defined as "less likely to be recovered than proved, but more certain to be recovered than Possible Reserves". Some industry specialists refer to this as P50, i.e., having a 50% certainty of being produced.
  • Possible reserves are reserves which analysis of geological and engineering data suggests are less likely to be recoverable than probable reserves. Some industry specialists refer to this as P10, i.e., having a 10% certainty of being produced.
The term 1P is frequently used to denote proved reserves; 2P is the sum of proved and probable reserves; and 3P the sum of proved, probable, and possible reserves. The best estimate of recovery from committed projects is generally considered to be the 2P sum of proved and probable reserves. Note that these volumes only refer to currently justified projects or those projects already in development.

Reserve booking

Oil and gas reserves are the main asset of an oil company. Booking is the process by which they are added to the balance sheet. 

In the United States, booking is done according to a set of rules developed by the Society of Petroleum Engineers (SPE). The reserves of any company listed on the New York Stock Exchange have to be stated to the U.S. Securities and Exchange Commission. Reported reserves may be audited by outside geologists, although this is not a legal requirement. 

In Russia, companies report their reserves to the State Commission on Mineral Reserves (GKZ).

Space travel in science fiction

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Space_travel_in_science_fiction Rock...