Search This Blog

Sunday, December 1, 2024

Health 2.0

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Health_2.0
Health 2.0

"Health 2.0" is a term introduced in the mid-2000s, as the subset of health care technologies mirroring the wider Web 2.0 movement. It has been defined variously as including social media, user-generated content, and cloud-based and mobile technologies. Some Health 2.0 proponents see these technologies as empowering patients to have greater control over their own health care and diminishing medical paternalism. Critics of the technologies have expressed concerns about possible misinformation and violations of patient privacy.

History

Health 2.0 built on the possibilities for changing health care, which started with the introduction of eHealth in the mid-1990s following the emergence of the World Wide Web. In the mid-2000s, following the widespread adoption both of the Internet and of easy to use tools for communication, social networking, and self-publishing, there was spate of media attention to and increasing interest from patients, clinicians, and medical librarians in using these tools for health care and medical purposes.

Early examples of Health 2.0 were the use of a specific set of Web tools (blogs, email list-servs, online communities, podcasts, search, tagging, Twitter, videos, wikis, and more) by actors in health care including doctors, patients, and scientists, using principles of open source and user-generated content, and the power of networks and social networks in order to personalize health care, to collaborate, and to promote health education. Possible explanations why health care has generated its own "2.0" term are the availability and proliferation of Health 2.0 applications across health care in general, and the potential for improving public health in particular.

Current use

While the "2.0" moniker was originally associated with concepts like collaboration, openness, participation, and social networking, in recent years the term "Health 2.0" has evolved to mean the role of Saas and cloud-based technologies, and their associated applications on multiple devices. Health 2.0 describes the integration of these into much of general clinical and administrative workflow in health care. As of 2014, approximately 3,000 companies were offering products and services matching this definition, with venture capital funding in the sector exceeding $2.3 billion in 2013.

Public Health 2.0

Public Health 2.0 is a movement within public health that aims to make the field more accessible to the general public and more user-driven. The term is used in three senses. In the first sense, "Public Health 2.0" is similar to "Health 2.0" and describes the ways in which traditional public health practitioners and institutions are reaching out (or could reach out) to the public through social media and health blogs.

In the second sense, "Public Health 2.0" describes public health research that uses data gathered from social networking sites, search engine queries, cell phones, or other technologies. A recent example is the proposal of statistical framework that utilizes online user-generated content (from social media or search engine queries) to estimate the impact of an influenza vaccination campaign in the UK.

In the third sense, "Public Health 2.0" is used to describe public health activities that are completely user-driven. An example is the collection and sharing of information about environmental radiation levels after the March 2011 tsunami in Japan. In all cases, Public Health 2.0 draws on ideas from Web 2.0, such as crowdsourcing, information sharing, and user-centered design. While many individual healthcare providers have started making their own personal contributions to "Public Health 2.0" through personal blogs, social profiles, and websites, other larger organizations, such as the American Heart Association (AHA) and United Medical Education (UME), have a larger team of employees centered around online driven health education, research, and training. These private organizations recognize the need for free and easy to access health materials often building libraries of educational articles.

Definitions

The "traditional" definition of "Health 2.0" focused on technology as an enabler for care collaboration: "The use of social software t-weight tools to promote collaboration between patients, their caregivers, medical professionals, and other stakeholders in health."

In 2011, Indu Subaiya redefined Health 2.0 as the use in health care of new cloud, Saas, mobile, and device technologies that are:

  1. Adaptable technologies which easily allow other tools and applications to link and integrate with them, primarily through use of accessible APIs
  2. Focused on the user experience, bringing in the principles of user-centered design
  3. Data driven, in that they both create data and present data to the user in order to help improve decision making

This wider definition allows recognition of what is or what isn't a Health 2.0 technology. Typically, enterprise-based, customized client-server systems are not, while more open, cloud based systems fit the definition. However, this line was blurring by 2011-2 as more enterprise vendors started to introduce cloud-based systems and native applications for new devices like smartphones and tablets.

In addition, Health 2.0 has several competing terms, each with its own followers—if not exact definitions—including Connected Health, Digital Health, Medicine 2.0, and mHealth. All of these support a goal of wider change to the health care system, using technology-enabled system reform—usually changing the relationship between patient and professional.:

  1. Personalized search that looks into the long tail but cares about the user experience
  2. Communities that capture the accumulated knowledge of patients, caregivers, and clinicians, and explains it to the world
  3. Intelligent tools for content delivery—and transactions
  4. Better integration of data with content

Wider health system definitions

In the late 2000s, several commentators used Health 2.0 as a moniker for a wider concept of system reform, seeking a participatory process between patient and clinician: "New concept of health care wherein all the constituents (patients, physicians, providers, and payers) focus on health care value (outcomes/price) and use competition at the medical condition level over the full cycle of care as the catalyst for improving the safety, efficiency, and quality of health care".

Health 2.0 defines the combination of health data and health information with (patient) experience, through the use of ICT, enabling the citizen to become an active and responsible partner in his/her own health and care pathway.

Health 2.0 is participatory healthcare. Enabled by information, software, and communities that we collect or create, we the patients can be effective partners in our own healthcare, and we the people can participate in reshaping the health system itself.

Definitions of Medicine 2.0 appear to be very similar but typically include more scientific and research aspects—Medicine 2.0: "Medicine 2.0 applications, services and tools are Web-based services for health care consumers, caregivers, patients, health professionals, and biomedical researchers, that use Web 2.0 technologies as well as semantic web and virtual reality tools, to enable and facilitate specifically social networking, participation, apomediation, collaboration, and openness within and between these user groups. Published in JMIR Tom Van de Belt, Lucien Engelen et al. systematic review found 46 (!) unique definitions of health 2.0

Overview

A model of Health 2.0

Health 2.0 refers to the use of a diverse set of technologies including Connected Health, electronic medical records, mHealth, telemedicine, and the use of the Internet by patients themselves such as through blogs, Internet forums, online communities, patient to physician communication systems, and other more advanced systems. A key concept is that patients themselves should have greater insight and control into information generated about them. Additionally Health 2.0 relies on the use of modern cloud and mobile-based technologies.

Much of the potential for change from Health 2.0 is facilitated by combining technology driven trends such as Personal Health Records with social networking —"[which] may lead to a powerful new generation of health applications, where people share parts of their electronic health records with other consumers and 'crowdsource' the collective wisdom of other patients and professionals." Traditional models of medicine had patient records (held on paper or a proprietary computer system) that could only be accessed by a physician or other medical professional. Physicians acted as gatekeepers to this information, telling patients test results when and if they deemed it necessary. Such a model operates relatively well in situations such as acute care, where information about specific blood results would be of little use to a lay person, or in general practice where results were generally benign. However, in the case of complex chronic diseases, psychiatric disorders, or diseases of unknown etiology patients were at risk of being left without well-coordinated care because data about them was stored in a variety of disparate places and in some cases might contain the opinions of healthcare professionals which were not to be shared with the patient. Increasingly, medical ethics deems such actions to be medical paternalism, and they are discouraged in modern medicine.

A hypothetical example demonstrates the increased engagement of a patient operating in a Health 2.0 setting: a patient goes to see their primary care physician with a presenting complaint, having first ensured their own medical record was up to date via the Internet. The treating physician might make a diagnosis or send for tests, the results of which could be transmitted directly to the patient's electronic medical record. If a second appointment is needed, the patient will have had time to research what the results might mean for them, what diagnoses may be likely, and may have communicated with other patients who have had a similar set of results in the past. On a second visit a referral might be made to a specialist. The patient might have the opportunity to search for the views of other patients on the best specialist to go to, and in combination with their primary care physician decides whom to see. The specialist gives a diagnosis along with a prognosis and potential options for treatment. The patient has the opportunity to research these treatment options and take a more proactive role in coming to a joint decision with their healthcare provider. They can also choose to submit more data about themselves, such as through a personalized genomics service to identify any risk factors that might improve or worsen their prognosis. As treatment commences, the patient can track their health outcomes through a data-sharing patient community to determine whether the treatment is having an effect for them, and they can stay up to date on research opportunities and clinical trials for their condition. They also have the social support of communicating with other patients diagnosed with the same condition throughout the world.

Level of use of Web 2.0 in health care

Partly due to weak definitions, the novelty of the endeavor and its nature as an entrepreneurial (rather than academic) movement, little empirical evidence exists to explain how much Web 2.0 is being used in general. While it has been estimated that nearly one-third of the 100 million Americans who have looked for health information online say that they or people they know have been significantly helped by what they found, this study considers only the broader use of the Internet for health management.

A study examining physician practices has suggested that a segment of 245,000 physicians in the U.S are using Web 2.0 for their practice, indicating that use is beyond the stage of the early adopter with regard to physicians and Web 2.0.

Types of Web 2.0 technology in health care

Web 2.0 is commonly associated with technologies such as podcasts, RSS feeds, social bookmarking, weblogs (health blogs), wikis, and other forms of many-to-many publishing; social software; and web application programming interfaces (APIs).

The following are examples of uses that have been documented in academic literature.

Purpose Description Case example in academic literature Users
Staying informed Used to stay informed of latest developments in a particular field Podcasts, RSS, and search tools All (medical professionals and public)
Medical education Use for professional development for doctors, and public health promotion for by public health professionals and the general public How podcasts can be used on the move to increase total available educational time or the many applications of these tools to public health All (medical professionals and public)
Collaboration and practice Web 2.0 tools use in daily practice for medical professionals to find information and make decisions Google searches revealed the correct diagnosis in 15 out of 26 cases (58%, 95% confidence interval 38% to 77%) in a 2005 study Doctors, nurses
Managing a particular disease Patients who use search tools to find out information about a particular condition Shown that patients have different patterns of usage depending on if they are newly diagnosed or managing a severe long-term illness. Long-term patients are more likely to connect to a community in Health 2.0 Public
Sharing data for research Completing patient-reported outcomes and aggregating the data for personal and scientific research Disease specific communities for patients with rare conditions aggregate data on treatments, symptoms, and outcomes to improve their decision making ability and carry out scientific research such as observational trials All (medical professionals and public)

Criticism of the use of Web 2.0 in health care

Hughes et al. (2009) argue there are four major tensions represented in the literature on Health/Medicine 2.0. These concern:

  1. the lack of clear definitions
  2. issues around the loss of control over information that doctors perceive
  3. safety and the dangers of inaccurate information
  4. issues of ownership and privacy
Several criticisms have been raised about the use of Web 2.0 in health care. Firstly, Google has limitations as a diagnostic tool for Medical Doctors (MDs), as it may be effective only for conditions with unique symptoms and signs that can easily be used as search term. Studies of its accuracy have returned varying results, and this remains in dispute. Secondly, long-held concerns exist about the effects of patients obtaining information online, such as the idea that patients may delay seeking medical advice or accidentally reveal private medical data. Finally, concerns exist about the quality of user-generated content leading to misinformation, such as perpetuating the discredited claim that the MMR vaccine may cause autism. In contrast, a 2004 study of a British epilepsy online support group suggested that only 6% of information was factually wrong. In a 2007 Pew Research Center survey of Americans, only 3% reported that online advice had caused them serious harm, while nearly one-third reported that they or their acquaintances had been helped by online health advice.

Saturday, November 30, 2024

Popular education

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Popular_education

Popular education
is a concept grounded in notions of class, political struggle, critical theory and social transformation. The term is a translation from the Spanish educación popular [es] or the Portuguese educação popular [pt]. The term 'popular' in this case means 'of the people'. More specifically 'popular' refers to the 'popular classes', which include peasants, the unemployed, the working class and sometimes the lower middle class. The designation of 'popular' is meant most of all to exclude the upper class and upper middle class.

Popular education is used to classify a wide array of educational endeavours and has been a strong tradition in Latin America since the end of the first half of the 20th century. These endeavors are either composed of or carried out in the interests of the popular classes. The diversity of projects and endeavors claiming or receiving the label of popular education makes the term difficult to precisely define. Generally, one can say that popular education is class-based in nature and rejects the notion of education as transmission or 'banking education'. It stresses a dialectic or dialogical model between educator and educand. This model is explored in great detail in the works of one of the foremost popular educators Paulo Freire.

Though sharing many similarities with other forms of alternative education, popular education is a distinct form in its own right. In the words of Liam Kane: "What distinguishes popular education from 'adult', 'non-formal', 'distance', or 'permanent education', for example, is that in the context of social injustice, education can never be politically neutral: if it does not side with the poorest and marginalised sectors- the 'oppressed' – in an attempt to transform society, then it necessarily sides with the 'oppressors' in maintaining the existing structures of oppression, even if by default."

Europe

Popular education began at the crossroads between politics and pedagogy, and strongly relies on the democratic ideal of the Enlightenment, which considered public education as a main tool of individual and collective emancipation, and thus the necessary conditions of autonomy, in accordance to Immanuel Kant's Was Ist Aufklärung? (What is Enlightenment?), published five years before the 1789 French Revolution, during which the Condorcet report established public instruction in France.

Jean-Jacques Rousseau's L'Emile: Or, On Education (1762) was another obvious theoretical influence, as well as the works of N. F. S. Grundtvig (1783–1872), at the origins of the Nordic movement of folk high schools. During the 19th century, popular education movements were involved, in particular in France, in the Republican and Socialist movement. A main component of the workers' movement, popular education was also strongly influenced by positivist, materialist and laïcité, if not anti-clerical, ideas.

Popular education may be defined as an educational technique designed to raise the consciousness of its participants and allow them to become more aware of how an individual's personal experiences are connected to larger societal problems. Participants are empowered to act to effect change on the problems that affect them.

19th century

One of the roots of popular education was the Condorcet report during the 1789 French Revolution. These ideas became an important component of the Republican and Socialist movement. Following the split of the First International at the 1872 Hague Congress between the "anti-authoritarian socialists" (anarchists) and the Marxists, popular education remained an important part of the workers' movement, in particular in the anarcho-syndicalist movement, strong in France, Spain and Italy. It was one of the important theme treated during the 1907 International Anarchist Congress of Amsterdam.

France

During the Second Empire, Jean Macé founded the Ligue de l'enseignement (Teaching League) in 1866; during the Lille Congress in 1885, Macé reaffirmed the masonic inspiration of this league devoted to popular instruction. Following the 1872 Hague Congress and the split between Marxists and anarchists, Fernand Pelloutier set up in France various Bourses du travail centres, where workers gathered and discussed politics and sciences.

The Jules Ferry laws in the 1880s, establishing free, laic (non-religious), mandatory and public education, were one of the founding stones of the Third Republic (1871–1940), set up in the aftermaths of the 1870 Franco-Prussian War and the Paris Commune.

Furthermore, most of the teachers, who were throughout one of the main support of the Third Republic, so much that it has been called the République des instituteurs ("Republic of Teachers"), while the teachers themselves were called, because of their Republican anti-clericalism, the hussards noirs de la République, supported Alfred Dreyfus against the conservatives during the Dreyfus Affair. One of its consequences was for them to set up free educational lectures of humanist topics for adults in order to struggle against the spread of antisemitism, which was not limited to the far-right but also affected the workers' movement.

Paul Robin's work at the Prévost orphanage of Cempuis was the model for Francisco Ferrer's Escuela Moderna in Spain. Robin taught atheism and internationalism, and broke new ground with co-ed schooling, and teaching orphans with the same respect given to other children. He taught that the individual should develop in harmony with the world, on the physical, moral, and intellectual planes.

Scandinavia

In Denmark, the concept of folk high school was pioneered in 1844 by N. F. S. Grundtvig. By 1870, Denmark had 50 of these institutions. The first in Sweden, Folkhögskolan Hvilan, was established in 1868 outside of Lund.

In 1882, liberal and socialist students at Uppsala University in Sweden founded the association Verdandi for popular education. Between 1888 and 1954 it published 531 educational booklets on various topics (Verdandis småskrifter).

Some Swedish proponents of folkbildning have adopted an anglicisation of folkbuilding

A Swedish bibliography on popular education with 25,000 references to books and articles between 1850 and 1950 is integrated in the Libris catalog of the Royal Library.

20th century

Popular education continued to be an important field of socialist politics, reemerging in particular during the Popular Front in 1936–38, while autogestion (self-management), a main tenet of the anarcho-syndicalist movement, became a popular slogan following the May '68 revolt.

Austria

During the Red Vienna period (1919–34) the Viennese Volkshochschule played an important role in providing popular education attracting significant levels of participation from both factory and office workers. They also attracted significant participation from prominent people associated with the Vienna Circle: Otto Neurath, Edgar Zilsel, Friedrich Waismann and Viktor Kraft.

The Escuela Moderna (1901–1907)

The Escuela Moderna (Modern School) was founded in 1901 in Barcelona by free-thinker Francesc Ferrer i Guàrdia, and became a leading inspiration of many various movements. Opposed to the "dogmas of conventional education Ferrer set a system based on reason, science, and observation." The school's stated goal was to "educate the working class in a rational, secular and non-coercive setting". In practice, high tuition fees restricted attendance at the school to wealthier middle class students. It was privately hoped that when the time was ripe for revolutionary action, these students would be motivated to lead the working classes. It closed in 1906. The Escuela Moderna, and Ferrer's ideas generally, formed the inspiration for a series of Modern Schools in the United States, Cuba, South America and London. The first of these was started in New York City in 1911. It also inspired the Italian newspaper Università popolare, founded in 1901.

France

List of lectures, Université populaire – town of Villeurbanne – 1936.

Following the 1981 presidential election that brought to power the Socialist Party (PS)'s candidate, François Mitterrand, his Minister of Education, Alain Savary, supported Jean Lévi's initiative to create a public high school, delivering the baccalauréat, but organized on the principles of autogestion (or self-management): this high school took the name of Lycée autogéré de Paris (LAP). The LAP explicitly modelled itself after the Oslo Experimental High School, opened in 1967 in Norway, as well as the Saint-Nazaire Experimental High School, opened six months before the LAP, and the secondary school Vitruve (opened in 1962 in the 20th arrondissement of Paris, still active). Theoretical references include Célestin Freinet and his comrades from the I.C.E.M., as well as Raymond Fonvieille, Fernand Oury, and others theoreticians of "institutional pedagogy", as well as those coming from the institutional analysis movement, in particular René Lourau, as well as members of the institutional psychotherapeutic movement, which were a main component in the 1970s of the anti-psychiatric movement (of which Félix Guattari was an important member). Since 2005, the LAP has maintained contact with self-managed firms, in the REPAS network (Réseau d'échanges de pratiques alternatives et solidaires, Network of Exchange of Solidarity and Alternative Practices")

A second generation for such folk high school meant to educate the people and the masses spread in the society (mainly for workers) just before the French Front populaire experience, as a reaction among teachers and intellectuals following the February 6, 1934 riots organized by far-right leagues. Issues devoted to free-thinking such as workers' self-management were thought and taught during that time, since the majority of attendants were proletarians interested in politics. Hence, some received the name of Université prolétarienne (Proletarian University) instead of Université populaire (Popular University) in some cities around the country. The reactionary Vichy regime put an end to such projects during World War II. The second generation continued in the post-war period, yet topical lectures turned to be more practical and focused on daily life matters. Nowadays, the largest remnant is located in the Bas-Rhin and Haut-Rhin départements.

Following World War II, popular teaching attempts were initiated mainly by the anarchist movement. Already in 1943, Joffre Dumazedier, Benigno Cacérès, Paul Lengrand, Joseph Rovan and others founded the Peuple et Culture (People and Culture) network, aimed at democratization of culture. Joffre Dumazedier conceptualized, at the Liberation, the concept of "cultural development" to oppose the concept of "economic development", thus foreshadowing the current Human Development Index. Historian Jean Maitron, for example, was director of the Apremont school in Vendée from 1950 to 1955.

Such popular educations were also a major feature of May '68 and of the following decenie, leading in particular to the establishment of the University of Paris VIII: Vincennes—Saint-Denis in Paris, in 1969. The Vincennes University (now located in Saint-Denis) was first an "Experimental University Center," with an interest in reshaping relations between students and teachers (so-called "mandarins", in reference to the bureaucrats of Imperial China, for their authority and classic, Third Republic pedagogy) as well as between the university itself and society. Thus, Vincennes was largely opened to those who did not have their baccalauréat diploma, as well as to foreigners. Its courses were focused on Freudo-Marxism, psychoanalysis, Marxist theory, cinema, theater, urbanism or artificial intelligence. Famous intellectuals such as Gilles Deleuze, Michel Foucault, Jacques Lacan and others held seminars there, in full classrooms where no seats could be found. The assistance was very heterogeneous. For instance, musicians such as Richard Pinhas assisted at Deleuze's courses, and after having written Anti-Oedipus (1972) with Félix Guattari, Deleuze used to say that non-specialists had best understood their work. Furthermore, Vincennes had no amphitheatres, representatives of the mandarin teacher facing and dominating several hundred students silently taking notes. It also enforced a strict equality between professors and teaching assistants. The student revolt continued throughout the 1970s in both Vincennes and the University of Paris X: Nanterre, created in 1964. In 1980, the Minister of Education Alice Saunier-Seité imposed the transfer of Vincennes' University to Saint-Denis. Although education was normalized in the 1980s, during the Mitterrand era, in both Saint-Denis and Vincennes, these universities have retained a less traditional outlook than the classic Sorbonne, where courses tend to be more conservative and sociological composition more middle-upper class.

Another attempt in popular education, specifically targeted towards the question of philosophy (France being one of the rare country where this discipline is taught in terminale, the last year of high school which culminate in the baccalauréat degree) was the creation, in 1983, of the open university named Collège international de philosophie (International Philosophy College, or Ciph), by Jacques Derrida, François Châtelet, Jean-Pierre Faye and Dominique Lecourt, in an attempt to re-think the teaching of philosophy in France, and to liberate it from any institutional authority (most of all from the university). As the ancient Collège de France, created by Francis I, it is free and open to everyone. The Ciph was first directed by Derrida, then by Philippe Lacoue-Labarthe, and has had as teaching members Giorgio Agamben, Alain Badiou, Sidi Mohamed Barkat, Geoffrey Bennington, François Châtelet, José Gil, Olivier LeCour Grandmaison, Antonio Negri, and others. The Ciph is still active.

In 2002 philosopher Michel Onfray initiated Université populaire de Caen in his hometown and starting a long seminar dealing with hedonistic philosophy from ancient times to May'68 events in French society, for at least ten years. His very topical subject in this seminar keeps going with a free-thinking spirit, since people are invited on the whole to rethink the history of ideas to eliminate any Christian influence. Despite the same name of Université populaire, it is not linked to the European federation of associations inherited from the second generation. In 2004, Onfray expanded the experience to other cities such as Arras, Lyon, Narbonne, Avignon, and Mons (in Belgium); each with various lectures and teachers joining his idea. The Universités populaires in Argentan is meant to deliver a culture of culinary tastes to nonworking people, through lectures and practises of famous chefs.

Latin America

Popular education is most commonly understood as an approach to education that emerged in Latin America during the 1930s. Closely linked with Marxism and particularly liberation theology. Best known amongst popular educators is the Brazilian Paulo Freire. Freire, and consequently the popular education movement in Latin America, draws heavily upon the work of John Dewey and Antonio Gramsci. One of the features of popular education in Latin America has been participatory action research (PAR).

Africa

Anglophone colonies

Anne Hope and Sally Timmel were Christian development workers and educators who used popular education in their work in East Africa. They documented their work between 1973 and 1984 in four handbooks designed to aid practitioners titled "Training for Transformation."

North America

In the United States and Canada popular education influenced social justice education and critical pedagogy, though there are differences. At the same time, however, there are examples of popular education in the U.S. and Canada that grew up alongside and independently of popular education in Latin America.

United States

Scholar and community-worker Myles Horton and his Highlander Folk School (now Highlander Research and Education Center) and his work in Tennessee can be classified as popular education. Horton's studies at Union Theological Seminary in New York under Reinhold Niebuhr in the 1920s parallels the emergence of liberation theology in Latin America and both are heavily influenced by socialism and a focus on the practical relationships between Christianity and everyday life. Niebuhr, however, was a staunch anti-communist while liberation theology has a much closer relationship to the work of Karl Marx. Additionally, popular education has been linked to populism and land-grant universities with their cooperative extension programs.

McCarthyism and the red scare were used to challenge and in some cases close labor schools and other institutions during the early part of the Cold War, as anticommunists attacked such schools for including communists. Nevertheless, Highlander Folk School, for example, played a significant role in the civil rights movement providing a space for leaders to consult and plan. And the methods of popular education continue to live on in radical education and community organizing circles, even though U.S. labor unions have largely abandoned the kind of labor education that more directly tied workplace organizing and collective bargaining to class struggle.

Digital Nations

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Digital_Nations

FoundedDecember 2014
TypeInternational network
Membership
  • Canada
  • Denmark
  • Estonia
  • Israel
  • Mexico
  • New Zealand
  • Portugal
  • Republic of Korea
  • United Kingdom
  • Uruguay
Websitehttps://leadingdigitalgovs.org/

The Digital Nations or DN (previously the Digital 5, Digital 7 and Digital 9) is a collaborative network of the world's leading digital governments with a common goal of harnessing digital technology to improve citizens' lives. Members share world-class digital practices, collaborate to solve common problems, identify improvements to digital services, and support and champion the group's growing digital economies. Through international cooperation, the Digital Nations aims to identify how digital government can provide the most benefit to citizens. The group embodies minilateral engagement, where small groups of states cooperate on specific topics with a global impact.

Members

Estonia, Israel, New Zealand, the Republic of Korea, and the United Kingdom are the founding members of the D5. In February 2018, Canada and Uruguay joined the group to form the D7. In November 2018, Mexico and Portugal joined to form the D9. Denmark joined as the tenth member of Digital Nations in November 2019.

The following government departments lead their country's engagement with the DN:

Charter

The signing of the D7 Charter in Wellington, New Zealand in 2018

In 2014, the founding members signed a charter committing to share and improve upon the participant nations' practices in digital services and digital economies. Updated to reflect a growing membership, the DN Charter outlines a mutual commitment to digital development and leadership through nine core principles:

  1. User needs – the design of public services for the citizen
  2. Open standards – a commitment to credible royalty-free open standards to promote interoperability
  3. Open source – future government systems, tradecraft, standards and manuals are created as open source and are shareable between members
  4. Open markets – in government procurement, create true competition for companies regardless of size. Encourage and support a start-up culture and promote growth through open markets
  5. Open government (transparency) – be a member of the Open Government Partnership and use open licenses to produce and consume open data
  6. Connectivity – enable an online population through comprehensive and high-quality digital infrastructure
  7. Digital skills and confidence – support children, young people and adults in developing digital competencies and skills
  8. Assisted digital – a commitment to support all its citizens to access digital services
  9. Commitment to share and learn – all members commit to work together to help solve each other's issues wherever they can

The updated Charter was signed on 6 November 2019 in Montevideo, Uruguay.

Meetings

The Digital Nations meets twice per year to showcase accomplishments in countries’ digital landscapes and co-create the next best practices. Members participate in political-level Ministerial Summits, hosted by the rotating Chair nation, and working-level Officials Meetings.

Meeting Location Date Themes Outcomes
D5 Ministerial Summit London, United Kingdom 9–10 December 2014
  • Teaching children to code
  • Open markets
  • Connectivity
  • Establishment of the Digital 5
  • Adoption of D5 Charter
D5 Ministerial Summit Tallinn, Estonia 19–20 November 2015
  • E-Procurement
  • Digital trust
  • Service design
  • IT talent

D5 Officials Meeting Wellington, New Zealand 7–8 June 2016 N/A
D5 Ministerial Summit Busan, Korea 10–11 November 2016
  • Innovation
  • Adoption of the Busan Declaration
D5 Officials Meeting Jerusalem, Israel 21–23 November 2017 N/A
D7 Ministerial Summit Wellington, New Zealand 21–22 February 2018
  • Digital rights
  • Canada and Uruguay officially join
  • Revision of D7 Charter
D7 Officials Meeting London, United Kingdom 16–17 July 2018 N/A
D9 Ministerial Summit Jerusalem, Israel 21–22 November 2018
  • AI and its public applications
  • Mexico and Portugal officially join
  • Revision of D9 Charter
D9 Officials Meeting Lisbon, Portugal 3–4 July 2019 N/A
D9 Ministerial Summit Montevideo, Uruguay 4–7 November 2019
  • Holistic approach to data in government
  • Denmark officially joins
  • Support for joint Data Declaration
  • Revision of DN Charter
DN Officials Meeting Copenhagen, Denmark June 2020 N/A Virtual meeting
DN Ministerial Summit Ottawa, Canada 2-4 November 2020 Resilient and responsive service Virtual meeting

Adoption of updated DN Charter

D5 London 2014

The first event of the D5 was held in London on 9 and 10 December 2014 with delegates from the five founding nations attending, as well as the United States, who were there as observers; the event was hosted by the UK's Cabinet Office minister Francis Maude. The UK's Culture, Communications and Creative Industries Minister Ed Vaizey and their chief technology officer Liam Maxwell were also present.

Themes

The 2014 summit had three themes: Teaching children to code, open markets, and connectivity.

Teaching children to code

By teaching children to code, the D5 intends to train the newest generation of kids – the "technology generation" – to take an active role in creating IT, rather than simply consuming it. Discussion points involving this theme included looking at whether simply changing the curriculum is enough to achieve this goal, methods that may be used to give teachers the skills to teach and inspire children to code, connecting the fields of industry and education so that such a change can be achieved, and ensuring gender balance and encouraging girls to take on tech roles.

Advances made by the D5's participating countries have already been made to achieve this goal. In the UK, England became the first country in the world to mandate that coding be taught to all pupils aged 5 to 16; In Estonia, primary schools have been teaching students to code since the 1990s; In New Zealand, they have introduced a set of Digital Technology Guidelines that will allow secondary schools to teach the subject coherently – they have also invested in new graduate ICT training schools to transition tertiary students into the workforce; in Israel they have "the most rigorous computer science high school programme in the world" due to a major review of computing at school that took place in the 1990s; South Korea teaches some computer science in school and also offers an optional online course for those who are interested.

Open markets

The focus of open markets is to open bidding on government IT contracts to small and medium-sized enterprises (SMEs) through the use of digital marketplaces such as the UK Government's G-cloud. The benefit of this to the government is to reduce costs by contracting out to the company that can provide the best value for money spent. The reduced barriers provided by an open market give SMEs who may not have been previously considered for a government contract, or who have never bid on one before, a fairer and more seamless opportunity to do so. Moving away from large outsourcers requires right sizing, which in this case can be achieved by buying parts of contracts from several smaller suppliers rather than buying one large contract from a single supplier, using agile delivery, buying cloud services, and building in-house engineering and operations capability.

Like the UK's G-cloud, New Zealand is also building a government cloud programme to ease the process of government buying from SMEs. They are committed to using 'as a service' products to open up the market. South Korea too has already built an e-procurement system that allows SMEs the opportunity to win government contracts. Since its inception, it has saved $8 billion annually and made a reduction of 7.8 million pages of paper documents per year. Bidding time has been reduced from thirty hours to two.

Connectivity

With an increasing number of internet connected devices in each household, the D5 intends to look at what type of infrastructure is needed to maintain and expand connectivity, as well as how they can work together to share each other's experiences and to develop standards together. Citing a Cisco figure, the D5 expects over 50 billion internet connected devices to be in use around the world by 2020, with as many as dozens of each device in every household.

To meet these needs, the UK government's focus will be on Machine to Machine technology, the Internet of Things and 5G mobile networks. In March 2014, they announced they will be investing £45 million in the Internet of Things. They have set up smart cities demonstrators in Glasgow, London, Bristol and Peterborough, and their 5G Innovation Centre is the world's first dedicated 5G testbed centre. In a speech during the summit, Cabinet Minister Francis Maude announced that the UK intends to have 97% of all citizen interactions with the state online by the end of the next parliament. In Estonia they have X-Road, a secure platform-independent Internet-based data exchange layer that provides transparent digital services with minimum costs. Through a public-private partnership, New Zealand is in the process of upgrading internet infrastructure to fibre-optic cables. Korea too has made a significant investment into The Internet of Things.

Events

There were a number of events and presentations held throughout the city. The Duke of York hosted an event for the delegates at Buckingham Palace, where 100 UK digital startups showcased their products to attendees. Presenters included Crowd Emotion, Code Kingdoms, Therapy Box, Yoyo, Skyscape, Kano, and Relative Insight. Another event highlighted the D5's intention of teaching programming to children of young ages by having the BBC lead a group of 11-year-olds through a coding session in which they utilised a Doctor Who themed game to gain a basic understanding of the practice of computer programming.

Future

The group expanded rapidly from five to ten members, with other countries signalling their interest in joining the Digital Nations. In the interest of strengthening capacity of the DN network, the Steering Committee agreed to establish a Secretariat to work on behalf of all DN countries in support of the group's key priorities. The DN Secretariat was introduced at the 5th Ministerial Summit in Israel in November 2018.

Open data

From Wikipedia, the free encyclopedia
Open data map
Linked open data cloud in August 2014
Clear labelling of the licensing terms is a key component of open data, and icons like the one pictured here are being used for that purpose.

Open data is data that is openly accessible, exploitable, editable and shareable by anyone for any purpose. Open data is licensed under an open license.

The goals of the open data movement are similar to those of other "open(-source)" movements such as open-source software, open-source hardware, open content, open specifications, open education, open educational resources, open government, open knowledge, open access, open science, and the open web. The growth of the open data movement is paralleled by a rise in intellectual property rights. The philosophy behind open data has been long established (for example in the Mertonian tradition of science), but the term "open data" itself is recent, gaining popularity with the rise of the Internet and World Wide Web and, especially, with the launch of open-data government initiatives Data.gov, Data.gov.uk and Data.gov.in.

Open data can be linked data - referred to as linked open data.

One of the most important forms of open data is open government data (OGD), which is a form of open data created by ruling government institutions. Open government data's importance is born from it being a part of citizens' everyday lives, down to the most routine/mundane tasks that are seemingly far removed from government.

The abbreviation FAIR/O data is sometimes used to indicate that the dataset or database in question complies with the principles of FAIR data and carries an explicit data‑capable open license.

Overview

The concept of open data is not new, but a formalized definition is relatively new. Open data as a phenomenon denotes that governmental data should be available to anyone with a possibility of redistribution in any form without any copyright restriction. One more definition is the Open Definition which can be summarized as "a piece of data is open if anyone is free to use, reuse, and redistribute it – subject only, at most, to the requirement to attribute and/or share-alike." Other definitions, including the Open Data Institute's "open data is data that anyone can access, use or share," have an accessible short version of the definition but refer to the formal definition. Open data may include non-textual material such as maps, genomes, connectomes, chemical compounds, mathematical and scientific formulae, medical data, and practice, bioscience and biodiversity.

A major barrier to the open data movement is the commercial value of data. Access to, or re-use of, data is often controlled by public or private organizations. Control may be through access restrictions, licenses, copyright, patents and charges for access or re-use. Advocates of open data argue that these restrictions detract from the common good and that data should be available without restrictions or fees.

Creators of data do not consider the need to state the conditions of ownership, licensing and re-use; instead presuming that not asserting copyright enters the data into the public domain. For example, many scientists do not consider the data published with their work to be theirs to control and consider the act of publication in a journal to be an implicit release of data into the commons. The lack of a license makes it difficult to determine the status of a data set and may restrict the use of data offered in an "Open" spirit. Because of this uncertainty it is possible for public or private organizations to aggregate said data, claim that it is protected by copyright, and then resell it.

Major sources

The State of Open Data, a 2019 book from African Minds

Open data can come from any source. This section lists some of the fields that publish (or at least discuss publishing) a large amount of open data.

In science

The concept of open access to scientific data was established with the formation of the World Data Center system, in preparation for the International Geophysical Year of 1957–1958. The International Council of Scientific Unions (now the International Council for Science) oversees several World Data Centres with the mission to minimize the risk of data loss and to maximize data accessibility.

While the open-science-data movement long predates the Internet, the availability of fast, readily available networking has significantly changed the context of Open science data, as publishing or obtaining data has become much less expensive and time-consuming.

The Human Genome Project was a major initiative that exemplified the power of open data. It was built upon the so-called Bermuda Principles, stipulating that: "All human genomic sequence information … should be freely available and in the public domain in order to encourage research and development and to maximize its benefit to society". More recent initiatives such as the Structural Genomics Consortium have illustrated that the open data approach can be used productively within the context of industrial R&D.

In 2004, the Science Ministers of all nations of the Organisation for Economic Co-operation and Development (OECD), which includes most developed countries of the world, signed a declaration which states that all publicly funded archive data should be made publicly available. Following a request and an intense discussion with data-producing institutions in member states, the OECD published in 2007 the OECD Principles and Guidelines for Access to Research Data from Public Funding as a soft-law recommendation.

Examples of open data in science:

  • data.uni-muenster.de – Open data about scientific artifacts from the University of Muenster, Germany. Launched in 2011.
  • Dataverse Network Project – archival repository software promoting data sharing, persistent data citation, and reproducible research.
  • linkedscience.org/data – Open scientific datasets encoded as Linked Data. Launched in 2011, ended 2018.
  • systemanaturae.org – Open scientific datasets related to wildlife classified by animal species. Launched in 2015.

In government

There are a range of different arguments for government open data. Some advocates say that making government information available to the public as machine readable open data can facilitate government transparency, accountability and public participation. "Open data can be a powerful force for public accountability—it can make existing information easier to analyze, process, and combine than ever before, allowing a new level of public scrutiny." Governments that enable public viewing of data can help citizens engage within the governmental sectors and "add value to that data." Open data experts have nuanced the impact that opening government data may have on government transparency and accountability. In a widely cited paper, scholars David Robinson and Harlan Yu contend that governments may project a veneer of transparency by publishing machine-readable data that does not actually make government more transparent or accountable. Drawing from earlier studies on transparency and anticorruption, World Bank political scientist Tiago C. Peixoto extended Yu and Robinson's argument by highlighting a minimal chain of events necessary for open data to lead to accountability:

  1. relevant data is disclosed;
  2. the data is widely disseminated and understood by the public;
  3. the public reacts to the content of the data; and
  4. public officials either respond to the public's reaction or are sanctioned by the public through institutional means.

Some make the case that opening up official information can support technological innovation and economic growth by enabling third parties to develop new kinds of digital applications and services.

Several national governments have created websites to distribute a portion of the data they collect. It is a concept for a collaborative project in the municipal Government to create and organize culture for Open Data or Open government data.

Additionally, other levels of government have established open data websites. There are many government entities pursuing Open Data in Canada. Data.gov lists the sites of a total of 40 US states and 46 US cities and counties with websites to provide open data, e.g., the state of Maryland, the state of California, US and New York City.

At the international level, the United Nations has an open data website that publishes statistical data from member states and UN agencies, and the World Bank published a range of statistical data relating to developing countries. The European Commission has created two portals for the European Union: the EU Open Data Portal which gives access to open data from the EU institutions, agencies and other bodies and the European Data Portal that provides datasets from local, regional and national public bodies across Europe. The two portals were consolidated to data.europa.eu on April 21, 2021.

Italy is the first country to release standard processes and guidelines under a Creative Commons license for spread usage in the Public Administration. The open model is called the Open Data Management Cycle and was adopted in several regions such as Veneto and Umbria. Main cities like Reggio Calabria and Genova have also adopted this model.

In October 2015, the Open Government Partnership launched the International Open Data Charter, a set of principles and best practices for the release of governmental open data formally adopted by seventeen governments of countries, states and cities during the OGP Global Summit in Mexico.

In July 2024, the OECD adopted Creative Commons CC-BY-4.0 licensing for its published data and reports.

In non-profit organizations

Many non-profit organizations offer open access to their data, as long it does not undermine their users', members' or third party's privacy rights. In comparison to for-profit corporations, they do not seek to monetize their data. OpenNWT launched a website offering open data of elections. CIAT offers open data to anybody who is willing to conduct big data analytics in order to enhance the benefit of international agricultural research. DBLP, which is owned by a non-profit organization Dagstuhl, offers its database of scientific publications from computer science as open data.

Hospitality exchange services, including Bewelcome, Warm Showers, and CouchSurfing (before it became for-profit) have offered scientists access to their anonymized data for analysis, public research, and publication.

Policies and strategies

At a small level, a business or research organization's policies and strategies towards open data will vary, sometimes greatly. One common strategy employed is the use of a data commons. A data commons is an interoperable software and hardware platform that aggregates (or collocates) data, data infrastructure, and data-producing and data-managing applications in order to better allow a community of users to manage, analyze, and share their data with others over both short- and long-term timelines. Ideally, this interoperable cyberinfrastructure should be robust enough "to facilitate transitions between stages in the life cycle of a collection" of data and information resources while still being driven by common data models and workspace tools enabling and supporting robust data analysis. The policies and strategies underlying a data commons will ideally involve numerous stakeholders, including the data commons service provider, data contributors, and data users.

Grossman et al suggests six major considerations for a data commons strategy that better enables open data in businesses and research organizations. Such a strategy should address the need for:

  • permanent, persistent digital IDs, which enable access controls for datasets;
  • permanent, discoverable metadata associated with each digital ID;
  • application programming interface (API)-based access, tied to an authentication and authorization service;
  • data portability;
  • data "peering," without access, egress, and ingress charges; and
  • a rationed approach to users computing data over the data commons.

Beyond individual businesses and research centers, and at a more macro level, countries like Germany have launched their own official nationwide open data strategies, detailing how data management systems and data commons should be developed, used, and maintained for the greater public good.

Arguments for and against

Opening government data is only a waypoint on the road to improving education, improving government, and building tools to solve other real-world problems. While many arguments have been made categorically, the following discussion of arguments for and against open data highlights that these arguments often depend highly on the type of data and its potential uses.

Arguments made on behalf of open data include the following:

  • "Data belongs to the human race". Typical examples are genomes, data on organisms, medical science, environmental data following the Aarhus Convention.
  • Public money was used to fund the work, and so it should be universally available.
  • It was created by or at a government institution (this is common in US National Laboratories and government agencies).
  • Facts cannot legally be copyrighted.
  • Sponsors of research do not get full value unless the resulting data are freely available.
  • Restrictions on data re-use create an anticommons.
  • Data are required for the smooth process of running communal human activities and are an important enabler of socio-economic development (health care, education, economic productivity, etc.).
  • In scientific research, the rate of discovery is accelerated by better access to data.
  • Making data open helps combat "data rot" and ensure that scientific research data are preserved over time.
  • Statistical literacy benefits from open data. Instructors can use locally relevant data sets to teach statistical concepts to their students.
  • Allowing open data in the scientific community is essential for increasing the rate of discoveries and recognizing significant patterns.

It is generally held that factual data cannot be copyrighted. Publishers frequently add copyright statements (often forbidding re-use) to scientific data accompanying publications. It may be unclear whether the factual data embedded in full text are part of the copyright.

While the human abstraction of facts from paper publications is normally accepted as legal there is often an implied restriction on the machine extraction by robots.

Unlike open access, where groups of publishers have stated their concerns, open data is normally challenged by individual institutions. Their arguments have been discussed less in public discourse and there are fewer quotes to rely on at this time.

Arguments against making all data available as open data include the following:

  • Government funding may not be used to duplicate or challenge the activities of the private sector (e.g. PubChem).
  • Governments have to be accountable for the efficient use of taxpayer's money: If public funds are used to aggregate the data and if the data will bring commercial (private) benefits to only a small number of users, the users should reimburse governments for the cost of providing the data.
  • Open data may lead to exploitation of, and rapid publication of results based on, data pertaining to developing countries by rich and well-equipped research institutes, without any further involvement and/or benefit to local communities (helicopter research); similarly, to the historical open access to tropical forests that has led to the misappropriation ("Global Pillage") of plant genetic resources from developing countries.
  • The revenue earned by publishing data can be used to cover the costs of generating and/or disseminating the data, so that the dissemination can continue indefinitely.
  • The revenue earned by publishing data permits non-profit organizations to fund other activities (e.g. learned society publishing supports the society).
  • The government gives specific legitimacy for certain organizations to recover costs (NIST in US, Ordnance Survey in UK).
  • Privacy concerns may require that access to data is limited to specific users or to sub-sets of the data.
  • Collecting, 'cleaning', managing and disseminating data are typically labour- and/or cost-intensive processes – whoever provides these services should receive fair remuneration for providing those services.
  • Sponsors do not get full value unless their data is used appropriately – sometimes this requires quality management, dissemination and branding efforts that can best be achieved by charging fees to users.
  • Often, targeted end-users cannot use the data without additional processing (analysis, apps etc.) – if anyone has access to the data, none may have an incentive to invest in the processing required to make data useful (typical examples include biological, medical, and environmental data).
  • There is no control to the secondary use (aggregation) of open data.

The paper entitled "Optimization of Soft Mobility Localization with Sustainable Policies and Open Data" argues that open data is a valuable tool for improving the sustainability and equity of soft mobility in cities. The author argues that open data can be used to identify the needs of different areas of a city, develop algorithms that are fair and equitable, and justify the installation of soft mobility resources.

Relation to other open activities

The goals of the Open Data movement are similar to those of other "Open" movements.

  • Open access is concerned with making scholarly publications freely available on the internet. In some cases, these articles include open datasets as well.
  • Open specifications are documents describing file types or protocols, where the documents are openly licensed. These specifications are primarily meant to improve different software handling the same file types or protocols, but monopolists forced by law into open specifications might make it more difficult.
  • Open content is concerned with making resources aimed at a human audience (such as prose, photos, or videos) freely available.
  • Open knowledge. Open Knowledge International argues for openness in a range of issues including, but not limited to, those of open data. It covers (a) scientific, historical, geographic or otherwise (b) Content such as music, films, books (c) Government and other administrative information. Open data is included within the scope of the Open Knowledge Definition, which is alluded to in Science Commons' Protocol for Implementing Open Access Data.
  • Open notebook science refers to the application of the Open Data concept to as much of the scientific process as possible, including failed experiments and raw experimental data.
  • Open-source software is concerned with the open-source licenses under which computer programs can be distributed and is not normally concerned primarily with data.
  • Open educational resources are freely accessible, openly licensed documents and media that are useful for teaching, learning, and assessing as well as for research purposes.
  • Open research/open science/open science data (linked open science) means an approach to open and interconnect scientific assets like data, methods and tools with linked data techniques to enable transparent, reproducible and interdisciplinary research.
  • Open-GLAM (Galleries, Library, Archives, and Museums) is an initiative and network that supports exchange and collaboration between cultural institutions that support open access to their digitalized collections. The GLAM-Wiki Initiative helps cultural institutions share their openly licensed resources with the world through collaborative projects with experienced Wikipedia editors. Open Heritage Data is associated with Open GLAM, as openly licensed data in the heritage sector is now frequently used in research, publishing, and programming, particularly in the Digital Humanities.

Open Data as commons

Ideas and definitions

Formally both the definition of Open Data and commons revolve around the concept of shared resources with a low barrier to access. Substantially, digital commons include Open Data in that it includes resources maintained online, such as data. Overall, looking at operational principles of Open Data one could see the overlap between Open Data and (digital) commons in practice. Principles of Open Data are sometimes distinct depending on the type of data under scrutiny. Nonetheless, they are somewhat overlapping and their key rationale is the lack of barriers to the re-use of data(sets). Regardless of their origin, principles across types of Open Data hint at the key elements of the definition of commons. These are, for instance, accessibility, re-use, findability, non-proprietarily. Additionally, although to a lower extent, threats and opportunities associated with both Open Data and commons are similar. Synthesizing, they revolve around (risks and) benefits associated with (uncontrolled) use of common resources by a large variety of actors.

The System

Both commons and Open Data can be defined by the features of the resources that fit under these concepts, but they can be defined by the characteristics of the systems their advocates push for. Governance is a focus for both Open Data and commons scholars. The key elements that outline commons and Open Data peculiarities are the differences (and maybe opposition) to the dominant market logics as shaped by capitalism. Perhaps it is this feature that emerges in the recent surge of the concept of commons as related to a more social look at digital technologies in the specific forms of digital and, especially, data commons.

Real-life case

Application of open data for societal good has been demonstrated in academic research works. The paper "Optimization of Soft Mobility Localization with Sustainable Policies and Open Data" uses open data in two ways. First, it uses open data to identify the needs of different areas of a city. For example, it might use data on population density, traffic congestion, and air quality to determine where soft mobility resources, such as bike racks and charging stations for electric vehicles, are most needed. Second, it uses open data to develop algorithms that are fair and equitable. For example, it might use data on the demographics of a city to ensure that soft mobility resources are distributed in a way that is accessible to everyone, regardless of age, disability, or gender. The paper also discusses the challenges of using open data for soft mobility optimization. One challenge is that open data is often incomplete or inaccurate. Another challenge is that it can be difficult to integrate open data from different sources. Despite these challenges, the paper argues that open data is a valuable tool for improving the sustainability and equity of soft mobility in cities.

An exemplification of how the relationship between Open Data and commons and how their governance can potentially disrupt the market logic otherwise dominating big data is a project conducted by Human Ecosystem Relazioni in Bologna (Italy). See: https://www.he-r.it/wp-content/uploads/2017/01/HUB-report-impaginato_v1_small.pdf.

This project aimed at extrapolating and identifying online social relations surrounding “collaboration” in Bologna. Data was collected from social networks and online platforms for citizens collaboration. Eventually data was analyzed for the content, meaning, location, timeframe, and other variables. Overall, online social relations for collaboration were analyzed based on network theory. The resulting dataset have been made available online as Open Data (aggregated and anonymized); nonetheless, individuals can reclaim all their data. This has been done with the idea of making data into a commons. This project exemplifies the relationship between Open Data and commons, and how they can disrupt the market logic driving big data use in two ways. First, it shows how such projects, following the rationale of Open Data somewhat can trigger the creation of effective data commons. The project itself was offering different types of support to social network platform users to have contents removed. Second, opening data regarding online social networks interactions has the potential to significantly reduce the monopolistic power of social network platforms on those data.

Funders' mandates

Several funding bodies that mandate Open Access also mandate Open Data. A good expression of requirements (truncated in places) is given by the Canadian Institutes of Health Research (CIHR):

  • to deposit bioinformatics, atomic and molecular coordinate data, and experimental data into the appropriate public database immediately upon publication of research results.
  • to retain original data sets for at least five years after the grant. This applies to all data, whether published or not.
Other bodies promoting the deposition of data and full text include the Wellcome Trust. An academic paper published in 2013 advocated that Horizon 2020 (the science funding mechanism of the EU) should mandate that funded projects hand in their databases as "deliverables" at the end of the project so that they can be checked for third-party usability and then shared.

Internet research

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Internet_...