Search This Blog

Monday, July 30, 2018

Ecocentrism

From Wikipedia, the free encyclopedia

Ecocentrism (/ˌɛkˈsɛntrɪzəm/; from Greek: οἶκος oikos, "house" and κέντρον kentron, "center") is a term used in ecological political philosophy to denote a nature-centered, as opposed to human-centered (i.e. anthropocentric), system of values. The justification for ecocentrism usually consists in an ontological belief and subsequent ethical claim. The ontological belief denies that there are any existential divisions between human and non-human nature sufficient to claim that humans are either (a) the sole bearers of intrinsic value or (b) possess greater intrinsic value than non-human nature. Thus the subsequent ethical claim is for an equality of intrinsic value across human and non-human nature, or 'biospherical egalitarianism'. According to Stan Rowe:

and:

Origin of term

The ecocentric ethic was conceived by Aldo Leopold[5] and recognizes that all species, including humans, are the product of a long evolutionary process and are inter-related in their life processes.[6] The writings of Aldo Leopold and his idea of the land ethic and good environmental management are a key element to this philosophy. Ecocentrism focuses on the biotic community as a whole and strives to maintain ecosystem composition and ecological processes.[7] The term also finds expression in the first principle of the deep ecology movement, as formulated by Arne Næss and George Sessions in 1984[8] which points out that anthropocentrism, which considers humans as the center of the universe and the pinnacle of all creation, is a difficult opponent for ecocentrism.[9]

Background

Environmental thought and the various branches of the environmental movement are often classified into two intellectual camps: those that are considered anthropocentric, or "human-centred," in orientation and those considered biocentric, or "life-centred". This division has been described in other terminology as "shallow" ecology versus "deep" ecology and as "technocentrism" versus "ecocentrism". Ecocentrism can be seen as one stream of thought within environmentalism, the political and ethical movement that seeks to protect and improve the quality of the natural environment through changes to environmentally harmful human activities by adopting environmentally benign forms of political, economic, and social organization and through a reassessment of humanity's relationship with nature. In various ways, environmentalism claims that non-human organisms and the natural environment as a whole deserve consideration when appraising the morality of political, economic, and social policies.[10]

Relationship to other similar philosophies

Anthropocentrism

Ecocentrism is taken by its proponents to constitute a radical challenge to long-standing and deeply rooted anthropocentric attitudes in Western culture, science, and politics. Anthropocentrism is alleged to leave the case for the protection of non-human nature subject to the demands of human utility, and thus never more than contingent on the demands of human welfare. An ecocentric ethic, by contrast, is believed to be necessary in order to develop a non-contingent basis for protecting the natural world. Critics of ecocentrism have argued that it opens the doors to an anti-humanist morality that risks sacrificing human well-being for the sake of an ill-defined ‘greater good’.[11] Deep ecologist Arne Naess has identified anthropocentrism as a root cause of the ecological crisis, human overpopulation, and the extinctions of many non-human species.[12] Others point to the gradual historical realization that humans are not the centre of all things, that "A few hundred years ago, with some reluctance, Western people admitted that the planets, Sun and stars did not circle around their abode. In short, our thoughts and concepts though irreducibly anthropomorphic need not be anthropocentric."[13]

Industrocentrism

Industrocentrism is an ideology that goes hand in hand with today's industrial neoliberal capitalist agenda. It sees all things on earth as resources to be utilized by humans or to be commodified. This view is the opposite of anthropocentrism and ecocentrism. It negatively affects humans, nonhumans, and the environment in the long run in that it only focuses on short term economic gratification.[14]

Technocentrism

Ecocentrism is also contrasted with technocentrism (meaning values centred on technology) as two opposing perspectives on attitudes towards human technology and its ability to affect, control and even protect the environment. Ecocentrics, including "deep green" ecologists, see themselves as being subject to nature, rather than in control of it. They lack faith in modern technology and the bureaucracy attached to it. Ecocentrics will argue that the natural world should be respected for its processes and products, and that low impact technology and self-reliance is more desirable than technological control of nature.[15] Technocentrics, including imperialists, have absolute faith in technology and industry and firmly believe that humans have control over nature. Although technocentrics may accept that environmental problems do exist, they do not see them as problems to be solved by a reduction in industry. Rather, environmental problems are seen as problems to be solved using science. Indeed, technocentrics see that the way forward for developed and developing countries and the solutions to our environmental problems today lie in scientific and technological advancement.[15]

Biocentrism

The distinction between biocentrism and ecocentrism is ill-defined. Ecocentrism recognizes Earth's interactive living and non-living systems rather than just the Earth's organisms (biocentrism) as central in importance.[16] The term has been used by those advocating "left biocentrism", combining deep ecology with an "anti-industrial and anti-capitalist" position (David Orton et al.).

3D-printing biocompatible living bacteria

Applications include skin transplants and nanofilters that break down toxic substances
December 8, 2017
Original link:  http://www.kurzweilai.net/3d-printing-biocompatible-living-bacteria
3D-printing with an ink containing living bacteria (credit: Bara Krautz/bara@scienceanimated.com)

Researchers at ETH Zurich university have developed a technique for 3D-printing biocompatible living bacteria for the first time — making it possible to produce produce high-purity cellulose for biomedical applications and nanofilters that can break down toxic substances (in drinking water, for example) or for use in disastrous oil spills, for example.

The technique, called “Flink” (“functional living ink”) allows for printing mini biochemical factories with properties that vary based on which species of bacteria are used. Up to four different inks containing different species of bacteria at different concentrations can be printed in a single pass.

Schematics of the Flink 3D bacteria-printing process for creating two types of functional living materials. (Left and center) Bacteria are embedded in a biocompatible hydrogel (which provides the supporting structure). (Right) The inclusion of P. putida* or A. xylinum* bacteria in the ink yields 3D-printed materials capable of degrading environmental pollutants (top) or forming bacterial cellulose in situ for biomedical applications (bottom), respectively. (credit: Manuel Schaffner et al./Science Advances)

The technique was described Dec. 1, 2017 in the open-access journal Science Advances.

(Left) A. xylinum bacteria were used in printing a cellulose nanofibril network (scanning electron microscope image), which was deposited (Right) on a doll face, forming a cellulose-reinforced hydrogel that, after removal of all biological residues, could serve as a skin transplant. (credit: Manuel Schaffner et al./Science Advances)

“The in situ formation of reinforcing cellulose fibers within the hydrogel is particularly attractive for regions under mechanical tension, such as the elbow and knee, or when administered as a pouch onto organs to prevent fibrosis after surgical implants and transplantations,” the researchers note in the paper. “Cellulose films grown in complex geometries precisely match the topography of the site of interest, preventing the formation of wrinkles and entrapments of contaminants that could impair the healing process. We envision that long-term medical applications will benefit from the presented multimaterial 3D printing process by locally deploying bacteria where needed.”

 * Pseudomonas putida breaks down the toxic chemical phenol, which is produced on a grand scale in the chemical industry; Acetobacter xylinum secretes high-purity nanocellulose, which relieves pain, retains moisture and is stable, opening up potential applications in the treatment of burns.


Abstract of 3D printing of bacteria into functional complex materials

Despite recent advances to control the spatial composition and dynamic functionalities of bacteria embedded in materials, bacterial localization into complex three-dimensional (3D) geometries remains a major challenge. We demonstrate a 3D printing approach to create bacteria-derived functional materials by combining the natural diverse metabolism of bacteria with the shape design freedom of additive manufacturing. To achieve this, we embedded bacteria in a biocompatible and functionalized 3D printing ink and printed two types of “living materials” capable of degrading pollutants and of producing medically relevant bacterial cellulose. With this versatile bacteria-printing platform, complex materials displaying spatially specific compositions, geometry, and properties not accessed by standard technologies can be assembled from bottom up for new biotechnological and biomedical applications.

eHealth

From Wikipedia, the free encyclopedia


eHealth (also written e-health) is a relatively recent healthcare practice supported by electronic processes and communication, dating back to at least 1999. Usage of the term varies. A study in 2005 found 51 unique definitions. Some argue that it is interchangeable with health informatics with a broad definition covering electronic/digital processes in health while others use it in the narrower sense of healthcare practice using the Internet. It can also include health applications and links on mobile phones, referred to as mHealth or m-Health. Since about 2011, the increasing recognition of the need for better cyber-security and regulation may result in the need for these specialized resources to develop safer eHealth solutions that can withstand these growing threats.

Types

The term can encompass a range of services or systems that are at the edge of medicine/healthcare and information technology, including:
  • Electronic health record: enabling the communication of patient data between different healthcare professionals (GPs, specialists etc.);
  • Computerized physician order entry: a means of requesting diagnostic tests and treatments electronically and receiving the results
  • ePrescribing: access to prescribing options, printing prescriptions to patients and sometimes electronic transmission of prescriptions from doctors to pharmacists
  • Clinical decision support system: providing information electronically about protocols and standards for healthcare professionals to use in diagnosing and treating patients[8]
  • Telemedicine: physical and psychological diagnosis and treatments at a distance, including telemonitoring of patients functions;
  • Consumer health informatics: use of electronic resources on medical topics by healthy individuals or patients;
  • Health knowledge management: e.g. in an overview of latest medical journals, best practice guidelines or epidemiological tracking (examples include physician resources such as Medscape and MDLinx);
  • Virtual healthcare teams: consisting of healthcare professionals who collaborate and share information on patients through digital equipment (for transmural care);
  • mHealth or m-Health: includes the use of mobile devices in collecting aggregate and patient-level health data, providing healthcare information to practitioners, researchers, and patients, real-time monitoring of patient vitals, and direct provision of care (via mobile telemedicine);
  • Medical research using grids: powerful computing and data management capabilities to handle large amounts of heterogeneous data.[9]
  • Health informatics / healthcare information systems: also often refer to software solutions for appointment scheduling, patient data management, work schedule management and other administrative tasks surrounding health

Contested definition

Several authors have noted the variable usage in the term, from being specific to the use of the Internet in healthcare to being generally around any use of computers in healthcare.[10] Various authors have considered the evolution of the term and its usage and how this maps to changes in health informatics and healthcare generally.[1][11][12] Oh et al., in a 2005 systematic review of the term's usage, offered the definition of eHealth as a set of technological themes in health today, more specifically based on commerce, activities, stakeholders, outcomes, locations, or perspectives.[2] One thing that all sources seem to agree on is that e-Health initiatives do not originate with the patient, though the patient may be a member of a patient organization that seeks to do this, as in the e-Patient movement.

eHealth literacy

eHealth literacy is defined as “the ability to seek, find, understand and appraise health information from electronic sources and apply knowledge gained to addressing or solving a health problem.[13][14] According to this definition, eHealth literacy encompasses six types of literacy: traditional (literacy and numeracy), information, media, health, computer, and scientific. Of these, media and computer literacies are unique to the Internet context, with eHealth media literacy being the awareness of media bias or perspective, the ability to discern both explicit and implicit meaning from media messages, and to derive meaning from media messages. The literature includes other definitions of perceived media capability or efficacy, but these were not specific to health information on the Internet.[15] Having the composite skills of eHealth literacy allows health consumers to achieve positive outcomes from using the Internet for health purposes. eHealth literacy has the potential to both protect consumers from harm and empower them to fully participate in informed health-related decision making.[14] People with high levels of eHealth literacy are also more aware of the risk of encountering unreliable information on the Internet [16] On the other hand, the extension of digital resources to the health domain in the form of eHealth literacy can also create new gaps between health consumers.[15] eHealth literacy hinges not on the mere access to technology, but rather on the skill to apply the accessed knowledge.[13]

Data exchange

One of the factors blocking the use of e-Health tools from widespread acceptance is the concern about privacy issues regarding patient records, most specifically the EPR (Electronic patient record). This main concern has to do with the confidentiality of the data. There is also concern about non-confidential data however. Each medical practise has its own jargon and diagnostic tools. To standardize the exchange of information, various coding schemes may be used in combination with international medical standards. Systems that deal with these transfers are often referred to as Health Information Exchange (HIE). Of the forms of e-Health already mentioned, there are roughly two types; front-end data exchange and back-end exchange.

Front-end exchange typically involves the patient, while back-end exchange does not. A common example of a rather simple front-end exchange is a patient sending a photo taken by mobile phone of a healing wound and sending it by email to the family doctor for control. Such an actions may avoid the cost of an expensive visit to the hospital.

A common example of a back-end exchange is when a patient on vacation visits a doctor who then may request access to the patient's health records, such as medicine prescriptions, x-ray photographs, or blood test results. Such an action may reveal allergies or other prior conditions that are relevant to the visit.

Thesaurus

Successful e-Health initiatives such as e-Diabetes have shown that for data exchange to be facilitated either at the front-end or the back-end, a common thesaurus is needed for terms of reference.[7][17] Various medical practises in chronic patient care (such as for diabetic patients) already have a well defined set of terms and actions, which makes standard communication exchange easier, whether the exchange is initiated by the patient or the caregiver.

In general, explanatory diagnostic information (such as the standard ICD-10) may be exchanged insecurely, and private information (such as personal information from the patient) must be secured. E-health manages both flows of information, while ensuring the quality of the data exchange.

Early adopters

Patients living with long term conditions (also called Chronic conditions) over time often acquire a high level of knowledge about the processes involved in their own care, and often develop a routine in coping with their condition. For these types of routine patients, front-end e-Health solutions tend to be relatively easy to implement.

E-mental health

E-mental health is frequently used to refer to internet based interventions and support for mental health conditions.[18] However, it can also refer to the use of information and communication technologies that also includes the use of social media, landline and mobile phones.[19] E-mental health services can include information; peer support services, computer and internet based programs, virtual applications and games as well as real time interaction with trained clinicians.[20] Programs can also be delivered using telephones and interactive voice response (IVR).[21]
Mental disorders includes a range of conditions such as alcohol and drug use disorders, mood disorders such as depression, dementia and Alzheimer's disease, delusional disorders such as schizophrenia and anxiety disorders.[22][page needed] The majority of e-mental health interventions have focused on the treatment of depression and anxiety.[20] There are, however, programs also for problems as diverse as smoking cessation,[23][needs update] gambling,[24] and post-disaster mental health.[25]

Advantages and disadvantages

E-mental health has a number of advantages such as being low cost, easily accessible and providing anonymity to users.[26] However, there are also a number of disadvantages such as concerns regarding treatment credibility, user privacy and confidentiality.[27] Online security involves the implementation of appropriate safeguards to protect user privacy and confidentiality. This includes appropriate collection and handling of user data, the protection of data from unauthorized access and modification and the safe storage of data.[28]

E-mental health has been gaining momentum in the academic research as well as practical arenas[29] in a wide variety of disciplines such as psychology, clinical social work, family and marriage therapy, and mental health counseling. Testifying to this momentum, the E-Mental Health movement has its own international organization, the International Society for Mental Health Online.[30]

Programs

There are at least five programs currently available to treat anxiety and depression. Several programs have been identified by the UK National Institute for Health and Care Excellence as cost effective for use in primary care.[21] These include Fearfighter,[31] a text based cognitive behavioral therapy program to treat people with phobias, and Beating the Blues,[32] an interactive text, cartoon and video CBT program for anxiety and depression. Two programs have been supported for use in primary care by the Australian Government.[citation needed] The first is Anxiety Online,[33] a text based program for the anxiety, depressive and eating disorders, and the second is THIS WAY UP,[34] a set of interactive text, cartoon and video programs for the anxiety and depressive disorders. Another is iFightDepression®[35] a multilingual, free to use, web-based tool for self-management of less severe forms of depression, for use under guidance of a GP or psychotherapist.

There are a number of online programs relating to smoking cessation. QuitCoach[36] is a personalised quit plan based on the users response to questions regarding giving up smoking and tailored individually each time the user logs into the site. Freedom From Smoking[37] takes users through lessons that are grouped into modules that provide information and assignments to complete. The modules guide participants through steps such as preparing to quit smoking, stopping smoking and preventing relapse.

Other internet programs have been developed specifically as part of research into treatment for specific disorders. For example, an online self-directed therapy for problem gambling was developed to specifically test this as a method of treatment.[24] All participants were given access to a website. The treatment group was provided with behavioural and cognitive strategies to reduce or quit gambling. This was presented in the form of a workbook which encouraged participants to self-monitor their gambling by maintaining an online log of gambling and gambling urges. Participants could also use a smartphone application to collect self-monitoring information. Finally participants could also choose to receive motivational email or text reminders of their progress and goals.

An internet based intervention was also developed for use after Hurricane Ike in 2009.[25] During this study, 1,249 disaster-affected adults were randomly recruited to take part in the intervention. Participants were given a structured interview then invited to access the web intervention using a unique password. Access to the website was provided for a four-month period. As participants accessed the site they were randomly assigned to either the intervention. those assigned to the intervention were provided with modules consisting of information regarding effective coping strategies to manage mental health and health risk behaviour.

Cybermedicine

Cybermedicine is the use of the Internet to deliver medical services, such as medical consultations and drug prescriptions. It is the successor to telemedicine, wherein doctors would consult and treat patients remotely via telephone or fax.

Cybermedicine is already being used in small projects where images are transmitted from a primary care setting to a medical specialist, who comments on the case and suggests which intervention might benefit the patient. A field that lends itself to this approach is dermatology, where images of an eruption are communicated to a hospital specialist who determines if referral is necessary.

The field has also expanded to include online "ask the doctor" services that allow patients direct, paid access to consultations (with varying degrees of depth) with medical professionals (examples include Bundoo.com, Doctor Spring, Teladoc, and Ask The Doctor).

A Cyber Doctor,[38] known in the UK as a Cyber Physician,[39] is a medical professional who does consultation via the internet, treating virtual patients, who may never meet face to face. This is a new area of medicine which has been utilized by the armed forces and teaching hospitals offering online consultation to patients before making their decision to travel for unique medical treatment only offered at a particular medical facility.[38]

Self-monitoring healthcare devices

Self-monitoring is the use of sensors or tools which are readily available to the general public to track and record personal data. The sensors are usually wearable devices and the tools are digitally available through mobile device applications. Self-monitoring devices were created for the purpose of allowing personal data to be instantly available to the individual to be analyzed. As of now, fitness and health monitoring are the most popular applications for self-monitoring devices.[40] The biggest benefit to self-monitoring devices is the elimination of the necessity for third party hospitals to run tests, which are both expensive and lengthy. These devices are an important advancement in the field of personal health management.

Self-monitoring healthcare devices exist in many forms. An example is the Nike+ FuelBand, which is a modified version of the original pedometer.[40] This device is wearable on the wrist and allows one to set a personal goal for a daily energy burn. It records the calories burned and the number of steps taken for each day while simultaneously functioning as a watch. To add to the ease of the user interface, it includes both numeric and visual indicators of whether or not the individual has achieved his or her daily goal. Finally, it is also synced to an iPhone app which allows for tracking and sharing of personal record and achievements.

Other monitoring devices have more medical relevance. A well-known device of this type is the blood glucose monitor. The use of this device is restricted to diabetic patients and allows users to measure the blood glucose levels in their body. It is extremely quantitative and the results are available instantaneously.[41] However, this device is not as independent of a self-monitoring device as the Nike+ Fuelband because it requires some patient education before use. One needs to be able to make connections between the levels of glucose and the effect of diet and exercise. In addition, the users must also understand how the treatment should be adjusted based on the results. In other words, the results are not just static measurements.

The demand for self-monitoring health devices is skyrocketing, as wireless health technologies have become especially popular in the last few years. In fact, it is expected that by 2016, self-monitoring health devices will account for 80% of wireless medical devices.[42] The key selling point for these devices is the mobility of information for consumers. The accessibility of mobile devices such as smartphones and tablets has increased significantly within the past decade. This has made it easier for users to access real-time information in a number of peripheral devices.

There are still many future improvements for self-monitoring healthcare devices. Although most of these wearable devices have been excellent at providing direct data to the individual user, the biggest task which remains at hand is how to effectively use this data. Although the blood glucose monitor allows the user to take action based on the results, measurements such as the pulse rate, EKG signals, and calories do not necessarily serve to actively guide an individual's personal healthcare management. Consumers are interested in qualitative feedback in addition to the quantitative measurements recorded by the devices.[43]

Evaluation

Knowledge of the socio-economic performance of eHealth is limited, and findings from evaluations are often challenging to transfer to other settings. Socio-economic evaluations of some narrow types of mHealth can rely on health economic methodologies, but larger scale eHealth may have too many variables, and tortuous, intangible cause and effect links may need a wider approach.[44]

In developing countries

eHealth in general, and telemedicine in particular, is a vital resource to remote regions of emerging and developing countries but is often difficult to establish because of the lack of communications infrastructure.[45] For example, in Benin, hospitals often can become inaccessible due to flooding during the rainy season[46] and across Africa, the low population density, along with severe weather conditions and the difficult financial situation in many African states, has meant that the majority of the African people are badly disadvantaged in medical care. In many regions there is not only a significant lack of facilities and trained health professionals, but also no access to eHealth because there is also no internet access in remote villages, or even a reliable electricity supply.[47]

Internet connectivity, and the benefits of eHealth, can be brought to these regions using satellite broadband technology, and satellite is often the only solution where terrestrial access may be limited, or poor quality, and one that can provide a fast connection over a vast coverage area.

Health information technology

From Wikipedia, the free encyclopedia

Health information technology (HIT) is information technology applied to health and health care. It supports health information management across computerized systems and the secure exchange of health information between consumers, providers, payers, and quality monitors. Based on an often-cited 2008 report on a small series of studies conducted at four sites that provide ambulatory care – three U.S. medical centers and one in the Netherlands – the use of electronic health records (EHRs) was viewed as the most promising tool for improving the overall quality, safety and efficiency of the health delivery system. According to a 2006 report by the Agency for Healthcare Research and Quality, broad and consistent utilization of HIT will:
  • Improve health care quality or effectiveness:
  • Increase health care productivity or efficiency;
  • Prevent medical errors and increase health care accuracy and procedural correctness;
  • Reduce health care costs;
  • Increase administrative efficiencies and healthcare work processes;
  • Decrease paperwork and unproductive or idle work time;
  • Extend real-time communications of health informatics among health care professionals; and
  • Expand access to affordable care.
Risk-based regulatory framework for health IT September 4, 2013 the Health IT Policy Committee (HITPC) accepted and approved recommendations from the Food and Drug Administration Safety and Innovation Act (FDASIA) working group for a risk-based regulatory framework for health information technology.[3] The Food and Drug Administration (FDA), the Office of the National Coordinator for Health IT (ONC), and Federal Communications Commission (FCC) kicked off the FDASIA workgroup of the HITPC to provide stakeholder input into a report on a risk-based regulatory framework that promotes safety and innovation and reduces regulatory duplication, consistent with section 618 of FDASIA. This provision permitted the Secretary of Health and Human Services (HHS) to form a workgroup in order to obtain broad stakeholder input from across the health care, IT, patients and innovation spectrum. The FDA, ONC, and FCC actively participated in these discussions with stakeholders from across the health care, IT, patients and innovation spectrum.

HIMSS Good Informatics Practices-GIP is aligned with FDA risk-based regulatory framework for health information technology.[4] GIP development began in 2004 developing risk-based IT technical guidance.[5] Today the GIP peer-review and published modules are widely used as a tool for educating Health IT professionals.

Interoperable HIT will improve individual patient care, but it will also bring many public health benefits including:
  • Early detection of infectious disease outbreaks around the country;
  • Improved tracking of chronic disease management;
  • Evaluation of health care based on value enabled by the collection of de-identified price and quality information that can be compared.
According to an article published in the International Journal of Medical Informatics, health information sharing between patients and providers helps to improve diagnosis, promotes self care, and patients also know more information about their health. The use of electronic medical records (EMRs) is still scarce now but is increasing in Canada, American and British primary care. Healthcare information in EMRs are important sources for clinical, research, and policy questions. Health information privacy (HIP) and security has been a big concern for patients and providers. Studies in Europe evaluating electronic health information poses a threat to electronic medical records and exchange of personal information.[6] Moreover, software's traceability features allow the hospitals to collect detailed information about the preparations dispensed, creating a database of every treatment that can be used for research purposes.[7]

Concepts and definitions

Health information technology (HIT) is "the application of information processing involving both computer hardware and software that deals with the storage, retrieval, sharing, and use of health care information, health data, and knowledge for communication and decision making".[8] Technology is a broad concept that deals with a species' usage and knowledge of tools and crafts, and how it affects a species' ability to control and adapt to its environment. However, a strict definition is elusive; "technology" can refer to material objects of use to humanity, such as machines, hardware or utensils, but can also encompass broader themes, including systems, methods of organization, and techniques. For HIT, technology represents computers and communications attributes that can be networked to build systems for moving health information. Informatics is yet another integral aspect of HIT.
Informatics refers to the science of information, the practice of information processing, and the engineering of information systems. Informatics underlies the academic investigation and practitioner application of computing and communications technology to healthcare, health education, and biomedical research. Health informatics refers to the intersection of information science, computer science, and health care. Health informatics describes the use and sharing of information within the healthcare industry with contributions from computer science, mathematics, and psychology. It deals with the resources, devices, and methods required for optimizing the acquisition, storage, retrieval, and use of information in health and biomedicine. Health informatics tools include not only computers but also clinical guidelines, formal medical terminologies, and information and communication systems. Medical informatics, nursing informatics, public health informatics, pharmacy informatics, and translational bioinformatics are subdisciplines that inform health informatics from different disciplinary perspectives.[9] The processes and people of concern or study are the main variables.

Implementation

The Institute of Medicine's (2001) call for the use of electronic prescribing systems in all healthcare organizations by 2010 heightened the urgency to accelerate United States hospitals' adoption of CPOE systems. In 2004, President Bush signed an Executive Order titled the President's Health Information Technology Plan, which established a ten-year plan to develop and implement electronic medical record systems across the US to improve the efficiency and safety of care. According to a study by RAND Health, the US healthcare system could save more than $81 billion annually, reduce adverse healthcare events and improve the quality of care if it were to widely adopt health information technology.[10]

The American Recovery and Reinvestment Act, signed into law in 2009 under the Obama Administration, has provided approximately $19 billion in incentives for hospitals to shift from paper to electronic medical records. Meaningful Use, as a part of the 2009 Health Information Technology for Economic and Clinical Health (HITECH) was the incentive that included over $20 billion for the implementation of HIT alone, and provided further indication of the growing consensus regarding the potential salutary effect of HIT. The American Recovery and Reinvestment Act has set aside $2 billion which will go towards programs developed by the National Coordinator and Secretary to help healthcare providers implement HIT and provide technical assistance through various regional centers. The other $17 billion in incentives comes from Medicare and Medicaid funding for those who adopt HIT before 2015. Healthcare providers who implement electronic records can receive up to $44,000 over four years in Medicare funding and $63,750 over six years in Medicaid funding. The sooner that healthcare providers adopt the system, the more funding they receive. Those who do not adopt electronic health record systems before 2015 do not receive any federal funding.[11]

While electronic health records have potentially many advantages in terms of providing efficient and safe care, recent reports have brought to light some challenges with implementing electronic health records. The most immediate barriers for widespread adoption of this technology have been the high initial cost of implementing the new technology and the time required for doctors to train and adapt to the new system. There have also been suspected cases of fraudulent billing, where hospitals inflate their billings to Medicare. Given that healthcare providers have not reached the deadline (2015) for adopting electronic health records, it is unclear what effects this policy will have long term.[12]

One approach to reducing the costs and promoting wider use is to develop open standards related to EHRs. In 2014 there was widespread interest in a new HL7 draft standard, Fast Healthcare Interoperability Resources (FHIR), which is designed to be open, extensible, and easier to implement, benefiting from modern web technologies.[13]

Types of technology

In a 2008 study about the adoption of technology in the United States, Furukawa, and colleagues classified applications for prescribing to include electronic medical records (EMR), clinical decision support (CDS), and computerized physician order entry (CPOE).[14] They further defined applications for dispensing to include bar-coding at medication dispensing (BarD), robot for medication dispensing (ROBOT), and automated dispensing machines (ADM). They defined applications for administration to include electronic medication administration records (eMAR) and bar-coding at medication administration (BarA or BCMA).

Electronic health record (EHR)

US medical groups' adoption of EHR (2005)

Although the electronic health record (EHR), previously known as the electronic medical record (EMR), is frequently cited in the literature, there is no consensus about the definition.[15] However, there is consensus that EMRs can reduce several types of errors, including those related to prescription drugs, to preventive care, and to tests and procedures.[16] Recurring alerts remind clinicians of intervals for preventive care and track referrals and test results. Clinical guidelines for disease management have a demonstrated benefit when accessible within the electronic record during the process of treating the patient.[17] Advances in health informatics and widespread adoption of interoperable electronic health records promise access to a patient's records at any health care site. A 2005 report noted that medical practices in the United States are encountering barriers to adopting an EHR system, such as training, costs and complexity, but the adoption rate continues to rise (see chart to right).[18] Since 2002, the National Health Service of the United Kingdom has placed emphasis on introducing computers into healthcare. As of 2005, one of the largest projects for a national EHR is by the National Health Service (NHS) in the United Kingdom. The goal of the NHS is to have 60,000,000 patients with a centralized electronic health record by 2010. The plan involves a gradual roll-out commencing May 2006, providing general practices in England access to the National Programme for IT (NPfIT), the NHS component of which is known as the "Connecting for Health Programme".[19] However, recent surveys have shown physicians' deficiencies in understanding the patient safety features of the NPfIT-approved software.[20] A main problem in HIT adoption is mainly seen by physicians, an important stakeholder to the process of EHR. The Thorn et al. article, elicited that emergency physicians noticed that health information exchange disrupted workflow and was less desirable to use, even though the main goal of EHR is improving coordination of care. The problem was seen that exchanges did not address the needs of end users, e.g. simplicity, user-friendly interface, and speed of systems.[21] The same finding was seen in an earlier article with the focus on CPOE and physician resistance to its use, Bhattacherjee et al.[22]

Clinical point of care technology

Computerized provider (physician) order entry

Prescribing errors are the largest identified source of preventable errors in hospitals. A 2006 report by the Institute of Medicine estimated that a hospitalized patient is exposed to a medication error each day of his or her stay.[23] Computerized provider order entry (CPOE), also called computerized physician order entry, can reduce total medication error rates by 80%, and adverse (serious with harm to patient) errors by 55%.[24] A 2004 survey by found that 16% of US clinics, hospitals and medical practices are expected to be utilizing CPOE within 2 years.[25] In addition to electronic prescribing, a standardized bar code system for dispensing drugs could prevent a quarter of drug errors.[23] Consumer information about the risks of the drugs and improved drug packaging (clear labels, avoiding similar drug names and dosage reminders) are other error-proofing measures. Despite ample evidence of the potential to reduce medication errors, competing systems of barcoding and electronic prescribing have slowed adoption of this technology by doctors and hospitals in the United States, due to concern with interoperability and compliance with future national standards.[26] Such concerns are not inconsequential; standards for electronic prescribing for Medicare Part D conflict with regulations in many US states.[23] And, aside from regulatory concerns, for the small-practice physician, utilizing CPOE requires a major change in practice work flow and an additional investment of time. Many physicians are not full-time hospital staff; entering orders for their hospitalized patients means taking time away from scheduled patients.[27]

Technological innovations, opportunities, and challenges

One of the rapidly growing areas of health care innovation lies in the advanced use of data science and machine learning. The key opportunities here are:
  • Health Monitoring and Diagnosis;
  • Medical Treatment and Patient Care;
  • Pharmaceutical Research and Development;
  • Clinic Performance Optimization.[28]
Handwritten reports or notes, manual order entry, non-standard abbreviations and poor legibility lead to substantial errors and injuries, according to the Institute of Medicine (2000) report. The follow-up IOM (2004) report, Crossing the quality chasm: A new health system for the 21st century, advised rapid adoption of electronic patient records, electronic medication ordering, with computer- and internet-based information systems to support clinical decisions.[29] However, many system implementations have experienced costly failures.[30] Furthermore, there is evidence that CPOE may actually contribute to some types of adverse events and other medical errors.[31] For example, the period immediately following CPOE implementation resulted in significant increases in reported adverse drug events in at least one study,[32] and evidence of other errors have been reported.[24][33][34] Collectively, these reported adverse events describe phenomena related to the disruption of the complex adaptive system resulting from poorly implemented or inadequately planned technological innovation.

Technological iatrogenesis

Technology may introduce new sources of error.[35][36] Technologically induced errors are significant and increasingly more evident in care delivery systems. Terms to describe this new area of error production include the label technological iatrogenesis[37] for the process and e-iatrogenic[38] for the individual error. The sources for these errors include:
  • Prescriber and staff inexperience may lead to a false sense of security; that when technology suggests a course of action, errors are avoided.
  • Shortcut or default selections can override non-standard medication regimens for elderly or underweight patients, resulting in toxic doses.
  • CPOE and automated drug dispensing were identified as a cause of error by 84% of over 500 health care facilities participating in a surveillance system by the United States Pharmacopoeia.[39]
  • Irrelevant or frequent warnings can interrupt work flow.
Healthcare information technology can also result in iatrogenesis if design and engineering are substandard, as illustrated in a 14-part detailed analysis done at the University of Sydney.[40]

Revenue Cycle HIT

The HIMSS Revenue Cycle Improvement Task Force was formed to prepare for the IT changes in the U.S. (e.g. the American Recovery and Reinvestment Act of 2009 (HITECH), Affordable Care Act, 5010 (electronic exchanges), ICD-10). An important change to the revenue cycle is the international classification of diseases (ICD) codes from 9 to 10. ICD-9 codes are set up to use three to five alphanumeric codes that represent 4,000 different types of procedures, while ICD-10 uses three to seven alphanumeric codes increasing procedural codes to 70,000. ICD-9 was outdated because there were more codes than procedures available, and to document for procedures without an ICD-9 code, unspecified codes were utilized which did not fully capture the procedures or the work involved in turn affecting reimbursement. Hence, ICD-10 was introduced to simplify the procedures with unknown codes and unify the standards closer to world standards (ICD-11). One of the main parts of Revenue Cycle HIT is charge capture, it utilizes codes to capture costs for reimbursements from different payers, such as CMS.[41]

International comparisons through HIT

International health system performance comparisons are important for understanding health system complexities and finding better opportunities, which can be done through health information technology. It gives policy makers the chance to compare and contrast the systems through established indicators from health information technology, as inaccurate comparisons can lead to adverse policies.

AlphaZero’s ‘alien’ superhuman-level program masters chess in 24 hours with no domain knowledge

Like a robot building a Ferrari from thousands of metal bits and parts, but no knowledge of a combustion engine
December 11, 2017
Original link:  http://www.kurzweilai.net/alphazeros-alien-superhuman-level-program-masters-chess-in-24-hours-with-no-domain-knowledge
AlphaZero vs. Stockfish chess program | Round 1 (credit: Chess.com)

Demis Hassabis, the founder and CEO of DeepMind, announced at the Neural Information Processing Systems conference (NIPS 2017) last week that DeepMind’s new AlphaZero program achieved a superhuman level of play in chess within 24 hours.

The program started from random play, given no domain knowledge except the game rules, according to an arXiv paper by DeepMind researchers published Dec. 5.

“It doesn’t play like a human, and it doesn’t play like a program,” said Hassabis, an expert chess player himself. “It plays in a third, almost alien, way. It’s like chess from another dimension.”
AlphaZero also mastered both shogi (Japanese chess) and Go within 24 hours, defeating a world-champion program in all three cases. The original AlphaGo mastered Go by learning thousands of example games and then practicing against another version of itself.



“AlphaZero was not ‘taught’ the game in the traditional sense,” explains Chess.com. “That means no opening book, no endgame tables, and apparently no complicated algorithms dissecting minute differences between center pawns and side pawns. This would be akin to a robot being given access to thousands of metal bits and parts, but no knowledge of a combustion engine, then it experiments numerous times with every combination possible until it builds a Ferrari. … The program had four hours to play itself many, many times, thereby becoming its own teacher.”

“What’s also remarkable, though, Hassabis explained, is that it sometimes makes seemingly crazy sacrifices, like offering up a bishop and queen to exploit a positional advantage that led to victory,” MIT Technology Review notes. “Such sacrifices of high-value pieces are normally rare. In another case the program moved its queen to the corner of the board, a very bizarre trick with a surprising positional value.”



Abstract of Mastering Chess and Shogi by Self-Play with a General Reinforcement Learning Algorithm

The game of chess is the most widely-studied domain in the history of artificial intelligence. The strongest programs are based on a combination of sophisticated search techniques, domain-specific adaptations, and handcrafted evaluation functions that have been refined by human experts over several decades. In contrast, the AlphaGo Zero program recently achieved superhuman performance in the game of Go, by tabula rasa reinforcement learning from games of self-play. In this paper, we generalise this approach into a single AlphaZero algorithm that can achieve, tabula rasa, superhuman performance in many challenging domains. Starting from random play, and given no domain knowledge except the game rules, AlphaZero achieved within 24 hours a superhuman level of play in the games of chess and shogi (Japanese chess) as well as Go, and convincingly defeated a world-champion program in each case.

Human-centered computing

From Wikipedia, the free encyclopedia

Human-centered computing (HCC) studies the design, development, and deployment of mixed-initiative human-computer systems. It is emerged from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts. Human-centered computing is closely related to human-computer interaction and information science. Human-centered computing is usually concerned with systems and practices of technology use while human-computer interaction is more focused on ergonomics and the usability of computing artifacts and information science is focused on practices surrounding the collection, manipulation, and use of information.

Human-centered computing researchers and practitioners usually come from one or more of disciplines such as computer science, human factors, sociology, psychology, cognitive science, anthropology, communication studies, graphic design and industrial design. Some researchers focus on understanding humans, both as individuals and in social groups, by focusing on the ways that human beings adopt and organize their lives around computational technologies. Others focus on designing and developing new computational artifacts.

Overview

Scope

HCC aims at bridging the existing gaps between the various disciplines involved with the design and implementation of computing systems that support human's activities.[1] Meanwhile, it is a set of methodologies that apply to any field that uses computers in applications in which people directly interact with devices or systems that use computer technologies.

HCC facilitates the design of effective computer systems that take into account personal, social, and cultural aspects and addresses issues such as information design, human information interaction, human-computer interaction, human-human interaction, and the relationships between computing technology and art, social, and cultural issues.[1]

HCC topics

The National Science Foundation (NSF) defines the trends of HCC research as "a three dimensional space comprising human, computer, and environment."[2] According to the NSF, the human dimension ranges from research that supports individual needs, through teams as goal-oriented groups, to society as an unstructured collection of connected people. The computer dimension ranges from fixed computing devices, through mobile devices, to computational systems of visual/audio devices that are embedded in the surrounding physical environment. The environment dimension ranges from discrete physical computational devices, through mixed reality systems, to immersive virtual environments.[2] Some examples of topics in the field are listed below.

List of topics in HCC field

  • Problem-solving in distributed environments, ranging across Internet-based information systems, grids, sensor-based information networks, and mobile and wearable information appliances.
  • Multimedia and multi-modal interfaces in which combinations of speech, text, graphics, gesture, movement, touch, sound, etc. are used by people and machines to communicate with one another.
  • Intelligent interfaces and user modeling, information visualization, and adaptation of content to accommodate different display capabilities, modalities, bandwidth and latency.
  • Multi-agent systems that control and coordinate actions and solve complex problems in distributed environments in a wide variety of domains, such as disaster response teams, e-commerce, education, and successful aging.
  • Models for effective computer-mediated human-human interaction under a variety of constraints, (e.g., video conferencing, collaboration across high vs. low bandwidth networks, etc.).
  • Definition of semantic structures for multimedia information to support cross-modal input and output.
  • Specific solutions to address the special needs of particular communities.
  • Collaborative systems that enable knowledge-intensive and dynamic interactions for innovation and knowledge generation across organizational boundaries, national borders, and professional fields.
  • Novel methods to support and enhance social interaction, including innovative ideas like social orthotics, affective computing, and experience capture.
  • Studies of how social organizations, such as government agencies or corporations, respond to and shape the introduction of new information technologies, especially with the goal of improving scientific understanding and technical design.
  • Knowledge-driven human-computer interaction that uses ontologies to addresss the semantic ambiguities between human and computer's understandings towards mutual behaviors[3]
  • Human-centered semantic relatedness measure that employs human power to measure the semantic relatedness between two concepts[4]

Human-centered systems

Human-centered systems (HCS) are systems designed for human-centered computing. HCS focuses on the design of interactive systems as they relate to human activities.[5] According to Kling et al., the Committee on Computing, Information, and Communication of the National Science and Technology Council, identified human-centered systems, or HCS, as one of five components for a High Performance Computing Program.[6] Human-centered systems can be referred to in terms of human-centered automation. According to Kling et al., HCS refers to "systems that are:
  1. based on the analysis of the human tasks the system is aiding
  2. monitored for performance in terms of human benefits
  3. built to take account of human skills and
  4. adaptable easily to changing human needs."[6]
In addition, Kling et al. defines four dimensions of human-centeredness that should be taken into account when classifying a system:  systems that are human centered must analyze the complexity of the targeted social organization, and the varied social units that structure work and information; human centeredness is not an attribute of systems, but a process in which the stakeholder group of a particular system assists in evaluating the benefit of the system; the basic architecture of the system should reflect a realistic relationship between humans and machines;  the purpose and audience the system is designed for should be an explicit part of the design, evaluation, and use of the system.[6]

Human-centered activities in multimedia

Wikimania human-centered design visualization, created by Myriapoda.

The human-centered activities in multimedia, or HCM, can be considered as follows according to:[7] media production, annotation, organization, archival, retrieval, sharing, analysis, and communication, which can be clustered into three areas: production, analysis, and interaction.

Multimedia production

Multimedia production is the human task of creating media.[8] For instance, photographing, recording audio, remixing, etc. It is important that all aspects of media production concerned should directly involve humans in HCM. There are two main characteristics of multimedia production. The first is culture and social factors. HCM production systems should consider cultural differences and be designed according to the culture in which they will be deployed. The second is to consider human abilities. Participants involved in HCM production should be able to complete the activities during the production process.

Multimedia analysis

Multimedia analysis can be considered as a type of HCM applications which is the automatic analysis of human activities and social behavior in general. There is a broad area of potential relevant uses from facilitating and enhancing human communications, to allowing for improved information access and retrieval in the professional, entertainment, and personal domains.

Multimedia interaction

Multimedia interaction can be considered as the interaction activity area of HCM. It is paramount to understand both how humans interact with each other and why, so that we can build systems to facilitate such communication and so that people can interact with computers in natural ways. To achieve natural interaction, cultural differences and social context are primary factors to consider, due to the potential different cultural backgrounds. For instance, a couple of examples include: face-to-face communications where the interaction is physically located and real-time; live-computer mediated communications where the interaction is physically remote but remains real-time; and non-real time computer-mediated communications such as instant SMS, email, etc.

Career

Academic programs

As human-centered computing has become increasingly popular, many universities have created special programs for HCC research and study for both graduate and undergraduate students.

User interface designer

A user interface designer is an individual who usually with a relevant degree or high level of knowledge, not only on technology, cognitive science, human–computer interaction, learning sciences, but also on psychology and sociology. A user interface designer develops and applies user-centered design methodologies and agile development processes that includes consideration for overall usability of interactive software applications, emphasizing interaction design and front-end development.

Information architect (IA)

Information architects mainly work to understand user and business needs in order to organize information to best satisfy these needs. Specifically, information architects often act as a key bridge between technical and creative development in a project team. Areas of interest in IA include search schemas, metadata, and taxonomy.[9]

Projects

NASA/Ames Computational Sciences Division

NASA Mars Project

The Human-Centered Computing (HCC) group at NASA/Ames Computational Sciences Division is conducting research at Haughton as members of the Haughton-Mars Project (HMP) to determine, via an analog study, how we will live and work on Mars.[10]
  1. HMP/Carnegie Mellon University (CMU) Field Robotics Experiments—HCC is collaborating with researchers on the HMP/CMU field robotics research program at Haughton to specify opportunities for robots assisting scientists. Researchers in this project has carried out a parallel investigation that documents work during traverses. A simulation module has been built, using a tool that represents people, their tools, and their work environment, that will serve as a partial controller for a robot that assist scientists in the field work in mars. When it comes to take human, computing and environment all into consideration, theory and techniques in HCC filed will be the guideline.
  2. Ethnography of Human Exploration of Space—HCC lab is carrying out an ethnographic study of scientific field work, covering all aspects of a scientist's life in the field. This study involves observing as participants at Haughton and writing about HCC lab`s experiences. HCC lab then look for patterns in how people organize their time, space, and objects and how they relate to each other to accomplish their goals. In this study, HCC lab is focusing on learning and conceptual change.

Introduction to entropy

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Introduct...