Search This Blog

Tuesday, July 14, 2020

Artificial intelligence in healthcare

From Wikipedia, the free encyclopedia
 
X-ray of a hand, with automatic calculation of bone age by a computer software

Artificial intelligence in healthcare is the use of complex algorithms and software in another words artificial intelligence (AI) to emulate human cognition in the analysis, interpretation, and comprehension of complicated medical and healthcare data. Specifically, AI is the ability of computer algorithms to approximate conclusions without direct human input.

What distinguishes AI technology from traditional technologies in health care is the ability to gain information, process it and give a well-defined output to the end-user. AI does this through machine learning algorithms and deep learning. These algorithms can recognize patterns in behavior and create their own logic. In order to reduce the margin of error, AI algorithms need to be tested repeatedly. AI algorithms behave differently from humans in two ways: (1) algorithms are literal: if you set a goal, the algorithm can't adjust itself and only understand what it has been told explicitly, (2) and some deep learning algorithms are black boxes; algorithms can predict extremely precise, but not the cause or the why.

The primary aim of health-related AI applications is to analyze relationships between prevention or treatment techniques and patient outcomes. AI programs have been developed and applied to practices such as diagnosis processes, treatment protocol development, drug development, personalized medicine, and patient monitoring and care. Medical institutions such as The Mayo Clinic, Memorial Sloan Kettering Cancer Center, and the British National Health Service, have developed AI algorithms for their departments. Large technology companies such as IBM and Google, have also developed AI algorithms for healthcare. Additionally, hospitals are looking to AI software to support operational initiatives that increase cost saving, improve patient satisfaction, and satisfy their staffing and workforce needs. Companies are developing predictive analytics solutions that help healthcare managers improve business operations through increasing utilization, decreasing patient boarding, reducing length of stay and optimizing staffing levels.

History

Research in the 1960s and 1970s produced the first problem-solving program, or expert system, known as Dendral. While it was designed for applications in organic chemistry, it provided the basis for a subsequent system MYCIN, considered one of the most significant early uses of artificial intelligence in medicine. MYCIN and other systems such as INTERNIST-1 and CASNET did not achieve routine use by practitioners, however.

The 1980s and 1990s brought the proliferation of the microcomputer and new levels of network connectivity. During this time, there was a recognition by researchers and developers that AI systems in healthcare must be designed to accommodate the absence of perfect data and build on the expertise of physicians. Approaches involving fuzzy set theory, Bayesian networks, and artificial neural networks, have been applied to intelligent computing systems in healthcare.

Medical and technological advancements occurring over this half-century period that have enabled the growth healthcare-related applications of AI include:

Current research

Various specialties in medicine have shown an increase in research regarding AI.

Radiology

The ability to interpret imaging results with radiology may aid clinicians in detecting a minute change in an image that a clinician might accidentally miss. A study at Stanford created an algorithm that could detect pneumonia at that specific site, in those patients involved, with a better average F1 metric (a statistical metric based on accuracy and recall), than the radiologists involved in that trial. Several companies (icometrix, QUIBIM, Robovision, ...) have popped up that offer AI platforms for uploading images to. There are also vendor-neutral systems like UMC Utrecht's IMAGR AI. These platforms are trainable through deep learning to detect a wide range of specific diseases and disorders. The radiology conference Radiological Society of North America has implemented presentations on AI in imaging during its annual meeting. The emergence of AI technology in radiology is perceived as a threat by some specialists, as the technology can achieve improvements in certain statistical metrics in isolated cases, as opposed to specialists.

Imaging

Recent advances have suggested the use of AI to describe and evaluate the outcome of maxillo-facial surgery or the assessment of cleft palate therapy in regard to facial attractiveness or age appearance.

In 2018, a paper published in the journal Annals of Oncology mentioned that skin cancer could be detected more accurately by an artificial intelligence system (which used a deep learning convolutional neural network) than by dermatologists. On average, the human dermatologists accurately detected 86.6% of skin cancers from the images, compared to 95% for the CNN machine.

Psychiatry

In psychiatry, AI applications are still in a phase of proof-of-concept. Areas where the evidence is widening quickly include chatbots, conversational agents that imitate human behaviour and which have been studied for anxiety and depression.

Challenges include the fact that many applications in the field are developed and proposed by private corporations, such as the screening for suicidal ideation implemented by Facebook in 2017. Such applications outside the healthcare system raise various professional, ethical and regulatory questions.

Disease Diagnosis

There are many diseases and there also many ways that AI has been used to efficiently and accurately diagnose them. Some of the diseases that are the most notorious such as Diabetes, and Cardiovascular Disease (CVD) which are both in the top ten for causes of death worldwide have been the basis behind  a lot of the research/testing to help get an accurate diagnosis. Due to such a high mortality rate being associated with these diseases there have been efforts to integrate various methods in helping get accurate diagnosis’.

An article by Jiang, et al. (2017) demonstrated that there are several types of AI techniques that have been used for a variety of different diseases. Some of these techniques discussed by Jiang, et al. include: Support vector machines, neural networks, Decision trees, and many more. Each of these techniques is described as having a “training goal” so “classifications agree with the outcomes as much as possible…”.

To demonstrate some specifics for disease diagnosis/classification there are two different techniques used in the classification of these diseases include using “Artificial Neural Networks (ANN) and Bayesian Networks (BN)”. From a review of multiple different papers within the timeframe of 2008-2017 observed within them which of the two techniques were better.  The conclusion that was drawn was that “the early classification of these  diseases can be achieved developing machine learning models such as Artificial Neural Network and Bayesian Network.”  Another conclusion Alic, et al. (2017) was able to draw was that between the two ANN and BN that ANN was better and could more accurately classify diabetes/CVD with a mean accuracy in “both cases (87.29 for diabetes and 89.38 for CVD).

Telehealth

The increase of telemedicine, has shown the rise of possible AI applications. The ability to monitor patients using AI may allow for the communication of information to physicians if possible disease activity may have occurred. A wearable device may allow for constant monitoring of a patient and also allow for the ability to notice changes that may be less distinguishable by humans.

Electronic health records

Electronic health records are crucial to the digitalization and information spread of the healthcare industry. However, logging all of this data comes with its own problems like cognitive overload and burnout for users. EHR developers are now automating much of the process and even starting to use natural language processing (NLP) tools to improve this process. One study conducted by the Centerstone research institute found that predictive modeling of EHR data has achieved 70–72% accuracy in predicting individualized treatment response at baseline. Meaning using an AI tool that scans EHR data. It can pretty accurately predict the course of disease in a person.

Drug Interactions

Improvements in natural language processing led to the development of algorithms to identify drug-drug interactions in medical literature. Drug-drug interactions pose a threat to those taking multiple medications simultaneously, and the danger increases with the number of medications being taken. To address the difficulty of tracking all known or suspected drug-drug interactions, machine learning algorithms have been created to extract information on interacting drugs and their possible effects from medical literature. Efforts were consolidated in 2013 in the DDIExtraction Challenge, in which a team of researchers at Carlos III University assembled a corpus of literature on drug-drug interactions to form a standardized test for such algorithms. Competitors were tested on their ability to accurately determine, from the text, which drugs were shown to interact and what the characteristics of their interactions were. Researchers continue to use this corpus to standardize the measurement of the effectiveness of their algorithms.

Other algorithms identify drug-drug interactions from patterns in user-generated content, especially electronic health records and/or adverse event reports. Organizations such as the FDA Adverse Event Reporting System (FAERS) and the World Health Organization's VigiBase allow doctors to submit reports of possible negative reactions to medications. Deep learning algorithms have been developed to parse these reports and detect patterns that imply drug-drug interactions.

Creation of New Drugs

DSP-1181, a molecule of the drug for OCD (obsessive-compulsive disorder) treatment, was invented by artificial intelligence through joint efforts of Exscientia (British start-up) and Sumitomo Dainippon Pharma (Japanese pharmaceutical firm). The drug development took a single year, while pharmaceutical companies usually spend about five years on similar projects. DSP-1181 was accepted for a human trial.

Industry

The subsequent motive of large based health companies merging with other health companies, allow for greater health data accessibility. Greater health data may allow for more implementation of AI algorithms.

A large part of industry focus of implementation of AI in the healthcare sector is in the clinical decision support systems. As the amount of data increases, AI decision support systems become more efficient. Numerous companies are exploring the possibilities of the incorporation of big data in the health care industry.

The following are examples of large companies that have contributed to AI algorithms for use in healthcare.

IBM

IBM's Watson Oncology is in development at Memorial Sloan Kettering Cancer Center and Cleveland Clinic. IBM is also working with CVS Health on AI applications in chronic disease treatment and with Johnson & Johnson on analysis of scientific papers to find new connections for drug development. In May 2017, IBM and Rensselaer Polytechnic Institute began a joint project entitled Health Empowerment by Analytics, Learning and Semantics (HEALS), to explore using AI technology to enhance healthcare.

Microsoft

Microsoft's Hanover project, in partnership with Oregon Health & Science University's Knight Cancer Institute, analyzes medical research to predict the most effective cancer drug treatment options for patients. Other projects include medical image analysis of tumor progression and the development of programmable cells.

Google

Google's DeepMind platform is being used by the UK National Health Service to detect certain health risks through data collected via a mobile app. A second project with the NHS involves analysis of medical images collected from NHS patients to develop computer vision algorithms to detect cancerous tissues.

Tencent

Tencent is working on several medical systems and services. These include:
  • AI Medical Innovation System (AIMIS), an AI-powered diagnostic medical imaging service
  • WeChat Intelligent Healthcare
  • Tencent Doctorwork

Intel

Intel's venture capital arm Intel Capital recently invested in startup Lumiata which uses AI to identify at-risk patients and develop care options.

Startups

Kheiron Medical developed deep learning software to detect breast cancers in mammograms.

Fractal Analytics has incubated Qure.ai which focuses on using deep learning and AI to improve radiology and speed up the analysis of diagnostic x-rays.

Other

Digital consultant apps like Babylon Health's GP at Hand, Ada Health, AliHealth Doctor You, KareXpert and Your.MD use AI to give medical consultation based on personal medical history and common medical knowledge. Users report their symptoms into the app, which uses speech recognition to compare against a database of illnesses. Babylon then offers a recommended action, taking into account the user's medical history. Entrepreneurs in healthcare have been effectively using seven business model archetypes to take AI solution to the marketplace. These archetypes depend on the value generated for the target user (e.g. patient focus vs. healthcare provider and payer focus) and value capturing mechanisms (e.g. providing information or connecting stakeholders).

IFlytek launched a service robot “Xiao Man”, which integrated artificial intelligence technology to identify the registered customer and provide personalized recommendations in medical areas. It also works in the field of medical imaging. Similar robots are also being made by companies such as UBTECH ("Cruzr") and Softbank Robotics ("Pepper").

Implications

The use of AI is predicted to decrease medical costs as there will be more accuracy in diagnosis and better predictions in the treatment plan as well as more prevention of disease.

Other future uses for AI include Brain-computer Interfaces (BCI) which are predicted to help those with trouble moving, speaking or with a spinal cord injury. The BCIs will use AI to help these patients move and communicate by decoding neural activates.

As technology evolves and is implemented in more workplaces, many fear that their jobs will be replaced by robots or machines. The U.S. News Staff (2018) writes that in the near future, doctors who utilize AI will “win out” over the doctors who don't. AI will not replace healthcare workers but instead, allow them more time for bedside cares. AI may avert healthcare worker burn out and cognitive overload. Overall, as Quan-Haase (2018) says, technology “extends to the accomplishment of societal goals, including higher levels of security, better means of communication over time and space, improved health care, and increased autonomy” (p. 43). As we adapt and utilize AI into our practice we can enhance our care to our patients resulting in greater outcomes for all.

Expanding care to developing nations

With an increase in the use of AI, more care may become available to those in developing nations. AI continues to expand in its abilities and as it is able to interpret radiology, it may be able to diagnose more people with the need for fewer doctors as there is a shortage in many of these nations. The goal of AI is to teach others in the world, which will then lead to improved treatment and eventually greater global health. Using AI in developing nations who do not have the resources will diminish the need for outsourcing and can use AI to improve patient care. For example, Natural language processing, and machine learning are being used for guiding cancer treatments in places such as Thailand, China, and India. Researchers trained an AI application to use NLP to mine through patient records, and provide treatment. The ultimate decision made by the AI application agreed with expert decisions 90% of the time.

Regulation

While research on the use of AI in healthcare aims to validate its efficacy in improving patient outcomes before its broader adoption, its use may nonetheless introduce several new types of risk to patients and healthcare providers, such as algorithmic bias, Do not resuscitate implications, and other machine morality issues. These challenges of the clinical use of AI has brought upon potential need for regulations.

Currently no regulations exist specifically for the use of AI in healthcare. In May 2016, the White House announced its plan to host a series of workshops and formation of the National Science and Technology Council (NSTC) Subcommittee on Machine Learning and Artificial Intelligence. In October 2016, the group published The National Artificial Intelligence Research and Development Strategic Plan, outlining its proposed priorities for Federally-funded AI research and development (within government and academia). The report notes a strategic R&D plan for the subfield of health information technology is in development stages.

The only agency that has expressed concern is the FDA. Bakul Patel, the Associate Center Director for Digital Health of the FDA, is quoted saying in May 2017.
“We're trying to get people who have hands-on development experience with a product's full life cycle. We already have some scientists who know artificial intelligence and machine learning, but we want complementary people who can look forward and see how this technology will evolve.”
The joint ITU - WHO Focus Group on Artificial Intelligence for Health has built a platform for the testing and benchmarking of AI applications in health domain. As of November 2018, eight use cases are being benchmarked, including assessing breast cancer risk from histopathological imagery, guiding anti-venom selection from snake images, and diagnosing skin lesions.

Monday, July 13, 2020

Command center

From Wikipedia, the free encyclopedia
 
War Room at Stevns Fortress used in Denmark during the Cold War
 
A command center (often called a war room) is any place that is used to provide centralized command for some purpose.

While frequently considered to be a military facility, these can be used in many other cases by governments or businesses. The term "war room" is also often used in politics to refer to teams of communications people who monitor and listen to the media and the public, respond to inquiries, and synthesize opinions to determine the best course of action.

If all functions of a command center are located in a single room this is often referred to as a control room

A command center enables an organization to function as designed, to perform day-to-day operations regardless of what is happening around it, in a manner in which no one realizes it is there but everyone knows who is in charge when there is trouble.

Conceptually, a command center is a source of leadership and guidance to ensure that service and order is maintained, rather than an information center or help desk. Its tasks are achieved by monitoring the environment and reacting to events, from the relatively harmless to a major crisis, using predefined procedures.

Types of command centers

There are many types of command centers. They include:
Data center management
Oversees the central management and operating control for the computer systems that are essential most businesses, usually housed in data centers and large computer rooms.
Business application management
Ensures applications that are critical to customers and businesses are always available and working as designed.
Civil management
Oversees the central management and control of civil operational functions. Staff members in those centers monitor the metropolitan environment to ensure the safety of people and the proper operation of critical government services, adjusting services as required and ensuring proper constant movement.
Emergency (crisis) management
Directs people, resources, and information, and controls events to avert a crisis/emergency and minimize/avoid impacts should an incident occur.
19th century War Room of the United States Navy

Types of command and control rooms and their responsibilities

  • Command Center (CC or ICC)
    • Data center, computer system, incident response
  • Network Operation Centers (NOC)
    • Network equipment and activity
  • Tactical Operation Centers (TOC)
    • Military operations
    • Police and intelligence
  • Security Operation Centers (SOC)
    • Security agencies
    • Government agencies
    • Traffic management
    • CCTV
  • Emergency Operation Centers (EOC)
    • Emergency services
  • Combined Operation Centers (COS)
    • Air traffic control
    • Oil and gas
    • Control rooms
    • Broadcast
  • Audio Visual (AV)
    • Simulation and training
    • Medical
  • Social Media Command Center
    • Monitoring, posting and responding on social media sites

Military and government

A command center is a central place for carrying out orders and for supervising tasks, also known as a headquarters, or HQ.

Common to every command center are three general activities: inputs, processes, and outputs. The inbound aspect is communications (usually intelligence and other field reports). Inbound elements are "sitreps" (situation reports of what is happening) and "progreps" (progress reports relative to a goal that has been set) from the field back to the command element.

The process aspect involves a command element that makes decisions about what should be done about the input data. In the US military, the command consists of a field – (Major to Colonel) or flag – (General) grade commissioned officer with one or more advisers. The outbound communications then delivers command decisions (i.e., operating orders) to the field elements.

Command centers should not be confused with the high-level military formation of a Command – as with any formation, Commands may be controlled from a command center, however not all formations controlled from a command centre are Commands.

Examples

Canada

During the Cold War, the Government of Canada undertook the construction of "Emergency Government Headquarters", to be used in the event of nuclear warfare or other large-scale disaster. Canada was generally allied with the United States for the duration of the war, was a founding member of NATO, allowed American cruise missiles to be tested in the far north, and flew sovereignty missions in the Arctic.

For these reasons, the country was often seen as being a potential target of the Soviets at the height of nuclear tensions in the 1960s. Extensive post-attack plans were drawn up for use in emergencies, and fallout shelters were built all across the country for use as command centres for governments of all levels, the Canadian Forces, and rescue personnel, such as fire services. 

Different levels of command centres included:
  • CEGF, Central Emergency Government Facility, located in Carp, Ontario, near the National Capital Region. Designed for use by senior federal politicians and civil servants.
  • REGHQ, Regional Emergency Government Headquarters, of which there were seven, spread out across the country.
  • MEGHQ, Municipal Emergency Government Headquarters
  • ZEGHQ, Zone Emergency Government Headquarters, built within the basements of existing buildings, generally designed to hold around 70 staff.
  • RU, Relocation Unit, or CRU, Central Relocation Unit. Often bunkers built as redundant backups to REGHQs and MEGHQs were given the RU designation.

Serbia

Joint Operations Command (JOC) is the organizational unit of the Serbian Armed Forces directly subordinated to the General Staff of the Armed Forces. The main duty of the Command is to conduct operational command over the Armed Forces. The Operations Command has a flexible formation, which is expanded by the representatives of other organizational units of the General Staff, and, if there is a need, operational level commands. In peacetime, the commander of the Joint Operations Command is at the same time Deputy of Serbian Armed Forces General Staff.

United Kingdom

Constructed in 1938, the Cabinet War Rooms were used extensively by Sir Winston Churchill during the Second World War.

United States

NORAD Command Center at the Cheyenne Mountain Complex, Colorado

A Command and Control Center is a specialized type of command center operated by a government or municipal agency 24 hours a day, 7 days a week. Various branches of the U.S. Military such as the U.S Coast Guard and the U.S. Navy have command and control centers.

They are also common in many large correctional facilities. A Command and Control Center operates as the agency's dispatch center, surveillance monitoring center, coordination office, and alarm monitoring center all in one.

Command and control centers are not staffed by high-level officials but rather by highly skilled technical staff. When a serious incident occurs the staff will notify the agency's higher level officials.

In service businesses

A command center enables the real-time visibility and management of an entire service operation. Similar to an air traffic control center, a command center allows organizations to view the status of global service calls, service technicians, and service parts on a single screen. In addition, customer commitments or service level agreements (SLAs) that have been made can also be programmed into the command center and monitored to ensure all are met and customers are satisfied.

A command center is well suited for industries where coordinating field service (people, equipment, parts, and tools) is critical. Some examples:
  • Intel's security Command Center
  • Dell's Enterprise Command Center
  • NASA's Mission Control Houston Command Center for Space Shuttle and ISS
War rooms can also be used for defining strategies, or driving business intelligence efforts.

In popular culture

Model of the war room constructed for Stanley Kubrick's Dr. Strangelove.
 
The most famous war room in popular culture was the one depicted in the 1964 film Dr. Strangelove. War rooms were also seen in other films like Fail Safe and WarGames.

A command center is used as the headquarters of the Power Rangers in the Mighty Morphin Power Rangers television series.

Command centers are used in the game Command and Conquer: Generals to train Workers and Dozers, provide vision, and command special weapons.

Artificial intelligence in heavy industry

The goals of AI: a learning machine that can evolve and make its own decisions.

AI-driven systems can discover patterns and trends, discover inefficiencies, and predict future outcomes based on historical trends, which ultimately enables informed decision-making. As such, they are potentially beneficial for many industries, notably heavy industry.

While the application of artificial intelligence in heavy industry is still in its early stages, applications are likely to include optimization of asset management and operational performance, as well as identifying efficiencies and decreasing downtime.

Potential benefits

AI-driven machines ensure an easier manufacturing process, along with many other benefits, at each new stage of advancement. Technology creates new potential for task automation while increasing the intelligence of human and machine interaction. Some benefits of AI include directed automation, 24/7 production, safer operational environments, and reduced operating costs.

Directed automation

AI and robots can execute actions repeatedly without any error, and design more competent production models by building automation solutions. They are also capable of eliminating human errors and delivering superior levels of quality assurance on their own.

24/7 production

While humans must work in shifts to accommodate sleep and mealtimes, robots can keep a production line running continuously. Businesses can expand their production capabilities and meet higher demands for products from global customers due to boosted production from this round-the-clock work performance.

Safer operational environment

More AI means fewer human laborers performing dangerous and strenuous work. Logically speaking, with fewer humans and more robots performing activities associated with risk, the number of workplace accidents should dramatically decrease. It also offers a great opportunity for exploration because companies do not have to risk human life.

Condensed operating costs

With AI taking over day-to-day activities, a business will have considerably lower operating costs. Rather than employing humans to work in shifts, they could simply invest in AI. The only cost incurred would be from maintenance after the machinery is purchased and commissioned.

Environmental impacts

Self-driving cars are potentially beneficial to the environment. They can be programmed to navigate the most efficient route and reduce idle time, which could result in less fossil fuel consumption and greenhouse gas (GHG) emissions. The same could be said for heavy machinery used in heavy industry. AI can accurately follow a sequence of procedures repeatedly, whereas humans are prone to occasional errors.

Additional benefits of AI

AI and industrial automation have advanced considerably over the years. There has been an evolution of many new techniques and innovations, such as advances in sensors and the increase of computing capabilities. AI helps machines gather and extract data, identify patterns, adapt to new trends through machine intelligence, learning, and speech recognition. It also helps to make quick data-driven decisions, advance process effectiveness, minimize operational costs, facilitate product development, and enable extensive scalability.

Potential negatives

High cost

Though the cost has been decreasing in the past few years, individual development expenditures can still be as high as $300,000 for basic AI. Small businesses with a low capital investment may have difficulty generating the funds necessary to leverage AI. For larger companies, the price of AI may be higher, depending on how much AI is involved in the process. Because of higher costs, the feasibility of leveraging AI becomes a challenge for many companies. Nevertheless, the cost of utilizing AI can be cheaper for companies with the advent of open-source artificial intelligence software.

Reduced employment opportunities

Job opportunities will grow with the advent of AI; however, some jobs might be lost because AI would replace them. Any job that involves repetitive tasks is at risk of being replaced. In 2017, Gartner predicted 500,000 jobs would be created because of AI, but also predicted that up to 900,000 jobs could be lost because of it. These figures stand true for jobs only within the United States.

AI decision-making

AI is only as intelligent as the individuals responsible for its initial programming. In 2014, an active shooter situation led to people calling Uber to escape the shooting and surrounding area. Instead of recognizing this as a dangerous situation, the algorithm Uber used saw a rise in demand and increased its prices. This type of situation can be dangerous in the heavy industry, where one mistake can cost lives or cause injury.

Environmental impacts

Only 20 percent of electronic waste was recycled in 2016, despite 67 nations having enacted e-waste legislation. Electronic waste is expected to reach 52.2 million tons in the year 2021. The manufacture of digital devices and other electronics goes hand-in-hand with AI development which is poised to damage the environment. In September 2015, the German car company Volkswagen witnessed an international scandal. The software in the cars falsely activated emission controls of nitrogen oxide gases (NOx gases) when they were undergoing a sample test. Once the cars were on the road, the emission controls deactivated and the NOx emissions increased up to 40 times. NOx gases are harmful because they cause significant health problems, including respiratory problems and asthma. Further studies have shown that additional emissions could cause over 1,200 premature deaths in Europe and result in $2.4 million worth of lost productivity.

AI trained to act on environmental variables might have erroneous algorithms, which can lead to potentially negative effects on the environment. Algorithms trained on biased data will produce biased results. The COMPAS judicial decision support system is one such example of biased data producing unfair outcomes. When machines develop learning and decision-making ability that is not coded by a programmer, the mistakes can be hard to trace and see. As such, the management and scrutiny of AI-based processes are essential.

Effects of AI in the manufacturing industry

Landing.ai, a startup formed by Andrew Ng, developed machine-vision tools that detect microscopic defects in products at resolutions well beyond the human vision. The machine-vision tools use a machine-learning algorithm tested on small volumes of sample images. The computer not only 'sees' the errors but processes the information and learns from what it observes.

In 2014, China, Japan, the United States, the Republic of Korea and Germany together contributed to 70 percent of the total sales volume of robots. In the automotive industry, a sector with a particularly high degree of automation, Japan had the highest density of industrial robots in the world at 1,414 per 10,000 employees.

Generative design is a new process born from artificial intelligence. Designers or engineers specify design goals (as well as material parameters, manufacturing methods, and cost constraints) into the generative design software. The software explores all potential permutations for a feasible solution and generates design alternatives. The software also uses machine learning to test and learn from each iteration to test which iterations work and which iterations fail. It is said to effectively rent 50,000 computers [in the cloud] for an hour.

Artificial intelligence has gradually become widely adopted in the modern world. AI personal assistants, like Siri or Alexa, have been around for military purposes since 2003.

Fallout shelter

From Wikipedia, the free encyclopedia

A fallout shelter sign in the United States of America.
 
A fallout shelter is an enclosed space specially designed to protect occupants from radioactive debris or fallout resulting from a nuclear explosion. Many such shelters were constructed as civil defense measures during the Cold War

During a nuclear explosion, matter vaporized in the resulting fireball is exposed to neutrons from the explosion, absorbs them, and becomes radioactive. When this material condenses in the rain, it forms dust and light sandy materials that resemble ground pumice. The fallout emits alpha and beta particles, as well as gamma rays.

Much of this highly radioactive material falls to earth, subjecting anything within the line of sight to radiation, becoming a significant hazard. A fallout shelter is designed to allow its occupants to minimize exposure to harmful fallout until radioactivity has decayed to a safer level.

History

Idealized American fallout shelter, around 1957.

North America

During the Cold War, many countries built fallout shelters for high-ranking government officials and crucial military facilities, such as Project Greek Island and the Cheyenne Mountain nuclear bunker in the United States and Canada's Emergency Government Headquarters. Plans were made, however, to use existing buildings with sturdy below-ground-level basements as makeshift fallout shelters. These buildings were placarded with the orange-yellow and black trefoil sign designed by United States Army Corps of Engineers director of administrative logistics support function Robert W. Blakeley in 1961.

The National Emergency Alarm Repeater (NEAR) program was developed in the United States in 1956 during the Cold War to supplement the existing siren warning systems and radio broadcasts in the event of a nuclear attack. The NEAR civilian alarm device was engineered and tested but the program was not viable and was terminated in 1967.

In the U.S. in September 1961, under the direction of Steuart L. Pittman, the federal government started the Community Fallout Shelter Program. A letter from President Kennedy advising the use of fallout shelters appeared in the September 1961 issue of Life magazine. Over the period 1961-1963, there was a growth in home fallout shelter sales, but eventually there was a public backlash against the fallout shelter as a consumer product.

In November 1961, in Fortune magazine, an article by Gilbert Burck appeared that outlined the plans of Nelson Rockefeller, Edward Teller, Herman Kahn, and Chet Holifield for an enormous network of concrete lined underground fallout shelters throughout the United States sufficient to shelter millions of people to serve as a refuge in case of nuclear war.

The United States ended federal funding for the shelters in the 1970s. In 2017, New York City began removing the yellow signs since members of the public are unlikely to find viable food and medicine inside those rooms.

Europe

Similar projects have been undertaken in Finland, which requires all buildings with area over 600 m² to have an NBC (nuclear-biological-chemical) shelter, and Norway, which requires all buildings with an area over 1000 m² to have a shelter.

The former Soviet Union and other Eastern Bloc countries often designed their underground mass-transit and subway tunnels to serve as bomb and fallout shelters in the event of an attack.

Germany has protected shelters for 3% of its population, Austria for 30%, Finland for 70%, Sweden for 81%, and Switzerland for 114%.

Switzerland

The Sonnenberg Tunnel, in Switzerland, was the world's largest civilian nuclear fallout shelter, designed to protect 20,000 civilians in the eventuality of war or disaster (civil defense function abandoned in 2006).
 
Switzerland built an extensive network of fallout shelters, not only through extra hardening of government buildings such as schools, but also through a building regulation requiring nuclear shelters in residential buildings since the 1960s (the first legal basis in this sense dates from 4 October 1963). Later, the law ensured that all residential buildings built after 1978 contained a nuclear shelter able to withstand a blast from a 12 megaton explosion at a distance of 700 metres. The Federal Law on the Protection of the Population and Civil Protection still requires that every inhabitant should have a place in a shelter close to where they live.

The Swiss authorities maintained large communal shelters (such as the Sonnenberg Tunnel until 2006) stocked with over four months of food and fuel. The reference Nuclear War Survival Skills declared that, as of 1986, "Switzerland has the best civil defense system, one that already includes blast shelters for over 85% of all its citizens." As of 2006, there were about 300,000 shelters built in private homes, institutions and hospitals, as well as 5,100 public shelters for a total of 8.6 million places, a level of coverage equal to 114% of the population.

In Switzerland, most residential shelters are no longer stocked with the food and water required for prolonged habitation and a large number have been converted by the owners to other uses (e.g., wine cellars, ski rooms, gyms). But the owner still has the obligation to ensure the maintenance of the shelter.

Details of shelter construction

Door of a public fallout shelter in Switzerland (2014).
 
Large fire door, sealing a fallout and air raid shelter inside the basement parking garage of a hotel in Germany.

Shielding

A basic fallout shelter consists of shields that reduce gamma ray exposure by a factor of 1000. The required shielding can be accomplished with 10 times the thickness of any quantity of material capable of cutting gamma ray exposure in half. Shields that reduce gamma ray intensity by 50% (1/2) include 1 cm (0.4 inch) of lead, 6 cm (2.4 inches) of concrete, 9 cm (3.6 inches) of packed earth or 150 m (500 ft) of air. When multiple thicknesses are built, the shielding multiplies. Thus, a practical fallout shield is ten halving-thicknesses of packed earth, reducing gamma rays by approximately 1024 times (210).

Usually, an expedient purpose-built fallout shelter is a trench; with a strong roof buried by 1 m (3 ft) of earth. The two ends of the trench have ramps or entrances at right angles to the trench, so that gamma rays cannot enter (they can travel only in straight lines). To make the overburden waterproof (in case of rain), a plastic sheet may be buried a few inches below the surface and held down with rocks or bricks.

Blast doors are designed to absorb the shock wave of a nuclear blast, bending and then returning to their original shape.

Climate control

Dry earth is a reasonably good thermal insulator, and over several weeks of habitation, a shelter will become dangerously hot. The simplest form of effective fan to cool a shelter is a wide, heavy frame with flaps that swing in the shelter's doorway and can be swung from hinges on the ceiling. The flaps open in one direction and close in the other, pumping air. (This is a Kearny air pump, or KAP, named after the inventor, Cresson Kearny

Unfiltered air is safe, since the most dangerous fallout has the consistency of sand or finely ground pumice. Such large particles are not easily ingested into the soft tissues of the body, so extensive filters are not required. Any exposure to fine dust is far less hazardous than exposure to the fallout outside the shelter. Dust fine enough to pass the entrance will probably pass through the shelter. Some shelters, however, incorporate NBC-filters for additional protection.

Locations

Effective public shelters can be the middle floors of some tall buildings or parking structures, or below ground level in most buildings with more than 10 floors. The thickness of the upper floors must form an effective shield, and the windows of the sheltered area must not view fallout-covered ground that is closer than 1.5 km (1 mi). One of Switzerland's solutions is to use road tunnels passing through the mountains, with some of these shelters being able to protect tens of thousands.

Fallout shelters are not always underground. Above ground buildings with walls and roofs dense enough to afford a meaningful protection factor can be used as a fallout shelter.

Contents

A battery-powered radio may be helpful to get reports of fallout patterns and clearance. However, radio and other electronic equipment may be disabled by electromagnetic pulse. For example, even at the height of the cold war, EMP protection had been completed for only 125 of the approximately 2,771 radio stations in the United States Emergency Broadcast System. Also, only 110 of 3,000 existing Emergency Operating Centers had been protected against EMP effects. The Emergency Broadcast System has since been supplanted in the United States by the Emergency Alert System.

The reference Nuclear War Survival Skills includes the following supplies in a list of "Minimum Pre-Crisis Preparations": one or more shovels, a pick, a bow-saw with an extra blade, a hammer, and 4-mil polyethylene film (also any necessary nails, wire, etc.); a homemade shelter-ventilating pump (a KAP); large containers for water; a plastic bottle of sodium hypochlorite bleach; one or two KFMs (Kearny fallout meters) and the knowledge to operate them; at least a 2-week supply of compact, nonperishable food; an efficient portable stove; wooden matches in a waterproof container; essential containers and utensils for storing, transporting, and cooking food; a hose-vented 5-gallon can, with heavy plastic bags for liners, for use as a toilet; tampons; insect screen and fly bait; any special medications needed by family members; pure potassium iodide, a 2-oz bottle, and a medicine dropper; a first-aid kit and a tube of antibiotic ointment; long-burning candles (with small wicks) sufficient for at least 14 nights; an oil lamp; a flashlight and extra batteries; and a transistor radio with extra batteries and a metal box to protect it from electromagnetic pulse.

Inhabitants should have water on hand, 1-2 gallons per person per day. Water stored in bulk containers requires less space than water stored in smaller bottles.

Kearny fallout meter

Commercially made Geiger counters are expensive and require frequent calibration. It is possible to construct an electrometer-type radiation meter called the Kearny fallout meter, which does not require batteries or professional calibration, from properly-scaled plans with just a coffee can or pail, gypsum board, monofilament fishing line, and aluminum foil. Plans are freely available in the public domain in the reference Nuclear War Survival Skills by Cresson Kearny.

Use

Inhabitants should plan to remain sheltered for at least two weeks (with an hour out at the end of the first week – see Swiss Civil Defense guidelines), then work outside for gradually increasing amounts of time, to four hours a day at three weeks. The normal work is to sweep or wash fallout into shallow trenches to decontaminate the area. They should sleep in a shelter for several months. Evacuation at three weeks is recommended by official authorities.

If available, inhabitants may take potassium iodide at the rate of 130 mg/day per adult (65 mg/day per child) as an additional measure to protect the thyroid gland from the uptake of dangerous radioactive iodine, a component of most fallout and reactor waste.
Relative abilities of three different types of ionizing radiation to penetrate solid matter.
The protection factor provided by 10 cm of concrete shielding where the source is the idealised Chernobyl fallout.
The protection factor provided by 20 cm of concrete shielding where the source is the idealised Chernobyl fallout.
The protection factor provided by 30 cm of concrete shielding where the source is the idealised Chernobyl fallout.
Calculated relative gamma dose rates from atomic bomb and Chernobyl fallout

Different types of radiation emitted by fallout

Alpha (α)

In the vast majority of accidents, and in all atomic bomb blasts, the threat due to beta and gamma emitters is greater than that posed by the alpha emitters in the fallout. Alpha particles are identical to a helium-4 nucleus (two protons and two neutrons), and travel at speeds in excess of 5% of the speed of light. Alpha particles have little penetrating power; most cannot penetrate through human skin. Avoiding direct exposure with fallout particles will prevent injury from alpha radiation.

Beta (β)

Beta radiation consists of particles (high-speed electrons) given off by some fallout. Most beta particles cannot penetrate more than about 10 feet (3 metres) of air or about ​18 inch (3 millimetres) of water, wood, or human body tissue; or a sheet of aluminum foil. Avoiding direct exposure with fallout particles will prevent most injuries from beta radiation.

The primary dangers associated with beta radiation are internal exposure from ingested fallout particles and beta burns from fallout particles no more than a few days old. Beta burns can result from contact with highly radioactive particles on bare skin; ordinary clothing separating fresh fallout particles from the skin can provide significant shielding.

Gamma (γ)

Gamma radiation penetrates further through matter than alpha or beta radiation. Most of the design of a typical fallout shelter is intended to protect against gamma rays. Gamma rays are better absorbed by materials with high atomic numbers and high density, although neither effect is important compared to the total mass per area in the path of the gamma ray. Thus, lead is only modestly better as a gamma shield than an equal mass of another shielding material such as aluminum, concrete, water or soil. 

Some gamma radiation from fallout will penetrate into even the best shelters. However, the radiation dose received while inside a shelter can be significantly reduced with proper shielding. Ten halving thicknesses of a given material can reduce gamma exposure to less than ​11000 of unshielded exposure.

Weapons versus nuclear accident fallout

The bulk of the radioactivity in nuclear accident fallout is more long-lived than that in weapons fallout. A good table of the nuclides, such as that provided by the Korean Atomic Energy Research Institute, includes the fission yields of the different nuclides. From this data it is possible to calculate the isotopic mixture in the fallout (due to fission products in bomb fallout).

Other matters and simple improvements

While a person's home may not be a purpose-made shelter, it could be thought of as one if measures are taken to improve the degree of fallout protection.

Measures to lower the beta dose

The main threat of beta radiation exposure comes from hot particles in contact with or close to the skin of a person. Also, swallowed or inhaled hot particles could cause beta burns. As it is important to avoid bringing hot particles into the shelter, one option is to remove one's outer clothing, or follow other decontamination procedures, on entry. Fallout particles will cease to be radioactive enough to cause beta burns within a few days following a nuclear explosion. The danger of gamma radiation will persist for far longer than the threat of beta burns in areas with heavy fallout exposure.

Measures to lower the gamma dose rate

The gamma dose rate due to the contamination brought into the shelter on the clothing of a person is likely to be small (by wartime standards) compared to gamma radiation that penetrates through the walls of the shelter. The following measures can be taken to reduce the amount of gamma radiation entering the shelter:
  • Roofs and gutters can be cleaned to lower the dose rate in the house.
  • The top inch of soil in the area near the house can be either removed or dug up and mixed with the subsoil. This reduces the dose rate as the gamma rays have to pass through the topsoil before they can irradiate anything above.
  • Nearby roads can be rinsed and washed down to remove dust and debris; the fallout would collect in the sewers and gutters for easier disposal. In Kiev after the Chernobyl accident a program of road washing was used to control the spread of radioactivity.
  • Windows can be bricked up, or the sill raised to reduce the hole in the shielding formed by the wall.
  • Gaps in the shielding can be blocked using containers of water. While water has a much lower density than that of lead, it is still able to shield some gamma rays.
  • Earth (or other dense material) can be heaped up against the exposed walls of the building; this forces the gamma rays to pass through a thicker layer of shielding before entering the house.
  • Nearby trees can be removed to reduce the dose due to fallout which is on the branches and leaves. It has been suggested by the US government that a fallout shelter should not be dug close to trees for this reason.

Fallout shelters in popular culture

Robert W. Blakeley-designed fallout shelter sign used in the United States.
 
The international distinctive sign of civil defense personnel and infrastructures.

Fallout shelters feature prominently in the Robert A. Heinlein novel Farnham's Freehold (Heinlein built a fairly extensive shelter near his home in Colorado Springs in 1963), Pulling Through by Dean Ing, A Canticle for Leibowitz by Walter M. Miller and Earth by David Brin

The 1961 Twilight Zone episode "The Shelter", from a Rod Serling script, deals with the consequences of actually using a shelter. Another episode of the series called "One More Pallbearer" featured a fallout shelter owned by millionaire. The 1985 adaption of the series had the episode "Shelter Skelter" that featured a fallout shelter.

In the Only Fools and Horses episode "The Russians are Coming", Derek Trotter buys a lead fallout shelter, then decides to construct it in fear of an impending nuclear war caused by the Soviet Union (who were still active during the episode's creation).

In 1999 the film Blast from the Past was released. It is a romantic comedy film about a nuclear physicist, his wife, and son that enter a well-equipped, spacious fallout shelter during the 1962 Cuban Missile Crisis. They do not emerge until 35 years later, in 1997. The film shows their reaction to contemporary society.

The Fallout series of computer games depicts the remains of human civilization after an immensely destructive global nuclear war; the United States of America had built underground vaults that were advertised to protect the population against a nuclear attack, but almost all of them were in fact meant to lure subjects for long-term human experimentation.

Paranoia, a role-playing game, takes place in a form of fallout shelter, which has become ruled by an insane computer.

The Metro 2033 book series by Russian author Dmitry Glukhovsky depicts survivors' life in the subway systems below Moscow and Saint-Petersburg after a nuclear exchange between the Russian Federation and the United States of America.

Fallout shelters are often featured on the reality television show Doomsday Preppers.

The Silo series of novellas by Hugh Howey feature extensive fallout-style shelters that protect the inhabitants from an initially unknown disaster.

The Tomorrow Man centers around a reclusive man whose main preoccupation is tending to his in-home fallout shelter and the conspiracy theory(s) that could put it to use.

Introduction to entropy

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Introduct...