Search This Blog

Friday, July 27, 2018

Computer science

From Wikipedia, the free encyclopedia

large capital lambda Plot of a quicksort algorithm
Utah teapot representing computer graphics Microsoft Tastenmaus mouse representing human-computer interaction
Computer science deals with the theoretical foundations of information and computation, together with practical techniques for the implementation and application of these foundations.

Computer science is the study of the theory, experimentation, and engineering that form the basis for the design and use of computers. It is the scientific and practical approach to computation and its applications and the systematic study of the feasibility, structure, expression, and mechanization of the methodical procedures (or algorithms) that underlie the acquisition, representation, processing, storage, communication of, and access to, information. An alternate, more succinct definition of computer science is the study of automating algorithmic processes that scale. A computer scientist specializes in the theory of computation and the design of computational systems.

Its fields can be divided into a variety of theoretical and practical disciplines. Some fields, such as computational complexity theory (which explores the fundamental properties of computational and intractable problems), are highly abstract, while fields such as computer graphics emphasize real-world visual applications. Other fields still focus on challenges in implementing computation. For example, programming language theory considers various approaches to the description of computation, while the study of computer programming itself investigates various aspects of the use of programming language and complex systems. Human–computer interaction considers the challenges in making computers and computations useful, usable, and universally accessible to humans.

History

Charles Babbage sometimes referred as "father of computing".
Ada Lovelace is credited with writing the first algorithm intended for processing on a computer.
The earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division. Further, algorithms for performing computations have existed since antiquity, even before the development of sophisticated computing equipment.

Wilhelm Schickard designed and constructed the first working mechanical calculator in 1623.[4] In 1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped Reckoner.[5] He may be considered the first computer scientist and information theorist, for, among other reasons, documenting the binary number system. In 1820, Thomas de Colmar launched the mechanical calculator industry[note 1] when he released his simplified arithmometer, which was the first calculating machine strong enough and reliable enough to be used daily in an office environment.  Charles Babbage started the design of the first automatic mechanical calculator, his Difference Engine, in 1822, which eventually gave him the idea of the first programmable mechanical calculator, his Analytical Engine.[6] He started developing this machine in 1834, and "in less than two years, he had sketched out many of the salient features of the modern computer".[7] "A crucial step was the adoption of a punched card system derived from the Jacquard loom"[7] making it infinitely programmable.[note 2] In 1843, during the translation of a French article on the Analytical Engine, Ada Lovelace wrote, in one of the many notes she included, an algorithm to compute the Bernoulli numbers, which is considered to be the first computer program.[8] Around 1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical information; eventually his company became part of IBM. In 1937, one hundred years after Babbage's impossible dream, Howard Aiken convinced IBM, which was making all kinds of punched card equipment and was also in the calculator business[9] to develop his giant programmable calculator, the ASCC/Harvard Mark I, based on Babbage's Analytical Engine, which itself used cards and a central computing unit. When the machine was finished, some hailed it as "Babbage's dream come true".[10]

During the 1940s, as new and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors.[11] As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s.[12][13] The world's first computer science degree program, the Cambridge Diploma in Computer Science, began at the University of Cambridge Computer Laboratory in 1953. The first computer science degree program in the United States was formed at Purdue University in 1962.[14] Since practical computers became available, many applications of computing have become distinct areas of study in their own rights.

Although many initially believed it was impossible that computers themselves could actually be a scientific field of study, in the late fifties it gradually became accepted among the greater academic population.[15][16] It is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM (short for International Business Machines) released the IBM 704[17] and later the IBM 709[18] computers, which were widely used during the exploration period of such devices. "Still, working with the IBM [computer] was frustrating […] if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again".[15] During the late 1950s, the computer science discipline was very much in its developmental stages, and such issues were commonplace.[16]

Time has seen significant improvements in the usability and effectiveness of computing technology.[19] Modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base. Initially, computers were quite costly, and some degree of human aid was needed for efficient use—in part from professional computer operators. As computer adoption became more widespread and affordable, less human assistance was needed for common usage.

Contributions

The German military used the Enigma machine (shown here) during World War II for communications they wanted kept secret. The large-scale decryption of Enigma traffic at Bletchley Park was an important factor that contributed to Allied victory in WWII.[20]

Despite its short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society—in fact, along with electronics, it is a founding science of the current epoch of human history called the Information Age and a driver of the information revolution, seen as the third major leap in human technological progress after the Industrial Revolution (1750–1850 CE) and the Agricultural Revolution (8000–5000 BC).

These contributions include:

Etymology

Although first proposed in 1956,[16] the term "computer science" appears in a 1959 article in Communications of the ACM,[28] in which Louis Fein argues for the creation of a Graduate School in Computer Sciences analogous to the creation of Harvard Business School in 1921,[29] justifying the name by arguing that, like management science, the subject is applied and interdisciplinary in nature, while having the characteristics typical of an academic discipline.[28] His efforts, and those of others such as numerical analyst George Forsythe, were rewarded: universities went on to create such programs, starting with Purdue in 1962.[30] Despite its name, a significant amount of computer science does not involve the study of computers themselves. Because of this, several alternative names have been proposed.[31] Certain departments of major universities prefer the term computing science, to emphasize precisely that difference. Danish scientist Peter Naur suggested the term datalogy,[32] to reflect the fact that the scientific discipline revolves around data and data treatment, while not necessarily involving computers. The first scientific institution to use the term was the Department of Datalogy at the University of Copenhagen, founded in 1969, with Peter Naur being the first professor in datalogy. The term is used mainly in the Scandinavian countries. An alternative term, also proposed by Naur, is data science; this is now used for a distinct field of data analysis, including statistics and databases.

Also, in the early days of computing, a number of terms for the practitioners of the field of computing were suggested in the Communications of the ACMturingineer, turologist, flow-charts-man, applied meta-mathematician, and applied epistemologist.[33] Three months later in the same journal, comptologist was suggested, followed next year by hypologist.[34] The term computics has also been suggested.[35] In Europe, terms derived from contracted translations of the expression "automatic information" (e.g. "informazione automatica" in Italian) or "information and mathematics" are often used, e.g. informatique (French), Informatik (German), informatica (Italian, Dutch), informática (Spanish, Portuguese), informatika (Slavic languages and Hungarian) or pliroforiki (πληροφορική, which means informatics) in Greek. Similar words have also been adopted in the UK (as in the School of Informatics of the University of Edinburgh).[36] "In the U.S., however, informatics is linked with applied computing, or computing in the context of another domain."[37]

A folkloric quotation, often attributed to—but almost certainly not first formulated by—Edsger Dijkstra, states that "computer science is no more about computers than astronomy is about telescopes."[note 3] The design and deployment of computers and computer systems is generally considered the province of disciplines other than computer science. For example, the study of computer hardware is usually considered part of computer engineering, while the study of commercial computer systems and their deployment is often called information technology or information systems. However, there has been much cross-fertilization of ideas between the various computer-related disciplines. Computer science research also often intersects other disciplines, such as philosophy, cognitive science, linguistics, mathematics, physics, biology, statistics, and logic.

Computer science is considered by some to have a much closer relationship with mathematics than many scientific disciplines, with some observers saying that computing is a mathematical science.[12] Early computer science was strongly influenced by the work of mathematicians such as Kurt Gödel, Alan Turing, Rózsa Péter and Alonzo Church and there continues to be a useful interchange of ideas between the two fields in areas such as mathematical logic, category theory, domain theory, and algebra.[16]

The relationship between computer science and software engineering is a contentious issue, which is further muddied by disputes over what the term "software engineering" means, and how computer science is defined.[38] David Parnas, taking a cue from the relationship between other engineering and science disciplines, has claimed that the principal focus of computer science is studying the properties of computation in general, while the principal focus of software engineering is the design of specific computations to achieve practical goals, making the two separate but complementary disciplines.[39]

The academic, political, and funding aspects of computer science tend to depend on whether a department formed with a mathematical emphasis or with an engineering emphasis. Computer science departments with a mathematics emphasis and with a numerical orientation consider alignment with computational science. Both types of departments tend to make efforts to bridge the field educationally if not across all research.

Philosophy

A number of computer scientists have argued for the distinction of three separate paradigms in computer science. Peter Wegner argued that those paradigms are science, technology, and mathematics.[40] Peter Denning's working group argued that they are theory, abstraction (modeling), and design.[41] Amnon H. Eden described them as the "rationalist paradigm" (which treats computer science as a branch of mathematics, which is prevalent in theoretical computer science, and mainly employs deductive reasoning), the "technocratic paradigm" (which might be found in engineering approaches, most prominently in software engineering), and the "scientific paradigm" (which approaches computer-related artifacts from the empirical perspective of natural sciences, identifiable in some branches of artificial intelligence).[42]

Areas of computer science

As a discipline, computer science spans a range of topics from theoretical studies of algorithms and the limits of computation to the practical issues of implementing computing systems in hardware and software.[43][44] CSAB, formerly called Computing Sciences Accreditation Board—which is made up of representatives of the Association for Computing Machinery (ACM), and the IEEE Computer Society (IEEE CS)[45]—identifies four areas that it considers crucial to the discipline of computer science: theory of computation, algorithms and data structures, programming methodology and languages, and computer elements and architecture. In addition to these four areas, CSAB also identifies fields such as software engineering, artificial intelligence, computer networking and communication, database systems, parallel computation, distributed computation, human–computer interaction, computer graphics, operating systems, and numerical and symbolic computation as being important areas of computer science.[43]

Theoretical computer science

Theoretical Computer Science is mathematical and abstract in spirit, but it derives its motivation from practical and everyday computation. Its aim is to understand the nature of computation and, as a consequence of this understanding, provide more efficient methodologies. All studies related to mathematical, logic and formal concepts and methods could be considered as theoretical computer science, provided that the motivation is clearly drawn from the field of computing.

Data structures and algorithms

Data structures and algorithms is the study of commonly used computational methods and their computational efficiency.

Theory of computation

According to Peter Denning, the fundamental question underlying computer science is, "What can be (efficiently) automated?"[12] Theory of computation is focused on answering fundamental questions about what can be computed and what amount of resources are required to perform those computations. In an effort to answer the first question, computability theory examines which computational problems are solvable on various theoretical models of computation. The second question is addressed by computational complexity theory, which studies the time and space costs associated with different approaches to solving a multitude of computational problems. The famous P = NP? problem, one of the Millennium Prize Problems,[46] is an open problem in the theory of computation.

Information and coding theory

Information theory is related to the quantification of information. This was developed by Claude Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data.[47] Coding theory is the study of the properties of codes (systems for converting information from one form to another) and their fitness for a specific application. Codes are used for data compression, cryptography, error detection and correction, and more recently also for network coding. Codes are studied for the purpose of designing efficient and reliable data transmission methods.

Programming language theory

Programming language theory is a branch of computer science that deals with the design, implementation, analysis, characterization, and classification of programming languages and their individual features. It falls within the discipline of computer science, both depending on and affecting mathematics, software engineering, and linguistics. It is an active research area, with numerous dedicated academic journals.

Formal methods

Formal methods are a particular kind of mathematically based technique for the specification, development and verification of software and hardware systems. The use of formal methods for software and hardware design is motivated by the expectation that, as in other engineering disciplines, performing appropriate mathematical analysis can contribute to the reliability and robustness of a design. They form an important theoretical underpinning for software engineering, especially where safety or security is involved. Formal methods are a useful adjunct to software testing since they help avoid errors and can also give a framework for testing. For industrial use, tool support is required. However, the high cost of using formal methods means that they are usually only used in the development of high-integrity and life-critical systems, where safety or security is of utmost importance. Formal methods are best described as the application of a fairly broad variety of theoretical computer science fundamentals, in particular logic calculi, formal languages, automata theory, and program semantics, but also type systems and algebraic data types to problems in software and hardware specification and verification.

Computer systems

Computer architecture and computer engineering

Computer architecture, or digital computer organization, is the conceptual design and fundamental operational structure of a computer system. It focuses largely on the way by which the central processing unit performs internally and accesses addresses in memory.[48] The field often involves disciplines of computer engineering and electrical engineering, selecting and interconnecting hardware components to create computers that meet functional, performance, and cost goals.

Computer performance analysis

Computer performance analysis is the study of work flowing through computers with the general goals of improving throughput, controlling response time, using resources efficiently, eliminating bottlenecks, and predicting performance under anticipated peak loads.[49]

Concurrent, parallel and distributed systems

Concurrency is a property of systems in which several computations are executing simultaneously, and potentially interacting with each other. A number of mathematical models have been developed for general concurrent computation including Petri nets, process calculi and the Parallel Random Access Machine model. A distributed system extends the idea of concurrency onto multiple computers connected through a network. Computers within the same distributed system have their own private memory, and information is often exchanged among themselves to achieve a common goal.

Computer networks

This branch of computer science aims to manage networks between computers worldwide.

Computer security and cryptography

Computer security is a branch of computer technology, whose objective includes protection of information from unauthorized access, disruption, or modification while maintaining the accessibility and usability of the system for its intended users. Cryptography is the practice and study of hiding (encryption) and therefore deciphering (decryption) information. Modern cryptography is largely related to computer science, for many encryption and decryption algorithms are based on their computational complexity.

Databases

A database is intended to organize, store, and retrieve large amounts of data easily. Digital databases are managed using database management systems to store, create, maintain, and search data, through database models and query languages.

Computer applications

Computer graphics and visualization

Computer graphics is the study of digital visual contents, and involves synthesis and manipulation of image data. The study is connected to many other fields in computer science, including computer vision, image processing, and computational geometry, and is heavily applied in the fields of special effects and video games.

Human–computer interaction

Research that develops theories, principles, and guidelines for user interface designers, so they can create satisfactory user experiences with desktop, laptop, and mobile devices.

Scientific computing

Scientific computing (or computational science) is the field of study concerned with constructing mathematical models and quantitative analysis techniques and using computers to analyze and solve scientific problems. In practical use, it is typically the application of computer simulation and other forms of computation to problems in various scientific disciplines.

Artificial intelligence

Artificial intelligence (AI) aims to or is required to synthesize goal-orientated processes such as problem-solving, decision-making, environmental adaptation, learning and communication found in humans and animals. From its origins in cybernetics and in the Dartmouth Conference (1956), artificial intelligence research has been necessarily cross-disciplinary, drawing on areas of expertise such as applied mathematics, symbolic logic, semiotics, electrical engineering, philosophy of mind, neurophysiology, and social intelligence. AI is associated in the popular mind with robotic development, but the main field of practical application has been as an embedded component in areas of software development, which require computational understanding. The starting-point in the late 1940s was Alan Turing's question "Can computers think?", and the question remains effectively unanswered although the Turing test is still used to assess computer output on the scale of human intelligence. But the automation of evaluative and predictive tasks has been increasingly successful as a substitute for human monitoring and intervention in domains of computer application involving complex real-world data.

Software engineering

Software engineering is the study of designing, implementing, and modifying software in order to ensure it is of high quality, affordable, maintainable, and fast to build. It is a systematic approach to software design, involving the application of engineering practices to software. Software engineering deals with the organizing and analyzing of software—it doesn't just deal with the creation or manufacture of new software, but its internal maintenance and arrangement. Both computer applications software engineers and computer systems software engineers are projected to be among the fastest growing occupations from 2008 to 2018.

The great insights of computer science

The philosopher of computing Bill Rapaport noted three Great Insights of Computer Science:[50]
All the information about any computable problem can be represented using only 0 and 1 (or any other bistable pair that can flip-flop between two easily distinguishable states, such as "on/off", "magnetized/de-magnetized", "high-voltage/low-voltage", etc.).
  • Alan Turing's insight: there are only five actions that a computer has to perform in order to do "anything".
Every algorithm can be expressed in a language for a computer consisting of only five basic instructions:
  • move left one location;
  • move right one location;
  • read symbol at current location;
  • print 0 at current location;
  • print 1 at current location.
  • Corrado Böhm and Giuseppe Jacopini's insight: there are only three ways of combining these actions (into more complex ones) that are needed in order for a computer to do "anything".
Only three rules are needed to combine any set of basic instructions into more complex ones:
  • sequence: first do this, then do that;
  • selection: IF such-and-such is the case, THEN do this, ELSE do that;
  • repetition: WHILE such-and-such is the case DO this.
Note that the three rules of Boehm's and Jacopini's insight can be further simplified with the use of goto (which means it is more elementary than structured programming).

Academia

Conferences are important events for computer science research. During these conferences, researchers from the public and private sectors present their recent work and meet. Unlike in most other academic fields, in computer science, the prestige of conference papers is greater than that of journal publications.[51][52] One proposed explanation for this is the quick development of this relatively new field requires rapid review and distribution of results, a task better handled by conferences than by journals.[53]

Education

Since computer science is a relatively new field, it is not as widely taught in schools and universities as other academic subjects. For example, in 2014, Code.org estimated that only 10 percent of high schools in the United States offered computer science education.[54] A 2010 report by Association for Computing Machinery (ACM) and Computer Science Teachers Association (CSTA) revealed that only 14 out of 50 states have adopted significant education standards for high school computer science.[55] However, computer science education is growing.[56] Some countries, such as Israel, New Zealand and South Korea, have already included computer science in their respective national secondary education curriculum.[57][58] Several countries are following suit.[59]

In most countries, there is a significant gender gap in computer science education. For example, in the US about 20% of computer science degrees in 2012 were conferred to women.[60] This gender gap also exists in other Western countries.[61] However, in some parts of the world, the gap is small or nonexistent. In 2011, approximately half of all computer science degrees in Malaysia were conferred to women.[62] In 2001, women made up 54.5% of computer science graduates in Guyana.

Informatics

From Wikipedia, the free encyclopedia

Informatics is a branch of information engineering. It involves the practice of information processing and the engineering of information systems, and as an academic field it is an applied form of information science. The field considers the interaction between humans and information alongside the construction of interfaces, organisations, technologies and systems. As such, the field of informatics has great breadth and encompasses many subspecialties, including disciplines of computer science, information systems, information technology and statistics. Since the advent of computers, individuals and organizations increasingly process information digitally. This has led to the study of informatics with computational, mathematical, biological, cognitive and social aspects, including study of the social impact of information technologies.

Etymology

In 1956 the German computer scientist Karl Steinbuch coined the word Informatik by publishing a paper called Informatik: Automatische Informationsverarbeitung ("Informatics: Automatic Information Processing").[1] The English term Informatics is sometimes understood as meaning the same as computer science. The German word Informatik is usually translated to English as computer science.
The French term informatique was coined in 1962 by Philippe Dreyfus[2] together with various translations—informatics (English), also proposed independently and simultaneously by Walter F. Bauer and associates who co-founded Informatics Inc., and informatica (Italian, Spanish, Romanian, Portuguese, Dutch), referring to the application of computers to store and process information.

The term was coined as a combination of "information" and "automatic" to describe the science of automating information interactions. The morphology—informat-ion + -ics—uses "the accepted form for names of sciences, as conics, linguistics, optics, or matters of practice, as economics, politics, tactics",[3] and so, linguistically, the meaning extends easily to encompass both the science of information and the practice of information processing.

History

The culture of library science promotes policies and procedures for managing information that fosters the relationship between library science and the development of information science to provide benefits for health informatics development; which is traced to the 1950s with the beginning of computer uses in healthcare (Nelson & Staggers p.4). Early practitioners interested in the field soon learned that there were no formal education programs set up to educate them on the informatics science until the late 1960s and early 1970s. Professional development began to emerge, playing a significant role in the development of health informatics (Nelson &Staggers p.7) According to Imhoff et al., 2001, healthcare informatics is not only the application of computer technology to problems in healthcare but covers all aspects of generation, handling, communication, storage, retrieval, management, analysis, discovery, and synthesis of data information and knowledge in the entire scope of healthcare. Furthermore, they stated that the primary goal of health informatics can be distinguished as follows: To provide solutions for problems related to data, information, and knowledge processing. To study general principles of processing data information and knowledge in medicine and healthcare.
This new term was adopted across Western Europe, and, except in English, developed a meaning roughly translated by the English ‘computer science’, or ‘computing science’. Mikhailov advocated the Russian term informatika (1966), and the English informatics (1967), as names for the theory of scientific information, and argued for a broader meaning, including study of the use of information technology in various communities (for example, scientific) and of the interaction of technology and human organizational structures.
Informatics is the discipline of science which investigates the structure and properties (not specific content) of scientific information, as well as the regularities of scientific information activity, its theory, history, methodology and organization.[4]
Usage has since modified this definition in three ways. First, the restriction to scientific information is removed, as in business informatics or legal informatics. Second, since most information is now digitally stored, computation is now central to informatics. Third, the representation, processing and communication of information are added as objects of investigation, since they have been recognized as fundamental to any scientific account of information. Taking information as the central focus of study distinguishes informatics from computer science. Informatics includes the study of biological and social mechanisms of information processing whereas computer science focuses on the digital computation. Similarly, in the study of representation and communication, informatics is indifferent to the substrate that carries information. For example, it encompasses the study of communication using gesture, speech and language, as well as digital communications and networking.

In the English-speaking world the term informatics was first widely used in the compound medical informatics, taken to include "the cognitive, information processing, and communication tasks of medical practice, education, and research, including information science and the technology to support these tasks".[5] Many such compounds are now in use; they can be viewed as different areas of "applied informatics". Indeed, "In the U.S., however, informatics is linked with applied computing, or computing in the context of another domain."[6]

Informatics encompasses the study of systems that represent, process, and communicate information. However, the theory of computation in the specific discipline of theoretical computer science, which evolved from Alan Turing, studies the notion of a complex system regardless of whether or not information actually exists. Since both fields process information, there is some disagreement among scientists as to field hierarchy; for example Arizona State University attempted to adopt a broader definition of informatics to even encompass cognitive science at the launch of its School of Computing and Informatics in September 2006.

A broad interpretation of informatics, as "the study of the structure, algorithms, behaviour, and interactions of natural and artificial computational systems," was introduced by the University of Edinburgh in 1994 when it formed the grouping that is now its School of Informatics. This meaning is now (2006) increasingly used in the United Kingdom.[7]

The 2008 Research Assessment Exercise, of the UK Funding Councils, includes a new, Computer Science and Informatics, unit of assessment (UoA),[8] whose scope is described as follows:
The UoA includes the study of methods for acquiring, storing, processing, communicating and reasoning about information, and the role of interactivity in natural and artificial systems, through the implementation, organisation and use of computer hardware, software and other resources. The subjects are characterised by the rigorous application of analysis, experimentation and design.

Academic schools and departments

Academic research in the informatics area can be found in a number of disciplines such as computer science, information technology, Information and Computer Science, information systems, business information management and health informatics.

In France, the first degree level qualifications in Informatics (computer science) appeared in the mid-1960s.[citation needed]

In English-speaking countries, the first example of a degree level qualification in Informatics occurred in 1982 when Plymouth Polytechnic (now the University of Plymouth) offered a four-year BSc(Honours) degree in Computing and Informatics – with an initial intake of only 35 students. The course still runs today [9] making it the longest available qualification in the subject.

At the Indiana University School of Informatics, Computing, and Engineering (Bloomington, Indianapolis and Southeast), informatics is defined as "the art, science and human dimensions of information technology" and "the study, application, and social consequences of technology." It is also defined in Informatics 101, Introduction to Informatics as "the application of information technology to the arts, sciences, and professions." These definitions are widely accepted in the United States, and differ from British usage in omitting the study of natural computation.

Texas Woman's University places its informatics degrees in its department of Mathematics and Computer Science within the College of Arts & Sciences, though it offers interdisciplinary Health Informatics degrees.[10] Informatics is presented in a generalist framework, as evidenced by their definition of informatics ("Using technology and data analytics to derive meaningful information from data for data and decision driven practice in user centered systems"), though TWU is also known for its nursing and health informatics programs.

At the University of California, Irvine Department of Informatics, informatics is defined as "the interdisciplinary study of the design, application, use and impact of information technology. The discipline of informatics is based on the recognition that the design of this technology is not solely a technical matter, but must focus on the relationship between the technology and its use in real-world settings. That is, informatics designs solutions in context, and takes into account the social, cultural and organizational settings in which computing and information technology will be used."

At the University of Michigan, Ann Arbor Informatics interdisciplinary major, informatics is defined as "the study of information and the ways information is used by and affects human beings and social systems. The major involves coursework from the College of Literature, Science and the Arts, where the Informatics major is housed, as well as the School of Information and the College of Engineering. Key to this growing field is that it applies both technological and social perspectives to the study of information. Michigan's interdisciplinary approach to teaching Informatics gives a solid grounding in contemporary computer programming, mathematics, and statistics, combined with study of the ethical and social science aspects of complex information systems. Experts in the field help design new information technology tools for specific scientific, business, and cultural needs." Michigan offers four curricular tracks within the informatics degree to provide students with increased expertise. These four track topics include:[11]
  • Internet Informatics: An applied track in which students experiment with technologies behind Internet-based information systems and acquire skills to map problems to deployable Internet-based solutions. This track will replace Computational Informatics in Fall 2013.[12]
  • Data Mining & Information Analysis: Integrates the collection, analysis, and visualization of complex data and its critical role in research, business, and government to provide students with practical skills and a theoretical basis for approaching challenging data analysis problems.
  • Life Science Informatics: Examines artificial information systems, which has helped scientists make great progress in identifying core components of organisms and ecosystems.
  • Social Computing: Advances in computing have created opportunities for studying patterns of social interaction and developing systems that act as introducers, recommenders, coordinators, and record-keepers. Students, in this track, craft, evaluate, and refine social software computer applications for engaging technology in unique social contexts. This track will be phased out in Fall 2013 in favor of the new bachelor of science in information. This will be the first undergraduate degree offered by the School of Information since its founding in 1996. The School of Information already contains a Master's program, Doctorate program, and a professional master's program in conjunction with the School of Public Health. The BS in Information at the University of Michigan will be the first curriculum program of its kind in the United States, with the first graduating class to emerge in 2015. Students will be able to apply for this unique degree in 2013 for the 2014 Fall semester; the new degree will be a stem off of the most popular Social Computing track in the current Informatics interdisciplinary major in LSA. Applications will be open to upper-classmen, juniors and seniors, along with a variety of information classes available for first and second year students to gauge interest and value in the specific sector of study. The degree was approved by the University on June 11, 2012.[13] Along with a new degree in the School of Information, there has also been the first and only chapter of an Informatics Professional Fraternity, Kappa Theta Pi, chartered in Fall 2012.[14]
At the University of Washington, Seattle Informatics Undergraduate Program, Informatics is an undergraduate program offered by the Information School. Bachelor of Science in Informatics is described as "[a] program that focuses on computer systems from a user-centered perspective and studies the structure, behavior and interactions of natural and artificial systems that store, process and communicate information. Includes instruction in information sciences, human computer interaction, information system analysis and design, telecommunications structure and information architecture and management." Washington offers three degree options as well as a custom track.[15]
  • Data Science Option: Data Science is an emerging interdisciplinary field that works to extract knowledge or insight from data. It combines fields such as information science, computer science, statistics, design, and social science.
  • Human-Computer Interaction: The iSchool’s work in human-computer interaction (HCI) strives to make information and computing useful, usable, and accessible to all. The Informatics HCI option allows one to blend your technical skills and expertise with a broader perspective on how design and development work impacts users. Courses explore the design, construction, and evaluation of interactive technologies for use by individuals, groups, and organizations, and the social implications of these systems. This work encompasses user interfaces, accessibility concerns, new design techniques and methods for interactive systems and collaboration. Coursework also examines the values implicit in the design and development of technology.
  • Information Architecture: Information architecture (IA) is a crucial component in the development of successful Web sites, software, intranets, and online communities. Architects structure the underlying information and its presentation in a logical and intuitive way so that people can put information to use. As an Informatics major with an IA option, one will master the skills needed to organize and label information for improved navigation and search. One will build frameworks to effectively collect, store and deliver information. One will also learn to design the databases and XML storehouses that drive complex and interactive websites, including the navigation, content layout, personalization, and transactional features of the site.
  • Information Assurance and Cybersecurity: Information Assurance and Cybersecurity (IAC) is the practice of creating and managing safe and secure systems. It is crucial for organizations public and private, large and small. In the IAC option, one will be equipped with the knowledge to create, deploy, use, and manage systems that preserve individual and organizational privacy and security. This tri-campus concentration leverages the strengths of the Information School, the Computing and Software Systems program at UW Bothell, and the Institute of Technology at UW Tacoma. After a course in the technical, policy, and management foundations of IAC, one may take electives at any campus to learn such specialties as information assurance policy, secure coding, or networking and systems administration.
  • Custom (Student-Designed Concentration): Students may choose to develop their own concentration, with approval from the academic adviser. Student-designed concentrations are created out of a list of approved courses and also result in the Bachelor of Science degree.

Applied disciplines

Organizational informatics

One of the most significant areas of application of informatics is that of organizational informatics. Organizational informatics is fundamentally interested in the application of information, information systems and ICT within organisations of various forms including private sector, public sector and voluntary sector organisations.[16][17] As such, organisational informatics can be seen to be a sub-category of social informatics and a super-category of business informatics. Organizational informatics are also present in the computer science and information technology industry.

Discovering new drugs and materials by ‘touching’ molecules in virtual reality

Scientists can now visualize and experiment with structures and dynamics of complex molecular structures (at atomic-level precision), with real-time multi-user collaboration via the cloud
July 6, 2018
Original link:  http://www.kurzweilai.net/discovering-new-drugs-and-materials-by-touching-molecules-in-virtual-reality
To figure out how to block a bacteria’s attempt to create multi-resistance to antibiotics, a researcher grabs a simulated ligand (binding molecule) — a type of penicillin called benzylpenicillin (red) — and interactively guides that molecule to dock within a larger enzyme molecule (blue-orange) called β-lactamase, which is produced by bacteria in an attempt to disable penicillin (making a patient resistant to a class of antibiotics called β-lactam). (credit: University of Bristol)

University of Bristol researchers have designed and tested a new virtual reality (VR) cloud-based system intended to allow researchers to reach out and “touch” molecules as they move — folding them, knotting them, plucking them, and changing their shape to test how the molecules interact. Using an HTC Vive virtual-reality device, it could lead to creating new drugs and materials and improving the teaching of chemistry.

More broadly, the goal is to accelerate progress in nanoscale molecular engineering areas that include conformational mapping, drug development, synthetic biology, and catalyst design.

Real-time collaboration via the cloud

Two users passing a fullerene (C60) molecule back and forth in real time over a cloud-based network. The researchers are each wearing a VR head-mounted display (HMD) and holding two small wireless controllers that function as atomic “tweezers” to manipulate the real-time molecular dynamic of the C60 molecule. Each user’s position is determined using a real-time optical tracking system composed of synchronized infrared light sources, running locally on a GPU-accelerated computer. (credit: University of Bristol)

The multi-user system, developed by developed by a team led by University of Bristol chemists and computer scientists, uses an “interactive molecular dynamics virtual reality” (iMD VR) app that allows users to visualize and sample (with atomic-level precision) the structures and dynamics of complex molecular structures “on the fly” and to interact with other users in the same virtual environment.

Because each VR client has access to global position data of all other users, any user can see through his/her headset a co-located visual representation of all other users at the same time. So far, the system has uniquely allowed for simultaneously co-locating six users in the same room within the same simulation.

Testing on challenging molecular tasks

The team designed a series of molecular tasks for testing, using traditional mouse, keyboard, and touchscreens compared to virtual reality. The tasks included threading a small molecule through a nanotube, changing the screw-sense of a small organic helix, and tying a small string-like protein into a simple knot, and a variety of dynamic molecular problems, such as binding drugs to its target, protein folding, and chemical reactions. The researchers found that for complex 3D tasks, VR offers a significant advantage over current methods. For example, participants were ten times more likely to succeed in difficult tasks such as molecular knot tying.

Anyone can try out the tasks described in the open-access paper by downloading the software and launching their own cloud-hosted session.

David Glowacki | This video, made by University of Bristol PhD student Helen M. Deeks, shows the actions she took using a wireless set of “atomic tweezers” (using the HTC Vive) to interactively dock a single benzylpenicillin drug molecule into the active site of the β-lactamase enzyme. 

David Glowacki | The video shows the cloud-mounted virtual reality framework, with several different views overlaid to give a sense of how the interaction works. The video outlines the four different parts of the user studies: (1) manipulation of buckminsterfullerene, enabling users to familarize themselves with the interactive controls; (2) threading a methane molecule through a nanotube; (3) changing the screw-sense of a helicene molecule; and (4) tying a trefoil knot in 17-Alanine.

Ref: Science Advances (open-access). Source: University of Bristol.

How to predict the side effects of millions of drug combinations

Stanford Univ. computer scientists have figured it out, using artificial intelligence.
July 17, 2018
Original link:  http://www.kurzweilai.net/how-to-predict-the-side-effects-of-millions-of-drug-combinations
An example graph of polypharmacy side effects derived from genomic and patient population data, protein–protein interactions, drug–protein targets, and drug–drug interactions encoded by 964 different polypharmacy side effects. The graph representation is used to develop Decagon. (credit: Marinka Zitnik et al./Bioinformatics)

Millions of people take up to five or more medications a day, but doctors have no idea what side effects might arise from adding another drug.*

Now, Stanford University computer scientists have developed a deep-learning system (a kind of AI modeled after the brain) called Decagon** that could help doctors make better decisions about which drugs to prescribe. It could also help researchers find better combinations of drugs to treat complex diseases.

The problem is that with so many drugs currently on the U.S. pharmaceutical market, “it’s practically impossible to test a new drug in combination with all other drugs, because just for one drug, that would be five thousand new experiments,” said Marinka Zitnik, a postdoctoral fellow in computer science and lead author of a paper presented July 10 at the 2018 meeting of the International Society for Computational Biology.

With some new drug combinations (“polypharmacy”), she said, “truly we don’t know what will happen.”

How proteins interact and how different drugs affect these proteins

So Zitnik and associates created a network describing how the more than 19,000 proteins in our bodies interact with each other and how different drugs affect these proteins. Using more than 4 million known associations between drugs and side effects, the team then designed a method to identify patterns in how side effects arise, based on how drugs target different proteins, and also to infer patterns about drug-interaction side effects.***

Based on that method, the system could predict the consequences of taking two drugs together.

To evaluate the The research was supported by the National Science Foundation, the National Institutes of Health, the Defense Advanced Research Projects Agency, the Stanford Data Science Initiative, and the Chan Zuckerberg Biohub. system, the group looked to see if its predictions came true. In many cases, they did. For example, there was no indication in the original data that the combination of atorvastatin (marketed under the trade name Lipitor among others), a cholesterol drug, and amlopidine (Norvasc), a blood-pressure medication, could lead to muscle inflammation. Yet Decagon predicted that it would, and it was right.

In the future, the team members hope to extend their results to include more multiple drug interactions. They also hope to create a more user-friendly tool to give doctors guidance on whether it’s a good idea to prescribe a particular drug to a particular patient, and to help researchers developing drug regimens for complex diseases, with fewer side effects.

Ref.: Bioinformatics (open access). Source: Stanford University.

* More than 23 percent of Americans took three or more prescription drugs in the past 30 days, according to a 2017 CDC estimate. Furthermore, 39 percent over age 65 take five or more, a number that’s increased three-fold in the last several decades. There are about 1,000 known side effects and 5,000 drugs on the market, making for nearly 125 billion possible side effects between all possible pairs of drugs. Most of these have never been prescribed together, let alone systematically studied, according to the Stanford researchers.

** In geometry, a decagon is a ten-sided polygon.

*** The research was supported by the National Science Foundation, the National Institutes of Health, the Defense Advanced Research Projects Agency, the Stanford Data Science Initiative, and the Chan Zuckerberg Biohub.

Cooperative

From Wikipedia, the free encyclopedia ...