Search This Blog

Wednesday, May 27, 2020

Software

From Wikipedia, the free encyclopedia

A diagram showing how the user interacts with application software on a typical desktop computer. The application software layer interfaces with the operating system, which in turn communicates with the hardware. The arrows indicate information flow.

Computer software, or simply software, is a collection of data or computer instructions that tell the computer how to work. This is in contrast to physical hardware, from which the system is built and actually performs the work. In computer science and software engineering, computer software is all information processed by computer systems, programs and data. Computer software includes computer programs, libraries and related non-executable data, such as online documentation or digital media. Computer hardware and software require each other and neither can be realistically used on its own.

At the lowest programming level, executable code consists of machine language instructions supported by an individual processor—typically a central processing unit (CPU) or a graphics processing unit (GPU). A machine language consists of groups of binary values signifying processor instructions that change the state of the computer from its preceding state. For example, an instruction may change the value stored in a particular storage location in the computer—an effect that is not directly observable to the user. An instruction may also invoke one of many input or output operations, for example displaying some text on a computer screen; causing state changes which should be visible to the user. The processor executes the instructions in the order they are provided, unless it is instructed to "jump" to a different instruction, or is interrupted by the operating system. As of 2015, most personal computers, smartphone devices and servers have processors with multiple execution units or multiple processors performing computation together, and computing has become a much more concurrent activity than in the past.

The majority of software is written in high-level programming languages. They are easier and more efficient for programmers because they are closer to natural languages than machine languages. High-level languages are translated into machine language using a compiler or an interpreter or a combination of the two. Software may also be written in a low-level assembly language, which has strong correspondence to the computer's machine language instructions and is translated into machine language using an assembler.

History

An outline (algorithm) for what would have been the first piece of software was written by Ada Lovelace in the 19th century, for the planned Analytical Engine. She created proofs to show how the engine would calculate Bernoulli Numbers. Because of the proofs and the algorithm, she is considered the first computer programmer.

The first theory about software—prior to the creation of computers as we know them today—was proposed by Alan Turing in his 1935 essay On Computable Numbers, with an Application to the Entscheidungsproblem (decision problem).

This eventually led to the creation of the academic fields of computer science and software engineering; Both fields study software and its creation. Computer science is the theoretical study of computer and software (Turing's essay is an example of computer science), whereas software engineering is the application of engineering and development of software.

However, prior to 1946, software was not yet the programs stored in the memory of stored-program digital computers, as we now understand it. The first electronic computing devices were instead rewired in order to "reprogram" them.

In 2000, Fred Shapiro, a librarian at the Yale Law School, published a letter revealing that John Wilder Tukey's 1958 paper "The Teaching of Concrete Mathematics" contained the earliest known usage of the term "software" found in a search of JSTOR's electronic archives, predating the OED's citation by two years. This led many to credit Tukey with coining the term, particularly in obituaries published that same year, although Tukey never claimed credit for any such coinage. In 1995, Paul Niquette claimed he had originally coined the term in October 1953, although he could not find any documents supporting his claim. The earliest known publication of the term "software" in an engineering context was in August 1953 by Richard R. Carhart, in a Rand Corporation Research Memorandum.

Types


On virtually all computer platforms, software can be grouped into a few broad categories.

Purpose, or domain of use

Based on the goal, computer software can be divided into:
  • Application software
    which is software that uses the computer system to perform special functions or provide entertainment functions beyond the basic operation of the computer itself. There are many different types of application software, because the range of tasks that can be performed with a modern computer is so large.
  • System software
    which is software for managing computer hardware behaviour, as to provide basic functionalities that are required by users, or for other software to run properly, if at all. System software is also designed for providing a platform for running application software, and it includes the following:
    • Operating systems
      which are essential collections of software that manage resources and provide common services for other software that runs "on top" of them. Supervisory programs, boot loaders, shells and window systems are core parts of operating systems. In practice, an operating system comes bundled with additional software (including application software) so that a user can potentially do some work with a computer that only has one operating system.
    • Device drivers
      which operate or control a particular type of device that is attached to a computer. Each device needs at least one corresponding device driver; because a computer typically has at minimum at least one input device and at least one output device, a computer typically needs more than one device driver.
    • Utilities
      which are computer programs designed to assist users in the maintenance and care of their computers.
  • Malicious software or malware
    which is software that is developed to harm and disrupt computers. As such, malware is undesirable. Malware is closely associated with computer-related crimes, though some malicious programs may have been designed as practical jokes.

Nature or domain of execution

  • Desktop applications such as web browsers and Microsoft Office, as well as smartphone and tablet applications (called "apps"). (There is a push in some parts of the software industry to merge desktop applications with mobile apps, to some extent. Windows 8, and later Ubuntu Touch, tried to allow the same style of application user interface to be used on desktops, laptops and mobiles.)
  •  
  • JavaScript scripts are pieces of software traditionally embedded in web pages that are run directly inside the web browser when a web page is loaded without the need for a web browser plugin. Software written in other programming languages can also be run within the web browser if the software is either translated into JavaScript, or if a web browser plugin that supports that language is installed; the most common example of the latter is ActionScript scripts, which are supported by the Adobe Flash plugin.
  • Server software, including:
  • Plugins and extensions are software that extends or modifies the functionality of another piece of software, and require that software be used in order to function;
  • Embedded software resides as firmware within embedded systems, devices dedicated to a single use or a few uses such as cars and televisions (although some embedded devices such as wireless chipsets can themselves be part of an ordinary, non-embedded computer system such as a PC or smartphone). In the embedded system context there is sometimes no clear distinction between the system software and the application software. However, some embedded systems run embedded operating systems, and these systems do retain the distinction between system software and application software (although typically there will only be one, fixed application which is always run).
  • Microcode is a special, relatively obscure type of embedded software which tells the processor itself how to execute machine code, so it is actually a lower level than machine code. It is typically proprietary to the processor manufacturer, and any necessary correctional microcode software updates are supplied by them to users (which is much cheaper than shipping replacement processor hardware). Thus an ordinary programmer would not expect to ever have to deal with it.

Programming tools

Programming tools are also software in the form of programs or applications that software developers (also known as programmers, coders, hackers or software engineers) use to create, debug, maintain (i.e. improve or fix), or otherwise support software.

Software is written in one or more programming languages; there are many programming languages in existence, and each has at least one implementation, each of which consists of its own set of programming tools. These tools may be relatively self-contained programs such as compilers, debuggers, interpreters, linkers, and text editors, that can be combined together to accomplish a task; or they may form an integrated development environment (IDE), which combines much or all of the functionality of such self-contained tools. IDEs may do this by either invoking the relevant individual tools or by re-implementing their functionality in a new way. An IDE can make it easier to do specific tasks, such as searching in files in a particular project. Many programming language implementations provide the option of using both individual tools or an IDE.

Topics

Architecture

Users often see things differently from programmers. People who use modern general purpose computers (as opposed to embedded systems, analog computers and supercomputers) usually see three layers of software performing a variety of tasks: platform, application, and user software.
  • Platform software
    The Platform includes the firmware, device drivers, an operating system, and typically a graphical user interface which, in total, allow a user to interact with the computer and its peripherals (associated equipment). Platform software often comes bundled with the computer. On a PC one will usually have the ability to change the platform software.
  • Application software
    Application software or Applications are what most people think of when they think of software. Typical examples include office suites and video games. Application software is often purchased separately from computer hardware. Sometimes applications are bundled with the computer, but that does not change the fact that they run as independent applications. Applications are usually independent programs from the operating system, though they are often tailored for specific platforms. Most users think of compilers, databases, and other "system software" as applications.
  • User-written software
    End-user development tailors systems to meet users' specific needs. User software includes spreadsheet templates and word processor templates. Even email filters are a kind of user software. Users create this software themselves and often overlook how important it is. Depending on how competently the user-written software has been integrated into default application packages, many users may not be aware of the distinction between the original packages, and what has been added by co-workers.

Execution

Computer software has to be "loaded" into the computer's storage (such as the hard drive or memory). Once the software has loaded, the computer is able to execute the software. This involves passing instructions from the application software, through the system software, to the hardware which ultimately receives the instruction as machine code. Each instruction causes the computer to carry out an operation—moving data, carrying out a computation, or altering the control flow of instructions.

Data movement is typically from one place in memory to another. Sometimes it involves moving data between memory and registers which enable high-speed data access in the CPU. Moving data, especially large amounts of it, can be costly. So, this is sometimes avoided by using "pointers" to data instead. Computations include simple operations such as incrementing the value of a variable data element. More complex computations may involve many operations and data elements together.

Quality and reliability

Software quality is very important, especially for commercial and system software like Microsoft Office, Microsoft Windows and Linux. If software is faulty (buggy), it can delete a person's work, crash the computer and do other unexpected things. Faults and errors are called "bugs" which are often discovered during alpha and beta testing. Software is often also a victim to what is known as software aging, the progressive performance degradation resulting from a combination of unseen bugs. 

Many bugs are discovered and eliminated (debugged) through software testing. However, software testing rarely—if ever—eliminates every bug; some programmers say that "every program has at least one more bug" (Lubarsky's Law). In the waterfall method of software development, separate testing teams are typically employed, but in newer approaches, collectively termed agile software development, developers often do all their own testing, and demonstrate the software to users/clients regularly to obtain feedback. Software can be tested through unit testing, regression testing and other methods, which are done manually, or most commonly, automatically, since the amount of code to be tested can be quite large. For instance, NASA has extremely rigorous software testing procedures for many operating systems and communication functions. Many NASA-based operations interact and identify each other through command programs. This enables many people who work at NASA to check and evaluate functional systems overall. Programs containing command software enable hardware engineering and system operations to function much easier together.

License

The software's license gives the user the right to use the software in the licensed environment, and in the case of free software licenses, also grants other rights such as the right to make copies. 

Proprietary software can be divided into two types:
  • freeware, which includes the category of "free trial" software or "freemium" software (in the past, the term shareware was often used for free trial/freemium software). As the name suggests, freeware can be used for free, although in the case of free trials or freemium software, this is sometimes only true for a limited period of time or with limited functionality.
  • software available for a fee, often inaccurately termed "commercial software", which can only be legally used on purchase of a license.
Open-source software, on the other hand, comes with a free software license, granting the recipient the rights to modify and redistribute the software.

Patents

Software patents, like other types of patents, are theoretically supposed to give an inventor an exclusive, time-limited license for a detailed idea (e.g. an algorithm) on how to implement a piece of software, or a component of a piece of software. Ideas for useful things that software could do, and user requirements, are not supposed to be patentable, and concrete implementations (i.e. the actual software packages implementing the patent) are not supposed to be patentable either—the latter are already covered by copyright, generally automatically. So software patents are supposed to cover the middle area, between requirements and concrete implementation. In some countries, a requirement for the claimed invention to have an effect on the physical world may also be part of the requirements for a software patent to be held valid—although since all useful software has effects on the physical world, this requirement may be open to debate. Meanwhile, American copyright law was applied to various aspects of the writing of the software code.

Software patents are controversial in the software industry with many people holding different views about them. One of the sources of controversy is that the aforementioned split between initial ideas and patent does not seem to be honored in practice by patent lawyers—for example the patent for Aspect-Oriented Programming (AOP), which purported to claim rights over any programming tool implementing the idea of AOP, howsoever implemented. Another source of controversy is the effect on innovation, with many distinguished experts and companies arguing that software is such a fast-moving field that software patents merely create vast additional litigation costs and risks, and actually retard innovation. In the case of debates about software patents outside the United States, the argument has been made that large American corporations and patent lawyers are likely to be the primary beneficiaries of allowing or continue to allow software patents.

Design and implementation

Design and implementation of software varies depending on the complexity of the software. For instance, the design and creation of Microsoft Word took much more time than designing and developing Microsoft Notepad because the latter has much more basic functionality.

Software is usually designed and created (aka coded/written/programmed) in integrated development environments (IDE) like Eclipse, IntelliJ and Microsoft Visual Studio that can simplify the process and compile the software (if applicable). As noted in a different section, software is usually created on top of existing software and the application programming interface (API) that the underlying software provides like GTK+, JavaBeans or Swing. Libraries (APIs) can be categorized by their purpose. For instance, the Spring Framework is used for implementing enterprise applications, the Windows Forms library is used for designing graphical user interface (GUI) applications like Microsoft Word, and Windows Communication Foundation is used for designing web services. When a program is designed, it relies upon the API. For instance, a Microsoft Windows desktop application might call API functions in the .NET Windows Forms library like Form1.Close() and Form1.Show() to close or open the application. Without these APIs, the programmer needs to write these functionalities entirely themselves. Companies like Oracle and Microsoft provide their own APIs so that many applications are written using their software libraries that usually have numerous APIs in them.

Data structures such as hash tables, arrays, and binary trees, and algorithms such as quicksort, can be useful for creating software. 

Computer software has special economic characteristics that make its design, creation, and distribution different from most other economic goods.

A person who creates software is called a programmer, software engineer or software developer, terms that all have a similar meaning. More informal terms for programmer also exist such as "coder" and "hacker" – although use of the latter word may cause confusion, because it is more often used to mean someone who illegally breaks into computer systems.

Industry and organizations

A great variety of software companies and programmers in the world comprise a software industry. Software can be quite a profitable industry: Bill Gates, the co-founder of Microsoft was the richest person in the world in 2009, largely due to his ownership of a significant number of shares in Microsoft, the company responsible for Microsoft Windows and Microsoft Office software products - both market leaders in their respective product categories.

Non-profit software organizations include the Free Software Foundation, GNU Project and the Mozilla Foundation. Software standard organizations like the W3C, IETF develop recommended software standards such as XML, HTTP and HTML, so that software can interoperate through these standards. 

Other well-known large software companies include Google, IBM, TCS, Infosys, Wipro, HCL Technologies, Oracle, Novell, SAP, Symantec, Adobe Systems, Sidetrade and Corel, while small companies often provide innovation.

STEP-NC

From Wikipedia, the free encyclopedia
STEP-NC interface on a CNC, showing product shape and color-coded tolerance state

STEP-NC is a machine tool control language that extends the ISO 10303 STEP standards with the machining model in ISO 14649, adding geometric dimension and tolerance data for inspection, and the STEP PDM model for integration into the wider enterprise. The combined result has been standardized as ISO 10303-238 (also known as AP238). 

STEP-NC was designed to replace ISO 6983/RS274D G-codes with a modern, associative communications protocol that connects computer numerical controlled (CNC) process data to a product description of the part being machined.

A STEP-NC program can use the full range of geometric constructs from the STEP standard to communicate device-independent toolpaths to the CNC. It can provide CAM operational descriptions and STEP CAD geometry to the CNC so workpieces, stock, fixtures and cutting tool shapes can be visualized and analyzed in the context of the toolpaths. STEP GD&T information can also be added to enable quality measurement on the control, and CAM-independent volume removal features may be added to facilitate regeneration and modification of the toolpaths before or during machining for closed loop manufacturing.

Motivation

Impeller machined using STEP-NC

Input to a CNC in the ISO 6983/RS274D G-code control language is often machine-specific and limited to axis motion commands. The machine tool is given little or no information about the desired result of the machining.

STEP-NC allows more information about the machining process to be sent to the machine control and adds new information about the product being machined. This "Smart Data for Smart Machining" enables applications such as the following:
  • Toolpath descriptions that are portable and independent of machine geometry.
  • Visual process, to show toolpaths in context of the machine and workpiece, and eliminate drawings.
  • On-Machine Simulation, to check for gouges, machine interference and other undesired behavior.
  • Simplified Inspection, with linked tolerances, on-machine probes and inspection workplans tied to part tolerances.
  • Feed and Speed Optimization, using tolerances, cross section information, sensor data.
  • Associativity so feedback can be sent from manufacturing back to design.

Capabilities

Overview of STEP-NC process model
STEP-NC can communicate a complete machining process description to a machine tool control or between manufacturing software applications. The information handled by STEP-NC can be divided into the following general categories. The standard handles technology-specific parameters for milling and turning, and extensions for other technologies under development (see Future Work). 

STEP-NC can exchange the explicit toolpath descriptions in use today, and add part, stock, and fixture geometry, a description of the tools, geometric dimensions and tolerances, and PDM information. A STEP-NC file is difficult to edit by hand because it contains geometry descriptions but for large programs the file size can be smaller because STEP-NC uses a compressed XML format instead of ASCII codes.

History

STEP-NC is not the first attempt at providing better quality information to a CNC. The EIA 494 Basic Control Language (BCL) defined a control language that was portable and had toolpaths independent of machine geometry, but did not contain any of the other product model information found in STEP-NC.

The core of STEP-NC is the ISO 14649 model for CNC control developed by European ESPRIT and IMS STEP-NC projects begun in 1999. These were led by Siemens with contributions from RWTH Aachen University and the University of Stuttgart in Germany, Komatsu and FANUC in Japan, Heidenhain in Switzerland, and the Pohang University of Science and Technology in Korea. Models for the control of CNC milling and turning machines were published in 2005, and draft models exist for EDM and contour cutting. 

Integration of the CNC model into STEP to produce ISO 10303-238 was done in the United States, under the NIST ATP Model Driven Intelligent Control of Manufacturing project, led by STEP Tools, Inc. with an industrial review board (IRB) consisting of Fortune 500 companies, CAD and CAM software developers, machine tool manufacturers, job shops and industry experts. STEP-NC AP238 was published in 2007.

STEP-NC Crown Wheel

In 2005 the OMAC STEP-NC Working Group hosted an AP238 testing forum in Orlando to demonstrate 5-axis parts machined using AP238 CC1 machine independent toolpaths. Four CAD/CAM systems produced AP238 machining programs for milling a 5-axis test part (an NAS 979 circle/diamond/square with an inverted NAS 979 cone test in the center). Each run on a pair of CNCs configured for completely different machine geometries (AB tool tilt vs. BC table tilt). In addition, Boeing cut parts on a variety of machines at their Tulsa facility and a machine at NIST in Gaithersburg.

In June 2006, a live 5-axis STEP-NC machining demonstration was hosted by Airbus at the Université Paul Sabatier Laboratoire de Génie mécanique in Toulouse. Further machining and measurement demonstrations were conducted in Ibusuki Japan in 2007.

On March 10–12, 2008, the STEP Manufacturing team (ISO TC184 SC4 WG3 T24) met in Sandviken and Stockholm, Sweden to demonstrate use of STEP-NC for feed and speed optimization, high-speed machining, tolerance-driven tool compensation and traceability. The participants in the demonstrations included Airbus/Univ. Bordeaux, Boeing, Eurostep, KTH Royal Institute of Technology, NIST, Sandvik Coromant, Scania, STEP Tools, and Univ. of Vigo.

On October 1–2, 2008, the STEP Manufacturing team met at the Connecticut Center for Advanced Technology, in Hartford, Connecticut to demonstrate closed-loop machining, feed optimization, and measurement using STEP-NC. The highlight of the meeting was the live 5-axis machining of a titanium impeller. Participants in the machining demonstration and other activities included Boeing, Connecticut Center for Advanced Technology, Concepts NRec, DMG, KTH Royal Institute of Technology, Mitutoyo, NIST, Sandvik Coromant, Scania, Siemens, and STEP Tools.

These participants and others continue to hold STEP-NC international implementation and testing events on a roughly six-month cycle. The demonstrations in 2009 focused on machining a Mold part at multiple sites from the same AP238 data including one part machined on a FANUC-developed STEP-NC control. At a meeting in Seattle the parts were then measured for accuracy using a CMM probe and a laser scanner.

STEP-NC machining on an Okuma CNC at IMTS 2014.

In the first half of 2010, the testing activity focused on tool wear management and machining a part in multiple setups with multiple alternate machining plans for 3, 4 and 5-axis machining. The new test part was a gear box that must be machined on all six sides. The tool wear and consequent machine loads were predicted from the STEP-NC data and verified using a dynamometer. In the second half of 2010, the testing forum applied STEP-NC to set up compensation with on-machine measurement of part and fixture datums using a FaroArm portable measurement device.

In 2012, the testing focused on machine tool accuracy calculations, culminating in a demonstration in June at the KTH production engineering labs in Stockholm. The test case milled a forged blank for a Crown Wheel Gear on an older Mazak VQC 20. Accuracy data from the machine was combined with tool engagement information from the STEP-NC to predict the deflections, which were tested against actual machining results.

In 2014, CAM data exchange using STEP-NC was shown at IMTS 2014 with daily machining demonstrations hosted by Okuma. A base machining process for a mold part was created by Boeing and then sent to Sandvik and ISCAR for optimization, producing a STEP-NC description containing all three process options. All machining was done in titanium and a range of CAM software was used, with all results captured as STEP-NC.

At IMTS 2018, a team consisting of Airbus, Boeing, DMG MORI, Hyundai WIA, Renishaw, and Mitutoyo demonstrated Digital Twin manufacturing by combining STEP-NC model and process data with MTConnect machine tool status and Quality Information Format (QIF) metrology results.

Future work

STEP-NC plasma cutting

Work continues within the ISO standard committees to extend STEP-NC to new technologies and to incorporate refinements discovered during use. Process models for new technologies are usually produced by the ISO TC184/SC1/WG7 committee. Models for Wire & Sink EDM and contour cutting of wood or stone are under investigation.

Work on extending and integrating STEP-NC with the manufacturing enterprise takes place in the ISO TC184/SC4/WG3/T24 STEP Manufacturing Team. This group also works on extensions and refinements discovered during testing. A series of traceability extensions have been proposed for linking STEP-NC machining programs with sensor feedback and machine state information during execution.

The National Shipbuilding Research Program (NSRP) has also hosted work to implement a prototype that connects a shipyard design system to a plate cutting using STEP-NC. This work involved extending STEP-NC to steel plate cutting and marking using lasers and plasma torches

A second edition of AP238 is being prepared for model-based integrated manufacturing, with geometry, tolerance, and kinematics improvements first introduced by AP242.

Computer-aided manufacturing

From Wikipedia, the free encyclopedia
 
CAD model and CNC machined part

Computer-aided manufacturing (CAM) also known as Computer-aided Modeling or Computer-aided Machining is the use of software to control machine tools and related ones in the manufacturing of work pieces. This is not the only definition for CAM, but it is the most common; CAM may also refer to the use of a computer to assist in all operations of a manufacturing plant, including planning, management, transportation and storage. Its primary purpose is to create a faster production process and components and tooling with more precise dimensions and material consistency, which in some cases, uses only the required amount of raw material (thus minimizing waste), while simultaneously reducing energy consumption. CAM is now a system used in schools and lower educational purposes. CAM is a subsequent computer-aided process after computer-aided design (CAD) and sometimes computer-aided engineering (CAE), as the model generated in CAD and verified in CAE can be input into CAM software, which then controls the machine tool. CAM is used in many schools alongside Computer-Aided Design (CAD) to create objects.

Overview

Chrome-cobalt disc with crowns for dental implants, manufactured using WorkNC CAM

Traditionally, CAM has been considered a numerical control (NC) programming tool, wherein two-dimensional (2-D) or three-dimensional (3-D) models of components are generated in CAD. As with other “Computer-Aided” technologies, CAM does not eliminate the need for skilled professionals such as manufacturing engineers, NC programmers, or machinists. CAM leverages both the value of the most skilled manufacturing professionals through advanced productivity tools, while building the skills of new professionals through visualization, simulation and optimization tools.

History

Early commercial applications of CAM were in large companies in the automotive and aerospace industries; for example, Pierre Béziers work developing the CAD/CAM application UNISURF in the 1960s for car body design and tooling at Renault.

Historically, CAM software was seen to have several shortcomings that necessitated an overly high level of involvement by skilled CNC machinists. Fallows created the first CAD software but this had severe shortcomings and was promptly taken back into the developing stage. CAM software would output code for the least capable machine, as each machine tool control added on to the standard G-code set for increased flexibility. In some cases, such as improperly set up CAM software or specific tools, the CNC machine required manual editing before the program will run properly. None of these issues were so insurmountable that a thoughtful engineer or skilled machine operator could not overcome for prototyping or small production runs; G-Code is a simple language. In high production or high precision shops, a different set of problems were encountered where an experienced CNC machinist must both hand-code programs and run CAM software.

The integration of CAD with other components of CAD/CAM/CAE Product lifecycle management (PLM) environment requires an effective CAD data exchange. Usually it had been necessary to force the CAD operator to export the data in one of the common data formats, such as IGES or STL or Parasolid formats that are supported by a wide variety of software. The output from the CAM software is usually a simple text file of G-code/M-codes, sometimes many thousands of commands long, that is then transferred to a machine tool using a direct numerical control (DNC) program or in modern Controllers using a common USB Storage Device.

CAM packages could not, and still cannot, reason as a machinist can. They could not optimize toolpaths to the extent required of mass production. Users would select the type of tool, machining process and paths to be used. While an engineer may have a working knowledge of G-code programming, small optimization and wear issues compound over time. Mass-produced items that require machining are often initially created through casting or some other non-machine method. This enables hand-written, short, and highly optimized G-code that could not be produced in a CAM package. 

At least in the United States, there is a shortage of young, skilled machinists entering the workforce able to perform at the extremes of manufacturing; high precision and mass production. As CAM software and machines become more complicated, the skills required of a machinist or machine operator advance to approach that of a computer programmer and engineer rather than eliminating the CNC machinist from the workforce.

Typical areas of concern:
  • High-Speed Machining, including streamlining of tool paths
  • Multi-function Machining
  • 5 Axis Machining
  • Feature recognition and machining
  • Automation of Machining processes
  • Ease of Use

Overcoming historical shortcomings

Over time, the historical shortcomings of CAM are being attenuated, both by providers of niche solutions and by providers of high-end solutions. This is occurring primarily in three arenas:
  1. Ease of usage
  2. Manufacturing complexity
  3. Integration with PLM and the extended enterprise
Ease in use
For the user who is just getting started as a CAM user, out-of-the-box capabilities providing Process Wizards, templates, libraries, machine tool kits, automated feature based machining and job function specific tailorable user interfaces build user confidence and speed the learning curve.
User confidence is further built on 3D visualization through a closer integration with the 3D CAD environment, including error-avoiding simulations and optimizations.
Manufacturing complexity
The manufacturing environment is increasingly complex. The need for CAM and PLM tools by the manufacturing engineer, NC programmer or machinist is similar to the need for computer assistance by the pilot of modern aircraft systems. The modern machinery cannot be properly used without this assistance.
Today's CAM systems support the full range of machine tools including: turning, 5 axis machining, waterjet, laser / plasma cutting, and wire EDM. Today’s CAM user can easily generate streamlined tool paths, optimized tool axis tilt for higher feed rates, better tool life and surface finish, and ideal cutting depth. In addition to programming cutting operations, modern CAM softwares can additionally drive non-cutting operations such as machine tool probing.
Integration with PLM and the extended enterpriseLM to integrate manufacturing with enterprise operations from concept through field support of the finished product.
To ensure ease of use appropriate to user objectives, modern CAM solutions are scalable from a stand-alone CAM system to a fully integrated multi-CAD 3D solution-set. These solutions are created to meet the full needs of manufacturing personnel including part planning, shop documentation, resource management and data management and exchange. To prevent these solutions from detailed tool specific information a dedicated tool management

Machining process

Most machining progresses through many stages, each of which is implemented by a variety of basic and sophisticated strategies, depending on the part design, material, and software available.
Roughing
This process usually begins with raw stock, known as billet, or a rough casting which a CNC machine cuts roughly to shape of the final model, ignoring the fine details. In milling, the result often gives the appearance of terraces or steps, because the strategy has taken multiple "steps" down the part as it removes material. This takes the best advantage of the machine's ability by cutting material horizontally. Common strategies are zig-zag clearing, offset clearing, plunge roughing, rest-roughing, and trochoidal milling (adaptive clearing). The goal at this stage is to remove the most material in the least time, without much concern for overall dimensional accuracy. When roughing a part, a small amount of extra material is purposely left behind to be removed in subsequent finishing operation(s).
Semi-finishing
This process begins with a roughed part that unevenly approximates the model and cuts to within a fixed offset distance from the model. The semi-finishing pass must leave a small amount of material (called the scallop) so the tool can cut accurately, but not so little that the tool and material deflect away from the cutting surfaces. Common strategies are raster passes, waterline passes, constant step-over passes, pencil milling.
Finishing
Finishing involves many light passes across the material in fine steps to produce the finished part. When finishing a part, the steps between passes is minimal to prevent tool deflection and material spring back. In order to reduce the lateral tool load, tool engagement is reduced, while feed rates and spindle speeds are generally increased in order to maintain a target surface speed (SFM). A light chip load at high feed and RPM is often referred to as High Speed Machining (HSM), and can provide quick machining times with high quality results. The result of these lighter passes is a highly accurate part, with a uniformly high surface finish. In addition to modifying speeds and feeds, machinists will often have finishing specific endmills, which never used as roughing endmills. This is done to protect the endmill from developing chips and flaws in the cutting surface, which would leave streaks and blemishes on the final part.
Contour milling
In milling applications on hardware with four or more axes, a separate finishing process called contouring can be performed. Instead of stepping down in fine-grained increments to approximate a surface, the work piece is rotated to make the cutting surfaces of the tool tangent to the ideal part features. This produces an excellent surface finish with high dimensional accuracy. This process is commonly used to machine complex organic shapes such as turbine and impeller blades, which due to their complex curves and overlapping geometry, are impossible to machine with only three axis machines.

Business software

From Wikipedia, the free encyclopedia

Business software (or a business application) is any software or set of computer programs used by business users to perform various business functions. These business applications are used to increase productivity, to measure productivity and to perform other business functions accurately.

By and large, business software is likely to be developed to meet the needs of a specific business, and therefore is not easily transferable to a different business environment, unless its nature and operation is identical. Due to the unique requirements of each business, off-the-shelf software is unlikely to completely address a company's needs. However, where an on-the-shelf solution is necessary, due to time or monetary considerations, some level of customization is likely to be required. Exceptions do exist, depending on the business in question, and thorough research is always required before committing to bespoke or off-the-shelf solutions.

Some business applications are interactive, i.e., they have a graphical user interface or user interface and users can query/modify/input data and view results instantaneously. They can also run reports instantaneously. Some business applications run in batch mode: they are set up to run based on a predetermined event/time and a business user does not need to initiate them or monitor them.

Some business applications are built in-house and some are bought from vendors (off the shelf software products). These business applications are installed on either desktops or big servers. Prior to the introduction of COBOL (a universal compiler) in 1965, businesses developed their own unique machine language. RCA's language consisted of a 12-position instruction. For example, to read a record into memory, the first two digits would be the instruction (action) code. The next four positions of the instruction (an 'A' address) would be the exact leftmost memory location where you want the readable character to be placed. Four positions (a 'B' address) of the instruction would note the very rightmost memory location where you want the last character of the record to be located. A two digit 'B' address also allows a modification of any instruction. Instruction codes and memory designations excluded the use of 8's or 9's. The first RCA business application was implemented in 1962 on a 4k RCA 301. The RCA 301, mid frame 501, and large frame 601 began their marketing in early 1960.

Many kinds of users are found within the business environment, and can be categorized by using a small, medium and large matrix:
Technologies that previously only existed in peer-to-peer software applications, like Kazaa and Napster, are starting to appear within business applications.

Types of business tools

  • Enterprise software application (Esa)
  • Resource Management
  • Enterprise Resource Planning (ERP)
  • Digital dashboards, also known as business intelligence dashboards, enterprise dashboards, or executive dashboards. These are visually based summaries of business data that show at-a-glance understanding of conditions through metrics and key performance indicators (KPIs). Dashboards are a very popular tools that have arisen in the last few years.
  • Online analytical processing (OLAP), (which include HOLAP, ROLAP and MOLAP) - are a capability of some management, decision support, and executive information systems that support interactive examination of large amounts of data from many perspectives.
  • Reporting software generates aggregated views of data to keep the management informed about the state of their business.
  • Procurement software is business software that helps to automate the purchasing function of organizations.
  • Data mining is the extraction of consumer information from a database by utilizing software that can isolate and identify previously unknown patterns or trends in large amounts of data. There is a variety of data mining techniques that reveal different types of patterns. Some of the techniques that belong here are statistical methods (particularly business statistics) and neural networks, as very advanced means of analyzing data.
  • Business performance management (BPM)
  • Document management software is made for organizing and managing multiple documents of various types. Some of them have storage functions for security and back-up of valuable business information.
  • Employee scheduling software- used for creating and distributing employee schedules, as well as for tracking employee hours.

Brief history

The essential motivation for business software is to increase profits by cutting costs or speeding the productive cycle. In the earliest days of white-collar business automation, large mainframe computers were used to tackle the most tedious jobs, like bank cheque clearing and factory accounting. 

Factory accounting software was among the most popular of early business software tools, and included the automation of general ledgers, fixed assets inventory ledgers, cost accounting ledgers, accounts receivable ledgers, and accounts payable ledgers (including payroll, life insurance, health insurance, federal and state insurance and retirement).

The early use of software to replace manual white-collar labor was extremely profitable, and caused a radical shift in white-collar labor. One computer might easily replace 100 white-collar 'pencil pushers', and the computer would not require any health or retirement benefits.

Building on these early successes with IBM, Hewlett-Packard and other early suppliers of business software solutions, corporate consumers demanded business software to replace the old-fashioned drafting board. CAD-CAM software (or computer-aided drafting for computer-aided manufacturing) arrived in the early 1980s. Also, project management software was so valued in the early 1980s that it might cost as much as $500,000 per copy (although such software typically had far fewer capabilities than modern project management software such as Microsoft Project, which one might purchase today for under $500 per copy.)

In the early days, perhaps the most noticeable, widespread change in business software was the word processor. Because of its rapid rise, the ubiquitous IBM typewriter suddenly vanished in the 1980s as millions of companies worldwide shifted to the use of Word Perfect business software, and later, Microsoft Word software. Another vastly popular computer program for business were mathematical spreadsheet programs such as Lotus 1-2-3, and later Microsoft Excel.

In the 1990s business shifted massively towards globalism with the appearance of SAP software which coordinates a supply-chain of vendors, potentially worldwide, for the most efficient, streamlined operation of factory manufacture. 

Yet nothing in the history of business software has had the global impact of the Internet, with its email and websites that now serve commercial interests worldwide. Globalism in business fully arrived when the Internet became a household word.

The next phase in the evolution of business software is being led by the emergance of Robotic Process Automation (RPA), which involves identifying and automating highly repetitive tasks and processes, with an aim to drive operational efficiency, reduce costs and limit human error. Industries that have been in the forefront of RPA adoption include the insurance industry, banking and financial services, the legal industry, and the healthcare industry.

Application support

Business applications are built based on the requirements from the business users. Also, these business applications are built to use certain kind of Business transactions or data items. These business applications run flawlessly until there are no new business requirements or there is no change in underlying Business transactions. Also, the business applications run flawlessly if there are no issues with computer hardware, computer networks (Internet/intranet), computer disks, power supplies, and various software components (middleware, database, computer programs, etc.).

Business applications can fail when an unexpected error occurs. This error could occur due to a data error (an unexpected data input or a wrong data input), an environment error (an in frastructure related error), a programming error, a human error or a work flow error. When a business application fails one needs to fix the business application error as soon as possible so that the business users can resume their work. This work of resolving business application errors is known as business application support.

Reporting errors

The Business User calls the business application support team phone number or sends an e-mail to the business application support team. The business application support team gets all the details of the error from the business user on the phone or from the e-mail. These details are then entered in a tracking software. The tracking software creates a request number and this request number is given to the business user. This request number is used to track the progress on the support issue. The request is assigned to a support team member.

Notification of errors

For critical business application errors (such as an application not available or an application not working correctly), an e-mail is sent to the entire organization or impacted teams so that they are aware of the issue. They are also provided with an estimated time for application availability.

Investigation or analysis of application errors

The business application support team member collects all the necessary information about the business software error. This information is then recorded in the support request. All of the data used by the business user is also used in the investigation. The application program is reviewed for any possible programming errors.

Error resolution

If any similar business application errors occurred in the past then the issue resolution steps are retrieved from the support knowledge base and the error is resolved using those steps. If it is a new support error, then new issue resolution steps are created and the error is resolved. The new support error resolution steps are recorded in the knowledge base for future use. For major business application errors (critical infrastructure or application failures), a phone conference call is initiated and all required support persons/teams join the call and they all work together to resolve the error.

Code correction

If the business application error occurred due to programming errors, then a request is created for the application development team to correct programming errors. If the business user needs new features or functions in the business application, then the required analysis/design/programming/testing/release is planned and a new version of the business software is deployed.

Business process correction

If the business application error occurred due to a work flow issue or human errors during data input, then the business users are notified. Business users then review their work flow and revise it if necessary. They also modify the user guide or user instructions to avoid such an error in the future.

Infrastructure issue correction

If the business application error occurred due to infrastructure issues, then the specific infrastructure team is notified. The infrastructure team then implements permanent fixes for the issue and monitors the infrastructure to avoid the re-occurrence of the same error.

Support follow up and internal reporting

The business application error tracking system is used to review all issues periodically (daily, weekly and monthly) and reports are generated to monitor the resolved issues, repeating issues, and pending issues. Reports are also generated for the IT/IS management for improvement and management of business applications.

Impact of the COVID-19 pandemic on science and technology

From Wikipedia, the free encyclopedia
 
The COVID-19 pandemic has affected many science, space and technology institutions and government agencies worldwide, leading to reduced productivity on a number of fields and programs. It has also opened several new funding research lines in several governmental agencies around the world.

Science

The pandemic may have improved scientific communication or established new forms of it. For instance, a lot of data is being released on preprint servers and is getting dissected on social Internet platforms and sometimes in the media before entering formal peer review. Scientists are reviewing, editing, analyzing and publishing manuscripts and data at record speeds and in large numbers. This intense communication may have allowed an unusual level of collaboration and efficiency among scientists. Francis Collins notes that while he hasn't seen research move faster, the pace of research "can still feel slow" during a pandemic. The typical model for research has been considered too slow for the "urgency of the coronavirus threat". A number of factors shape how much and which scientific knowledge can be established timely.

World Health Organization

On 4 May 2020, the World Health Organization (WHO) organized a telethon to raise US$8 billion from forty countries to support rapid development of vaccines to prevent COVID-19 infections, also announcing deployment of an international "Solidarity trial" for simultaneous evaluation of several vaccine candidates reaching Phase II-III clinical trials. The "Solidarity trial for treatments" is a multinational Phase III-IV clinical trial organized by the WHO and partners to compare four untested treatments for hospitalized people with severe COVID-19 illness. The trial was announced 18 March 2020, and as of 21 April, over 100 countries were participating. The WHO is also coordinating a multiple-site, international randomized controlled trial  – the "Solidarity trial for vaccines" – to enable simultaneous evaluation of the benefits and risks of different vaccine candidates under clinical trials in countries where there are high rates of COVID-19 disease, ensuring fast interpretation and sharing of results around the world. The WHO vaccine coalition will prioritize which vaccines should go into Phase II and III clinical trials, and determine harmonized Phase III protocols for all vaccines achieving the pivotal trial stage.

The Coalition for Epidemic Preparedness Innovations (CEPI) – which is organizing a US$2 billion worldwide fund for rapid investment and development of vaccine candidates – indicated in April that a vaccine may be available under emergency use protocols in less than 12 months or by early 2021.

National and intergovernmental laboratories

United States Department of Energy federal scientific laboratories such as the Oak Ridge National Laboratory have closed all its doors to all visitors and many employees, and non-essential staff and scientists are required to work from home if possible. Contractors also are strongly advised to isolate their facilities and staff unless necessary. The overall operation of the ORNL remains somewhat unaffected.

The Lawrence Livermore National Laboratory has been tasked by the White House Coronavirus Task Force to utilize most of its supercomputing capability for further research of the virus stream, possible mutations and other factors; while temporary reducing other projects or delaying them indefinitely.

The European Molecular Biology Laboratory has closed all six sites across Europe (Barcelona, Grenoble, Hamburg, Heidelberg, Hinxton and Rome). All of EMBL's host governments have introduced strict controls in response to the coronavirus. EMBL staff have been instructed to follow local government advice. A small number of staff have been authorized to attend the sites to provide an essential service such as maintenance of animal facilities or data services. All other staff have been instructed to stay at home. EMBL has also cancelled all visits to sites by non-staff groups. This includes physical participation in the Courses and Conferences programme at Heidelberg, the EMBL-EBI Training courses, and all other seminars, courses and public visits at all sites. Meanwhile, the European Bioinformatics Institute is creating a European COVID-19 Data Platform for data/information exchange. The goal is to collect and share rapidly available research data to enable synergies, cross-fertilisation and use of diverse data sets with different degrees of aggregation, validation and/or completeness. The platform is envisaged to consist of two connected components, the SARS-CoV-2 Data Hubs organising the flow of SARS-CoV-2 outbreak sequence data and providing comprehensive open data sharing for the European and global research communities, and one broader COVID-19 Portal.

The World Meteorological Organization expressed concern about the observation system. Observations from the Aircraft Meteorological Data Relay programme, which uses in-flight measurements from the fleets of 43 airlines, were reduced by 50% to 80% depending on region. Data from other automated systems was largely unaffected, though the WMO expressed fears that repairs and maintenance may be affected. Manual observations, mostly from developing countries, also saw a significant decrease.

Open science

The need for accelerating open scientific research made several civil society organizations to create an Open COVID Pledge asking to different industries to release their intellectual property rights during the pandemic to help find a cure for the disease. Several tech giants joined the pledge. The pledge includes the release of an Open COVID license. Organizations that have been longtime advocates for Open Access, such as Creative Commons, implemented a myriad of calls and actions to promote open access in science as a key element to combat the disease. These include a public call for more open access policies, and a call to scientists to adopt zero embargo periods for their publications, implementing a CC BY to their articles, and a CC0 waiver for the research data. Other organizations questioned the current scientific culture, making a call for more open, public science.

Computing research and citizen science

In March 2020, the United States Department of Energy, National Science Foundation, NASA, industry, and nine universities pooled resources to access supercomputers from IBM, combined with cloud computing resources from Hewlett Packard Enterprise, Amazon, Microsoft, and Google, for drug discovery. The COVID‑19 High Performance Computing Consortium is also being used to forecast disease spread, model possible vaccines, and screen thousands of chemical compounds to design a COVID‑19 vaccine or therapy.

The C3.ai Digital Transformation Institute, an additional consortium of Microsoft, six universities (including the Massachusetts Institute of Technology, a member of the first consortium), and the National Center for Supercomputer Applications in Illinois, working under the auspices of C3.ai, a company founded by Thomas Siebel, are pooling supercomputer resources toward drug discovery, medical protocol development and public health strategy improvement, as well as awarding large grants to researchers who propose to use AI to carry out similar tasks by May.

In March 2020, the distributed computing project Folding@home launched a program to assist medical researchers around the world. The initial wave of projects are meant to simulate potentially protein targets from SARS-CoV-2 virus, and the related SARS-CoV virus, which has been studied previously.

Resources for computer science and scientific crowdsourcing projects concerning COVID-19 can be found on the internet or as apps. Examples of such projects are listed below:
  • The Eterna OpenVaccine project enables video game players to "design an mRNA encoding a potential vaccine against the novel coronavirus."
  • The EU-Citizen.Science project has "a selection of resources related to the current COVID19 pandemic. It contains links to citizen science and crowdsourcing projects"
  • The COVID-19 Citizen Science project is "a new initiative by University of California, San Francisco physician-scientists" that "will allow anyone in the world age 18 or over to become a citizen scientist advancing understanding of the disease."
  • The CoronaReport digital journalism project is "a citizen science project which democratizes the reporting on the Coronavirus, and makes these reports accessible to other citizens."
  • The COVID Symptom Tracker is a crowdsourced study of the symptoms of the virus. It has had two million downloads by April 2020.
  • The Covid Near You epidemiology tool "uses crowdsourced data to visualize maps to help citizens and public health agencies identify current and potential hotspots for the recent pandemic coronavirus, COVID-19."

Space

NASA

The James Webb Space Telescope's launch has been delayed
 
Components of the Space Launch System

NASA announced the temporary closure of all its field center visitor complexes until further notice, as well as requiring all non-critical personnel to work from home if possible. Production and manufacture of the Space Launch System at the Michoud Assembly Facility was stopped, and further delays to the James Webb Space Telescope are expected.

The majority of personnel at the Johnson Space Center transitioned to teleworking, and International Space Station mission critical personnel were instructed to reside in the mission control room until further notice. Station operations are relatively unaffected, but new expedition astronauts face longer and stricter quarantines before flight.

NASA's emergency response framework has varied depending on local virus cases around its agency field centers. As of March 24, 2020, the following space centers had been escalated to stage 4:
Two facilities were held at stage 4 after reporting new coronavirus cases: the Michoud Assembly Facility reporting its first employee testing positive for COVID-19, and the Stennis Space Center recording a second case of a member of the NASA community with the virus. The Kennedy Space Center was held at stage 3, after one member of the workforce tested positive. Due to mandatory telework policy already in effect, the individual had not been on site for over a week prior to symptoms. On May 18, the Michoud facility began to resume SLS work operations, but so far remains in a state of level 3.

At stage 4, mandatory telework is in effect for all personnel, with the exception of limited personnel required for mission-essential work and to care-take and maintain the safety and security of the facility.

ESA

The European Space Agency has ordered many of its science and technology facilities' workforce to also telework as much as possible.

Recent developments, including strengthened restrictions by national, regional and local authorities across Europe and the first positive test result for COVID-19 within the workforce at the European Space Operations Centre, have led the agency to restrict on-site personnel at its mission control centres even further.

ESA's Director of Operations - Rolf Densing, has strongly recommended mission personnel to reduce activity of scientific missions, especially on interplanetary spacecraft.

The affected spacecraft are currently in stable orbits and long mission durations, so turning off their science instruments and placing them into a largely unattended safe configuration for a certain period will have a negligible impact on their overall mission performance.

Among the affected missions are:
  • Cluster – A four-spacecraft mission launched in 2000, orbiting Earth to investigate our planet's magnetic environment and how it is forged by the solar wind, the stream of charged particles constantly released by the Sun;
  • ExoMars Trace Gas Orbiter – Launched in 2016, the spacecraft is in orbit around Mars, where it has been investigating the planet's atmosphere and providing data relay for landers on the surface;
  • Mars Express – Launched in 2003, the workhorse orbiter has been imaging the Martian surface and sampling the planet's atmosphere for over one and a half decades;
  • Solar Orbiter – ESA's newest science mission, launched in February 2020 and currently en route to its science operations orbit around the Sun.
ESA's Director of Science - Günther Hasinger, commented: "It was a difficult decision, but the right one to take. Our greatest responsibility is the safety of people, and I know all of us in the science community understand why this is necessary".

The temporary reduction in personnel on site will also allow the ESOC teams to concentrate on maintaining spacecraft safety for all other missions involved, in particular, the Mercury explorer BepiColombo, which is on its way to the innermost planet in the Solar System and will require some on-site support around its scheduled Earth flyby on 10 April.

The challenging manoeuvre, which will use Earth's gravity to adjust BepiColombo's trajectory as it cruises towards Mercury, will be performed by a very small number of engineers and in full respect of social distancing and other health and hygiene measures required by the current situation. Commissioning and first check-out operations of scientific instruments on the recently launched Solar Orbiter, which had begun last month, have been temporarily suspended.

ESA expects to resume these operations in the near future, in line with the development of the coronavirus situation. Meanwhile, Solar Orbiter will continue its journey towards the Sun, with the first Venus flyby to take place in December.

JAXA

The Japan Aerospace Exploration Agency (JAXA) space and science operations largely remain unaffected. However all visitors to their numerous field centers have been suspended until April 30 to reduce contamination.

Commercial aerospace

Bigelow Aerospace announced on March 23, 2020, that it was laying off all 88 of its employees. It has said it would hire workers back when restrictions imposed by the pandemic. World View based in Tucson, Arizona announced on April 17, 2020 that it had halted new business initiatives and furloughed an unstated number of employee in order to reduce cash outflow. The company had also received rent deferments from Pima County, Arizona.

OneWeb filed for bankruptcy on 27 March 2020, following a cash crunch amidst difficulties raising capital to complete the build and deployment of the remaining 90% of the network. The company had already laid off approximately 85% of its 531 employees, but said it will maintain satellite operational capabilities while the court restructures it and new owners for the constellation are sought.

Rocket Lab temporarily closed its New Zealand launch site but operations continue at its Wallops Flight Facility launch complex.

Larger companies such as SpaceX and Boeing remain somewhat economically unaffected, apart from extra safety precautions and measures for their employees to limit the spread of the virus in their workplaces. As of April 16, Blue Origin stated that it was continuing with its hiring of staff, growing by around 20 each week. ULA implemented an internal pandemic plan. Whilst some aspects of launch related outreach were scaled back, the company made clear the intention to maintain its launch schedule.

Telecommunications

The coronavirus caused a huge strain on internet traffic, with an increase of 60% and 50% in broadband usage of BT Group and Vodafone respectively. In the meantime, Netflix, Disney+, Google, Amazon and YouTube considered the notion to reduce their video quality to prevent the overload. Meanwhile, Sony started slowing down PlayStation game downloads in Europe and the United States to maintain the traffic level.

Cellular service providers in mainland China have reported significant drops in subscriber numbers, partially due to migrant workers being unable to return to work as a result of quarantine lockdowns; China Mobile saw a reduction of 8 million subscribers, while China Unicom had 7.8 million fewer subscribers, and China Telecom lost 5.6 million users.

Teleconferencing has served as a replacement for cancelled events as well as daily business meetings and social contacts. Teleconference companies such as Zoom Video Communications have seen a sharp increase in usage, accompanied by attendant technical problems like bandwidth overcrowding and social problems like Zoombombing.

Virtual happy hours for "quarantinis" have been held using the technology, and even virtual dance parties.

Entropy (information theory)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Entropy_(information_theory) In info...