Search This Blog

Thursday, May 23, 2024

Open-source hardware

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Open-source_hardware
The "open source hardware" logo proposed by OSHWA, one of the main defining organizations
The RepRap Mendel general-purpose 3D printer with the ability to make copies of most of its own structural parts

Open-source hardware (OSH, OSHW) consists of physical artifacts of technology designed and offered by the open-design movement. Both free and open-source software (FOSS) and open-source hardware are created by this open-source culture movement and apply a like concept to a variety of components. It is sometimes, thus, referred to as FOSH (free and open-source hardware). The term usually means that information about the hardware is easily discerned so that others can make it – coupling it closely to the maker movement. Hardware design (i.e. mechanical drawings, schematics, bills of material, PCB layout data, HDL source code and integrated circuit layout data), in addition to the software that drives the hardware, are all released under free/libre terms. The original sharer gains feedback and potentially improvements on the design from the FOSH community. There is now significant evidence that such sharing can drive a high return on investment for the scientific community.

It is not enough to merely use an open-source license; an open source product or project will follow open source principles, such as modular design and community collaboration.

Since the rise of reconfigurable programmable logic devices, sharing of logic designs has been a form of open-source hardware. Instead of the schematics, hardware description language (HDL) code is shared. HDL descriptions are commonly used to set up system-on-a-chip systems either in field-programmable gate arrays (FPGA) or directly in application-specific integrated circuit (ASIC) designs. HDL modules, when distributed, are called semiconductor intellectual property cores, also known as IP cores.

Open-source hardware also helps alleviate the issue of proprietary device drivers for the free and open-source software community, however, it is not a pre-requisite for it, and should not be confused with the concept of open documentation for proprietary hardware, which is already sufficient for writing FLOSS device drivers and complete operating systems. The difference between the two concepts is that OSH includes both the instructions on how to replicate the hardware itself as well as the information on communication protocols that the software (usually in the form of device drivers) must use in order to communicate with the hardware (often called register documentation, or open documentation for hardware), whereas open-source-friendly proprietary hardware would only include the latter without including the former.

History

openhardware.org logo (2013)
OSHWA logo

The first hardware-focused "open source" activities were started around 1997 by Bruce Perens, creator of the Open Source Definition, co-founder of the Open Source Initiative, and a ham radio operator. He launched the Open Hardware Certification Program, which had the goal of allowing hardware manufacturers to self-certify their products as open.

Shortly after the launch of the Open Hardware Certification Program, David Freeman announced the Open Hardware Specification Project (OHSpec), another attempt at licensing hardware components whose interfaces are available publicly and of creating an entirely new computing platform as an alternative to proprietary computing systems. In early 1999, Sepehr Kiani, Ryan Vallance and Samir Nayfeh joined efforts to apply the open-source philosophy to machine design applications. Together they established the Open Design Foundation (ODF) as a non-profit corporation and set out to develop an Open Design Definition. However, most of these activities faded out after a few years.

A "Free Hardware" organization, known as FreeIO, was started in the late 1990s by Diehl Martin, who also launched a FreeIO website in early 2000. In the early to mid 2000s, FreeIO was a focus of free/open hardware designs released under the GNU General Public License. The FreeIO project advocated the concept of Free Hardware and proposed four freedoms that such hardware provided to users, based on the similar freedoms provided by free software licenses. The designs gained some notoriety due to Martin's naming scheme in which each free hardware project was given the name of a breakfast food such as Donut, Flapjack, Toast, etc. Martin's projects attracted a variety of hardware and software developers as well as other volunteers. Development of new open hardware designs at FreeIO ended in 2007 when Martin died of pancreatic cancer but the existing designs remain available from the organization's website.

By the mid 2000s open-source hardware again became a hub of activity due to the emergence of several major open-source hardware projects and companies, such as OpenCores, RepRap (3D printing), Arduino, Adafruit, SparkFun, and Open Source Ecology. In 2007, Perens reactivated the openhardware.org website, but it's currently (August 2023) inactive.

Following the Open Graphics Project, an effort to design, implement, and manufacture a free and open 3D graphics chip set and reference graphics card, Timothy Miller suggested the creation of an organization to safeguard the interests of the Open Graphics Project community. Thus, Patrick McNamara founded the Open Hardware Foundation (OHF) in 2007.

The Tucson Amateur Packet Radio Corporation (TAPR), founded in 1982 as a non-profit organization of amateur radio operators with the goals of supporting R&D efforts in the area of amateur digital communications, created in 2007 the first open hardware license, the TAPR Open Hardware License. The OSI president Eric S. Raymond expressed some concerns about certain aspects of the OHL and decided to not review the license.

Around 2010 in context of the Freedom Defined project, the Open Hardware Definition was created as collaborative work of many and is accepted as of 2016 by dozens of organizations and companies.

In July 2011, CERN (European Organization for Nuclear Research) released an open-source hardware license, CERN OHL. Javier Serrano, an engineer at CERN's Beams Department and the founder of the Open Hardware Repository, explained: "By sharing designs openly, CERN expects to improve the quality of designs through peer review and to guarantee their users – including commercial companies – the freedom to study, modify and manufacture them, leading to better hardware and less duplication of efforts". While initially drafted to address CERN-specific concerns, such as tracing the impact of the organization's research, in its current form it can be used by anyone developing open-source hardware.

Following the 2011 Open Hardware Summit, and after heated debates on licenses and what constitutes open-source hardware, Bruce Perens abandoned the OSHW Definition and the concerted efforts of those involved with it. Openhardware.org, led by Bruce Perens, promotes and identifies practices that meet all the combined requirements of the Open Source Hardware Definition, the Open Source Definition, and the Four Freedoms of the Free Software Foundation Since 2014 openhardware.org is not online and seems to have ceased activity.

The Open Source Hardware Association (OSHWA) at oshwa.org acts as hub of open-source hardware activity of all genres, while cooperating with other entities such as TAPR, CERN, and OSI. The OSHWA was established as an organization in June 2012 in Delaware and filed for tax exemption status in July 2013. After some debates about trademark interferences with the OSI, in 2012 the OSHWA and the OSI signed a co-existence agreement.

The Free Software Foundation has suggested an alternative "free hardware" definition derived from the Four Freedoms.

Forms of open-source hardware

The term hardware in open-source hardware has been historically used in opposition to the term software of open-source software. That is, to refer to the electronic hardware on which the software runs (see previous section). However, as more and more non-electronic hardware products are made open source (for example WikiHouse, OpenBeam or Hovalin), this term tends to be used back in its broader sense of "physical product". The field of open-source hardware has been shown to go beyond electronic hardware and to cover a larger range of product categories such as machine tools, vehicles and medical equipment. In that sense, hardware refers to any form of tangible product, be it electronic hardware, mechanical hardware, textile or even construction hardware. The Open Source Hardware (OSHW) Definition 1.0 defines hardware as "tangible artifacts — machines, devices, or other physical things".

Electronics

Electronics is one of the most popular types of open-source hardware. There are many companies that provide large varieties of open-source electronics such as Sparkfun, Adafruit, and Seeed. In addition, there are NPOs and companies that provide a specific open-source electronic component such as the Arduino electronics prototyping platform. There are many examples of specialty open-source electronics such as low-cost voltage and current GMAW open-source 3-D printer monitor and a robotics-assisted mass spectrometry assay platform. Open-source electronics finds various uses, including automation of chemical procedures.

Mecha(tro)nics

A large range of open-source mechatronic products have been developed, including mechanical components, machine tools, vehicles, musical instruments, and medical equipment.

Examples of open-source machine tools include 3D printers such as RepRap, Prusa, and Ultimaker, 3D printer filament extruders such as polystruder XR PRO as well as the laser cutter Lasersaur. Open-source vehicles have also been developed including bicycles like XYZ Space Frame Vehicles and cars such as the Tabby OSVehicle. Examples of open source medical equipment include open-source ventilators, the echostethoscope echOpen, and a wide range of prosthetic hands listed in the review study by Ten Kate et.al. (e.g. OpenBionics' Prosthetic Hands).

Chip design

OSH chip designs are now common. RISC-V is an open instruction set architecture which has several OSH implementations. LowRISC is working on a complete OSH system on chip.

Other

Examples of open-source hardware products can also be found to a lesser extent in construction (Wikihouse), textile (Kit Zéro Kilomètres), and firearms (3D printed firearm, Defense Distributed).

Licenses

Rather than creating a new license, some open-source hardware projects use existing, free and open-source software licenses. These licenses may not accord well with patent law.

Later, several new licenses were proposed, designed to address issues specific to hardware design. In these licenses, many of the fundamental principles expressed in open-source software (OSS) licenses have been "ported" to their counterpart hardware projects. New hardware licenses are often explained as the "hardware equivalent" of a well-known OSS license, such as the GPL, LGPL, or BSD license.

Despite superficial similarities to software licenses, most hardware licenses are fundamentally different: by nature, they typically rely more heavily on patent law than on copyright law, as many hardware designs are not copyrightable. Whereas a copyright license may control the distribution of the source code or design documents, a patent license may control the use and manufacturing of the physical device built from the design documents. This distinction is explicitly mentioned in the preamble of the TAPR Open Hardware License:

"... those who benefit from an OHL design may not bring lawsuits claiming that design infringes their patents or other intellectual property."

— TAPR Open Hardware License

Noteworthy licenses include:

The Open Source Hardware Association recommends seven licenses which follow their open-source hardware definition. From the general copyleft licenses the GNU General Public License (GPL) and Creative Commons Attribution-ShareAlike license, from the hardware-specific copyleft licenses the CERN Open Hardware License (OHL) and TAPR Open Hardware License (OHL) and from the permissive licenses the FreeBSD license, the MIT license, and the Creative Commons Attribution license. Openhardware.org recommended in 2012 the TAPR Open Hardware License, Creative Commons BY-SA 3.0 and GPL 3.0 license.

Organizations tend to rally around a shared license. For example, OpenCores prefers the LGPL or a Modified BSD License, FreeCores insists on the GPL, Open Hardware Foundation promotes "copyleft or other permissive licenses", the Open Graphics Project uses a variety of licenses, including the MIT license, GPL, and a proprietary license, and the Balloon Project wrote their own license.

Development

The OSHW (Open Source Hardware) logo silkscreened on an unpopulated PCB

The adjective "open-source" not only refers to a specific set of freedoms applying to a product, but also generally presupposes that the product is the object or the result of a "process that relies on the contributions of geographically dispersed developers via the Internet." In practice however, in both fields of open-source hardware and open-source software, products may either be the result of a development process performed by a closed team in a private setting or by a community in a public environment, the first case being more frequent than the second which is more challenging. Establishing a community-based product development process faces several challenges such as: to find appropriate product data management tools, document not only the product but also the development process itself, accepting losing ubiquitous control over the project, ensure continuity in a context of fickle participation of voluntary project members, among others.

The Arduino Diecimila, another popular and early open source hardware design

One of the major differences between developing open-source software and developing open-source hardware is that hardware results in tangible outputs, which cost money to prototype and manufacture. As a result, the phrase "free as in speech, not as in beer", more-formally known as gratis versus libre, distinguishes between the idea of zero cost and the freedom to use and modify information. While open-source hardware faces challenges in minimizing cost and reducing financial risks for individual project developers, some community members have proposed models to address these needs Given this, there are initiatives to develop sustainable community funding mechanisms, such as the Open Source Hardware Central Bank.

Extensive discussion has taken place on ways to make open-source hardware as accessible as open-source software. Providing clear and detailed product documentation is an essential factor facilitating product replication and collaboration in hardware development projects. Practical guides have been developed to help practitioners to do so. Another option is to design products so they are easy to replicate, as exemplified in the concept of open-source appropriate technology.

The process of developing open-source hardware in a community-based setting is alternatively called open design, open source development or open source product development. All these terms are examples of the open-source model applicable for the development of any product, including software, hardware, cultural and educational. Does open design and open-source hardware design process involves new design practices, or raises requirements for new tools? is the question of openness really key in OSH?. See here for a delineation of these terms.

A major contributor to the production of open-source hardware product designs is the scientific community. There has been considerable work to produce open-source hardware for scientific hardware using a combination of open-source electronics and 3-D printing. Other sources of open-source hardware production are vendors of chips and other electronic components sponsoring contests with the provision that the participants and winners must share their designs. Circuit Cellar magazine organizes some of these contests.

Open-source labs

A guide has been published (Open-Source Lab (book) by Joshua Pearce) on using open-source electronics and 3D printing to make open-source labs. Today, scientists are creating many such labs. Examples include:

Business models

Open hardware companies are experimenting with business models. For example, littleBits implements open-source business models by making available the circuit designs in each electronics module, in accordance with the CERN Open Hardware License Version 1.2. Another example is Arduino, which registered its name as a trademark; others may manufacture products from Arduino designs but cannot call the products Arduino products. There are many applicable business models for implementing some open-source hardware even in traditional firms. For example, to accelerate development and technical innovation, the photovoltaic industry has experimented with partnerships, franchises, secondary supplier and completely open-source models.

Recently, many open-source hardware projects have been funded via crowdfunding on platforms such as Indiegogo, Kickstarter, or Crowd Supply.

Reception and impact

Richard Stallman, the founder of the free software movement, was in 1999 skeptical on the idea and relevance of free hardware (his terminology for what is now known as open-source hardware). In a 2015 article in Wired Magazine, he modified this attitude; he acknowledged the importance of free hardware, but still saw no ethical parallel with free software. Also, Stallman prefers the term free hardware design over open source hardware, a request which is consistent with his earlier rejection of the term open source software (see also Alternative terms for free software).ther authors, such as Professor Joshua Pearce have argued there is an ethical imperative for open-source hardware – specifically with respect to open-source appropriate technology for sustainable development. In 2014, he also wrote the book Open-Source Lab: How to Build Your Own Hardware and Reduce Research Costs, which details the development of free and open-source hardware primarily for scientists and university faculty. Pearce in partnership with Elsevier introduced a scientific journal HardwareX. It has featured many examples of applications of open-source hardware for scientific purposes.

Further, Vasilis Kostakis [et] et al have argued that open-source hardware may promote values of equity, diversity and sustainability. Open-source hardware initiative transcend traditional dichotomies of global-local, urban-rural, and developed-developing contexts. They may leverage cultural differences, environmental conditions, and local needs/resources, while embracing hyper-connectivity, to foster sustainability and collaboration rather than conflict. However, open-source hardware does face some challenges and contradictions. It must navigate tensions between inclusiveness, standardization, and functionality. Additionally, while open-source hardware may reduce pressure on natural resources and local populations, it still relies on energy- and material-intensive infrastructures, such as the Internet. Despite these complexities, Kostakis et al argue, the open-source hardware framework can serve as a catalyst for connecting and unifying diverse local initiatives under radical narratives, thus inspiring genuine change.

OSH has grown as an academic field through the two journals Journal of Open Hardware (JOH) and HardwareX. These journals compete to publish the best OSH designs, and each define their own requirements for what constitutes acceptable quality of design documents, including specific requirements for build instructions, bill of materials, CAD files, and licences. These requirements are often used by other OSH projects to define how to do an OSH release. These journals also publish papers contributing to the debate about how OSH should be defined and used.

Basic research

From Wikipedia, the free encyclopedia

Basic research, also called pure research, fundamental research, basic science, or pure science, is a type of scientific research with the aim of improving scientific theories for better understanding and prediction of natural or other phenomena. In contrast, applied research uses scientific theories to develop technology or techniques, which can be used to intervene and alter natural or other phenomena. Though often driven simply by curiosity, basic research often fuels the technological innovations of applied science. The two aims are often practiced simultaneously in coordinated research and development.

In addition to innovations, basic research also serves to provide insight into nature around us and allows us to respect its innate value. The development of this respect is what drives conservation efforts. Through learning about the environment, conservation efforts can be strengthened using research as a basis. Technological innovations can unintentionally be created through this as well, as seen with examples such as kingfishers' beaks affecting the design for high speed bullet trains in Japan.

Overview

Despite smart people working on this problem for 50 years, we're still discovering surprisingly basic things about the earliest history of our world. It's quite humbling. — Matija Ćuk, scientist at the SETI Institute and lead researcher, November 2016

Basic research advances fundamental knowledge about the world. It focuses on creating and refuting or supporting theories that explain observed phenomena. Pure research is the source of most new scientific ideas and ways of thinking about the world. It can be exploratory, descriptive, or explanatory; however, explanatory research is the most common.

Basic research generates new ideas, principles, and theories, which may not be immediately utilized but nonetheless form the basis of progress and development in different fields. Today's computers, for example, could not exist without research in pure mathematics conducted over a century ago, for which there was no known practical application at the time. Basic research rarely helps practitioners directly with their everyday concerns; nevertheless, it stimulates new ways of thinking that have the potential to revolutionize and dramatically improve how practitioners deal with a problem in the future.

History

By country

In the United States, basic research is funded mainly by federal government and done mainly at universities and institutes. As government funding has diminished in the 2010s, however, private funding is increasingly important.

Basic versus applied science

Applied science focuses on the development of technology and techniques. In contrast, basic science develops scientific knowledge and predictions, principally in natural sciences but also in other empirical sciences, which are used as the scientific foundation for applied science. Basic science develops and establishes information to predict phenomena and perhaps to understand nature, whereas applied science uses portions of basic science to develop interventions via technology or technique to alter events or outcomes Applied and basic sciences can interface closely in research and development. The interface between basic research and applied research has been studied by the National Science Foundation.

A worker in basic scientific research is motivated by a driving curiosity about the unknown. When his explorations yield new knowledge, he experiences the satisfaction of those who first attain the summit of a mountain or the upper reaches of a river flowing through unmapped territory. Discovery of truth and understanding of nature are his objectives. His professional standing among his fellows depends upon the originality and soundness of his work. Creativeness in science is of a cloth with that of the poet or painter.

It conducted a study in which it traced the relationship between basic scientific research efforts and the development of major innovations, such as oral contraceptives and videotape recorders. This study found that basic research played a key role in the development in all of the innovations. The number of basic science research that assisted in the production of a given innovation peaked between 20 and 30 years before the innovation itself. While most innovation takes the form of applied science and most innovation occurs in the private sector, basic research is a necessary precursor to almost all applied science and associated instances of innovation. Roughly 76% of basic research is conducted by universities.

A distinction can be made between basic science and disciplines such as medicine and technology. They can be grouped as STM (science, technology, and medicine; not to be confused with STEM [science, technology, engineering, and mathematics]) or STS (science, technology, and society). These groups are interrelated and influence each other, although they may differ in the specifics such as methods and standards.

The Nobel Prize mixes basic with applied sciences for its award in Physiology or Medicine. In contrast, the Royal Society of London awards distinguish natural science from applied science.

Wednesday, May 22, 2024

Open-design movement

From Wikipedia, the free encyclopedia
RepRap general-purpose 3D printer that not only could be used to make structures and functional components for open-design projects but is an open-source project itself
Uzebox is an open-design video game console.
Bug Labs open source hardware
Zoybar open source guitar kit with 3-D printed body

The open-design movement involves the development of physical products, machines and systems through use of publicly shared design information. This includes the making of both free and open-source software (FOSS) as well as open-source hardware. The process is generally facilitated by the Internet and often performed without monetary compensation. The goals and philosophy of the movement are identical to that of the open-source movement, but are implemented for the development of physical products rather than software. Open design is a form of co-creation, where the final product is designed by the users, rather than an external stakeholder such as a private company.

Origin

Sharing of manufacturing information can be traced back to the 18th and 19th century. Aggressive patenting put an end to that period of extensive knowledge sharing. More recently, principles of open design have been related to the free and open-source software movements. In 1997 Eric S. Raymond, Tim O'Reilly and Larry Augustin established "open source" as an alternative expression to "free software", and in 1997 Bruce Perens published The Open Source Definition. In late 1998, Dr. Sepehr Kiani (a PhD in mechanical engineering from MIT) realized that designers could benefit from open source policies, and in early 1999 he convinced Dr. Ryan Vallance and Dr. Samir Nayfeh of the potential benefits of open design in machine design applications. Together they established the Open Design Foundation (ODF) as a non-profit corporation, and set out to develop an Open Design Definition.

The idea of open design was taken up, either simultaneously or subsequently, by several other groups and individuals. The principles of open design are closely similar to those of open-source hardware design, which emerged in March 1998 when Reinoud Lamberts of the Delft University of Technology proposed on his "Open Design Circuits" website the creation of a hardware design community in the spirit of free software.

Ronen Kadushin coined the title "Open Design" in his 2004 Master's thesis, and the term was later formalized in the 2010 Open Design Manifesto.

Current directions

Open Source Ecology, open source farming and industrial machinery

The open-design movement currently unites two trends. On one hand, people apply their skills and time on projects for the common good, perhaps where funding or commercial interest is lacking, for developing countries or to help spread ecological or cheaper technologies. On the other hand, open design may provide a framework for developing advanced projects and technologies that might be beyond the resource of any single company or country and involve people who, without the copyleft mechanism, might not collaborate otherwise. There is now also a third trend, where these two methods come together to use high-tech open-source (e.g. 3D printing) but customized local solutions for sustainable development. Open Design holds great potential in driving future innovation as recent research has proven that stakeholder users working together produce more innovative designs than designers consulting users through more traditional means. The open-design movement may arguably organize production by prioritising socio-ecological well-being over corporate profits, over-production and excess consumption.

Open machine design as compared to open-source software

The open-design movement is currently fairly nascent but holds great potential for the future. In some respects design and engineering are even more suited to open collaborative development than the increasingly common open-source software projects, because with 3D models and photographs the concept can often be understood visually. It is not even necessary that the project members speak the same languages to usefully collaborate.

However, there are certain barriers to overcome for open design when compared to software development where there are mature and widely used tools available and the duplication and distribution of code cost next to nothing. Creating, testing and modifying physical designs is not quite so straightforward because of the effort, time and cost required to create the physical artefact; although with access to emerging flexible computer-controlled manufacturing techniques the complexity and effort of construction can be significantly reduced (see tools mentioned in the fab lab article).

Organizations

VIA OpenBook reference design CAD visualisation

Open design was considered in 2012 a fledgling movement consisting of several unrelated or loosely related initiatives. Many of these organizations are single, funded projects, while a few organizations are focusing on an area needing development. In some cases (e.g. Thingiverse for 3D printable designs or Appropedia for open source appropriate technology) organizations are making an effort to create a centralized open source design repository as this enables innovation. Notable organizations include:

Software rot

From Wikipedia, the free encyclopedia

Software rot (bit rot, code rot, software erosion, software decay, or software entropy) is the deterioration of software quality or performance over time that leads to it becoming faulty, unusable, or needing upgrade.

Since software cannot physically decay, the term is hyperbole. The process is due to either changes in the source code or to the environment in which the software operates.

The Jargon File, a compendium of hacker lore, defines "bit rot" as a jocular explanation for the degradation of a software program over time even if "nothing has changed"; the idea behind this is almost as if the bits that make up the program were subject to radioactive decay.

Causes

Several factors are responsible for software rot, including changes to the environment in which the software operates, degradation of compatibility between parts of the software itself, and the emergence of bugs in unused or rarely used code.

Environment change

When changes occur in the program's environment, particularly changes which the designer of the program did not anticipate, the software may no longer operate as originally intended. For example, many early computer game designers used the CPU clock speed as a timer in their games. However, newer CPU clocks were faster, so the gameplay speed increased accordingly, making the games less usable over time.

Onceability

There are changes in the environment not related to the program's designer, but its users. Initially, a user could bring the system into working order, and have it working flawlessly for a certain amount of time. But, when the system stops working correctly, or the users want to access the configuration controls, they cannot repeat that initial step because of the different context and the unavailable information (password lost, missing instructions, or simply a hard-to-manage user interface that was first configured by trial and error). Information Architect Jonas Söderström has named this concept Onceability, and defines it as "the quality in a technical system that prevents a user from restoring the system, once it has failed".

Unused code

Infrequently used portions of code, such as document filters or interfaces designed to be used by other programs, may contain bugs that go unnoticed. With changes in user requirements and other external factors, this code may be executed later, thereby exposing the bugs and making the software appear less functional.

Rarely updated code

Normal maintenance of software and systems may also cause software rot. In particular, when a program contains multiple parts which function at arm's length from one another, failing to consider how changes to one part that affect the others may introduce bugs.

In some cases, this may take the form of libraries that the software uses being changed in a way which adversely affects the software. If the old version of a library that previously worked with the software can no longer be used due to conflicts with other software or security flaws that were found in the old version, there may no longer be a viable version of a needed library for the program to use.

Online connectivity

Modern commercial software often connects to an online server for license verification and accessing information. If the online service powering the software is shut down, it may stop working.

Since the late 2010s most websites use secure HTTPS connections. However this requires encryption keys called root certificates which have expiration dates. After the certificates expire the device loses connectivity to most websites unless the keys are continuously updated.

Another issue is that in March 2021 old encryption standards TLS 1.0 and TLS 1.1 were deprecated. This means that operating systems, browsers and other online software that do not support at least TLS 1.2 cannot connect to most websites, even to download patches or update the browser, if these are available. This is occasionally called the "TLS apocalypse".

Products that cannot connect to most websites include PowerMacs, old Unix boxes and Microsoft Windows versions older than Server 2008/Windows 7. The Internet Explorer 8 browser in Server 2008/Windows 7 does support TLS 1.2 but it is disabled by default.

Classification

Software rot is usually classified as being either 'dormant rot' or 'active rot'.

Dormant rot

Software that is not currently being used gradually becomes unusable as the remainder of the application changes. Changes in user requirements and the software environment also contribute to the deterioration.

Active rot

Software that is being continuously modified may lose its integrity over time if proper mitigating processes are not consistently applied. However, much software requires continuous changes to meet new requirements and correct bugs, and re-engineering software each time a change is made is rarely practical. This creates what is essentially an evolution process for the program, causing it to depart from the original engineered design. As a consequence of this and a changing environment, assumptions made by the original designers may be invalidated, thereby introducing bugs.

In practice, adding new features may be prioritized over updating documentation; without documentation, however, it is possible for specific knowledge pertaining to parts of the program to be lost. To some extent, this can be mitigated by following best current practices for coding conventions.

Active software rot slows once an application is near the end of its commercial life and further development ceases. Users often learn to work around any remaining software bugs, and the behaviour of the software becomes consistent as nothing is changing.

Examples

AI program example

Many seminal programs from the early days of AI research have suffered from irreparable software rot. For example, the original SHRDLU program (an early natural language understanding program) cannot be run on any modern day computer or computer simulator, as it was developed during the days when LISP and PLANNER were still in development stage, and thus uses non-standard macros and software libraries which do not exist anymore.

Forked online forum example

Suppose an administrator creates a forum using open source forum software, and then heavily modifies it by adding new features and options. This process requires extensive modifications to existing code and deviation from the original functionality of that software.

From here, there are several ways software rot can affect the system:

  • The administrator can accidentally make changes which conflict with each other or the original software, causing the forum to behave unexpectedly or break down altogether. This leaves them in a very bad position: as they have deviated so greatly from the original code, technical support and assistance in reviving the forum will be difficult to obtain.
  • A security hole may be discovered in the original forum source code, requiring a security patch. However, because the administrator has modified the code so extensively, the patch may not be directly applicable to their code, requiring the administrator to effectively rewrite the update.
  • The administrator who made the modifications could vacate their position, leaving the new administrator with a convoluted and heavily modified forum that lacks full documentation. Without fully understanding the modifications, it is difficult for the new administrator to make changes without introducing conflicts and bugs. Furthermore, documentation of the original system may no longer be available, or worse yet, misleading due to subtle differences in functional requirements.

Wiki example

Suppose a webmaster installs the latest version of MediaWiki, the software that powers wikis such as Wikipedia, then never applies any updates. Over time, the web host is likely to update their versions of the programming language (such as PHP) and the database (such as MariaDB) without consulting the webmaster. After a long enough time, this will eventually break complex websites that have not been updated, because the latest versions of PHP and MariaDB will have breaking changes as they hard deprecate certain built-in functions, breaking backwards compatibility and causing fatal errors. Other problems that can arise with un-updated website software include security vulnerabilities and spam.

Refactoring

Refactoring is a means of addressing the problem of software rot. It is described as the process of rewriting existing code to improve its structure without affecting its external behaviour. This includes removing dead code and rewriting sections that have been modified extensively and no longer work efficiently. Care must be taken not to change the software's external behaviour, as this could introduce incompatibilities and thereby itself contribute to software rot. Some design principles to consider when it comes to refactoring is maintaining the hierarchical structure of the code and implementing abstraction to simplify and generalize code structures.

Decompiler

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Decompiler

A decompiler is a computer program that translates an executable file to high-level source code. It does therefore the opposite of a typical compiler, which translates a high-level language to a low-level language. While disassemblers translate an executable into assembly language, decompilers go a step further and translate the code into a higher level language such as C or Java, requiring more sophisticated techniques. Decompilers are usually unable to perfectly reconstruct the original source code, thus will frequently produce obfuscated code. Nonetheless, they remain an important tool in the reverse engineering of computer software.

Introduction

The term decompiler is most commonly applied to a program which translates executable programs (the output from a compiler) into source code in a (relatively) high level language which, when compiled, will produce an executable whose behavior is the same as the original executable program. By comparison, a disassembler translates an executable program into assembly language (and an assembler could be used for assembling it back into an executable program).

Decompilation is the act of using a decompiler, although the term can also refer to the output of a decompiler. It can be used for the recovery of lost source code, and is also useful in some cases for computer security, interoperability and error correction. The success of decompilation depends on the amount of information present in the code being decompiled and the sophistication of the analysis performed on it. The bytecode formats used by many virtual machines (such as the Java Virtual Machine or the .NET Framework Common Language Runtime) often include extensive metadata and high-level features that make decompilation quite feasible. The application of debug data, i.e. debug-symbols, may enable to reproduce the original names of variables and structures and even the line numbers. Machine language without such metadata or debug data is much harder to decompile.

Some compilers and post-compilation tools produce obfuscated code (that is, they attempt to produce output that is very difficult to decompile, or that decompiles to confusing output). This is done to make it more difficult to reverse engineer the executable.

While decompilers are normally used to (re-)create source code from binary executables, there are also decompilers to turn specific binary data files into human-readable and editable sources.

The success level achieved by decompilers can be impacted by various factors. These include the abstraction level of the source language,  if the object code contains explicit class structure information, it aids the decompilation process. Descriptive information, especially with naming details, also accelerates the compiler's work. Moreover, less optimized code is quicker to decompile since optimization can cause greater deviation from the original code.

Design

Decompilers can be thought of as composed of a series of phases each of which contributes specific aspects of the overall decompilation process.

Loader

The first decompilation phase loads and parses the input machine code or intermediate language program's binary file format. It should be able to discover basic facts about the input program, such as the architecture (Pentium, PowerPC, etc.) and the entry point. In many cases, it should be able to find the equivalent of the main function of a C program, which is the start of the user written code. This excludes the runtime initialization code, which should not be decompiled if possible. If available the symbol tables and debug data are also loaded. The front end may be able to identify the libraries used even if they are linked with the code, this will provide library interfaces. If it can determine the compiler or compilers used it may provide useful information in identifying code idioms.

Disassembly

The next logical phase is the disassembly of machine code instructions into a machine independent intermediate representation (IR). For example, the Pentium machine instruction

mov    eax, [ebx+0x04]

might be translated to the IR

eax  := m[ebx+4];

Idioms

Idiomatic machine code sequences are sequences of code whose combined semantics are not immediately apparent from the instructions' individual semantics. Either as part of the disassembly phase, or as part of later analyses, these idiomatic sequences need to be translated into known equivalent IR. For example, the x86 assembly code:

    cdq    eax             ; edx is set to the sign-extension≠edi,edi +(tex)push
    xor    eax, edx
    sub    eax, edx

could be translated to

eax  := abs(eax);

Some idiomatic sequences are machine independent; some involve only one instruction. For example, xor eax, eax clears the eax register (sets it to zero). This can be implemented with a machine independent simplification rule, such as a = 0.

In general, it is best to delay detection of idiomatic sequences if possible, to later stages that are less affected by instruction ordering. For example, the instruction scheduling phase of a compiler may insert other instructions into an idiomatic sequence, or change the ordering of instructions in the sequence. A pattern matching process in the disassembly phase would probably not recognize the altered pattern. Later phases group instruction expressions into more complex expressions, and modify them into a canonical (standardized) form, making it more likely that even the altered idiom will match a higher level pattern later in the decompilation.

It is particularly important to recognize the compiler idioms for subroutine calls, exception handling, and switch statements. Some languages also have extensive support for strings or long integers.

Program analysis

Various program analyses can be applied to the IR. In particular, expression propagation combines the semantics of several instructions into more complex expressions. For example,

    mov   eax,[ebx+0x04]
    add   eax,[ebx+0x08]
    sub   [ebx+0x0C],eax

could result in the following IR after expression propagation:

m[ebx+12]  := m[ebx+12] - (m[ebx+4] + m[ebx+8]);

The resulting expression is more like high level language, and has also eliminated the use of the machine register eax. Later analyses may eliminate the ebx register.

Data flow analysis

The places where register contents are defined and used must be traced using data flow analysis. The same analysis can be applied to locations that are used for temporaries and local data. A different name can then be formed for each such connected set of value definitions and uses. It is possible that the same local variable location was used for more than one variable in different parts of the original program. Even worse it is possible for the data flow analysis to identify a path whereby a value may flow between two such uses even though it would never actually happen or matter in reality. This may in bad cases lead to needing to define a location as a union of types. The decompiler may allow the user to explicitly break such unnatural dependencies which will lead to clearer code. This of course means a variable is potentially used without being initialized and so indicates a problem in the original program.

Type analysis

A good machine code decompiler will perform type analysis. Here, the way registers or memory locations are used result in constraints on the possible type of the location. For example, an and instruction implies that the operand is an integer; programs do not use such an operation on floating point values (except in special library code) or on pointers. An add instruction results in three constraints, since the operands may be both integer, or one integer and one pointer (with integer and pointer results respectively; the third constraint comes from the ordering of the two operands when the types are different).

Various high level expressions can be recognized which trigger recognition of structures or arrays. However, it is difficult to distinguish many of the possibilities, because of the freedom that machine code or even some high level languages such as C allow with casts and pointer arithmetic.

The example from the previous section could result in the following high level code:

struct T1 *ebx;
    struct T1 {
        int v0004;
        int v0008;
        int v000C;
    };
ebx->v000C -= ebx->v0004 + ebx->v0008;

Structuring

The penultimate decompilation phase involves structuring of the IR into higher level constructs such as while loops and if/then/else conditional statements. For example, the machine code

    xor eax, eax
l0002:
    or  ebx, ebx
    jge l0003
    add eax,[ebx]
    mov ebx,[ebx+0x4]
    jmp l0002
l0003:
    mov [0x10040000],eax

could be translated into:

eax = 0;
while (ebx < 0) {
    eax += ebx->v0000;
    ebx = ebx->v0004;
}
v10040000 = eax;

Unstructured code is more difficult to translate into structured code than already structured code. Solutions include replicating some code, or adding boolean variables.

Code generation

The final phase is the generation of the high level code in the back end of the decompiler. Just as a compiler may have several back ends for generating machine code for different architectures, a decompiler may have several back ends for generating high level code in different high level languages.

Just before code generation, it may be desirable to allow an interactive editing of the IR, perhaps using some form of graphical user interface. This would allow the user to enter comments, and non-generic variable and function names. However, these are almost as easily entered in a post decompilation edit. The user may want to change structural aspects, such as converting a while loop to a for loop. These are less readily modified with a simple text editor, although source code refactoring tools may assist with this process. The user may need to enter information that failed to be identified during the type analysis phase, e.g. modifying a memory expression to an array or structure expression. Finally, incorrect IR may need to be corrected, or changes made to cause the output code to be more readable.

Other techniques

Decompilers using neural networks have been developed. Such a decompiler may be trained by machine learning to improve its accuracy over time.

Legality

The majority of computer programs are covered by copyright laws. Although the precise scope of what is covered by copyright differs from region to region, copyright law generally provides the author (the programmer(s) or employer) with a collection of exclusive rights to the program. These rights include the right to make copies, including copies made into the computer’s RAM (unless creating such a copy is essential for using the program). Since the decompilation process involves making multiple such copies, it is generally prohibited without the authorization of the copyright holder. However, because decompilation is often a necessary step in achieving software interoperability, copyright laws in both the United States and Europe permit decompilation to a limited extent.

In the United States, the copyright fair use defence has been successfully invoked in decompilation cases. For example, in Sega v. Accolade, the court held that Accolade could lawfully engage in decompilation in order to circumvent the software locking mechanism used by Sega's game consoles. Additionally, the Digital Millennium Copyright Act (PUBLIC LAW 105–304) has proper exemptions for both Security Testing and Evaluation in §1201(i), and Reverse Engineering in §1201(f).

In Europe, the 1991 Software Directive explicitly provides for a right to decompile in order to achieve interoperability. The result of a heated debate between, on the one side, software protectionists, and, on the other, academics as well as independent software developers, Article 6 permits decompilation only if a number of conditions are met:

  • First, a person or entity must have a licence to use the program to be decompiled.
  • Second, decompilation must be necessary to achieve interoperability with the target program or other programs. Interoperability information should therefore not be readily available, such as through manuals or API documentation. This is an important limitation. The necessity must be proven by the decompiler. The purpose of this important limitation is primarily to provide an incentive for developers to document and disclose their products' interoperability information.
  • Third, the decompilation process must, if possible, be confined to the parts of the target program relevant to interoperability. Since one of the purposes of decompilation is to gain an understanding of the program structure, this third limitation may be difficult to meet. Again, the burden of proof is on the decompiler.

In addition, Article 6 prescribes that the information obtained through decompilation may not be used for other purposes and that it may not be given to others.

Overall, the decompilation right provided by Article 6 codifies what is claimed to be common practice in the software industry. Few European lawsuits are known to have emerged from the decompilation right. This could be interpreted as meaning one of three things:

  1. ) the decompilation right is not used frequently and the decompilation right may therefore have been unnecessary,
  2. ) the decompilation right functions well and provides sufficient legal certainty not to give rise to legal disputes or
  3. ) illegal decompilation goes largely undetected.

In a report of 2000 regarding implementation of the Software Directive by the European member states, the European Commission seemed to support the second interpretation.

Inequality (mathematics)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Inequality...