Search This Blog

Wednesday, September 30, 2020

Supramolecular chemistry

From Wikipedia, the free encyclopedia

Supramolecular chemistry refers to the area of chemistry concerning chemical systems composed of a discrete number of molecules. The strength of the forces responsible for spatial organization of the system range from weak intermolecular forces, electrostatic charge, or hydrogen bonding to strong covalent bonding, provided that the electronic coupling strength remains small relative to the energy parameters of the component. Whereas traditional chemistry concentrates on the covalent bond, supramolecular chemistry examines the weaker and reversible non-covalent interactions between molecules. These forces include hydrogen bonding, metal coordination, hydrophobic forces, van der Waals forces, pi–pi interactions and electrostatic effects.

Important concepts advanced by supramolecular chemistry include molecular self-assembly, molecular folding, molecular recognition, host–guest chemistry, mechanically-interlocked molecular architectures, and dynamic covalent chemistry. The study of non-covalent interactions is crucial to understanding many biological processes that rely on these forces for structure and function. Biological systems are often the inspiration for supramolecular research.

Gallery

History

The existence of intermolecular forces was first postulated by Johannes Diderik van der Waals in 1873. However, Nobel laureate Hermann Emil Fischer developed supramolecular chemistry's philosophical roots. In 1894, Fischer suggested that enzyme–substrate interactions take the form of a "lock and key", the fundamental principles of molecular recognition and host–guest chemistry. In the early twentieth century non-covalent bonds were understood in gradually more detail, with the hydrogen bond being described by Latimer and Rodebush in 1920.

The use of these principles led to an increasing understanding of protein structure and other biological processes. For instance, the important breakthrough that allowed the elucidation of the double helical structure of DNA occurred when it was realized that there are two separate strands of nucleotides connected through hydrogen bonds. The use of non-covalent bonds is essential to replication because they allow the strands to be separated and used to template new double stranded DNA. Concomitantly, chemists began to recognize and study synthetic structures based on non-covalent interactions, such as micelles and microemulsions.

Eventually, chemists were able to take these concepts and apply them to synthetic systems. The breakthrough came in the 1960s with the synthesis of the crown ethers by Charles J. Pedersen. Following this work, other researchers such as Donald J. Cram, Jean-Marie Lehn and Fritz Vögtle became active in synthesizing shape- and ion-selective receptors, and throughout the 1980s research in the area gathered a rapid pace with concepts such as mechanically interlocked molecular architectures emerging.

The importance of supramolecular chemistry was established by the 1987 Nobel Prize for Chemistry which was awarded to Donald J. Cram, Jean-Marie Lehn, and Charles J. Pedersen in recognition of their work in this area. The development of selective "host–guest" complexes in particular, in which a host molecule recognizes and selectively binds a certain guest, was cited as an important contribution.

In the 1990s, supramolecular chemistry became even more sophisticated, with researchers such as James Fraser Stoddart developing molecular machinery and highly complex self-assembled structures, and Itamar Willner developing sensors and methods of electronic and biological interfacing. During this period, electrochemical and photochemical motifs became integrated into supramolecular systems in order to increase functionality, research into synthetic self-replicating system began, and work on molecular information processing devices began. The emerging science of nanotechnology also had a strong influence on the subject, with building blocks such as fullerenes, nanoparticles, and dendrimers becoming involved in synthetic systems.

Control

Thermodynamics

Supramolecular chemistry deals with subtle interactions, and consequently control over the processes involved can require great precision. In particular, non-covalent bonds have low energies and often no activation energy for formation. As demonstrated by the Arrhenius equation, this means that, unlike in covalent bond-forming chemistry, the rate of bond formation is not increased at higher temperatures. In fact, chemical equilibrium equations show that the low bond energy results in a shift towards the breaking of supramolecular complexes at higher temperatures.

However, low temperatures can also be problematic to supramolecular processes. Supramolecular chemistry can require molecules to distort into thermodynamically disfavored conformations (e.g. during the "slipping" synthesis of rotaxanes), and may include some covalent chemistry that goes along with the supramolecular. In addition, the dynamic nature of supramolecular chemistry is utilized in many systems (e.g. molecular mechanics), and cooling the system would slow these processes.

Thus, thermodynamics is an important tool to design, control, and study supramolecular chemistry. Perhaps the most striking example is that of warm-blooded biological systems, which entirely cease to operate outside a very narrow temperature range.

Environment

The molecular environment around a supramolecular system is also of prime importance to its operation and stability. Many solvents have strong hydrogen bonding, electrostatic, and charge-transfer capabilities, and are therefore able to become involved in complex equilibria with the system, even breaking complexes completely. For this reason, the choice of solvent can be critical.

Concepts

Molecular self-assembly

Molecular self-assembly is the construction of systems without guidance or management from an outside source (other than to provide a suitable environment). The molecules are directed to assemble through non-covalent interactions. Self-assembly may be subdivided into intermolecular self-assembly (to form a supramolecular assembly), and intramolecular self-assembly (or folding as demonstrated by foldamers and polypeptides). Molecular self-assembly also allows the construction of larger structures such as micelles, membranes, vesicles, liquid crystals, and is important to crystal engineering.

Molecular recognition and complexation

Molecular recognition is the specific binding of a guest molecule to a complementary host molecule to form a host–guest complex. Often, the definition of which species is the "host" and which is the "guest" is arbitrary. The molecules are able to identify each other using non-covalent interactions. Key applications of this field are the construction of molecular sensors and catalysis.

Template-directed synthesis

Molecular recognition and self-assembly may be used with reactive species in order to pre-organize a system for a chemical reaction (to form one or more covalent bonds). It may be considered a special case of supramolecular catalysis. Non-covalent bonds between the reactants and a "template" hold the reactive sites of the reactants close together, facilitating the desired chemistry. This technique is particularly useful for situations where the desired reaction conformation is thermodynamically or kinetically unlikely, such as in the preparation of large macrocycles. This pre-organization also serves purposes such as minimizing side reactions, lowering the activation energy of the reaction, and producing desired stereochemistry. After the reaction has taken place, the template may remain in place, be forcibly removed, or may be "automatically" decomplexed on account of the different recognition properties of the reaction product. The template may be as simple as a single metal ion or may be extremely complex.

Mechanically interlocked molecular architectures

Mechanically interlocked molecular architectures consist of molecules that are linked only as a consequence of their topology. Some non-covalent interactions may exist between the different components (often those that were utilized in the construction of the system), but covalent bonds do not. Supramolecular chemistry, and template-directed synthesis in particular, is key to the efficient synthesis of the compounds. Examples of mechanically interlocked molecular architectures include catenanes, rotaxanes, molecular knots, molecular Borromean rings and ravels.

Dynamic covalent chemistry

In dynamic covalent chemistry covalent bonds are broken and formed in a reversible reaction under thermodynamic control. While covalent bonds are key to the process, the system is directed by non-covalent forces to form the lowest energy structures.

Biomimetics

Many synthetic supramolecular systems are designed to copy functions of biological systems. These biomimetic architectures can be used to learn about both the biological model and the synthetic implementation. Examples include photoelectrochemical systems, catalytic systems, protein design and self-replication.

Imprinting

Molecular imprinting describes a process by which a host is constructed from small molecules using a suitable molecular species as a template. After construction, the template is removed leaving only the host. The template for host construction may be subtly different from the guest that the finished host binds to. In its simplest form, imprinting utilizes only steric interactions, but more complex systems also incorporate hydrogen bonding and other interactions to improve binding strength and specificity.

Molecular machinery

Molecular machines are molecules or molecular assemblies that can perform functions such as linear or rotational movement, switching, and entrapment. These devices exist at the boundary between supramolecular chemistry and nanotechnology, and prototypes have been demonstrated using supramolecular concepts. Jean-Pierre Sauvage, Sir J. Fraser Stoddart and Bernard L. Feringa shared the 2016 Nobel Prize in Chemistry for the 'design and synthesis of molecular machines'.

Building blocks

Supramolecular systems are rarely designed from first principles. Rather, chemists have a range of well-studied structural and functional building blocks that they are able to use to build up larger functional architectures. Many of these exist as whole families of similar units, from which the analog with the exact desired properties can be chosen.

Synthetic recognition motifs

Macrocycles

Macrocycles are very useful in supramolecular chemistry, as they provide whole cavities that can completely surround guest molecules and may be chemically modified to fine-tune their properties.

  • Cyclodextrins, calixarenes, cucurbiturils and crown ethers are readily synthesized in large quantities, and are therefore convenient for use in supramolecular systems.
  • More complex cyclophanes, and cryptands can be synthesised to provide more tailored recognition properties.
  • Supramolecular metallocycles are macrocyclic aggregates with metal ions in the ring, often formed from angular and linear modules. Common metallocycle shapes in these types of applications include triangles, squares, and pentagons, each bearing functional groups that connect the pieces via "self-assembly."
  • Metallacrowns are metallomacrocycles generated via a similar self-assembly approach from fused chelate-rings.

Structural units

Many supramolecular systems require their components to have suitable spacing and conformations relative to each other, and therefore easily employed structural units are required.

  • Commonly used spacers and connecting groups include polyether chains, biphenyls and triphenyls, and simple alkyl chains. The chemistry for creating and connecting these units is very well understood.
  • nanoparticles, nanorods, fullerenes and dendrimers offer nanometer-sized structure and encapsulation units.
  • Surfaces can be used as scaffolds for the construction of complex systems and also for interfacing electrochemical systems with electrodes. Regular surfaces can be used for the construction of self-assembled monolayers and multilayers.
  • The understanding of intermolecular interactions in solids has undergone a major renaissance via inputs from different experimental and computational methods in the last decade. This includes high-pressure studies in solids and in situ crystallization of compounds which are liquids at room temperature along with the utilization of electron density analysis, crystal structure prediction and DFT calculations in solid state to enable a quantitative understanding of the nature, energetics and topological properties associated with such interactions in crystals.

Photo-chemically and electro-chemically active units

Biologically-derived units

  • The extremely strong complexation between avidin and biotin is instrumental in blood clotting, and has been used as the recognition motif to construct synthetic systems.
  • The binding of enzymes with their cofactors has been used as a route to produce modified enzymes, electrically contacted enzymes, and even photoswitchable enzymes.
  • DNA has been used both as a structural and as a functional unit in synthetic supramolecular systems.

Applications

Materials technology

Supramolecular chemistry has found many applications, in particular molecular self-assembly processes have been applied to the development of new materials. Large structures can be readily accessed using bottom-up synthesis as they are composed of small molecules requiring fewer steps to synthesize. Thus most of the bottom-up approaches to nanotechnology are based on supramolecular chemistry. Many smart materials are based on molecular recognition.

Catalysis

A major application of supramolecular chemistry is the design and understanding of catalysts and catalysis. Non-covalent interactions are extremely important in catalysis, binding reactants into conformations suitable for reaction and lowering the transition state energy of reaction. Template-directed synthesis is a special case of supramolecular catalysis. Encapsulation systems such as micelles, dendrimers, and cavitands are also used in catalysis to create microenvironments suitable for reactions (or steps in reactions) to progress that is not possible to use on a macroscopic scale.

Medicine

Design based on supramolecular chemistry has led to numerous applications in the creation of functional biomaterials and therapeutics. Supramolecular biomaterials afford a number of modular and generalizable platforms with tunable mechanical, chemical and biological properties. These include systems based on supramolecular assembly of peptides, host–guest macrocycles, high-affinity hydrogen bonding, and metal–ligand interactions.

A supramolecular approach has been used extensively to create artificial ion channels for the transport of sodium and potassium ions into and out of cells.

Supramolecular chemistry is also important to the development of new pharmaceutical therapies by understanding the interactions at a drug binding site. The area of drug delivery has also made critical advances as a result of supramolecular chemistry providing encapsulation and targeted release mechanisms. In addition, supramolecular systems have been designed to disrupt protein–protein interactions that are important to cellular function.

Data storage and processing

Supramolecular chemistry has been used to demonstrate computation functions on a molecular scale. In many cases, photonic or chemical signals have been used in these components, but electrical interfacing of these units has also been shown by supramolecular signal transduction devices. Data storage has been accomplished by the use of molecular switches with photochromic and photoisomerizable units, by electrochromic and redox-switchable units, and even by molecular motion. Synthetic molecular logic gates have been demonstrated on a conceptual level. Even full-scale computations have been achieved by semi-synthetic DNA computers.

Methodology

From Wikipedia, the free encyclopedia
 
The methodology underlying a type of DNA sequencing.

Methodology is the systematic, theoretical analysis of the methods applied to a field of study. It comprises the theoretical analysis of the body of methods and principles associated with a branch of knowledge. Typically, it encompasses concepts such as paradigm, theoretical model, phases and quantitative or qualitative techniques.

A methodology does not set out to provide solutions—it is therefore, not the same as a method. Instead, a methodology offers the theoretical underpinning for understanding which method, set of methods, or best practices can be applied to a specific case, for example, to calculate a specific result.

It has been defined also as follows:

  1. "the analysis of the principles of methods, rules, and postulates employed by a discipline";
  2. "the systematic study of methods that are, can be, or have been applied within a discipline";
  3. "the study or description of methods."

Related concepts

Methodology has several related concepts: paradigm, algorithm, and method.

The methodology is the general research strategy that outlines the way in which research is to be undertaken and, among other things, identifies the methods to be used in it. These methods, described in the methodology, define the means or modes of data collection or, sometimes, how a specific result is to be calculated. Methodology does not define specific methods, even though much attention is given to the nature and kinds of processes to be followed in a particular procedure or to attain an objective.

When proper to a study of methodology, such processes constitute a constructive generic framework, and may therefore be broken down into sub-processes, combined, or their sequence changed.

A paradigm is similar to a methodology in that it is also a constructive framework. In theoretical work, the development of paradigms satisfies most or all of the criteria for methodology.

An algorithm, like a paradigm, is also a type of constructive framework, meaning that the construction is a logical, rather than a physical, array of connected elements.

Any description of a means of calculation of a specific result is always a description of a method and never a description of a methodology. It is thus important to avoid using methodology as a synonym for method or body of methods. Doing this shifts it away from its true epistemological meaning and reduces it to being the procedure itself, or the set of tools, or the instruments that should have been its outcome. A methodology is the design process for carrying out research or the development of a procedure and is not in itself an instrument, or method, or procedure for doing things.

The economist George M. Frankfurter has argued that the word method is not interchangeable for methodology, and in contemporary scientific discourse is a "pretentious substitute for the word method". He argues that using methodology as a synonym for method or set of methods leads to confusion and misinterpretation and undermines the proper analysis that should go into designing research.

Top-down and bottom-up design

From Wikipedia, the free encyclopedia

Top-down and bottom-up are both strategies of information processing and knowledge ordering, used in a variety of fields including software, humanistic and scientific theories (see systemics), and management and organization. In practice, they can be seen as a style of thinking, teaching, or leadership.

A top-down approach (also known as stepwise design and stepwise refinement and in some cases used as a synonym of decomposition) is essentially the breaking down of a system to gain insight into its compositional sub-systems in a reverse engineering fashion. In a top-down approach an overview of the system is formulated, specifying, but not detailing, any first-level subsystems. Each subsystem is then refined in yet greater detail, sometimes in many additional subsystem levels, until the entire specification is reduced to base elements. A top-down model is often specified with the assistance of "black boxes", which makes it easier to manipulate. However, black boxes may fail to clarify elementary mechanisms or be detailed enough to realistically validate the model. Top down approach starts with the big picture. It breaks down from there into smaller segments.

A bottom-up approach is the piecing together of systems to give rise to more complex systems, thus making the original systems sub-systems of the emergent system. Bottom-up processing is a type of information processing based on incoming data from the environment to form a perception. From a cognitive psychology perspective, information enters the eyes in one direction (sensory input, or the "bottom"), and is then turned into an image by the brain that can be interpreted and recognized as a perception (output that is "built up" from processing to final cognition). In a bottom-up approach the individual base elements of the system are first specified in great detail. These elements are then linked together to form larger subsystems, which then in turn are linked, sometimes in many levels, until a complete top-level system is formed. This strategy often resembles a "seed" model, by which the beginnings are small but eventually grow in complexity and completeness. However, "organic strategies" may result in a tangle of elements and subsystems, developed in isolation and subject to local optimization as opposed to meeting a global purpose.

Product design and development

During the design and development of new products, designers and engineers rely on both a bottom-up and top-down approach. The bottom-up approach is being utilized when off-the-shelf or existing components are selected and integrated into the product. An example would include selecting a particular fastener, such as a bolt, and designing the receiving components such that the fastener will fit properly. In a top-down approach, a custom fastener would be designed such that it would fit properly in the receiving components. For perspective, for a product with more restrictive requirements (such as weight, geometry, safety, environment, etc.), such as a space-suit, a more top-down approach is taken and almost everything is custom designed.

Computer science

Software development

In the software development process, the top-down and bottom-up approaches play a key role.

Top-down approaches emphasize planning and a complete understanding of the system. It is inherent that no coding can begin until a sufficient level of detail has been reached in the design of at least some part of the system. Top-down approaches are implemented by attaching the stubs in place of the module. This, however, delays testing of the ultimate functional units of a system until significant design is complete.

Bottom-up emphasizes coding and early testing, which can begin as soon as the first module has been specified. This approach, however, runs the risk that modules may be coded without having a clear idea of how they link to other parts of the system, and that such linking may not be as easy as first thought. Re-usability of code is one of the main benefits of the bottom-up approach.

Top-down design was promoted in the 1970s by IBM researchers Harlan Mills and Niklaus Wirth. Mills developed structured programming concepts for practical use and tested them in a 1969 project to automate the New York Times morgue index. The engineering and management success of this project led to the spread of the top-down approach through IBM and the rest of the computer industry. Among other achievements, Niklaus Wirth, the developer of Pascal programming language, wrote the influential paper Program Development by Stepwise Refinement. Since Niklaus Wirth went on to develop languages such as Modula and Oberon (where one could define a module before knowing about the entire program specification), one can infer that top-down programming was not strictly what he promoted. Top-down methods were favored in software engineering until the late 1980s, and object-oriented programming assisted in demonstrating the idea that both aspects of top-down and bottom-up programming could be utilized.

Modern software design approaches usually combine both top-down and bottom-up approaches. Although an understanding of the complete system is usually considered necessary for good design, leading theoretically to a top-down approach, most software projects attempt to make use of existing code to some degree. Pre-existing modules give designs a bottom-up flavor. Some design approaches also use an approach where a partially functional system is designed and coded to completion, and this system is then expanded to fulfill all the requirements for the project.

Programming

Building blocks are an example of bottom-up design because the parts are first created and then assembled without regard to how the parts will work in the assembly.

Top-down is a programming style, the mainstay of traditional procedural languages, in which design begins by specifying complex pieces and then dividing them into successively smaller pieces. The technique for writing a program using top–down methods is to write a main procedure that names all the major functions it will need. Later, the programming team looks at the requirements of each of those functions and the process is repeated. These compartmentalized sub-routines eventually will perform actions so simple they can be easily and concisely coded. When all the various sub-routines have been coded the program is ready for testing. By defining how the application comes together at a high level, lower level work can be self-contained. By defining how the lower level abstractions are expected to integrate into higher level ones, interfaces become clearly defined.

In a bottom-up approach, the individual base elements of the system are first specified in great detail. These elements are then linked together to form larger subsystems, which then in turn are linked, sometimes in many levels, until a complete top-level system is formed. This strategy often resembles a "seed" model, by which the beginnings are small, but eventually grow in complexity and completeness. Object-oriented programming (OOP) is a paradigm that uses "objects" to design applications and computer programs. In mechanical engineering with software programs such as Pro/ENGINEER, Solidworks, and Autodesk Inventor users can design products as pieces not part of the whole and later add those pieces together to form assemblies like building with Lego. Engineers call this piece part design.

In a bottom-up approach, good intuition is necessary to decide the functionality that is to be provided by the module. If a system is to be built from an existing system, this approach is more suitable as it starts from some existing modules.

Parsing

Parsing is the process of analyzing an input sequence (such as that read from a file or a keyboard) in order to determine its grammatical structure. This method is used in the analysis of both natural languages and computer languages, as in a compiler.

Bottom-up parsing is a strategy for analyzing unknown data relationships that attempts to identify the most fundamental units first, and then to infer higher-order structures from them. Top-down parsers, on the other hand, hypothesize general parse tree structures and then consider whether the known fundamental structures are compatible with the hypothesis. See Top-down parsing and Bottom-up parsing.

Nanotechnology

Top-down and bottom-up are two approaches for the manufacture of products. These terms were first applied to the field of nanotechnology by the Foresight Institute in 1989 in order to distinguish between molecular manufacturing (to mass-produce large atomically precise objects) and conventional manufacturing (which can mass-produce large objects that are not atomically precise). Bottom-up approaches seek to have smaller (usually molecular) components built up into more complex assemblies, while top-down approaches seek to create nanoscale devices by using larger, externally controlled ones to direct their assembly. Certain valuable nanostructures, such as Silicon nanowires, can be fabricated using either approach, with processing methods selected on the basis of targeted applications.

The top-down approach often uses the traditional workshop or microfabrication methods where externally controlled tools are used to cut, mill, and shape materials into the desired shape and order. Micropatterning techniques, such as photolithography and inkjet printing belong to this category. Vapor treatment can be regarded as a new top-down secondary approaches to engineer nanostructures.

Bottom-up approaches, in contrast, use the chemical properties of single molecules to cause single-molecule components to (a) self-organize or self-assemble into some useful conformation, or (b) rely on positional assembly. These approaches utilize the concepts of molecular self-assembly and/or molecular recognition. See also Supramolecular chemistry. Such bottom-up approaches should, broadly speaking, be able to produce devices in parallel and much cheaper than top-down methods, but could potentially be overwhelmed as the size and complexity of the desired assembly increases.

Neuroscience and psychology

An example of top-down processing: Even though the second letter in each word is ambiguous, top-down processing allows for easy disambiguation based on the context.

These terms are also employed in neuroscience, cognitive neuroscience and cognitive psychology to discuss the flow of information in processing. Typically sensory input is considered "bottom-up", and higher cognitive processes, which have more information from other sources, are considered "top-down". A bottom-up process is characterized by an absence of higher level direction in sensory processing, whereas a top-down process is characterized by a high level of direction of sensory processing by more cognition, such as goals or targets (Beiderman, 19).

According to college teaching notes written by Charles Ramskov, Rock, Neiser, and Gregory claim that top-down approach involves perception that is an active and constructive process. Additionally, it is an approach not directly given by stimulus input, but is the result of stimulus, internal hypotheses, and expectation interactions. According to Theoretical Synthesis, "when a stimulus is presented short and clarity is uncertain that gives a vague stimulus, perception becomes a top-down approach."

Conversely, psychology defines bottom-up processing as an approach wherein there is a progression from the individual elements to the whole. According to Ramskov, one proponent of bottom-up approach, Gibson, claims that it is a process that includes visual perception that needs information available from proximal stimulus produced by the distal stimulus. Theoretical Synthesis also claims that bottom-up processing occurs "when a stimulus is presented long and clearly enough."

Cognitively speaking, certain cognitive processes, such as fast reactions or quick visual identification, are considered bottom-up processes because they rely primarily on sensory information, whereas processes such as motor control and directed attention are considered top-down because they are goal directed. Neurologically speaking, some areas of the brain, such as area V1 mostly have bottom-up connections. Other areas, such as the fusiform gyrus have inputs from higher brain areas and are considered to have top-down influence.

The study of visual attention provides an example. If your attention is drawn to a flower in a field, it may be because the color or shape of the flower are visually salient. The information that caused you to attend to the flower came to you in a bottom-up fashion—your attention was not contingent upon knowledge of the flower; the outside stimulus was sufficient on its own. Contrast this situation with one in which you are looking for a flower. You have a representation of what you are looking for. When you see the object you are looking for, it is salient. This is an example of the use of top-down information.

In cognitive terms, two thinking approaches are distinguished. "Top-down" (or "big chunk") is stereotypically the visionary, or the person who sees the larger picture and overview. Such people focus on the big picture and from that derive the details to support it. "Bottom-up" (or "small chunk") cognition is akin to focusing on the detail primarily, rather than the landscape. The expression "seeing the wood for the trees" references the two styles of cognition.

Management and organization

In the fields of management and organization, the terms "top-down" and "bottom-up" are used to describe how decisions are made and/or how change is implemented.

A "top-down" approach is where an executive decision maker or other top person makes the decisions of how something should be done. This approach is disseminated under their authority to lower levels in the hierarchy, who are, to a greater or lesser extent, bound by them. For example, when wanting to make an improvement in a hospital, a hospital administrator might decide that a major change (such as implementing a new program) is needed, and then the leader uses a planned approach to drive the changes down to the frontline staff (Stewart, Manges, Ward, 2015).

A "bottom-up" approach to changes is one that works from the grassroots—from a large number of people working together, causing a decision to arise from their joint involvement. A decision by a number of activists, students, or victims of some incident to take action is a "bottom-up" decision. A bottom-up approach can be thought of as "an incremental change approach that represents an emergent process cultivated and upheld primarily by frontline workers" (Stewart, Manges, Ward, 2015, p. 241).

Positive aspects of top-down approaches include their efficiency and superb overview of higher levels. Also, external effects can be internalized. On the negative side, if reforms are perceived to be imposed 'from above', it can be difficult for lower levels to accept them (e.g. Bresser-Pereira, Maravall, and Przeworski 1993). Evidence suggests this to be true regardless of the content of reforms (e.g. Dubois 2002). A bottom-up approach allows for more experimentation and a better feeling for what is needed at the bottom. Other evidence suggests that there is a third combination approach to change (see Stewart, Manges, Ward, 2015).

Public health

Both top-down and bottom-up approaches exist in public health. There are many examples of top-down programs, often run by governments or large inter-governmental organizations (IGOs); many of these are disease-specific or issue-specific, such as HIV control or Smallpox Eradication. Examples of bottom-up programs include many small NGOs set up to improve local access to healthcare. However, a lot of programs seek to combine both approaches; for instance, guinea worm eradication, a single-disease international program currently run by the Carter Center has involved the training of many local volunteers, boosting bottom-up capacity, as have international programs for hygiene, sanitation, and access to primary health-care.

Architecture

Often, the École des Beaux-Arts school of design is said to have primarily promoted top-down design because it taught that an architectural design should begin with a parti, a basic plan drawing of the overall project.

By contrast, the Bauhaus focused on bottom-up design. This method manifested itself in the study of translating small-scale organizational systems to a larger, more architectural scale (as with the woodpanel carving and furniture design).

Ecology

In ecology, top-down control refers to when a top predator controls the structure or population dynamics of the ecosystem. The interactions between these top predators and their prey is what influences lower trophic levels. Changes in the top level of trophic levels have an inverse effect on the lower trophic levels. Top-down control can have negative effects on the surrounding ecosystem if there is a drastic change in the number of predators. The classic example is of kelp forest ecosystems. In such ecosystems, sea otters are a keystone predator. They prey on urchins which in turn eat kelp. When otters are removed, urchin populations grow and reduce the kelp forest creating urchin barrens. This reduces the diversity of the ecosystem as a whole and can detrimental effects on all of the other organisms. In other words, such ecosystems are not controlled by productivity of the kelp, but rather, a top predator. 

One can see the inverse effect that top-down control has in this example; when the population of otters decreased, the population of the urchins increased.


Bottom-up control in ecosystems refers to ecosystems in which the nutrient supply, productivity, and type of primary producers (plants and phytoplankton) control the ecosystem structure. If there are not enough resources or producers in the ecosystem, there is not enough energy left for the rest of the animals in the food chain because of biomagnification and the ecological efficiency. An example would be how plankton populations are controlled by the availability of nutrients. Plankton populations tend to be higher and more complex in areas where upwelling brings nutrients to the surface.


There are many different examples of these concepts. It is common for populations to be influenced by both types of control, and there is still debates going on as to which type of control affects food webs in certain ecosystems.

Philosophy and ethics

Top-down reasoning in ethics is when the reasoner starts from abstract universalisable principles and then reasons down them to particular situations. Bottom-up reasoning occurs when the reasoner starts from intuitive particular situational judgements and then reasons up to principles. Reflective Equilibrium occurs when there is interaction between top-down and bottom-up reasoning until both are in harmony. That is to say, when universalisable abstract principles are reflectively found to be in equilibrium with particular intuitive judgements. The process occurs when cognitive dissonance occurs when reasoners try to resolve top-down with bottom-up reasoning, and adjust one or the other, until they are satisfied they have found the best combinations of principles and situational judgements.

 

Software development process

From Wikipedia, the free encyclopedia
 
In software engineering, a software development process is the process of dividing software development work into distinct phases to improve design, product management, and project management. It is also known as a software development life cycle (SDLC). The methodology may include the pre-definition of specific deliverables and artifacts that are created and completed by a project team to develop or maintain an application.

Most modern development processes can be vaguely described as agile. Other methodologies include waterfall, prototyping, iterative and incremental development, spiral development, rapid application development, and extreme programming.

A life-cycle "model" is sometimes considered a more general term for a category of methodologies and a software development "process" a more specific term to refer to a specific process chosen by a specific organization. For example, there are many specific software development processes that fit the spiral life-cycle model. The field is often considered a subset of the systems development life cycle.

History

The software development methodology (also known as SDM) framework didn't emerge until the 1960s. According to Elliott (2004) the systems development life cycle (SDLC) can be considered to be the oldest formalized methodology framework for building information systems. The main idea of the SDLC has been "to pursue the development of information systems in a very deliberate, structured and methodical way, requiring each stage of the life cycle––from inception of the idea to delivery of the final system––to be carried out rigidly and sequentially" within the context of the framework being applied. The main target of this methodology framework in the 1960s was "to develop large scale functional business systems in an age of large scale business conglomerates. Information systems activities revolved around heavy data processing and number crunching routines".

Methodologies, processes, and frameworks range from specific proscriptive steps that can be used directly by an organization in day-to-day work, to flexible frameworks that an organization uses to generate a custom set of steps tailored to the needs of a specific project or group. In some cases a "sponsor" or "maintenance" organization distributes an official set of documents that describe the process. Specific examples include:

1970s
1980s
1990s
2000s

2010s

It is notable that since DSDM in 1994, all of the methodologies on the above list except RUP have been agile methodologies - yet many organisations, especially governments, still use pre-agile processes (often waterfall or similar). Software process and software quality are closely interrelated; some unexpected facets and effects have been observed in practice 

Among these another software development process has been established in open source. The adoption of these best practices known and established processes within the confines of a company is called inner source.

Prototyping

Software prototyping is about creating prototypes, i.e. incomplete versions of the software program being developed.

The basic principles are:

  • Prototyping is not a standalone, complete development methodology, but rather an approach to try out particular features in the context of a full methodology (such as incremental, spiral, or rapid application development (RAD)).
  • Attempts to reduce inherent project risk by breaking a project into smaller segments and providing more ease-of-change during the development process.
  • The client is involved throughout the development process, which increases the likelihood of client acceptance of the final implementation.
  • While some prototypes are developed with the expectation that they will be discarded, it is possible in some cases to evolve from prototype to working system.

A basic understanding of the fundamental business problem is necessary to avoid solving the wrong problems, but this is true for all software methodologies.

Methodologies

Agile development

"Agile software development" refers to a group of software development methodologies based on iterative development, where requirements and solutions evolve via collaboration between self-organizing cross-functional teams. The term was coined in the year 2001 when the Agile Manifesto was formulated.

Agile software development uses iterative development as a basis but advocates a lighter and more people-centric viewpoint than traditional approaches. Agile processes fundamentally incorporate iteration and the continuous feedback that it provides to successively refine and deliver a software system.

There are many agile methodologies, including:

Continuous integration

Continuous integration is the practice of merging all developer working copies to a shared mainline several times a day. Grady Booch first named and proposed CI in his 1991 method, although he did not advocate integrating several times a day. Extreme programming (XP) adopted the concept of CI and did advocate integrating more than once per day – perhaps as many as tens of times per day.

Incremental development

Various methods are acceptable for combining linear and iterative systems development methodologies, with the primary objective of each being to reduce inherent project risk by breaking a project into smaller segments and providing more ease-of-change during the development process.

There are three main variants of incremental development:

  1. A series of mini-Waterfalls are performed, where all phases of the Waterfall are completed for a small part of a system, before proceeding to the next increment, or
  2. Overall requirements are defined before proceeding to evolutionary, mini-Waterfall development of individual increments of a system, or
  3. The initial software concept, requirements analysis, and design of architecture and system core are defined via Waterfall, followed by incremental implementation, which culminates in installing the final version, a working system.

Rapid application development

Rapid Application Development (RAD) Model

Rapid application development (RAD) is a software development methodology, which favors iterative development and the rapid construction of prototypes instead of large amounts of up-front planning. The "planning" of software developed using RAD is interleaved with writing the software itself. The lack of extensive pre-planning generally allows software to be written much faster, and makes it easier to change requirements.

The rapid development process starts with the development of preliminary data models and business process models using structured techniques. In the next stage, requirements are verified using prototyping, eventually to refine the data and process models. These stages are repeated iteratively; further development results in "a combined business requirements and technical design statement to be used for constructing new systems".

The term was first used to describe a software development process introduced by James Martin in 1991. According to Whitten (2003), it is a merger of various structured techniques, especially data-driven information technology engineering, with prototyping techniques to accelerate software systems development.

The basic principles of rapid application development are:

  • Key objective is for fast development and delivery of a high quality system at a relatively low investment cost.
  • Attempts to reduce inherent project risk by breaking a project into smaller segments and providing more ease-of-change during the development process.
  • Aims to produce high quality systems quickly, primarily via iterative Prototyping (at any stage of development), active user involvement, and computerized development tools. These tools may include Graphical User Interface (GUI) builders, Computer Aided Software Engineering (CASE) tools, Database Management Systems (DBMS), fourth-generation programming languages, code generators, and object-oriented techniques.
  • Key emphasis is on fulfilling the business need, while technological or engineering excellence is of lesser importance.
  • Project control involves prioritizing development and defining delivery deadlines or “timeboxes”. If the project starts to slip, emphasis is on reducing requirements to fit the timebox, not in increasing the deadline.
  • Generally includes joint application design (JAD), where users are intensely involved in system design, via consensus building in either structured workshops, or electronically facilitated interaction.
  • Active user involvement is imperative.
  • Iteratively produces production software, as opposed to a throwaway prototype.
  • Produces documentation necessary to facilitate future development and maintenance.
  • Standard systems analysis and design methods can be fitted into this framework.

Spiral development

Spiral model (Boehm, 1988)

In 1988, Barry Boehm published a formal software system development "spiral model," which combines some key aspect of the waterfall model and rapid prototyping methodologies, in an effort to combine advantages of top-down and bottom-up concepts. It provided emphasis in a key area many felt had been neglected by other methodologies: deliberate iterative risk analysis, particularly suited to large-scale complex systems.

The basic principles are:

  • Focus is on risk assessment and on minimizing project risk by breaking a project into smaller segments and providing more ease-of-change during the development process, as well as providing the opportunity to evaluate risks and weigh consideration of project continuation throughout the life cycle.
  • "Each cycle involves a progression through the same sequence of steps, for each part of the product and for each of its levels of elaboration, from an overall concept-of-operation document down to the coding of each individual program."
  • Each trip around the spiral traverses four basic quadrants: (1) determine objectives, alternatives, and constraints of the iteration; (2) evaluate alternatives; Identify and resolve risks; (3) develop and verify deliverables from the iteration; and (4) plan the next iteration.
  • Begin each cycle with an identification of stakeholders and their "win conditions", and end each cycle with review and commitment.

Waterfall development

The activities of the software development process represented in the waterfall model. There are several other models to represent this process.

The waterfall model is a sequential development approach, in which development is seen as flowing steadily downwards (like a waterfall) through several phases, typically:

The first formal description of the method is often cited as an article published by Winston W. Royce in 1970, although Royce did not use the term "waterfall" in this article. Royce presented this model as an example of a flawed, non-working model.

The basic principles are:

  • Project is divided into sequential phases, with some overlap and splashback acceptable between phases.
  • Emphasis is on planning, time schedules, target dates, budgets and implementation of an entire system at one time.
  • Tight control is maintained over the life of the project via extensive written documentation, formal reviews, and approval/signoff by the user and information technology management occurring at the end of most phases before beginning the next phase. Written documentation is an explicit deliverable of each phase.

The waterfall model is a traditional engineering approach applied to software engineering. A strict waterfall approach discourages revisiting and revising any prior phase once it is complete. This "inflexibility" in a pure waterfall model has been a source of criticism by supporters of other more "flexible" models. It has been widely blamed for several large-scale government projects running over budget, over time and sometimes failing to deliver on requirements due to the Big Design Up Front approach. Except when contractually required, the waterfall model has been largely superseded by more flexible and versatile methodologies developed specifically for software development. See Criticism of Waterfall model.

Other

Other high-level software project methodologies include:

Process meta-models

Some "process models" are abstract descriptions for evaluating, comparing, and improving the specific process adopted by an organization.

  • ISO/IEC 12207 is the international standard describing the method to select, implement, and monitor the life cycle for software.
  • The Capability Maturity Model Integration (CMMI) is one of the leading models and based on best practice. Independent assessments grade organizations on how well they follow their defined processes, not on the quality of those processes or the software produced. CMMI has replaced CMM.
  • ISO 9000 describes standards for a formally organized process to manufacture a product and the methods of managing and monitoring progress. Although the standard was originally created for the manufacturing sector, ISO 9000 standards have been applied to software development as well. Like CMMI, certification with ISO 9000 does not guarantee the quality of the end result, only that formalized business processes have been followed.
  • ISO/IEC 15504 Information technology — Process assessment also known as Software Process Improvement Capability Determination (SPICE), is a "framework for the assessment of software processes". This standard is aimed at setting out a clear model for process comparison. SPICE is used much like CMMI. It models processes to manage, control, guide and monitor software development. This model is then used to measure what a development organization or project team actually does during software development. This information is analyzed to identify weaknesses and drive improvement. It also identifies strengths that can be continued or integrated into common practice for that organization or team.
  • ISO/IEC 24744 Software Engineering — Metamodel for Development Methodologies, is a powertype-based metamodel for software development methodologies.
  • SPEM 2.0 by the Object Management Group
  • Soft systems methodology - a general method for improving management processes
  • Method engineering - a general method for improving information system processes

In practice

The three basic approaches applied to software development methodology frameworks.

A variety of such frameworks have evolved over the years, each with its own recognized strengths and weaknesses. One software development methodology framework is not necessarily suitable for use by all projects. Each of the available methodology frameworks are best suited to specific kinds of projects, based on various technical, organizational, project and team considerations.

Software development organizations implement process methodologies to ease the process of development. Sometimes, contractors may require methodologies employed, an example is the U.S. defense industry, which requires a rating based on process models to obtain contracts. The international standard for describing the method of selecting, implementing and monitoring the life cycle for software is ISO/IEC 12207.

A decades-long goal has been to find repeatable, predictable processes that improve productivity and quality. Some try to systematize or formalize the seemingly unruly task of designing software. Others apply project management techniques to designing software. Large numbers of software projects do not meet their expectations in terms of functionality, cost, or delivery schedule - see List of failed and overbudget custom software projects for some notable examples.

Organizations may create a Software Engineering Process Group (SEPG), which is the focal point for process improvement. Composed of line practitioners who have varied skills, the group is at the center of the collaborative effort of everyone in the organization who is involved with software engineering process improvement.

A particular development team may also agree to programming environment details, such as which integrated development environment is used, and one or more dominant programming paradigms, programming style rules, or choice of specific software libraries or software frameworks. These details are generally not dictated by the choice of model or general methodology.

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...