Search This Blog

Wednesday, September 30, 2020

Top-down and bottom-up design

From Wikipedia, the free encyclopedia

Top-down and bottom-up are both strategies of information processing and knowledge ordering, used in a variety of fields including software, humanistic and scientific theories (see systemics), and management and organization. In practice, they can be seen as a style of thinking, teaching, or leadership.

A top-down approach (also known as stepwise design and stepwise refinement and in some cases used as a synonym of decomposition) is essentially the breaking down of a system to gain insight into its compositional sub-systems in a reverse engineering fashion. In a top-down approach an overview of the system is formulated, specifying, but not detailing, any first-level subsystems. Each subsystem is then refined in yet greater detail, sometimes in many additional subsystem levels, until the entire specification is reduced to base elements. A top-down model is often specified with the assistance of "black boxes", which makes it easier to manipulate. However, black boxes may fail to clarify elementary mechanisms or be detailed enough to realistically validate the model. Top down approach starts with the big picture. It breaks down from there into smaller segments.

A bottom-up approach is the piecing together of systems to give rise to more complex systems, thus making the original systems sub-systems of the emergent system. Bottom-up processing is a type of information processing based on incoming data from the environment to form a perception. From a cognitive psychology perspective, information enters the eyes in one direction (sensory input, or the "bottom"), and is then turned into an image by the brain that can be interpreted and recognized as a perception (output that is "built up" from processing to final cognition). In a bottom-up approach the individual base elements of the system are first specified in great detail. These elements are then linked together to form larger subsystems, which then in turn are linked, sometimes in many levels, until a complete top-level system is formed. This strategy often resembles a "seed" model, by which the beginnings are small but eventually grow in complexity and completeness. However, "organic strategies" may result in a tangle of elements and subsystems, developed in isolation and subject to local optimization as opposed to meeting a global purpose.

Product design and development

During the design and development of new products, designers and engineers rely on both a bottom-up and top-down approach. The bottom-up approach is being utilized when off-the-shelf or existing components are selected and integrated into the product. An example would include selecting a particular fastener, such as a bolt, and designing the receiving components such that the fastener will fit properly. In a top-down approach, a custom fastener would be designed such that it would fit properly in the receiving components. For perspective, for a product with more restrictive requirements (such as weight, geometry, safety, environment, etc.), such as a space-suit, a more top-down approach is taken and almost everything is custom designed.

Computer science

Software development

In the software development process, the top-down and bottom-up approaches play a key role.

Top-down approaches emphasize planning and a complete understanding of the system. It is inherent that no coding can begin until a sufficient level of detail has been reached in the design of at least some part of the system. Top-down approaches are implemented by attaching the stubs in place of the module. This, however, delays testing of the ultimate functional units of a system until significant design is complete.

Bottom-up emphasizes coding and early testing, which can begin as soon as the first module has been specified. This approach, however, runs the risk that modules may be coded without having a clear idea of how they link to other parts of the system, and that such linking may not be as easy as first thought. Re-usability of code is one of the main benefits of the bottom-up approach.

Top-down design was promoted in the 1970s by IBM researchers Harlan Mills and Niklaus Wirth. Mills developed structured programming concepts for practical use and tested them in a 1969 project to automate the New York Times morgue index. The engineering and management success of this project led to the spread of the top-down approach through IBM and the rest of the computer industry. Among other achievements, Niklaus Wirth, the developer of Pascal programming language, wrote the influential paper Program Development by Stepwise Refinement. Since Niklaus Wirth went on to develop languages such as Modula and Oberon (where one could define a module before knowing about the entire program specification), one can infer that top-down programming was not strictly what he promoted. Top-down methods were favored in software engineering until the late 1980s, and object-oriented programming assisted in demonstrating the idea that both aspects of top-down and bottom-up programming could be utilized.

Modern software design approaches usually combine both top-down and bottom-up approaches. Although an understanding of the complete system is usually considered necessary for good design, leading theoretically to a top-down approach, most software projects attempt to make use of existing code to some degree. Pre-existing modules give designs a bottom-up flavor. Some design approaches also use an approach where a partially functional system is designed and coded to completion, and this system is then expanded to fulfill all the requirements for the project.

Programming

Building blocks are an example of bottom-up design because the parts are first created and then assembled without regard to how the parts will work in the assembly.

Top-down is a programming style, the mainstay of traditional procedural languages, in which design begins by specifying complex pieces and then dividing them into successively smaller pieces. The technique for writing a program using top–down methods is to write a main procedure that names all the major functions it will need. Later, the programming team looks at the requirements of each of those functions and the process is repeated. These compartmentalized sub-routines eventually will perform actions so simple they can be easily and concisely coded. When all the various sub-routines have been coded the program is ready for testing. By defining how the application comes together at a high level, lower level work can be self-contained. By defining how the lower level abstractions are expected to integrate into higher level ones, interfaces become clearly defined.

In a bottom-up approach, the individual base elements of the system are first specified in great detail. These elements are then linked together to form larger subsystems, which then in turn are linked, sometimes in many levels, until a complete top-level system is formed. This strategy often resembles a "seed" model, by which the beginnings are small, but eventually grow in complexity and completeness. Object-oriented programming (OOP) is a paradigm that uses "objects" to design applications and computer programs. In mechanical engineering with software programs such as Pro/ENGINEER, Solidworks, and Autodesk Inventor users can design products as pieces not part of the whole and later add those pieces together to form assemblies like building with Lego. Engineers call this piece part design.

In a bottom-up approach, good intuition is necessary to decide the functionality that is to be provided by the module. If a system is to be built from an existing system, this approach is more suitable as it starts from some existing modules.

Parsing

Parsing is the process of analyzing an input sequence (such as that read from a file or a keyboard) in order to determine its grammatical structure. This method is used in the analysis of both natural languages and computer languages, as in a compiler.

Bottom-up parsing is a strategy for analyzing unknown data relationships that attempts to identify the most fundamental units first, and then to infer higher-order structures from them. Top-down parsers, on the other hand, hypothesize general parse tree structures and then consider whether the known fundamental structures are compatible with the hypothesis. See Top-down parsing and Bottom-up parsing.

Nanotechnology

Top-down and bottom-up are two approaches for the manufacture of products. These terms were first applied to the field of nanotechnology by the Foresight Institute in 1989 in order to distinguish between molecular manufacturing (to mass-produce large atomically precise objects) and conventional manufacturing (which can mass-produce large objects that are not atomically precise). Bottom-up approaches seek to have smaller (usually molecular) components built up into more complex assemblies, while top-down approaches seek to create nanoscale devices by using larger, externally controlled ones to direct their assembly. Certain valuable nanostructures, such as Silicon nanowires, can be fabricated using either approach, with processing methods selected on the basis of targeted applications.

The top-down approach often uses the traditional workshop or microfabrication methods where externally controlled tools are used to cut, mill, and shape materials into the desired shape and order. Micropatterning techniques, such as photolithography and inkjet printing belong to this category. Vapor treatment can be regarded as a new top-down secondary approaches to engineer nanostructures.

Bottom-up approaches, in contrast, use the chemical properties of single molecules to cause single-molecule components to (a) self-organize or self-assemble into some useful conformation, or (b) rely on positional assembly. These approaches utilize the concepts of molecular self-assembly and/or molecular recognition. See also Supramolecular chemistry. Such bottom-up approaches should, broadly speaking, be able to produce devices in parallel and much cheaper than top-down methods, but could potentially be overwhelmed as the size and complexity of the desired assembly increases.

Neuroscience and psychology

An example of top-down processing: Even though the second letter in each word is ambiguous, top-down processing allows for easy disambiguation based on the context.

These terms are also employed in neuroscience, cognitive neuroscience and cognitive psychology to discuss the flow of information in processing. Typically sensory input is considered "bottom-up", and higher cognitive processes, which have more information from other sources, are considered "top-down". A bottom-up process is characterized by an absence of higher level direction in sensory processing, whereas a top-down process is characterized by a high level of direction of sensory processing by more cognition, such as goals or targets (Beiderman, 19).

According to college teaching notes written by Charles Ramskov, Rock, Neiser, and Gregory claim that top-down approach involves perception that is an active and constructive process. Additionally, it is an approach not directly given by stimulus input, but is the result of stimulus, internal hypotheses, and expectation interactions. According to Theoretical Synthesis, "when a stimulus is presented short and clarity is uncertain that gives a vague stimulus, perception becomes a top-down approach."

Conversely, psychology defines bottom-up processing as an approach wherein there is a progression from the individual elements to the whole. According to Ramskov, one proponent of bottom-up approach, Gibson, claims that it is a process that includes visual perception that needs information available from proximal stimulus produced by the distal stimulus. Theoretical Synthesis also claims that bottom-up processing occurs "when a stimulus is presented long and clearly enough."

Cognitively speaking, certain cognitive processes, such as fast reactions or quick visual identification, are considered bottom-up processes because they rely primarily on sensory information, whereas processes such as motor control and directed attention are considered top-down because they are goal directed. Neurologically speaking, some areas of the brain, such as area V1 mostly have bottom-up connections. Other areas, such as the fusiform gyrus have inputs from higher brain areas and are considered to have top-down influence.

The study of visual attention provides an example. If your attention is drawn to a flower in a field, it may be because the color or shape of the flower are visually salient. The information that caused you to attend to the flower came to you in a bottom-up fashion—your attention was not contingent upon knowledge of the flower; the outside stimulus was sufficient on its own. Contrast this situation with one in which you are looking for a flower. You have a representation of what you are looking for. When you see the object you are looking for, it is salient. This is an example of the use of top-down information.

In cognitive terms, two thinking approaches are distinguished. "Top-down" (or "big chunk") is stereotypically the visionary, or the person who sees the larger picture and overview. Such people focus on the big picture and from that derive the details to support it. "Bottom-up" (or "small chunk") cognition is akin to focusing on the detail primarily, rather than the landscape. The expression "seeing the wood for the trees" references the two styles of cognition.

Management and organization

In the fields of management and organization, the terms "top-down" and "bottom-up" are used to describe how decisions are made and/or how change is implemented.

A "top-down" approach is where an executive decision maker or other top person makes the decisions of how something should be done. This approach is disseminated under their authority to lower levels in the hierarchy, who are, to a greater or lesser extent, bound by them. For example, when wanting to make an improvement in a hospital, a hospital administrator might decide that a major change (such as implementing a new program) is needed, and then the leader uses a planned approach to drive the changes down to the frontline staff (Stewart, Manges, Ward, 2015).

A "bottom-up" approach to changes is one that works from the grassroots—from a large number of people working together, causing a decision to arise from their joint involvement. A decision by a number of activists, students, or victims of some incident to take action is a "bottom-up" decision. A bottom-up approach can be thought of as "an incremental change approach that represents an emergent process cultivated and upheld primarily by frontline workers" (Stewart, Manges, Ward, 2015, p. 241).

Positive aspects of top-down approaches include their efficiency and superb overview of higher levels. Also, external effects can be internalized. On the negative side, if reforms are perceived to be imposed 'from above', it can be difficult for lower levels to accept them (e.g. Bresser-Pereira, Maravall, and Przeworski 1993). Evidence suggests this to be true regardless of the content of reforms (e.g. Dubois 2002). A bottom-up approach allows for more experimentation and a better feeling for what is needed at the bottom. Other evidence suggests that there is a third combination approach to change (see Stewart, Manges, Ward, 2015).

Public health

Both top-down and bottom-up approaches exist in public health. There are many examples of top-down programs, often run by governments or large inter-governmental organizations (IGOs); many of these are disease-specific or issue-specific, such as HIV control or Smallpox Eradication. Examples of bottom-up programs include many small NGOs set up to improve local access to healthcare. However, a lot of programs seek to combine both approaches; for instance, guinea worm eradication, a single-disease international program currently run by the Carter Center has involved the training of many local volunteers, boosting bottom-up capacity, as have international programs for hygiene, sanitation, and access to primary health-care.

Architecture

Often, the École des Beaux-Arts school of design is said to have primarily promoted top-down design because it taught that an architectural design should begin with a parti, a basic plan drawing of the overall project.

By contrast, the Bauhaus focused on bottom-up design. This method manifested itself in the study of translating small-scale organizational systems to a larger, more architectural scale (as with the woodpanel carving and furniture design).

Ecology

In ecology, top-down control refers to when a top predator controls the structure or population dynamics of the ecosystem. The interactions between these top predators and their prey is what influences lower trophic levels. Changes in the top level of trophic levels have an inverse effect on the lower trophic levels. Top-down control can have negative effects on the surrounding ecosystem if there is a drastic change in the number of predators. The classic example is of kelp forest ecosystems. In such ecosystems, sea otters are a keystone predator. They prey on urchins which in turn eat kelp. When otters are removed, urchin populations grow and reduce the kelp forest creating urchin barrens. This reduces the diversity of the ecosystem as a whole and can detrimental effects on all of the other organisms. In other words, such ecosystems are not controlled by productivity of the kelp, but rather, a top predator. 

One can see the inverse effect that top-down control has in this example; when the population of otters decreased, the population of the urchins increased.


Bottom-up control in ecosystems refers to ecosystems in which the nutrient supply, productivity, and type of primary producers (plants and phytoplankton) control the ecosystem structure. If there are not enough resources or producers in the ecosystem, there is not enough energy left for the rest of the animals in the food chain because of biomagnification and the ecological efficiency. An example would be how plankton populations are controlled by the availability of nutrients. Plankton populations tend to be higher and more complex in areas where upwelling brings nutrients to the surface.


There are many different examples of these concepts. It is common for populations to be influenced by both types of control, and there is still debates going on as to which type of control affects food webs in certain ecosystems.

Philosophy and ethics

Top-down reasoning in ethics is when the reasoner starts from abstract universalisable principles and then reasons down them to particular situations. Bottom-up reasoning occurs when the reasoner starts from intuitive particular situational judgements and then reasons up to principles. Reflective Equilibrium occurs when there is interaction between top-down and bottom-up reasoning until both are in harmony. That is to say, when universalisable abstract principles are reflectively found to be in equilibrium with particular intuitive judgements. The process occurs when cognitive dissonance occurs when reasoners try to resolve top-down with bottom-up reasoning, and adjust one or the other, until they are satisfied they have found the best combinations of principles and situational judgements.

 

Software development process

From Wikipedia, the free encyclopedia
 
In software engineering, a software development process is the process of dividing software development work into distinct phases to improve design, product management, and project management. It is also known as a software development life cycle (SDLC). The methodology may include the pre-definition of specific deliverables and artifacts that are created and completed by a project team to develop or maintain an application.

Most modern development processes can be vaguely described as agile. Other methodologies include waterfall, prototyping, iterative and incremental development, spiral development, rapid application development, and extreme programming.

A life-cycle "model" is sometimes considered a more general term for a category of methodologies and a software development "process" a more specific term to refer to a specific process chosen by a specific organization. For example, there are many specific software development processes that fit the spiral life-cycle model. The field is often considered a subset of the systems development life cycle.

History

The software development methodology (also known as SDM) framework didn't emerge until the 1960s. According to Elliott (2004) the systems development life cycle (SDLC) can be considered to be the oldest formalized methodology framework for building information systems. The main idea of the SDLC has been "to pursue the development of information systems in a very deliberate, structured and methodical way, requiring each stage of the life cycle––from inception of the idea to delivery of the final system––to be carried out rigidly and sequentially" within the context of the framework being applied. The main target of this methodology framework in the 1960s was "to develop large scale functional business systems in an age of large scale business conglomerates. Information systems activities revolved around heavy data processing and number crunching routines".

Methodologies, processes, and frameworks range from specific proscriptive steps that can be used directly by an organization in day-to-day work, to flexible frameworks that an organization uses to generate a custom set of steps tailored to the needs of a specific project or group. In some cases a "sponsor" or "maintenance" organization distributes an official set of documents that describe the process. Specific examples include:

1970s
1980s
1990s
2000s

2010s

It is notable that since DSDM in 1994, all of the methodologies on the above list except RUP have been agile methodologies - yet many organisations, especially governments, still use pre-agile processes (often waterfall or similar). Software process and software quality are closely interrelated; some unexpected facets and effects have been observed in practice 

Among these another software development process has been established in open source. The adoption of these best practices known and established processes within the confines of a company is called inner source.

Prototyping

Software prototyping is about creating prototypes, i.e. incomplete versions of the software program being developed.

The basic principles are:

  • Prototyping is not a standalone, complete development methodology, but rather an approach to try out particular features in the context of a full methodology (such as incremental, spiral, or rapid application development (RAD)).
  • Attempts to reduce inherent project risk by breaking a project into smaller segments and providing more ease-of-change during the development process.
  • The client is involved throughout the development process, which increases the likelihood of client acceptance of the final implementation.
  • While some prototypes are developed with the expectation that they will be discarded, it is possible in some cases to evolve from prototype to working system.

A basic understanding of the fundamental business problem is necessary to avoid solving the wrong problems, but this is true for all software methodologies.

Methodologies

Agile development

"Agile software development" refers to a group of software development methodologies based on iterative development, where requirements and solutions evolve via collaboration between self-organizing cross-functional teams. The term was coined in the year 2001 when the Agile Manifesto was formulated.

Agile software development uses iterative development as a basis but advocates a lighter and more people-centric viewpoint than traditional approaches. Agile processes fundamentally incorporate iteration and the continuous feedback that it provides to successively refine and deliver a software system.

There are many agile methodologies, including:

Continuous integration

Continuous integration is the practice of merging all developer working copies to a shared mainline several times a day. Grady Booch first named and proposed CI in his 1991 method, although he did not advocate integrating several times a day. Extreme programming (XP) adopted the concept of CI and did advocate integrating more than once per day – perhaps as many as tens of times per day.

Incremental development

Various methods are acceptable for combining linear and iterative systems development methodologies, with the primary objective of each being to reduce inherent project risk by breaking a project into smaller segments and providing more ease-of-change during the development process.

There are three main variants of incremental development:

  1. A series of mini-Waterfalls are performed, where all phases of the Waterfall are completed for a small part of a system, before proceeding to the next increment, or
  2. Overall requirements are defined before proceeding to evolutionary, mini-Waterfall development of individual increments of a system, or
  3. The initial software concept, requirements analysis, and design of architecture and system core are defined via Waterfall, followed by incremental implementation, which culminates in installing the final version, a working system.

Rapid application development

Rapid Application Development (RAD) Model

Rapid application development (RAD) is a software development methodology, which favors iterative development and the rapid construction of prototypes instead of large amounts of up-front planning. The "planning" of software developed using RAD is interleaved with writing the software itself. The lack of extensive pre-planning generally allows software to be written much faster, and makes it easier to change requirements.

The rapid development process starts with the development of preliminary data models and business process models using structured techniques. In the next stage, requirements are verified using prototyping, eventually to refine the data and process models. These stages are repeated iteratively; further development results in "a combined business requirements and technical design statement to be used for constructing new systems".

The term was first used to describe a software development process introduced by James Martin in 1991. According to Whitten (2003), it is a merger of various structured techniques, especially data-driven information technology engineering, with prototyping techniques to accelerate software systems development.

The basic principles of rapid application development are:

  • Key objective is for fast development and delivery of a high quality system at a relatively low investment cost.
  • Attempts to reduce inherent project risk by breaking a project into smaller segments and providing more ease-of-change during the development process.
  • Aims to produce high quality systems quickly, primarily via iterative Prototyping (at any stage of development), active user involvement, and computerized development tools. These tools may include Graphical User Interface (GUI) builders, Computer Aided Software Engineering (CASE) tools, Database Management Systems (DBMS), fourth-generation programming languages, code generators, and object-oriented techniques.
  • Key emphasis is on fulfilling the business need, while technological or engineering excellence is of lesser importance.
  • Project control involves prioritizing development and defining delivery deadlines or “timeboxes”. If the project starts to slip, emphasis is on reducing requirements to fit the timebox, not in increasing the deadline.
  • Generally includes joint application design (JAD), where users are intensely involved in system design, via consensus building in either structured workshops, or electronically facilitated interaction.
  • Active user involvement is imperative.
  • Iteratively produces production software, as opposed to a throwaway prototype.
  • Produces documentation necessary to facilitate future development and maintenance.
  • Standard systems analysis and design methods can be fitted into this framework.

Spiral development

Spiral model (Boehm, 1988)

In 1988, Barry Boehm published a formal software system development "spiral model," which combines some key aspect of the waterfall model and rapid prototyping methodologies, in an effort to combine advantages of top-down and bottom-up concepts. It provided emphasis in a key area many felt had been neglected by other methodologies: deliberate iterative risk analysis, particularly suited to large-scale complex systems.

The basic principles are:

  • Focus is on risk assessment and on minimizing project risk by breaking a project into smaller segments and providing more ease-of-change during the development process, as well as providing the opportunity to evaluate risks and weigh consideration of project continuation throughout the life cycle.
  • "Each cycle involves a progression through the same sequence of steps, for each part of the product and for each of its levels of elaboration, from an overall concept-of-operation document down to the coding of each individual program."
  • Each trip around the spiral traverses four basic quadrants: (1) determine objectives, alternatives, and constraints of the iteration; (2) evaluate alternatives; Identify and resolve risks; (3) develop and verify deliverables from the iteration; and (4) plan the next iteration.
  • Begin each cycle with an identification of stakeholders and their "win conditions", and end each cycle with review and commitment.

Waterfall development

The activities of the software development process represented in the waterfall model. There are several other models to represent this process.

The waterfall model is a sequential development approach, in which development is seen as flowing steadily downwards (like a waterfall) through several phases, typically:

The first formal description of the method is often cited as an article published by Winston W. Royce in 1970, although Royce did not use the term "waterfall" in this article. Royce presented this model as an example of a flawed, non-working model.

The basic principles are:

  • Project is divided into sequential phases, with some overlap and splashback acceptable between phases.
  • Emphasis is on planning, time schedules, target dates, budgets and implementation of an entire system at one time.
  • Tight control is maintained over the life of the project via extensive written documentation, formal reviews, and approval/signoff by the user and information technology management occurring at the end of most phases before beginning the next phase. Written documentation is an explicit deliverable of each phase.

The waterfall model is a traditional engineering approach applied to software engineering. A strict waterfall approach discourages revisiting and revising any prior phase once it is complete. This "inflexibility" in a pure waterfall model has been a source of criticism by supporters of other more "flexible" models. It has been widely blamed for several large-scale government projects running over budget, over time and sometimes failing to deliver on requirements due to the Big Design Up Front approach. Except when contractually required, the waterfall model has been largely superseded by more flexible and versatile methodologies developed specifically for software development. See Criticism of Waterfall model.

Other

Other high-level software project methodologies include:

Process meta-models

Some "process models" are abstract descriptions for evaluating, comparing, and improving the specific process adopted by an organization.

  • ISO/IEC 12207 is the international standard describing the method to select, implement, and monitor the life cycle for software.
  • The Capability Maturity Model Integration (CMMI) is one of the leading models and based on best practice. Independent assessments grade organizations on how well they follow their defined processes, not on the quality of those processes or the software produced. CMMI has replaced CMM.
  • ISO 9000 describes standards for a formally organized process to manufacture a product and the methods of managing and monitoring progress. Although the standard was originally created for the manufacturing sector, ISO 9000 standards have been applied to software development as well. Like CMMI, certification with ISO 9000 does not guarantee the quality of the end result, only that formalized business processes have been followed.
  • ISO/IEC 15504 Information technology — Process assessment also known as Software Process Improvement Capability Determination (SPICE), is a "framework for the assessment of software processes". This standard is aimed at setting out a clear model for process comparison. SPICE is used much like CMMI. It models processes to manage, control, guide and monitor software development. This model is then used to measure what a development organization or project team actually does during software development. This information is analyzed to identify weaknesses and drive improvement. It also identifies strengths that can be continued or integrated into common practice for that organization or team.
  • ISO/IEC 24744 Software Engineering — Metamodel for Development Methodologies, is a powertype-based metamodel for software development methodologies.
  • SPEM 2.0 by the Object Management Group
  • Soft systems methodology - a general method for improving management processes
  • Method engineering - a general method for improving information system processes

In practice

The three basic approaches applied to software development methodology frameworks.

A variety of such frameworks have evolved over the years, each with its own recognized strengths and weaknesses. One software development methodology framework is not necessarily suitable for use by all projects. Each of the available methodology frameworks are best suited to specific kinds of projects, based on various technical, organizational, project and team considerations.

Software development organizations implement process methodologies to ease the process of development. Sometimes, contractors may require methodologies employed, an example is the U.S. defense industry, which requires a rating based on process models to obtain contracts. The international standard for describing the method of selecting, implementing and monitoring the life cycle for software is ISO/IEC 12207.

A decades-long goal has been to find repeatable, predictable processes that improve productivity and quality. Some try to systematize or formalize the seemingly unruly task of designing software. Others apply project management techniques to designing software. Large numbers of software projects do not meet their expectations in terms of functionality, cost, or delivery schedule - see List of failed and overbudget custom software projects for some notable examples.

Organizations may create a Software Engineering Process Group (SEPG), which is the focal point for process improvement. Composed of line practitioners who have varied skills, the group is at the center of the collaborative effort of everyone in the organization who is involved with software engineering process improvement.

A particular development team may also agree to programming environment details, such as which integrated development environment is used, and one or more dominant programming paradigms, programming style rules, or choice of specific software libraries or software frameworks. These details are generally not dictated by the choice of model or general methodology.

Matthew 10

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Matthew_10

Matthew 10
Sinaiticus, Matthew 9,23-10,17.JPG
Gospel of Matthew 9:23–10:17 on Codex Sinaiticus, made about AD 330–360.
BookGospel of Matthew
CategoryGospel
Christian Bible partNew Testament
Order in the Christian part1

Matthew 10 is the tenth chapter in the Gospel of Matthew in the New Testament section of the Christian Bible. Matthew 10 comes after Jesus had called some of his disciples and before the meeting with the disciples of John the Baptist. This section is also known as the Mission Discourse or the Little Commission, in contrast to the Great Commission (Matthew 28:1820). The Little Commission is directed specifically to the Jewish believers of the early church, while the Great Commission is to all nationalities. The Pulpit Commentary suggests that Jesus' message in this discourse "was hardly likely to have been remembered outside Jewish Christian circles".

Matthew names the twelve apostles, or "twelve disciples", in verses 1 to 4 and the remainder of the chapter consists almost entirely of sayings attributed to Jesus. In this chapter, Jesus sends out the apostles to heal and preach throughout the region and gives them careful instruction. Many of the sayings found in Matthew 10 are also found in Luke 10 and the Gospel of Thomas, which is not part of the accepted canon of the New Testament.

Text

Matthew 10:13–15 on Papyrus 110 (3rd/4th century), recto side.
 
Matthew 10:25–27 on Papyrus 110 (3rd/4th century), verso side.

The original text was written in Koine Greek. This chapter is divided into 42 verses.

Textual witnesses

Some early manuscripts containing the text of this chapter are:

Matthew 10:10–17 on Codex Petropolitanus Purpureus (6th century).
 
Codex Sinaiticus (AD 330–360), Matthew 10:17–11:15

The twelve

The text in verse 1 refers to "his twelve disciples" (Greek: τους δωδεκα μαθητας αυτου, tous dodeka mathetas autou). Verse 2 calls them "the twelve apostles":

²Now the names of the twelve apostles are these: The first, Simon, who is called Peter, and Andrew his brother; James the son of Zebedee, and John his brother; ³Philip, and Bartholomew; Thomas, and Matthew the publican; James the son of Alphaeus, and Lebbaeus, whose surname was Thaddaeus; ⁴Simon the Canaanite, and Judas Iscariot, who also betrayed him.

Verse 5 refers to them simply as "the twelve".

Verse 10

nor bag for your journey, nor two tunics, nor sandals, nor staffs; for a worker is worthy of his food.

Cross reference: Mark 6:8–9; Luke 9:3

  • "Bag" (KJV: "Scrip"): Called "tarmil" in Hebrew as found in one Jewish commentary, it is a large leather bag, which shepherds and travelers carried their food, and other things, hanging it around their necks. The disciples were neither to carry money with them, nor any provisions for their journey.
  • "Two tunics" (KJV: "two coats", NABRE: "a second tunic"): supposedly one to wear during travel, and another to put on, when they came to their quarters. Theologian John Gill suggests that "the disciples were not allowed change of raiment, either because superfluous, or too magnificent to appear in, or too troublesome to carry".
  • "Shoes": only sandals are allowed, according to the Gospel of Mark. There seems to be a difference between shoes and sandals, as appears from the case of the plucking off the shoe, when a man refused his brother's wife: if the "shoe" was plucked off it was regarded; but if the "sandal", it was not minded: this was the old tradition, though custom went against it. Sandals were made of harder leather than shoes, and sometimes of wood covered with leather, and stuck with nails, to make them more durable; though sometimes of bulrushes, and bark of palm trees, and of cork, which were light to walk with. Of what sort these were, the disciples were allowed to travel with, is not certain.
  • "Staffs" (KJV: "staves"): that is, more than one staff, which was sufficient to assist and lean upon during the journey. According to Mark, one staff was allowed, as though they might take a traveling staff, but not staffs for defense or to fight with (Matthew 26:55). Now these several things were forbidden them, partly because they would be burdensome to them in traveling; and partly because they were not to be out any long time, but were quickly to return again; and mainly to teach them to live and depend upon divine providence. Since they were to take neither money, nor provisions with them, and were also to preach the Gospel freely, they might reasonably ask how they should be provided for, and supported, so Jesus said, that they should not be anxiously concerned about that, as he would take care that they had a suitable supply and would so influence and dispose the minds of such, to whom they should minister, as that they should have all necessary provisions made for them, without any care or expense of theirs.
  • "For a worker is worthy of his food" (KJV: "For the workman is worthy of his meat"): Jesus uses this proverbial expression to remark that the disciples are workmen, or laborers in his vineyard, and for doing their duty, they were entitled to all the necessaries of life. This is their due and justified to give it to them, and on which they might depend. So that this whole context is so far from militating against a minister's maintenance by the people, that it most strongly establishes it; for if the apostles were not to take any money or provisions with them, to support themselves with, it clearly follows, that it was the will of Christ, that they should live by the Gospel, upon those to whom they preached and though they were not to make gain of the Gospel, or preach it for filthy lucre's sake, yet they might expect a comfortable subsistence, at the charge of the people, to whom they ministered, and which was their duty to provide for them.

Verse 13

If the household [where you stay] is worthy, let your peace come upon it. But if it is not worthy, let your peace return to you.

Commentator Dale Allison suggests that "your peace" refers to the peace promised "for the eschatological age" (e.g. Isaiah 52:7): How beautiful upon the mountains are the feet of him who brings good news, who proclaims peace. "The gift of peace is not just a social convention: the apostolic greeting should be understood as a sign of the inbreaking of the kingdom."

Verse 18

You will be brought before governors and kings for My sake, as a testimony to them and to the Gentiles.

Cross references: Matthew 27:2

Verse 21

Now brother will deliver up brother to death, and a father his child; and children will rise up against parents and cause them to be put to death.

This prophecy of family strife is based upon Micah 7:6, which was thought to describe the discord of the latter days. The conviction that the great tribulation would turn those of the same household against one another was widespread.

Verse 34

"Think not that I am come to send peace on earth: I came not to send [or bring] peace, but a sword."

This is a much-discussed passage, often explained in terms of the "apocalyptic-eschatological" context of the 1st century.

R. T. France explains the verse, in context with the subsequent verse 35: "The sword Jesus brings is not here military conflict, but, as vv. 35–36 show, a sharp social division which even severs the closest family ties. … Jesus speaks here, as in the preceding and following verses, more of a division in men’s personal response to him."

The text of Matthew's Gospel in the Book of Kells alters gladium, the Vulgate translation of makhairan "sword", to gaudium "joy", resulting in a reading of "I came not [only] to bring peace, but [also] joy".

Verse 38

And he who does not take his cross and follow after Me is not worthy of Me.
  • "Take his cross": is in the sense of "willingly to undergo the severe trials that fall to his lot" (2 Corinthians 1:5; Philippians 3:10); a figurative expression taken from the practice that "condemned criminals were compelled to take up their own cross and carry it to the place of execution" (Matthew 27:32; Luke 23:26; John 19:16)

Parallels in the Gospel of Thomas

Matthew 10 contains many parallels found in the Gospel of Thomas.

  • Matthew 10:16 parallels saying 39 in the Gospel of Thomas.
  • Matthew 10:37 parallels sayings 55 and 101
  • Matthew 10:27b parallels saying 33a.
  • Matthew 10:34–36 parallels saying 16.
  • Matthew 10:26 parallels saying 5b.

Natural theology

From Wikipedia, the free encyclopedia

Natural theology, once also termed physico-theology, is a type of theology that provides arguments for the existence of God based on reason and ordinary experience of nature.

This distinguishes it from revealed theology, which is based on scripture and/or religious experiences, also from transcendental theology, which is based on a priori reasoning. It is thus a type of philosophy, with the aim of explaining the nature of the gods, or of one supreme God. For monotheistic religions, this principally involves arguments about the attributes or non-attributes of God, and especially the existence of God, using arguments that do not involve recourse to supernatural revelation.

Marcus Terentius Varro (116–27 BCE) established a distinction between political theology (the social functions of religion), natural theology and mythical theology. His terminology became part of the Stoic tradition and then Christianity through Augustine of Hippo and Thomas Aquinas.

Ancient Greece

Besides Hesiod's Works and Days and Zarathushtra's Gathas, Plato gives the earliest surviving account of a natural theology. In the Timaeus, written c. 360 BCE, we read: "We must first investigate concerning [the whole Cosmos] that primary question which has to be investigated at the outset in every case, — namely, whether it has always existed, having no beginning or generation, or whether it has come into existence, having begun from some beginning." In the Laws, in answer to the question as to what arguments justify faith in the gods, Plato affirms: "One is our dogma about the soul...the other is our dogma concerning the ordering of the motion of the stars".

Ancient Rome

Varro (Marcus Terentius Varro) in his (lost) Antiquitates rerum humanarum et divinarum (Antiquities of Human and Divine Things, 1st century BCE) established a distinction between three kinds of theology: civil (political) (theologia civilis), natural (physical) (theologia naturalis) and mythical (theologia mythica). The theologians of civil theology are "the people", asking how the gods relate to daily life and the state (imperial cult). The theologians of natural theology are the philosophers, asking about the nature of the gods, and the theologians of mythical theology are the poets, crafting mythology.

Middle ages

From the 8th century CE, the Mutazilite school of Islam, compelled to defend their principles against the orthodox Islam of their day, used philosophy for support, and were among the first to pursue a rational Islamic theology, termed Ilm-al-Kalam (scholastic theology). The teleological argument was later presented by the early Islamic philosophers Alkindus and Averroes, while Avicenna presented both the cosmological argument and the ontological argument in The Book of Healing (1027).

Thomas Aquinas (c. 1225 – 1274) presented several versions of the cosmological argument in his Summa Theologica, and of the teleological argument in his Summa contra Gentiles. He presented the ontological argument, but rejected it in favor of proofs that invoke cause and effect alone. His quinque viae ("five ways") in those books attempted to demonstrate the existence of God in different ways, including (as way No. 5) the goal-directed actions seen in nature.

Early modern onward

Raymond of Sabunde's (c. 1385–1436) Theologia Naturalis sive Liber Creaturarum, written 1434–1436, but published posthumously (1484), marks an important stage in the history of natural theology.

John Ray (1627–1705) also known as John Wray, was an English naturalist, sometimes referred to as the father of English natural history. He published important works on plants, animals, and natural theology, with the objective "to illustrate the glory of God in the knowledge of the works of nature or creation".

William Derham (1657–1735) continued Ray's tradition of natural theology in two of his own works, Physico-Theology, published during 1713, and Astro-Theology, 1714. These later influenced the work of William Paley.

In An Essay on the Principle of Population, published during 1798, Thomas Malthus ended with two chapters on natural theology and population. Malthus—a devout Christian—argued that revelation would "damp the soaring wings of intellect", and thus never let "the difficulties and doubts of parts of the scripture" interfere with his work.

William Paley, an important influence on Charles Darwin, who studied theology at Christ College in Cambridge, gave a well-known rendition of the teleological argument for God. During 1802 he published Natural Theology, or Evidences of the Existence and Attributes of the Deity collected from the Appearances of Nature. In this he described the Watchmaker analogy, for which he is probably best known. However, his book, which was one of the most published books of the 19th and 20th century, presents a number of teleological and cosmological arguments for the existence of God. The book served as a template for many subsequent natural theologies during the 19th century.

Professor of chemistry and natural history, Edward Hitchcock also studied and wrote on natural theology. He attempted to unify and reconcile science and religion, emphasizing geology. His major work of this type was The Religion of Geology and its Connected Sciences (1851).

The Gifford Lectures were established by the will of Adam Lord Gifford to "promote and diffuse the study of Natural Theology in the widest sense of the term—in other words, the knowledge of God." The term natural theology as used by Gifford means theology supported by science and not dependent on the miraculous.

Bridgewater Treatises

Debates over the applicability of teleology to scientific questions continued during the nineteenth century, as Paley's argument about design conflicted with radical new theories on the transmutation of species. In order to support the scientific ideas of the time, which explored the natural world within Paley's framework of a divine designer, Francis Henry Egerton, 8th Earl of Bridgewater, a gentleman naturalist, commissioned eight Bridgewater Treatises upon his deathbed to explore "the Power, Wisdom, and Goodness of God, as manifested in the Creation." They were published first during the years 1833 to 1840, and afterwards in Bohn's Scientific Library. The treatises are:

  1. The Adaptation of External Nature to the Moral and Intellectual Condition of Man, by Thomas Chalmers, D. D.
  2. On The Adaptation of External Nature to the Physical Condition of Man, by John Kidd, M. D.
  3. Astronomy and General Physics considered with reference to Natural Theology, by William Whewell, D. D.
  4. The hand, its Mechanism and Vital Endowments as evincing Design, by Sir Charles Bell.
  5. Animal and Vegetable Physiology considered with reference to Natural Theology, by Peter Mark Roget.
  6. Geology and Mineralogy considered with reference to Natural Theology, by William Buckland, D.D.
  7. On the History, Habits and Instincts of Animals, by William Kirby.
  8. Chemistry, Meteorology, and the Function of Digestion, considered with reference to Natural Theology, by William Prout, M.D.

In response to the claim in Whewell's treatise that "We may thus, with the greatest propriety, deny to the mechanical philosophers and mathematicians of recent times any authority with regard to their views of the administration of the universe", Charles Babbage published what he termed The Ninth Bridgewater Treatise, A Fragment. As his preface states, this volume was not part of that series, but rather his own considerations of the subject. He draws on his own work on calculating engines to consider God as a divine programmer setting complex laws as the basis of what we think of as miracles, rather than miraculously producing new species by creative whim. There was also a fragmentary supplement to this, published posthumously by Thomas Hill.

The theology of the Bridgewater Treatises was often disputed, given that it assumed humans could have knowledge of God acquired by observation and reasoning without the aid of revealed knowledge.

The works are of unequal merit; several of them were esteemed as apologetic literature, but they attracted considerable criticism. One notable critic of the Bridgewater Treatises was Edgar Allan Poe, who wrote Criticism. Robert Knox, an Edinburgh surgeon and major advocate of radical morphology, referred to them as the "Bilgewater Treatises", to mock the "ultra-teleological school". Though memorable, this phrase overemphasises the influence of teleology in the series, at the expense of the idealism of the likes of Kirby and Roget.

Homework

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Homework A person doing geometry home...