Search This Blog

Friday, July 1, 2022

Entropy (order and disorder)

From Wikipedia, the free encyclopedia
 
Boltzmann's molecules (1896) shown at a "rest position" in a solid

In thermodynamics, entropy is often associated with the amount of order or disorder in a thermodynamic system. This stems from Rudolf Clausius' 1862 assertion that any thermodynamic process always "admits to being reduced [reduction] to the alteration in some way or another of the arrangement of the constituent parts of the working body" and that internal work associated with these alterations is quantified energetically by a measure of "entropy" change, according to the following differential expression:

where Q = motional energy (“heat”) that is transferred reversibly to the system from the surroundings and T = the absolute temperature at which the transfer occurs

In the years to follow, Ludwig Boltzmann translated these 'alterations of arrangement' into a probabilistic view of order and disorder in gas-phase molecular systems. In the context of entropy, "perfect internal disorder" has often been regarded as describing thermodynamic equilibrium, but since the thermodynamic concept is so far from everyday thinking, the use of the term in physics and chemistry has caused much confusion and misunderstanding.

In recent years, to interpret the concept of entropy, by further describing the 'alterations of arrangement', there has been a shift away from the words 'order' and 'disorder', to words such as 'spread' and 'dispersal'.

History

This "molecular ordering" entropy perspective traces its origins to molecular movement interpretations developed by Rudolf Clausius in the 1850s, particularly with his 1862 visual conception of molecular disgregation. Similarly, in 1859, after reading a paper on the diffusion of molecules by Clausius, Scottish physicist James Clerk Maxwell formulated the Maxwell distribution of molecular velocities, which gave the proportion of molecules having a certain velocity in a specific range. This was the first-ever statistical law in physics.

In 1864, Ludwig Boltzmann, a young student in Vienna, came across Maxwell's paper and was so inspired by it that he spent much of his long and distinguished life developing the subject further. Later, Boltzmann, in efforts to develop a kinetic theory for the behavior of a gas, applied the laws of probability to Maxwell's and Clausius' molecular interpretation of entropy so as to begin to interpret entropy in terms of order and disorder. Similarly, in 1882 Hermann von Helmholtz used the word "Unordnung" (disorder) to describe entropy.

Overview

To highlight the fact that order and disorder are commonly understood to be measured in terms of entropy, below are current science encyclopedia and science dictionary definitions of entropy:

  • A measure of the unavailability of a system's energy to do work; also a measure of disorder; the higher the entropy the greater the disorder.
  • A measure of disorder; the higher the entropy the greater the disorder.
  • In thermodynamics, a parameter representing the state of disorder of a system at the atomic, ionic, or molecular level; the greater the disorder the higher the entropy.
  • A measure of disorder in the universe or of the unavailability of the energy in a system to do work.

Entropy and disorder also have associations with equilibrium. Technically, entropy, from this perspective, is defined as a thermodynamic property which serves as a measure of how close a system is to equilibrium—that is, to perfect internal disorder. Likewise, the value of the entropy of a distribution of atoms and molecules in a thermodynamic system is a measure of the disorder in the arrangements of its particles. In a stretched out piece of rubber, for example, the arrangement of the molecules of its structure has an “ordered” distribution and has zero entropy, while the “disordered” kinky distribution of the atoms and molecules in the rubber in the non-stretched state has positive entropy. Similarly, in a gas, the order is perfect and the measure of entropy of the system has its lowest value when all the molecules are in one place, whereas when more points are occupied the gas is all the more disorderly and the measure of the entropy of the system has its largest value.

In systems ecology, as another example, the entropy of a collection of items comprising a system is defined as a measure of their disorder or equivalently the relative likelihood of the instantaneous configuration of the items. Moreover, according to theoretical ecologist and chemical engineer Robert Ulanowicz, “that entropy might provide a quantification of the heretofore subjective notion of disorder has spawned innumerable scientific and philosophical narratives.” In particular, many biologists have taken to speaking in terms of the entropy of an organism, or about its antonym negentropy, as a measure of the structural order within an organism.

The mathematical basis with respect to the association entropy has with order and disorder began, essentially, with the famous Boltzmann formula, , which relates entropy S to the number of possible states W in which a system can be found. As an example, consider a box that is divided into two sections. What is the probability that a certain number, or all of the particles, will be found in one section versus the other when the particles are randomly allocated to different places within the box? If you only have one particle, then that system of one particle can subsist in two states, one side of the box versus the other. If you have more than one particle, or define states as being further locational subdivisions of the box, the entropy is larger because the number of states is greater. The relationship between entropy, order, and disorder in the Boltzmann equation is so clear among physicists that according to the views of thermodynamic ecologists Sven Jorgensen and Yuri Svirezhev, “it is obvious that entropy is a measure of order or, most likely, disorder in the system.” In this direction, the second law of thermodynamics, as famously enunciated by Rudolf Clausius in 1865, states that:

The entropy of the universe tends to a maximum.

Thus, if entropy is associated with disorder and if the entropy of the universe is headed towards maximal entropy, then many are often puzzled as to the nature of the "ordering" process and operation of evolution in relation to Clausius' most famous version of the second law, which states that the universe is headed towards maximal “disorder”. In the recent 2003 book SYNC – the Emerging Science of Spontaneous Order by Steven Strogatz, for example, we find “Scientists have often been baffled by the existence of spontaneous order in the universe. The laws of thermodynamics seem to dictate the opposite, that nature should inexorably degenerate toward a state of greater disorder, greater entropy. Yet all around us we see magnificent structures—galaxies, cells, ecosystems, human beings—that have all somehow managed to assemble themselves.” 

The common argument used to explain this is that, locally, entropy can be lowered by external action, e.g. solar heating action, and that this applies to machines, such as a refrigerator, where the entropy in the cold chamber is being reduced, to growing crystals, and to living organisms. This local increase in order is, however, only possible at the expense of an entropy increase in the surroundings; here more disorder must be created. The conditioner of this statement suffices that living systems are open systems in which both heat, mass, and or work may transfer into or out of the system. Unlike temperature, the putative entropy of a living system would drastically change if the organism were thermodynamically isolated. If an organism was in this type of “isolated” situation, its entropy would increase markedly as the once-living components of the organism decayed to an unrecognizable mass.

Phase change

Owing to these early developments, the typical example of entropy change ΔS is that associated with phase change. In solids, for example, which are typically ordered on the molecular scale, usually have smaller entropy than liquids, and liquids have smaller entropy than gases and colder gases have smaller entropy than hotter gases. Moreover, according to the third law of thermodynamics, at absolute zero temperature, crystalline structures are approximated to have perfect "order" and zero entropy. This correlation occurs because the numbers of different microscopic quantum energy states available to an ordered system are usually much smaller than the number of states available to a system that appears to be disordered.

From his famous 1896 Lectures on Gas Theory, Boltzmann diagrams the structure of a solid body, as shown above, by postulating that each molecule in the body has a "rest position". According to Boltzmann, if it approaches a neighbor molecule it is repelled by it, but if it moves farther away there is an attraction. This, of course was a revolutionary perspective in its time; many, during these years, did not believe in the existence of either atoms or molecules (see: history of the molecule). According to these early views, and others such as those developed by William Thomson, if energy in the form of heat is added to a solid, so to make it into a liquid or a gas, a common depiction is that the ordering of the atoms and molecules becomes more random and chaotic with an increase in temperature:

Solid-liquid-gas.svg

Thus, according to Boltzmann, owing to increases in thermal motion, whenever heat is added to a working substance, the rest position of molecules will be pushed apart, the body will expand, and this will create more molar-disordered distributions and arrangements of molecules. These disordered arrangements, subsequently, correlate, via probability arguments, to an increase in the measure of entropy.

Entropy-driven order

Entropy has been historically, e.g. by Clausius and Helmholtz, associated with disorder. However, in common speech, order is used to describe organization, structural regularity, or form, like that found in a crystal compared with a gas. This commonplace notion of order is described quantitatively by Landau theory. In Landau theory, the development of order in the everyday sense coincides with the change in the value of a mathematical quantity, a so-called order parameter. An example of an order parameter for crystallization is "bond orientational order" describing the development of preferred directions (the crystallographic axes) in space. For many systems, phases with more structural (e.g. crystalline) order exhibit less entropy than fluid phases under the same thermodynamic conditions. In these cases, labeling phases as ordered or disordered according to the relative amount of entropy (per the Clausius/Helmholtz notion of order/disorder) or via the existence of structural regularity (per the Landau notion of order/disorder) produces matching labels.

However, there is a broad class of systems that manifest entropy-driven order, in which phases with organization or structural regularity, e.g. crystals, have higher entropy than structurally disordered (e.g. fluid) phases under the same thermodynamic conditions. In these systems phases that would be labeled as disordered by virtue of their higher entropy (in the sense of Clausius or Helmholtz) are ordered in both the everyday sense and in Landau theory.

Under suitable thermodynamic conditions, entropy has been predicted or discovered to induce systems to form ordered liquid-crystals, crystals, and quasicrystals. In many systems, directional entropic forces drive this behavior. More recently, it has been shown it is possible to precisely engineer particles for target ordered structures.

Adiabatic demagnetization

In the quest for ultra-cold temperatures, a temperature lowering technique called adiabatic demagnetization is used, where atomic entropy considerations are utilized which can be described in order-disorder terms. In this process, a sample of solid such as chrome-alum salt, whose molecules are equivalent to tiny magnets, is inside an insulated enclosure cooled to a low temperature, typically 2 or 4 kelvins, with a strong magnetic field being applied to the container using a powerful external magnet, so that the tiny molecular magnets are aligned forming a well-ordered "initial" state at that low temperature. This magnetic alignment means that the magnetic energy of each molecule is minimal. The external magnetic field is then reduced, a removal that is considered to be closely reversible. Following this reduction, the atomic magnets then assume random less-ordered orientations, owing to thermal agitations, in the "final" state:

Entropy "order"/"disorder" considerations in the process of adiabatic demagnetization

The "disorder" and hence the entropy associated with the change in the atomic alignments has clearly increased. In terms of energy flow, the movement from a magnetically aligned state requires energy from the thermal motion of the molecules, converting thermal energy into magnetic energy. Yet, according to the second law of thermodynamics, because no heat can enter or leave the container, due to its adiabatic insulation, the system should exhibit no change in entropy, i.e. ΔS = 0. The increase in disorder, however, associated with the randomizing directions of the atomic magnets represents an entropy increase? To compensate for this, the disorder (entropy) associated with the temperature of the specimen must decrease by the same amount. The temperature thus falls as a result of this process of thermal energy being converted into magnetic energy. If the magnetic field is then increased, the temperature rises and the magnetic salt has to be cooled again using a cold material such as liquid helium.

Difficulties with the term "disorder"

In recent years the long-standing use of term "disorder" to discuss entropy has met with some criticism. Critics of the terminology state that entropy is not a measure of 'disorder' or 'chaos', but rather a measure of energy's diffusion or dispersal to more microstates. Shannon's use of the term 'entropy' in information theory refers to the most compressed, or least dispersed, amount of code needed to encompass the content of a signal.

Visualization (graphics)

From Wikipedia, the free encyclopedia

Visualization of how a car deforms in an asymmetrical crash using finite element analysis

Visualization or visualisation (see spelling differences) is any technique for creating images, diagrams, or animations to communicate a message. Visualization through visual imagery has been an effective way to communicate both abstract and concrete ideas since the dawn of humanity. Examples from history include cave paintings, Egyptian hieroglyphs, Greek geometry, and Leonardo da Vinci's revolutionary methods of technical drawing for engineering and scientific purposes.

Visualization today has ever-expanding applications in science, education, engineering (e.g., product visualization), interactive multimedia, medicine, etc. Typical of a visualization application is the field of computer graphics. The invention of computer graphics (and 3D computer graphics) may be the most important development in visualization since the invention of central perspective in the Renaissance period. The development of animation also helped advance visualization.

Overview

The Ptolemy world map, reconstituted from Ptolemy's Geographia (circa 150), indicating the countries of "Serica" and "Sinae" (China) at the extreme right, beyond the island of "Taprobane" (Sri Lanka, oversized) and the "Aurea Chersonesus" (Southeast Asian peninsula)
 
Charles Minard's information graphic of Napoleon's march

The use of visualization to present information is not a new phenomenon. It has been used in maps, scientific drawings, and data plots for over a thousand years. Examples from cartography include Ptolemy's Geographia (2nd century AD), a map of China (1137 AD), and Minard's map (1861) of Napoleon's invasion of Russia a century and a half ago. Most of the concepts learned in devising these images carry over in a straightforward manner to computer visualization. Edward Tufte has written three critically acclaimed books that explain many of these principles.

Computer graphics has from its beginning been used to study scientific problems. However, in its early days the lack of graphics power often limited its usefulness. The recent emphasis on visualization started in 1987 with the publication of Visualization in Scientific Computing, a special issue of Computer Graphics. Since then, there have been several conferences and workshops, co-sponsored by the IEEE Computer Society and ACM SIGGRAPH, devoted to the general topic, and special areas in the field, for example volume visualization.

Most people are familiar with the digital animations produced to present meteorological data during weather reports on television, though few can distinguish between those models of reality and the satellite photos that are also shown on such programs. TV also offers scientific visualizations when it shows computer drawn and animated reconstructions of road or airplane accidents. Some of the most popular examples of scientific visualizations are computer-generated images that show real spacecraft in action, out in the void far beyond Earth, or on other planets. Dynamic forms of visualization, such as educational animation or timelines, have the potential to enhance learning about systems that change over time.

Apart from the distinction between interactive visualizations and animation, the most useful categorization is probably between abstract and model-based scientific visualizations. The abstract visualizations show completely conceptual constructs in 2D or 3D. These generated shapes are completely arbitrary. The model-based visualizations either place overlays of data on real or digitally constructed images of reality or make a digital construction of a real object directly from the scientific data.

Scientific visualization is usually done with specialized software, though there are a few exceptions, noted below. Some of these specialized programs have been released as open source software, having very often its origins in universities, within an academic environment where sharing software tools and giving access to the source code is common. There are also many proprietary software packages of scientific visualization tools.

Models and frameworks for building visualizations include the data flow models popularized by systems such as AVS, IRIS Explorer, and VTK toolkit, and data state models in spreadsheet systems such as the Spreadsheet for Visualization and Spreadsheet for Images.

Applications

Scientific visualization

Simulation of a Raleigh–Taylor instability caused by two mixing fluids
 

As a subject in computer science, scientific visualization is the use of interactive, sensory representations, typically visual, of abstract data to reinforce cognition, hypothesis building, and reasoning. Scientific visualization is the transformation, selection, or representation of data from simulations or experiments, with an implicit or explicit geometric structure, to allow the exploration, analysis, and understanding of the data. Scientific visualization focuses and emphasizes the representation of higher order data using primarily graphics and animation techniques. It is a very important part of visualization and maybe the first one, as the visualization of experiments and phenomena is as old as science itself. Traditional areas of scientific visualization are flow visualization, medical visualization, astrophysical visualization, and chemical visualization. There are several different techniques to visualize scientific data, with isosurface reconstruction and direct volume rendering being the more common.

Data visualization

Data visualization is a related subcategory of visualization dealing with statistical graphics and geospatial data (as in thematic cartography) that is abstracted in schematic form.

Information visualization

Relative average utilization of IPv4
 

Information visualization concentrates on the use of computer-supported tools to explore large amount of abstract data. The term "information visualization" was originally coined by the User Interface Research Group at Xerox PARC and included Jock Mackinlay. Practical application of information visualization in computer programs involves selecting, transforming, and representing abstract data in a form that facilitates human interaction for exploration and understanding. Important aspects of information visualization are dynamics of visual representation and the interactivity. Strong techniques enable the user to modify the visualization in real-time, thus affording unparalleled perception of patterns and structural relations in the abstract data in question.

Educational visualization

Educational visualization is using a simulation to create an image of something so it can be taught about. This is very useful when teaching about a topic that is difficult to otherwise see, for example, atomic structure, because atoms are far too small to be studied easily without expensive and difficult to use scientific equipment.

Knowledge visualization

The use of visual representations to transfer knowledge between at least two persons aims to improve the transfer of knowledge by using computer and non-computer-based visualization methods complementarily. Thus properly designed visualization is an important part of not only data analysis but knowledge transfer process, too. Knowledge transfer may be significantly improved using hybrid designs as it enhances information density but may decrease clarity as well. For example, visualization of a 3D scalar field may be implemented using iso-surfaces for field distribution and textures for the gradient of the field. Examples of such visual formats are sketches, diagrams, images, objects, interactive visualizations, information visualization applications, and imaginary visualizations as in stories. While information visualization concentrates on the use of computer-supported tools to derive new insights, knowledge visualization focuses on transferring insights and creating new knowledge in groups. Beyond the mere transfer of facts, knowledge visualization aims to further transfer insights, experiences, attitudes, values, expectations, perspectives, opinions, and predictions by using various complementary visualizations. See also: picture dictionary, visual dictionary

Product visualization

Product visualization involves visualization software technology for the viewing and manipulation of 3D models, technical drawing and other related documentation of manufactured components and large assemblies of products. It is a key part of product lifecycle management. Product visualization software typically provides high levels of photorealism so that a product can be viewed before it is actually manufactured. This supports functions ranging from design and styling to sales and marketing. Technical visualization is an important aspect of product development. Originally technical drawings were made by hand, but with the rise of advanced computer graphics the drawing board has been replaced by computer-aided design (CAD). CAD-drawings and models have several advantages over hand-made drawings such as the possibility of 3-D modeling, rapid prototyping, and simulation. 3D product visualization promises more interactive experiences for online shoppers, but also challenges retailers to overcome hurdles in the production of 3D content, as large-scale 3D content production can be extremely costly and time-consuming.

Visual communication

Visual communication is the communication of ideas through the visual display of information. Primarily associated with two dimensional images, it includes: alphanumerics, art, signs, and electronic resources. Recent research in the field has focused on web design and graphically oriented usability.

Visual analytics

Visual analytics focuses on human interaction with visualization systems as part of a larger process of data analysis. Visual analytics has been defined as "the science of analytical reasoning supported by the interactive visual interface".

Its focus is on human information discourse (interaction) within massive, dynamically changing information spaces. Visual analytics research concentrates on support for perceptual and cognitive operations that enable users to detect the expected and discover the unexpected in complex information spaces.

Technologies resulting from visual analytics find their application in almost all fields, but are being driven by critical needs (and funding) in biology and national security.

Interactivity

Interactive visualization or interactive visualisation is a branch of graphic visualization in computer science that involves studying how humans interact with computers to create graphic illustrations of information and how this process can be made more efficient.

For a visualization to be considered interactive it must satisfy two criteria:

  • Human input: control of some aspect of the visual representation of information, or of the information being represented, must be available to a human, and
  • Response time: changes made by the human must be incorporated into the visualization in a timely manner. In general, interactive visualization is considered a soft real-time task.

One particular type of interactive visualization is virtual reality (VR), where the visual representation of information is presented using an immersive display device such as a stereo projector (see stereoscopy). VR is also characterized by the use of a spatial metaphor, where some aspect of the information is represented in three dimensions so that humans can explore the information as if it were present (where instead it was remote), sized appropriately (where instead it was on a much smaller or larger scale than humans can sense directly), or had shape (where instead it might be completely abstract).

Another type of interactive visualization is collaborative visualization, in which multiple people interact with the same computer visualization to communicate their ideas to each other or to explore information cooperatively. Frequently, collaborative visualization is used when people are physically separated. Using several networked computers, the same visualization can be presented to each person simultaneously. The people then make annotations to the visualization as well as communicate via audio (i.e., telephone), video (i.e., a video-conference), or text (i.e., IRC) messages.

Human control of visualization

The Programmer's Hierarchical Interactive Graphics System (PHIGS) was one of the first programmatic efforts at interactive visualization and provided an enumeration of the types of input humans provide. People can:

  1. Pick some part of an existing visual representation;
  2. Locate a point of interest (which may not have an existing representation);
  3. Stroke a path;
  4. Choose an option from a list of options;
  5. Valuate by inputting a number; and
  6. Write by inputting text.

All of these actions require a physical device. Input devices range from the common – keyboards, mice, graphics tablets, trackballs, and touchpads – to the esoteric – wired gloves, boom arms, and even omnidirectional treadmills.

These input actions can be used to control both the information being represented or the way that the information is presented. When the information being presented is altered, the visualization is usually part of a feedback loop. For example, consider an aircraft avionics system where the pilot inputs roll, pitch, and yaw and the visualization system provides a rendering of the aircraft's new attitude. Another example would be a scientist who changes a simulation while it is running in response to a visualization of its current progress. This is called computational steering.

More frequently, the representation of the information is changed rather than the information itself.

Rapid response to human input

Experiments have shown that a delay of more than 20 ms between when input is provided and a visual representation is updated is noticeable by most people. Thus it is desirable for an interactive visualization to provide a rendering based on human input within this time frame. However, when large amounts of data must be processed to create a visualization, this becomes hard or even impossible with current technology. Thus the term "interactive visualization" is usually applied to systems that provide feedback to users within several seconds of input. The term interactive framerate is often used to measure how interactive a visualization is. Framerates measure the frequency with which an image (a frame) can be generated by a visualization system. A framerate of 50 frames per second (frame/s) is considered good while 0.1 frame/s would be considered poor. The use of framerates to characterize interactivity is slightly misleading however, since framerate is a measure of bandwidth while humans are more sensitive to latency. Specifically, it is possible to achieve a good framerate of 50 frame/s but if the images generated refer to changes to the visualization that a person made more than 1 second ago, it will not feel interactive to a person.

The rapid response time required for interactive visualization is a difficult constraint to meet and there are several approaches that have been explored to provide people with rapid visual feedback based on their input. Some include

  1. Parallel rendering – where more than one computer or video card is used simultaneously to render an image. Multiple frames can be rendered at the same time by different computers and the results transferred over the network for display on a single monitor. This requires each computer to hold a copy of all the information to be rendered and increases bandwidth, but also increases latency. Also, each computer can render a different region of a single frame and send the results over a network for display. This again requires each computer to hold all of the data and can lead to a load imbalance when one computer is responsible for rendering a region of the screen with more information than other computers. Finally, each computer can render an entire frame containing a subset of the information. The resulting images plus the associated depth buffer can then be sent across the network and merged with the images from other computers. The result is a single frame containing all the information to be rendered, even though no single computer's memory held all of the information. This is called parallel depth compositing and is used when large amounts of information must be rendered interactively.
  2. Progressive rendering – where a framerate is guaranteed by rendering some subset of the information to be presented and providing incremental (progressive) improvements to the rendering once the visualization is no longer changing.
  3. Level-of-detail (LOD) rendering – where simplified representations of information are rendered to achieve a desired framerate while a person is providing input and then the full representation is used to generate a still image once the person is through manipulating the visualization. One common variant of LOD rendering is subsampling. When the information being represented is stored in a topologically rectangular array (as is common with digital photos, MRI scans, and finite difference simulations), a lower resolution version can easily be generated by skipping n points for each 1 point rendered. Subsampling can also be used to accelerate rendering techniques such as volume visualization that require more than twice the computations for an image twice the size. By rendering a smaller image and then scaling the image to fill the requested screen space, much less time is required to render the same data.
  4. Frameless rendering – where the visualization is no longer presented as a time series of images, but as a single image where different regions are updated over time.

Urban resilience

From Wikipedia, the free encyclopedia
 
Tuned mass damper in Taipei 101, the world's third tallest skyscraper

Urban resilience has conventionally been defined as the "measurable ability of any urban system, with its inhabitants, to maintain continuity through all shocks and stresses, while positively adapting and transforming towards sustainability". Therefore, a resilient city is one that assesses, plans and acts to prepare for and respond to hazards - natural and human-made, sudden and slow-onset, expected and unexpected. Resilient Cities are better positioned to protect and enhance people's lives, secure development gains, foster an investible environment, and drive positive change. Academic discussion of urban resilience has focused primarily on three distinct threats; climate change, natural disasters, and terrorism. Resilience to these threats has been discussed in the context of non-physical, as well as, physical aspects of urban planning and design. Accordingly, resilience strategies have tended to be conceived of in terms of counter-terrorism, other disasters (earthquakes, wildfires, tsunamis, coastal flooding, solar flares, etc.), and infrastructure adoption of sustainable energy.

More recently, there has been an increasing attention to genealogies of urban resilience  and the capability of urban systems to adapt to changing conditions. This branch of resilience theory builds on a notion of cities as highly complex adaptive systems. The implication of this insight is to move urban planning away from conventional approaches based in geometric plans to an approach informed by network science that involves less interference in the functioning of cities. Network science provides a way of linking city size to the forms of networks that are likely to enable cities to function in different ways. It can further provide insights into the potential effectiveness of various urban policies. This requires a better understanding of the types of practices and tools that contribute to building urban resilience. Genealogical approaches explore the evolution of these practices over time, including the values and power relations underpinning them.

Building resilience in cities relies on investment decisions that prioritize spending on activities that offer alternatives, which perform well in different scenarios. Such decisions need to take into account future risks and uncertainties. Because risk can never be fully eliminated, emergency and disaster planning is crucial. Disaster risk management frameworks, for example, offer practical opportunities for enhancing resilience.

More than half of the world's human population has lived in cities since 2007, and urbanization is calculated to rise to 80% by 2050. This means that the major resilience challenges of our era, such as poverty reduction, natural hazards and climate change, environmental sustainability, and social inclusion, will be won or lost in cities. Mass density of people makes them especially vulnerable both to the impacts of acute disasters and the slow, creeping effects of the changing climate; all making resilience planning critically important. At the same time, growing urbanization over the past century has been associated with a considerable increase in urban sprawl. Resilience efforts address how individuals, communities and business not only cope on the face of multiple shocks and stresses, but also exploit opportunities for transformational development.

As one way of addressing disaster risk in urban areas, national and local governments, often supported by international funding agencies, engage in resettlement. This can be preventative, or occur after a disaster. While this reduces people's exposure to hazards, it can also lead to other problems, which can leave people more vulnerable or worse off than they were before. Resettlement needs to be understood as part of long-term sustainable development, not just as a means for disaster risk reduction.

Sustainable Development Goal 11

In September 2015, world leaders adopted the 17 Sustainable Development Goals (SDGs) as part of the 2030 Agenda for Sustainable Development. The goals, which build on and replace the Millennium Development Goals, officially came into force on 1 January 2016 and are expected to be achieved within the next 15 years. While the SDGs are not legally binding, governments are expected to take ownership and establish national frameworks for their achievement. Countries also have the primary responsibility for follow-up and review of progress based on quality, accessible and timely data collection. National reviews will feed into regional reviews, which in turn will inform a review at the global level.

UN-Habitat's City Resilience Profiling Tool (CRPT)

As the UN Agency for Human Settlements, UN-Habitat is working to support local governments and their stakeholders build urban resilience through the City Resilience Profiling Tool (CRPT). When applied, UN-Habitat's holistic approach to increasing resiliency results in local governments that are better able to ensure the wellbeing of citizens, protect development gains and maintain functionality in the face of hazards. The tool developed by UN-Habitat to support local governments achieve resilience is the City Resilience Profiling Tool. The Tool follows various stages and UN-Habitat supports cities to maximize the impact of CRPT implementation.

Getting started Local governments and UN-Habitat connect to evaluate the needs, opportunities and context of the city and evaluate the possibility of implementing the tool in their city. WIth our local government partners, we consider the stakeholders that need to be involved in implementation, including civil society organizations, national governments, the private sectors, among others.

Engagement By signing an agreement with a UN agency, the local government is better able to work with the necessary stakeholders to plan-out risk and built-in resilience across the city.

Diagnosis The CRPT provides a framework for cities to collect the right data about the city that enables them to evaluate their resilience and identify potential vulnerability in the urban system. Diagnosis through data covers all elements of the urban system, and considers all potential hazards and stakeholders.

Resilience Actions Understanding of the entire urban system fuels effective action. The main output of the CRPT is a unique Resilience Action Plan (RAP) for each engaged city. The RAP sets out short-, medium- and long-term strategies based on the diagnosis and actions are prioritised, assigned interdepartmentally, and integrated into existing government policies and plans. The process is iterative and once resilience actions have been implemented, local governments monitor impact through the tool, which recalibrates to identify next steps.

Taking it further Resilience actions require the buy-in of all stakeholders and, in many cases, additional funding. With a detailed diagnostic, local governments can leverage the support of national governments, donors and other international organizations to work towards sustainable urban development.

To date, this approach is currently being adapted in Barcelona (Spain), Asuncion (Paraguay), Maputo (Mozambique), Port Vila (Vanuatu), Bristol (United Kingdom), Lisbon (Portugal), Yakutsk (Russia), and Dakar (Senegal). The biennial publication, Trends in Urban Resilience, also produced by UN-Habitat is tracking the most recent efforts to build urban resilience as well as the actors behind these actions and a number of case studies.

Medellin Collaboration for Urban Resilience

The Medellin Collaboration for Urban Resilience (MCUR) was launched at the 7th session of the World Urban Forum in Medellín, Colombia in 2014. As a pioneering partnerships platforms, it gathers the most prominent actors committed to building resilience globally, including UNISDR, The World Bank Group, Global Facility for Disaster Reduction and Recovery, Inter-American Development Bank, Rockefeller Foundation, 100 Resilient Cities, C40, ICLEI and Cities Alliance, and it is chaired by UN-Habitat.

MCUR aims to jointly collaborate on strengthening the resilience of all cities and human settlements around the world by supporting local, regional and national governments. It addresses its activity by providing knowledge and research, facilitating access to local-level finance and raising global awareness on urban resilience through policy advocacy and adaptation diplomacy efforts. Its work is devoted to achieving the main international development agendas, as it works to achieve the mandates set out in the Sustainable Development Goals, the New Urban Agenda, the Paris Agreement on Climate Change and the Sendai Framework for Disaster Risk Reduction.

The Medellin Collaboration conceived a platform to help local governments and other municipal professionals understand the primary utility of the vast array of tools and diagnostics designed to assess, measure, monitor and improve city-level resilience. For example, some tools are intended as rapid assessments to establish a general understanding and baseline of a city's resilience and can be self-deployed, while others are intended as a means to identify and prioritise areas for investment. The Collaboration has produced a guidebook to illustrate how cities are responding to current and future challenges by thinking strategically about design, planning, and management for building resilience. Currently, it is working in a collaborative model in six pilot cities: Accra, Bogotá, Jakarta, Maputo, Mexico City and New York City.

100 Resilient Cities and the City Resilience Index (CRI)

"Urban Resilience is the capacity of individuals, communities, institutions, businesses, and systems within a city to survive, adapt, and grow no matter what kinds of chronic stresses and acute shocks they experience." Rockefeller Foundation, 100 Resilient Cities.

A central program contributing to the achievement of SDG 11 is the Rockefeller Foundation's 100 Resilient Cities. In December 2013, The Rockefeller Foundation launched the 100 Resilient Cities initiative, which is dedicated to promoting urban resilience, defined as "the capacity of individuals, communities, institutions, businesses, and systems within a city to survive, adapt, and grow no matter what kinds of chronic stresses and acute shocks they experience". The related resilience framework is multidimensional in nature, incorporating the four core dimensions of leadership and strategy, health and well-being, economy and society and infrastructure and environment. Each dimension is defined by three individual "drivers" which reflect the actions cities can take to improve their resilience.

While the vagueness of the term "resilience" has enabled innovative multi-disciplinary collaboration, it has also made it difficult to operationalize or to develop generalizable metrics. To overcome this challenge, the professional services firm Arup has helped the Rockefeller Foundation develop the City Resilience Index based on extensive stakeholder consultation across a range of cities globally. The index is intended to serve as a planning and decision-making tool to help guide urban investments toward results that facilitate sustainable urban growth and the well-being of citizens. The hope is that city officials will utilize the tool to identify areas of improvement, systemic weaknesses and opportunities for mitigating risk. Its generalizable format also allows cities to learn from each other.

The index is a holistic articulation of urban resilience premised on the finding that there are 12 universal factors or drivers that contribute to city resilience. What varies is their relative importance. The factors are organized into the four core dimensions of the urban resilience framework:

Leadership and strategy

  • Effective leadership and management
  • Empowered stakeholders
  • Integrated development planning

Health and well-being

  • Minimal human vulnerability
  • Diverse livelihoods and employment
  • Effective safeguards to human health and life

Economy and society

  • Sustainable economy
  • Comprehensive security and rule of law
  • Collective identity and community support

Infrastructure and environment

  • Reduced exposure and fragility
  • Effective provision of critical services
  • Reliable mobility and communications

A total of 100 cities across six continents have signed up for the Rockefeller Center's urban resilience challenge. All 100 cities have developed individual City Resilience Strategies with technical support from a Chief Resilience Officer (CRO). The CRO ideally reports directly to the city's chief executive and helps coordinate all the resilience efforts in a single city.

Medellin in Colombia qualified for the urban resilience challenge in 2013. In 2016, it won the Lee Kuan Yew World City Prize.

Digital technology, open data and governance for urban resilience

A core factor enabling progress on all other dimensions of urban resilience is urban governance. Sustainable, resilient and inclusive cities are often the outcome of good governance that encompasses effective leadership, inclusive citizen participation and efficient financing among other things. To this end, public officials increasingly have access to public data, enabling evidence-based decision making. Open data is also increasingly transforming the way local governments share information with citizens, deliver services and monitor performance. It enables simultaneously increased public access to information and more direct citizen involvement in decision-making.

As part of their resilience strategies, city governments are increasingly relying on digital technology as part of a city's infrastructure and service delivery systems. On the one hand, reliance on technologies and electronic service delivery has made cities more vulnerable to hacking and cyberattacks. At the same time, information technologies have often had a positive transformative impact by supporting innovation and promoting efficiencies in urban infrastructure, thus leading to lower-cost city services. The deployment of new technologies in the initial construction of infrastructure have in some cases even allowed urban economies to leapfrog stages of development. An unintended outcome of the growing digitalization of cities is the emergence of a digital divide, which can exacerbate inequality between well-connected affluent neighborhoods and business districts, on the one hand, and under-serviced and under-connected low-income neighborhoods, on the other. In response, a number of cities have introduced digital inclusion programs to ensure that all citizens have the necessary tools to thrive in an increasingly digitalized world.

Climate change and urban resilience

The urban impacts of climate change vary widely across geographical and developmental scales. A recent study of 616 cities (home to 1.7 billion people, with a combined GDP of US$35 trillion, half of the world's total economic output), found that floods endanger more city residents than any other natural peril, followed by earthquakes and storms. Below is an attempt to define and discuss the challenges of heat waves, droughts and flooding. Resilience-boosting strategies will be introduced and outlined.

Heat waves and droughts

Heat waves are becoming increasingly prevalent as the global climate changes. The 1980 United States heat wave and drought killed 10,000 people. In 1988 a similar heat wave and drought killed 17,000 American citizens. In August 2003 the UK saw record breaking summer temperatures with average temperatures persistently rising above 32 °C. Nearly 3,000 deaths were contributed to the heat wave in the UK during this period, with an increase of 42% in London alone. This heat wave claimed more than 40,000 lives across Europe. Research indicates that by 2040 over 50% of summers will be warmer than 2003 and by 2100 those same summer temperatures will be considered cool. The 2010 northern hemisphere summer heat wave was also disastrous, with nearly 5,000 deaths occurring in Moscow. In addition to deaths, these heat waves also cause other significant problems. Extended periods of heat and droughts also cause widespread crop losses, spikes in electricity demand, forest fires, air pollution and reduced biodiversity in vital land and marine ecosystems. Agricultural losses from heat and drought might not occur directly within the urban area, but it certainly affects the lives of urban dwellers. Crop supply shortages can lead to spikes in food prices, food scarcity, civic unrest and even starvation in extreme cases. In terms of the direct fatalities from these heat waves and droughts, they are statistically concentrated in urban areas, and this is not just in line with increased population densities, but is due to social factors and the urban heat island effect.

Urban heat islands

Urban heat island (UHI) refers to the presence of an inner-city microclimate in which temperatures are comparatively higher than in the rural surroundings. Recent studies have shown that summer daytime temperatures can reach up to 10 °C hotter in a city centre than in rural areas and between 5–6 °C warmer at night. The causes of UHI are no mystery, and are mostly based on simple energy balances and geometrics. The materials commonly found in urban areas (concrete and asphalt) absorb and store heat energy much more effectively than the surrounding natural environment. The black colouring of asphalt surfaces (roads, parking lots and highways) is able to absorb significantly more electromagnetic radiation, further encouraging the rapid and effective capture and storage of heat throughout the day. Geometrics come into play as well, as tall buildings provide large surfaces that both absorb and reflect sunlight and its heat energy onto other absorbent surfaces. These tall buildings also block the wind, which limits convective cooling. The sheer size of the buildings also blocks surface heat from naturally radiating back into the cool sky at night. These factors, combined with the heat generated from vehicles, air conditioners and industry ensure that cities create, absorb and hold heat very effectively.

Social factors for heat vulnerability

The physical causes of heat waves and droughts and the exacerbation of the UHI effect are only part of the equation in terms of fatalities; social factors play a role as well. Statistically, senior citizens represent the majority of heat (and cold) related deaths within urban areas and this is often due to social isolation. In rural areas, seniors are more likely to live with family or in care homes, whereas in cities they are often concentrated in subsidised apartment buildings and in many cases have little to no contact with the outside world. Like other urban dwellers with little or no income, most urban seniors are unlikely to own an air conditioner. This combination of factors leads to thousands of tragic deaths every season, and incidences are increasing each year.

Adapting for heat and drought resilience

Greening, reflecting and whitening urban spaces

Greening urban spaces is among the most frequently mentioned strategies to address heat effects. The idea is to increase the amount of natural cover within the city. This cover can be made up of grasses, bushes, trees, vines, water, rock gardens; any natural material. Covering as much surface as possible with green space will both reduce the total quantity of thermally absorbent artificial material, but the shading effect will reduce the amount of light and heat that reaches the concrete and asphalt that cannot be replaced by greenery. Trees are among the most effective greening tool within urban environments because of their coverage/footprint ratio. Trees require a very small physical area for planting, but when mature, they provide a much larger coverage area. This both absorbs solar energy for photosynthesis (improving air quality and mitigating global warming), reducing the amount of energy being trapped and held within artificial surfaces, but also casts much-needed shade on the city and its inhabitants. Shade itself does not lower the ambient air temperature, but it greatly reduces the perceived temperature and comfort of those seeking its refuge. A popular method of reducing UHI is simply increasing the albedo (light reflectiveness) of urban surfaces that cannot be ‘greened’. This is done by using reflective paints or materials where appropriate, or white and light-coloured options where reflections would be distracting or dangerous. Glazing can also be added to windows to reduce the amount of heat entering buildings. Green roofs are also a resilience-boosting option, and have synergies with flood resilience strategies as well. However, depaving of excess pavement has been found to be a more effective and cost-efficient approach to greening and flood control.

Social strategies

There are various strategies to increase the resilience of those most vulnerable to urban heat waves. As established, these vulnerable citizens are primarily socially isolated seniors. Other vulnerable groups include young children (especially those facing abject poverty or living in informal housing), people with underlying health problems, the infirm or disabled and the homeless. Accurate and early prediction of heat waves is of fundamental importance, as it gives time for the government to issue extreme heat alerts. Urban areas must prepare and be ready to implement heat-wave emergency response initiatives. Seasonal campaigns aimed to educate the public on the risks associated with heat waves will help prepare the broad community, but in response to impending heat events more direct action is required. Local government must quickly communicate with the groups and institutions that work with heat-vulnerable populations. Cooling centres should be opened in libraries, community centres and government buildings. These centres ensure free access to air conditioning and water. In partnership with government and non-government social services, paramedics, police, firefighters, nurses and volunteers; the above-mentioned groups working with vulnerable populations should carry out regular door-to-door visits during these extreme heat scenarios. These visits should provide risk assessment, advice, bottled water (for areas without potable tap water) and the offer of free transportation to local cooling centres.

Food and water supplies

Heat waves and droughts can reap massive damage on agricultural areas vital to providing food staples to urban populations. Reservoirs and aquifers quickly dry up due to increased demand on water for drinking, industrial and agricultural purposes. The result can be shortages and price spikes for food and with increasing frequency, shortages of drinking water as observed with increasing severity seasonally in China and throughout most of the developing world. From an agricultural standpoint, farmers can be required to plant more heat and drought-resistant crops. Agricultural practices can also be streamlined to higher levels of hydrological efficiency. Reservoirs should be expanded and new reservoirs and water towers should be constructed in areas facing critical shortages. Grander schemes of damming and redirecting rivers should also be considered if possible. For saltwater coastal cities, desalination plants provide a possible solution to water shortages. Infrastructure also plays a role in resilience, as in many areas aging pipelines result in leakage and possible contamination of drinking water. In Kenya’s major cities, Nairobi and Mombasa, between 40 and 50% of drinking water is lost through leakage. In these types of cases, replacements and repairs are clearly needed.

Flooding

Flooding, either from weather events, rising sea levels or infrastructure failures are a major cause of death, disease and economic losses throughout the world. Climate change and rapidly expanding urban settlements are two factors that are leading to the increasing occurrence and severity of urban flood events, especially in the developing world. Storm surges can affect coastal cities and are caused by low pressure weather systems, like cyclones and hurricanes. Flash floods and river floods can affect any city within a floodplain or with inadequate drainage infrastructure. These can be caused by large quantities of rain or heavy rapid snow melt. With all forms of flooding, cities are increasingly vulnerable because of the large quantity of paved and concrete surfaces. These impermeable surfaces cause massive amounts of runoff and can quickly overwhelm the limited infrastructure of storm drains, flood canals and intentional floodplains. Many cities in the developing world simply have no infrastructure to redirect floodwaters whatsoever. Around the world, floods kill thousands of people every year and are responsible for billions of dollars in damages and economic losses. Flooding, much like heat waves and droughts, can also wreak havoc on agricultural areas, quickly destroying large amounts of crops. In cities with poor or absent drainage infrastructure, flooding can also lead to the contamination of drinking water sources (aquifers, wells, inland waterways) with salt water, chemical pollution, and most frequently, viral and bacterial contaminants.

Flood flow in urban environment

The flood flow in urbanised areas constitutes a hazard to the population and infrastructure. Some recent catastrophes included the inundations of Nîmes (France) in 1998 and Vaison-la-Romaine (France) in 1992, the flooding of New Orleans (USA) in 2005, the flooding in Rockhampton, Bundaberg, Brisbane during the 2010–2011 summer in Queensland (Australia). Flood flows in urban environments have been studied relatively recently despite many centuries of flood events. Some researchers mentioned the storage effect in urban areas. Several studies looked into the flow patterns and redistribution in streets during storm events and the implication in terms of flood modelling.

Some research considered the criteria for safe evacuation of individuals in flooded areas. But some recent field measurements during the 2010–2011 Queensland floods showed that any criterion solely based upon the flow velocity, water depth or specific momentum cannot account for the hazards caused by the velocity and water depth fluctuations. These considerations ignore further the risks associated with large debris entrained by the flow motion.

Adapting for flood resilience

Urban greening

Replacing as many non-porous surfaces with green space as possible will create more areas for natural ground (and plant-based) absorption of excess water. Gaining popularity are different types of green roofs. Green roofs vary in their intensity, from very thin layers of soil or rockwool supporting a variety of low or no-maintenance mosses or sedum species to large, deep, intensive roof gardens capable of supporting large plants and trees but requiring regular maintenance and more structural support. The deeper the soil, the more rainwater it can absorb and therefore the more potential floodwater it can prevent from reaching the ground. One of the best strategies, if possible, is to simply create enough space for the excess water. This involves planning or expanding areas of parkland in or adjacent to the zone where flooding is most likely to occur. Excess water is diverted into these areas when necessary, as in Cardiff, around the new Millennium Stadium. Floodplain clearance is another greening strategy that fundamentally removes structures and pavement built on floodplains and returns them to their natural habitat which is capable of absorbing massive quantities of water that otherwise would have flooded the built urban area.

Flood-water control

Levees and other flood barriers are indispensable for cities on floodplains or along rivers and coasts. In areas with lower financial and engineering capital, there are cheaper and simpler options for flood barriers. UK engineers are currently conducting field tests of a new technology called the SELOC (Self-Erecting Low-Cost Barrier). The barrier itself lies flat on the ground, and as the water rises, the SELOC floats up, with its top edge rising with the water level. A restraint holds the barrier in the vertical position. This simple, inexpensive flood barrier has great potential for increasing urban resilience to flood events and shows significant promise for developing nations with its low cost and simple, fool-proof design. The creation or expansion of flood canals and/or drainage basins can help direct excess water away from critical areas and the utilisation of innovative porous paving materials on city streets and car parks allow for the absorption and filtration of excess water.

During the January 2011 flood of the Brisbane River (Australia), some unique field measurements about the peak of the flood showed very substantial sediment fluxes in the Brisbane River flood plain, consistent with the murky appearance of floodwaters. The field deployment in an inundated street of the CBD showed also some unusual features of flood flow in an urban environment linked with some local topographic effects.

Structural resilience

In most developed nations, all new developments are assessed for flood risks. The aim is to ensure flood risk is taken into account in all stages of the planning process to avoid inappropriate development in areas of high risk. When development is required in areas of high risk, structures should be built to flood-resistant standards and living or working areas should be raised well above the worst-case scenario flood levels. For existing structures in high-risk areas, funding should be allocated to i.e. raise the electrical wiring/sockets so any water that enters the home can not reach the electrics. Other solutions are to raise these structures to appropriate heights or make them floating or considerations should be made to relocate or rebuild structures on higher ground. A house in Mexico Beach, Florida which survived Hurricane Michael is an example of a house built to survive tidal surge.

The pre-Incan Uru people of Lake Titicaca in Peru have lived on floating islands made of reeds for hundreds of years. The practice began as an innovative form of protection from competition for land by various groups, and it continues to support the Uru homeland. The manual technique is used to build homes resting on hand-made islands all from simple reeds from the totora plant. Similarly, in the southern wetlands of Iraq, the Marsh Arabs (Arab al-Ahwār) have lived for centuries on floating islands and in arched buildings all constructed exclusively from the local qasab reeds. Without any nails, wood, or glass, buildings are assembled by hand as quickly as within a day. Another aspect of these villages, called Al Tahla, is that the built homes can also be disassembled in a day, transported, and reassembled.

Emergency response

As with all disasters, flooding requires a specific set of disaster response plans. Various levels of contingency planning should be established, from basic medical and selective evacuation provisions involving local emergency responders right the way up to full military disaster relief plans involving air-based evacuations, search and rescue teams and relocation provisions for entire urban populations. Clear lines of responsibility and chains of command must be laid out, and tiered priority response levels should be established to address the immediate needs of the most vulnerable citizens first. For post-flooding repair and reconstruction sufficient emergency funding should be set aside proactively.

Educational programs related to urban resilience

The emergence of urban resilience as an educational topic has experienced an unprecedented level of growth due in large part to a series of natural disasters including the 2004 Indian Ocean earthquake and tsunami, 2005 Hurricane Katrina, the 2011 Tohoku earthquake and tsunami, and Hurricane Sandy in 2012. Two of the more well-recognized programs are Harvard Graduate School of Design's Master's program in Risk and Resilience, and Tulane University's Disaster Resilience Leadership Academy. There are also several workshops available related to the U.S. Federal Emergency Management Agency and the Department of Homeland Security. A list of more than 50 current graduate and undergraduate programs focusing on urban resilience has been compiled by The Resilience Shift.

Inhalant

From Wikipedia, the free encyclopedia https://en.wikipedia.org/w...