Search This Blog

Saturday, October 12, 2019

Fire ecology

From Wikipedia, the free encyclopedia
 
The Old Fire burning in the San Bernardino Mountains (image taken from the International Space Station)
 
Fire ecology is a scientific discipline concerned with natural processes involving fire in an ecosystem and the ecological effects, the interactions between fire and the abiotic and biotic components of an ecosystem, and the role as an ecosystem process. Many ecosystems, particularly prairie, savanna, chaparral and coniferous forests, have evolved with fire as an essential contributor to habitat vitality and renewal. Many plant species in fire-affected environments require fire to germinate, establish, or to reproduce. Wildfire suppression not only eliminates these species, but also the animals that depend upon them.

Campaigns in the United States have historically molded public opinion to believe that wildfires are always harmful to nature. This view is based on the outdated belief that ecosystems progress toward an equilibrium and that any disturbance, such as fire, disrupts the harmony of nature. More recent ecological research has shown, however, that fire is an integral component in the function and biodiversity of many natural habitats, and that the organisms within these communities have adapted to withstand, and even to exploit, natural wildfire. More generally, fire is now regarded as a 'natural disturbance', similar to flooding, wind-storms, and landslides, that has driven the evolution of species and controls the characteristics of ecosystems.

Fire suppression, in combination with other human-caused environmental changes, may have resulted in unforeseen consequences for natural ecosystems. Some large wildfires in the United States have been blamed on years of fire suppression and the continuing expansion of people into fire-adapted ecosystems, but climate change is more likely responsible. Land managers are faced with tough questions regarding how to restore a natural fire regime, but allowing wildfires to burn is the least expensive and likely most effective method.

Panoramic photo series of succession in Florida pine woodland
A combination of photos taken at a photo point at Florida Panther NWR. The photos are panoramic and cover a 360 degree view from a monitoring point. These photos range from pre-burn to 2 year post burn.

Fire components

A fire regime describes the characteristics of fire and how it interacts with a particular ecosystem. Its "severity" is a term that ecologists use to refer to the impact that a fire has on an ecosystem. Ecologists can define this in many ways, but one way is through an estimate of plant mortality. Fire can burn at three levels. Ground fires will burn through soil that is rich in organic matter. Surface fires will burn through dead plant material that is lying on the ground. Crown fires will burn in the tops of shrubs and trees. Ecosystems generally experience a mix of all three.

Fires will often break out during a dry season, but in some areas wildfires may also commonly occur during a time of year when lightning is prevalent. The frequency over a span of years at which fire will occur at a particular location is a measure of how common wildfires are in a given ecosystem. It is either defined as the average interval between fires at a given site, or the average interval between fires in an equivalent specified area.

Defined as the energy released per unit length of fireline (kW m−1), wildfire intensity can be estimated either as
  • the product of
    • the linear spread rate (m s−1),
    • the low heat of combustion (kJ kg−1),
    • and the combusted fuel mass per unit area,
  • or it can be estimated from the flame length.
Radiata pine plantation burnt during the 2003 Eastern Victorian alpine bushfires, Australia

Abiotic responses

Fires can affect soils through heating and combustion processes. Depending on the temperatures of the soils caused by the combustion processes, different effects will happen- from evaporation of water at the lower temperature ranges, to the combustion of soil organic matter and formation of pyrogenic organic matter, otherwise known as charcoal.

Fires can cause changes in soil nutrients through a variety of mechanisms, which include oxidation, volatilization, erosion, and leaching by water, but the event must usually be of high temperatures in order of significant loss of nutrients to occur. However, quantity of nutrients available in soils are usually increased due to the ash that is generated, and this is made quickly available, as opposed to the slow release of nutrients by decomposition. Rock spalling (or thermal exfoliation) accelerates weathering of rock and potentially the release of some nutrients. 

Increase in the pH of the soil following a fire is commonly observed, most likely due to the formation of calcium carbonate, and the subsequent decomposition of this calcium carbonate to calcium oxide when temperatures get even higher. It could also be due to the increased cation content in the soil due to the ash, which temporarily increases soil pH. Microbial activity in the soil might also increase due to the heating of soil and increased nutrient content in the soil, though studies have also found complete loss of microbes on the top layer of soil after a fire. Overall, soils become more basic (higher pH) following fires because of acid combustion. By driving novel chemical reactions at high temperatures, fire can even alter the texture and structure of soils by affecting the clay content and the soil's porosity

Removal of vegetation following a fire can cause several effects on the soil, such as increasing the temperatures of the soil during the day due to increased solar radiation on the soil surface, and greater cooling due to loss of radiative heat at night. Fewer leaves to intercept rain will also cause more rain to reach the soil surface, and with fewer plants to absorb the water, the amount of water content in the soils might increase. However, it might be seen that ash can be water repellent when dry, and therefore water content and availability might not actually increase.

Biotic responses and adaptations

Plants

Lodgepole pine cones
 
Plants have evolved many adaptations to cope with fire. Of these adaptations, one of the best-known is likely pyriscence, where maturation and release of seeds is triggered, in whole or in part, by fire or smoke; this behaviour is often erroneously called serotiny, although this term truly denotes the much broader category of seed release activated by any stimulus. All pyriscent plants are serotinous, but not all serotinous plants are pyriscent (some are necriscent, hygriscent, xeriscent, soliscent, or some combination thereof). On the other hand, germination of seed activated by trigger is not to be confused with pyriscence; it is known as physiological dormancy

In chaparral communities in Southern California, for example, some plants have leaves coated in flammable oils that encourage an intense fire. This heat causes their fire-activated seeds to germinate (an example of dormancy) and the young plants can then capitalize on the lack of competition in a burnt landscape. Other plants have smoke-activated seeds, or fire-activated buds. The cones of the Lodgepole pine (Pinus contorta) are, conversely, pyriscent: they are sealed with a resin that a fire melts away, releasing the seeds. Many plant species, including the shade-intolerant giant sequoia (Sequoiadendron giganteum), require fire to make gaps in the vegetation canopy that will let in light, allowing their seedlings to compete with the more shade-tolerant seedlings of other species, and so establish themselves. Because their stationary nature precludes any fire avoidance, plant species may only be fire-intolerant, fire-tolerant or fire-resistant.

Fire intolerance

Fire-intolerant plant species tend to be highly flammable and are destroyed completely by fire. Some of these plants and their seeds may simply fade from the community after a fire and not return; others have adapted to ensure that their offspring survives into the next generation. "Obligate seeders" are plants with large, fire-activated seed banks that germinate, grow, and mature rapidly following a fire, in order to reproduce and renew the seed bank before the next fire. Seeds may contain the receptor protein KAI2, that is activated by the growth hormones karrikin released by the fire.

Fire tolerance. Typical regrowth after an Australian bushfire

Fire tolerance

Fire-tolerant species are able to withstand a degree of burning and continue growing despite damage from fire. These plants are sometimes referred to as "resprouters." Ecologists have shown that some species of resprouters store extra energy in their roots to aid recovery and re-growth following a fire. For example, after an Australian bushfire, the Mountain Grey Gum tree (Eucalyptus cypellocarpa) starts producing a mass of shoots of leaves from the base of the tree all the way up the trunk towards the top, making it look like a black stick completely covered with young, green leaves.

Fire resistance

Fire-resistant plants suffer little damage during a characteristic fire regime. These include large trees whose flammable parts are high above surface fires. Mature ponderosa pine (Pinus ponderosa) is an example of a tree species that suffers virtually no crown damage under a naturally mild fire regime, because it sheds its lower, vulnerable branches as it matures.

Animals, birds and microbes

A mixed flock of hawks hunting in and around a bushfire
 
Like plants, animals display a range of abilities to cope with fire, but they differ from most plants in that they must avoid the actual fire to survive. Although birds are vulnerable when nesting, they are generally able to escape a fire; indeed they often profit from being able to take prey fleeing from a fire and to recolonize burned areas quickly afterwards. Some anthropological and ethno-ornithological evidence suggests that certain species of fire-foraging raptors may engage in intentional fire propagation to flush out prey. Mammals are often capable of fleeing a fire, or seeking cover if they can burrow. Amphibians and reptiles may avoid flames by burrowing into the ground or using the burrows of other animals. Amphibians in particular are able to take refuge in water or very wet mud.

Some arthropods also take shelter during a fire, although the heat and smoke may actually attract some of them, to their peril. Microbial organisms in the soil vary in their heat tolerance but are more likely to be able to survive a fire the deeper they are in the soil. A low fire intensity, a quick passing of the flames and a dry soil will also help. An increase in available nutrients after the fire has passed may result in larger microbial communities than before the fire. The generally greater heat tolerance of bacteria relative to fungi makes it possible for soil microbial population diversity to change following a fire, depending on the severity of the fire, the depth of the microbes in the soil, and the presence of plant cover. Certain species of fungi, such as Cylindrocarpon destructans appear to be unaffected by combustion contaminants, which can inhibit re-population of burnt soil by other microorganisms, and therefore have a higher chance of surviving fire disturbance and then recolonizing and out-competing other fungal species afterwards.

Fire and ecological succession

Fire behavior is different in every ecosystem and the organisms in those ecosystems have adapted accordingly. One sweeping generality is that in all ecosystems, fire creates a mosaic of different habitat patches, with areas ranging from those having just been burned to those that have been untouched by fire for many years. This is a form of ecological succession in which a freshly burned site will progress through continuous and directional phases of colonization following the destruction caused by the fire. Ecologists usually characterize succession through the changes in vegetation that successively arise. After a fire, the first species to re-colonize will be those with seeds are already present in the soil, or those with seeds are able to travel into the burned area quickly. These are generally fast-growing herbaceous plants that require light and are intolerant of shading. As time passes, more slowly growing, shade-tolerant woody species will suppress some of the herbaceous plants. Conifers are often early successional species, while broad leaf trees frequently replace them in the absence of fire. Hence, many conifer forests are themselves dependent upon recurring fire.

Different species of plants, animals, and microbes specialize in exploiting different stages in this process of succession, and by creating these different types of patches, fire allows a greater number of species to exist within a landscape. Soil characteristics will be a factor in determining the specific nature of a fire-adapted ecosystem, as will climate and topography.

Some examples of fire in different ecosystems

Forests

Mild to moderate fires burn in the forest understory, removing small trees and herbaceous groundcover. High-severity fires will burn into the crowns of the trees and kill most of the dominant vegetation. Crown fires may require support from ground fuels to maintain the fire in the forest canopy (passive crown fires), or the fire may burn in the canopy independently of any ground fuel support (an active crown fire). High-severity fire creates complex early seral forest habitat, or snag forest with high levels of biodiversity. When a forest burns frequently and thus has less plant litter build-up, below-ground soil temperatures rise only slightly and will not be lethal to roots that lie deep in the soil. Although other characteristics of a forest will influence the impact of fire upon it, factors such as climate and topography play an important role in determining fire severity and fire extent. Fires spread most widely during drought years, are most severe on upper slopes and are influenced by the type of vegetation that is growing.

Forests in British Columbia

In Canada, forests cover about 10% of the land area and yet harbor 70% of the country’s bird and terrestrial mammal species. Natural fire regimes are important in maintaining a diverse assemblage of vertebrate species in up to twelve different forest types in British Columbia. Different species have adapted to exploit the different stages of succession, regrowth and habitat change that occurs following an episode of burning, such as downed trees and debris. The characteristics of the initial fire, such as its size and intensity, cause the habitat to evolve differentially afterwards and influence how vertebrate species are able to use the burned areas.

Shrublands

Lightning-sparked wildfires are frequent occurrences on shrublands and grasslands in Nevada.
 
Shrub fires typically concentrate in the canopy and spread continuously if the shrubs are close enough together. Shrublands are typically dry and are prone to accumulations of highly volatile fuels, especially on hillsides. Fires will follow the path of least moisture and the greatest amount of dead fuel material. Surface and below-ground soil temperatures during a burn are generally higher than those of forest fires because the centers of combustion lie closer to the ground, although this can vary greatly. Common plants in shrubland or chaparral include manzanita, chamise and Coyote Brush.

California shrublands

California shrubland, commonly known as chaparral, is a widespread plant community of low growing species, typically on arid sloping areas of the California Coast Ranges or western foothills of the Sierra Nevada. There are a number of common shrubs and tree shrub forms in this association, including salal, toyon, coffeeberry and Western poison oak. Regeneration following a fire is usually a major factor in the association of these species.

South African Fynbos shrublands

Fynbos shrublands occur in a small belt across South Africa. The plant species in this ecosystem are highly diverse, yet the majority of these species are obligate seeders, that is, a fire will cause germination of the seeds and the plants will begin a new life-cycle because of it. These plants may have coevolved into obligate seeders as a response to fire and nutrient-poor soils. Because fire is common in this ecosystem and the soil has limited nutrients, it is most efficient for plants to produce many seeds and then die in the next fire. Investing a lot of energy in roots to survive the next fire when those roots will be able to extract little extra benefit from the nutrient-poor soil would be less efficient. It is possible that the rapid generation time that these obligate seeders display has led to more rapid evolution and speciation in this ecosystem, resulting in its highly diverse plant community.

Grasslands

Grasslands burn more readily than forest and shrub ecosystems, with the fire moving through the stems and leaves of herbaceous plants and only lightly heating the underlying soil, even in cases of high intensity. In most grassland ecosystems, fire is the primary mode of decomposition, making it crucial in the recycling of nutrients. In some grassland systems, fire only became the primary mode of decomposition after the disappearance of large migratory herds of browsing or grazing megafauna driven by predator pressure. In the absence of functional communities of large migratory herds of herbivorous megafauna and attendant predators, overuse of fire to maintain grassland ecosystems may lead to excessive oxidation, loss of carbon, and desertification in susceptible climates. Some grassland ecosystems respond poorly to fire.

North American grasslands

In North America fire-adapted invasive grasses such as Bromus tectorum contribute to increased fire frequency which exerts selective pressure against native species. This is a concern for grasslands in the Western United States.

In less arid grassland presettlement fires worked in concert with grazing to create a healthy grassland ecosystem  as indicated by the accumulation of soil organic matter significantly altered by fire. The tallgrass prairie ecosystem in the Flint Hills of eastern Kansas and Oklahoma is responding positively to the current use of fire in combination with grazing.

South African savanna

In the savanna of South Africa, recently burned areas have new growth that provides palatable and nutritious forage compared to older, tougher grasses. This new forage attracts large herbivores from areas of unburned and grazed grassland that has been kept short by constant grazing. On these unburned "lawns", only those plant species adapted to heavy grazing are able to persist; but the distraction provided by the newly burned areas allows grazing-intolerant grasses to grow back into the lawns that have been temporarily abandoned, so allowing these species to persist within that ecosystem.

Longleaf pine savannas

Yellow pitcher plant is dependent upon recurring fire in coastal plain savannas and flatwoods.
 
Much of the southeastern United States was once open longleaf pine forest with a rich understory of grasses, sedges, carnivorous plants and orchids. The above maps shows that these ecosystems (coded as pale blue) had the highest fire frequency of any habitat, once per decade or less. Without fire, deciduous forest trees invade, and their shade eliminates both the pines and the understory. Some of the typical plants associated with fire include Yellow Pitcher Plant and Rose pogonia. The abundance and diversity of such plants is closely related to fire frequency. Rare animals such as gopher tortoises and indigo snakes also depend upon these open grasslands and flatwoods. Hence, the restoration of fire is a priority to maintain species composition and biological diversity.

Fire in wetlands

Although it may seem strange, many kinds of wetlands are also influenced by fire. This usually occurs during periods of drought. In landscapes with peat soils, such as bogs, the peat substrate itself may burn, leaving holes that refill with water as new ponds. Fires that are less intense will remove accumulated litter and allow other wetland plants to regenerate from buried seeds, or from rhizomes. Wetlands that are influenced by fire include coastal marshes, wet prairies, peat bogs, floodplains, prairie marshes and flatwoods.  Since wetlands can store large amounts of carbon in peat, the fire frequency of vast northern peatlands is linked to processes controlling the carbon dioxide levels of the atmosphere, and to the phenomenon of global warming.  Dissolved organic carbon (DOC) is abundant in wetlands and plays a critical role in their ecology. In the Florida Everglades, a significant portion of the DOC is "dissolved charcoal" indicating that fire can play a critical role in wetland ecosystems.

Fire suppression

Fire serves many important functions within fire-adapted ecosystems. Fire plays an important role in nutrient cycling, diversity maintenance and habitat structure. The suppression of fire can lead to unforeseen changes in ecosystems that often adversely affect the plants, animals and humans that depend upon that habitat. Wildfires that deviate from a historical fire regime because of fire suppression are called "uncharacteristic fires".

Chaparral communities

In 2003, southern California witnessed powerful chaparral wildfires. Hundreds of homes and hundreds of thousands of acres of land went up in flames. Extreme fire weather (low humidity, low fuel moisture and high winds) and the accumulation of dead plant material from 8 years of drought, contributed to a catastrophic outcome. Although some have maintained that fire suppression contributed to an unnatural buildup of fuel loads, a detailed analysis of historical fire data has showed that this may not have been the case. Fire suppression activities had failed to exclude fire from the southern California chaparral. Research showing differences in fire size and frequency between southern California and Baja has been used to imply that the larger fires north of the border are the result of fire suppression, but this opinion has been challenged by numerous investigators and is no longer supported by the majority of fire ecologists.

One consequence of the fires in 2003 has been the increased density of invasive and non-native plant species that have quickly colonized burned areas, especially those that had already been burned in the previous 15 years. Because shrubs in these communities are adapted to a particular historical fire regime, altered fire regimes may change the selective pressures on plants and favor invasive and non-native species that are better able to exploit the novel post-fire conditions.

Fish impacts

The Boise National Forest is a US national forest located north and east of the city of Boise, Idaho. Following several uncharacteristically large wildfires, an immediately negative impact on fish populations was observed, posing particular danger to small and isolated fish populations. In the long term, however, fire appears to rejuvenate fish habitats by causing hydraulic changes that increase flooding and lead to silt removal and the deposition of a favorable habitat substrate. This leads to larger post-fire populations of the fish that are able to recolonize these improved areas. But although fire generally appears favorable for fish populations in these ecosystems, the more intense effects of uncharacteristic wildfires, in combination with the fragmentation of populations by human barriers to dispersal such as weirs and dams, will pose a threat to fish populations.

Fire as a management tool

Restoration ecology is the name given to an attempt to reverse or mitigate some of the changes that humans have caused to an ecosystem. Controlled burning is one tool that is currently receiving considerable attention as a means of restoration and management. Applying fire to an ecosystem may create habitats for species that have been negatively impacted by fire suppression, or fire may be used as a way of controlling invasive species without resorting to herbicides or pesticides. However, there is debate as to what state managers should aim to restore their ecosystems to, especially as to whether "natural" means pre-human or pre-European. Native American use of fire, not natural fires, historically maintained the diversity of the savannas of North America. When, how, and where managers should use fire as a management tool is a subject of debate.

The Great Plains shortgrass prairie

A combination of heavy livestock grazing and fire-suppression has drastically altered the structure, composition, and diversity of the shortgrass prairie ecosystem on the Great Plains, allowing woody species to dominate many areas and promoting fire-intolerant invasive species. In semi-arid ecosystems where the decomposition of woody material is slow, fire is crucial for returning nutrients to the soil and allowing the grasslands to maintain their high productivity. 

Although fire can occur during the growing or the dormant seasons, managed fire during the dormant season is most effective at increasing the grass and forb cover, biodiversity and plant nutrient uptake in shortgrass prairies. Managers must also take into account, however, how invasive and non-native species respond to fire if they want to restore the integrity of a native ecosystem. For example, fire can only control the invasive spotted knapweed (Centaurea maculosa) on the Michigan tallgrass prairie in the summer, because this is the time in the knapweed's life cycle that is most important to its reproductive growth.

Mixed conifer forests in the US Sierra Nevada

Mixed conifer forests in the United States Sierra Nevada used to have fire return intervals that ranged from 5 years up to 300 years, depending on the local climate. Lower elevations had more frequent fire return intervals, whilst higher and wetter elevations saw much longer intervals between fires. Native Americans tended to set fires during fall and winter, and land at a higher elevation was generally occupied by Native Americans only during the summer.

Finnish boreal forests

The decline of habitat area and quality has caused many species populations to be red-listed by the International Union for Conservation of Nature. According to a study on forest management of Finnish boreal forests, improving the habitat quality of areas outside reserves can help in conservation efforts of endangered deadwood-dependent beetles. These beetles and various types of fungi both need dead trees in order to survive. Old growth forests can provide this particular habitat. However, most Fennoscandian boreal forested areas are used for timber and therefore are unprotected. The use of controlled burning and tree retention of a forested area with deadwood was studied and its effect on the endangered beetles. The study found that after the first year of management the number of species increased in abundance and richness compared to pre-fire treatment. The abundance of beetles continued to increase the following year in sites where tree retention was high and deadwood was abundant. The correlation between forest fire management and increased beetle populations shows a key to conserving these red-listed species.

Australian eucalypt forests

Much of the old growth eucalypt forest in Australia is designated for conservation. Management of these forests is important because species like Eucalyptus grandis rely on fire to survive. There are a few eucalypt species that do not have a lignotuber, a root swelling structure that contains buds where new shoots can then sprout. During a fire a lignotuber is helpful in the reestablishment of the plant. Because some eucalypts do not have this particular mechanism, forest fire management can be helpful by creating rich soil, killing competitors, and allowing seeds to be released.

Management policies

United States

Fire policy in the United States involves the federal government, individual state governments, tribal governments, interest groups, and the general public. The new federal outlook on fire policy parallels advances in ecology and is moving towards the view that many ecosystems depend on disturbance for their diversity and for the proper maintenance of their natural processes. Although human safety is still the number one priority in fire management, new US government objectives include a long-term view of ecosystems. The newest policy allows managers to gauge the relative values of private property and resources in particular situations and to set their priorities accordingly.

One of the primary goals in fire management is to improve public education in order to suppress the "Smokey Bear" fire-suppression mentality and introduce the public to the benefits of regular natural fires.

Academic journal

From Wikipedia, the free encyclopedia
 
Different types of peer-reviewed research journals; these specific publications are about economics
 
An academic or scholarly journal is a periodical publication in which scholarship relating to a particular academic discipline is published. Academic journals serve as permanent and transparent forums for the presentation, scrutiny, and discussion of research. They are usually peer-reviewed or refereed. Content typically takes the form of articles presenting original research, review articles, and book reviews. The purpose of an academic journal, according to Henry Oldenburg (the first editor of Philosophical Transactions of the Royal Society), is to give researchers a venue to "impart their knowledge to one another, and contribute what they can to the Grand design of improving natural knowledge, and perfecting all Philosophical Arts, and Sciences."

The term academic journal applies to scholarly publications in all fields; this article discusses the aspects common to all academic field journals. Scientific journals and journals of the quantitative social sciences vary in form and function from journals of the humanities and qualitative social sciences; their specific aspects are separately discussed. 

The first academic journal was Journal des sçavans (January 1665), followed soon after by Philosophical Transactions of the Royal Society (March 1665), and Mémoires de l'Académie des Sciences (1666). The first fully peer-reviewed journal was Medical Essays and Observations (1733).

History

Adrien Auzout's "A TABLE of the Apertures of Object-Glasses" from a 1665 article in Philosophical Transactions, showing a table
 
The idea of a published journal with the purpose of "[letting] people know what is happening in the Republic of Letters" was first conceived by Eudes de Mazerai in 1663. A publication titled Journal littéraire général was supposed to be published to fulfill that goal, but never was. Humanist scholar Denis de Sallo (under the pseudonym "Sieur de Hédouville") and printer Jean Cusson took Mazerai's idea, and obtained a royal privilege from King Louis XIV on 8 August 1664 to establish the Journal des sçavans. The journal's first issue was published on 5 January 1665. It was aimed at people of letters, and had four main objectives:
  1. review newly published major European books,
  2. publish the obituaries of famous people,
  3. report on discoveries in arts and science, and
  4. report on the proceedings and censures of both secular and ecclesiastical courts, as well as those of Universities both in France and outside.
Soon after, the Royal Society established Philosophical Transactions of the Royal Society in March 1665, and the Académie des Sciences established the Mémoires de l'Académie des Sciences in 1666, which more strongly focused on scientific communications. By the end of the 18th century, nearly 500 such periodical had been published, the vast majority coming from Germany (304 periodicals), France (53), and England (34). Several of those publications however, and in particular the German journals, tended to be short lived (under 5 years). A.J. Meadows has estimated the proliferation of journal to reach 10,000 journals in 1950, and 71,000 in 1987. However, Michael Mabe warns that the estimates will vary depending on the definition of what exactly counts as a scholarly publication, but that the growth rate has been "remarkably consistent over time", with an average rates of 3.46% per year from 1800 to 2003.

In 1733, Medical Essays and Observations was established by the Medical Society of Edinburgh as the first fully peer-reviewed journal. Peer review was introduced as an attempt to increase the quality and pertinence of submissions. Other important events in the history of academic journals include the establishment of Nature (1869) and Science (1880), the establishment of Postmodern Culture in 1990 as the first online-only journal, the foundation of arXiv in 1991 for the dissemination of preprints to be discussed prior to publication in a journal, and the establishment of PLOS One in 2006 as the first megajournal.

Scholarly articles

There are two kinds of article or paper submissions in academia: solicited, where an individual has been invited to submit work either through direct contact or through a general submissions call, and unsolicited, where an individual submits a work for potential publication without directly being asked to do so. Upon receipt of a submitted article, editors at the journal determine whether to reject the submission outright or begin the process of peer review. In the latter case, the submission becomes subject to review by outside scholars of the editor's choosing who typically remain anonymous. The number of these peer reviewers (or "referees") varies according to each journal's editorial practice – typically, no fewer than two, though sometimes three or more, experts in the subject matter of the article produce reports upon the content, style, and other factors, which inform the editors' publication decisions. Though these reports are generally confidential, some journals and publishers also practice public peer review. The editors either choose to reject the article, ask for a revision and resubmission, or accept the article for publication. Even accepted articles are often subjected to further (sometimes considerable) editing by journal editorial staff before they appear in print. The peer review can take from several weeks to several months.

Reviewing

Review articles

Review articles, also called "reviews of progress," are checks on the research published in journals. Some journals are devoted entirely to review articles, some contain a few in each issue, and others do not publish review articles. Such reviews often cover the research from the preceding year, some for longer or shorter terms; some are devoted to specific topics, some to general surveys. Some journals are enumerative, listing all significant articles in a given subject; others are selective, including only what they think worthwhile. Yet others are evaluative, judging the state of progress in the subject field. Some journals are published in series, each covering a complete subject field year, or covering specific fields through several years. Unlike original research articles, review articles tend to be solicited submissions, sometimes planned years in advance. They are typically relied upon by students beginning a study in a given field, or for current awareness of those already in the field.

Book reviews

Reviews of scholarly books are checks upon the research books published by scholars; unlike articles, book reviews tend to be solicited. Journals typically have a separate book review editor determining which new books to review and by whom. If an outside scholar accepts the book review editor's request for a book review, he or she generally receives a free copy of the book from the journal in exchange for a timely review. Publishers send books to book review editors in the hope that their books will be reviewed. The length and depth of research book reviews varies much from journal to journal, as does the extent of textbook and trade book review.

Prestige and ranking

An academic journal's prestige is established over time, and can reflect many factors, some but not all of which are expressible quantitatively. In each academic discipline, there are dominant journals that receive the largest number of submissions, and therefore can be selective in choosing their content. Yet, not only the largest journals are of excellent quality.

In the natural sciences and in the social sciences, the impact factor is an established proxy, measuring the number of later articles citing articles already published in the journal. There are other quantitative measures of prestige, such as the overall number of citations, how quickly articles are cited, and the average "half-life" of articles. Clarivate Analytics' Journal Citation Reports, which among other features, computes an impact factor for academic journals, draws data for computation from the Science Citation Index Expanded (for natural science journals), and from the Social Sciences Citation Index (for social science journals). Several other metrics are also used, including the SCImago Journal Rank, CiteScore, Eigenfactor, and Altmetrics

In the Anglo-American humanities, there is no tradition (as there is in the sciences) of giving impact-factors that could be used in establishing a journal's prestige. Recent moves have been made by the European Science Foundation (ESF) to change the situation, resulting in the publication of preliminary lists for the ranking of academic journals in the humanities. These rankings have been severely criticized, notably by history and sociology of science British journals that have published a common editorial entitled "Journals under Threat." Though it did not prevent ESF and some national organizations from proposing journal rankings, it largely prevented their use as evaluation tools.

In some disciplines such as knowledge management/intellectual capital, the lack of a well-established journal ranking system is perceived by academics as "a major obstacle on the way to tenure, promotion and achievement recognition". Conversely, a significant number of scientists and organizations consider the pursuit of impact factor calculations as inimical to the goals of science, and have signed the San Francisco Declaration on Research Assessment to limit its use. 

The categorization of journal prestige in some subjects has been attempted, typically using letters to rank their academic world importance.

Three categories of techniques have developed to assess journal quality and create journal rankings:
  • stated preference;
  • revealed preference; and
  • publication power approaches

Costs

Many academic journals are subsidized by universities or professional organizations, and do not exist to make a profit. However, they often accept advertising, page and image charges from authors to pay for production costs. On the other hand, some journals are produced by commercial publishers who do make a profit by charging subscriptions to individuals and libraries. They may also sell all of their journals in discipline-specific collections or a variety of other packages.

Journal editors tend to have other professional responsibilities, most often as teaching professors. In the case of the largest journals, there are paid staff assisting in the editing. The production of the journals is almost always done by publisher-paid staff. Humanities and social science academic journals are usually subsidized by universities or professional organization.

New developments

The Internet has revolutionized the production of, and access to, academic journals, with their contents available online via services subscribed to by academic libraries. Individual articles are subject-indexed in databases such as Google Scholar. Some of the smallest, most specialized journals are prepared in-house, by an academic department, and published only online – such form of publication has sometimes been in the blog format though some, like the open access journal Internet Archaeology, use the medium to embed searchable datasets, 3D models, and interactive mapping. Currently, there is a movement in higher education encouraging open access, either via self archiving, whereby the author deposits a paper in a disciplinary or institutional repository where it can be searched for and read, or via publishing it in a free open access journal, which does not charge for subscriptions, being either subsidized or financed by a publication fee. Given the goal of sharing scientific research to speed advances, open access has affected science journals more than humanities journals. Commercial publishers are experimenting with open access models, but are trying to protect their subscription revenues.

The much lower entry cost of on-line publishing has also raised concerns of an increase in publication of "junk" journals with lower publishing standards. These journals, often with names chosen as similar to well-established publications, solicit articles via e-mail and then charge the author to publish an article, often with no sign of actual review. Jeffrey Beall, a research librarian at the University of Colorado, has compiled a list of what he considers to be "potential, possible, or probable predatory scholarly open-access publishers"; the list numbered over 300 journals as of April 2013, but he estimates that there may be thousands. The OMICS Publishing Group, which publishes a number of the journals on this list, has threatened to sue Beall.

Some academic journals use the registered report format, which aims to counteract issues such as data dredging and hypothesizing after the results are known. For example, Nature Human Behaviour has adopted the registered report format, as it "shift[s] the emphasis from the results of research to the questions that guide the research and the methods used to answer them". The European Journal of Personality defines this format: "In a registered report, authors create a study proposal that includes theoretical and empirical background, research questions/hypotheses, and pilot data (if available). Upon submission, this proposal will then be reviewed prior to data collection, and if accepted, the paper resulting from this peer-reviewed procedure will be published, regardless of the study outcomes."

Lists of Academic Journals

Wikipedia has many Lists of Academic Journals by discipline, such as List of African Studies Journals and List of Forestry Journals. The largest database providing detailed information about journals is Ulrichs Global Serials Directory. Other databases providing detailed information about journals are the Modern Language Association Directory of Periodicals and Genamics JournalSeek. Journal hosting websites like Project MUSE, JSTOR, Pubmed, Ingenta Web of Science, and Informaworld also provide journal lists. Some sites evaluate journals, providing information such as how long a journal takes to review articles and what types of articles it publishes.

Academic journal publishing reform

From Wikipedia, the free encyclopedia
 
Academic journal publishing reform is the advocacy for changes in the way academic journals are created and distributed in the age of the Internet and the advent of electronic publishing. Since the rise of the Internet, people have organized campaigns to change the relationships among and between academic authors, their traditional distributors and their readership. Most of the discussion has centered on taking advantage of benefits offered by the Internet's capacity for widespread distribution of reading material.

History

Before the advent of the Internet it was difficult for scholars to distribute articles giving their research results. Historically publishers performed services including proofreading, typesetting, copy editing, printing, and worldwide distribution. In modern times all researchers became expected to give the publishers digital copies of their work which needed no further processing. For digital distribution printing was unnecessary, copying was free, and worldwide distribution happens online instantly. In science journal publishing, Internet technology enabled the four major scientific publishers—Elsevier, Springer, Wiley, and Informa—to cut their expenditures such that they could consistently generate profits which exceed a third of their revenue.

The Internet made it easier for researchers to do work which had previously been done by publishers, and some people began to feel that they did not need to pay for the services of publishers. This perception was a problem for publishers, who stated that their services were still necessary at the rates they asked. Critics began to describe publishers' practices with terms such as "corporate scam" and "racket". Scholars sometimes obtain articles from fellow scholars through unofficial channels, such as posting requests on Twitter using the hashtag "#icanhazpdf" (a play on the I Can Has Cheezburger? meme), to avoid paying publishers' access charges.

Motivations for reform

Although it has some historical precedent, open access became desired in response to the advent of electronic publishing as part of a broader desire for academic journal publishing reform. Electronic publishing created new benefits as compared to paper publishing but beyond that, it contributed to causing problems in traditional publishing models. 

The premises behind open access are that there are viable funding models to maintain traditional academic publishing standards of quality while also making the following changes to the field:
  1. Rather than making journals be available through a subscription business model, all academic publications should be free to read and published with some other funding model. Publications should be gratis or "free to read".
  2. Rather than applying traditional notions of copyright to academic publications, readers should be free to build upon the research of others. Publications should be libre or "free to build upon".
  3. Everyone should have greater awareness of the serious social problems caused by restricting access to academic research.
  4. Everyone should recognize that there are serious economic challenges for the future of academic publishing. Even though open access models are problematic, traditional publishing models definitely are not sustainable and something radical needs to change immediately.
Open access also has ambitions beyond merely granting access to academic publications, as access to research is only a tool for helping people achieve other goals. Open access advances scholarly pursuits in the fields of open data, open government, open educational resources, free and open-source software, and open science, among others.

Problems addressed by academic publishing reform

The motivations for academic journal publishing reform include the ability of computers to store large amounts of information, the advantages of giving more researchers access to preprints, and the potential for interactivity between researchers.

Various studies showed that the demand for open access research was such that freely available articles consistently had impact factors which were higher than articles published under restricted access.

Some universities reported that modern "package deal" subscriptions were too costly for them to maintain, and that they would prefer to subscribe to journals individually to save money.

The problems which led to discussion about academic publishing reform have been considered in the context of what provision of open access might provide. Here are some of the problems in academic publishing which open access advocates purport that open access would address:
  1. A pricing crisis called the serials crisis has been growing in the decades before open access and remains today. The academic publishing industry has increased prices of academic journals faster than inflation and beyond the library budgets.
  2. The pricing crisis does not only mean strain to budgets, but also that people actually are losing access to journals.
  3. Not even the wealthiest libraries in the world are able to afford all the journals that their users are demanding, and less rich libraries are severely harmed by lack of access to journals.
  4. Publishers are using "bundling" strategies to sell journals, and this marketing strategy is criticized by many libraries as forcing them to pay for unpopular journals which their users are not demanding.
  5. Libraries are cutting their book budgets to pay for academic journals.
  6. Libraries do not own electronic journals in permanent archival form as they do paper copies, so if they have to cancel a subscription then they lose all subscribed journals. This did not happen with paper journals, and yet costs historically have been higher for electronic versions.
  7. Academic publishers get essential assets from their subscribers in a way that other publishers do not. Authors donate the texts of academic journals to the publishers and grant rights to publish them, and editors and referees donate peer-review to validate the articles. The people writing the journals are questioning the increased pressure put upon them to pay higher prices for the journal produced by their community.
  8. Conventional publishers are using a business model which requires access barriers and creates artificial scarcity. All publishers need revenue, but open access promises models in which scarcity is fundamental to raising revenue.
  9. Scholarly publishing depends heavily on government policy, public subsidies, gift economy, and anti-competitive practices, yet all of these things are in conflict with the conventional academic publishing model of restricting access to works.
  10. Toll access journals compete more for authors to donate content to them than they compete for subscribers to pay for the work. This is because every scholarly journal has a natural monopoly over the information of its field. Because of this, the market for pricing journals does not have feedback because it is outside of traditional market forces, and the prices have no control to drive it to serve the needs of the market.
  11. Besides the natural monopoly, there is supporting evidence that prices are artificially inflated to benefit publishers while harming the market. Evidence includes the trend of large publishers to have accelerating prices increases greater than small publishers, when in traditional markets high volume and high sales enables cost savings and lower prices.
  12. Conventional publishers fund "content protection" actions which restrict and police content sharing.
  13. For-profit publishers have economic incentives to decrease rates of rejected articles so that they publish more content to sell. No such market force exists if selling content for money is not a motivating factor.
  14. Many researchers are unaware that it might be possible for them to have all the research articles they need, and just accept it as fate that they will always be without some of the articles they would like to read.
  15. Access to toll-access journals is not scaling with increases in research and publishing, and the academic publishers are under market forces to restrict increases in publishing and indirectly because of that they are restricting the growth of research.

Motivations against reform

Publishers state that if profit was not a consideration in the pricing of journals then the cost of accessing those journals would not substantially change. Publishers also state that they add value to publications in many ways, and without academic publishing as an institution these services the readership would miss these services and fewer people would have access to articles.

Critics of open access have suggested that by itself, this is not a solution to scientific publishing’s most serious problem – it simply changes the paths through which ever-increasing sums of money flow. Evidence for this does exist and for example, Yale University ended its financial support of BioMed Central’s Open Access Membership program effective July 27, 2007. In their announcement, they stated,
The libraries’ BioMedCentral membership represented an opportunity to test the technical feasibility and the business model of this open access publisher. While the technology proved acceptable, the business model failed to provide a viable long-term revenue base built upon logical and scalable options. Instead, BioMedCentral has asked libraries for larger and larger contributions to subsidize their activities. Starting with 2005, BioMed Central article charges cost the libraries $4,658, comparable to single biomedicine journal subscription. The cost of article charges for 2006 then jumped to $31,625. The article charges have continued to soar in 2007 with the libraries charged $29,635 through June 2007, with $34,965 in potential additional article charges in submission.
A similar situation is reported from the University of Maryland, and Phil Davis commented that,
The assumptions that open access publishing is both cheaper and more sustainable than the traditional subscription model are featured in many of these mandates. But they remain just that — assumptions. In reality, the data from Cornell show just the opposite. Institutions like the University of Maryland would pay much more under an author-pays model, as would most research-intensive universities, and the rise in author processing charges (APCs) rivals the inflation felt at any time under the subscription model.
Opponents of the open access model see publishers as a part of the scholarly information chain and view a pay-for-access model as being necessary in ensuring that publishers are adequately compensated for their work. "In fact, most STM [Scientific, Technical and Medical] publishers are not profit-seeking corporations from outside the scholarly community, but rather learned societies and other non-profit entities, many of which rely on income from journal subscriptions to support their conferences, member services, and scholarly endeavors". Scholarly journal publishers that support pay-for-access claim that the "gatekeeper" role they play, maintaining a scholarly reputation, arranging for peer review, and editing and indexing articles, require economic resources that are not supplied under an open access model. Conventional journal publishers may also lose customers to open access publishers who compete with them. The Partnership for Research Integrity in Science and Medicine (PRISM), a lobbying organization formed by the Association of American Publishers (AAP), is opposed to the open access movement. PRISM and AAP have lobbied against the increasing trend amongst funding organizations to require open publication, describing it as "government interference" and a threat to peer review.

For researchers, publishing an article in a reputable scientific journal is perceived as being beneficial to one's reputation among scientific peers and in advancing one's academic career. There is a concern that the perception of open access journals do not have the same reputation, which will lead to less publishing. Park and Qin discuss the perceptions that academics have with regard to open access journals. One concern that academics have "are growing concerns about how to promote [Open Access] publishing." Park and Qin also state, "The general perception is that [Open Access] journals are new, and therefore many uncertainties, such as quality and sustainability, exist."

Journal article authors are generally not directly financially compensated for their work beyond their institutional salaries and the indirect benefits that an enhanced reputation provides in terms of institutional funding, job offers, and peer collaboration.

There are those, for example PRISM, who think that open access is unnecessary or even harmful. David Goodman argued that there is no need for those outside major academic institutions to have access to primary publications, at least in some fields.

The argument that publicly funded research should be made openly available has been countered with the assertion that "taxes are generally not paid so that taxpayers can access research results, but rather so that society can benefit from the results of that research; in the form of new medical treatments, for example. Publishers claim that 90% of potential readers can access 90% of all available content through national or research libraries, and while this may not be as easy as accessing an article online directly it is certainly possible." The argument for tax-payer funded research is only applicable in certain countries as well. For instance in Australia, 80% of research funding comes through taxes, whereas in Japan and Switzerland, only approximately 10% is from the public coffers.

For various reasons open access journals have been established by predatory publishers who seek to use the model to make money without regard to producing a quality journal. The causes of predatory open access publishing include the low barrier to creating the appearance of a legitimate digital journal and funding models which may include author publishing costs rather than subscription sales. Research reviewer Jeffrey Beall publishes a "List of Predatory Publishers" and an accompanying methodology for identifying publishers who have editorial and financial practices which are contrary to the ideal of good research publishing practices.

Reform initiatives

Public Library of Science

The Public Library of Science is a nonprofit open-access scientific publishing project aimed at creating a library of open access journals and other scientific literature under an open content license. The founding of the organization had its origins in a 2001 online petition calling for all scientists to pledge that from September 2001 they would discontinue submission of papers to journals which did not make the full-text of their papers available to all, free and unfettered, either immediately or after a delay of several months. The petition collected 34,000 signatures but the publishers took no strong response to the demands. Shortly thereafter, the Public Library of Science was founded as an alternative to traditional publishing.

HINARI

HINARI is a 2002 project of the World Health Organization and major publishers to enable developing countries to access collections of biomedical and health literature online at reduced subscription costs.

Research Works Act

The Research Works Act was a bill of the United States Congress which would have prohibited all laws which would require an open access mandate when US-government-funded researchers published their work. The proposers of the law stated that it would "ensure the continued publication and integrity of peer-reviewed research works by the private sector". Critics of the law stated that it was the moment that "academic publishers gave up all pretence of being on the side of scientists." In February 2012, Elsevier withdrew its support for the bill. Following this statement, the sponsors of the bill announced they will also withdraw their support.

The Cost of Knowledge

The Cost of Knowledge is a campaign begun in 2012 specifically targeting the scientific publishing company Elsevier. It was begun by a group of prominent mathematicians who each made a commitment to not participate in publishing in Elsevier's journals.

Access2Research

Access2Research is a 2012 United States-based campaign in which open access advocates appealed to the United States government to require that taxpayer-funded research be made available to the public under open licensing.

PeerJ

PeerJ is an open-access journal launched in 2012 that charges publication fees per researcher, not per article, resulting in what has been called "a flat fee for 'all you can publish'".

Public Knowledge Project

Since 1998, PKP has been developing free open source software platforms for managing and publishing peer-reviewed open access journals and monographs, with Open Journal Systems used by more than 7,000 active journals in 2013.

Schekman boycott

2013 Nobel Prize winner Randy Schekman called for a boycott of traditional academic journals including Nature, Cell, and Science. Instead he promoted the open access journal eLife.

Initiative for Open Citations

Initiative for Open Citations is a CrossRef initiative for improved citation analysis. It was supported by majority of the publishers effective from April 2017.

Thursday, October 10, 2019

Information management

From Wikipedia, the free encyclopedia
 
Information management (IM) concerns a cycle of organizational activity: the acquisition of information from one or more sources, the custodianship and the distribution of that information to those who need it, and its ultimate disposition through archiving or deletion.

This cycle of organisational involvement with information involves a variety of stakeholders, including those who are responsible for assuring the quality, accessibility and utility of acquired information; those who are responsible for its safe storage and disposal; and those who need it for decision making. Stakeholders might have rights to originate, change, distribute or delete information according to organisational information management policies.

Information management embraces all the generic concepts of management, including the planning, organizing, structuring, processing, controlling, evaluation and reporting of information activities, all of which is needed in order to meet the needs of those with organisational roles or functions that depend on information. These generic concepts allow the information to be presented to the audience or the correct group of people. After individuals are able to put that information to use, it then gains more value.
Information management is closely related to, and overlaps with, the management of data, systems, technology, processes and – where the availability of information is critical to organisational success – strategy. This broad view of the realm of information management contrasts with the earlier, more traditional view, that the life cycle of managing information is an operational matter that requires specific procedures, organisational capabilities and standards that deal with information as a product or a service.

History

Emergent ideas out of data management

In the 1970s, the management of information largely concerned matters closer to what would now be called data management: punched cards, magnetic tapes and other record-keeping media, involving a life cycle of such formats requiring origination, distribution, backup, maintenance and disposal. At this time the huge potential of information technology began to be recognised: for example a single chip storing a whole book, or electronic mail moving messages instantly around the world, remarkable ideas at the time. With the proliferation of information technology and the extending reach of information systems in the 1980s and 1990s, information management took on a new form. Progressive businesses such as British Petroleum transformed the vocabulary of what was then "IT management", so that “systems analysts” became “business analysts”, “monopoly supply” became a mixture of “insourcing” and “outsourcing”, and the large IT function was transformed into “lean teams” that began to allow some agility in the processes that harness information for business benefit. The scope of senior management interest in information at British Petroleum extended from the creation of value through improved business processes, based upon the effective management of information, permitting the implementation of appropriate information systems (or “applications”) that were operated on IT infrastructure that was outsourced. In this way, information management was no longer a simple job that could be performed by anyone who had nothing else to do, it became highly strategic and a matter for senior management attention. An understanding of the technologies involved, an ability to manage information systems projects and business change well, and a willingness to align technology and business strategies all became necessary.

Positioning information management in the bigger picture

In the transitional period leading up to the strategic view of information management, Venkatraman (a strong advocate of this transition and transformation, proffered a simple arrangement of ideas that succinctly brought together the managements of data, information, and knowledge (see the figure)) argued that:
  • Data that is maintained in IT infrastructure has to be interpreted in order to render information.
  • The information in our information systems has to be understood in order to emerge as knowledge.
  • Knowledge allows managers to take effective decisions.
  • Effective decisions have to lead to appropriate actions.
  • Appropriate actions are expected to deliver meaningful results.
This simple model summarises a presentation by Venkatraman in 1996, as reported by Ward and Peppard (2002, page 207).
 
This is often referred to as the DIKAR model: Data, Information, Knowledge, Action and Result, it gives a strong clue as to the layers involved in aligning technology and organisational strategies, and it can be seen as a pivotal moment in changing attitudes to information management. The recognition that information management is an investment that must deliver meaningful results is important to all modern organisations that depend on information and good decision-making for their success.

Theoretical background

Behavioural and organisational theories

It is commonly believed that good information management is crucial to the smooth working of organisations, and although there is no commonly accepted theory of information management per se, behavioural and organisational theories help. Following the behavioural science theory of management, mainly developed at Carnegie Mellon University and prominently supported by March and Simon, most of what goes on in modern organizations is actually information handling and decision making. One crucial factor in information handling and decision making is an individual's ability to process information and to make decisions under limitations that might derive from the context: a person's age, the situational complexity, or a lack of requisite quality in the information that is at hand – all of which is exacerbated by the rapid advance of technology and the new kinds of system that it enables, especially as the social web emerges as a phenomenon that business cannot ignore. And yet, well before there was any general recognition of the importance of information management in organisations, March and Simon argued that organizations have to be considered as cooperative systems, with a high level of information processing and a vast need for decision making at various levels. Instead of using the model of the "economic man", as advocated in classical theory  they proposed "administrative man" as an alternative, based on their argumentation about the cognitive limits of rationality. Additionally they proposed the notion of satisficing, which entails searching through the available alternatives until an acceptability threshold is met - another idea that still has currency.

Economic theory

In addition to the organisational factors mentioned by March and Simon, there are other issues that stem from economic and environmental dynamics. There is the cost of collecting and evaluating the information needed to take a decision, including the time and effort required. The transaction cost associated with information processes can be high. In particular, established organizational rules and procedures can prevent the taking of the most appropriate decision, leading to sub-optimum outcomes. This is an issue that has been presented as a major problem with bureaucratic organizations that lose the economies of strategic change because of entrenched attitudes.

Strategic information management

Background

According to the Carnegie Mellon School an organization's ability to process information is at the core of organizational and managerial competency, and an organization's strategies must be designed to improve information processing capability and as information systems that provide that capability became formalised and automated, competencies were severely tested at many levels. It was recognised that organisations needed to be able to learn and adapt in ways that were never so evident before  and academics began to organise and publish definitive works concerning the strategic management of information, and information systems. Concurrently, the ideas of business process management and knowledge management although much of the optimistic early thinking about business process redesign has since been discredited in the information management literature. In the strategic studies field, it is considered of the highest priority the understanding of the information environment, conceived as the aggregate of individuals, organizations, and systems that collect, process, disseminate, or act on information. This environment consists of three interrelated dimensions which continuously interact with individuals, organizations, and systems. These dimensions are the physical, informational, and cognitive.

Aligning technology and business strategy with information management

Venkatraman has provided a simple view of the requisite capabilities of an organisation that wants to manage information well – the DIKAR model (see above). He also worked with others to understand how technology and business strategies could be appropriately aligned in order to identify specific capabilities that are needed. This work was paralleled by other writers in the world of consulting, practice and academia.

A contemporary portfolio model for information

Bytheway has collected and organised basic tools and techniques for information management in a single volume. At the heart of his view of information management is a portfolio model that takes account of the surging interest in external sources of information and the need to organise un-structured information external so as to make it useful (see the figure). 

This portfolio model organizes issues of internal and external sourcing and management of information, that may be either structured or unstructured.

Such an information portfolio as this shows how information can be gathered and usefully organised, in four stages:

Stage 1: Taking advantage of public information: recognise and adopt well-structured external schemes of reference data, such as post codes, weather data, GPS positioning data and travel timetables, exemplified in the personal computing press.

Stage 2: Tagging the noise on the world wide web: use existing schemes such as post codes and GPS data or more typically by adding “tags”, or construct a formal ontology that provides structure. Shirky provides an overview of these two approaches.

Stage 3: Sifting and analysing: in the wider world the generalised ontologies that are under development extend to hundreds of entities and hundreds of relations between them and provide the means to elicit meaning from large volumes of data. Structured data in databases works best when that structure reflects a higher-level information model – an ontology, or an entity-relationship model.

Stage 4: Structuring and archiving: with the large volume of data available from sources such as the social web and from the miniature telemetry systems used in personal health management, new ways to archive and then trawl data for meaningful information. Map-reduce methods, originating from functional programming, are a more recent way of eliciting information from large archival datasets that is becoming interesting to regular businesses that have very large data resources to work with, but it requires advanced multi-processor resources.

Competencies to manage information well

The Information Management Body of Knowledge was made available on the world wide web in 2004  and sets out to show that the required management competencies to derive real benefits from an investment in information are complex and multi-layered. The framework model that is the basis for understanding competencies comprises six “knowledge” areas and four “process” areas:

This framework is the basis of organising the "Information Management Body of Knowledge" first made available in 2004. This version is adapted by the addition of "Business information" in 2014.
The information management knowledge areas
The IMBOK is based on the argument that there are six areas of required management competency, two of which (“business process management” and “business information management”) are very closely related.
  • Information technology: The pace of change of technology and the pressure to constantly acquire the newest technological products can undermine the stability of the infrastructure that supports systems, and thereby optimises business processes and delivers benefits. It is necessary to manage the “supply side” and recognise that technology is, increasingly, becoming a commodity.
  • Information system: While historically information systems were developed in-house, over the years it has become possible to acquire most of the software systems that an organisation needs from the software package industry. However, there is still the potential for competitive advantage from the implementation of new systems ideas that deliver to the strategic intentions of organisations.
  • Business processes and Business information: Information systems are applied to business processes in order to improve them, and they bring data to the business that becomes useful as business information. Business process management is still seen as a relatively new idea because it is not universally adopted, and it has been difficult in many cases; business information management is even more of a challenge.
  • Business benefit: What are the benefits that we are seeking? It is necessary not only to be brutally honest about what can be achieved, but also to ensure the active management and assessment of benefit delivery. Since the emergence and popularisation of the Balanced scorecard  there has been huge interest in business performance management but not much serious effort has been made to relate business performance management to the benefits of information technology investments and the introduction of new information systems until the turn of the millennium.
  • Business strategy: Although a long way from the workaday issues of managing information in organisations, strategy in most organisations simply has to be informed by information technology and information systems opportunities, whether to address poor performance or to improve differentiation and competitiveness. Strategic analysis tools such as the value chain and critical success factor analysis are directly dependent on proper attention to the information that is (or could be) managed 
The information management processes
Even with full capability and competency within the six knowledge areas, it is argued that things can still go wrong. The problem lies in the migration of ideas and information management value from one area of competency to another. Summarising what Bytheway explains in some detail (and supported by selected secondary references):
  • Projects: Information technology is without value until it is engineered into information systems that meet the needs of the business by means of good project management.
  • Business change: The best information systems succeed in delivering benefits through the achievement of change within the business systems, but people do not appreciate change that makes new demands upon their skills in the ways that new information systems often do. Contrary to common expectations, there is some evidence that the public sector has succeeded with information technology induced business change.
  • Business operations: With new systems in place, with business processes and business information improved, and with staff finally ready and able to work with new processes, then the business can get to work, even when new systems extend far beyond the boundaries of a single business.
  • Performance management: Investments are no longer solely about financial results, financial success must be balanced with internal efficiency, customer satisfaction, and with organisational learning and development.

Summary

There are always many ways to see a business, and the information management viewpoint is only one way. It is important to remember that other areas of business activity will also contribute to strategy – it is not only good information management that moves a business forwards. Corporate governance, human resource management, product development and marketing will all have an important role to play in strategic ways, and we must not see one domain of activity alone as the sole source of strategic success. On the other hand, corporate governance, human resource management, product development and marketing are all dependent on effective information management, and so in the final analysis our competency to manage information well, on the broad basis that is offered here, can be said to be predominant.

Operationalising information management

Managing requisite change

Organizations are often confronted with many information management challenges and issues at the operational level, especially when organisational change is engendered. The novelty of new systems architectures and a lack of experience with new styles of information management requires a level of organisational change management that is notoriously difficult to deliver. As a result of a general organisational reluctance to change, to enable new forms of information management, there might be (for example): a shortfall in the requisite resources, a failure to acknowledge new classes of information and the new procedures that use them, a lack of support from senior management leading to a loss of strategic vision, and even political manoeuvring that undermines the operation of the whole organisation. However, the implementation of new forms of information management should normally lead to operational benefits.

The early work of Galbraith

In early work, taking an information processing view of organisation design, Jay Galbraith has identified five tactical areas to increase information processing capacity and reduce the need for information processing.
  • Developing, implementing, and monitoring all aspects of the “environment” of an organization.
  • Creation of slack resources so as to decrease the load on the overall hierarchy of resources and to reduce information processing relating to overload.
  • Creation of self-contained tasks with defined boundaries and that can achieve proper closure, and with all the resources at hand required to perform the task.
  • Recognition of lateral relations that cut across functional units, so as to move decision power to the process instead of fragmenting it within the hierarchy.
  • Investment in vertical information systems that route information flows for a specific task (or set of tasks) in accordance to the applied business logic.

The matrix organisation

The lateral relations concept leads to an organizational form that is different from the simple hierarchy, the “matrix organization”. This brings together the vertical (hierarchical) view of an organisation and the horizontal (product or project) view of the work that it does visible to the outside world. The creation of a matrix organization is one management response to a persistent fluidity of external demand, avoiding multifarious and spurious responses to episodic demands that tend to be dealt with individually.

Archetype

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Archetype The concept of an archetyp...