Search This Blog

Wednesday, July 28, 2021

Information and communications technology

A Concept Map on the Use of Information and Communication Technology (ICT) in Educational as per the International Federation of ICT the IFGICT , the IFGICT Assessment

Information and communications technology (ICT) is an extensional term for information technology (IT) that stresses the role of unified communications and the integration of telecommunications (telephone lines and wireless signals) and computers as per IFGICT, as well as necessary enterprise software, middleware, storage and audiovisual, that enable users to access, store, transmit, understand and manipulate information as per the international federation of ICT.

The term ICT is also used to refer to the convergence of audiovisual and telephone networks with computer networks through a single cabling or link system. There are large economic incentives to merge the telephone network with the computer network system using a single unified system of cabling, signal distribution, and management. ICT is an umbrella term that includes any communication device, encompassing radio, television, cell phones, computer and network hardware, satellite systems and so on, as well as the various services and appliances with them such as video conferencing and distance learning.

ICT is a broad subject and the concepts are evolving. It covers any product that will store, retrieve, manipulate, transmit, or receive information electronically in a digital form (e.g., personal computers including smartphones, digital television, email, or robots). Theoretical differences between interpersonal-communication technologies and mass-communication technologies have been identified by the philosopher Piyush Mathur. Skills Framework for the Information Age is one of many models for describing and managing competencies for ICT professionals for the 21st century.

Etymology

The phrase "information and communication technologies" has been used by academic researchers since the 1980s. The abbreviation "ICT" became popular after it was used in a report to the UK government by Dennis Stevenson in 1997, and then in the revised National Curriculum for England, Wales and Northern Ireland in 2000. However, in 2012, the Royal Society recommended that the use of the term "ICT" should be discontinued in British schools "as it has attracted too many negative connotations". From 2014, the National Curriculum has used the word computing, which reflects the addition of computer programming into the curriculum.

Variations of the phrase have spread worldwide. The United Nations has created a "United Nations Information and Communication Technologies Task Force" and an internal "Office of Information and Communications Technology".

Monetisation

The money spent on IT worldwide has been estimated as US$3.8 trillion  in 2017 and has been growing at less than 5% per year since 2009. The estimate 2018 growth of the entire ICT is 5%. The biggest growth of 16% is expected in the area of new technologies (IoT, Robotics, AR/VR, and AI).

The 2014 IT budget of the US federal government was nearly $82 billion. IT costs, as a percentage of corporate revenue, have grown 50% since 2002, putting a strain on IT budgets. When looking at current companies' IT budgets, 75% are recurrent costs, used to "keep the lights on" in the IT department, and 25% are the cost of new initiatives for technology development.

The average IT budget has the following breakdown:

  • 31% personnel costs (internal)
  • 29% software costs (external/purchasing category)
  • 26% hardware costs (external/purchasing category)
  • 14% costs of external service providers (external/services).

The estimate of money to be spent in 2022 is just over US$6 trillion.

Technological capacity

The world's technological capacity to store information grew from 2.6 (optimally compressed) exabytes in 1986 to 15.8 in 1993, over 54.5 in 2000, and to 295 (optimally compressed) exabytes in 2007, and some 5 zetta bytes in 2014. This is the informational equivalent to 1.25 stacks of CD-ROM from the earth to the moon in 2007, and the equivalent of 4,500 stacks of printed books from the earth to the sun in 2014. The world's technological capacity to receive information through one-way broadcast networks was 432 exabytes of (optimally compressed) information in 1986, 715 (optimally compressed) exabytes in 1993, 1.2 (optimally compressed) zettabytes in 2000, and 1.9 zettabytes in 2007. The world's effective capacity to exchange information through two-way telecommunication networks was 281 petabytes of (optimally compressed) information in 1986, 471 petabytes in 1993, 2.2 (optimally compressed) exabytes in 2000, 65 (optimally compressed) exabytes in 2007, and some 100 exabytes in 2014. The world's technological capacity to compute information with humanly guided general-purpose computers grew from 3.0 × 10^8 MIPS in 1986, to 6.4 x 10^12 MIPS in 2007.

ICT sector in the OECD

The following is a list of OECD countries by share of ICT sector in total value added in 2013.

Rank Country ICT sector in % Relative size
1  South Korea 10.7
 
2  Japan 7.02
 
3  Ireland 6.99
 
4  Sweden 6.82
 
5  Hungary 6.09
 
6  United States 5.89
 
7  India 5.87
 
8  Czech Republic 5.74
 
9  Finland 5.60
 
10  United Kingdom 5.53
 
11  Estonia 5.33
 
12  Slovakia 4.87
 
13  Germany 4.84
 
14  Luxembourg 4.54
 
15   Switzerland 4.63
 
16  France 4.33
 
17  Slovenia 4.26
 
18  Denmark 4.06
 
19  Spain 4.00
 
20  Canada 3.86
 
21  Italy 3.72
 
22  Belgium 3.72
 
23  Austria 3.56
 
24  Portugal 3.43
 
25  Poland 3.33
 
26  Norway 3.32
 
27  Greece 3.31
 
28  Iceland 2.87
 
29  Mexico 2.77
 

ICT Development Index

The ICT Development Index ranks and compares the level of ICT use and access across the various countries around the world. In 2014 ITU (International Telecommunications Union) released the latest rankings of the IDI, with Denmark attaining the top spot, followed by South Korea. The top 30 countries in the rankings include most high-income countries where the quality of life is higher than average, which includes countries from Europe and other regions such as "Australia, Bahrain, Canada, Japan, Macao (China), New Zealand, Singapore, and the United States; almost all countries surveyed improved their IDI ranking this year."

The WSIS process and ICT development goals

On 21 December 2001, the United Nations General Assembly approved Resolution 56/183, endorsing the holding of the World Summit on the Information Society (WSIS) to discuss the opportunities and challenges facing today's information society. According to this resolution, the General Assembly related the Summit to the United Nations Millennium Declaration's goal of implementing ICT to achieve Millennium Development Goals. It also emphasized a multi-stakeholder approach to achieve these goals, using all stakeholders including civil society and the private sector, in addition to governments.

To help anchor and expand ICT to every habitable part of the world, "2015 is the deadline for achievements of the UN Millennium Development Goals (MDGs), which global leaders agreed upon in the year 2000."

In education

Today's society shows the ever-growing computer-centric lifestyle, which includes the rapid influx of computers in the modern classroom.

There is evidence that, to be effective in education, ICT must be fully integrated into the pedagogy. Specifically, when teaching literacy and math, using ICT in combination with Writing to Learn  produces better results than traditional methods alone or ICT alone. The United Nations Educational, Scientific and Cultural Organisation (UNESCO), a division of the United Nations, has made integrating ICT into education part of its efforts to ensure equity and access to education. The following, taken directly from a UNESCO publication on educational ICT, explains the organization's position on the initiative.

Information and Communication Technology can contribute to universal access to education, equity in education, the delivery of quality learning and teaching, teachers' professional development and more efficient education management, governance, and administration. UNESCO takes a holistic and comprehensive approach to promote ICT in education. Access, inclusion, and quality are among the main challenges they can address. The Organization's Intersectoral Platform for ICT in education focuses on these issues through the joint work of three of its sectors: Communication & Information, Education and Science.

OLPC Laptops at school in Rwanda2

Despite the power of computers to enhance and reform teaching and learning practices, improper implementation is a widespread issue beyond the reach of increased funding and technological advances with little evidence that teachers and tutors are properly integrating ICT into everyday learning. Intrinsic barriers such as a belief in more traditional teaching practices and individual attitudes towards computers in education as well as the teachers own comfort with computers and their ability to use them all as result in varying effectiveness in the integration of ICT in the classroom. 

Mobile learning for refugees

School environments play an important role in facilitating language learning. However, language and literacy barriers are obstacles preventing refugees from accessing and attending school, especially outside camp settings.

Mobile-assisted language learning apps are key tools for language learning. Mobile solutions can provide support for refugees’ language and literacy challenges in three main areas: literacy development, foreign language learning and translations. Mobile technology is relevant because communicative practice is a key asset for refugees and immigrants as they immerse themselves in a new language and a new society. Well-designed mobile language learning activities connect refugees with mainstream cultures, helping them learn in authentic contexts.

Developing countries

Africa

A computer screen at the front of a room of policymakers shows the Mobile Learning Week logo
Representatives meet for a policy forum on M-Learning at UNESCO's Mobile Learning Week in March 2017

ICT has been employed as an educational enhancement in Sub-Saharan Africa since the 1960s. Beginning with television and radio, it extended the reach of education from the classroom to the living room, and to geographical areas that had been beyond the reach of the traditional classroom. As the technology evolved and became more widely used, efforts in Sub-Saharan Africa were also expanded. In the 1990s a massive effort to push computer hardware and software into schools was undertaken, with the goal of familiarizing both students and teachers with computers in the classroom. Since then, multiple projects have endeavoured to continue the expansion of ICT's reach in the region, including the One Laptop Per Child (OLPC) project, which by 2015 had distributed over 2.4 million laptops to nearly 2 million students and teachers.

The inclusion of ICT in the classroom often referred to as M-Learning, has expanded the reach of educators and improved their ability to track student progress in Sub-Saharan Africa. In particular, the mobile phone has been most important in this effort. Mobile phone use is widespread, and mobile networks cover a wider area than internet networks in the region. The devices are familiar to student, teacher, and parent, and allow increased communication and access to educational materials. In addition to benefits for students, M-learning also offers the opportunity for better teacher training, which leads to a more consistent curriculum across the educational service area. In 2011, UNESCO started a yearly symposium called Mobile Learning Week with the purpose of gathering stakeholders to discuss the M-learning initiative.

Implementation is not without its challenges. While mobile phone and internet use are increasing much more rapidly in Sub-Saharan Africa than in other developing countries, the progress is still slow compared to the rest of the developed world, with smartphone penetration only expected to reach 20% by 2017. Additionally, there are gender, social, and geo-political barriers to educational access, and the severity of these barriers vary greatly by country. Overall, 29.6 million children in Sub-Saharan Africa were not in school in the year 2012, owing not just to the geographical divide, but also to political instability, the importance of social origins, social structure, and gender inequality. Once in school, students also face barriers to quality education, such as teacher competency, training and preparedness, access to educational materials, and lack of information management.

Modern ICT In modern society ICT is ever-present, with over three billion people having access to the Internet. With approximately 8 out of 10 Internet users owning a smartphone, information and data are increasing by leaps and bounds. This rapid growth, especially in developing countries, has led ICT to become a keystone of everyday life, in which life without some facet of technology renders most of clerical, work and routine tasks dysfunctional.

The most recent authoritative data, released in 2014, shows "that Internet use continues to grow steadily, at 6.6% globally in 2014 (3.3% in developed countries, 8.7% in the developing world); the number of Internet users in developing countries has doubled in five years (2009-2014), with two-thirds of all people online now living in the developing world."

However, hurdles are still large. "Of the 4.3 billion people not yet using the Internet, 90% live in developing countries. In the world's 42 Least Connected Countries (LCCs), which are home to 2.5 billion people, access to ICTs remains largely out of reach, particularly for these countries' large rural populations." ICT has yet to penetrate the remote areas of some countries, with many developing countries dearth of any type of Internet. This also includes the availability of telephone lines, particularly the availability of cellular coverage, and other forms of electronic transmission of data. The latest "Measuring the Information Society Report" cautiously stated that the increase in the aforementioned cellular data coverage is ostensible, as "many users have multiple subscriptions, with global growth figures sometimes translating into little real improvement in the level of connectivity of those at the very bottom of the pyramid; an estimated 450 million people worldwide live in places which are still out of reach of mobile cellular service."

Favourably, the gap between the access to the Internet and mobile coverage has decreased substantially in the last fifteen years, in which "2015 [was] the deadline for achievements of the UN Millennium Development Goals (MDGs), which global leaders agreed upon in the year 2000, and the new data show ICT progress and highlight remaining gaps." ICT continues to take on a new form, with nanotechnology set to usher in a new wave of ICT electronics and gadgets. ICT newest editions into the modern electronic world include smartwatches, such as the Apple Watch, smart wristbands such as the Nike+ FuelBand, and smart TVs such as Google TV. With desktops soon becoming part of a bygone era, and laptops becoming the preferred method of computing, ICT continues to insinuate and alter itself in the ever-changing globe.

Information communication technologies play a role in facilitating accelerated pluralism in new social movements today. The internet according to Bruce Bimber is "accelerating the process of issue group formation and action" and coined the term accelerated pluralism to explain this new phenomena. ICTs are tools for "enabling social movement leaders and empowering dictators" in effect promoting societal change. ICTs can be used to garner grassroots support for a cause due to the internet allowing for political discourse and direct interventions with state policy as well as change the way complaints from the populace are handled by governments. Furthermore, ICTs in a household are associated with women rejecting justifications for intimate partner violence. According to a study published in 2017, this is likely because “[a]ccess to ICTs exposes women to different ways of life and different notions about women’s role in society and the household, especially in culturally conservative regions where traditional gender expectations contrast observed alternatives."

Models of access to ICT

Scholar Mark Warschauer defines a “models of access” framework for analyzing ICT accessibility. In the second chapter of his book, Technology and Social Inclusion: Rethinking the Digital Divide, he describes three models of access to ICTs: devices, conduits, and literacy. Devices and conduits are the most common descriptors for access to ICTs, but they are insufficient for meaningful access to ICTs without third model of access, literacy. Combined, these three models roughly incorporate all twelve of the criteria of “Real Access” to ICT use, conceptualized by a non-profit organization called Bridges.org in 2005:

  1. Physical access to technology
  2. Appropriateness of technology
  3. Affordability of technology and technology use
  4. Human capacity and training
  5. Locally relevant content, applications, and services
  6. Integration into daily routines
  7. Socio-cultural factors
  8. Trust in technology
  9. Local economic environment
  10. Macro-economic environment
  11. Legal and regulatory framework
  12. Political will and public support

Devices

The most straightforward model of access for ICT in Warschauer’s theory is devices. In this model, access is defined most simply as the ownership of a device such as a phone or computer. Warschauer identifies many flaws with this model, including its inability to account for additional costs of ownership such as software, access to telecommunications, knowledge gaps surrounding computer use, and the role of government regulation in some countries. Therefore, Warschauer argues that considering only devices understates the magnitude of digital inequality. For example, the Pew Research Center notes that 96% of Americans own a smartphone, although most scholars in this field would contend that comprehensive access to ICT in the United States is likely much lower than that.

Conduits

A conduit requires a connection to a supply line, which for ICT could be a telephone line or Internet line. Accessing the supply requires investment in the proper infrastructure from a commercial company or local government and recurring payments from the user once the line is set up. For this reason, conduits usually divide people based on their geographic locations. As a Pew Research Center poll reports, rural Americans are 12% less likely to have broadband access than other Americans, thereby making them less likely to own the devices. Additionally, these costs can be prohibitive to lower-income families accessing ICTs. These difficulties have led to a shift toward mobile technology; fewer people are purchasing broadband connection and are instead relying on their smartphones for Internet access, which can be found for free at public places such as libraries. Indeed, smartphones are on the rise, with 37% of Americans using smartphones as their primary medium for internet access and 96% of Americans owning a smartphone.

Literacy

Youth and adults with ICT skills, 2017

In 1981, Sylvia Scribner and Michael Cole studied a tribe in Liberia, the Vai people, that has its own local language. Since about half of those literate in Vai have never had formal schooling, Scribner and Cole were able to test more than 1,000 subjects to measure the mental capabilities of literates over non-literates. This research, which they laid out in their book The Psychology of Literacy, allowed them to study whether the literacy divide exists at the individual level. Warschauer applied their literacy research to ICT literacy as part of his model of ICT access.

Scribner and Cole found no generalizable cognitive benefits from Vai literacy; instead, individual differences on cognitive tasks were due to other factors, like schooling or living environment. The results suggested that there is “no single construct of literacy that divides people into two cognitive camps; [...] rather, there are gradations and types of literacies, with a range of benefits closely related to the specific functions of literacy practices.” Furthermore, literacy and social development are intertwined, and the literacy divide does not exist on the individual level.

Warschauer draws on Scribner and Cole’s research to argue that ICT literacy functions similarly to literacy acquisition, as they both require resources rather than a narrow cognitive skill. Conclusions about literacy serve as the basis for a theory of the digital divide and ICT access, as detailed below:

There is not just one type of ICT access, but many types. The meaning and value of access varies in particular social contexts. Access exists in gradations rather than in a bipolar opposition. Computer and Internet use brings no automatic benefit outside of its particular functions. ICT use is a social practice, involving access to physical artifacts, content, skills, and social support. And acquisition of ICT access is a matter not only of education but also of power.

Therefore, Warschauer concludes that access to ICT cannot rest on devices or conduits alone; it must also engage physical, digital, human, and social resources. Each of these categories of resources have iterative relations with ICT use. If ICT is used well, it can promote these resources, but if it is used poorly, it can contribute to a cycle of underdevelopment and exclusion.

Tuesday, July 27, 2021

Information technology

From Wikipedia, the free encyclopedia
 

Information technology (IT) is the use of computers to store or retrieve data and information. IT is typically used within the context of business operations as opposed to personal or entertainment technologies. IT is considered to be a subset of information and communications technology (ICT). An information technology system (IT system) is generally an information system, a communications system, or, more specifically speaking, a computer system – including all hardware, software, and peripheral equipment – operated by a limited group of IT users.

Humans have been storing, retrieving, manipulating, and communicating information since the Sumerians in Mesopotamia developed writing in about 3000 BC. However, the term information technology in its modern sense first appeared in a 1958 article published in the Harvard Business Review; authors Harold J. Leavitt and Thomas L. Whisler commented that "the new technology does not yet have a single established name. We shall call it information technology (IT)." Their definition consists of three categories: techniques for processing, the application of statistical and mathematical methods to decision-making, and the simulation of higher-order thinking through computer programs.

The term is commonly used as a synonym for computers and computer networks, but it also encompasses other information distribution technologies such as television and telephones. Several products or services within an economy are associated with information technology, including computer hardware, software, electronics, semiconductors, internet, telecom equipment, and e-commerce.

Based on the storage and processing technologies employed, it is possible to distinguish four distinct phases of IT development: pre-mechanical (3000 BC – 1450 AD), mechanical (1450–1840), electromechanical (1840–1940), and electronic (1940–present). This article focuses on the most recent period (electronic).

History of computer technology

Zuse Z3 replica on display at Deutsches Museum in Munich. The Zuse Z3 is the first programmable computer.
 
This is the Antikythera mechanism, which is considered the first mechanical analog computer, dating back to the first century.

Devices have been used to aid computation for thousands of years, probably initially in the form of a tally stick. The Antikythera mechanism, dating from about the beginning of the first century BC, is generally considered to be the earliest known mechanical analog computer, and the earliest known geared mechanism. Comparable geared devices did not emerge in Europe until the 16th century, and it was not until 1645 that the first mechanical calculator capable of performing the four basic arithmetical operations was developed.

Electronic computers, using either relays or valves, began to appear in the early 1940s. The electromechanical Zuse Z3, completed in 1941, was the world's first programmable computer, and by modern standards one of the first machines that could be considered a complete computing machine. During the Second World War, Colossus developed the first electronic digital computer to decrypt German messages. Although it was programmable, it was not general-purpose, being designed to perform only a single task. It also lacked the ability to store its program in memory; programming was carried out using plugs and switches to alter the internal wiring. The first recognizably modern electronic digital stored-program computer was the Manchester Baby, which ran its first program on 21 June 1948.

The development of transistors in the late 1940s at Bell Laboratories allowed a new generation of computers to be designed with greatly reduced power consumption. The first commercially available stored-program computer, the Ferranti Mark I, contained 4050 valves and had a power consumption of 25 kilowatts. By comparison, the first transistorized computer developed at the University of Manchester and operational by November 1953, consumed only 150 watts in its final version.

Several other breakthroughs in semiconductor technology include the integrated circuit (IC) invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor in 1959, the metal-oxide-semiconductor field-effect transistor (MOSFET) invented by Mohamed Atalla and Dawon Kahng at Bell Laboratories in 1959, and the microprocessor invented by Ted Hoff, Federico Faggin, Masatoshi Shima and Stanley Mazor at Intel in 1971. These important inventions led to the development of the personal computer (PC) in the 1970s, and the emergence of information and communications technology (ICT).

Electronic data processing

This is the Ferranti Mark I computer Logic Board.

Data storage

Punched tapes were used in early computers to represent data.

Early electronic computers such as Colossus made use of punched tape, a long strip of paper on which data was represented by a series of holes, a technology now obsolete. Electronic data storage, which is used in modern computers, dates from World War II, when a form of delay line memory was developed to remove the clutter from radar signals, the first practical application of which was the mercury delay line. The first random-access digital storage device was the Williams tube, based on a standard cathode ray tube, but the information stored in it and delay line memory was volatile in that it had to be continuously refreshed, and thus was lost once power was removed. The earliest form of non-volatile computer storage was the magnetic drum, invented in 1932 and used in the Ferranti Mark 1, the world's first commercially available general-purpose electronic computer.

IBM introduced the first hard disk drive in 1956, as a component of their 305 RAMAC computer system. Most digital data today is still stored magnetically on hard disks, or optically on media such as CD-ROMs. Until 2002 most information was stored on analog devices, but that year digital storage capacity exceeded analog for the first time. As of 2007, almost 94% of the data stored worldwide was held digitally: 52% on hard disks, 28% on optical devices ,and 11% on digital magnetic tape. It has been estimated that the worldwide capacity to store information on electronic devices grew from less than 3 exabytes in 1986 to 295 exabytes in 2007, doubling roughly every 3 years.

Databases

Database Management Systems (DMS) emerged in the 1960s to address the problem of storing and retrieving large amounts of data accurately and quickly. An early such system was IBM's Information Management System (IMS), which is still widely deployed more than 50 years later. IMS stores data hierarchically, but in the 1970s Ted Codd proposed an alternative relational storage model based on set theory and predicate logic and the familiar concepts of tables, rows ,and columns. In 1981, the first commercially available relational database management system (RDBMS) was released by Oracle.

All DMS consist of components, they allow the data they store to be accessed simultaneously by many users while maintaining its integrity. All databases are common in one point that the structure of the data they contain is defined and stored separately from the data itself, in a database schema.

In recent years, the extensible markup language (XML) has become a popular format for data representation. Although XML data can be stored in normal file systems, it is commonly held in relational databases to take advantage of their "robust implementation verified by years of both theoretical and practical effort". As an evolution of the Standard Generalized Markup Language (SGML), XML's text-based structure offers the advantage of being both machine and human-readable.

Data retrieval

The relational database model introduced a programming-language independent Structured Query Language (SQL), based on relational algebra.

The terms "data" and "information" are not synonymous. Anything stored is data, but it only becomes information when it is organized and presented meaningfully. Most of the world's digital data is unstructured, and stored in a variety of different physical formats even within a single organization. Data warehouses began to be developed in the 1980s to integrate these disparate stores. They typically contain data extracted from various sources, including external sources such as the Internet, organized in such a way as to facilitate decision support systems (DSS).

Data transmission

This is what a IBM card storage warehouse located in Alexandria, Virginia in 1959. This is where the government kept storage of punched cards.

Data transmission has three aspects: transmission, propagation, and reception. It can be broadly categorized as broadcasting, in which information is transmitted unidirectionally downstream, or telecommunications, with bidirectional upstream and downstream channels.

XML has been increasingly employed as a means of data interchange since the early 2000s, particularly for machine-oriented interactions such as those involved in web-oriented protocols such as SOAP, describing "data-in-transit rather than ... data-at-rest".

Data manipulation

Hilbert and Lopez identify the exponential pace of technological change (a kind of Moore's law): machines' application-specific capacity to compute information per capita roughly doubled every 14 months between 1986 and 2007; the per capita capacity of the world's general-purpose computers doubled every 18 months during the same two decades; the global telecommunication capacity per capita doubled every 34 months; the world's storage capacity per capita required roughly 40 months to double (every 3 years); and per capita broadcast information has doubled every 12.3 years.

Massive amounts of data are stored worldwide every day, but unless it can be analyzed and presented effectively it essentially resides in what have been called data tombs: "data archives that are seldom visited". To address that issue, the field of data mining – "the process of discovering interesting patterns and knowledge from large amounts of data" – emerged in the late 1980s.

Perspectives

Academic perspective

In an academic context, the Association for Computing Machinery defines IT as "undergraduate degree programs that prepare students to meet the computer technology needs of business, government, healthcare, schools, and other kinds of organizations .... IT specialists assume responsibility for selecting hardware and software products appropriate for an organization, integrating those products with organizational needs and infrastructure, and installing, customizing, and maintaining those applications for the organization’s computer users."

Undergraduate degrees in IT (B.S., A.S.) are similar to other computer science degrees. In fact, they often times have the same foundational level courses. Computer science (CS) programs tend to focus more on theory and design, whereas Information Technology programs are structured to equip the graduate with expertise in the practical application of technology solutions to support modern business and user needs.

Commercial and employment perspective

Companies in the information technology field are often discussed as a group as the "tech sector" or the "tech industry". These titles can be misleading at times and should not be mistaken for “tech companies”; which are generally large scale, for-profit corporations that sell consumer technology and software. It is also worth noting that from a business perspective, Information Technology departments are a “cost center” the majority of the time. A cost center is a department or staff which incurs expenses, or “costs”, within a company rather than generating profits or revenue streams. Modern businesses rely heavily on technology for their day-to-day operations, so the expenses delegated to cover technology that facilitates business in a more efficient manner is usually seen as “just the cost of doing business”. IT departments are allocated funds by senior leadership and must attempt to achieve the desired deliverables while staying within that budget. Government and the private sector might have different funding mechanisms, but the principles are more-or-less the same. This is an often overlooked reason for the rapid interest in automation and Artificial Intelligence, but the constant pressure to do more with less is opening the door for automation to take control of at least some minor operations in large companies.

Many companies now have IT departments for managing the computers, networks, and other technical areas of their businesses. Companies have also sought to integrate IT with business outcomes and decision-making through a BizOps or business operations department.

In a business context, the Information Technology Association of America has defined information technology as "the study, design, development, application, implementation, support or management of computer-based information systems". The responsibilities of those working in the field include network administration, software development and installation, and the planning and management of an organization's technology life cycle, by which hardware and software are maintained, upgraded and replaced.

Information services

Information services is a term somewhat loosely applied to a variety of IT-related services offered by commercial companies, as well as data brokers.

Ethical perspectives

The field of information ethics was established by mathematician Norbert Wiener in the 1940s. Some of the ethical issues associated with the use of information technology include:

  • Breaches of copyright by those downloading files stored without the permission of the copyright holders
  • Employers monitoring their employees' emails and other Internet usage
  • Unsolicited emails
  • Hackers accessing online databases
  • Web sites installing cookies or spyware to monitor a user's online activities, which may be used by data brokers

 

High-level programming language

From Wikipedia, the free encyclopedia

In computer science, a high-level programming language is a programming language with strong abstraction from the details of the computer. In contrast to low-level programming languages, it may use natural language elements, be easier to use, or may automate (or even hide entirely) significant areas of computing systems (e.g. memory management), making the process of developing a program simpler and more understandable than when using a lower-level language. The amount of abstraction provided defines how "high-level" a programming language is.

In the 1960s, high-level programming languages using a compiler were commonly called autocodes. Examples of autocodes are COBOL and Fortran.

The first high-level programming language designed for computers was Plankalkül, created by Konrad Zuse. However, it was not implemented in his time, and his original contributions were largely isolated from other developments due to World War II, aside from the language's influence on the "Superplan" language by Heinz Rutishauser and also to some degree Algol. The first significantly widespread high-level language was Fortran, a machine-independent development of IBM's earlier Autocode systems. Algol, defined in 1958 and 1960 by committees of European and American computer scientists, introduced recursion as well as nested functions under lexical scope. It was also the first language with a clear distinction between value and name-parameters and their corresponding semantics. Algol also introduced several structured programming concepts, such as the while-do and if-then-else constructs and its syntax was the first to be described in formal notation – "Backus–Naur form" (BNF). During roughly the same period, Cobol introduced records (also called structs) and Lisp introduced a fully general lambda abstraction in a programming language for the first time.

Features

"High-level language" refers to the higher level of abstraction from machine language. Rather than dealing with registers, memory addresses, and call stacks, high-level languages deal with variables, arrays, objects, complex arithmetic or boolean expressions, subroutines and functions, loops, threads, locks, and other abstract computer science concepts, with a focus on usability over optimal program efficiency. Unlike low-level assembly languages, high-level languages have few, if any, language elements that translate directly into a machine's native opcodes. Other features, such as string handling routines, object-oriented language features, and file input/output, may also be present. One thing to note about high-level programming languages is that these languages allow the programmer to be detached and separated from the machine. That is, unlike low-level languages like assembly or machine language, high-level programming can amplify the programmer's instructions and trigger a lot of data movements in the background without their knowledge. The responsibility and power of executing instructions have been handed over to the machine from the programmer.

Abstraction penalty

High-level languages intend to provide features which standardize common tasks, permit rich debugging, and maintain architectural agnosticism; while low-level languages often produce more efficient code through optimization for a specific system architecture. Abstraction penalty is the cost that high-level programming techniques pay for being unable to optimize performance or use certain hardware because they don't take advantage of certain low-level architectural resources. High-level programming exhibits features like more generic data structures and operations, run-time interpretation, and intermediate code files; which often result in execution of far more operations than necessary, higher memory consumption, and larger binary program size. For this reason, code which needs to run particularly quickly and efficiently may require the use of a lower-level language, even if a higher-level language would make the coding easier. In many cases, critical portions of a program mostly in a high-level language can be hand-coded in assembly language, leading to a much faster, more efficient, or simply reliably functioning optimised program.

However, with the growing complexity of modern microprocessor architectures, well-designed compilers for high-level languages frequently produce code comparable in efficiency to what most low-level programmers can produce by hand, and the higher abstraction may allow for more powerful techniques providing better overall results than their low-level counterparts in particular settings. High-level languages are designed independent of a specific computing system architecture. This facilitates executing a program written in such a language on any computing system with compatible support for the Interpreted or JIT program. High-level languages can be improved as their designers develop improvements. In other cases, new high-level languages evolve from one or more others with the goal of aggregating the most popular constructs with new or improved features. An example of this is Scala which maintains backward compatibility with Java which means that programs and libraries written in Java will continue to be usable even if a programming shop switches to Scala; this makes the transition easier and the lifespan of such high-level coding indefinite. In contrast, low-level programs rarely survive beyond the system architecture which they were written for without major revision. This is the engineering 'trade-off' for the 'Abstraction Penalty'.

Relative meaning

Examples of high-level programming languages in active use today include Python, Visual Basic, Delphi, Perl, PHP, ECMAScript, Ruby, C#, Java and many others.

The terms high-level and low-level are inherently relative. Some decades ago, the C language, and similar languages, were most often considered "high-level", as it supported concepts such as expression evaluation, parameterised recursive functions, and data types and structures, while assembly language was considered "low-level". Today, many programmers might refer to C as low-level, as it lacks a large runtime-system (no garbage collection, etc.), basically supports only scalar operations, and provides direct memory addressing. It, therefore, readily blends with assembly language and the machine level of CPUs and microcontrollers.

Assembly language may itself be regarded as a higher level (but often still one-to-one if used without macros) representation of machine code, as it supports concepts such as constants and (limited) expressions, sometimes even variables, procedures, and data structures. Machine code, in its turn, is inherently at a slightly higher level than the microcode or micro-operations used internally in many processors.

Execution modes

There are three general modes of execution for modern high-level languages:

Interpreted
When code written in a language is interpreted, its syntax is read and then executed directly, with no compilation stage. A program called an interpreter reads each program statement, following the program flow, then decides what to do, and does it. A hybrid of an interpreter and a compiler will compile the statement into machine code and execute that; the machine code is then discarded, to be interpreted anew if the line is executed again. Interpreters are commonly the simplest implementations of the behavior of a language, compared to the other two variants listed here.
Compiled
When code written in a language is compiled, its syntax is transformed into an executable form before running. There are two types of compilation:
Machine code generation
Some compilers compile source code directly into machine code. This is the original mode of compilation, and languages that are directly and completely transformed to machine-native code in this way may be called truly compiled languages. See assembly language.
Intermediate representations
When code written in a language is compiled to an intermediate representation, that representation can be optimized or saved for later execution without the need to re-read the source file. When the intermediate representation is saved, it may be in a form such as bytecode. The intermediate representation must then be interpreted or further compiled to execute it. Virtual machines that execute bytecode directly or transform it further into machine code have blurred the once clear distinction between intermediate representations and truly compiled languages.
Source-to-source translated or transcompiled
Code written in a language may be translated into terms of a lower-level language for which native code compilers are already common. JavaScript and the language C are common targets for such translators. See CoffeeScript, Chicken Scheme, and Eiffel as examples. Specifically, the generated C and C++ code can be seen (as generated from the Eiffel language when using the EiffelStudio IDE) in the EIFGENs directory of any compiled Eiffel project. In Eiffel, the translated process is referred to as transcompiling or transcompiled, and the Eiffel compiler as a transcompiler or source-to-source compiler.

Note that languages are not strictly interpreted languages or compiled languages. Rather, implementations of language behavior use interpreting or compiling. For example, ALGOL 60 and Fortran have both been interpreted (even though they were more typically compiled). Similarly, Java shows the difficulty of trying to apply these labels to languages, rather than to implementations; Java is compiled to bytecode which is then executed by either interpreting (in a Java virtual machine (JVM)) or compiling (typically with a just-in-time compiler such as HotSpot, again in a JVM). Moreover, compiling, transcompiling, and interpreting is not strictly limited to only a description of the compiler artifact (binary executable or IL assembly).

High-level language computer architecture

Alternatively, it is possible for a high-level language to be directly implemented by a computer – the computer directly executes the HLL code. This is known as a high-level language computer architecture – the computer architecture itself is designed to be targeted by a specific high-level language. The Burroughs large systems were target machines for ALGOL 60, for example.

Lie point symmetry

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Lie_point_symmetry     ...