Search This Blog

Sunday, September 27, 2020

Smart city

From Wikipedia, the free encyclopedia

A smart city is an urban area that uses different types of electronic methods and sensors to collect data. Insights gained from that data are used to manage assets, resources and services efficiently; in return, that data is used to improve the operations across the city. This includes data collected from citizens, devices, buildings and assets that is then processed and analyzed to monitor and manage traffic and transportation systems, power plants, utilities, water supply networks, waste, crime detection, information systems, schools, libraries, hospitals, and other community services.

The smart city concept integrates information and communication technology (ICT), and various physical devices connected to the IoT network to optimize the efficiency of city operations and services and connect to citizens. Smart city technology allows city officials to interact directly with both community and city infrastructure and to monitor what is happening in the city and how the city is evolving. ICT is used to enhance quality, performance and interactivity of urban services, to reduce costs and resource consumption and to increase contact between citizens and government. Smart city applications are developed to manage urban flows and allow for real-time responses. A smart city may therefore be more prepared to respond to challenges than one with a simple "transactional" relationship with its citizens. Yet, the term itself remains unclear to its specifics and therefore, open to many interpretations.

Terminology

Due to the breadth of technologies that have been implemented under the smart city label, it is difficult to distill a precise definition of a smart city. Deakin and Al Waer list four factors that contribute to the definition of a smart city:

  1. The application of a wide range of electronic and digital technologies to communities and cities.
  2. The use of ICT to transform life and working environments within the region.
  3. The embedding of such Information and Communications Technologies (ICTs) in government systems.
  4. The territorialisation of practices that brings ICTs and people together to enhance the innovation and knowledge that they offer.

Deakin defines the smart city as one that utilises ICT to meet the demands of the market (the citizens of the city), and that community involvement in the process is necessary for a smart city. A smart city would thus be a city that not only possesses ICT technology in particular areas, but has also implemented this technology in a manner that positively impacts the local community.

Alternative definitions include:

  • Giffinger et al. 2007: "Regional competitiveness, transport and Information and Communication Technologies economics, natural resources, human and social capital, quality of life, and participation of citizens in the governance of cities."
  • Smart Cities Council: "A smart city is one that has digital technology embedded across all city functions."
  • Caragliu and Nijkamp 2009: "A city can be defined as 'smart' when investments in human and social capital and traditional (transport) and modern (ICT) communication infrastructure fuel sustainable economic development and a high quality of life, with a wise management of natural resources, through participatory action and engagement."
  • Frost & Sullivan 2014: "We identified eight key aspects that define a smart city: smart governance, smart energy, smart building, smart mobility, smart infrastructure, smart technology, smart healthcare and smart citizen.
  • Institute of Electrical and Electronics Engineers smart Cities: "A smart city brings together technology, government and society to enable the following characteristics: smart cities, a smart economy, smart mobility, a smart environment, smart people, smart living, smart governance."
  • Business Dictionary: "A developed urban area that creates sustainable economic development and high quality of life by excelling in multiple key areas; economy, mobility, environment, people, living, and government. Excelling in these key areas can be done so through strong human capital, social capital, and/or ICT infrastructure."
  • Indian Government 2014: "Smart city offers sustainability in terms of economic activities and employment opportunities to a wide section of its residents, regardless of their level of education, skills or income levels."
  • Department for Business, Innovation and Skills, UK 2013: "The concept is not static, there is no absolute definition of a smart city, no end point, but rather a process, or series of steps, by which cities become more 'liveable' and resilient and, hence, able to respond more quickly to new challenges."

Characteristics

It has been suggested that a smart city (also community, business cluster, urban agglomeration or region) uses information technologies to:

  1. Make more efficient use of physical infrastructure (roads, built environment and other physical assets) through artificial intelligence and data analytics to support a strong and healthy economic, social, cultural development.
  2. Engage effectively with local people in local governance and decision by use of open innovation processes and e-participation, improving the collective intelligence of the city's institutions through e-governance, with emphasis placed on citizen participation and co-design.
  3. Learn, adapt and innovate and thereby respond more effectively and promptly to changing circumstances by improving the intelligence of the city.

They evolve towards a strong integration of all dimensions of human intelligence, collective intelligence, and also artificial intelligence within the city. The intelligence of cities "resides in the increasingly effective combination of digital telecommunication networks (the nerves), ubiquitously embedded intelligence (the brains), sensors and tags (the sensory organs), and software (the knowledge and cognitive competence)".

These forms of intelligence in smart cities have been demonstrated in three ways:

Bletchley Park often considered to be the first smart community.
  1. Orchestration intelligence: Where cities establish institutions and community-based problem solving and collaborations, such as in Bletchley Park, where the Nazi Enigma cypher was decoded by a team led by Alan Turing. This has been referred to as the first example of a smart city or an intelligent community.
  2. Empowerment intelligence: Cities provide open platforms, experimental facilities and smart city infrastructure in order to cluster innovation in certain districts. These are seen in the Kista Science City in Stockholm and the Cyberport Zone in Hong Kong. Similar facilities have also been established in Melbourne.
  1. Hong Kong Cyberport 1 and Cyberport 2 Buildings
    A hub has been created in Kyiv, which develops public projects.
  2. Instrumentation intelligence: Where city infrastructure is made smart through real-time data collection, with analysis and predictive modelling across city districts. There is much controversy surrounding this, particularly with regards to surveillance issues in smart cities. Examples of Instrumentation intelligence have been implemented in Amsterdam. This is implemented through:
    1. A common IP infrastructure that is open to researchers to develop applications.
    2. Wireless meters and devices transmit information at the point in time.
    3. A number of homes being provided with smart energy meters to become aware of energy consumption and reduce energy usage.
    4. Solar power garbage compactors, car recharging stations and energy saving lamps.

Some major fields of intelligent city activation are:

Innovation economy Urban infrastructure Governance
Innovation in industries, clusters, districts of a city Transport Administration services to the citizen
Knowledge workforce: Education and employment Energy / Utilities Participatory and direct democracy
Creation of knowledge-intensive companies Protection of the environment / Safety Services to the citizen: Quality of life

According to David K. Owens, the former executive vice president of the Edison Electric Institute, two key elements that a smart city must have are an integrated communications platform and a "dynamic resilient grid." Both are large investments.

Frameworks

The creation, integration, and adoption of smart city capabilities requires a unique set of frameworks to realize the focus areas of opportunity and innovation central to smart city projects. The frameworks can be divided into 5 main dimensions which include numerous related categories of smart city development:

Technology framework

A smart city relies heavily on the deployment of technology. Different combinations of technological infrastructure interact to form the array of smart city technologies with varying levels of interaction between human and technological systems.

  • Digital: A service oriented infrastructure is required to connect individuals and devices in a smart city. These include innovation services and communication infrastructure. Yovanof, G. S. & Hazapis, G. N. define a digital city as "a connected community that combines broadband communications infrastructure; a flexible, service-oriented computing infrastructure based on open industry standards; and, innovative services to meet the needs of governments and their employees, citizens and businesses."
  • Intelligent: Cognitive technologies, such as artificial intelligence and machine learning, can be trained on the data generated by connected city devices to identify patterns. The efficacy and impact of particular policy decisions can be quantified by cognitive systems studying the continuous interactions of humans with their urban surroundings.
  • Ubiquitous: A ubiquitous city provides access to public services through any connected device. U-city is an extension of the digital city concept because of the facility in terms of accessibility to every infrastructure.
  • Wired: The physical components of IT systems are crucial to early-stage smart city development. Wired infrastructure is required to support the IoT and wireless technologies central to more interconnected living. A wired city environment provides general access to continually updated digital and physical infrastructure. The latest in telecommunications, robotics, IoT, and various connected technologies can then be deployed to support human capital and productivity.
  • Hybrid: A hybrid city is the combination of a physical conurbation and a virtual city related to the physical space. This relationship can be one of virtual design or the presence of a critical mass of virtual community participants in a physical urban space. Hybrid spaces can serve to actualize future-state projects for smart city services and integration.
  • Information city: The multiplicity of interactive devices in a smart city generates a large quantity of data. How that information is interpreted and stored is critical to Smart city growth and security.

Human framework

Smart city initiatives have measurable positive impacts on the quality of life of its citizens and visitors. The human framework of a smart city - its economy, knowledge networks, and human support systems - is an important indicator of its success.

  • Creativity: Arts and culture initiatives are common focus areas in smart city planning. Innovation is associated with intellectual curiosity and creativeness, and various projects have demonstrated that knowledge workers participate in a diverse mix of cultural and artistic activities.
  • Learning: Since mobility is a key area of Smart city development, building a capable workforce through education initiatives is necessary. A city's learning capacity includes its education system, including available workforce training and support, and its cultural development and exchange.
  • Humanity: Numerous Smart city programs focus on soft infrastructure development, like increasing access to voluntary organizations and designated safe zones. This focus on social and relational capital means diversity, inclusion, and ubiquitous access to public services is worked in to city planning.
  • Knowledge: The development of a knowledge economy is central to Smart city projects. Smart cities seeking to be hubs of economic activity in emerging tech and service sectors stress the value of innovation in city development.

Institutional framework

According to Moser, M. A., since the 1990s, the smart communities movement took shape as a strategy to broaden the base of users involved in IT. Members of these Communities are people that share their interest and work in a partnership with government and other institutional organizations to push the use of IT to improve the quality of daily life as a consequence of different worsening in daily actions. Eger, J. M. said that a smart community makes a conscious and agreed-upon decision to deploy technology as a catalyst to solving its social and business needs. It is very important to understand that this use of IT and the consequent improvement could be more demanding without the institutional help; indeed institutional involvement is essential to the success of smart community initiatives. Again Moser, M. A. explained that "building and planning a smart community seeks for smart growth"; smart growth is essential for the partnership between citizen and institutional organizations to react to worsening trends in daily issues like traffic congestion, school overcrowding and air pollution. However, it is important to note that technological propagation is not an end in itself, but only a means to reinventing cities for a new economy and society. To sum up, it is possible to assert that any smart city initiatives necessitate the government's support for their success.

The importance of these three different dimensions is that only a link among them can make possible the development of a real smart city concept. According to the definition of smart city given by Caragliu, A., Del Bo, C., & Nijkamp, P., a city is smart when investments in human/social capital and IT infrastructure fuel sustainable growth and enhance quality of life, through participatory governance.

Energy framework

Smart cities use data and technology to create efficiencies, improve sustainability, create economic development, and enhance quality of life factors for people living and working in the city. It also means that the city has a smarter energy infrastructure. More formally, a smart city is: "… An urban area that has securely integrated technology across the information . . . and Internet of Things (IoT) sectors to better manage a city’s assets."

A smart city is powered by "smart connections" for various items such as street lighting, smart buildings, distributed energy resources (DER), data analytics, and smart transportation. Amongst these things, energy is paramount; this is why utility companies play a key role in smart cities. Electric companies, working partnership with city officials, technology companies and a number of other institutions, are among the major players that helped accelerate the growth of America's smart cities.

Data Management framework

Smart cities employ a combination of data collection, processing, and disseminating technologies in conjunction with networking and computing technologies and data security and privacy measures encouraging the application of innovation to promote the overall quality of life for its citizens and covering dimensions that include: utilities, health, transportation, entertainment and government services.

Roadmap

A smart city roadmap consists of four/three (the first is a preliminary check) major components:

  1. Define exactly what is the community: maybe that definition can condition what you are doing in the subsequent steps; it relates to geography, links between cities and countryside and flows of people between them; maybe – even – that in some Countries the definition of City/community that is stated does not correspond effectively to what – in fact – happens in real life.
  2. Study the Community: Before deciding to build a smart city, first we need to know why. This can be done by determining the benefits of such an initiative. Study the community to know the citizens, the business's needs – know the citizens and the community's unique attributes, such as the age of the citizens, their education, hobbies, and attractions of the city.
  3. Develop a smart city Policy: Develop a policy to drive the initiatives, where roles, responsibilities, objective, and goals, can be defined. Create plans and strategies on how the goals will be achieved.
  4. Engage The Citizens: This can be done by engaging the citizens through the use of e-government initiatives, open data, sport events, etc.

In short, People, Processes, and Technology (PPT) are the three principles of the success of a smart city initiative. Cities must study their citizens and communities, know the processes, business drivers, create policies, and objectives to meet the citizens' needs. Then, technology can be implemented to meet the citizens' need, in order to improve the quality of life and create real economic opportunities. This requires a holistic customized approach that accounts for city cultures, long-term city planning, and local regulations.

"Whether to improve security, resiliency, sustainability, traffic congestion, public safety, or city services, each community may have different reasons for wanting to be smart. But all smart communities share common attributes—and they all are powered by smart connections and by our industry's smarter energy infrastructure. A smart grid is the foundational piece in building a smart community." – Pat Vincent-Collawn, chairman of the Edison Electric Institute and president and CEO of PNM Resources.

Policies

ASEAN Smart Cities Network (ASCN) is a collaborative platform which aims to synergise Smart city development efforts across ASEAN by facilitating cooperation on smart city development, catalysing bankable projects with the private sector, and securing funding and support from ASEAN's external partners.

The European Union (EU) has devoted constant efforts to devising a strategy for achieving 'smart' urban growth for its metropolitan city-regions. The EU has developed a range of programmes under "Europe's Digital Agenda". In 2010, it highlighted its focus on strengthening innovation and investment in ICT services for the purpose of improving public services and quality of life. Arup estimates that the global market for smart urban services will be $400 billion per annum by 2020.

The Smart Cities Mission is a retrofitting and urban renewal program being spearheaded by the Ministry of Urban Development, Government of India. The Government of India has the ambitious vision of developing 100 cities by modernizing existing mid-sized cities.

Technologies

Smart grids are an important technology in smart cities. The improved flexibility of the smart grid permits greater penetration of highly variable renewable energy sources such as solar power and wind power.

Mobile devices (such as smartphones, tablets, ...) are another key technology allowing citizens to connect to the smart city services.

Smart cities also rely on smart homes and specifically, the technology used in them.

Bicycle-sharing systems are an important element in smart cities.

Smart mobility is also important to smart cities.

Intelligent transportation systems and CCTV systems are also being developed.

Some smart cities also have digital libraries.

Online collaborative sensor data management platforms are on-line database services that allow sensor owners to register and connect their devices to feed data into an on-line database for storage and allow developers to connect to the database and build their own applications based on that data.

Additional supporting technology include telecommuting, telehealth, the blockchain, fintech,, online banking technology ...

Electronic cards (known as smart cards) are another common component in smart city contexts. These cards possess a unique encrypted identifier that allows the owner to log into a range of government provided services (or e-services) without setting up multiple accounts. The single identifier allows governments to aggregate data about citizens and their preferences to improve the provision of services and to determine common interests of groups. This technology has been implemented in Southampton.

Retractable bollards allow to restrict access inside city centers (i.e. to delivery trucks resupplying outlet stores, ...). Opening and closing of such barriers is traditionally done manually, through an electronic pass but can even be done by means of ANPR camera's connected to the bollard system.

Cost-benefit analysis of smart city technologies

Cost-benefit analysis has been done into smart cities and the individual technologies. These can help to assess whether it is economically and ecologically beneficial to implement some technologies at all, and also compare the cost-effectiveness of each technology among each other

Commercialisation

Large IT, telecommunication and energy management companies such as Baidu, Alibaba, Tencent, Huawei, Google, Microsoft, Cisco, IBM, and Schneider Electric launched market initiatives for intelligent cities.

  • Baidu is working on Apollo, a self-driving technology
  • Alibaba has created the City Brain
  • Tencent is working on medical technology, such as WeChat Intelligent Healthcare, Tencent Doctorwork, and AI Medical Innovation System (AIMIS)
  • Huawei has its Safe City Compact Solution which focuses on improving safety in cities
  • Google's subsidiary Sidewalk Labs is focusing on smart cities
  • Microsoft has CityNext
  • Cisco, launched the global "Intelligent Urbanization" initiative to help cities using the network as the fourth utility for integrated city management, better quality of life for citizens, and economic development.
  • IBM announced its Smarter Cities Challenge to stimulate economic growth and quality of life in cities and metropolitan areas with the activation of new approaches of thinking and acting in the urban ecosystem.
  • Schneider Electric is working on EcoStruxure
  • Sensor developers and startup companies are also continually developing new smart city applications.

Adoption

Examples of smart city technologies and programs have been implemented in Singapore, India, Dubai, Milton Keynes, Southampton, Amsterdam, Barcelona, Madrid, Stockholm, Copenhagen, China, and New York.

Major strategies and achievements related to the spatial intelligence of cities are listed in the Intelligent Community Forum awards from 1999 to 2010, in the cities of Songdo and Suwon (South Korea), Stockholm (Sweden), Gangnam District of Seoul (South Korea), Waterloo, Ontario (Canada), Taipei (Taiwan), Mitaka (Japan), Glasgow (Scotland, UK), Calgary (Alberta, Canada), Seoul (South Korea), New York City (US), LaGrange, Georgia (US), and Singapore, which were recognized for their efforts in developing broadband networks and e-services sustaining innovation ecosystems, growth, and inclusion. There are a number of cities actively pursuing a smart city strategy:

Amsterdam

Street lamps in Amsterdam have been upgraded to allow municipal councils to dim the lights based on pedestrian usage.

The Amsterdam smart city initiative which began in 2009 currently includes 170+ projects collaboratively developed by local residents, government and businesses. These projects run on an interconnected platform through wireless devices to enhance the city's real-time decision making abilities. The City of Amsterdam (City) claims the purpose of the projects is to reduce traffic, save energy and improve public safety. To promote efforts from local residents, the City runs the Amsterdam Smart City Challenge annually, accepting proposals for applications and developments that fit within the City's framework. An example of a resident developed app is Mobypark, which allows owners of parking spaces to rent them out to people for a fee. The data generated from this app can then be used by the City to determine parking demand and traffic flows in Amsterdam. A number of homes have also been provided with smart energy meters, with incentives provided to those that actively reduce energy consumption. Other initiatives include flexible street lighting (smart lighting) which allows municipalities to control the brightness of street lights, and smart traffic management where traffic is monitored in real time by the City and information about current travel time on certain roads is broadcast to allow motorists to determine the best routes to take.

Barcelona

Barcelona has established a number of projects that can be considered 'smart city' applications within its "CityOS" strategy. For example, sensor technology has been implemented in the irrigation system in Parc del Centre de Poblenou, where real time data is transmitted to gardening crews about the level of water required for the plants. Barcelona has also designed a new bus network based on data analysis of the most common traffic flows in Barcelona, utilising primarily vertical, horizontal and diagonal routes with a number of interchanges. Integration of multiple smart city technologies can be seen through the implementation of smart traffic lights as buses run on routes designed to optimise the number of green lights. In addition, where an emergency is reported in Barcelona, the approximate route of the emergency vehicle is entered into the traffic light system, setting all the lights to green as the vehicle approaches through a mix of GPS and traffic management software, allowing emergency services to reach the incident without delay. Much of this data is managed by the Sentilo Platform.

Columbus, Ohio

In the summer of 2017, the City of Columbus, Ohio began its pursuit of a smart city initiative. The city partnered with American Electric Power Ohio to create a group of new electric vehicle charging stations. Many smart cities such as Columbus are using agreements such as this one to prepare for climate change, expand electric infrastructure, convert existing public vehicle fleets to electric cars, and create incentives for people to share rides when commuting. For doing this, the U.S. Department of Transportation gave the City of Columbus a $40 million grant. The city also received $10 million from Vulcan Inc.

One key reason why the utility was involved in the picking of locations for new electric vehicle charging stations was to gather data. According to Daily Energy Insider, the group Infrastructure and Business Continuity for AEP said, "You don't want to put infrastructure where it won't be used or maintained. The data we collect will help us build a much bigger market in the future."

Because autonomous vehicles are currently seeing "an increased industrial research and legislative push globally", building routes and connections for them is another important part of the Columbus smart city initiative.

Copenhagen

In 2014, Copenhagen claimed the prestigious World Smart Cities Award for its “Connecting Copenhagen” smart city development strategy. Positioned in the Technical and Environmental Administration of Copenhagen, the smart city initiatives are coordinated by Copenhagen Solutions Lab, the city's administrative unit for smart city development. There are other notable actors in Greater Copenhagen that coordinate and initiate smart city initiatives including State of Green and Gate21, the latter of which has initiated the innovation hub smart city Cluster Denmark.

In an article with The Economist, a current major smart city project is explained: “In Copenhagen, as in many cities around the world, air quality is high on the agenda when it comes to liveability, with 68 percent of citizens citing it as of high importance when it comes to what makes their city attractive. To monitor pollution levels, Copenhagen Solutions Lab is currently working with Google and has installed monitoring equipment in their streetview car in order to produce a heatmap of air quality around the city. The information will help cyclists and joggers plan routes with the best air quality. The project also gives a glimpse of the future, when this kind of information could be collected in real time by sensors all over the city and collated with traffic flow data.”

In another article with The World Economic Forum, Marius Sylvestersen, Program Director at Copenhagen Solutions Lab, explains that public-private collaborations must be built on transparency, the willingness to share data and must be driven by the same set of values. This requires a particularly open mindset from the organisations that wish to get involved. To facilitate open collaboration and knowledge-sharing, Copenhagen Solutions Lab launched the Copenhagen Street Lab in 2016. Here, organisations such as TDC, Citelum and Cisco work in collaboration with Copenhagen Solutions Lab to identify new solutions to city and citizen problems.

Dubai

In 2013, the Smart Dubai project was initiated by Shaikh Mohammad bin Rashid Al Maktoum, vice president of UAE, which contained more than 100 initiatives to make Dubai a smart city by 2030. The project aimed to integrate private and public sectors, enabling citizens to access these sectors through their smartphones. Some initiatives include the Dubai Autonomous Transportation Strategy to create driverless transits, fully digitizing government, business and customer information and transactions, and providing citizens 5000 hotspots to access government applications by 2021. Two mobile applications, mPay and DubaiNow, facilitate various payment services for citizens ranging from utilities or traffic fines to educational, health, transport, and business services. In addition, the Smart Nol Card is a unified rechargeable card enabling citizens to pay for all transportation services such as metro, buses, water bus, and taxis. There is also the Dubai Municipality's Digital City initiative which assigns each building a unique QR code that citizens can scan containing information about the building, plot, and location.

Dublin

Dublin finds itself as an unexpected capital for smart cities. The smart city programme for the city is run by Smart Dublin an initiative of the four Dublin Local Authorities to engage with smart technology providers, researchers and citizens to solve city challenges and improve city life. It includes Dublinked- Dublin's open data platform that hosts open source data to smart city applications.

Kyiv

Kyiv has a transport dispatch system. It contains GPS trackers, installed on public transportation, as well as 6,000 video surveillance cameras which monitor the traffic. The accrued data is used by local Traffic Management Service and transport application developers.

London

In London, a traffic management system known as SCOOT optimises green light time at traffic intersections by feeding back magnetometer and inductive loop data to a supercomputer, which can co-ordinate traffic lights across the city to improve traffic throughout.

Madrid

Madrid, Spain's pioneering smart city, has adopted the MiNT Madrid Inteligente/Smarter Madrid platform to integrate the management of local services. These include the sustainable and computerized management of infrastructure, garbage collection and recycling, and public spaces and green areas, among others. The programme is run in partnership with IBMs INSA, making use of the latter's Big Data and analytics capabilities and experience. Madrid is considered to have taken a bottom-up approach to smart cities, whereby social issues are first identified and individual technologies or networks are then identified to address these issues. This approach includes support and recognition for start ups through the Madrid Digital Start Up programme.

Malta

A document written in 2011 refers to 18th century Żejtun as the earliest "smart city" in Malta,[147] but not in the modern context of a smart city. By the 21st century, SmartCity Malta, a planned technology park, became partially operational while the rest is under construction, as a Foreign Direct Investment.

Manchester

In December 2015, Manchester's CityVerve project was chosen as the winner of a government-led technology competition and awarded £10m to develop an Internet of Things (IoT) smart cities demonstrator.

Established in July 2016, the project is being carried out by a consortium of 22 public and private organisations, including Manchester City Council, and is aligned with the city's on-going devolution commitment.

The project has a two-year remit to demonstrate the capability of IoT applications and address barriers to deploying smart cities, such as city governance, network security, user trust and adoption, interoperability, scalability and justifying investment.

CityVerve is based on an open data principle that incorporates a "platform of platforms" which ties together applications for its four key themes: transport and travel; health and social care; energy and the environment; culture and the public realm. This will also ensure that the project is scalable and able to be redeployed to other locations worldwide.

Milan

Milan, Italy was prompted to begin its smart city strategies and initiatives by the European Union's Smart Cities and Communities initiative. However, unlike many European cities, Milan's Smart city strategies focus more on social sustainability rather than environmental sustainability. This focus is almost exclusive to Milan and has a major influence in the way content and way its strategies are implemented as shown in the case study of the Bicocca District in Milan.

Milton Keynes

Milton Keynes has a commitment to making itself a smart city. Currently the mechanism through which this is approached is the MK:Smart initiative, a collaboration of local government, businesses, academia and 3rd sector organisations. The focus of the initiative is on making energy use, water use and transport more sustainable whilst promoting economic growth in the city. Central to the project is the creation of a state-of-the-art 'MK Data Hub' which will support the acquisition and management of vast amounts of data relevant to city systems from a variety of data sources. These will include data about energy and water consumption, transport data, data acquired through satellite technology, social and economic datasets, and crowdsourced data from social media or specialised apps.

The MK:Smart initiative has two aspects which extend our understanding of how smart Cities should operate. The first, Our MK, is a scheme for promoting citizen-led sustainability issues in the city. The scheme provides funding and support to engage with citizens and help turn their ideas around sustainability into a reality. The second aspect is in providing citizens with the skills to operate effectively in a smart city. The Urban Data school is an online platform to teach school students about data skills while the project has also produced a MOOC to inform citizens about what a smart city is.

Moscow

Moscow has been implementing smart solutions since 2011 by creating the main infrastructure and local networks. Over the past few years Moscow Government implemented a number of programs, contributing to its IT development. So, Information City programme was launched and subsequently implemented from 2012 to 2018. The initial purpose of the programme was to make daily life for citizens safe and comfortable through the large-scale introduction of information and communication technologies. In the summer of 2018, Moscow Mayor Sergey Sobyanin announced the city's smart city project, aimed at applying modern technologies in all areas of city life. And in June 2018, the global management consultancy McKinsey announced that Moscow is one of the world's top 50 cities for smart technologies. Smart City technologies have been deployed in healthcare, education, transport and municipal services. The initiative aims to improve quality of life, make urban government more efficient and develop an information society. There are more than 300 digital initiatives within the smart city project, with electronic services now widely provided online and through multifunctional centres. Moscow's citywide Wi-Fi project was launched in 2012 and now provides more than 16,000 Wi-Fi internet access points. Moscow is actively developing eco-friendly transport using electric buses, and autonomous cars will soon be tested on the city's streets. Other initiatives include Moscow's Electronic School programme, its blockchain-based Active Citizen project and smart traffic management.

New York

New York is developing a number of smart city initiatives. An example is the series of city service kiosks in the LinkNYC network. These provide services including free WiFi, phone calls, device charging stations, local wayfinding, and more, funded by advertising that plays on the kiosk's screens.

San Leandro

The city of San Leandro, California is in the midst of transforming from an industrial center to a tech hub of the Internet of things (IoT) (technology that lets devices communicate with each other over the Internet). California's utility company PG&E is working with the city in this endeavor and on a smart energy pilot program that would develop a distributed energy network across the city that would be monitored by IoT sensors. The goal would be to give the city an energy system that has enough capacity to receive and redistribute electricity to and from multiple energy sources.

Santa Cruz

An alternative use of smart city technology can be found in Santa Cruz, California, where local authorities analyse historical crime data in order to predict police requirements and maximise police presence where it is required. The analytical tools generate a list of 10 places each day where property crimes are more likely to occur, and then placing police efforts on these regions when officers are not responding to any emergency. This use of ICT technology is different to the manner in which European cities utilise smart city technology, possibly highlighting the breadth of the smart city concept in different parts of the world.

Santander

The city of Santander in Cantabria, northern Spain, has 20,000 sensors connecting buildings, infrastructure, transport, networks and utilities, offers a physical space for experimentation and validation of the IoT functions, such as interaction and management protocols, device technologies, and support services such as discovery, identity management and security In Santander, the sensors monitor the levels of pollution, noise, traffic and parking.

Shanghai

Shanghai's development of the IoT and internet connection speeds have allowed for third party companies to revolutionize the productivity of the city. As mobile ride share giant, DiDi Chuxing, continuously adds more user protection features such as ride recording, and a new quick response safety center, Shanghai is furthering their smart city agenda. During the first China International Import Expo, Shanghai focused on smart mobility and implemented sensors to accept smartphone traffic cards in all metro stations and buses to increase efficiency in the city.

Singapore

Singapore, a city-state, has embarked on transforming towards a "Smart Nation", and endeavours to harness the power of networks, data and info-comm technologies to improve living, create economic opportunities and build closer communities.

Stockholm

The Kista Science City from above.

Stockholm's smart city technology is underpinned by the Stokab dark fibre system which was developed in 1994 to provide a universal fibre optic network across Stockholm. Private companies are able to lease fibre as service providers on equal terms. The company is owned by the City of Stockholm itself. Within this framework, Stockholm has created a Green IT strategy. The Green IT program seeks to reduce the environmental impact of Stockholm through IT functions such as energy efficient buildings (minimising heating costs), traffic monitoring (minimising the time spent on the road) and development of e-services (minimising paper usage). The e-Stockholm platform is centred on the provision of e-services, including political announcements, parking space booking and snow clearance. This is further being developed through GPS analytics, allowing residents to plan their route through the city. An example of district-specific smart city technology can be found in the Kista Science City region. This region is based on the triple helix concept of smart cities, where university, industry and government work together to develop ICT applications for implementation in a smart city strategy.

Taipei

Taipei - a smart city.

Taipei started the "smarttaipei" project in 2016, where the major concept of is to change the culture of city hall government to be able to adopt new ideas and new concepts called bottom-up mechanism. The Taipei City government established the “Taipei Smart City Project Management Office”, also known as the "PMO", to implement and governance the development of smart city. Thereafter, building an innovation matchmaking platform to combine industry and government resources to develop smart solutions that satisfy public demands.

PMO accept proposals from industry and help to negotiate with relative department of Taipei city to initiate new proof of concept(PoC) project, with the help of a matchmaking platform which allows citizens access necessary innovative technologies. There are more than 150 PoC Project established, and only 34% project finished.

Research

University research labs developed prototypes for intelligent cities.

Criticism

The criticisms of smart cities revolve around:

  • A bias in strategic interest may lead to ignoring alternative avenues of promising urban development.
  • A smart city, as a scientifically planned city, would defy the fact that real development in cities is often haphazard. In that line of criticism, the smart city is seen as unattractive for citizens as they "can deaden and stupefy the people who live in its all-efficient embrace". Instead, people would prefer cities they can participate to shape.
  • The focus of the concept of smart city may lead to an underestimation of the possible negative effects of the development of the new technological and networked infrastructures needed for a city to be smart.
  • As a globalized business model is based on capital mobility, following a business-oriented model may result in a losing long-term strategy: "The 'spatial fix' inevitably means that mobile capital can often 'write its own deals' to come to town, only to move on when it receives a better deal elsewhere. This is no less true for the smart city than it was for the industrial, [or] manufacturing city."
  • The high level of big data collection and analytics has raised questions regarding surveillance in smart cities, particularly as it relates to predictive policing.
  • As of August 2018, the discussion on smart cities centres around the usage and implementation of technology rather than on the inhabitants of the cities and how they can be involved in the process.
  • Especially in low-income countries, smart cities are irrelevant to the majority of the urban population, which lives in poverty with limited access to basic services. A focus on smart cities may worsen inequality and marginalization.
  • If a Smart city strategy is not planned taking into account people with accessibility problems, such as persons with disabilities affecting mobility, vision, hearing, and cognitive function, the implementation of new technologies could create new barriers.

 

Cloud computing

From Wikipedia, the free encyclopedia
 
Cloud computing metaphor: the group of networked elements providing services need not be individually addressed or managed by users; instead, the entire provider-managed suite of hardware and software can be thought of as an amorphous cloud.

Cloud computing is the on-demand availability of computer system resources, especially data storage (cloud storage) and computing power, without direct active management by the user. The term is generally used to describe data centers available to many users over the Internet. Large clouds, predominant today, often have functions distributed over multiple locations from central servers. If the connection to the user is relatively close, it may be designated an edge server.

Clouds may be limited to a single organization (enterprise clouds), or be available to many organizations (public cloud).

Cloud computing relies on sharing of resources to achieve coherence and economies of scale.

Advocates of public and hybrid clouds note that cloud computing allows companies to avoid or minimize up-front IT infrastructure costs. Proponents also claim that cloud computing allows enterprises to get their applications up and running faster, with improved manageability and less maintenance, and that it enables IT teams to more rapidly adjust resources to meet fluctuating and unpredictable demand, providing the burst computing capability: high computing power at certain periods of peak demand.

Cloud providers typically use a "pay-as-you-go" model, which can lead to unexpected operating expenses if administrators are not familiarized with cloud-pricing models.

The availability of high-capacity networks, low-cost computers and storage devices as well as the widespread adoption of hardware virtualization, service-oriented architecture and autonomic and utility computing has led to growth in cloud computing. By 2019, Linux was the most widely used operating system, including in Microsoft's offerings and is thus described as dominant. The Cloud Service Provider (CSP) will screen, keep up and gather data about the firewalls, intrusion identification or/and counteractive action frameworks and information stream inside the network.

History

Cloud computing was popularized with Amazon.com releasing its Elastic Compute Cloud product in 2006.

References to the phrase "cloud computing" appeared as early as 1996, with the first known mention in a Compaq internal document.

The cloud symbol was used to represent networks of computing equipment in the original ARPANET by as early as 1977, and the CSNET by 1981—both predecessors to the Internet itself. The word cloud was used as a metaphor for the Internet and a standardized cloud-like shape was used to denote a network on telephony schematics. With this simplification, the implication is that the specifics of how the endpoints of a network are connected are not relevant to understanding the diagram.

The term cloud was used to refer to platforms for distributed computing as early as 1993, when Apple spin-off General Magic and AT&T used it in describing their (paired) Telescript and PersonaLink technologies. In Wired's April 1994 feature "Bill and Andy's Excellent Adventure II", Andy Hertzfeld commented on Telescript, General Magic's distributed programming language:

"The beauty of Telescript ... is that now, instead of just having a device to program, we now have the entire Cloud out there, where a single program can go and travel to many different sources of information and create a sort of a virtual service. No one had conceived that before. The example Jim White [the designer of Telescript, X.400 and ASN.1] uses now is a date-arranging service where a software agent goes to the flower store and orders flowers and then goes to the ticket shop and gets the tickets for the show, and everything is communicated to both parties."

Early history

During the 1960s, the initial concepts of time-sharing became popularized via RJE (Remote Job Entry); this terminology was mostly associated with large vendors such as IBM and DEC. Full-time-sharing solutions were available by the early 1970s on such platforms as Multics (on GE hardware), Cambridge CTSS, and the earliest UNIX ports (on DEC hardware). Yet, the "data center" model where users submitted jobs to operators to run on IBM's mainframes was overwhelmingly predominant.

In the 1990s, telecommunications companies, who previously offered primarily dedicated point-to-point data circuits, began offering virtual private network (VPN) services with comparable quality of service, but at a lower cost. By switching traffic as they saw fit to balance server use, they could use overall network bandwidth more effectively. They began to use the cloud symbol to denote the demarcation point between what the provider was responsible for and what users were responsible for. Cloud computing extended this boundary to cover all servers as well as the network infrastructure. As computers became more diffused, scientists and technologists explored ways to make large-scale computing power available to more users through time-sharing. They experimented with algorithms to optimize the infrastructure, platform, and applications to prioritize CPUs and increase efficiency for end users.

The use of the cloud metaphor for virtualized services dates at least to General Magic in 1994, where it was used to describe the universe of "places" that mobile agents in the Telescript environment could go. As described by Andy Hertzfeld:

"The beauty of Telescript," says Andy, "is that now, instead of just having a device to program, we now have the entire Cloud out there, where a single program can go and travel to many different sources of information and create a sort of a virtual service."

The use of the cloud metaphor is credited to General Magic communications employee David Hoffman, based on long-standing use in networking and telecom. In addition, to use by General Magic itself, it was also used in promoting AT&T's associated PersonaLink Services.

2000s

In August 2006, Amazon created subsidiary Amazon Web Services and introduced its Elastic Compute Cloud (EC2).

In April 2008, Google released the beta version of Google App Engine.

In early 2008, NASA's OpenNebula, enhanced in the RESERVOIR European Commission-funded project, became the first open-source software for deploying private and hybrid clouds, and for the federation of clouds.

By mid-2008, Gartner saw an opportunity for cloud computing "to shape the relationship among consumers of IT services, those who use IT services and those who sell them" and observed that "organizations are switching from company-owned hardware and software assets to per-use service-based models" so that the "projected shift to computing ... will result in dramatic growth in IT products in some areas and significant reductions in other areas."

In 2008, the U.S. National Science Foundation began the Cluster Exploratory program to fund academic research using Google-IBM cluster technology to analyze massive amounts of data.

2010s

In February 2010, Microsoft released Microsoft Azure, which was announced in October 2008.

In July 2010, Rackspace Hosting and NASA jointly launched an open-source cloud-software initiative known as OpenStack. The OpenStack project intended to help organizations offering cloud-computing services running on standard hardware. The early code came from NASA's Nebula platform as well as from Rackspace's Cloud Files platform. As an open-source offering and along with other open-source solutions such as CloudStack, Ganeti, and OpenNebula, it has attracted attention by several key communities. Several studies aim at comparing these open source offerings based on a set of criteria.

On March 1, 2011, IBM announced the IBM SmartCloud framework to support Smarter Planet. Among the various components of the Smarter Computing foundation, cloud computing is a critical part. On June 7, 2012, Oracle announced the Oracle Cloud. This cloud offering is poised to be the first to provide users with access to an integrated set of IT solutions, including the Applications (SaaS), Platform (PaaS), and Infrastructure (IaaS) layers.

In May 2012, Google Compute Engine was released in preview, before being rolled out into General Availability in December 2013.

In 2019, it was revealed that Linux is most used on Microsoft Azure.

Similar concepts

The goal of cloud computing is to allow users to take benefit from all of these technologies, without the need for deep knowledge about or expertise with each one of them. The cloud aims to cut costs and helps the users focus on their core business instead of being impeded by IT obstacles. The main enabling technology for cloud computing is virtualization. Virtualization software separates a physical computing device into one or more "virtual" devices, each of which can be easily used and managed to perform computing tasks. With operating system–level virtualization essentially creating a scalable system of multiple independent computing devices, idle computing resources can be allocated and used more efficiently. Virtualization provides the agility required to speed up IT operations and reduces cost by increasing infrastructure utilization. Autonomic computing automates the process through which the user can provision resources on-demand. By minimizing user involvement, automation speeds up the process, reduces labor costs and reduces the possibility of human errors.

Cloud computing uses concepts from utility computing to provide metrics for the services used. Cloud computing attempts to address QoS (quality of service) and reliability problems of other grid computing models.

Cloud computing shares characteristics with:

  • Client–server modelClient–server computing refers broadly to any distributed application that distinguishes between service providers (servers) and service requestors (clients).
  • Computer bureau—A service bureau providing computer services, particularly from the 1960s to 1980s.
  • Grid computing—A form of distributed and parallel computing, whereby a 'super and virtual computer' is composed of a cluster of networked, loosely coupled computers acting in concert to perform very large tasks.
  • Fog computing—Distributed computing paradigm that provides data, compute, storage and application services closer to the client or near-user edge devices, such as network routers. Furthermore, fog computing handles data at the network level, on smart devices and on the end-user client-side (e.g. mobile devices), instead of sending data to a remote location for processing.
  • Mainframe computer—Powerful computers used mainly by large organizations for critical applications, typically bulk data processing such as census; industry and consumer statistics; police and secret intelligence services; enterprise resource planning; and financial transaction processing.
  • Utility computing—The "packaging of computing resources, such as computation and storage, as a metered service similar to a traditional public utility, such as electricity."
  • Peer-to-peer—A distributed architecture without the need for central coordination. Participants are both suppliers and consumers of resources (in contrast to the traditional client-server model).
  • Green computing
  • Cloud sandbox—A live, isolated computer environment in which a program, code or file can run without affecting the application in which it runs.

Characteristics

Cloud computing exhibits the following key characteristics:

  • Agility for organizations may be improved, as cloud computing may increase users' flexibility with re-provisioning, adding, or expanding technological infrastructure resources.
  • Cost reductions are claimed by cloud providers. A public-cloud delivery model converts capital expenditures (e.g., buying servers) to operational expenditure. This purportedly lowers barriers to entry, as infrastructure is typically provided by a third party and need not be purchased for one-time or infrequent intensive computing tasks. Pricing on a utility computing basis is "fine-grained", with usage-based billing options. As well, less in-house IT skills are required for implementation of projects that use cloud computing. The e-FISCAL project's state-of-the-art repository contains several articles looking into cost aspects in more detail, most of them concluding that costs savings depend on the type of activities supported and the type of infrastructure available in-house.
  • Device and location independence enable users to access systems using a web browser regardless of their location or what device they use (e.g., PC, mobile phone). As infrastructure is off-site (typically provided by a third-party) and accessed via the Internet, users can connect to it from anywhere.
  • Maintenance of cloud computing applications is easier, because they do not need to be installed on each user's computer and can be accessed from different places (e.g., different work locations, while travelling, etc.).
  • Multitenancy enables sharing of resources and costs across a large pool of users thus allowing for:
    • centralization of infrastructure in locations with lower costs (such as real estate, electricity, etc.)
    • peak-load capacity increases (users need not engineer and pay for the resources and equipment to meet their highest possible load-levels)
    • utilisation and efficiency improvements for systems that are often only 10–20% utilised.
  • Performance is monitored by IT experts from the service provider, and consistent and loosely coupled architectures are constructed using web services as the system interface.
  • Productivity may be increased when multiple users can work on the same data simultaneously, rather than waiting for it to be saved and emailed. Time may be saved as information does not need to be re-entered when fields are matched, nor do users need to install application software upgrades to their computer.
  • Availability improves with the use of multiple redundant sites, which makes well-designed cloud computing suitable for business continuity and disaster recovery.
  • Scalability and elasticity via dynamic ("on-demand") provisioning of resources on a fine-grained, self-service basis in near real-time (Note, the VM startup time varies by VM type, location, OS and cloud providers), without users having to engineer for peak loads. This gives the ability to scale up when the usage need increases or down if resources are not being used. Emerging approaches for managing elasticity include the use of machine learning techniques to propose efficient elasticity models.
  • Security can improve due to centralization of data, increased security-focused resources, etc., but concerns can persist about loss of control over certain sensitive data, and the lack of security for stored kernels. Security is often as good as or better than other traditional systems, in part because service providers are able to devote resources to solving security issues that many customers cannot afford to tackle or which they lack the technical skills to address. However, the complexity of security is greatly increased when data is distributed over a wider area or over a greater number of devices, as well as in multi-tenant systems shared by unrelated users. In addition, user access to security audit logs may be difficult or impossible. Private cloud installations are in part motivated by users' desire to retain control over the infrastructure and avoid losing control of information security.

The National Institute of Standards and Technology's definition of cloud computing identifies "five essential characteristics":

On-demand self-service. A consumer can unilaterally provision computing capabilities, such as server time and network storage, as needed automatically without requiring human interaction with each service provider.

Broad network access. Capabilities are available over the network and accessed through standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, tablets, laptops, and workstations).

Resource pooling. The provider's computing resources are pooled to serve multiple consumers using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned according to consumer demand. 

Rapid elasticity. Capabilities can be elastically provisioned and released, in some cases automatically, to scale rapidly outward and inward commensurate with demand. To the consumer, the capabilities available for provisioning often appear unlimited and can be appropriated in any quantity at any time.

Measured service. Cloud systems automatically control and optimize resource use by leveraging a metering capability at some level of abstraction appropriate to the type of service (e.g., storage, processing, bandwidth, and active user accounts). Resource usage can be monitored, controlled, and reported, providing transparency for both the provider and consumer of the utilized service.

— National Institute of Standards and Technology

Service models

Cloud computing service models arranged as layers in a stack

Though service-oriented architecture advocates "Everything as a service" (with the acronyms EaaS or XaaS, or simply aas), cloud-computing providers offer their "services" according to different models, of which the three standard models per NIST are Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). These models offer increasing abstraction; they are thus often portrayed as layers in a stack: infrastructure-, platform- and software-as-a-service, but these need not be related. For example, one can provide SaaS implemented on physical machines (bare metal), without using underlying PaaS or IaaS layers, and conversely one can run a program on IaaS and access it directly, without wrapping it as SaaS.

Infrastructure as a service (IaaS)

"Infrastructure as a service" (IaaS) refers to online services that provide high-level APIs used to abstract various low-level details of underlying network infrastructure like physical computing resources, location, data partitioning, scaling, security, backup, etc. A hypervisor runs the virtual machines as guests. Pools of hypervisors within the cloud operational system can support large numbers of virtual machines and the ability to scale services up and down according to customers' varying requirements. Linux containers run in isolated partitions of a single Linux kernel running directly on the physical hardware. Linux cgroups and namespaces are the underlying Linux kernel technologies used to isolate, secure and manage the containers. Containerisation offers higher performance than virtualization because there is no hypervisor overhead. Also, container capacity auto-scales dynamically with computing load, which eliminates the problem of over-provisioning and enables usage-based billing. IaaS clouds often offer additional resources such as a virtual-machine disk-image library, raw block storage, file or object storage, firewalls, load balancers, IP addresses, virtual local area networks (VLANs), and software bundles.

The NIST's definition of cloud computing describes IaaS as "where the consumer is able to deploy and run arbitrary software, which can include operating systems and applications. The consumer does not manage or control the underlying cloud infrastructure but has control over operating systems, storage, and deployed applications; and possibly limited control of select networking components (e.g., host firewalls)."

IaaS-cloud providers supply these resources on-demand from their large pools of equipment installed in data centers. For wide-area connectivity, customers can use either the Internet or carrier clouds (dedicated virtual private networks). To deploy their applications, cloud users install operating-system images and their application software on the cloud infrastructure. In this model, the cloud user patches and maintains the operating systems and the application software. Cloud providers typically bill IaaS services on a utility computing basis: cost reflects the amount of resources allocated and consumed.

Platform as a service (PaaS)

The NIST's definition of cloud computing defines Platform as a Service as:

The capability provided to the consumer is to deploy onto the cloud infrastructure consumer-created or acquired applications created using programming languages, libraries, services, and tools supported by the provider. The consumer does not manage or control the underlying cloud infrastructure including network, servers, operating systems, or storage, but has control over the deployed applications and possibly configuration settings for the application-hosting environment.

PaaS vendors offer a development environment to application developers. The provider typically develops toolkit and standards for development and channels for distribution and payment. In the PaaS models, cloud providers deliver a computing platform, typically including operating system, programming-language execution environment, database, and web server. Application developers develop and run their software on a cloud platform instead of directly buying and managing the underlying hardware and software layers. With some PaaS, the underlying computer and storage resources scale automatically to match application demand so that the cloud user does not have to allocate resources manually.

Some integration and data management providers also use specialized applications of PaaS as delivery models for data. Examples include iPaaS (Integration Platform as a Service) and dPaaS (Data Platform as a Service). iPaaS enables customers to develop, execute and govern integration flows. Under the iPaaS integration model, customers drive the development and deployment of integrations without installing or managing any hardware or middleware. dPaaS delivers integration—and data-management—products as a fully managed service. Under the dPaaS model, the PaaS provider, not the customer, manages the development and execution of programs by building data applications for the customer. dPaaS users access data through data-visualization tools. Platform as a Service (PaaS) consumers do not manage or control the underlying cloud infrastructure including network, servers, operating systems, or storage, but have control over the deployed applications and possibly configuration settings for the application-hosting environment.

Software as a service (SaaS)

The NIST's definition of cloud computing defines Software as a Service as:

The capability provided to the consumer is to use the provider's applications running on a cloud infrastructure. The applications are accessible from various client devices through either a thin client interface, such as a web browser (e.g., web-based email), or a program interface. The consumer does not manage or control the underlying cloud infrastructure including network, servers, operating systems, storage, or even individual application capabilities, with the possible exception of limited user-specific application configuration settings.

In the software as a service (SaaS) model, users gain access to application software and databases. Cloud providers manage the infrastructure and platforms that run the applications. SaaS is sometimes referred to as "on-demand software" and is usually priced on a pay-per-use basis or using a subscription fee. In the SaaS model, cloud providers install and operate application software in the cloud and cloud users access the software from cloud clients. Cloud users do not manage the cloud infrastructure and platform where the application runs. This eliminates the need to install and run the application on the cloud user's own computers, which simplifies maintenance and support. Cloud applications differ from other applications in their scalability—which can be achieved by cloning tasks onto multiple virtual machines at run-time to meet changing work demand. Load balancers distribute the work over the set of virtual machines. This process is transparent to the cloud user, who sees only a single access-point. To accommodate a large number of cloud users, cloud applications can be multitenant, meaning that any machine may serve more than one cloud-user organization.

The pricing model for SaaS applications is typically a monthly or yearly flat fee per user, so prices become scalable and adjustable if users are added or removed at any point. It may also be free. Proponents claim that SaaS gives a business the potential to reduce IT operational costs by outsourcing hardware and software maintenance and support to the cloud provider. This enables the business to reallocate IT operations costs away from hardware/software spending and from personnel expenses, towards meeting other goals. In addition, with applications hosted centrally, updates can be released without the need for users to install new software. One drawback of SaaS comes with storing the users' data on the cloud provider's server. As a result, there could be unauthorized access to the data. Examples of applications offered as SaaS are games and productivity software like Google Docs and Word Online. SaaS applications may be integrated with cloud storage or File hosting services, which is the case with Google Docs being integrated with Google Drive and Word Online being integrated with Onedrive.

Mobile "backend" as a service (MBaaS)

In the mobile "backend" as a service (m) model, also known as backend as a service (BaaS), web app and mobile app developers are provided with a way to link their applications to cloud storage and cloud computing services with application programming interfaces (APIs) exposed to their applications and custom software development kits (SDKs). Services include user management, push notifications, integration with social networking services and more. This is a relatively recent model in cloud computing, with most BaaS startups dating from 2011 or later but trends indicate that these services are gaining significant mainstream traction with enterprise consumers.

Serverless computing

Serverless computing is a cloud computing code execution model in which the cloud provider fully manages starting and stopping virtual machines as necessary to serve requests, and requests are billed by an abstract measure of the resources required to satisfy the request, rather than per virtual machine, per hour. Despite the name, it does not actually involve running code without servers. Serverless computing is so named because the business or person that owns the system does not have to purchase, rent or provision servers or virtual machines for the back-end code to run on.

Function as a service (FaaS)

Function as a service (FaaS) is a service-hosted remote procedure call that leverages serverless computing to enable the deployment of individual functions in the cloud that run in response to events. FaaS is included under the broader term serverless computing, but the terms may also be used interchangeably.

Deployment models

Cloud computing types

Private cloud

Private cloud is cloud infrastructure operated solely for a single organization, whether managed internally or by a third party, and hosted either internally or externally. Undertaking a private cloud project requires significant engagement to virtualize the business environment, and requires the organization to reevaluate decisions about existing resources. It can improve business, but every step in the project raises security issues that must be addressed to prevent serious vulnerabilities. Self-run data centers are generally capital intensive. They have a significant physical footprint, requiring allocations of space, hardware, and environmental controls. These assets have to be refreshed periodically, resulting in additional capital expenditures. They have attracted criticism because users "still have to buy, build, and manage them" and thus do not benefit from less hands-on management, essentially "[lacking] the economic model that makes cloud computing such an intriguing concept".

Public cloud

Cloud services are considered "public" when they are delivered over the public Internet, and they may be offered as a paid subscription, or free of charge. Architecturally, there are few differences between public- and private-cloud services, but security concerns increase substantially when services (applications, storage, and other resources) are shared by multiple customers. Most public-cloud providers offer direct-connection services that allow customers to securely link their legacy data centers to their cloud-resident applications.

Hybrid cloud

Hybrid cloud is a composition of a public cloud and a private environment, such as a private cloud or on-premises resources, that remain distinct entities but are bound together, offering the benefits of multiple deployment models. Hybrid cloud can also mean the ability to connect collocation, managed and/or dedicated services with cloud resources. Gartner defines a hybrid cloud service as a cloud computing service that is composed of some combination of private, public and community cloud services, from different service providers. A hybrid cloud service crosses isolation and provider boundaries so that it can't be simply put in one category of private, public, or community cloud service. It allows one to extend either the capacity or the capability of a cloud service, by aggregation, integration or customization with another cloud service.

Varied use cases for hybrid cloud composition exist. For example, an organization may store sensitive client data in house on a private cloud application, but interconnect that application to a business intelligence application provided on a public cloud as a software service. This example of hybrid cloud extends the capabilities of the enterprise to deliver a specific business service through the addition of externally available public cloud services. Hybrid cloud adoption depends on a number of factors such as data security and compliance requirements, level of control needed over data, and the applications an organization uses.

Another example of hybrid cloud is one where IT organizations use public cloud computing resources to meet temporary capacity needs that can not be met by the private cloud. This capability enables hybrid clouds to employ cloud bursting for scaling across clouds. Cloud bursting is an application deployment model in which an application runs in a private cloud or data center and "bursts" to a public cloud when the demand for computing capacity increases. A primary advantage of cloud bursting and a hybrid cloud model is that an organization pays for extra compute resources only when they are needed. Cloud bursting enables data centers to create an in-house IT infrastructure that supports average workloads, and use cloud resources from public or private clouds, during spikes in processing demands. The specialized model of hybrid cloud, which is built atop heterogeneous hardware, is called "Cross-platform Hybrid Cloud". A cross-platform hybrid cloud is usually powered by different CPU architectures, for example, x86-64 and ARM, underneath. Users can transparently deploy and scale applications without knowledge of the cloud's hardware diversity. This kind of cloud emerges from the rise of ARM-based system-on-chip for server-class computing.

Hybrid cloud infrastructure essentially serves to eliminate limitations inherent to the multi-access relay characteristics of private cloud networking. The advantages include enhanced runtime flexibility and adaptive memory processing unique to virtualized interface models.

Others

Community cloud

Community cloud shares infrastructure between several organizations from a specific community with common concerns (security, compliance, jurisdiction, etc.), whether managed internally or by a third-party, and either hosted internally or externally. The costs are spread over fewer users than a public cloud (but more than a private cloud), so only some of the cost savings potential of cloud computing are realized.

Distributed cloud

A cloud computing platform can be assembled from a distributed set of machines in different locations, connected to a single network or hub service. It is possible to distinguish between two types of distributed clouds: public-resource computing and volunteer cloud.

  • Public-resource computing—This type of distributed cloud results from an expansive definition of cloud computing, because they are more akin to distributed computing than cloud computing. Nonetheless, it is considered a sub-class of cloud computing.
  • Volunteer cloud—Volunteer cloud computing is characterized as the intersection of public-resource computing and cloud computing, where a cloud computing infrastructure is built using volunteered resources. Many challenges arise from this type of infrastructure, because of the volatility of the resources used to build it and the dynamic environment it operates in. It can also be called peer-to-peer clouds, or ad-hoc clouds. An interesting effort in such direction is Cloud@Home, it aims to implement a cloud computing infrastructure using volunteered resources providing a business-model to incentivize contributions through financial restitution.

Multicloud

Multicloud is the use of multiple cloud computing services in a single heterogeneous architecture to reduce reliance on single vendors, increase flexibility through choice, mitigate against disasters, etc. It differs from hybrid cloud in that it refers to multiple cloud services, rather than multiple deployment modes (public, private, legacy).

Poly cloud

Poly cloud refers to the use of multiple public clouds for the purpose of leveraging specific services that each provider offers. It differs from multicloud in that it is not designed to increase flexibility or mitigate against failures but is rather used to allow an organization to achieve more that could be done with a single provider.

Big Data cloud

The issues of transferring large amounts of data to the cloud as well as data security once the data is in the cloud initially hampered adoption of cloud for big data, but now that much data originates in the cloud and with the advent of bare-metal servers, the cloud has become a solution for use cases including business analytics and geospatial analysis.

HPC cloud

HPC cloud refers to the use of cloud computing services and infrastructure to execute high-performance computing (HPC) applications. These applications consume considerable amount of computing power and memory and are traditionally executed on clusters of computers. In 2016 a handful of companies, including R-HPC, Amazon Web Services, Univa, Silicon Graphics International, Sabalcore, Gomput, and Penguin Computing offered a high performance computing cloud. The Penguin On Demand (POD) cloud was one of the first non-virtualized remote HPC services offered on a pay-as-you-go basis. Penguin Computing launched its HPC cloud in 2016 as alternative to Amazon's EC2 Elastic Compute Cloud, which uses virtualized computing nodes.

Architecture

Cloud computing sample architecture

Cloud architecture, the systems architecture of the software systems involved in the delivery of cloud computing, typically involves multiple cloud components communicating with each other over a loose coupling mechanism such as a messaging queue. Elastic provision implies intelligence in the use of tight or loose coupling as applied to mechanisms such as these and others.

Cloud engineering

Cloud engineering is the application of engineering disciplines to cloud computing. It brings a systematic approach to the high-level concerns of commercialization, standardization and governance in conceiving, developing, operating and maintaining cloud computing systems. It is a multidisciplinary method encompassing contributions from diverse areas such as systems, software, web, performance, information technology engineering, security, platform, risk, and quality engineering.

Security and privacy

Cloud computing poses privacy concerns because the service provider can access the data that is in the cloud at any time. It could accidentally or deliberately alter or delete information. Many cloud providers can share information with third parties if necessary for purposes of law and order without a warrant. That is permitted in their privacy policies, which users must agree to before they start using cloud services. Solutions to privacy include policy and legislation as well as end-users' choices for how data is stored. Users can encrypt data that is processed or stored within the cloud to prevent unauthorized access. Identity management systems can also provide practical solutions to privacy concerns in cloud computing. These systems distinguish between authorized and unauthorized users and determine the amount of data that is accessible to each entity. The systems work by creating and describing identities, recording activities, and getting rid of unused identities.

According to the Cloud Security Alliance, the top three threats in the cloud are Insecure Interfaces and APIs, Data Loss & Leakage, and Hardware Failure—which accounted for 29%, 25% and 10% of all cloud security outages respectively. Together, these form shared technology vulnerabilities. In a cloud provider platform being shared by different users, there may be a possibility that information belonging to different customers resides on the same data server. Additionally, Eugene Schultz, chief technology officer at Emagined Security, said that hackers are spending substantial time and effort looking for ways to penetrate the cloud. "There are some real Achilles' heels in the cloud infrastructure that are making big holes for the bad guys to get into". Because data from hundreds or thousands of companies can be stored on large cloud servers, hackers can theoretically gain control of huge stores of information through a single attack—a process he called "hyperjacking". Some examples of this include the Dropbox security breach, and iCloud 2014 leak. Dropbox had been breached in October 2014, having over 7 million of its users passwords stolen by hackers in an effort to get monetary value from it by Bitcoins (BTC). By having these passwords, they are able to read private data as well as have this data be indexed by search engines (making the information public).

There is the problem of legal ownership of the data (If a user stores some data in the cloud, can the cloud provider profit from it?). Many Terms of Service agreements are silent on the question of ownership. Physical control of the computer equipment (private cloud) is more secure than having the equipment off-site and under someone else's control (public cloud). This delivers great incentive to public cloud computing service providers to prioritize building and maintaining strong management of secure services. Some small businesses that don't have expertise in IT security could find that it's more secure for them to use a public cloud. There is the risk that end users do not understand the issues involved when signing on to a cloud service (persons sometimes don't read the many pages of the terms of service agreement, and just click "Accept" without reading). This is important now that cloud computing is becoming popular and required for some services to work, for example for an intelligent personal assistant (Apple's Siri or Google Now). Fundamentally, private cloud is seen as more secure with higher levels of control for the owner, however public cloud is seen to be more flexible and requires less time and money investment from the user.

Limitations and disadvantages

According to Bruce Schneier, "The downside is that you will have limited customization options. Cloud computing is cheaper because of economics of scale, and—like any outsourced task—you tend to get what you want. A restaurant with a limited menu is cheaper than a personal chef who can cook anything you want. Fewer options at a much cheaper price: it's a feature, not a bug." He also suggests that "the cloud provider might not meet your legal needs" and that businesses need to weigh the benefits of cloud computing against the risks. In cloud computing, the control of the back end infrastructure is limited to the cloud vendor only. Cloud providers often decide on the management policies, which moderates what the cloud users are able to do with their deployment. Cloud users are also limited to the control and management of their applications, data and services. This includes data caps, which are placed on cloud users by the cloud vendor allocating a certain amount of bandwidth for each customer and are often shared among other cloud users.

Privacy and confidentiality are big concerns in some activities. For instance, sworn translators working under the stipulations of an NDA, might face problems regarding sensitive data that are not encrypted.

Cloud computing is beneficial to many enterprises; it lowers costs and allows them to focus on competence instead of on matters of IT and infrastructure. Nevertheless, cloud computing has proven to have some limitations and disadvantages, especially for smaller business operations, particularly regarding security and downtime. Technical outages are inevitable and occur sometimes when cloud service providers (CSPs) become overwhelmed in the process of serving their clients. This may result in temporary business suspension. Since this technology's systems rely on the Internet, an individual cannot access their applications, server or data from the cloud during an outage. However, many large enterprises maintain at least two internet providers, using different entry points into their workplaces, some even use 4G as a third fallback.

Emerging trends

Cloud computing is still a subject of research. A driving factor in the evolution of cloud computing has been chief technology officers seeking to minimize risk of internal outages and mitigate the complexity of housing network and computing hardware in-house. Major cloud technology companies invest billions of dollars per year in cloud Research and Development. For example, in 2011 Microsoft committed 90 percent of its $9.6 billion R&D budget to its cloud. Research by investment bank Centaur Partners in late 2015 forecasted that SaaS revenue would grow from $13.5 billion in 2011 to $32.8 billion in 2016.

Digital forensics in the cloud

The issue of carrying out investigations where the cloud storage devices cannot be physically accessed has generated a number of changes to the way that digital evidence is located and collected. New process models have been developed to formalize collection.

In some scenarios existing digital forensics tools can be employed to access cloud storage as networked drives (although this is a slow process generating a large amount of internet traffic).

An alternative approach is to deploy a tool that processes in the cloud itself.

For organizations using Office 365 with an 'E5' subscription, there is the option to use Microsoft's built-in ediscovery resources, although these do not provide all the functionality that is typically required for a forensic process.

Smartphone zombie

People using phones while walking

Smartphone zombie or ‘Smombie’ has been used in popular culture to describe pedestrians who walk slowly and without attention to their surroundings because they are focused upon their smartphone. Safety hazards have been noted due to such distracted pedestrians. Cities such as Chongqing and Antwerp have introduced special lanes for smartphone users to help direct and manage them. Whilst used in some countries more than others, the terminology was suggested to be pejorative by Tom Chatfield of the BBC, noting it is "more sinister", "suggests disapproval of (the) subject(s)" and that "terms of technological disapproval, conjured from hyperbolic hopes and fears, are as old as electronic communications." A 2017 review considered the popular culture term in regards to the medical diagnoses of internet addiction disorder and other forms of digital media overuse.

Prevalence

In Chongqing, China, the government constructed a dedicated smartphone-sidewalk, separating the phone users and the non-phone users. In Hong Kong, they are called dai tau juk (the head-down tribe).

Mitigation

A warning sign in Osaka

Texting pedestrians may trip over curbs, walk out in front of cars and bump into other walkers. The field of vision of a smartphone user is estimated to be just 5% of a normal pedestrian's. An app which uses the phone's camera to make it seem transparent can be used to provide some warning of hazards. In Augsburg, Bodegraven and Cologne, ground-level traffic lights embedded in the pavement have been introduced so that they are more visible to preoccupied pedestrians. In Seoul, warning signs have been placed on the pavement at dangerous intersections following over a thousand road accidents caused by smartphones in South Korea in 2014. In October 2017, the City of Honolulu, Hawaii introduced a measure to fine pedestrians looking at smartphones while crossing the road. In 2019, China introduced penalties for "activities affecting other vehicles or pedestrians" and a woman was fined 10 yuan in Wenzhou.

History

Science fiction author Ray Bradbury predicted the phenomenon in the 1950s in his stories such as The Pedestrian and Fahrenheit 451. He wrote at the time how he observed it actually starting to happen in Beverly Hills,

"Elsewhere in the narrative I described my Fire Man arriving home after midnight to find his wife in bed afflicted with two varieties of stupor. She is in a trance, a condition so withdrawn as to resemble catatonia, compounded of equal parts liquor and a small Seashell thimble-radio tucked in her ear. The Seashell croons and murmurs its music and commercials and private little melodramas for her alone. The room is silent. The husband cannot even try to guess the communion between Seashell and wife. Awakening her is not unlike applying shock to a cataleptic.

I thought I was writing a story of prediction, describing a world that might evolve in four or five decades. But only a month ago, in Beverly Hills one night, a husband and wife passed me, walking their dog. I stood staring after them, absolutely stunned. The woman held in one hand a small cigarette-package-sized radio, its antenna quivering. From this sprang tiny copper wires which ended in a dainty cone plugged into her right ear. There she was, oblivious to man and dog, listening to far winds and whispers and soap-opera cries, sleepwalking, helped up and down curbs by a husband who might just as well not have been there. This was not science fiction. This was a new fact in our changing society."

— Ray Bradbury, The Nation

Archetype

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Archetype The concept of an archetyp...