I’m skeptical of ‘futurists’. Work closely enough with the development of technology solutions and you’ll know that the only certain thing about the future is that it’s constantly changing. For example, few ‘futurists’ predicted the Covid-19 outbreak that brought the world to a standstill in 2020. Many, however, had spent hours waxing on about how 5G technology was to change the trajectory of human evolution, telling tales of what would be possible with ultra-high speed, ultra-low latency connectivity. Me included.
Of course, 5G will enable many of these promised use cases, and many others we haven’t even dreamed of yet, but have the prophets been proven true? Has 5G changed the world?
The answer, of course, is not yet. We simply haven’t yet achieved the levels of scale required for 5G to realize its potential, but some aspects of the transition to 5G are going well. Despite a global pandemic, deployment has continued to move at a decent pace with 5G now available in almost 2,000 cities across more than 70 countries. This healthy and continued expansion is made possible by a solid, and constantly evolving, 5G standard.
However, other aspects have moved slower. The cybersecurity provisions of 5G standards have lagged behind in their maturity and fit for purpose, with gaps still remaining to be filled. This is not entirely surprising. Both private and public players face a significant challenge in securing 5G networks, especially with the increased complexity represented by new developments like Open RAN.
As a measure of this challenge, the European Telecommunications Standards Institute (ETSI) only released its first Open RAN standard in September of this year. Even more tellingly, it included no cybersecurity requirements. Open RAN functions are governed by the existing cybersecurity specifications in the 5G standard, but none more.
This is a major concern. Open RAN is likely to become a major part of 5G development in the future and ensuring its security needs to become a priority.
Why the noise about Open RAN?
The Radio Access Network (RAN) is a critical component of any broader mobile network setup. It includes base station equipment, cell towers and radios, which work in unison to convert wireless signals into the various data formats that end users ultimately engage with. The RAN is what connects your devices to other parts of the network, and ensures the wireless signals travelling invisibly through the aether arrive on your device in the form of text, voice or video.
A conventional RAN configuration, as is used in 3G and 4G networks, for example, is built on proprietary hardware and software resources developed by a single vendor. These components are not interoperable – that is, they cannot function in agreement with equipment built by alternative suppliers. This ‘vendor-locked’ arrangement means mobile network operators (MNOs) are limited to the supply schedules and component offerings of their contracted vendor.
Commercially, this arrangement has long favoured the supplier, with operators seeking cost efficiencies and technological agility complaining of their limited options. Security has also been positioned as a major drawback of traditional proprietary infrastructure. This reasoning gained significant traction during the Trump era and drives the Federal Communications Commission’s (FCC’s) ‘rip and replace’ program to do away with network gear from firms like Huawei and ZTE. The security argument against vendor lock-in points to the risks of being tied to suppliers, such as Chinese firms, whose products are suspected of security flaws.
The commercial argument is driving industry change. The O-RAN Alliance, whose specifications underpin ETSI’s standard released in September, is the most influential of a number of bodies campaigning for an “open” network architecture that disaggregates RAN functions, relies on interoperability of network components, and paves the way for MNOs to lower equipment costs and improve network performance through increased competition among network suppliers.
Ostensibly, this diversity of supplier base should encourage greater network security too. A more open RAN architecture should increase transparency across the network, granting operators more freedom and responsiveness in addressing vulnerabilities or incursions in real time. And, where a particular vendor’s products are shown to be compromised, the operator can quickly and easily swap them out for alternatives.
Theoretically, then, market economics should also favour suppliers who are able to deliver superior security. As declared by the DoD, “…this market-based approach represents a sustainable model for accelerating critical 5G innovation while spurring the growth of domestic supply chains based on trusted and secure vendors.”
But in most cases cybersecurity’s relevance to the bottom line is not immediately obvious and commercial motivations stand to win out against security considerations. This friction is not easily apparent when, as in the case of increasing interoperability and supplier diversity, both causes appear to be served by the same course of action.
But, the final test of this union is in the actual selection of those supplier products and services and, beneath that, the reliability and security competence of different vendors. When it comes down to it, can we confidently assume that network operators – which include many smaller local outfits lacking the capital and operational budgets of larger national players – will always choose the most secure option over the cheaper one with more favourable terms?
A key theme here is virtualization. One of the defining characteristics of the move to 5G is the virtualization of network functions previously assigned to hardware. This is not a basic technological development, it is an evolutionary leap. By unmooring network functions from physical hardware we liberate the full potential of integrated technologies like cloudification, edge computing, and AI/ML automation. But this move relies on a shift to a software-driven ecosystem which is inherently more hackable than a hardware-based system that includes software services.
But it also increase the attack surface of the network and invites greater complexity in supplier management. The more suppliers there are, the more difficult, time-consuming and expensive it becomes to vet them and their products, while many supply chains cross borders and originate in countries beyond the network operator’s own territory.
Also, the more disaggregated a network is, the more component interfaces there are to act as entry points for malicious attacks. And, when most of these products are either software or software-driven, the challenge increases exponentially, because we need to consider the DNA of the software itself. In a development environment in which so much software is based on open-source code, finding vulnerabilities and attack opportunities becomes considerably more difficult.
While Open RAN only accounts for a portion of the network, it represents a major share of capital investment. This alone should be enough to focus MNO efforts on ensuring Open RAN is a secure system. The increased agility and flexibility promised by oRAN won’t be realised if safety and reliability cannot be achieved. But the challenge is significant. Cybersecurity experts across the world are working at ensuring features like cloudification, virtualization and software supremacy do not open 5G networks to attack, but these are novel technologies and security methods are still evolving. Not only do Open RAN configurations need to contend with these same challenges, which apply to 5G networks more generally, they also have the added pressure of keeping open interfaces, which are unique to Open RAN, safe for the network and, ultimately, the end user.
Most large-scale 5G deployments globally are still likely to implement ‘traditional’ RAN architecture, with very few operators moving assertively towards oRAN in the short term. In the United States, for example, only DISH is deploying Open RAN across its entire network, and even that roll out has been repeatedly constrained by teething problems. So, it appears we still have some time to work out the best way to approach Open RAN security, but this is a new and quickly evolving concept. Development is happening at speed, and at the moment security considerations are not receiving the same amount of attention as commercial promises. The risk that security is left behind is increasing, and the potential fall out could be severe.
The first issue is one of maturity. Open RAN network design and its concomitant security standards are simply not mature enough yet and rushing into deployment could invite disaster. With the increasing complexity of multiple suppliers and innumerable software sources and combinations, the potential for inadequate security provisions increases.
Furthermore, opening standards for interfaces in the RAN invites a broad diversity of new vendors, thereby increasing competition, which is one of the key commercial incentives for MNOs. But having more vendors increases supply chain risks, while the quality and security rigour of the components created by these new suppliers is currently unknown. If Open RAN cybersecurity is not more explicitly spelled out in 5G standards, there’s nothing to ensure that new network components will be safe.
In addition to amplified risks, there are numerous potential risks that would be unique to oRAN, the first being a significantly expanded threat surface. One area of focus is the Open Fronthaul, a crucial aspect of oRAN architecture, which, as defined by the O-RAN Alliance, sees the disaggregation of the distributed unit (located in the base station) and the radio unit. Communication between these components will need to happen in real time via interoperable connections, but these real-time interfaces add an extra dimension of potential vulnerability. As suggested by the Cybersecurity and Infrastructure Security Agency (CISA), the Open Fronthaul is specifically vulnerable to DDoS attacks, and the first line of defence is network access control. So, cryptographic security mechanisms for these real-time interfaces become crucial to the integrity of the Open Fronthaul networks, but at the same time these networks “push the boundaries of high-speed performance and the ability of cryptographic security mechanisms to keep up, all while keeping unit deployment and operational costs down.” As a result, “These cryptographic security mechanisms require further industry study and consideration.”
It is in light of concerns around Open Fronthaul that the recent ETSI announcement of its first 5G standard is especially disappointing. That standard is specifically formulated for Open Fronthaul and would have been an ideal opportunity to set the bar for security of fronthaul networks, yet there were no such specifications. The concern, of course, is that this instead sets the tone for a mode of release in which commercial expediency (getting standards to market to allow for development of hardware and software components) outpaces security considerations.
Another development specific to Open RAN is in the form of network automation applications known as rApps and xApps, which further expand access by allowing different vendors to contribute to the RAN app ecosystem. The EU Open RAN security report correctly points out that these new functions will “require additional security controls and measures to be put in place between each and every function to avoid new security threats being introduced.”
These applications will initially be used to manage AI/ML operations in the network, though these AI/ML functions will themselves be new potential attack vectors. AI/ML algorithms are also susceptible to “data poisoning attacks” in which corrupted or misleading data is fed into the system, causing the algorithm to make false assumptions and move into chaotic or unpredictable behaviour. However, securing such algorithms against data poisioning is still a fairly new area of study.
With its disaggregated structure that allows for multi-vendor engagement in a more competitive landscape, Open RAN is an extremely promising area of development in 5G technology. However, by raising the number of suppliers providing an increased number of products and services in a larger number of categories, the complexity of an oRAN network will far exceed that of its predecessors.
Will MNOs be equipped to implement these new infrastructures in a way that keeps networks and their users safe? With the support of standards bodies committed to delivering robust and secure guidelines, there’s no reason this shouldn’t be possible. Until now, industry associations and authorities have been clear and confident about the need to employ best practices in making sure Open RAN networks are secure.
But we have seen little in practice.
Tremendous amounts of energy and resource are being invested in building out 5G standards for global network deployment. The time to include cybersecurity provisions in those standards is now.
Self-help authors and politicians seem to agree on at least one thing: mindset matters. The shelves of bookstores worldwide are awash with motivational books by evangelical writers hoping to convert readers to their gospel of optimism. The central thesis is simple: success depends on approaching life, especially its challenges, with a positive outlook.
Politicians and leaders have always appreciated the power of mindset, though less of the positive kind, as attested to by a history of propaganda that dates back to at least 500 BC. More recently, as Covid-19 spread across the globe, language in public discourse showed itself to be carefully selected to shape national mindsets. War metaphors, specifically, became the favoured way of talking about the virus that was, despite its many negative consequences, still just a virus, not a sentient foe.
Donald Trump declared himself a “wartime president”, leading the US in a campaign against an “invisible enemy.” UK Prime Minister, Boris Johnson, also named the virus as an “enemy” that was being fought by his “wartime government.” His chancellor, Rishi Sunak, and his Health Secretary, Matt Hancock, kept to the same script, naming the “war against this virus” as the greatest fight “in peacetime” the country had ever faced. Italian Prime Minister, Giuseppe Conte, referenced the speeches of Winston Churchill, describing the pandemic as “our darkest hour.” Even António Guterres, Secretary-General of the United Nations, felt compelled to announce, “We are at war with a virus.”
There is nothing particularly new in all of this – the language of war has long been used as a tool to reliably “express an urgent, negatively valenced emotional tone that captures attention and motivates action.” Yet, in the case of the Covid-19 pandemic, it is useful to contrast popular war-themed rhetoric with a more embracing perspective.
Dr. Yuko Harayama is the Executive Director for international affairs, communication, and diversity at RIKEN, Japan’s largest research organization for basic and applied science. When asked about the impact of the coronavirus emergency, Harayama replied,
“After the pandemic, we have to foresee a future where humans are not dominating everything; humans are just one part of nature. We should not be arrogant to say we’ll dominate coronavirus.”
Harayama’s measured response says a lot about the way she sees the interaction between humans and the world around them. It also says a lot about Society 5.0, behind which Harayama was the driving force during her time in the Cabinet Office of Japan, where she was an executive member of the Council for Science, Technology and Innovation.
Society 5.0 is the vision of a future in which humans and machines “co-create” the solutions to societal problems by integrating cyberspace and physical space. First proposed in Japan’s 5th Science and Technology Basic Plan as a future that the country should aspire to, Society 5.0 represents the next step towards a more successful human collective.
Rather than simply using technology to improve our means of production, this plan is intended to create a new social contract and economic model by fully integrating cutting-edge technological innovations into our social fabric. In the words of Shinzo Abe, the Japanese Prime Minister under whose administration the vision was launched, Society 5.0 sets “a new definition for machines.” Leveraging advanced robotics, AI, cloud computing, next generation connectivity, and big data, Society 5.0 is intended to liberate machines of their narrow functions in industry and society. Instead, they will become active problem solvers and evolutionary enablers.
According to researchers at Hitachi-UTokyo Laboratory, a partnership between The University of Tokyo and Hitachi, the realisation of this view will also require us to reframe two kinds of relationships: the relationship between technology and society and the technology-mediated relationship between individuals and society. This is a crucial point because it highlights the human-centric nature of Society 5.0, a quality that neatly distinguishes it from the fourth industrial revolution.
We are in the emergent stages of the fourth industrial revolution (4IR)–a reimagining of production through the digitization of manufacturing.
The first industrial revolution employed steam and water power to improve output. The second used electricity to do the same. The third industrial revolution used computers and automation to accelerate production.
4IR, or Industry 4.0, builds on the power of computerized automation by introducing machine and systems autonomy. Through wireless networks of sensors, receivers, and processors, vast amounts of manufacturing data are collected and processed by artificial intelligence, currently of the “narrow” or “weak” kind.
These autonomous arrangements of physical and virtual computing elements are effectively capable of learning in real-time. They continuously improve production processes, making decisions based on super-fast analysis of live and historical data collected from the production environment.
The first industrial revolution reduced the need for human labour. The second increased efficiency by mechanizing large production lines. The third used computers to automate these processes even further, but still required humans to manage production.
The fourth industrial revolution goes further to make human intervention in production applications almost redundant. Smart factories, for example, are independent cyber-physical systems in which people are necessary only for specialized jobs, machine maintenance, high-level network management, and strategic guidance.
However, society 5.0 is something different from Industry 4.0. In the words of the Japanese Cabinet Office, “This is a society centered on each and every person and not a future controlled and monitored by AI and robots.” It is a human-centred proposition that seeks to use the same relationships between cyberspace and physical space to solve social problems.
The fourth industrial revolution relates specifically to commerce and manufacturing through better use of machines, but the non-commercial consequences of 4IR are often overlooked.
What, for example, will be the societal effects of 4IR?
As AI and automation make many human jobs redundant, what will be the impact on the nature of work, communities and social structures?
What will happen to economies as medical improvements lead to an aging population?
What will happen to the environment as human production and consumption continue to grow?
These are wicked problems, even though they are the result of largely positive trends towards more widespread human wellbeing. And they would not be vexing us were it not for technology.
Of course, this does not make technology bad, or even good–it is agnostic–but it does raise the question: if we used technology to get ourselves into these dilemmas, can we use it to get ourselves out?
The notion of Society 5.0 is an emphatic ‘yes’ to that question. It is a proposal for humanity’s next evolutionary step. According to the Japanese government, this would be the 5th stage of human society. Initially (Society 1.0), we organized ourselves in small groups or tribes of hunter gatherers, living off the natural output of the land. Then, through horticulture and agriculture (Society 2.0), we used tools to harness the growing potential of the earth, giving us more control over our food production. Society 3.0 saw us move into the industrial era, and Society 4.0 represents the information age we are living through now.
Society 5.0 assumes that, through a high degree of convergence between cyberspace and physical space, we can achieve a forward-looking society in which each and every person can lead an active and enjoyable life.
New solutions for new problems
The pace and extent of globalization have meant that new challenges have emerged that were either not anticipated, or at least not expected for some time to come. And, having a more integrated world means having more integrated problems that are more difficult to solve.
Sustaining economic growth while reducing income inequality and environmental degradation; improving the welfare of an aging population while ensuring opportunities for the youth; providing for more people using limited resources; slowing down, stopping, and then reversing the effects of climate change: these are wicked problems, but Society 5.0 represents an integrated approach to tackling them through multiple domains.
An increase in global life expectancy has been one of the great achievements of medical progress over the last two hundred years. According to World Bank data, average life expectancy at birth across all countries increased by more than 30 years between 1960 and 2019 alone. In the late nineteenth and early twentieth centuries, this trend was largely due to improvements in living conditions (especially sanitation), education, and advances in medical treatments like vaccines and antibiotics. These factors helped reduce early to mid-life mortality, but since the later parts of the twentieth century, rising life expectancy has been principally attributed to lower mortality later in life. Quite simply, the average human has been living longer. However, this has not necessarily been positive.
The WHO reports that, in the first two decades of this century, life expectancy around the world rose by an average of 6.6 years. Healthy life expectancy (HALE) also rose by 8% during the same period, though this was not due to reduced years living with disability. Instead, increases in HALE were attributed to declining mortality rates. People have been longer but not living longer well.
This discrepancy places a burden on societies and economies, an effect that is particularly pronounced in richer nations. In the 24 countries classified as high income by the World Bank, people aged 25 to 59 earn more than they consume, while the elderly do the opposite. Inevitably, as a population’s life expectancy increases, the cost of social support rests more heavily on working-age citizens, while pressure on government budgets grows. Quite simply, an ageing population puts economic strain on society as a whole.
It’s not surprising, then, that the original idea for Society 5.0 originated in a country like Japan, where life expectancy is higher than anywhere else on earth and a third of the population is 60 years or older.
However, as medical technology and social support structures across the world improve in quality and affordability, more and more nations will face the challenges of having an aging population. These include increasing medical and social security expenses, and the demands of caring for the elderly.
In Society 5.0, wearable medical devices will allow health and physiological data to be captured, uploaded and analyzed remotely, permitting early (AI-driven) detection and diagnosis of illness. Medication and healthcare services will be delivered by drone and autonomous vehicles, giving elderly people in rural areas equal access to quality healthcare. Robots and AI will assist in giving elderly citizens living support, even offering them the conversation and companionship that is critical to greater mental health.
In combination, these results will lessen the burden on public healthcare systems, lowering the need for hospital visits and improving the accuracy and efficacy of diagnoses and medical prescriptions.
Smart cities and mobility
In 2009, for the first time, the number of people living in urban centres globally surpassed the number living in rural areas. Though a significant moment, this was simply a milestone in a steady rise in urbanisation that saw the world’s urban population increase six-fold from from 751 million in 1950 to 4.2 billion in 2018. By 2030, that number is expected to top 5 billion. By 2050, an estimated 68% of the world’s population will live in cities.
The pressure that this urban growth is placing, and will continue to place, on infrastructure and resources is immense. Smart technologies will be critical to the success of these cities in managing the complex challenges created by having so many people in limited spaces: problems like waste management, energy management, water and power management, connectivity, public safety and security, transport and logistics.
In Society 5.0, the urban cohort of the world’s citizenry will be defined by an open embrace of cyber-physical technologies – by necessity and for pleasure. Cities will no longer be the traditional bricks and mortar environments we have been accustomed too. They will become cyber-physical realms built on data as much as concrete and glass.
This data will be collected and distributed by vast networks of sensors and processors, feeding AI-driven decision-making on the back of next-generation connectivity. In the cities of Society 5.0, no area of human activity will be left untouched by smart technologies.
While other aspects of the Society 5.0 vision still reside in the future, the evolution of smart cities is already well underway. Early adopters in Europe included Barcelona and Amsterdam, with Copenhagen, Dubai, Hamburg, Nice and Singapore quickly following suit. In North America, New York, Chicago, Miami, San Francisco, Kansas City and Montreal are also examples of cities implementing smart city initiatives.
While living in Singapore, Marin was fortunate to work on many of the city’s “Smart Nation” programs, learning first-hand how such projects incorporate technology across transport, health, home, and business to create a network of interconnected digital experiences that enhance citizen’s lives and optimize their work and play.
China has aggressively developed smart cities, which monitor and seek to address common urban challenges like pollution, traffic congestion and widespread energy consumption through connected technologies. The government’s 12th Five-Year Plan announced in 2013 included the development of 103 smart cities, districts and towns.
Less radical and more pragmatic is India’s “Smart City Mission.” Initially investing in 90 cities to develop smart capabilities, this evolving, layered system solves specific issues such as clean water while organically developing smart integrations over time.
In the US, the “Smart City Challenge“ saw more than 78 cities across the country enter the inaugural challenge focused on tackling 21st Century transport issues. Through shared innovation and intelligence the program nurtured ideas for an “integrated, first-of-its-kind smart transportation system that would use data, applications, and technology to help people and goods move more quickly, cheaply, and efficiently.”
On the opposite end of the scale, depopulated rural areas of the future will have fewer public transport options or none at all. In these regions Society 5.0 will see the provision of autonomous public services, including driverless taxis and buses for public transport, drone-based distribution and delivery services, and digital support for mental and physical wellbeing.
As with individuals’ health, social care for public infrastructure and services will become proactive in Society 5.0. This move will be the backbone of civil management in smart cities.
Installations like roads, buildings, tunnels and dams will be monitored by sensors supplying a continuous feed of data. This information will allow preemptive maintenance and efficient deployment of technicians with specialized skills.
As a result, accidents will be minimized, time and resources spent in construction and repair work will be reduced. Safety and productivity will increase.
A declining rural population worldwide is leading to a labor shortage in agriculture, This, in a sector that is under increasing pressure to raise production while working against the challenges of more extreme climate patterns.
In Society 5.0, AI analysis of big data, such as meteorological data, crop-growth data, market conditions, and food trends and needs, will lead to hyper-efficient agricultural management.
These “intelligent” data-based decisions will be carried out by autonomous farming vehicles and machinery. From soil preparation to crop collection to seed planting, robots, drones and driverless farm equipment will take over many traditional farm labor roles.
The world population is expected to reach 9 billion by 2050. Only through AI and machine-optimized agricultural management will we be able to feed so many people.
Disaster prevention and response
As we see more examples of extreme weather around the globe, the future value of predictive climatological and geological information is becoming clearer and clearer.
As Society 5.0 unfolds, data acquired from terrestrial weather radar, satellites, geological sensors, drones and public observation systems will become invaluable. Processed in real-time using AI, this information will deliver those precious minutes or hours’ warning of impending disaster that can save lives.
Widespread access to mobile networks will allow safety and prevention broadcasts to be disseminated directly to end users, while devices can be used to geolocate individuals in trouble.
To those trapped by environmental disasters, relief and rescue materials can be delivered by drones, which will also be able to feed back video footage of victims’ state of wellbeing.
In a world of 9 billion people, much of the competition for resources will effectively be a competition for energy. Optimal energy creation and management will be crucial to a harmonious society.
As energy production moves more towards green alternatives like wind and solar, weather plays a more important role. Analysis of weather data and accurate prediction of weather patterns will a key aspect of reliable electricity manufacturing.
Big data processing by AI will also optimize electricity flows across the grid to meet vacillations in demand and supply. This will be particularly important in smart cities where responsive systems in buildings and public locations will manage energy down to the minute, and most forms of transport will become electric.
Though the Japanese Government was the first to formally use the term “Society 5.0,” we envisage a broader reach than that originally defined. We have borrowed it for this book because it speaks to the inclusivity we anticipate for a world in which the cyber and physical are fully integrated.
To that end, any catalogue of potential domains to be influenced by the advances of Society 5.0 must include an “Other,” simply because there will be no aspect of human endeavour that will go untouched. We could speak here of shipping, international travel, space travel, environmental management, genetics, arts and entertainment, sports – the list is endless.
Convergence with caution
The true power of Society 5.0 will lie in its degree of integration. As Shinzo Abe said, in Society 5.0 “we must cherish connectedness, above all else.” The more the cyber and physical worlds are combined, the greater the benefits we will experience.
However, the same is true of cyber threats. The more technology is incorporated into every corner of our social being, even our physical being, the greater the risk to our personal and collective safety.
Society 5.0 is built on an intricate network of sensors, devices machines and systems–a vast internet of everything. Each of these components broadens the cyber attack surface, but also elevates the stakes in the case of fallout.
When technology is woven into the tapestry of all we do, it is not hard to see the potential dangers. Autonomous vehicles, AI-operated public transport systems, fleets of drones, critical disaster prevention processes–these can all be hacked.
That is true today, but, as we will explore in detail on Part 5 of this book, the difference in Society 5.0 is that all relationships are cyber-kinetic. Virtual events have physical results. People get hurt. Or worse.
A term first coined by the Japanese government, “Society 5.0” describes “A human-centered society that balances economic advancement with the resolution of social problems by a system that highly integrates cyberspace and physical space.” The fifth evolution of the society, enabled by the fifth generation of cellular networking and cyber-physical systems, imagines technology, things and humans converging to address some of the biggest societal challenges. The concept encompasses Industry 4.0, Fourth Industrial Revolution, Smart-Everything World and other buzzwords of the moment.
In the society of the future the more the cyber and physical worlds are combined, the greater the benefits we will experience. However, the same is true of cyber threats. The more technology is incorporated into every corner of our social being, even our physical being, the greater the risk to our personal and collective safety.
The pandemic has accelerated our progress towards Society 5.0, albeit without corresponding advancements in cybersecurity and privacy. In the second book my son and I are writing we are highlighting the blind spots that might drag us down on our way to humanity’s next evolutionary step and offering potential ways to reconsider cybersecurity and privacy in Society 5.0. From the introduction to Securing Society 5.0 (Upcoming):
As we move into the third decade of the 21st century, humanity faces challenges of previously unimagined scale and complexity. The world grows smaller every day; all problems are to some extent shared global problems. We have been dramatically reminded of this fact by the recent Covid-19 pandemic, which started as a health emergency but soon evolved into a social and economic one, leaving no nation on earth untouched.
Many parts of the world economy ground to a halt. Despite record sums in fiscal stimulus and monetary interventions aimed at keeping companies open and citizens employed, the damage was sharp and extreme. Global unemployment rose by 33 million in 2020, a number which would have been far higher were it not for the job retention schemes that allowed companies to reduce working hours without closing jobs. Even these measures, though, could not stem the fallout in productivity, with working hours lost in 2020 equivalent to 255 million full-time jobs.
These overall figures do not, however, reveal how uneven the effect of Covid-19 has been across industries. Sectors like aviation, food and hospitality, arts and culture, and construction have been hit hardest, suffering far greater losses than higher-skilled service sectors, like information and communication, finance, and insurance, many of which have actually seen jobs growth.
The fundamental reason for this disparity in impact is quite a simple one: physical proximity. Those industries reliant on human contact, or at least humans working near each other, have been largely paralysed by regulations prohibiting physical interaction. Those industries in which companies and their workforces are able to operate remotely have typically incurred less damage. This does not, however, mean that organizations in those industries were prepared for remote working on the scale we have seen.
According to the World Economic Forum (WEF), until recently, working from home was a luxury for the relatively affluent. Only around 7% of U.S. workers had the option to regularly work from home, most of them “knowledge workers” such as executives, IT managers, financial analysts and accountants. The UK Office for National Statistics estimated the WFH contingent in the United Kingdom in 2019 was approximately 5%.
Though no authoritative figures have been compiled yet, the number of people currently teleworking around the world has multiplied dramatically. Whole organizations have moved online and connect via video conferencing.
In March 2020, Zoom was downloaded 2.13m times around the world in one day, up from 56,000 times a day two months earlier. The company’s share price doubled in the same time period.
Predictions of a future in which people communicate primarily online have been rendered inaccurate by decades. In one giant leap, we have landed in a virtual reality facilitated almost exclusively by digital applications. Yet, communication is only one face of a broader shift catalysed by the recent pandemic. A 2020 McKinsey Global Survey of 900 C-level executives, reveals that companies have accelerated the digitization of their customer and supply-chain interactions and of their internal operations by three to four years. The proportion of digital or digitally-enabled products they offer has leaped forward seven years in a few months.
Much has been said elsewhere about the way in which Covid-19 has accelerated the digital transformation of commercial, industrial, and civic enterprises, but in our consulting work during 2020-21, one of the most striking aspects of this change has been its extension beyond digital. Companies have not only been upgrading their internal processes to digital or developing digital-centric products, they have been assertively integrating cyber and physical technologies to increase competitiveness and remove human dependencies from their value chains.
Recognizing the increased risk of future operations being slowed, interrupted or halted altogether by the outbreak of new viruses, businesses in sectors like manufacturing have brought forward plans for automation and cyber-management of their factories. For these clients, the cloud-based confluence of 5G, AI and Big Data is making autonomous operations a reality: production installations overseen by humans, but driven by digitally-enabled machines.
These examples are the realization of buzzwords like Fourth Industrial Revolution, Industry 4.0, and smart environments, visions of the future that all have one thing in common: cyber-physical systems. Cyber-physical systems represent the convergence of physical, digital, and biological spheres, and they will soon be ubiquitous in all areas of life.
Even before Covid-19 began its spread across the globe, the number of devices expected to be connected to the Internet of Things (IoT) by 2023 was 43 billion. Now, with businesses pivoting to create more cyber-physical products and digital services for an online-resident population, those numbers are probably gross underestimations.
It is because of this exponentially expanding Internet of Things that 5G has been such a controversial topic over the last few years, feeding geopolitical conflict, trade wars and relentless debates in the telecoms industry and beyond. 5G has not even been rolled out in most of the world and nations have already come to regard it as a critical infrastructure. Why? Because it will enable a massive Internet of Things (mIoT), because it is a structure that will unlock the unimaginable potential of cyber-physical reality.
“5G will transform lives of many in the UK and across the world by facilitating the Internet of Things,” says the UK Government.
The Government of Canada agrees: “The 5G networks are expected to play a much broader role in our lives by enabling wireless connectivity of an unprecedented variety of devices for an unimaginable number of services and applications.”
Australia states, “5G provides responsive digital technology required to support innovations such as robotics and the Internet of Things (IoT),” while the Government of the United States declares:
“5G is a fundamental shift in wireless infrastructure. More like the invention of the Gutenberg press than the move from 3G to 4G, it will move the world into the information age. Everything from automated cars and aircraft to advanced logistics and manufacturing to true AI enhanced network combat. Most communication on the network will move from mobile devices to machine to machine (M2M) traffic.”
These statements are less about 5G itself, and more about what it enables, and they echo the sentiments of almost all sectors of commerce and industry: the future is cyber-physical.
Despite distinct but parallel paths of evolution, humankind and technology have reached a time of unprecedented assimilation in which we and our tools are less and less distinguishable from each other. As with all unions, there are tremendous gains to be won, but there are also challenges.
The benefits of cyber-physical systems (CPS) extend far beyond sexy consumer products like self-parking cars and homes that change lighting according to your tastes. Humankind faces numerous existential threats, and a global network of cyber-physical devices may hold some of the clues to overcoming these obstacles.
However, technology is no panacea. Humankind’s latest technological revolution has been breathtaking in its pace and impact, but it has not been matched by concomitant progress in society, ethics and neurobiology.
Humankind’s survival and future success do not rely on technology alone, but on its conscious, balanced, and secure incorporation into social, industrial and economic systems.
A term first coined in the Japanese government’s Fifth Science and Technology Basic Plan, “Society 5.0” envisages an amalgamation of cyber and physical spheres to deliver exponential synergy in society’s operations. It describes a time of greater prosperity for all, achieved through the liberation of cyber-physical intelligence to create a “super-smart” society. Social contracts will be rewritten, economic models will be redefined, new solutions to nagging societal problems will be achieved through strategic assimilation of robotics, AI, big data, 5G (and beyond), and as-yet-unseen emergent technologies.
For the purposes of this book, we have expanded the term “Society 5.0,” while retaining its essential spirit. We mean it to include the Fourth Industrial Revolution, Industry 4.0, the Internet of Things (IoT), the Internet of Everything (IoE), and the many alternative concepts that are regularly used to describe the connectivity-driven integration of technology and human daily life. These labels all describe specific systems or trends, while Society 5.0 is a broader term describing an integrated cyber-physical ecosystem, a “system of systems.”
There are already many interpretations of what such a future may look like, ranging from the cynical to the utopian. Our job in this book is not to contribute to that collective pool of imagination, but rather to illuminate some of the practical considerations often overlooked by “futurists” and “tech prophets.”
“Securing Society 5.0” addresses the largely unexamined threats of cyber-physical ubiquity. It begins with an exploration of the context and history of Society 5.0, including its anthropological roots and the multi-systems challenges that have called this vision into being. The book then examines the defining characteristics and assumptions of Society 5.0, as well as the major technologies that will influence its success. Part Four investigates the shifts we can expect to see in the nature of society itself as the boundaries between cyber and physical become increasingly blurred, before Part Five considers the hidden threats of a cyber-physical world, and what can be done to ameliorate them. The book concludes with an affirmation of the need for systemic evolution, as envisaged in Society 5.0, with directions for its safe delivery.
In writing this book, we hope to offer a sober view of technology’s potential to help humanity evolve in a healthy way, while drawing attention to the blind spots that may drag us down. We stand on the threshold of a golden age, but one that will only be realized if we understand that what worked for us in the past may not work in the future. What kept us safe yesterday will not keep us safe tomorrow.
Only by appreciating this fact will we be able to access the full potential of humanity’s next evolutionary step.
The 5G Network Functions: As explained by Ericsson “is built using IT network principles and cloud native technology. In this new architecture each Network Function (NF) offers one or more services to other NFs via Application Programming Interfaces (API). Each Network Function (NF) is formed by a combination of small pieces of software code called as microservices.” The Network Functions are all “Virtual”.
Cloud Native Functionality: The 5G core is Cloud Native, i.e., it leverages microservices, containers, orchestration, CI/CD Pipelines, APIs, and service meshes etc.
Control & User Plane Separation (CUPS): This functionality is critical for 5G as it allows operators to separate the control plane that can sit in a centralized location and for the user plane to be placed closer to the application it is supporting.
Edge Computing: With CUPS (control and user plane separation) the data plane can be moved closer to Edge for lower latency requirements leading to Edge Computing.
Network Slicing: This functionality leverages virtualized functionality to logically connect different physical resources to support a specific service for different business needs.
This paper reviews the 5G Core Network architecture capabilities and current deployments of the major 5G vendors including:
According to Dell’Oro, as of 2019, Ericsson and Huawei share the top two spots in MCN (Mobile Core Network) followed by ZTE and Nokia and then Cisco. This includes market share for both the legacy as well as the 5G Core that is triggered by 5G Standalone (SA) launces.
In 2020 Huawei and ZTE increased their dominance over Nokia and Ericsson as China aggressively launched 5G Standalone ahead of other countries. Overall Huawei is the leading Telecom Equipment vendor followed by Nokia and Ericsson.
The 5G core market is purported to grow at a whopping 72% CAGR to $9.5B by 2025. As per Dell’Oro this market is dominated by Ericsson, Huawei, Nokia with Samsung and ZTE as the challengers. We review the 5G Core Network Architectures as proposed by different players divided into two categories:
Traditional Equipment Vendors: Nokia, Ericsson, Cisco, Huawei, ZTE
Disruptive Players: Samsung, Mavenir, Casa systems, Affirmed Networks.
Core Network Architecture Evolution from 4G to 5G
The Core Network Architecture evolution depends on the mobile operator choices. 3GPP has specified different options, shown in the following figure from GSMA. The options are grouped as SA (Standalone) vs NSA (non-Standalone). SA refers when only one radio access technology is used vs NSA refers to the option when both LTE and 5G radio access technologies are used simultaneously:
Option 1: The tradition enodeB (eNB) connected to an EPC
Option 2: 5G Standalone (SA): 5G NR nodeB (gNB) connected to the 5GC
Option 3: A non-Standalone (NSA) eNB is connected to a 4G Core, and gNB is connected to the enodeB – it comes in three variants: Option 3, 3a and 3x with different connectivity options between the gNB and eNB
Option 4: A non-Standalone (NSA) deployment, where both LTE and 5G NR radio access technologies are deployed and controlled through only 5GC. The eNB is routed to the 5GC via the gNB. Has options 4, 4a
Option 5: A standalone (SA) deployment; an evolved eNB connected to 5GC
Option 7: An NSA 5GNR nodeB (gNB) is the master, connected to the 5G Core and eNB as the slave
The question arises on why migrate to a 5G core? The following reasons provide the rationale for the migration:
5G Core is Cloud Native
The 5G Core is being built with a cloud Native Architecture with microservices and that can be reused for supporting other Network Functions. The Cloud Native Architecture will be built on CI/CD pipelines. Such an architecture will speed up development and operational efficiency by deploying a DevOps approach
5G Core enables Network Slicing
Network slicing is enabled by the cloud native architecture. Multiple logical functions can be defined on the same physical architecture. It enables a mobile operator to support new services and business models for a variety of services like Massive IoT, Industrial IoT and Evolved Mobile Broadband. The 5G use cases enabled by a 5GC include augmented reality, factory automation, mission critical communications.
5G Core supports Edge Computing
The scenarios for Edge Computing including local breakout of traffic. As is explained in the reference, “the reduction in latency, increase in service reliability and traffic and services isolation will contribute to an overall enhancement in the end-user experience. The list of capabilities goes on, but here are a few others:
Service exposure and traffic steering functionalities provide additional tools for service differentiation
Enhanced QoS model; more flexible than in 4G will allow multiple services (QoS flows) per PDU Session
Security is improved with enhanced key handling and a unified authentication model
Service differentiation per geographical e.g., control access to FWA services or other localized services”
5G Core Comparison
As a part of this paper we compare the support of the 5G vendors
Ericsson claims the world’s first in 5G Core and NR SA and as per GlobalData, is a leader in 5G Core. “The solution has gained significant market momentum, which currently includes 64+ 5G contracts, 33+ live Non-Standalone (NSA) deployments, and 100+ Standalone (SA) trials in the planning or execution stages.”
5G Core Network by Huawei
Huawei has highlighted the importance of 5G Deterministic Network to provide a differentiated and deterministic experience to customers. “Deterministic Networking” builds on Network Slicing and Mobile Edge Computing.
Huawei talks about 5G deterministic networking (5GDN) that enables5G use cases including 5GDN+smart devices, 5GDN+machine vision, 5GDN+AR man-machine collaboration, and 5GDN+AI+smart transportation/energy.
These use cases are possible because 5G DN SLAs guarantee reliability, service availability, etc. In the Industrial Internet with stricter requirements, IEEE and IETF have defined the TSN standards to study deterministic communication development in industrial automation, vehicle management, and other fields.
Done right: it is cloud-native and infrastructure-agnostic by design. Deploy it on any cloud – private or public, centralized or distributed, with an optimized performance footprint for any deployment model.
Done now: it simplifies the complexity with the latest technology to boost the top line and lower costs. Open and programmable, it creates an innovation engine for a strategic business advantage – today.
Made real: it meets stringent reliability & quality requirements, because it is created and delivered by Nokia with its broad portfolio and global experience, including hundreds of core deployments (Cloud Packet Core, VoLTE, SDM, Policy, Charging, Signaling, etc.)”
5G Core from ZTE
ZTE has been aggressively testing 5G SA with Orange, launched with 5G SA MTN Uganda. ZTE has a 5G E2E Slicing architecture and has been working Industrial Automation Opportunities with 5G technology.
ZTE offers a vision of what a successful 5G Core deployment looks like in this graphic that includes connected house, connected things, connected city, connected people, connected health, connected transportation:
At Mobile World Congress in 2019, ZTE presented the “Enhanced 5G Core, Enabling 2B New Business”.
Cisco 5G Core
Cisco has a presence in the 4G packet core with its acquisition of Starent in 2009. It continues to build on that acquisition for piece of the 5G Core business. Cisco has a very strong IP networking and security portfolio that it adds on to its existing offering to position itself as a key 5G security player. The following figure talks about the 5G Core Cloud Native Core with Network Slicing, and Mobile Edge Computing and the importance of an end-to-end security layer for a 5G Network. The security aspects covered include:
As per GlobalData [Link updated Feb 2022] the disruptors in the mobile core space are Affirmed Unity Cloud (acquired by Microsoft), Samsung 5G Core, Casa Systems Axyom 5G Core and Mavenir 5G Core. A quick overview as per GlobalData:
Affirmed Unity Cloud
Affirmed Unity Cloud is being deployed by: Inventec, CHT, AT&T, DNA, Milicom and Netmore, showing early customer momentum. Microsoft’s acquisition can be positive from a funding perspective but could dilute its laser focus on mobile core solutions. Affirmed, as part of Microsoft’s Cloud business unit, may be challenged to maintain a cloud-neutral stance regarding third party clouds.
5G Core by Samsung
Samsung has demonstrated market momentum and operational experience – via penetration in Korean telco operators deploying early 5G standalone (SA) networks. Samsung is well positioned in O-RAN to deliver end-to-end solutions, based on open RAN standards. Samsung’s open-source PaaS plus its Samsung Cloud Orchestrator (SCO) provides an effective automation platform. Samsung’s limitations include limited marketing presence outside of the Korean telco market. It may take some time to transition from trials to significant deployments. Samsung has a whitepaper on the 5G Cloud Native Core on a 5G Migration Strategy.
The white paper also reviews the 4G to 5G migration options.
Samsung whitepaper lays out the evolution path towards 5G NSA + SA + WiFi in the years to come.
Mavenir strengths include that it is integrated with ONAP and ETSI based MANO solutions which appeals to operators for management and orchestration. Mavenir is highly visible in the Open RAN Policy Coalition to bring open and interoperable solutions to the RAN and has established engagements with operators such as Dish Networks and Vodafone IDea, providing it with a basis to deploy its 5GC and ORAN solutions. Mavenir utilizes cloud-native technologies to interwork with legacy protocols. Mavenir’s limitation is that it has not named operators who are using its 5G core in trials or commercial deployments.
From a LightReading article “Mavenir has enjoyed significant mobile core network wins in Europe, India and Japan. Top operators, including Turkcell, Telefónica and Deutsche Telekom, alongside upstarts like Rakuten Mobile in Japan and Dish Network in the US, have purchased the company’s offerings”.
Casa 5G Core
Casa has not publicly announced 5G engagements with service providers, It notes engagement PoCs and trials.
Gartner’s Magic Quadrant for 5G Infrastructure Providers
Gartner recently updated its magic quadrant for 5G Vendors that shows the competitive landscape end-to-end. The capabilities of the 5G infrastructure include:
Radio access network equipment, radio units (RU), base band units (BBU) for 5G new radio and 4G LTE:
Passive antennas, RU, AAU, vBBU, BBU, DU, CU, vDU, vCU, small cell
Core network equipment, including 5G next-generation core and evolved packet core (EPC):
MME, S-GW, P-GW, IMS, HSS, PCRF, EPC/vEPC for 4G LTE
This paper compares the major 5G Core Network vendors, the features and their customers. The paper describers the leaders including Huawei, Ericsson followed by Nokia, ZTE and Cisco. Then we compare some of the disruptors on the 5G Core Network including Affirmed, Mavenir etc.
In 2021 there will a major move towards 5G SA to realize the special 5G use case that possible with a legacy 4G core which will cause an increase in investment. These vendors are poised to be the winners in this race.
The telecoms and digital technologies sectors are notoriously jargonised. Eavesdrop on any conversation at an industry conference (remember those?) and you’d be treated to a parade of acronyms, initialisms and technical terms that would sound like ancient Greek to an outsider.
However, with new technologies being developed and deployed at an accelerated rate, staying on top of terminology can be challenging for even seasoned professionals. This is nowhere as apparent as in the evolving debate around Open RAN and its applications. Open RAN and its variations became the next “big thing” in wireless and I thought I’d try and help clarify some of the related terms that often get confused.
People often mean different things when they talk about Open RAN. Some are referring to network specifications, others are describing a philosophy. Add to this mix the different industry Open RAN groups, a multiplicity of spellings (often within the same article) and creative hashtags that populate social media, and it’s easy to see how conversations on this topic quickly become confused.
What follows is a lexicon of Open RAN-related terms and definitions that will hopefully help you cut through the noise. But first, some background:
The radio access network (RAN) is a critical part of network infrastructure. It is also one of the most expensive. Traditional RAN setups are hardware heavy and require major CapEx to build the foundation of a wireless network.
But the costs are not only significant in capital investment. RAN operating expenses are also high, unnecessarily so according to a growing number of network operators and suppliers. Theoretically, a radio access network built by a particular vendor to 3GPP standards should be interoperable with devices or components produced by any other vendor satisfying the same specifications.
In practice, though, vendors usually construct RAN setups with proprietary software and interfaces built on top of hardware developed by the same vendor. For the telecoms operator, the long-term cost of this inflexibility can be punitive. Many find themselves locked-in to unfavourable vendor contracts with little to no control over the upgrades and security of their RAN components.
The movement towards Open RAN has grown primarily in response to the gated nature of legacy RAN deployment and management. The cause is driven by operators hungry for the cost benefits of greater competition and prospective suppliers currently unable to break into a market dominated by a handful of monolithic vendors.
Open RAN refers to a disaggregated approach to deploying and managing radio access network functions, by using open interfaces between network elements. This aims to increase interoperability through vendor-neutral hardware and software-driven technology developed according to community-agreed standards.
Perhaps because Open RAN encourages a less restrictive and more accessible approach it is often conflated with open source, but they are not the same thing.
Some people also use Open RAN more generally, as an umbrella term to describe a collection of technologies, including vRAN and C-RAN (see below), that support the disaggregation of RAN elements.
OpenRAN (one word) is seen regularly online and is employed in one of three ways:
Used interchangeably with the term Open RAN
Used in social media posts, often with a hashtag: #OpenRAN to refer to any Open RAN-related technology
Used to describe the OpenRAN Telecom Infra Project tasked with defining and building 2G, 3G, 4G and 5G RAN solutions based on general-purpose, vendor-neutral hardware and software-defined technology
Refers to the O-RAN Alliance, an industry group working to develop new standards for open and intelligent RAN, provide open software development for the RAN, and support member organisations in testing and integrating O-RAN implementations.
The global O-RAN Alliance was born from a merger of the C-RAN Alliance and xRAN, and brings together more than 160 mobile operators, vendors, and research and academic institutions.
According to the group, O-RAN focuses on technical aspects of the RAN and stays neutral in any political, governmental or other areas of any country or region. O-RAN does not get involved in any policy-related topics.
In addition to the O-RAN Alliance and the above-mentioned Facebook-backed Telecom Infra Project (TIP) other Open RAN-related industry groups have started forming and influencing the development and deployment of open, disaggregated, and standards-based RAN approaches. One interesting to mention is the newly-formed Open RAN Policy Coalition that promotes policies to advance the adoption of open and interoperable RAN solutions
oRAN or ORAN
Often used as shorthand for the Open RAN movement in general.
On social media, however, #oRAN or #ORAN may refer to either the Open RAN movement or the O-RAN Alliance.
In Virtual RAN (vRAN) the RAN functions of the baseband unit (BBU) are virtualised on a commercial-off-the-shelf (COTS) server. Theoretically, this allows different components of the baseband and radio software and hardware to be supplied by different vendors.
However, in practice, the interfaces between the BBU at the bottom of the cell tower and the remote radio unit (RRU) at the top of the tower often remain proprietary, meaning that an RRU from one vendor can require software from the same vendor to run on the COTS-based BBU. As a result, vRAN can still lead to vendor lock-in.
So, even though vRAN is a more open and flexible architecture and the virtualisation of network functions is a key principle of Open RAN, virtual RAN does not equal Open RAN.
In Open RAN, the proprietary interfaces between the baseband unit and the remote radio unit are replaced with open interfaces. So, any vendor’s software can work on any open RRU.
Cloud RAN or Centralised RAN. Staring about 10 years ago, this was an important first step towards disaggregating the radio access network. C-RAN sees the baseband unit relocated from the radio site to a data center where it is combined with other BBUs to form a pool of centralized resources that function as a cloud.
C-RAN relies on a fibre-based fronthaul – the connection layer between a BBU and RRU (or multiple RRUs) – and, as a result, has traditionally been limited to high-density or urban areas.
C-RAN has many benefits over legacy RAN and, like vRAN, has contributions to make to Open RAN, but it is not open in the Open RAN sense and is still susceptible to vendor lock-in.
It’s been a year of contradictions for the telecommunications industry.
Like most sectors, it has been heavily impacted by the consequences of the Covid-19 pandemic, with a slowdown in global 5G roll-outs being a notable result. Geopolitical conflicts have continued to muddy the market, with governments playing a more active role than ever in setting telecoms-centred policy.
At the same time, however, the air is thick with promise and opportunity. Over the last nine months, entire organizations have transitioned to remote working and high-bandwidth video communication. Corporations have accelerated digital transformation initiatives. Online shopping has soared. The appetite for autonomous manufacturing and other aspects of Industry 4.0 has grown.
The telecoms industry has proven resilient and cemented its status as critical national infrastructure. This recognition has raised the already-elevated stakes for governments who see mobile networks as key to national security. And it has made even more urgent the debate about the fastest and safest way to evolve the telecoms ecosystem.
Over the last few years, much of the focus has been on the emergence of 5G, but as this process has evolved a fierce debate has developed around the best way for network operators to implement 5G while remaining sustainable and relevant in a rapidly shifting landscape. Open RAN has been central to these disputes, receiving massive attention from vendors, operators, and policymakers.
This is understandable, the potential benefits of open radio access networks certainly are alluring. But, as with 5G itself, the security of open RAN configurations will need to be considered carefully if we are to harness the technology’s full potential.
Virtual or open?
Virtualized radio access networks (vRAN) and open RAN are both hot topics in the mobile industry for different, but complementary, reasons. Though the two approaches often work in unison, they serve different purposes.
vRAN has its origins in network functions virtualization (NFV) which shifts network architecture from hardware-based to software-based. Similarly, in virtualized radio access networks software is decoupled from hardware and radio access network functions are run on commercial off-the-shelf (COTS) servers.
In both cases, service providers are principally trying to save time and money, firstly by speeding up the deployment of new network services, and secondly by reducing operating costs and capital expenses.
Though vRAN offers rewards in greater efficiency and lower costs, it does not necessarily alter the current infrastructure supply chain. Almost all existing vendors are working on virtualizing their existing products.
Open RAN, on the other hand, represents a dramatic departure from a restricted vendor base. Advocates maintain that it offers telcos a cost-cutting alternative to traditional management of the radio access network, one of the costliest parts of the infrastructure.
Conventional network interfaces do not support interoperability between different suppliers, with the result that operators are locked into closed arrangements with single vendors. Proprietary hardware and software are tightly-coupled and closed to adaptation.
Open RAN sees a relaxation of these restrictions. Though such networks generally include virtualization, they are defined by their goal of opening up interfaces within and between the different elements in the radio access network: radio unit (RU), distributed unit (DU), and centralized unit (CU).
An example of the challenges that open RAN is trying to solve can be found in the interface linking radios and signal-processing equipment. Open RAN proponents regard this fronthaul interface, known as CPRI (common public radio interface), as incomplete. Currently, the only way for an operator to deal with this issue is to buy radios and signal-processing technology from the same vendor, usually one of the monolithic RAN suppliers.
In a more open system built on interoperability, that service provider would not be restricted in their choice of hardware or software supplier. They would have greater freedom in how they resolved technical concerns, being able to use one supplier’s radios with another’s processors.
The O-RAN Alliance, a specification group defining next-generation RAN infrastructures, has defined 11 interfaces for open RAN, covering the fronthaul (RU to DU), the midhaul (DU to CU), and the backhaul (connecting the RAN to the core).
These give operators the freedom to mix and match components from a growing number of suppliers, thereby inviting more diversity, competition and innovation into the supply chain.
Why open RAN?
In short, open RAN should offer telcos a more cost-effective and adaptable solution than traditional radio access networks. That’s the commercial reason. However, as trade wars have dragged on and the US-led campaign against Huawei and ZTE has gained momentum, open RAN has taken on political importance.
In the US, for example, it has been identified as a way to circumvent the need for Chinese network hardware, ostensibly eliminating much-publicized backdoor threats. However, a move to open RAN would also buy freedom from reliance on other international suppliers, notably Finnish Nokia and Swedish Ericsson. It is here that political and commercial motives meet.
Conventional radio access network arrangements see operators locked into agreements with a few big vendors who maintain ownership of RAN processes through proprietary equipment and services. It’s what’s been called an “oligopolistic vendor landscape” in which operators have little control, limited insight into RAN security and operations, and a paucity of choice.
While this has always been frustrating for telcos, the crises of the last year have laid bare the risks of persisting with a closed supply chain. Disruptions caused by Covid-19 have exposed a clear need to build supply chain resilience and security through greater supplier diversity. And, in trying times, it has become especially clear how much innovation is limited by restrictions on the telecoms supply chain.
These points underlie a growing wave of open RAN advocacy that envisages a brave new world of greater RAN efficiency, intelligence and versatility. According to the O-RAN Alliance, the radio access networks industry is moving towards “open, intelligent, virtualized and fully interoperable RAN.”
Industry groups like the O-RAN Alliance, the Facebook-initiated Telecom Infra Project (TIP), and the recently-formed Open RAN Policy Coalition are supported by a broad spectrum of stakeholders, including major vendors like Nokia and Ericsson. This appears to spell an acceleration in open RAN adoption.
ABI Research estimates that open RAN will outstrip traditional RAN within the decade, reaching a total market of approximately $30 billion in 2030, compared to $20 billion in the traditional RAN market.
These are not wild projections. Instead, they seem predicated on an already high level of open RAN activity. In Japan, Rakuten Mobile has launched open RAN-based 4G commercial services in urban areas and is currently building its 5G network to O-RAN specifications. Another greenfield operator, DISH, is preparing a significant open RAN network build in the US.
Meanwhile, suggestions that open RAN is only fit for new developments are being put to the test by a growing number of established operators. Telefónica, Deutsche Telekom, Vodafone, Orange, and Turkcell are all working on open RAN deployments.
There are a number of anticipated benefits driving these shifts.
The most obvious is that open RAN widens the supply chain, which is music to free market ears. More suppliers mean more competition at different layers in the hardware and software supply chains, translating into lower OpEx and CapEx for telecom operators.
Interoperability means telcos can ensure that they are making use of best-of-breed components with a reduced chance of vendor lock-in. This flexibility also ensures more progressive network updates and faster ecosystem evolution.
Virtualization and disaggregating hardware from software create a more agile network with lower deployment times and a better ability to scale at pace. New features can be added more quickly for specific use cases, while operators can provide enterprise-level services to support industry 4.0.
Open vRAN also permits edge-centric network architecture. The only site installation setup required is a radio plus power which, when coupled with mini data centres built closer to subscribers, translates into a flexible and scalable footprint that can support low latency applications – just one example of the potency of connection between open RAN and 5G.
Open RAN and 5G
Virtualized RAN may prove critical if 5G networks are to realize their projected performance standards. Cloud-based network functions will not be sufficient – all aspects of the 5G architecture will need to be virtualized in order to fully access 5G’s potential.
In vRAN, functions of the baseband unit (BBU) are enabled virtually through virtual machines (VMs) on centralized servers, while controller functions can be moved closer to the edge of the network. With these expanded options, operators can exercise greater (and more cost-efficient) control over their radio resources.
By separating network functions from the underlying hardware, vRAN enables an agile and dynamic RAN ecosystem characterized by streamlined resource utilization and more responsive deployment of new network services. This will be crucial to the operation of a smooth 5G network.
The O-RAN Alliance paints the picture of a RAN ecosystem based on interoperability and intelligence. It is the second principle, intelligence, that is especially pertinent to 5G networks, which will traffic massive amounts of data created by the internet of things (IoT), high-definition video, AR and VR.
5G will also see the deployment and management of countless virtual applications and their relationships. This will be beyond human capacities; the network will need to be intelligent. Though this kind of intelligence may be satisfied through virtualized networks, it is possibly through the accelerated innovation of open radio access networks that the 5G network will be able to evolve fastest.
For operators deploying 5G on legacy networks, open RAN is a burning question. As shown by Rakuten and DISH, the decision is relatively simple for greenfield deployments – open RAN supports a future-ready, scalable and upgradable software-driven network.
But for brownfield deployments, the considerations are more complex. Should open RAN only be initiated for 5G networks, or across all legacy Gs? What will be the long term impacts on CapEx and OpEx, as well as the operator’s total cost of operation? For many service providers the commitment to 5G may provide an incentive to consider open RAN across all network generations.
Though cost is the primary driver of the open RAN proposition, the debate has also rested on security. And rightly so. With such a strong potential for alignment between 5G and open RAN, the security of more open radio access networks is critical to national security.
This argument has featured strongly in political rhetoric and statements from pro-open RAN organisations. In the US, filings to the National Telecommunications and Information Administration (NTIA) by the Open RAN Policy Coalition and open RAN vendor, Mavenir, have suggested that open RAN is imperative to securing 5G.
Skeptics say such players are motivated by the enormous commercial opportunities that would become available in the widespread adoption of open radio access networks. But the security argument is a strong one.
In closed RAN, operators rely on vendors to maintain security and manage threats like back doors. The ability to respond to these threats is also determined by the efficacy of the vendor’s proprietary technology. The inflexibility of the supply chain limits telcos’ level of RAN insight and responsiveness.
Theoretically, the vendor diversity of a more open RAN could create the conditions for more responsive and dynamic network security. If threats or vulnerabilities are identified, the operator can move quickly to swap out the offending component without having to undertake a costly and extensive rip and replace, of the kind currently being applied to Huawei hardware in the US.
Ericsson has publicly questioned this line of thinking, arguing that “The introduction of new and additional touch points in O-RAN architecture, along with the decoupling of hardware and software, has the potential to expand the threat and attack surface of the network in numerous ways.” The vendor also maintains that the virtualization of network services could contribute to security challenges.
These claims have been rejected by CTOs for Rakuten and Telefónica, for example, who’ve reiterated the belief that open, non-proprietary networks will provide greater network security options. From the operators’ point of view, having 100% end-to-end visibility of the network is advantageous in monitoring security and pre-empting breaches.
Ironically, the greater freedom that defines open RAN could offer a route to tighter operator control, improved accountability and stronger security. The success of this approach will rest on strong standards supported by rigorous 3rd party testing – having the option to swap or upgrade components from multiple vendors is useless if those components aren’t safe.
However, as with 5G, the reality of an expanded attack surface in open RAN is a real concern. Operators appear confident that they will be able to take this challenge on, but their bullish mood remains to be tested.
The spirit of the open RAN movement is a positive one that should lead to a more democratized, innovative, lower cost and, hopefully, safer 5G-driven ecosystem. But the stakes are high and the path is new. We need to balance pioneering zeal with healthy caution if we are to create the secure networks that will usher in a new age of global connectivity.
In the recent report by IHS Markit – “The 5G Economy – How 5G will contribute to the global economy” – researchers claimed that manufacturing will garner almost $4.7 trillion in sales enablement by 2035. Or 36% of the $13.2 trillion total opportunity of 5G by 2035. Manufacturing will be by far the largest industry beyond mobile to be impacted by 5G.
In the manufacturing sector the adoption will benefit in the short-to-medium term from enhanced indoor wireless broadband coverage. Other early use cases include asset tracking such as visibility over incoming and outgoing components and goods in the supply chain; remote access solutions that enable remote machine maintenance from the internet; and industrial automation, such as continued automation of robots and connectivity for moving assets such as AGVs.
5G provides many of the network characteristics essential for manufacturing such as low latency, high reliability and high connection density. These are requirements that manufacturers currently rely on fixed-line networks. The 5G technology will allow for higher flexibility, lower cost, and shorter lead times for factory floor production reconfiguration, layout changes, and alterations.
The most impactful and most critical use cases, however, will also require network protocols that, more than “traditional” IT protocols include elaborate mechanisms to provide fault-tolerant network paths and precise time synchronization. As modern industrial networks are changing from communication architectures with strong compartmentalization towards communication from the sensor to the cloud, time synchronization becomes even more important. These use cases require precise time synchronization all the way down to device level.
In industrial automation information about physical events captured by wirelessly-connected and spatially-distributed sensors have to be synchronized. Individual devices must be carefully synchronized to project the relative chronological order of occurrences and extract correlation patterns in the event. Time synchronized coordination among machines and robots is crucial. In particular in closed-loop motion control, such as in packaging, printing, symmetrical welding, etc. in which machines execute meticulously sequenced real-time tasks isochronously.
Such time synchronization is also required for energy efficient radio scheduling and distributed coordination within the network.
Therefore, to realize all of the intended benefits of 5G-enabled industrial manufacturing, we need to add to the mix a deterministic, real-time communication with very accurate time synchronization.
In this article I will try to explain the importance of Time Sensitive Networking in the context of Industry 4.0. The time sensitivity requirements of the various industrial application (including periodic/deterministic, a-periodic/deterministic and non-deterministic) are examined. The building blocks of the TSN standard (IEEE 802) including Traffic Shaping, Resource Management, Time Synchronization and Reliability are explained. Finally, the enablement of Time Sensitive Networking for Industrial Automation with URLLC in 5G is discussed.
The world has seen industry evolve over time. At certain stages there has been a “revolution” vs an evolution:
1st Revolution: “Steam power” introduced to mechanize industrial production ~ Late 18th Century
2nd Revolution: “Mass production” pioneered by automotive manufacturers ~ Early 20th century
3rd Revolution: “Digital” or programmable electronic systems, robots automate production lines ~ Early 1970s
4th Revolution: “Internet of Things” with time sensitive connectivity to the cloud to maximize automation
The 4th industrial revolution is based on:
Internet of Things (IoT) that increase automation, improve communication and self-monitoring
Smart machines that analyze and diagnose issues without the need for human intervention
Increased sharing of data across multiple systems and participants in the manufacturing process
A critical component to enable Industry 4.0 is industrial IoT.
Connectivity Evolution for Industrial IoT
According to Omdia, “Connectivity is one of the fundamental pillars upon which the industrial IoT (IIoT) is built. And over the last few decades, industrial connectivity, in particular, has evolved considerably, especially in response to the ever-changing requirements of the manufacturing industry.” The article continues to illustrate how industrial connectivity has evolved over the past 4 decades:
1980s: “Discrete wires” to communicate with field devices
1990s: “Fieldbus” industrial networking technology and a controller to communicate with field devices
Today: “Ethernet” is deployed with the fieldbus and wireless technologies
2020+ Time-Sensitive Networking (TSN) will be deployed
The time sensitive nature of industrial automation is explained by an example from high speed packaging:
Machines that filling, jars or bottles with food products need millisecond level precision in the timing of signals that control the process of placing, filling, removing and sealing the containers.
A warning from a machine tool that it has for some reason failed fully to offload the component it has just manufactured must reach the robot seeking to load the next component before it makes the attempt.
Missed connections and millisecond delays in communication between robotic systems can cause products and perhaps production machines to become unsynchronized, often leading to damaged products or even damaged machines.
The same article mentions another example of a failure with catastrophic consequences is failure of a system to respond immediately to a warning of over-pressure in a boiler.
Time Sensitive Requirements in Industrial Automation
Industrial systems therefore need to guarantee an event will occur precisely when expected. It is essential to ensure that events happen exactly when they are supposed to happen and that there is no scope for variability. Industrial ethernet due to its Carrier Sense Multiple Access (CSMA) nature cannot provide this level of determinism for industrial networking. In an IEEE paper the authors mention multiple industrial IoT use cases that require low latency. These industrial automation use cases referenced have the following traffic requirements:
Deterministic, periodic with stringent latency requirements
Motion control responsible for controlling moving and/or rotating parts of machines (e.g. printing machines, machine tools or packaging machines)
Control-to-control communication between industrial controllers e.g. an assembly line
Mobile robot able to fulfil a large variety of tasks usually following programmed paths
Deterministic, aperiodic with lesser stringent latency requirements or non-deterministic requirements
Mobile control panels with safety functions (safety panels) used for configuring, monitoring, and controlling machines, robots, or production lines. Safety control panels are also typically equipped with an emergency stop button. They require transmission of non-critical data (non-deterministic traffic) for the configuration, monitoring, and maintenance of the machines. They also require the transmission of highly critical and unpredictable safety data with stringent latency requirements (deterministic aperiodic traffic) when pressing the emergency stop button.
Process automation (P.A.) – closed-loop control, for example when several sensors are installed in a plant and each sensor makes continuous measurements. The latency and determinism in this use case are crucial. Closed-loop control produces periodic and aperiodic traffic with strict latency requirements (i.e. deterministic traffic). The traffic is aperiodic if for example the sensor only transmits data when a certain threshold is exceeded. It is periodic if the sensed data must be periodically transmitted to maintain the industrial process active.
Process automation (P.A.) – plan asset management: In this use case, sensors collect data about assets. This data must be transmitted for storage and processed within a defined time interval (deterministic aperiodic traffic). This data is used to continuously diagnose assets and components and be able to detect (and even predict) any possible degradation.
Time-Sensitive Networking (TSN)
Time-Sensitive Networking (TSN) is a key technology to realize this fundamental change. TSN is a set of IEEE 802 Ethernet sub-standards, defined by the IEEE TSN Task Group that enables deterministic real-time communication.
In article on Time Series Networking and 5G by Ericsson: “TSN provides guaranteed data delivery in a guaranteed time window; that is, bounded low latency, low-delay variation and extremely low data loss. TSN supports various kinds of applications having different QoS requirements: from time- and/or mission-critical data traffic, for example, closed-loop control, to best-effort traffic over a single standard Ethernet network infrastructure; in other words, through a converged network. As a result, TSN is an enabler of Industry 4.0 by providing flexible data access and full connectivity for a smart factory.“
The main goal of a Time Sensitive Network is to provide deterministic services over IEEE standard 802.3 Ethernet wired networks. This means guaranteed packet transport with low and bounded latency, low packet delay variation, and low packet loss. TSN features can be enabled for specific data streams in a network that also handles best effort type of traffic.
TSN enables deterministic data transfer by splitting time into repeating cycles by means of the TDMA (Time Division Multiple Access) method. Within these periods, time slots are reserved for high priority data streams, which need to be protected from other network transmissions. This creates virtual channels from one terminal device connected to the network to another. These channels are closely linked to the internal clocks of the participating network members. In order to achieve high precision for time synchronization, TSN usually uses the Precision Time Protocol (PTP) in accordance to IEEE1588.
TSN has typically been targeted at wired networks because it requires very low latency. TSN focused on the link layer of the network, which is different from the 3GPP 5G standards or the 802.11 Wi-Fi communications standards focused on the communications layer of the network.
However, the latest 5G and 802.11ax Wi-Fi (Wi-Fi 6) standards, which support ultra-reliable low latency communications (URLLC), make TSN over wireless networks an exciting possibility. These standards introduce different scheduling mechanisms than previous wireless standards allowing for a more efficient scheduling of simultaneous transmissions from multiple devices. This can eliminate delays, and can make it possible to provide bounded latency and high reliability in wireless communications. Something that was practically impossible previously.
TSN standards can be seen as a toolbox that includes several valuable tools categorized into four groups – Traffic Shaping, Resource Management, Time Synchronization and Reliability.
Traffic shaping guarantees the worst-case latency for critical data by various queuing and shaping techniques and by reserving resources for critical traffic:
The Scheduled Traffic standard (802.1Qbv) provides time-based traffic shaping.
Ethernet frame preemption (802.3br and 802.1Qbu), can suspend the transmission of a non-critical Ethernet frame. It is beneficial to decrease latency and latency variation of critical traffic.
Resource management is defined by the TSN configuration models (802.1Qcc)
Centralized Network Configuration (CNC) can be applied to the network devices (bridges),
Centralized User Configuration (CUC) can be applied to user devices (end stations).
Time synchronization is based on the generalized Precision Time Protocol (gPTP) (802.1AS),
It is a profile of the Precision Time Protocol standard (IEEE 1588)
It provides reliable time synchronization and can be used by Scheduled Traffic (802.1Qbv).
Reliability is provided by Frame Replication and Elimination for Reliability (FRER) (802.1CB) for data flows through a per-packet-level reliability mechanism.
It provides reliability by transmitting multiple copies of the same data packets over disjoint paths in the network.
Per-Stream Filtering and Policing (802.1Qci) improves reliability by protecting against bandwidth violation, malfunctioning and malicious behavior.
Time-Sensitive Networking Profile for Industrial Automation is referred to as IEC/IEEE 60802 specifies the application of TSN for industrial automation, and also gives guidelines to what 5G needs to support.
5G Specifications and TSN Requirements
3GPP 5G NR Release 16 specification is focused on enabling Industrial Internet of Things (IIoT) communications. Release 16 includes latency and reliability enhancements that build on the already very low air-interface latency and high reliability provided by Release 15. Release 16 approach is to integrate TSN over the top. TSN time domain information is distributed between the TSN translator functions in the network and the device using the 802.1AS standard protocol. More work is expected to occur in 3GPP Release 17.
5G specification includes several functionalities especially around the 5G New Radio (NR) that can be mapped to the TSN requirements:
Low Latency in 5G NR is enabled by shorter slots in a radio subframe, which benefits low-latency applications. NR also introduces mini slots, where prioritized transmissions can be started without waiting for slot boundaries, further reducing latency.
Resource Management 5G NR introduces preemption – where URLLC data transmission can preempt ongoing non-URLLC transmissions. Additionally, NR applies very fast processing, enabling retransmissions even within short latency bounds.
Reliability – 5G defines extra-robust transmission modes for increased reliability for both data and control radio channels. Reliability is further improved by various techniques, such as multi-antenna transmission, the use of multiple carriers and packet duplication over independent radio links.
Time synchronization is embedded into the 5G radio systems as the radio network components themselves are also time synchronized, for instance, through the precision time protocol telecom profile. This is a good basis to provide synchronization for time-critical applications.
In the Ericsson white paper 5G evolution: 3GPP Releases 16 & 17 overview, the authors describe how Industrial IoT integrates with the TSN spec via 5G.
5G support for TSN is still a work in progress. If you want to check in more detail the current specifications, the most relevant clauses are:
TS 23.501 clauses 4.4.8, 5.27, 5.28, Annex H, Annex I on support for TSN and clauses 220.127.116.11, 18.104.22.168, 22.214.171.124.3 on Ethernet forwarding;
5G NR defines multiple numerologies to support Enhanced Mobile Broadband (eMBB), Massive Machine Type (mMTC) Communications and Ultrareliable Low Latency Communications (uRLLC) with different QoS requirements. In 4G (or LTE – Long Term Evolution) defines a fixed slot duration. On the other hand, 5G NR defines different slot durations, and can simultaneously support different numerologies to serve a variety of applications.
5G and Network Slicing
5G Network slicing can support multiple applications with different QoS requirements thanks to the flexibility introduced in 5G NR with and 5G Virtualized Core Network. The slices share computing, storage and resources at the RAN, but configure differently their radio resources to support eMBB, uRLLC and mMTC applications. In the following figure:
Slice 1 is configured with shorter time slot durations for uRLLC applications for industrial IoT
Slice 2 uses a low numerology to support a large number of devices with low bandwidth demands and without strict latency requirements.
Slice 3 is configured to support eMBB applications with large bandwidth demands.
In summary the Ultra Reliable Low Latency Communications works in conjunction with the Network Slicing to achieve Time Sensitive Networking Requirements for industrial Applications.
Deterministic execution of the production cycle requires timely coordination among devices which is possible only if the devices and the E2E communication are synchronized to a common time reference with clock disparity of less than 1 microsecond.
TSN time synchronization is based on the generalized Precision Time Protocol (gPTP) (802.1AS) as a profile of the Precision Time Protocol standard (IEEE 1588).
For quite a while 5G networks will coexist with traditional networks and might require transparent integration to transport industrial Ethernet or TSN. In such scenarios, collaborative actions of devices belonging to different domains need to be coordinated in time and 5G systems will needs to interwork with the gPTP of the connected TSN network, as gPTP is the default time synchronization solution for TSN-based industrial automation. Initial capability for such bridging between 5G and TSN networks is a part of Release 16.
In 5G networks time synchronization is an essential part of the 5G radio system. Radio network components are themselves time synchronized for advanced radio transmission, such as synchronized Time Division Duplexing (TDD) operation, cooperative multipoint transmission (CoMP) and carrier aggregation.
There are two independent time synchronization processes running in parallel in an integrated 5G-TSN system: a 5G System synchronization process and a TSN
5G URLLC – Time Sensitive Networking Industry Showcases
Multiple industry consortia have formed to leverage 5G capabilities for industry applications including
5G Alliance for Connected Industry and Automation (5G-ACIA)
The overall goal of 5G-ACIA is to apply industrial 5G in the best possible way. Members jointly strive to make sure that the particular interests of the industrial domain are adequately considered in 5G standardization and regulation. Together, they discuss and evaluate technical, regulatory, and business aspects with respect to 5G for the industrial domain.
In its white paper 5G-ACIA provides an overview of 5G’s basic potential for manufacturing industry and outlines relevant use cases and requirements. Not being complete, the example use-cases demonstrate that QoS requirements can be very divergent ranging from process control with a cycle time of >50ms and availability of >99.99% to motion control demanding for availability of more than six 9’s and cycle time can be as low as <0.5ms. Worth noting that 5G must also meet operational and functional requirements of the industry, such as dependability, functional safety, security, cost efficiency and process flexibility.
Industrial Internet Consortium (IIC)
The Industrial Internet Consortium (IIC) was founded in March 2014 to bring together the organizations and technologies necessary to accelerate the growth of the industrial internet by identifying, assembling, testing and promoting best practices. Members work collaboratively to speed the commercial use of advanced technologies. Membership includes small and large technology innovators, vertical market leaders, researchers, universities and government organizations.
Alliance of Industrial Internet (AII)
The Alliance of Industrial Internet (AII) was jointly initiated by manufacturing industry, communications industry, Internet and other enterprises, aiming to study and promote industrial Internet standards, results of industrial internet testing and demonstration, as well as products and application innovation. Although most of its results are published in Chinese, its work is worth of reference.
International Electrotechnical Commission (IEC)
International Electrotechnical Commission (IEC) is a leading global organization that prepares and publishes International Standards for all electrical, electronic and related technologies. IEC has an extensive work range including smart city, smart grid, cyber security, smart electrification and so on. To promote international co-operation in the electrical and electronic fields, IEC publishes many standards, reports and specifications. Some of these publications such as 61850, 61907 and 62657 series of standards present the requirements of industrial automation on wireless communication
Avnu Alliance is a group of silicon suppliers and networking vendors creating an interoperable ecosystem of low-latency, time-synchronized, highly reliable synchronized networked devices using open standards through certification.
Members of the Avnu Alliance include some familiar names like Intel, Keysight Technologies, General Electric and Extreme Networks.
The Alliance is focused on applications of these technologies in the Automotive, Professional A/V, Industrial and Consumer Electronics markets.
TSN-Specific Cybersecurity Challenges
It wouldn’t be my article, if I wouldn’t mention cybersecurity.
In addition to all the other 5G security and privacy challenges, when talking about TSN over 5G we have to consider time synchronization as a new attack surface that has to be protected.
By attacking the time synchronization protocol, a potential attacker could effectively cause a denial-of-service. Since TSN is based on the availability of time data, operational impact could be caused by simply deliberately overloading a single time slot.
Time synchronization protocols mentioned above, by themselves, do not have security mechanisms built in and completely rely on security controls present in the network.
This paper explores the TSN (Time Sensitive Networking) toolbox as defined by IEEE 802 that are a critical component to enable industry 4.0.
The TSN requirements are mostly fulfilled by the 5G specification in Release 16 with the flexible 5G frame structure, and the 5G Network Slicing Feature that optimizes the network resources to enable the uRLLC.
Neil Harbisson calls himself a cyborg. Without the antenna implanted in his skull, he would not be able to see colour of any kind. Born with achromatopsia, a condition of total colourblindness that affects 1 in every 30 000 people, Harbisson’s physical faculties are augmented by cyber technology to grant him access to a life of greater meaning and satisfaction.
As technological evolution leads to concomitant advances in medical science, we are seeing more and more examples of humans who are integrating devices and sensors into their biological makeup. For some, like those part of the growing “transhumanist” movement, this is a means of artistic expression or exploration of human potential. For others, it is a solution to a medical problem. Either way, it represents the most vivid and personal example of what may be called a cyber-physical system (CPS).
Harbisson campaigns for greater debate around the identity and rights of people with tech-adapted bodies. As in any discussion of CPSs, however, a more urgent part of the conversation should be security.
In March 2019, an alert from the US Department of Homeland Security and the FDA warned medical professionals and patients that a broad range of implanted devices, such as defibrillators and heart monitors, were vulnerable to hacking that could cause product malfunction.
Of course, these dangers are not only seen at the level of the private individual. Greater, more widespread risk is found in the cyber-physical systems that will soon be ubiquitous, crucial to the successful operation of industry and society. Adoption of these networks is being driven by access to the internet of things (IoT) or, more accurately in cases of biological integration, the internet of everything (IoE), and is about to be accelerated with the rollout of 5G. Unfortunately, however, so are the risks.
What Is A Cyber-Physical System (CPS)?
CPS is a broad, umbrella term for technologies that connect our physical world with the cyber world. It describes situations in which we find a fundamental intersection of computation, communications and physical processes without suggesting any particular implementation or application.
In addition to IoT, the cyber-physical systems term also includes Industrial Control Systems (ICS) – those setups that manage large-scale civil and industrial operations such as smart factories, water supply and power production and distribution, as well as technologies such as the Industrial Internet of Things (IIoT), robotics, drones, connected and autonomous transportation, building management systems, connected environmental controls and a myriad of other things. In essence, these are software-enabled collections of sensors, processors, and control components that automate entire, or large parts of, human operations. And they are already all around us.
Definitions of CPSs vary and many are excellent, but one that is particularly relevant to the topic is a definition I coined for the Cyber-Physical Systems Security Institute (CPSSI) in 1998: “Cyber-physical systems are physical or biological systems with an embedded computational core in which a cyber attack could adversely affect physical space, potentially impacting well-being, lives or the environment.” This definition goes beyond a technical assessment of a system’s makeup to recognize its potential impact on the world around that system. It identifies the inherent threat of cyber attacks and the dangers they inevitably pose to human life.
What Could Go Wrong?
The common appreciation of threats innate to cyber-physical systems is evolving more slowly than the technology within those systems, and more slowly than the thinking of those who wish to use this technology to cause harm.
The use of these devices in our personal lives – everything from smart phones to smart appliances in smart homes – is already taken for granted in developed nations. Though private individuals are becoming savvier about their exposure to uninvited surveillance through these devices, most concerns are still centered around privacy and data security. Few people consider the possibility of technological tools and their components being captured for employment against them in tactile ways.
The case of vulnerable heart equipment shared earlier offers one example of how a cyber-physical attack could be lethal to individuals. Hackers have already proved that it is possible to hijack a moving vehicle remotely, raising obvious safety concerns for the driver, but also fellow drivers on the road. Now, imagine that same concern extrapolated across a network of self-driving vehicles all travelling at high speed –a scenario which, as we’ll see shortly, becomes a reality with the introduction of 5G.
This growing number of devices and their management applications connected to the IoT represents an exponentially expanding “attack surface” available to hackers and cyber-terrorists.
Unfortunately, regulations governing security of these devices and applications are underdeveloped, non-uniform and difficult to enforce across borders,an especially pertinent issue when equipment components are produced in one region, assembled in another and then sold in a third or more.The absence of these regulatory protocols leaves a huge gap as the vast majority of IoT devices are delivered without baked-in security. Even when companies do aim to make their products secure, these endeavours are usually hampered by a lack of expertise and constant pressure to be first-to-market.
This all translates into a perfect storm of cyber-physical threats in the private and social spheres, but greater dangers extend to a national, even international, level where the scale of impact is highest.
Nation-state attacks against cyber-physical systems are becoming routine. The Stuxnet malware incursion used to disrupt uranium enrichment in the Iranian plant at Natanz in 2010 saw the birth of cyber-kinetic weaponry. Since then, similar attacks have been numerous, with targets including military, civil and industrial operations.
In 2013, hackers thought to be working for a nation-state gained control of a small dam in the US, giving them the power to release water onto the communities below (had the sluice gates not been manually disabled).
The Dragonfly/Crouching Yeti espionage campaigns, thought to have taken place from 2011 to 2014, were attacks on targets in the aviation and defense industries in the US and Canada, as well as various energy industry targets in the US, Spain, France, Italy, Germany, Turkey and Poland. Similar tactics could be seen in the Ukraine in 2015, with the BlackEnergy malware causing significant power outages.
In 2017, the US electricity grid was attacked, emphasizing what experts have known for decades: critical systems such as national energy are constantly vulnerable to breach, with potentially devastating consequences for hospitals and clinics, industry, transport and civil supply services.
The Center for Strategic and International Studies (CSIS) regularly updates its list of significant cyber incidents, with a focus on cyber attacks on government agencies, defense and high-tech companies, or economic crimes with losses of more than a million dollars. More than 20 such major events have been recorded in the last two months alone, with most of those attacks having an impact on cyber-physical systems. Until 2017, I used to track cyber-kinetic incidents – those that have caused impacts in the real, physical world. I stopped because the number of such attacks increased beyond my capacity to track.
The attack surface is growing. We are already seeing a post-COVID-19 drive toward greater automation of manufacturing operations and supply chains as businesses try to mitigate the risk of reliance on human labour. These developments rely on the creation of CPSs that require increasingly sophisticated cybersecurity.
Most of these CPSs are built with 5G in mind. This budding technology is set to revolutionize industry and society, facilitating the establishment of highly integrated and largely autonomous production and distribution systems. But 5G is a two-sided coin. With its tremendous potential comes tremendous risk.
When Cyber Becomes Physical: Securing the 5G Bridge
5G has been discussed extensively in almost every industry. It is in its infancy, already showing impressive results, but yet to see widespread availability. It is set to redefine the possibilities of CPSs as well as the security requirements of those systems. But the questions linking 5G and CPSs go back some time.
On one side of the coin, the concerns. As far back as 2012, US Defense Secretary Leon Panetta warned the Business Executives for National Security of the dangers of attacks on national systems: “The most destructive scenarios involve cyber actors launching several attacks on our critical infrastructure at one time, in combination with a physical attack on our country. Attackers could also seek to disable or degrade critical military systems and communication networks. The collective result of these kinds of attacks could be a cyber Pearl Harbor; an attack that would cause physical destruction and the loss of life. In fact, it would paralyze and shock the nation and create a new, profound sense of vulnerability.”
In 1968, Stanley Kubrick invited writer Arthur C. Clarke to collaborate in the creation of the groundbreaking epic, 2001: A Space Odyssey. The film launched Clarke to the status of pop culture icon and indisputable futurist, though his uniquely prescient abilities were already well-established by then.
Of the many predictions Clarke made in his career, perhaps the most well-known was this one from 1964, in which the author declares: “I’m perfectly serious when I suggest that one day we may have brain surgeons in Edinburgh operating on patients in New Zealand.”
As it turned out, the surgeon and the patient were both in China. The world’s first remote brain surgery was performed last year by Dr. Ling Zhipei, who conducted the operation by manipulating instruments in Beijing from his location in Sanya City, 3000 kilometers away.
Though this historic event had been expected for some time, the fact that Clarke predicted it more than 55 years ago is astonishing. What’s even more impressive, though, is the detail of the writer’s foresight. Not only did he see remote surgery coming, but he also saw the complications that would hamper its success. In his 1975 novel, Imperial Earth, Clarke addresses the problem of an even slightly laggy network: “‘Hawaii’s almost exactly on the other side of the world–which means you have to work through two comsats in series. During tele-surgery, that extra time delay can be critical.’ So even on Earth, thought Duncan, the slowness of radio waves can be a problem. A half-second lag would not matter in conversation; but between a surgeon’s hand and eye, it might be fatal.”
Though he wouldn’t have known it then, what the writer had identified was one of the key distinctions between 4G and 5G networks.
To take nothing away from Dr Zhipei’s skills as a surgeon, his pioneering achievement would not have been possible without the computer, powered by a 5G network. 5G eliminates the lag and remote-control delay typical on 4G networks. But, that is not simply because it is an upgraded version of 4G. 5G is something entirely new. It is a momentous leap in potential. It has made science fiction science fact.
Though many rub their hands in glee at the prospect of super-fast movie downloads and instantly responsive gaming, the most notable impact of this technology will be through the CPSs it facilitates. It is there that we will see technology finally having the enduring societal impact it has promised for so long. But the halcyon image of humans living carefree in a hyper-connected world is also misguided.
When systems are cyber-kinetic, high speed, high efficiency, AI-driven decision making and systems autonomy are great when things are running well. But when they aren’t, people can get hurt. Or worse.
A New Age of Cyber…
Another soothsayer of sci-fi is William Gibson, venerated author of Neuromancer, coiner of the term, “cyberspace” and regarded by many as a prophet of the digital age. In a recent Financial Times interview, Gibson states, “The online/offline distinction is going to be fully generational soon. Only old people will think of being on or off.”
The digital mystic is expressing a recurring theme that underpins the evangelical spirit of Neil Harbisson and other proponents of the Singularity–human and machine are moving closer and closer to becoming one. Though we are not yet at the stage of full cyber-bio assimilation, the functional integration of technology into our daily lives is already widely apparent through the IoT.
Thanks to consistently cheaper computer chips and the ubiquity of wireless networks, the IoT is expanding unabated. In a 5G world, the IoT will grow exponentially to a massive internet of things (mIoT) that includes sub-domains such as the industrial internet of things (IIoT) and critical internet of things (cIoT). The connection capacity of 5G networks will be breathtaking. For the first time, smart cities will be genuinely possible: all aspects of our lives – personal, professional, social – connected in a continuous stream of data creation and interpretation.
Our homes will be “intuitively” responsive to our every whim and taste, our offices will maximize energy efficiency and convenience, our social services will be preemptive and evolutionary.
Fleets of autonomous vehicles directed by self-managed and self-optimizing traffic control systems, public surveillance interfaces capable of refined facial recognition, civil management operations ensuring that water, energy and waste processes run increasingly smoothly – these are the anticipated fruits of a 5G world.
There are a couple of reasons for this. First, 5G is fast. Lightning fast. Its theoretical top speed (20 Gbps) is up to 200 times faster than 4G. 5G’s speed is what makes it possible to download Ultra HD movies in a matter of seconds.
Second, 5G operates with unbelievably low latency (the time it takes for a system to receive a response to a request). The average human reaction time to a stimulus is 250 milliseconds (ms). Most humans perceive 100ms as instantaneous. 5G’s reaction time is between 1 and 2ms. 5G’s super-low latency is what makes real-time instant gaming, remote surgery and driverless cars a reality.
5G is able to produce these sensational results because it is not like anything that has come before. Though the term “5G” is an abbreviation of “5th Generation,” this nomenclature is deceiving. It suggests that 5G is simply an advanced form of 4G, just as 4G was a step up from 3G.
This is not the case.
Unlike previous generations, 5G is not a physical network. It is an all-software cloud-based configuration operated through distributed digital routers. It is a decentralized system that optimizes processing speed and power by relocating operations to the fringe.
Resting in the digital ether, built on software and managed largely by AI, 5G represents the first widespread transcendence of physical computing and communication. Perhaps ironically, then, it is in the physical realm where 5G’s greatest dangers lie. Though the technology itself is agnostic, it does invite us to marry our physical lives with the cyber realm, and for all the promises in that union, there are many threats too.
…Needs A New Age of Security
There is little doubt that cyber-physical technologies are encroaching into every aspect of our lives and are evolving toward higher degrees of autonomy and adaptability.
With the explosion of CPSs connected through the upcoming 5G with its distributed structure, incredible speed and negligible latency, the reality is starting not only to match, but to exceed the expectations of science fiction writers and futurists of past generations.
But there is an inherent trade-off in this equation. In return for greater convenience we are increasingly losing the control over the related cyber risks.
Unlike 3G and 4G networks, which are more centralized, 5G’s edge computing decentralizes processing, moving it away from the “core” of the network to the data source. This is partly what makes 5G’s sub-second latency possible, but it also restricts cyber hygiene and makes the network harder to police. With thousands, or millions of devices on the “edge” of any organization’s network, all making decisions at different levels of the network, all potentially serving as attack vectors for the whole organization, cybersecurity approaches of the past are becoming obsolete.
With cyber risks transcending the traditional concerns of financial and reputational impact and becoming the risks to lives, well-being or the environment, traditional cybersecurity and cyber-risk management approaches and organizational structures must be rethought.
Consumers have already proven their appetite for IoT devices. 5G will enable them to access more at lower cost. Manufacturers will continue to meet this expansive need, until we have exponential demand curves meeting exponential supply curves. Billions of devices with multiple application types–the attack vectors become limitless.
As discussed, the security of these devices is unregulated, inconsistent and unreliable. Products developed with short-term profit focus are being designed as iterative models, always released as a minimum commercially viable product. They have no defense against cyber attacks. Protection is almost impossible.
Hackers will always find a way, and with billions of entry points into the 5G network, that could spell catastrophe. We simply can’t learn fast enough. As William Gibson suggests, there will be a never-ending process of adoption and adaptation as the “street finds its own uses for things.”
The outcomes are frightening enough when one thinks of cyber attackers infiltrating our private networks, but what about the broader implications spelled out in Panetta’s speech?
When hackers or cyber terrorists manage to compromise the systems that keep a smart city, or smart factory, or smart port, or a country functioning, the consequences are large scale and a threat to physical life. When water supply, power supply, traffic management, waste removal or connectivity are disrupted, humans suffer.
Defending ourselves against these possibilities is not a negative stance, nor is it a dampener on human progress, as some idealists would have us believe. The security of cyber-physical systems and the 5G that connects them is possibly one of the most urgent responsibilities we face in the coming decade.
A failure to enlist governments, regulators, private enterprises and consumers in a coordinated approach to the cyber-secure implementation of the smart-everything world could be devastating. Not even Arthur C. Clarke could predict the results.
(This article was originally published in ThinkTwenty20 magazine)
The interest in 5G and mIoT is exploding. It’s exciting to see so many IT and cybersecurity professionals in my network trying to learn more about 5G and related technologies.
In addition to my usual articles about the societal impacts of these innovations, I’ll start a series of articles introducing key 5G and mIoT technology concepts. Before we move on to technical aspects of 5G security.
Let’s get started with reviewing the 5G core service-based architecture and learning the first few dozen acronyms, out of approximately a gazillion. The cellular industry loves acronyms. Even more than the cybersecurity industry.
5G architecture is an evolution of current 4G architectures but based on a Service-Based Architecture (SBA). The 3GPP defines the SBA for a 5G core network as delivered by a set of interconnected NetworkFunctions(NFs), with authorization to access each other’s services.
Some of the key differences / focus areas:
In contrast to a fixed-function, hard-wired, appliance-based architecture as was the case for 4G LTE Core (or Evolved Packet Core (EPC)), fully realizing the potential of 5G means moving to a software and cloud-based open platforms.
EPC (4G Core) elements were architected to be implemented on physical nodes that were virtualized, but not designed to be virtualized from the outset.
Network elements in 5G core are cloud native; referred to as “functions” vs. “nodes.”
Automation and programmability are important part of the target 5G architecture.
With the flexibility, virtualization and programmability, the new architecture would better support possibility for diverging architectures for new service.
In summary – 5G core is designed for three enhancements:
Control and User Plane Split – Mapping of 4G Core to 5G Core elements Access and Mobility Management Function (AMF), Session Management Function (SMF), User Plane Function (UPF).
Native support for Network Slicing for the 5G Use Cases including enhanced Mobile Broadband (eMBB), massive Machine Type Communications (mMTC) & critical MTC, and Ultra-Reliable and Low Latency Communications (URLLC).
Service Based Architecture – A service-based architecture delivers services as a set of “Network Functions”
4G Control and User Plane Separation (CUPS) EPC
The separation of Control and User Plane for the 4G architecture was introduced with 3GPP Release 14. It separated the packet gateways into control and user planes allowing for more flexible deployment and independent scaling achieving benefits in both, CapEx and OpEx.
The next step in the evolution to 5G was to rename core network entities and either split or merge them depending on the functions that fall within the user or control plane in the 5G architecture. For those of you with the 4G background, some 4G CUPS Core elements can be easily mapped to renamed 5G Core elements. Here are the few key ones:
Next generation NB (gNB)
The new radio access technology is called New Radio (NR) and replaces LTE. The new base radio station is called next generation NB (gNB) (or gNodeB). It replaces the eNB (or eNodeB or Evolved Node B) in 4G-LTE, or NodeB in 3G-UMTS.
The gNB handles radio communications with the 5G capable User Equipment (UE) using the 5G New Radio (NR) air interface. Although, some types of gNB may connect to the 4G EPC instead of 5G Core.
The Control Plane – AMF and SMF
The Mobility Management Entity (MME) in LTE is the signaling node for UE access and mobility, establishing the bearer path for UE’s, and mobility between LTE and 2G/3G access networks. Mobility Management Function in LTE is now replaced with:
Access & Mobility Management Function (AMF) – oversees authentication, connection, mobility management between network and device. It receives connection and session related information from the UE.
Session Management Function (SMF) – handles session management, IP address allocation, and control of policy enforcement.
The Data Plane – User Plane Function (UPF)
As CUPS decouples Packet Gateway (PGW) control and user plane functions. This enables the data forwarding component (PGW-U) to be decentralized which is mapped to the UPF for the 5G Core.
The user plane function consists of a single entity User Plane Function (UPF)
It combines functionality from previous EPC Serving-Gateway (S-GW) and PDN-Gateway (P-GW).
UPF is responsible for packet routing and forwarding and Quality of Service (QoS).
Network Slicing in 5G
A 5G network is geared towards supporting multiple use cases / applications. Examples of these uses cases include:
enhanced Mobile Broadband (eMBB) which entails supporting user throughputs in the Gbps range (x Gbps)
Industrial Internet of things that requires the Ultra-Reliable and Low Latency Communications (URLLC) capabilities (~ 1ms latency)
massive Machine Type Communications (mMTC)– a network that can support millions of IoT devices
5G supports these multitude of services by leveraging the SBA to support multiple virtual networks that operate on the same physical hardware. The slices that occupy a single physical network are separated, meaning traffic and security breaches from one slice cannot interfere with another slice.
In short, a Network Slice is a logical network including the Radio Access and Core Network.
It provides services and network capabilities, which vary (or not) from slice to slice.
It lets service providers partition their networks into discrete horizontal slices for specific use cases, services, individual customers or even vertical segments, such as energy, healthcare and manufacturing.
A dedicated set of physical and virtualized network resources are allocated– from end devices, over the radio access, transport and packet core to application, content delivery and edge cloud domains.
In summary, a network slice is a logical network that provides specific network capabilities and network characteristics. A key component of a Network Slices is the Network Slice Instance (NSI). A Network Slice instance is a set of Network Function instances and the required resources (e.g. compute, storage and networking resources) which form a deployed Network Slice.
In 5G, a Network Slice includes the Core Network Control Plane and User Plane Network Functions as well as the 5G Access Network (AN). The 5G Access Network may be:
A Next Generation (NG) Radio Access Network (gNB)
A non-3GPP Access Network where the terminal may use any non-3GPP access to reach the 5G core network via a secured IPSec/IKE tunnel terminated on a Non-3GPP Interworking Function (N3IWF).
For those of you from IT and cloud background, you can imagine the 5G SBA as a hybrid of Service-Oriented Architecture (SOA) and microservices.
In short, it is an architectural approach that enables 5G network functionality to become more granular and decoupled. This allows individual services to be updated independently with minimal impact to other services and deployed on demand allowing for vendor independence, automation and agile operational processes, reduction in delivery and deployment time, and enhanced operational efficiencies.
Basic principles are:
A Control Plane Network Function can provide one or more NF Services
A NF Service consist of operations based on either a request-response or a subscribe-notify model
Common control protocol using e.g. HTTP based API, replacing protocols like e.g. Diameter
Service-based interface (request-reply and subscribe-notify) (Credit: ITU)
The major building blocks of the 5G Core Service-Based Architecture are simplified as follows:
Network and Resource Management
Application Function and Network Exposure Function
The IMS Core Functionality is the same as for 4G.
Network and Resource Management
Network and Resource Management consists of three parts:
Network Repository Function (NRF)
Allows every network function to discover the services offered by other network functions.
It serves as a repository of the services;
supports discovery mechanisms that allows 5G elements to discover each other; and
enable status updates of the 5G elements.
Network Slice Selection Function (NSSF)
Selects the Network Slice Instance (NSI) based on information provided during UE attach.
Redirects traffic to a network slice.
A set of Access and Mobility Management Function (AMF) are provided to the UE based on which slices the UE has access to.
Network Data Analytics Function (NWDA)
Responsible for providing network analysis information upon request from network functions.
Security Edge Protection Proxy (SEPP)
Protects control plane traffic that is exchanged between different 5G operator networks.
Service Communication Proxy (SCP)
SCP is a decentralized solution and composed of control plane and data plane.
SCP is deployed along side of 5G Network Functions (NF) for providing routing control, resiliency, and observability to the core network.
Binding Support Function(BSF)
BSF is used for binding an application-function request to a specific Policy Control Function (PCF) instance.
It is comparable to Policy and Charging Rules Function (PCRF) binding function provided by a 4G Diameter Routing Agent (DRA), for VoLTE and VoWiFi.
Consists of UDR, UDSF:
Unified Data Repository (UDR)
A converged repository of subscriber information that can be used to service a number of network functions.
Stores structured data that can be exposed to an NF.
Unstructured Data Storage Function (UDSF)
Repository for storage and retrieval of unstructured data by a suitable network function.
Network Functions (NFs) can store/retrieve “unstructured” data from UDSF.
Application Function and Network Exposure Function
Application Function (AF)
Supports application influence on traffic routing, accesses NEF, interacts with policy framework for policy control.
Network Exposure Function (NEF)
Provides a means to securely expose the services and capabilities provided by 3GPP network functions.
It exposes APIs from/to external systems.
Policy Control Function (PCF)
Governs the network behavior by supporting a unified policy framework.
Accesses subscription information for policy decisions taken by the UDR.
Supports the new 5G QoS policy and charging control functions.
Charging Function (CHF)
Allows charging services to be offered to authorized network functions.
Authentication Server Function (AUSF)
Is in a home network and performs authentication with a UE.
Relies on backend service authenticating data and keying materials when 5G-AKA or EAP-AKA is used.
Performs the authentication function of 4G Home Subscriber Server (HSS) – a database that contains user-related and subscriber-related information.
Unified Data Management (UDM)
Is a converged repository of subscriber information; used to service a number of network functions.
The 5GUDM (Unified Data Management) can use the UDR to store and retrieve subscription data.
Equipment Identity Register (5G-EIR)
Enables authentication of devices in the network.
Protects networks and revenues against the use of stolen and unauthorized devices.
Home Subscriber Server (HSS)
Is in 4G networks fills a similar function to the UDM for 5G.
It stores customer profile data and authentication information along with encryption keys.
5G Location Services
Location Management Function (LMF)
Supports the following functionality:
Location determination for a UE.
Obtain downlink location measurements or a location estimate from the UE.
Obtain uplink location measurements from the 5G RAN.
Obtain non-UE associated assistance data from the 5G RAN.
The major benefit of the Service-Based Architecture is that the 5G core components are defined as Network Functions (NF) together with an API that can be used to invoke services. In addition, the 5G core decouples the user-plane (or data plane) from the control plane (CUPS).
A key benefit of this capability is that the control plane can be centralized while the User Plane Function (UPF) can be distributed to various parts of the network to achieve low latency or to offload traffic closer to the actual users.
A key application of the CUPS capability is to allow mobile IP traffic to be broken out at different parts of the network enabling distribution of content delivery depending on the use case.
Ultra-Reliable Low Latency Communication (URLLC) traffic is terminated within the aggregation network resulting in lower end-to-end latency.
eMBB traffic is terminated on eMBB caches at the network edge so that this traffic does not need to be carried further into the core.
Non-critical IoT traffic is terminated at a core location.
The 5G Core network builds on Control and User Plane Separation introduced in 3GPP Release 14. The 5G network architecture is based on the Service Based Architecture. It specifies Network Functions that support a multitude of applications that are knit together as Network Slices. The 10 building blocks of the 5G Core network presented include: Network and Resource Management, Signaling, Subscriber Data, Application Function and Network Exposure Function, Location Services, Subscriber Management, Policy, Control Plane, User Plane and the Access Network.
Due to the number and types of use cases supported by 5G, traffic patterns in a 5G network will be a lot more dynamic. The underlying transport network will need to allow programmatic control to allow it to react in near-real-time to the changing traffic demands of the mobile network.
Appendix: 4G Core Revisited
A comparison of the 5G vs 4G architecture in the following figure. The main component of the 4G RAN include:
eNodeB (E-UTRAN) connected
via S1-U (U=User plane) to the Serving Gateway
via S1-C (C=MME=Control Plane) to the MME
The Serving Gateway connects to the PDN node via S5
The Serving Gateway connects to the MME node via S11
The PDN Gateway (Packet Gateway) connects to the internet via SGi
The MME is mobility management entity
HSS – for Home Subscriber Server is a database that contains:
user-related and subscriber-related information;
provides support functions in mobility management;
call and session setup, user authentication and access authorization.
Serving GW part of the User plane (with the PDN GW). It
transports IP data traffic between the User Equipment (UE) and the external networks;
is the point of interconnect between the radio-side and the EPC;
serves the UE by routing the incoming and outgoing IP packets;
is the anchor point for the intra-LTE mobility i.e. handover between eNodeBs;
is also the anchor point for between LTE and other 3GPP accesses;
is logically connected to the other gateway, the PDN GW.
PDN GW (connects to external IP Network aka Packet Data Networks i.e. PDN GW). It
is the point of interconnect between the EPC and the external IP networks;
routes packets to and from the PDNs;
performs functions e.g. IP address / IP prefix allocation or policy control and charging.
As per 3GPP PDN GW and Serving GW
are assigned independently may be combined in a single “box”.
MME Mobility Management Entity. It
deals with the control plane;
handles the signaling related to mobility;
security for E-UTRAN access;
is responsible for the tracking and the paging of UE in idle mode;
is the termination point of the Non-Access Stratum (NAS).
In a recent session on smart building cybersecurity, a student cheekily asked me “How did we ever connect anything before 5G?” At that moment I realized I might have been overdoing my 5G cheerleading recently. To atone, here are the key performance and cybersecurity attributes of the most commonly used connectivity technologies in smart home / smart building use cases… And 5G.
If you thought that the “traditional” home life is under heavy attack from digitization of media and constant communication, wait until you learn about the Internet of Things (IoT) and Smart Homes.
Our most personal spaces – our homes – are rapidly getting digitized and connected. Hundreds of IoT devices – sensors, actuators, smart speakers, smart toothbrushes, and smart everything are being implemented in every home. All trying to create an environment that caters to our every whim, predicts our needs, personalizes our physical space, monitors our health, conserves energy, etc. In doing so all constantly communicating with each other, with our mobile phones, and with a myriad of solutions located somewhere in “clouds”. All creating new cybersecurity and privacy risks.
Indeed, the most representative indicator of technology impact on daily life is the development of wireless communications as the enabler for all these transformations. After the emergence of radio and TV, it was the appearance of the 1st generation of cellular communications technology in 1980s that introduced analog mobile voice communication service and accelerated the transformation. In the next decade 2G offered digital communications and paved the way to the 3G—cellular technology from the beginning of the 21st century that provided IP support and wireless broadcast transmissions. Today the most widespread cellular technology is 4G (as the first all IP cellular technology) and the world is briskly preparing for deployment of its successor—5G.
Simultaneously with the development of cellular technologies, other wireless technologies shaped the market and enabled simple, ubiquitous, device-to-device communication at short ranges. In this article we’ll explore wireless connectivity options for smart homes / smart buildings and introduce main cybersecurity attributes of each.
Wireless technologies and smart home – smart building products
Smart home is the most popular IoT use case currently. Smart homes popularity is probably driven by the fastest growing age group of new homeowners – the Millennials. Having grown up with technology, they often find it more important than other traditional new home features. Millennials are more attracted by smart homes and smart buildings solutions, having confidence in technology, supporting its further innovation and development, and having knowledge to widely accepted new services.
Amazon, Google, Apple, Samsung are the most dominant companies in the smart home market, offering all kinds of products, from smart thermostats to smart lighting devices. The rise of energy or other utilities production and distribution costs, decreasing cost of technologies and massive production, ongoing government policies, campaigns for energy consumption savings, increasing awareness of the carbon footprint consequences for the environment are great motivation for increasing popularity of smart home devices.
Smart lighting systems, like Hue from Philips, can detect presence of people and adjust lighting as needed. Smart light bulbs are supported with auto-regulation based on sunlight intensity.
Nest from Nest Labs Inc. is a representative example of a smart thermostat. It comes with embedded Wi-Fi, allowing users to schedule, monitor and remotely control home temperature. Smart thermostats can also report about energy consumption or remind users about maintenance issues, filters changes, etc.
Smart locks are perfect support for users to allow or deny access to their premises. With smart security cameras, real time home monitoring becomes available 24 hours a day. Smart motion sensors, supported with many features and setting options, can also make a difference between residents, visitors, pets and unauthorized visitors. They can notify authorities about suspicious activities and activate day or night cameras for recording or even provide monitoring that can help seniors to remain at home comfortably. These safety features are extendable even to pet care.
Smart homes also include use cases such as smart TVs, smart washing machines and dryers in the laundry rooms or different kitchen appliances like smart coffee makers, smart toasters, smart refrigerators that monitor expiration dates, make shopping lists and even create recipes based on currently available ingredients.
One of the most important devices in smart homes is a smart home hub. It represents the central point of the smart home system capable for wireless communications and data processing. It combines all separate applications into a single comprehensive application capable of controlling the smart home. Some available smart home hub solutions are Amazon Echo, Google Home, Insteon Hub Pro, Samsung SmartThings, Wink Hub, etc. Artificial intelligence (AI) technology is implemented in smart homes as well, such as in voice-activated systems like Amazon Echo or Google Home that are illustrated in Figure 4.
They have embedded virtual assistants capable of learning users’ behavior and personalizing the smart home patterns and context.
Generally, IoT solutions apply to smart buildings as a next logical step. Majority of technologies applicable to smart homes are implemented in smart buildings such as lighting systems, security and access systems, identity management or heating and air conditioning systems. Smart buildings generally increase the quality of everyday life by enhancing digital experience, tenants’ satisfaction and staff business efficiency, enabling real-time information, improving life organization and work productivity.
Review of wireless technologies and their applicability for smart home / smart building use cases
The primary task of wireless communication technologies is to provide connectivity for automation. Wireless communication technologies differ in specific capabilities which make them more or less suitable for particular use cases.
One of the first wireless communication protocols developed for home automation support and communication among electronic devices was X10 released in 1975. It provided wireless communication at 120 kHz via digital bursts between programmable outlets or switches. This precursor of modern wireless technologies in the beginning had some drawbacks and disadvantages compared to present-day solutions. It was a simplex one direction communication, because home devices did not have the capability to generate backlink response. Wireless communication in both directions was later enabled via X10 protocol, but it was not a cost-effective solution. Moreover, there was a serious problem of communication reliability because of a signal loss caused by circuits that were wired on different polarities.
In the meantime, thanks to the continuous development of different wireless communications and their convergence with cellular communications driven by the adoption of IoT technologies, home automation continued growing.
Today’s wide availability of wireless technologies (like Bluetooth, ZigBee, RFID or NFC) at a reasonable price is a catalyst for rapid development and implementation of a myriad of smart home IoT use cases.
Wireless communication technologies work on different frequencies, use different modulations, differ in ranges, have different resistance to obstacles and interference, they have different power consumption and different power supply solutions, support different mechanisms for security and communication reliability, etc. All these features influence suitability for particular use cases.
Let’s briefly describe some representative wireless communication technologies like Bluetooth, Zigbee, Wi-Fi, RFID and NFC with their strengths, challenges and applicability for smart homes and smart buildings.
Bluetooth is a short-range wireless communications technology based on the IEEE 802.15.1 protocol. It works in a crowded license-free 2.4 GHz frequency band and shares this resource with many other technologies.
The newest versions of Bluetooth is known as Bluetooth Low Energy (BLE) or Bluetooth smart.
It is important to note that Bluetooth and BLE are not compatible technologies. For example, channel bandwidth in Bluetooth technology is 1MHz and in BLE is 2MHz, number of channels in Bluetooth is 79, while BLE supports 40 channels. They also differ in waveforms, transmission power, network organization etc. Bluetooth Versions 4.1/4.2/5.0 support both BLE and Bluetooth standards, but if the master device is a BLE device, the slave must also be a BLE device.
In the most recent Bluetooth Version 5.0 new wave-forms and coding techniques are implemented to achieve longer ranges of 50m or more, less power consumption, lower latency, better robustness and support for a higher number of subscribers in a single Bluetooth network.
At its inception the Bluetooth technology was used for data streaming or file exchange between mobile phones, PCs, printers, headsets, joysticks, mice, keyboards, stereo audio or in the automotive industry.
These days BLE technology became an indispensable protocol used in mobile phones, PCs and other types of devices applicable in gaming, sports, wellness, industrial, medical, home and automation electronics. It is an important wireless technology for smart homes and smart buildings because of the achieved ranges, throughput (2 Mbps), reliability, security performances, low power transmission and low power consumption. BLE provides wireless connectivity that enables home automation via the control of lights—smart bulbs and outlets, smoke detectors, cameras and other security systems, thermostats, video door bells, smart digital locks, hubs and controllers, different assistant devices, universal remotes, gaming consoles, TVs, etc.
In smart buildings, this wireless technology enables automation of some complex systems, as presented in Figure 2, such as: Heating, Ventilation and Air Conditioning (HVAC), lighting, security and indoor positioning. BLE technology deployed in smart buildings enables optimal space utilization, lowers operating and maintenance costs by condition monitoring via different sensors, contributes to energy savings, enhances the tenants, staff or visitor experiences, etc.
BLE is important for both residential and business buildings. It changes the outlook of the offices by formatting the smart meeting spaces or enabling the sensor-based occupancy mapping, improves workflow efficiency, reduces expenditures, increases revenues and employee satisfaction. In specific smart building types—smart healthcare facilities or smart hospitals, BLE is crucial for patient care and operational efficiency improvements.
In retail industry, coupled with beacon technology, it supports enhanced customer services like in-building or in-store navigation, personalized promotions and specific customer oriented content delivery. Some BLE limitations for smart home and smart building use cases would be: suitability for short range controls only, interference with other wireless technologies (Wi-Fi, Zigbee, etc.) that are using license free 2.4GHz frequency range, optimal for short-burst wireless communication, lower throughput compared to some other wireless technologies, lack of generic IP connectivity etc.
Zigbee is wireless PAN (Personal Area Network) technology developed from IEEE 802.15.4 wireless standard and supported by the Zigbee Alliance. IEEE 802.15.4 standard defines the physical and data link layers with all details about the robust radio communication and medium access control. Zigbee Alliance provides content standardization of the transmitted messages from network layer to application layer. It is a non-profit association, responsible for open global Zigbee standards development. Companies like Google, Amazon, Qualcomm, Samsung, Silicon Labs, Philips, Huawei, Toshiba etc. are members of Zigbee Alliance.
The Zigbee wireless communications technology operates in unlicensed frequency bands including 2.4 GHz, 900 MHz and 868 MHz, within 100m range. It enables up to 250 Kbps throughput in the 2.4 GHz frequency band and 40 Kbps/20 Kbps in the 900/868 MHz frequency bands. In the 2.4 GHz frequency band the Zigbee technology is organized in 16 channels, shifted in 5MHz steps. This technology supports theoretically up to 65000 nodes organized in a single wireless network. There are three types of nodes – logical devices in a Zigbee network:
Zigbee Coordinator – is a device responsible for establishing, executing, administering and managing the overall Zigbee network, its security, subscribers list, etc. There is only one coordinator in Zigbee network.
Zigbee Router – is an intermediate node responsible for routing packets between end devices or between end devices and the coordinator. In one Zigbee network there could be several routers.
Zigbee End Device – represents a sensor or a node that monitors and collects required data. Unlike routers or coordinators, these nodes are usually battery operated. Hence, they could be put to sleep for a certain period to minimize battery draining and conserve energy when there is no activity to be monitored. End devices can neither route traffic nor permit other nodes to join the network.
In a star network, one hub, the coordinator is the central point of all communications, limiting the network coverage with its range and processing power. As the most important node in a Zigbee star topology network, it represents a single point of failure.
In the mesh network all end nodes are router nodes at the same time, including the coordinator after the network initialization, making this topology robust and without a single point of failure (presented in Figure 3).
Hybrid mesh topology combines the first two types—in this topology there can be several star networks and their routers can communicate as described in a mesh network.
We must consider the choice of topology in the network planning phase, taking into account its purpose, available power supply solutions, range and throughput requirements, schedule for end nodes (sensors) activity, costs and other factors important for specific use cases.
In the endless process of technology evolution, the Zigbee Alliance continues improving Zigbee standard. The latest version of Zigbee standard has enabled interoperability among the wide range of smart devices from different manufacturers and provides access for end-users to innovative products and services that will work together seamlessly.
Today Zigbee 3.0 is one of the most common wireless standards implemented in IoT devices. It significantly affects smart homes and smart buildings development because of the low power consumption, long battery life, built–in support for mesh networking and IP, provided communication security and reliability, cross-band communication across 2.4GHz and sub-GHz frequency bands, etc. Zigbee became one of the most crucial technologies and a global standard for home automation. It helps creation of smart home, by enabling appliances control, improvements in everyday comfort, security and energy management.
As the Zigbee 3.0 devices have energy harvesting support and long battery life, we generally describe this technology as the low-power Wi-Fi.
Its applicability in smart homes and smart buildings enabled remote control of different equipment like smart plugs or motion sensors, light switches, thermostats, door locks and systems like security, HVAC and energy or water consumption.
Worldwide compatibility is enabled between the Zigbee 3.0 devices which improves inherited interoperability challenges. At the same time, operating and maintenance costs are decreased, making it a win-win solution for both—end users (staff, tenants or visitors) and providers.
Zigbee is the standard of wireless technology choice for smart home and smart building applications, but some of its disadvantages are recognized as well, such as short range communication, data throughput that is optimized for bursts of sensor transmissions but not for streaming, lack of advanced error correction mechanisms, sometimes more complex troubleshooting, star topology single point of failure, etc.
The Wi-Fi represents wireless technology that includes the IEEE 802.11 family of standards (IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, IEEE 802.11n, IEEE 802.11ac, etc.). Within 50m range, it operates in 2.4 GHz and 5GHz frequency bands.
This technology was developed for wireless networking of computer devices and is commonly called WLAN (Wireless Local Area Network), where the communication is realized between wireless routers typically connected to the Internet and other wireless nodes within its range.
In compliance with performances of specific IEEE 802.11 standards, different data rates are enabled and their theoretical throughput is 11 Mbps (IEEE 802.11b), 54 Mbps (IEEE 802.11a and IEEE 802.11g), 100 Mbps (IEEE 802.11n) or 300 Mbps (IEEE 802.11ac).
In the overcrowded 2.4 GHz frequency band, there are 14 channels dedicated for the Wi-Fi technology. In 5 GHz frequency band, RF channel distribution for Wi-Fi is correlated with national legislation and RF bands allocation plans.
A new Wi-Fi HaLow (IEEE 802.11ah) standard is a technological successor of the current IEEE 802.11ac wireless protocol. It works at 900 MHz frequency band in the USA and significantly improves wireless coverage and energy efficiency as one of the most important features for smart homes, smart buildings and other IoT use cases.
Among other available choices, this technology is used in smart homes and smart buildings for use cases with high throughput audio/video streaming requests, centralized management applications, video monitoring and security systems, etc. Networking of multiple devices such as cameras, lights and switches, monitors, sensors and many others is enabled with this technology.
One of the major Wi-Fi benefits is its prevalence in almost all digital devices today and capability to provide high-capacity wireless links. From a security perspective, activation and implementation of supported encryption mechanisms provide acceptable protection, like Wi-Fi Protected Access (WPA) or WPA2. Wi-Fi technology enables generic IP compatibility, easy installation and operation procedures, possibility to add or remove the devices to or from a network with no particular management efforts and impacts to network functionality, efficient troubleshooting, etc.
Some Wi-Fi drawbacks for smart home and smart building use cases are power consumption, higher infrastructure price, susceptibility to obstacles that limits the range, susceptibility to RF jamming—important for smart home or smart building security systems, available throughput is shared between connected devices, susceptibility to interference from the many devices that operate at the same frequency—including Wi-Fi and other wireless technologies devices like cordless phones, microwaves, etc.
Radio-Frequency Identification (RFID)
Radio-Frequency Identification (RFID) is a technology commonly used for identification, status administration and management of different objects. It is important for people identification, as it is commonly deployed in latest biometric passports.
It operates in several frequency bands like Low frequency band from 125 kHz to 134 kHz, High frequency band with 13.56MHz working frequency, Ultra-high frequency band with 433 MHz working frequency and 860 – 960 MHz sub-band.
In Ultra-high frequency bands there are two types of RFID systems—Active and Passive.
Active RFID system operates on 433 MHz radio frequency and on 2.4GHz from Extremely High- Frequency Range. It supports range from 30 to 100+ meters.
Passive RFID system operates on 860 – 960 MHz frequency and supports up to 25m range.
RFID tags can be active (with microchip, antenna, sensors and power supply) or passive (without power supply).
RFID reader is another hardware component that identifies a RFID tag and transmits its status to the RFID software application.
RFID software applications (often mobile applications) monitor and administer RFID tags. They usually exchange information with RFID readers via different beacon technologies or Bluetooth.
RFID technology is very important for different IoT applications including smart homes and smart building. According to the applied frequency ranges, some advantages and limitations of RFID systems are given in Table 1.
RFID system type
Low frequency band
Unique applicability compared to other RFID systems
Global standardization support
Very short range
Limited memory of RFID devices
High production costs
Animal tracking, access control, applications with high volumes of liquids and metals.
High frequency band
Support for NFC global protocols and standards
Higher capacity of the memory
Short range – 30cm
DVD kiosks, library books, personal ID cards, gaming chips, etc.
Active RFID systems
Lower infrastructure costs compared to Passive RFID
High memory capacity
High tag cost
Restrictions due to battery power supply
Complex software solutions
Susceptibility to interference from metal and liquids
Lack of global standardization support
Vehicle tracking, auto manufacturing, mining, construction, asset tracking.
Passive RFID systems
Long read range
Low tag cost
Variety of tag sizes and shapes
Global standards support
High infrastructure cost
Moderate memory capacity
Susceptibility to interference from metal and liquids
RFID tags are implemented as an interface between the IoT ecosystem and the subscribers. This technology potential is significant because of its low cost and low power features.
Smart clothes are a representative example of RFID technology deployment in a smart home. Garments with embedded RFID tags could share information with smart home appliances, to help us improve life quality. Smart bins could help to sort clothing items into logical groups, while balancing the load size. Smart washing machines in smart homes or buildings could read the embedded RFID tags on smart clothes and set the optimal wash cycle in compliance with provided instructions. Smart cleaning/laundry services provided in smart buildings can establish real-time communication with the building tenants, keeping them informed about the status of requested service.
RFID is also important for indoor location applications development and Angle of Arrival (AOA) technology. AOA technology implies the optimization of the mobile tag signals arrival angle from at least two adjacent sources, establishing a real-time location system with centimeters accuracy. In the context of localization systems and indoor applicability, this is a significant improvement.
RFID technology enables new consumer applications and services for smart homes and buildings like smart shelves, smart mirrors, self check-in or check-out, restricted area access control, etc.
Some important RFID technology advantages for smart home and building applications are low cost, low power consumption, great implementation potential, perspective for development of different user friendly software (mobile) applications, etc. RFID technology limitations are susceptibility to interference caused by different objects or eavesdropping and DDoS attacks, lack of standardization support, signal collision, etc.
Near field communications (NFC)
NFC is a short range two-way wireless communication technology that enables simple and secure communication between electronic devices embedded with NFC microchip. NFC technology operates in 13.56 megahertz and supports 106, 212, or 424 Kbps throughput. There are three available modes of NFC communication:
Read/write (e.g. for reading tags in NFC posters)
Card emulation (e.g. for making payments)
Peer-to-peer (e.g. for file transfers)
There is no need for pairing code between devices, because once in range they instantly start communication and prompt the user. NFC technology is power efficient – much more than other wireless technologies. The communication range of NFC is approximately 10 centimeters and it could be doubled with specific antennas. The short range makes this technology secure. Only allowing near field communications makes this communication technology optimal for secure transactions, such as contactless payments. Some examples of NFC applicability include:
Ticket confirmation for sports events, concerts, at theaters, cinemas;
Welfare performances improvement – syncing workout data from a fitness machines with personal user device;
Personalized content sharing – viewing special offers on your phone in museums, shopping malls and stores;
Loaders of translated content in different services, like menus in the restaurants;
Check-in and check-out in hotels, airports, etc.;
Security systems – unlocking an NFC-enabled door locks, etc.
NFC technology provides further support for smart home and smart building evolution. In the bedroom an NFC tag can be used for monitoring TV, wireless system, alarm, lighting or other devices via the smartphone. In the kitchen NFC tags could be placed on refrigerator and oven making them smart as presented in Figure 3., or they could be used to modify the ambient according to your needs (turning on and off some lights, music, etc.)
NFC tags can transform a smartphone or other personal digital device embedded with NFC chipset, into a universal remote capable of performing any action. Compared to RFID technology, every NFC device has embedded NFC reader and NFC tag capabilities. The potential for NFC technology applicability in smart homes and buildings is endless.
NFC advantages for smart homes and buildings applicability are simplicity, security, capability to connect unconnected devices via NFC tags or bridge other incompatible wireless technologies, low power consumption, widespread technology in almost all electronic devices, etc.
The main limitations that have to be considered for NFC applicability in smart home and smart building use cases are: very short distance, lower throughput compared to other wireless technologies, it is not completely risk-free technology – due to the fact that mobile based hacking tools are evolving and became common today.
Built-in cybersecurity features of wireless technologies (Bluetooth, Zigbee, Wi-Fi, RFID and NFC)
Traditionally, wireless networks are self-contained, homogenous and do not provide interoperability between different wireless technologies. There is no single wireless technology optimal for all use cases, capable of supporting all coverage, throughput, mobility, etc. requirements. As these technologies are wireless and susceptible to security issues, security protection is one of top priorities and the most challenging features in wireless networking. Rapid development and increasing importance of all wireless technologies became crucial for the fourth industrial revolution (IoT). Communications infrastructure is more complex than ever before. This trend will continue.
Hence, the general conclusion is that the principal task of wireless communication technologies should be to provide secure connectivity. In this chapter I’ll present some representative security features and challenges of above-mentioned wireless technologies.
BLE (Bluetooth) Cybersecurity
Several security modes are recognized in Bluetooth technology. Each version of Bluetooth standard supports some of them. These modes differ based on the point of security initiation in Bluetooth devices. Bluetooth devices must operate in one of four available modes:
Bluetooth security mode 1 – it is an insecure mode. It is easy to establish wireless connectivity in this mode, but the security is an issue. Bluetooth security mode 1 applicability is in short range devices and only supported up to Bluetooth v2.0 + EDR (Enhanced Data Rate) standard version.
Bluetooth security mode 2 – a centralized security manager is responsible for access to specific services and devices in this mode, by implementation of the authorization procedure. All Bluetooth devices can support this security mode. However, v2.1 + EDR devices support it only for backward compatibility.
Bluetooth security mode 3 – in this link level-enforced security mode, the Bluetooth device initiates security procedures before establishment of physical link. It uses authentication and encryption for all connections to and from the device. Bluetooth Security Mode 3 is only supported in Bluetooth devices with v2.0 + EDR or earlier versions.
Bluetooth security mode 4 – in this mode security procedures are initiated after link setup. Secure Simple Pairing uses Elliptic Curve Diffie Hellman (ECDH) techniques for key exchange and link key generation. This mode was introduced at Bluetooth v2.1 + EDR.
Authenticated BLE secure connections pairing with encryption (each time after the pairing is initiated Elliptic Curve Diffie-Helman key agreement protocol is used for key exchange BLE secure connections).
BLE security mode 2 is supported with 2 layers:
Unauthenticated pairing with data signing.
Authenticated pairing with data signing.
Security manager protocol, built in the session layer of the OSI reference model, is responsible for pairing, signing between nodes, encryption, key administration, key management, security services management and all other security features in a BLE network. Bluetooth has some security vulnerabilities, as does any other wireless technologies. Its implementation has to be planned taking into account possible security threats. Some representative security challenges of BLE (Bluetooth) technology are:
Passive eavesdropping and Man in the Middle (MITM) attacks or identity tracking apply to Bluetooth technology. The interception of radio waves between a smartphone and smart lock can be realized by a different kind of sniffers. Their price range today is 50-100 USD.
Bluejacking involves the sending of a vCard message via Bluetooth to other Bluetooth users within the short ranges – typically 10 meters. The risk is that the recipient will not realize what the message is and it is possible to open messages automatically, assuming that they are sent from someone known from the contact list.
Bluebugging is the Bluetooth security issue that allows unauthorized remote access to a phone and usage of its features. It may include placing calls and sending text messages. In the meantime, the owner does not realize that his phone has been taken over. Depending on the attacker’s creativity, denial of service (DoS) attacks and resource misappropriation are consequences of this security issue, too.
Car Whispering is a hacking technique that implies the usage of specific software to send and receive audio and other files to and from a Bluetooth enabled car stereo system, to invade privacy or listen conversation. It could be applied in the same manner to a Bluetooth enabled systems in smart homes or smart buildings.
Bluesnipping is a hacking technique capable to extend the range of unauthorized Bluetooth communication system monitoring and provide malicious coverage within a mile distance. It is realized with a specific hardware – a Bluesnipping gun that is made with a few hardware pieces like folding stock, Yagi antenna and Linux terminal.
These vulnerabilities can cause unauthorized access to sensitive information, unauthorized use of Bluetooth devices and other systems or networks to which the Bluetooth devices are connected. In order to protect a network from security vulnerabilities, it is always useful to be careful with third-party applications and install applications only from trusted sources. Recommendation is to deploy a home network firewall that will protect and encrypt all incoming and outgoing data.
The Zigbee Alliance and its members are continuously improving security performances of Zigbee technology, to achieve optimal balance between deployment, exploitation and security requirements in wireless machine to machine communication.
Zigbee is considered to be a relatively secure wireless communication protocol, with security architecture built in accordance with the IEEE 802.15.4 standard. To meet the security needs, Zigbee provides a standardized set of security specifications based on a 128-bit AES algorithm and compatible to wireless 802.15.4 standard.
Security mechanisms include authentication, authorized access to network devices, integrity protection and encryption with key establishment and transportation. Device authentication is the procedure of confirming a new device that joins the network as authentic. The new device must be able to receive a network key and set proper attributes within a given timeframe to be considered authenticated. Device authentication is performed by the Trust Center. Integrity protection is realized on the frame level using message integrity checks (MIC) to protect the transmitted frames and ensure they are not accessed and manipulated. A 128-bits symmetric-key cryptography is implemented in Zigbee’s security architecture. Zigbee technology supports 3 different types of keys for different purposes:
Master key must be obtained by pre-installation, secure key transport or user-entered data such as PIN or password. It is used for link keys derivation and establishment.
Network key is used for network establishment and broadcast network communication. This key provides network level security.
Link key is used for encryption point-to-point communication at the application level. It is different for each pair of devices in the network that are working in point-to-point mode. Link keys are used to minimize the security risks of Master key distribution. This key provides APL level security and the messages between devices are protected with both – the Network key and the Link key.
There are two types of security models in Zigbee networks. They mainly differ according to the implemented mechanisms, i.e. how new devices are admitted into the network and how they protect the messages in the network – Centralized security network and Distributed security network.
In the Centralized security model only Zigbee Coordinator with Trust Center credentials can establish centralized networks. Nodes join the network, receive the network key and establish a unique link key with the Trust Center.
In the Distributed security model, there is neither Zigbee Coordinator with Trust Center credentials nor Master keys. All the nodes in the network are pre-configured with the Link key, before entering the network and using the same network key for message encryption.
Zigbee technology supports different key management and transportation mechanisms, such as pre-installation (realized by manufacturer), key establishment—a method of generating Link keys based on the Master key, and key transport—when the network device makes a request to the Trust Centre for a key.
Replay and injection attacks. In the first phase, specific tools for Zigbee network discovery transmit beacon request frames and analyze responded information about available nodes in the network. This entire process finds Zigbee devices working on dedicated working channels, sends and receives beacon request and response frames over that single channel. Next phase is capturing the packets, analyzing them, then replaying the same packets and making it look as if they came from the originating node to cause a change in the device’s behavior, determined by the replayed packets. The network will treat the malicious traffic as regular traffic.
Wormhole attacks exploit the mechanisms to discover routes of on-demand routing protocols and apply to Zigbee networks. A malicious user that receives packets at one point in the network then replays these packets in other areas to interfere with the overall network functionality. The attacker can control the data that flows through the malicious tunnel and launch other attacks, especially if network nodes are far enough from each other.
Misplacement of some low-cost Zigbee devices with limited protection capabilities – for example, without tamper-resistance (such as temperature sensors and light switches), makes them vulnerable for unauthorized access to privileged information like keys, network identification, working channels etc.
Zigbee uses the same security level in all network devices for the purpose of achieving and maintaining the device interoperability. This could lead to some security risks.
Eavesdropping applies to Zigbee networks, especially to ones supported by OTA firmware upgrade capability. This kind of attack is very hard to discover.
DDoS attacks at the MAC layer are a realistic scenario. If an attacker floods a radio channel with frames, the network will be forced to deny any communication between devices, because Zigbee uses CSMA/CA mechanism and devices always check if a channel is busy before transmission if it is running in non-beacon mode.
Without integrity protection provided by MIC, a rogue device could modify a transmitted frame and the modification may not be detected by the recipient.
A denial-of-service (DoS) attack causes a node to reject all received messages. In a Zigbee network, DoS attack can be done by altering routing tables to redirect all or some of the network traffic to a malicious device (sinkhole attack). It is achieved by purposely sending messages to build artificial routing paths or to implement loops to the routing process of legitimate nodes. As a consequence, transmission of packets among devices is hampered.
DoS attacks could be realized too, by using jamming techniques to trick the user for initiation of a factory reset and preventing the devices from communicating. It could be also realized by sending a “reset to factory default” command to the device and waiting on the device to look for another Zigbee network to connect.
Upon leaving the network, a node can still access the communication, since it still possesses the master and link keys. If we analyze an example of a smart home or smart building where Zigbee devices are used for opening doors or improving energy efficiency, it is not impossible that one or many of the devices are lost or stolen. For that reason, if the keys stored on the devices are not properly revoked, someone might take advantage of the situation and exploit this weakness. Therefore, this type of attacks together with network physical security should not be underestimated and must be taken seriously.
In order to meet the increased security requirements for smart home and smart building use cases, the Zigbee Alliance is permanently engaged in security improvements (new algorithms and functions research and development, security protocols and hardware and software support requirements, networks and system organization and settings, regulatory topics and standards establishment).
The Wi-Fi Alliance enables the implementation of different security solutions across Wi-Fi networks through the Wi-Fi Protected Access (WPA) family of technologies. Simultaneously with Wi-Fi technology, deployable for personal and enterprise networks, security capabilities evolve too.
Today there are several available levels of security applicable to Wi-Fi networks implemented in WPA protocols, like WPA3 – Personal, WPA3 – Enterprise, WPA2, Open Wi-Fi and Wi-Fi enhanced open.
WPA3 security protocol
WPA3 is the latest generation of Wi-Fi security protocol. It is a successor of successful and widespread WPA2 protocol.
WPA3 adds new security features to deliver more robust authentication, enable increased cryptographic strength for highly sensitive information exchange and support resiliency of mission critical networks.
Once implemented, WPA3 protocol represents best security practices in Wi-Fi networks, while disabling obsolete security protocols and requiring usage of Protected Management Frames (PMF) at the same time. It includes additional features specifically to Personal or Enterprise networks and maintains interoperability with WPA2 protocol.
WPA3 is currently an optional certification for Wi-Fi certified devices that will become mandatory in compliance with the market needs and growth.
WPA3-Personal protocol enabled better protections to individual users by providing more robust password-based authentication. This capability is enabled through Simultaneous Authentication of Equals (SAE), which replaces Pre-Shared Key (PSK) in WPA2-Personal protocol. Some of its advantages are natural password selection (allows easy to remember passwords), protection of data traffic even if a password is compromised after the data was transmitted and easy to use.
WPA3-Enterprise protocol is developed specifically for enterprises, governments and financial institutions, offering an optional mode that uses 192-bit minimum-strength security protocols and cryptographic tools for better protection of sensitive data. It is supported with authenticated encryption (256-bit Galois/Counter Mode Protocol – GCMP-256), key derivation and confirmation (384-bit Hashed Message Authentication Mode with Secure Hash Algorithm – HMAC-SHA384), key establishment and authentication (Elliptic Curve Diffie-Hellman – ECDH exchange and Elliptic Curve Digital Signature Algorithm – ECDSA, using a 384-bit elliptic curve) and robust management frame protection (256-bit Broadcast/Multicast Integrity Protocol with Galois Message Authentication Code – BIP-GMAC-256).
WPA2 security protocol
WPA2 protocol provides security and privacy to Wi-Fi networks since 2006. It is a well-known successor of an obsolete WPA security protocol. The major improvement in comparison with WPA is deployment of stronger AES encryption algorithms in WPA2 protocol.
During 2018, to meet security requirements in evolving networking environments, Wi-Fi Alliance augmented existing WPA2 protocol through configuration, authentication and encryption enhancement. By these enhancements, susceptibility to network misconfiguration is reduced and security of managed networks with centralized authentication services is supported.
Open Wi-Fi networks
In some use cases, open Wi-Fi networks are the only available option. It is important to be aware of the risks that open networks present. To address these risks, Wi-Fi Alliance has developed a Wi-Fi Enhanced Open as a solution for users of open Wi-Fi networks.
Compared to traditional open networks with no protection, Wi-Fi Enhanced Open certification provides unauthenticated data encryption to subscribers. It is based on Opportunistic Wireless Encryption (OWE) method defined in the Internet Engineering Task Force (IETF) RFC8110 specification and the Wi‑Fi Alliance Opportunistic Wireless Encryption Specification. Wi-Fi Enhanced Open enables data encryption that maintains the open networks ease of use and benefits network providers because of simple network maintenance and management.
The intensive evolution of security features in Wi-Fi technology makes it very deployable in the IoT domain and specifically to smart home and smart building use cases. Like other wireless technologies, it has some security challenges too. If we take into account the number of devices embedded with Wi-Fi chips, this becomes even more important. Some representative Wi-Fi security challenges are:
Jamming susceptibility – a Wi-Fi signal can be easily jammed today. In smart homes or smart building, this attracts additional attention. If a home security system is based on Wi-Fi technology, intruders could effectively block the Wi-Fi signal and disable the alarm system.
Because of the single point of failure (wireless router or Access Point), DoS attacks are potential risks for smart homes or smart building Wi-Fi networks. If the Access Point is out of service, there is no service availability and complete wireless network is malfunctioning.
Eavesdropping is performed by simply getting within range of a target Wi-Fi network, then listening and capturing data. This information can be used for a number of unauthorized activities including attempting to break existing security settings and analyzing non-secured traffic. It is almost impossible to reliably prevent this category of attacks because of the nature of a wireless network. It is always important to set the complex parameters in security mechanisms.
Evil Twins or Rogue Wi-Fi Hotspots are one of the most common ways for obtaining sensitive information from Wi-Fi networks. It represents a fake Wi-Fi access point that imitates the legitimate one. In this scenario, an SSID is state similar to original Access Point and any information disclosed while connected to Rogue Wi-Fi Hotspot could be misused.
Packet Sniffers – by using a packet sniffer, it is possible to identify, intercept, and monitor web traffic over unsecured Wi-Fi networks and capture personal information such as login credentials to bank accounts and corporate email accounts.
File-Sharing – if enabled on devices, it can be used for unauthorized access to a device connected to the Access Point or Wi-Fi hotspot and malware drop.
Malware and Ransomware susceptibility of public Wi-Fi hotspots that could be a part of smart building. Without the protection of AV software and web filters, malware can be silently downloaded.
A generic IP nature of Wi-Fi networks makes them a perfect surrounding for the testing of the new hackers’ tools and for improvements of the existing ones.
To maintain worldwide interoperability and secure communications between devices from different manufacturers, Wi-Fi alliance permanently improves the security solutions implemented in Wi-Fi technology, provides product certifications, forward and backward compatibility. This approach is very important for Wi-Fi support to different IoT use cases and particularly for smart homes and smart buildings evolution.
RFID technology is becoming increasingly popular for smart homes, smart buildings and other IoT use cases. RFID is considered to be the successor of the barcode technology.
If any of the security mechanisms in RFID is not implemented properly or not operational, the security is broken. Particularly in smart homes and smart building use cases, it may result in unauthorized access to personal data, or even personal tracking.
Like other wireless technologies, RFID is exposed to security threats and the most typical RFID security challenges are:
Interference susceptibility is caused by environmental factors such as radio noise and collision caused by metal and liquids. The interference affects the RF propagation and eventually leads to error in localization services, propagation, ranges, service availability etc.
Tag isolation is technically the simplest attack, and the most represented. It includes the jamming of tag communications and blocking data that has to be transferred to the reader.
Tag cloning includes eavesdropping, the extraction of the unique identifier (UID) and/or the RFID content and their insertion into another tag. Tag cloning is commonly used for unauthorized access to restricted areas or even for changing – decreasing the price of certain goods in supermarkets.
Relay/Amplification attacks consist in unauthorized amplification of the RFID signal by using a relay and extending the range of the RFID tag beyond the borders of its coverage zone.
Denial of Service (DoS) attacks include the scenario when a tag is flooded with a large amount of information from a malicious source and cannot process the operational signals sent by real tags. Other techniques are based on jamming – emitting radio noise at the RFID system operating frequency.
Remote tag destruction is realized by RFID zappers able to send energy remotely. This electro-magnetic field can be very high and capable of burning certain components of the tag. Remote tag destruction is possible if the kill password in some tags is misused – first by passive eavesdropping in order to open the kill password and then applying it intentionally to disable the tags.
Man-in-the-Middle (MitM) attacks, SQL injection, virus/malware and commands injections are possible by placing an active malicious device between a tag and the reader to intercept or alter the communications between both elements and endanger the readers functioning.
RFID skimming includes the deployment of unauthorized portable terminals, to make fraudulent charges on payment cards.
To provide a secure wireless network, described security challenges have to be taken into account when creating smart home or smart building systems based on RFID wireless technology.
NFC wireless technology enables all objects to connect to the Internet. Its applicability in the IoT domain, to smart home and smart building use cases is crucial, especially if taken into account the fact that all modern personal devices (cellphones, tablets and notebooks) are embedded with NFC chips and their mutual compatibility is achieved.
One of the security mechanisms implemented in NFC is Digital Signature (defined in the NFC Forum Signature RTD 2.0) with asymmetric key exchange. The Digital Signature is a part of the NFC Data Exchange Format (NDEF) message, which includes also a Certificate Chain and a Root Certificate. Each NFC device has a private and a public key. Developed by HID – NFC tag manufacturer, another security mechanism is a Trusted Tag. It fully complies with NFC Forum Tag Type 4 and works with any NFC Forum compatible devices. The Trusted tag is protected from cloning and embedded with cryptographic code generated by every “tap” or click on NFC button. This cryptographic code protects the content of the transmitted information.
NFC technology operates in limited range and includes additional protection like PIN or biometric locks that enable secure data exchange.
General theft of property or losing a device is hard to avoid. The best defense from this threat is to ensure phones, tablets and other personal devices from unauthorized logging and usage.
Eavesdropping and interception attacks apply to NFC technology.
Man in the middle attack is possible if there is a malicious device positioned between two NFC devices or in their short range that receives and alters the exchanged information. They can be prevented by remaining aware of unusual devices that are attached or positioned nearby to transactions premises. It is important to ensure that NFC transactions are realized only in official and authorized places.
5G for smart home / smart buildings
Deployment of 5G cellular communication technology establishes a new ecosystem with great potential. These potentials are based on the creation of the most advanced and the most critical communications infrastructure ever, capable of supporting new service possibilities, including efficient information exchange in an IoT context. A new ecosystem outlines are shaped by the availability of 5G technology features such as:
High speed (1-20 Gbps) data throughput capable of effectively processing augmented and virtual reality (AR/VR) systems, 3D video streaming with 4K /8K resolutions screens, online gaming and other services, etc.
Ultra-low latency (<1ms) that is of crucial importance for real time services such as telemedicine and healthcare, AR/VR, intelligent transportation, smart homes and industry automation.
Millimeter-Wave radio communications with new waveforms and massive MIMO (Multiple In Multiple Out) applicability with beam-forming and beam management, due to frequency range – wavelength, size of antenna and spacing characteristics.
Massive connectivity and dense coverage for vehicles, mobile subscribers, enterprises, IoT etc.
Very low energy consumption with extremely long battery life (up to 10 years), necessary for IoT M2M (Machine to Machine) communications.
To enable these capabilities, a completely new air interface capable of supporting heterogeneous access networks in different frequency bands and variable bandwidths is provided for 5G networks. Supported by small cells network structure, it ensures ultra-low latency, great indoor and outdoor coverage, localization and service availability. Cloud Radio Access Network (CRAN) model implemented in 5G technology enables split access architecture and deployment of network virtualization. In this new radio access model, a “central” edge cloud location is responsible for some access network functionality, while other functions are realized in the remote locations, enabling the separation of the front and back-haul in the transport network.
Implementation of adaptable software-based architecture technologies especially applicable to the first three layers of OSI reference model – Software Defined Radio (SDR), Software Defined Access (SDA) and Software Defined Networks (SDN) is enabled in 5G networks together with packet core network upgrades. The implementation of these technologies enables Network slicing as a unique 5G attribute. Network slicing manages and processes the creation of multiple virtual networks within shared physical infrastructure and is expected to be a crucial feature that will empower the deployment of different 5G use cases.
Expectations are that 5G will expand boundaries in all domains of modern life such as travelling, driving, production efficiency improvements, smart systems deployment such as smart cities with smart homes, buildings, hospitals, factories, public safety and services management etc. – all areas of human activity.
AI, IoT and 5G technology are intertwined. 5G technology is responsible to provide a network surrounding capable of supporting widespread use of AI and IoT applications and services. AI significantly improves the network management and services availability. Through integration and advancement of these technologies 5G telecom carriers are in position to improve network planning, capacity expansion forecast, coverage auto-optimization, network slicing, CRAN and dynamic cloud network resource scheduling.
AI is recognized as a game changer that will lead the transformation from the current carriers’ management model based on human capabilities to the self-driven automatic network operation and maintenance management mode. At the same time, availability of IoT and AI applications and services is directly correlated with construction of new 5G infrastructure and networks deployment.
When considering smart buildings and smart homes use cases, automation saves time and costs. Automation processing is moved to a higher level with 5G technology.
By supporting massive connectivity, 5G enables the deployment of smart home devices that work automatically, with no additional settings. For instance, by connecting the specific utilities meters to a central network, it is possible for suppliers of energy or other utilities to monitor, detect and respond if any unusual changes in consumption occur in smart landscapes like buildings, homes and cities.
Improvement in security systems performances is expected due to lower latency with high throughput and network reliability that is provided by 5G technology.
Taking into account predictions that over 80% of traffic will be originated by indoor subscribers, indoor coverage becomes extremely important. The small cells structure of 5G networks improves the indoor coverage, compared to other cellular communication technologies and, at the same time influences the evolution of different HD enterprise services, home VR, holographic communication, telemedicine and other new services applicable to smart homes and buildings.
5G networks are designed not only to enable information exchange between people, but to also connect machines. The security and privacy are a major concern that spans far beyond 5G as the technology with the most complex infrastructure. It is important to be aware that 5G networks will support millions of low-cost sensors that affect the security, too. Considerations in 5G security and privacy developed new trust models, service delivery models, evolved threat environments, and privacy concerns.
To support all new relations between distinct entities in the 5G ecosystem, new trust models have to be established.
The increase of security requirements in areas such as authentication between distinct elements of a complex set, accountability and non-repudiation is expected.
New categories of devices will shape the trust models and extend the wide range of different security requirements applicable to many use cases such as industry automation control devices, smart home devices with associated services, next-generation of personal devices like tablets and smartphones, etc.
New identity management solutions play an important role in defining new trust models, too.
Cloud technology, AI and network virtualization applicability in 5G networks influence the shaping of the new service delivery models. Decoupling software and hardware, separation of the front and back haul in the transport network, third-party applications deployment in the clouds together with some native telecommunications services impact the demands on virtualization with strong isolation properties and force the new security system organization and deployment.
Simultaneously with new performances, new threats and challenges are rising. So are the privacy concerns.
Generally, the level of 5G security is not defined by the number of specified security mechanisms. A multi-stakeholder approach that involves operators, vendors, regulators, policy makers and representatives of 5G subscribers (from different ecosystem segments) is fundamental to the security baseline of trustworthy, cost-efficient and manageable 5G networks. In such a complex landscape standardization is of crucial importance for everyone – enterprises, public safety, industrial automation, smart homes and buildings, etc. Standards defined by entities such as the ISO (International Organization for Standardization), the IEC (International Electro-technical Commission) and the CSA (Cloud Security Alliance) will also impact the technology evolution, applicability, and customers’ services availability. To provide safe and secure wireless connectivity worldwide, new comprehensive security policies have to be created and implemented in 5G technology.
5G security challenges
Similar to other wireless communications, 5G is susceptible to security challenges. 5G even more so.
There are several specific facts that are determining unique 5G susceptibility in security context, such as:
Network components that are virtualized and potentially deployed on the NFVI (Network Function Virtualization Infrastructure) and cloud components provide dynamic configurations of 5G architecture and need more dynamic and flexible security solutions.
Complex control of Network slicing – as a completely virtual type of networking deployed through all entities of 5G network.
Radio access network is vulnerable to all common wireless network security threats such as rogue nodes, modification, altering or injecting user plane traffic, MEC server vulnerability and DoS attacks.
AI applicability to 5G networks generates the new cybersecurity challenges, such as AI “black boxes”, the inability to test AI for intentional backdoors, or adversarial learning, which is remote reprogramming of the neural network algorithms.
Since the 5G network is managed by different software – its protection and reduction of API (Application Programming Interface) and other software vulnerabilities within the network became the priority, together with external roaming threats.
5G implements edge computing that represents the potential for new security threats. It moves processing from the core and places it at the edges of the network – spatially distributed closer to high-density data sources to 5G.
The expansion of bandwidth in 5G creates more complex air interface and security challenges like eavesdropping, RF jamming, MitM attacks, complex resources administration and monitoring, etc.
Vulnerability is increased by attaching billions of smart but often low-cost and hackable devices to an IoT networks and other types of subscriber devices that can suffer from malware, MitM attacks, DDoS (Distributed Denial of Service) and other botnet types of attacks, lack of device tampering protection, snooping and sniffing attacks, etc.
Protection of subscribers’ personal privacy is a very important component of 5G security that includes access to location information (location based services), data and personal information privacy (personal health information, identity management or employee personal information available for enterprises).
Quantum technology is expected to break almost all encryption solutions available today. This issue has to be resolved on time, by upgrading encryption models to quantum resilient ones. For example SK Telecom, South Korea’s largest mobile operator has already developed Quantum Key Distribution (QKD) technology for its 5G network.
Without further standardization, regulation and strong proactive measures, 5G networks offer the widest and the most attractive attack surface and remain vulnerable to cyber-attacks.
Conclusion – wireless technologies and IoT perspective
It is impossible to deploy a functional IoT ecosystem without the support of wireless technologies. They provide the communications between the billions of devices, network and applications servers, cloud infrastructure, machines and sensors, subscribers, new applications and services, etc.
The latest cellular communications technology – 5G is recognized as a game changer that will support different heterogeneous wireless technologies, open new perspectives for AI and augmented reality applicability, provide necessary infrastructure that will enable secure and safe deployment for smart homes, smart buildings and smart cities or any other IoT use cases.
Enabled by 5G, the potential of smart homes, smart buildings, and smart cities will explode. With the wide applicability and ubiquity, the arrival of 5G will further expand the demand for smart home devices, impact their development, lead to more competitive pricing and make them more available in everyday life.
It is essential to pay close attention to the integration and configuration of wireless devices, in compliance with system needs and to achieve secure communication in different IoT use cases. Regardless of the security protocols applied in the wireless – cellular technologies and security solutions implemented across the different layers of OSI reference model, we must also keep the focus on user attention as one of the most important details that significantly contributes to overall system safety and security, especially for smart homes and smart building use cases.
In a major milestone for 5G, 3GPP finalized the Release 16 in July – its second set of specifications for 5G New Radio (NR) technology. As a second article in my series of 5G 101 articles, this is a good opportunity to review the 3GPP process and major 5G-related technical specification releases. As well as to clarify some misconceptions about the 5G development process.
This article provides an overview of what is 3GPP and its importance. It also explains the Releases related to 5G including Release 15, 16 and 17. The focus will be on 3GPP Release 16 and 17.
Release 16 was completed on July 3, 2020 with the slight delay due to the COVID-19 pandemic. Looking ahead, 3GPP Release 17 expected in 2021, although 3GPP announced that the Release is “at high risk of being delayed,” citing the switch from physical to virtual meetings.
The common misunderstanding is that 3GPP (Third Generation Partnership Project) is a “standardization body”. On the contrary, 3GPP only develops and maintains global technical specifications with the objective to make sure that network equipment and handset manufacturers can develop products that are interoperable all over the world. 3GPP is a collaborative activity between well-established regional standard organizations. The 7 telecommunications standard development organizations (or 3GPP’s Organizational Partners) use these specifications to create the standards.
3GPP Organizational Partners and their Role
The 3GPP organizational partners or standard development organizations include:
ARIB – The Association of Radio Industries and Businesses, Japan
ATIS – The Alliance for Telecommunications Industry Solutions, USA
Allocate human and financial resources to the Project Co-ordination Group
Act as a body of appeal on procedural matters referred to them
Cellular Technology Evolution from 2G to 5G
The Figure 1 shows the evolution of technology from GSM/CDMA (2G) to EDGE/WCDMA/1xEVDO (3G).
While ETSI specified GSM/EDGE/WCDM and HSPA, TIA specified the CDMA evolution. There were a multitude of standardization bodies, by Rel. 8 they all converged to on global standard called “LTE”. The critical driver for the convergence was Verizon’s choice to go with LTE vs. other Radio Specifications (e.g. 1xEVDO Rev. B, and WiMax). Going forward 3GPP defined the specifications which lead to the definition of 5G in 2017.
The 3GPP Release Schedule
The 3GPP release schedule since 2000 and is shown in the following table along with brief release details.
The 3GPP Specification
The 3GPP specification covers the “GSM” family of cellular telecommunications technologies, including:
core network and
The 3GPP specification provides a complete system description for mobile telecommunications and also provide connectivity for non-radio access to the core network and interworking with non-3GPP networks. The three Technical Specification Groups (TSG) in 3GPP are:
Radio Access Networks (RAN)
Core Network & Terminals (CT)
Services & Systems Aspects (SA)
The Global Reach of 3GPP
The power of 3GPP is that it defines the specification used by 6 Billion+ mobile subscribers worldwide in 2020. This subscriber base will continue to grow at a dramatic pace as illustrated in the Ericsson Mobility Report:
One reason for the spectacular adoption of the 3GPP technical specification is the “collaborative approach”. It is global system engineering project. Participants including vendors and operators participate in the creation of the specification from initial R&D till the final product according to the specification. The ideas are taken to the body for approval, which translate into a study item, followed by work items that result in the technical specification.
What are Requirements on the 5G Spec
5G is defined by a set of requirements that enable a set of use scenarios. These use cases depicted the following figure from the International Telecommunications Union (ITU) defined as IMT-2020:
Release 15 Recap
Release 15 focus is Enhanced Mobile Broadband and therefore all features are geared towards enabling
NR up to 71 GHz (extend the current NR waveform up to 71 GHz) and to explore new and more efficient waveforms for the 52.6 – 71 GHz band
NR NB-IoT/eMTC: The objective is to develop cost-effective devices with capabilities that lie between the full-featured NR and Low Power Wireless Access (e.g., NB-IoT). For example, devices that support 10s or 100 Mbps speed vs. multi-Gigabit, etc. The typical use cases are wearables, Industrial IoT (IIoT), and others.
XR – The objective of this is to evaluate and adopt improvements that make 5G even better suited for AR, VR, and MR. It includes evaluating distributed architecture harnessing the power of edge-cloud and device capabilities to optimize latency, processing, and power.
Non-Terrestrial Network (NTN) support for NR & NB-IoT/eMTC– A typical NTN is the satellite network. The objective is to address verticals such as Mining and Agriculture, which mostly lie in remote areas, as well as to enable global asset management, transcending contents and oceans.
Perfecting items introduced in Release 16
Rel. 16 was a short release with an aggressive schedule. It improved upon Rel. 15 and brought in some new concepts. Rel 17 is aiming to make those new concepts well rounded.
Integrated Access & Backhaul – Enable cost-effective and efficient deployment of 5G by using wireless for both access and backhaul, for example, using relatively low-cost and readily available millimeter wave (mmWave) spectrum in IAB mode for rapid 5G deployment. Such an approach is especially useful in regions where fiber is not feasible (hilly areas, emerging markets).
Positioning – Achieve centimeter-level accuracy, based only on cellular connectivity, especially indoors. This is a key feature for wearables, IIoT, and Industry 4.0 applications. Lead – CATT (NYU)
Sidelink – Expand use cases from V2X-only to public safety, emergency services, and other handset-based applications by reducing power consumption, reliability, and latency. Lead – LG
IIoT and URLLC – Evaluate and adopt any changes that might be needed to use the unlicensed spectrum for these applications and use cases.
Fine tuning features introduced in Release 15
Rel. 15 introduced 5G. Its primary focus was enabling enhanced Broadband (eMBB). Rel. 16 enhanced many of eMBB features, and Rel. 17 is now trying to optimize them even further, especially based on the learnings from the early 5G deployments.
Further enhanced MIMO – FeMIMO – This improves the management of beamforming and beamsteering and reduces associated overheads.
Multi-Radio Dual Connectivity – MRDC – Mechanism to quickly deactivate unneeded radio when user traffic goes down, to save power.
Dynamic Spectrum Sharing – DSS– DSS had a major upgrade in Rel 16. Rel 17 is looking to facilitate better cross-carrier scheduling of 5G devices to provide enough capacity when their penetration increases.
Coverage Extension– Since many of the spectrum bands used for 5G will be higher than 4G (even in Sub 6 GHz), this will look into the possibility of extending the coverage of 5G to balance the difference between the two.
Along with these, there were many other SI and WIs, including Multi-SIM, RAN Slicing, Self-Organizing Networks, QoE Enhancements, NR-Multicast/Broadcast, UE power saving, etc., was adopted into Rel. 17.
No, no it doesn’t. Huawei’s code might as well be extremely secure. Their code is certainly the most scrutinized. But the recent UDG source code review is not an evidence of security.
ERNW, an independent IT security service provider in Germany, recently performed a technical review / audit of Huawei’s Unified Distributed Gateway (UDG) source code. Huawei made the summary report available here [PDF].
The review focused on the quality of the source code and the source code management practices. The report is overall positive and showed that Huawei has significantly improved its software engineering processes. At least for the UDG product.
It’s a welcome improvement. Huawei deserves some recognition for improving their software engineering practices. A year ago UK government’s Huawei Cyber Security Evaluation Centre (HCSEC) issued a report that “revealed serious and systematic defects in Huawei’s software engineering and cyber security competence”.
Based on the positive report by ERNW, Huawei has mounted a PR campaign implying that the report is a proof of their 5G core network being secure and reliable.
Dozens contacted me over the last few days asking what this report really means. Here’s my $0.02.
This discussion is purely from the software engineering and cybersecurity perspectives. And based only on the publicly available ERNW summary report. I am not offering any opinions on the whole Huawei controversy here or on the 5G and COVID-19 conspiracy theory.
The ERNW’s technical review scope was the code base of UDG 20.2.0 5G.
UDG is just one of many 5G-related products within Huawei’s Cloud Core Network product line. And only one version was tested.
Huawei’s Cloud Core Network product line comprises several product groups. 5G Core product group consists of 5G Core, UNC, UDM, UPCF and UDG. Other product groups within the Cloud Core Network include CS&IMS, Mobile Packet Core, SDM, SmartPCC, Signalling, SEQ Analyst, SingleOSS-CN and others—each with several individual products. To make things more complex, there are also products within other product lines that a telco would consider if they implement 5G SA with Huawei products.
Extrapolating the positive UDG source code quality finding to security and reliability of the whole of 5G core gear is like saying that one cashier drawer in a bank has a placeholder for an advanced lock, therefore the entire bank is secure.
The source code in scope of this audit contained approximately 30 million lines of code (LOC). While LOC is a silly metric, to give you an indication – it is similar in size to the Microsoft Office for Mac code base, i.e. it’s massive.
Consider the size of the code base of just this one product and the fact that each of these dozens of products are constantly being developed and you’ll understand why I’ve been arguing for years that Huawei source code reviews as undertaken by some governments are not a feasible approach to evaluating the security of the 5G gear.
McCabe cyclomatic complexity
ERNW analyzed the source code complexity using the industry-accepted approach—by measuring McCabe’s cyclomatic complexity. It’s a quantitative measure of a number of linearly independent paths through a program and can indicate the complexity of a program. Higher value makes the program more difficult to understand and maintain.
ERNW set the threshold value for cyclomatic complexity as 100 based on “industry best practices”. For the UDG 5G components the average complexity per file was calculated at 90.284 demonstrating that on average the source code complexity is below the threshold.
The Software Engineering Institute (SEI) at Carnegie Mellon University defines four ranges for cyclomatic complexity based on an industry-accepted way of calculating the complexity. Cyclomatic complexity values below 10 represent low inherent execution and maintainability risk. Values of over 50 represent an untestable program with a very high risk.
Without understanding the non-SEI ERNW’s cyclomatic complexity metric, how they calculated the total cyclomatic complexity, and how they defined the threshold, I can’t comment on the result other than saying:
5G is a mission-critical and safety-critical system. Its cyclomatic complexity should be much closer to 0 than being barely below an “industry best-practice” threshold. (Unless ERNW took the “industry” to mean avionics, aerospace, medical, weapons sytems, nuclear, or automotive domains.)
Cyclomatic complexity is correlated with security, but only indirectly. A complex system would be harder to understand and maintain, which would make it more time consuming to find and fix vulnerabilities. It also might make it easier to introduce vulnerabilities. But that’s all.
ERNW found code duplication to be 2.2% on average. Which is a relatively good result. For an average business software, but it might still be too much for a critical software.
Some code duplication is normal. I would even argue that excessive DRYing (“Don’t Repeat Yourself”) of a code could impact readability, and therefore failing to meet the objective of DRY.
However, in this case 2.2% of the 30 million LOC means that there is 660,000 LOC with higher risk. If a fix to a vulnerability includes editing any of those 660,000 LOC, the developer would have to track down the duplicate and repeat the edit.
While code duplication, like cyclomatic complexity, degrades the maintainability of the code base it is also only indirectly correlated to security.
Code duplication also should not be reduced to a single number with no other comments. There is a significant difference in risks depending on what kind of code is being duplicated. The summary report says nothing about it.
Auditors found that Huawei is putting right processes in place to reduce the use of unsafe functions and are avoiding them extensively. They did, however, found some usage of unsafe functions that they recommended to be reduced. Without knowing more details, I can only say—the tested code uses some unsafe functions for now. Use of unsafe functions is one of the highest cybersecurity related risks in software engineering.
Auditors found that representative unit test cases were technically suitable and recommended the test coverage to be increased to 75%. Here ERNW again seems to set target thresholds based on average business systems, not critical systems.
My take: Below 75% unit testing coverage for components with high value functionality is not adequate even for an average business system, let alone for a mission-critical or a safety-critical system.
Variant analysis of the course code
Variant analysis means that ERNW used bad patterns they found in the source code and use it as a seed to look for other similar problems in the code.
From the report: “The variant analysis identified additional bad patterns.”
In the dynamic analysis phase the auditors would use a fuzzing approach to subject a running code to a variety of inputs, boundary conditions, etc. The auditors found positive test cases with either crashes or undefined states. ERNW summarized the findings as: “The results are common for projects with similar characteristics in terms of complexity and size of the code base”.
My take is that 5G core component’s source code quality shouldn’t be compared with the quality of an average software product, but with those of mission-critical and safety-critical systems.
Build process and open-source lifecyle management
The last part of the summary report looked at the build process and open-source lifecycle management. Conclusion summaries from the report:
Secure Compilation Options: The binaries are compiled comprehensively with secure compilation options.
Binary Equivalence: All binaries are built with binary equivalence. Overall, an acceptable amount of binary equivalence is achieved.
Only minor improvements for the open-source lifecycle management were recommended. From the auditor’s perspective the separation of the code, the handling, documentation and patch management are reasonable and meet all requirements of a state-of-the-art open-source lifecycle management.
The test covered only a small subset of 5G core.
Few positive source code quality findings only indirectly correlate with security.
The approach and the aim of the audit did not cover source code analysis from the secure code perspective nor the review of secure software engineering practices.
This report can’t tell us conclusively anything about the UDG security, let alone about Huawei’s 5G core security.
Low-band 5G (600 - 700 MHz) giving download speeds a little higher than 4G at the moment: 30-250 Mbps. Mid-band 5G (2.5-3.7 GHz) currently allowing speeds of 100-900 Mbps. High-band 5G (25 - 39 GHz and higher frequencies up to 80GHz) achieves, at the mom
Range is correlated with frequency bands - low band 5G has similar range to 4G (tens of kilometers), Mid-band 5G has several km range. High-band 5G has hundreds of meters up to 1.5 km range.
12.8 Kbit/s - 60kbit/s
BLE (Bluetooth Low Energy) or Bluetooth Smart (Bluetooth 5, 4.2)
Range is variable and depends on frequency bands, propagation conditions etc. typically it is better than LTE-M coverage.
up to 100kbps
106kbps, 212kbps, 424kbps
RPMA (Random Phase Multiple Access)
Different data rates are enabled in IEEE 802.11 family of standards and their theoretical throughput is 11 Mbps ( IEEE 802.11b), 54 Mbps (IEEE 802.11a and IEEE 802.11g), 100 Mbps (IEEE 802.11n) or 300 Mbps (IEEE 802.11ac).
IEEE 802.11ah (Wi-Fi HaLow )
IEEE 802.16 (WiMax)
40Mbit/s - mobile, 1 Gbit/s - fixed
40kbps (915MHz) and 20kbps (868MHz)
250 kbps (2.4GHz) 40kbps (915MHz) 20kbps (868MHz)
250 kbps (2.4) 40kbps (915) 20kbps (868)
≈ 30m (outdoor 300m)
Weightless (W, N, P)
≈ 2km (P), 5km (W, N)
38400bps - via RF
13165bps - via powerlines
4kbps - 640kbps (depending on the active or passive type of device and frequency range).
Low-band 5G 600 - 700 MHz /Mid-band 5G 2.5-3.7 GHz / High-band 5G 25 - 39 GHz and higher frequencies up to 80GHz
Range is correlated with frequency bands - low band 5G has similar range to 4G (tens of kilometers), Mid-band 5G has several km range. High-band 5G has hundreds of meters up to 1.5 km range.
Low-band 5G (600 - 700 MHz) giving download speeds a little higher than 4G at the moment: 30-250 Mbps. Mid-band 5G (2.5-3.7 GHz) currently allowing speeds of 100-900 Mbps. High-band 5G (25 - 39 GHz and higher frequencies up to 80GHz) achieves, at the mom
Personal, Single building, Campus, LAN, Software defined WAN (SD WAN)
Encryption is evolved from 4G. It is more complex and based on multi-layer & multi-criteria approach. Generally, the level of 5G security is not defined by the number of specified security mechanisms. A multi-stakeholder approach that involves operator
Expectations are that 5G will expand boundaries in all domains of modern life such as travelling, driving, production efficiency improvements, smart systems deployment such as smart cities with smart homes, buildings, hospitals, factories, public safety,
The first phase of 5G specifications is defined in 3GPP Release 15. 5G is equipped with new air interface that supports heterogeneous access networks and handles variable bandwidths. Packet core network upgrades are also implemented, where traditional and
12.8 Kbit/s - 60kbit/s
Peer to peer, Star, Mesh, broadcast, ANT - FS, shared cluster.
ANT supports an 8-byte (64-bit) network key and 128-bit AES encryption for ANT master and slave channels. If further security is required, authentication and encryption can be implemented through the application level.
ANT devices may use the public net
ANT in residential, commercial and industrial sensing, control applications.
ANT + predominantly in health and wellness - blood pressure monitoring, fitness, cycling, running, continuous glucose monitoring, emergency response alerts, audio control, hear
ANT is a purpose-built ultra-low-power wireless networking protocol operating at 2.4GHz. ANT+ is an implementation of ANT and is an ecosystem of interoperable products built on device profiles.
ANT devices may use any RF frequency from 2400MHz to 2524MHz
Several security modes are recognized in Bluetooth technology, and generally each version of Bluetooth standard supports some of them. These modes are mutually different according to the point of security initiation in Bluetooth devices. Bluetooth devices
Bluetooth technology is used for data streaming or file exchange between mobile phones, PCs, printers, headsets, joysticks, mice, keyboards, stereo audio or in the automotive industry.
Bluetooth is wireless communications technology based on the IEEE 802.15.1 protocol. Bluetooth technology is supported by 1 master & up to 7 slave nodes, while the number of slave nodes is not limited by specification in BLE networks. In the most recent
BLE (Bluetooth Low Energy) or Bluetooth Smart (Bluetooth 5, 4.2)
In compliance with the Bluetooth Specification Version 5.0, two Security modes are implemented in BLE: Security mode 1 and Security mode 2.
BLE security mode 1 has 4 layers:
1) No security (without authentication and without encryption).
BLE technology is nowadays an indispensable part of mobile phones, PCs and other types of devices applicable in gaming, sports, wellness, industrial, medical, home and automation electronics. BLE provides wireless connectivity that enables home automatio
It is important to notice that Bluetooth and BLE are not compatible technologies. For example, channel bandwidth in Bluetooth technology is 1MHz and in BLE is 2MHz, number of channels in Bluetooth is 79, while BLE is supported by 40 channels. Moreover, th
Authentication algorithms were not very strong in 2G networks and were based on master security key . In 3G wireless standard (3GPP based) , the authentication mechanism was enhanced to become a two-way process. In addition, 128-bit encryption and integri
2G offered digital communications. 3G has been generic data cellular mobile technology that provided broadband transmissions. 4G is the first all IP cellular data communication technology with dominant data transfer services and IoT support capabilitie
Expectations are that the IoT ecosystem and its evolution support will be the most important criteria for further development of cellular mobile technologies.
≈ 20 kbps and 40 kbps (BPSK ), ≈ 250 kbps (O-QPSK with DSSS)
Star, Mesh, peer-to-peer
The IEEE 802.15.4 standard protects information at the Medium Access Control (MAC) sublayer of the OSI Reference Model. The implemented cryptographic mechanism in this standard is based on the symmetric-key cryptography and uses keys that are provided by
Typical use cases are smart homes and buildings i.e. home security, lighting control, air conditioning and heating systems; industrial automation; automotive sensing; education; consumer electronic devices and personal computer accessories.
The IEEE 802.15.4 standard defines the
interconnection protocol for the low-rate wireless personal area networks (LR-WPANs). This standard provides short range wireless communications between battery - powered nodes. The power consumption in IEEE 802.15.
ISA 100.11a standard is embedded with integrity checks and optional encryption at data link layer of the OSI reference model. Moreover, security mechanisms are provided in transport layer. too. 128 bits keys are used in both transport and data link layers
The most important use cases are reliable monitoring and alerting, asset management, predictive maintenance and condition monitoring, open - loop control and closed loop control industrial applications.
ISA 100.11a low data rate connectivity is supported with increased security and system management levels. In compliance with best practices, optimal number of nodes in the network is 50-100.
≈ 250 kbps
6LoWPAN has implemented AES-128 link layer security - which is defined in IEEE 802.15.4 protocol. This security mechanism provides link authentication and encryption. Additional security features are enabled by the transport layer security mechanisms ove
There are many applications where 6LoWPAN is being used: automation, industrial monitoring, smart grids (enable smart meters and other devices to build a micro mesh network), smart homes and smart buildings.
6LoWPAN (IPv6 over Low-Power Wireless Personal Area Networks), is a low power wireless mesh network. It is specified in IETF standard RFC 8138. Every node in the 6LoWPAN network is embedded with its own IPv6 address. This allows the node (typically sensor
433 MHz, 868 MHz (Europe), 915 MHz (Australia and North America) and 923 MHz (Asia)
≈ 0.3-50 kbps
Star of Stars
The fundamental properties that are supported in LoRaWAN security are mutual authentication, integrity protection and confidentiality.
Mutual authentication is established between a LoRaWAN end-device and the LoRaWAN network as part of the network join p
Some representative LoRaWAN use cases are smart homes and buildings, smart city applications and utility companies , smart metering, agriculture, civil infrastructures and industrial facilities as well.
LoRaWAN is a Low Power Wide Area Network (LPWAN) technology. It provides wireless, low-cost and secure bi-directional communication for Internet of Things (IoT) applications. LoRaWAN is optimized for long range communication, low power consumption and is
Proprietary - The EC-GSM- IoT Group is Open to GSMA Members and Associate Members, however all members must positively contribute to the Group's high-level objectives, including promoting EC-GSM- IoT technology and encouraging new service and applicati
The EC-GSM-IoT has improved security, compared to the existing GSM/GPRS networks - offers integrity protection, mutual authentication and implements stronger ciphering algorithms.
Battery life of up to 10 years can be supported for a wide range of use cases. Coverage extension beyond GSM enables coverage of challenging indoor and outdoor locations or remote areas in which sensors are deployed for agriculture or infrastructure monit
Extended coverage GSM IoT (EC-GSM-IoT) is a standard-based Low Power Wide Area technology specified by 3GPP Rel. 13. It is based on eGPRS and designed as a high capacity, long range, low energy and low complexity cellular system for IoT communications. Th
LTE technology frequency bands are used LTE-MTC Cat 0 (700MHz, 800 MHz, 900MHz, 1700MHz, 1800MHz, 1900MHz, 2300MHz, 2400MHz, 2500MHz, 2700MHz).
Range is variable and depends on frequency bands, propagation conditions etc. typically it is ≈ 10km
≈ 1 Mbps
System and security management is more complex in LTE-MTC compared to LTE, as there are massive numbers of devices in LTE MTC network. At the same time, the request defined in 3GPP TS 22.368 is "LTE MTC optimizations shall not degrade security compared
LTE MTC can be applicable to various use cases including industrial automation and control, intelligent transportation, automatic meter reading, smart electricity distribution and management, smart homes/ofﬁces/shops, smart lighting, smart industrial pla
LTE-MTC Cat 0 (LTE machine type communications) is determined in 3GPP Rel. 12 specification.
LTE-M technology offers SIM-based security features requiring device authentication to connect to the network. Although it shares the LTE security standards, security system and management is more complex in LTE-M (eMTC) due to massive connectivity that
LTE M (eMTC) technology supports many use cases, like smart cities, smart agriculture, logistics and transportation, industry and manufacturing automation.
LTE-M Cat M1 is specified by 3GPP Rel.13 and LTE-M Cat M2 is specified by 3GPP Rel.14. Voice over LTE (VoLTE) is usable on LTE- eMTC communications. Two new features are enabled in eMTC, like extended Discontinuous Reception (eDRX), and Power Saving Mo
NB-IoT - Narrowband-IoT (LTE Cat NB1 and LTE Cat NB2)
In-band LTE carrier, or within LTE guard bands, or standalone in re-farmed GSM spectrum - 700, 800 or 900 MHz.
The 3GPP Release 14 introduced five new FDD frequency bands for NB-IoT: 11 (central frequencies - UL 1437.9 MHz, DL 1485.9 MHz), 21 (central
Range is variable and depends on frequency bands, propagation conditions etc. typically it is better than LTE-M coverage.
Multilayer security is applied in NB-IoT- network level and application level security, including support for user identity confidentiality, entity authentication, data integrity, and mobile device identification.
Some NB - IoT use cases are smart metering (electricity, gas and water), facility management services, security systems, connected personal appliances measuring health parameters tracking of persons, animals or objects, smart city and industrial appliance
NB-IoT is is determined in 3GPP Rel. 13 specification (LTE Cat NB1) and 3GPP Rel. 14 specification (LTE Cat NB2).
NB-IoT has good indoor coverage and supports a massive number of low throughput end devices - sensors. It has low delay sensitivity, low d
The wireless communications links between the gateway (base station) and the network nodes are encrypted.
The Neul communications technology is a wide-area wireless networking technology designed for the IoT and created to compete against existing cellular communications technologies solutions, applicable to smart metering, facility management services, se
Neul leverages very small slices of the TV White Space spectrum to deliver high scalability, high coverage, low power and low-cost wireless networks. Systems are based on the Iceni chip, which communicates using the white space radio to access the high-qu
One of the security mechanisms implemented in NFC is Digital Signature (defined in the NFC Forum Signature RTD 2.0) with asymmetric key exchange [RD]. The Digital Signature is a part of the NFC Data Exchange Format (NDEF) message, which includes also a Ce
Some representative NFC use cases are ticket confirmation for sports events, concerts, at theaters, cinemas; welfare performances improvement - syncing workout data from a fitness machines with personal user device; personalized content sharing - viewing
NFC is a short range two-way wireless communication technology that enables simple and secure communication between electronic devices embedded with NFC microchip. There are three available modes of NFC communication:
- Read/write (e.g. for reading tags i
Security in RPMA wireless technology is built on 128 b AES. It offers security features such as: mutual authentication, message integrity and replay protection, message confidentiality, device anonymity, authentic firmware upgrades and secure multicasts.
RPMA is applicable for many use cases such as asset tracking, agriculture, oil fields automation, environmental monitoring, smart city, fleet management and logistics, industrial automation, connected cars, etc.
Before IoT, Ingenu (previously OnRamp) was selling metering infrastructure that collected low power information from electricity meters.
Ingenu has created random phase multiple access (RPMA), which uses Direct Sequence Spread Spectrum (DSSS) and is simil
Different data rates are enabled in IEEE 802.11 family of standards and their theoretical throughput is 11 Mbps ( IEEE 802.11b), 54 Mbps (IEEE 802.11a and IEEE 802.11g), 100 Mbps (IEEE 802.11n) or 300 Mbps (IEEE 802.11ac).
The Wi-Fi Alliance enables the implementation of different security solutions across Wi-Fi networks through the Wi-Fi Protected Access (WPA) family of technologies. Simultaneously with Wi-Fi technology, deployable for personal and enterprise networks, sec
Typical Wi Fi use cases are use cases are audio/video streaming applications, centralized management applications, video monitoring ad security systems, etc. Networking of multiple devices such as cameras, lights and switches, monitors, sensors and many o
The Wi-Fi represents wireless technology that includes the IEEE 802.11 family of standards (IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, IEEE 802.11n, IEEE 802.11ac, etc.). Within the 50m range, it operates in 2.4 GHz and 5GHz frequency bands.
Security is typically based on WPA 3 protocol with available personal and enterprise solutions.
Some representative IEEE 802.11ah use cases are health care, outdoor activities, smart metering, environmental sensing, home security, smart homes and buildings, power management, industrial automation, etc.
A Wi-Fi HaLow (IEEE 802.11ah) standard works at 900 MHz frequency band in the USA and significantly improves wireless coverage and energy efficiency as one of the most important features for IoT use cases. Wi-Fi HaLow devices have instant internet access
IEEE 802.16 (WiMax)
2.3 GHz, 3.5 GHz, 5.8GHz
40Mbit/s - mobile, 1 Gbit/s - fixed
Different security solutions are enabled in WiMax networks, like Advanced Encryption Standard (AES) with 128-bit key: Rivest, Shamir and Adelman (RSA) with 1024-bit key and Triple Digital Encryption Standard (3-DES).Both Advanced Encryption Standard (AES)
WiMax applicability is recognized in wireless MAN deployment, provisioning of Internet connectivity and generic user applications, environmental monitoring, smart cities , telemedicine etc.
IEEE 802.16 technology has been put forward to overcome the drawbacks of WLANs and mobile networks. It provides different QoS scheduling for supporting heterogeneous traffic including legacy voice traffic, VoIP (Voice over IP), voice and video streams and
Communications are always encrypted in a HART network. The network uses a 128-bit AES encryption system. The security manager in the WirelessHART gateway administers three parameters: Network ID, Join key and Session key. In addition to individual sessi
Typical HART use cases are process industry monitoring (automotive production process, chemical segments, food and beverage, power generation); process optimization , safety enhancements , environment and health monitoring, maintenance optimization, etc.
“HART” is an acronym for Highway Addressable Remote Transducer. The HART Protocol uses Frequency Shift Keying (FSK) standard to superimpose digital communication signals at a low level on top of the 4-20mA. This enables two-way field communication to take
915MHz (USA) 868MHz (EU)
40kbps (915MHz) and 20kbps (868MHz)
Z-wave provides packet encryption, integrity protection and device authentication services. End-to-end security is provided on application level (communication using command classes). It has in-band network key exchange and AES symmetric block cipher algo
It is a wireless communications protocol used primarily for home automation. Important Z-wave use cases are smart homes and buildings, smart offices, smart sensors, smart wall switches, smart bulbs, thermostats, windows, locks and security systems, swimm
Z-Wave protocol was developed by Sigma Designs, Inc. and determined by ITU G.9959 recommendation. Like other protocols and systems developed for the home and office automation , a Z-Wave system can be controlled via the Internet from a smart phone, tablet
ZigBee is considered to be a secure wireless communication protocol, with security architecture built in accordance with IEEE 802.15.4 standard. Security mechanisms include authentication – authorized access to network devices, integrity protection and en
Some representative ZigBee use cases are correlated with smart homes and smart buildings applications, like different smart home gateways, sensors and alarms that are monitoring almost everything - from temperature, humidity, or lighting and movement, sma
ZigBee is wireless PAN (Personal Area Network) technology evolved from IEEE 802.15.4 wireless standard and supported by the ZigBee Alliance. IEEE 802.15.4 standard defines the physical and data link layer with all details about the robust radio communicat
Thread utilizes a network-wide key that is used at the Media Access Layer (MAC) for encryption. This key is used for standard IEEE 802.15.4 authentication and encryption. IEEE 802.15.4 security protects the Thread network from over-the-air attacks origina
Thread provides wireless connectivity for home automation via the control of lights – smart bulbs and outlets, smoke detectors, cameras and other security systems, thermostats, utilities measurements, smart digital locks, hubs and controllers, different
Thread was designed with the Internet’s proven, open standards to create an Internet Protocol version 6 (IPv6) based mesh network, with 6LoWPAN as its foundation.
Thread can securely connect up to 250 devices in a wireless mesh network .
DigiMesh security features are 128-bit AES encryption and 256-bit AES - available on some products, such as XBee3 and XTend. One command (KZ) sets a password that prevents intruders from sending or receiving unsecured remote AT commands. For added securi
Some representative DigiMesh use cases are monitoring in food safety, facility and pharmacy domains, supply chains applicability, transportation and logistics, environmental monitoring etc.
DigiMesh is a proprietary peer-to-peer wireless networking topology developed by Digi International. The protocol allows for time synchronized sleeping nodes/routers and battery powered operations with low-power consumption.
The MiWi protocol follows the MAC security definition specified in IEEE 802.15.4 and is based on 128-bit AES model. MiWi security mechanisms can be categorized into three groups:
• AES-CTR mode encrypts MiWi protocol payload.
• AES-CBC-MAC mode ensures th
MiWi is designed for low-power, cost-constrained networks, such as industrial monitoring and control, home and building automation, remote control, wireless sensors, lighting control, HVAC systems and automated meter reading.
MiWi uses small, low-power digital radios based on the IEEE 802.15.4 standard. Although the MiWi software can all be downloaded for free from its official website, it is a proprietary solution that requires use only with Microchip microcontrollers. It was
902 MHz/928.35 MHz/868.3 MHz/315 MHz
≈ 30m (outdoor 300m)
The unique 32-bit identification number (ID) of the standard Enocean modules cannot be changed or copied - it is the protection against duplication. This authentication method already offers field-proven secure and reliable communication in building auto
The EnOcean wireless standard (ISO/ IEC 14543-3-1X) in sub 1GHz is optimized for use in buildings, as a radio range of 30m indoors is possible. Enocean representative use cases are smart lighting, temperature and air quality monitoring, positioning and s
The EnOcean wireless standard is geared to wireless sensors and wireless sensor networks with ultra-low power consumption. It also includes sensor networks that utilize energy harvesting technology to draw energy from their surroundings – for example from
In Weightless standard AES-128/256 encryption and authentication of both the terminal and the network guarantees integrity whilst temporary device identifiers offer anonymity for maximum security and privacy. OTA security key negotiation or replacement is
Typical Weightless use cases are smart metering, vehicle tracking, asset tracking, smart cars – vehicle diagnostics and upgrades, health monitoring, traffic sensors, smart appliances, rural broadband, smart ePayment infrastructure, industrial machine mon
The Weightless Special Interest Group (SIG) offers three different protocols— Weightless-N, Weightless-W, and Weightless-P.
Weightless-W open standard is designed to operate in the TV white space (TVWS) spectrum. Weightless-W represents a model the Neul
mcThings technology is embedded with 128 bits AES encryption algorithm.
Some representative mcThings use cases are asset tracking, industrial automation, maintenance optimization, location monitoring, security systems (theft and loss prevention), status monitoring, agriculture and food industry automation, environmental moni
mcThings is a good solution for use-cases that have sets of sensors in some urban areas (neighboring buildings). The technology is power efficient and requires minimal maintenance efforts. Network is expandable with bridges, and sensors have long-life bat
License-free sub-gigahertz radio frequency bands like 433 MHz, 868 MHz, 915 MHz, 923 MHz .
Based on security for IEEE 802.15.4 wireless networks, AES encryption with the key exchange is implemented in LoRa. In higher OSI levels built over the LoRa PHY layer, two layers of security are utilized - one for the network and one for the application
Typical LoRa use cases are power metering , water flow, gas or similar quantitative monitoring; logistics and transportation monitoring; smart home, office and smart city appliances; environmental sensing like air pollution, flooding, avalanche, forest f
LoRa provides wireless, low-cost and secure bi-directional communication for Internet of Things (IoT) applications. LoRa is optimized for long range communication, low power consumption and is designed to support large networks deployment.
LoRa is built
The Sigfox technology globally works within the ranges from 862 to 928 MHz
Security first comes within devices During the manufacturing process, each Sigfox Ready device is provisioned with a symmetrical authentication key. Security is also supported by radio technology. The SigFox technology encryption is designed in collabo
SIGFOX applicability potential is great. Some representative use cases are supply chain & logistics automation, manufacturing automation, smart cities, smart buildings and smart utilities & energy management and monitoring, smart agriculture etc.
SIGFOX protocol is a patented and closed technology. While it's hardware is open, the network however isn’t and customers must be subscribed to it. Note that while there are strict limitations of SIGFOX in terms of throughput and utilization, it is intend
DECT ULE devices use a combination of general DECT security procedures and ULE specific security procedures. General DECT security procedures are device registration (subscription), device and base authentication, key generation (including keys for ULE us
DECT ULE is new technology developed for different IoT use cases like home, office and industrial automation, control and monitoring systems, medical care and security systems.
DECT Ultra Low Energy (ULE) is a new technology based on DECT and intended for Machine-to-Machine communications such as Home and Industrial automation. DECT ULE standard has advantages of long distance range, no interference and large data rate/bandwidth
915MHz ( USA)
869.85 MHz (EU)
921.00 MHz (Australia)
38400bps - via RF
13165bps - via powerlines
Low or battery free (plug-in)
Insteon network security is maintained via linking control to ensure that users cannot create links that would allow them to control a neighbors’ Insteon devices, and via encryption within extended Insteon messages for applications such as door locks and
INSTEON is optimized for home and office automation and allows networking of different devices like light switches, thermostats, home audio, remote controls, leak sensors, pumps,
motion sensors, alarms, HVAC systems, security sensors or different remote c
INSTEON allows home automation devices to communicate through power lines, radio frequencies or a combination of both.
All INSTEON devices are known as peers. This is because any device can transmit, receive, or repeat the messages from other devices. In
A number of organizations have set standards for RFID, including the International Organization for Standardization (ISO), the International Electrotechnical Commission (IEC), ASTM International, etc. RFID standards include: ISO 11784/11785, ISO 14223, I
4kbps - 640kbps (depending on the active or passive type of device and frequency range).
point to point
The implementation of security mechanisms in RFID technology is based on confidentiality, integrity, and availability. Confidentiality is the information protection from unauthorized access. Integrity is related to data protection from modification and de
Radio-Frequency Identification (RFID) is a technology commonly used for identification, status administration and management of different objects. Moreover, this technology is very important for people identification, as it is deployed in the latest biome
Commonly, a RFID system has three main components: RFID tag, RFID reader and RFID application software. RFID tags can be active (with microchip, antenna, sensors and power supply) or passive (without power supply). RFID reader is another hardware componen
All WAVIoT data is encrypted bidirectionally from the device to the server using an XTEA 256 bit key block cipher.
Typical WAViOT use cases are smart cities, smart buildings, smart metering, utilities monitoring and metering like water, power - electricity, gas, heat, etc.
NB-Fi (Narrowband Fidelity) is a narrow band protocol which communicates on the sub 1GHz ISM sub bands. DBPSK is used as the modulation scheme in the physical layer. WAVIoT gateways can provide -154 dBm of receiver sensitivity, and cover over 1 million no
Similarly to 802.15.4, AES-CBC is used for authentication and AES-CCM for authentication and encryption.
Some representative DASH7 use cases are access control, smart energy, location based services, mobile advertising, industry automation, logistics, building access, mobile payments, ticketing, etc.
D7A complies with the ISO/IEC 18000-7 standard. ISO/IEC 18000-7 is an open standard for the license-free 433 MHz ISM band air-interface for
wireless communications. The 433 MHz frequency provides D7A with long propagation distance and better penetration.
The Wi-SUN security is specified by implementation of the x.509 certificate-based, public-key infrastructure to authenticate devices, as well as Advanced Encryption Standard (AES) encryption and message integrity check.
Devices protect their digital cr
Some typical Wi-Sun use cases are smart metering, smart cities, smart buildings, industrial automation, environmental sensing, etc.
The term Wi-Sun is the short form of Wireless Smart Utility Network. Wi-Sun technology is a successful derivation of IEEE 802.15.4 wireless standard that supports IPv6 protocol.
Wavenis technology is supported by 128-bit AES encryption.
Some Wavenis use cases are different metering solutions (gas, electricity, water, etc.) applicable to environmental monitoring, smart cities, utilities companies etc.
Wavenis is a wireless technology created by Coronis in the year 2000. It is developed for ultra low power and long range Wireless Sensor Networks(WSNs). It has become popular due to promotion by Wavenis Open Standard Alliance.
Fraunhofer Institute for Integrated Circuits & BTI Ltd. Toronto
Some MiOTY use cases are optimized maintenance models, inventory optimization for parts, asset management and tracking, condition and environmental monitoring, smart metering, augmented reality innovative applications, product R&D, improved customer suppo
MIOTY is a low-power, wide-area network (LPWAN) protocol that is purpose-built for massive industrial and commercial IoT deployments. Fraunhofer’s patented Telegram Splitting – the core of the MIOTY protocol, is designed to provide the scalability and ov
Wireless technology / standard
Organization that manages the technology / standard
For many end-users of today’s communications technology, the cloud is a somewhat mystical concept, a digital equivalent of aether. Most think of it as a formless abstraction “up there” when, in fact, the cloud is rooted in the ground. Or the seabed.
Despite rapid advances in satellite connection, almost all intercontinental data transfer that takes place every second of the day occurs via hundreds of thousands of miles of underwater cables. Reading a map of these submarine cables is like viewing a tapestry of international telecommunication.
It is perhaps strange to think that the email you just sent to your colleague in Africa traveled below the Atlantic Ocean. It’s an oddly analogue image in an online world. Yet, the digital capability that those cables unleash is increasingly the stuff of science fiction.
Submarine cables have historically been laid by telecoms companies, mostly in the form of consortiums that share out the extreme costs, often running into the hundreds of millions of dollars. Perhaps frustratingly for these investors, the cables are famously attractive to sharks who, for reasons not entirely clear to scientists, can’t resist having a nibble.
Large fish, however, are not the only things eating into telcos’ profits. Over the past few years, many undersea cables have been laid by internet giants like Amazon, Google, Facebook and Microsoft, businesses that, in 2018, owned or leased more than half of the submarine cable bandwidth.
This expanding aquatic infrastructure grants these firms growing independence from telecommunications companies, allowing them to launch, even more aggressively, offerings that directly compete with telco products and services.
There is nothing new about this scenario.
A worrying habit
In stories that almost read like case studies for Kodak or Blackberry, telcos lost their core revenue streams to OTT providers in the early to mid-2010s, seemingly from nowhere. Between 2010 and 2014, telecom businesses saw revenue decline from 4.5 percent to 4 percent, EBITDA margins drop from 25 percent to 17 percent, and cash-flow margins decrease from 15.6 percent to 8 percent.
Then, when it became clear that cloud computing and networking was going to be the future of digital operations, telecoms companies again lost the advantage. On paper, telcos seemed like a solid bet to compete with the likes of Amazon, Google and Microsoft: network experts against booksellers, online advertising engines and packaged software suppliers.
But that hasn’t been the case. AWS, Microsoft, and Google have dominated the rapid evolution of cloud computing, leaving telcos to play catch up. Though their network infrastructure capabilities keep them relevant, it is possibly these very networks that account for telecoms companies’ sluggish response to accelerating change.
Some argue that “thinking digital” is deeply embedded in telcos’ business model because, in addition to offering their own digital products and services, they provide the infrastructure and connectivity that allow other players and sectors to operate in the digital economy.
However, it is not just technical capacity that counts. Thriving in the digital economy requires a different mindset. It’s the type of mindset that might have helped telcos see earlier on that the future of cloud wasn’t going to just be in networks, but rather the software and services that rested on those networks.
GSMA predicts that over the next five years telcos will spend USD$1.1 trillion on their networks, 80 percent of which will go on 5G. As we stand on the threshold of global 5G and perhaps the greatest leap into communicative novelty the world has seen, telcos are faced with another opportunity to take charge in the form of edge computing. How will they react this time? Will they capture the opportunity that relies on software, services and the security for the smart-everything world? Or will the cloud incumbents again capture larger returns on top of telecom infrastructure?
IoT in a post-COVID world
As we have learned, and continue to learn, through the COVID-19 pandemic, the world as we know it is fundamentally reliant on digital connection. The recent crisis has also made it abundantly clear that the nature of work is likely to evolve much faster than anyone previously expected.
After witnessing almost instantaneous shifts to remote working and the accelerated digitization of organizational operations, it is easy to see how the adoption of mass automation and AI-driven cyber-physical systems (CPS) might be sooner than many first thought.
Such networks would incorporate billions of devices, sensors and machines in factories, cities and commercial environments managed by businesses and different tiers of government. The future prospects for consumer-facing IoT products and services are tremendous. In fact, the opportunities in all areas of IoT growth are abundant, but it is in the facilitation of enterprise and civic development that telcos could play a major role. The opportunities in developing the IoT ecosystem remain extensive. The question is, can they capture the possibilities?
McKinsey estimates that IoT’s potential economic impact on factories will rise to as much as $3.7 trillion a year by 2025. Though these figures were pre-COVID and don’t account for the huge downturn in the manufacturing sector, it is precisely this financial pressure that may force manufacturers to disinvest in human labor and prioritize the implementation of IIoT systems managed by AI.
However, with less capital available than before the coronavirus outbreak, companies – industrial and commercial – pursuing digital transformation are likely to be more risk-averse in their investments. Reliability and security will become more crucial than ever when choosing service providers, which should be to telcos’ advantage.
Finding the winning edge
The much-anticipated emergence of the massive internet of things (mIoT) is unlikely to be restrained by the economic impacts of the current pandemic. No-one knows for sure, of course, but more urgent development of new use cases in markets like healthcare, transportation, logistics, resource management or public surveillance will possibly stimulate the growth of the mIoT, not hamper it.
This expansion will continue to provide complex challenges in privacy, security, and even safety, an important concern when enterprises begin to incorporate more CPSs into their operations. With their vast experience in network management, telecom operators who are able to move beyond the mindset of selling connectivity or data have a number of advantages that can be leveraged to succeed in IoT.
The mIoT will be built on 5G, which is not just a physical network infrastructure. However, telcos pivoting into the development and management of virtual networks and multiple cloud-based applications should capitalize on their ownership of 5G network hardware to build an IoT ecosystem that delivers greater returns overall.
Security will be an important component of telcos’ proposition, not just as a market differentiator, but as a revenue stream. COVID-19 has highlighted the value of a humanity-serving IoT, but it has also laid bare the risks of sliding into a Big Brother society. Consumers and enterprises will demand greater IoT security, and telcos have a strong track record in keeping networks secure.
But success for operators will depend on their ability to capture a significant share of edge computing. This technology will facilitate most of the real-time cyber-physical use cases that often are used as adverts for 5G-founded IoT: autonomous vehicles and factories, remote surgery, massive drone networks, and so on.
These operations are extremely time-sensitive. There’s no tolerance for network lag when a driverless car traveling at 90 miles an hour suddenly needs to avoid a pedestrian. There’s no room for data delays at a critical stage of brain surgery.
5G’s near-zero latency is critical to the realization of these levels of operation, but even with 5G, we may not be able to rely exclusively on centralized cloud networking to deliver these use cases. IoT devices are also, by design and necessity, too small and too weak to take charge of the required data processing.
Edge computing, also known as fog computing, is a decentralized architectural pattern that moves computing resources and application services away from the cloud server, which is often hundreds of miles away, and closer to the point of data generation and action.
This improves speed as well as security and compliance. Edge computing removes the need for large packets of data to be transferred over great distances, and bypasses the potentially complex regulatory questions of data being generated in one jurisdiction but analyzed in another.
Telecom operators have a natural advantage in edge computing. Their physical networks with multiple data processing nodes mirror the distributed approach of fog processing. For edge computing to work, smaller data centers need to be located near the end device – telcos already have a lot of this hardware in place. This should give them the upper hand over AWS, Azure, and GCP.
These major cloud providers have largely relied on enormous data centers to deliver their services and will need to build out the components of an edge solution. Telcos, however, can reassign their existing assets in the service of computing at the edge.
Through their scale, infrastructure and networking experience, telecom companies should be able to leverage the move to the edge and reclaim a leading role in large scale and enterprise IT expansion.
The challenge, however, is not just technical, it is also mental. Can telecommunications companies make the mindset shift that will be required to re-imagine themselves as collaborative and forward-thinking players in a smart future?
In politics, the principle of subsidiarity dictates that a central authority should only perform those functions that cannot be performed at a local level. This is a useful analogy for the future of cloud computing because, though almost all processes are currently run through centralized network hubs, what has been called “Cloud 2.0” will see more and more functions decentralized and moved to the edge.
Cisco predicts that the number of devices connected to IP networks will be more than three times the global population by 2023. Edge or fog computing will be critical to meeting the increased load of these devices, many of which will be built to run on the super low latency that we expect from 5G.
5G will also see network virtualization, which will allow telecom operators to redistribute the core network to be closer to the user or device. At the same time, Cloud RAN, which improves agility and flexibility by virtualizing non-realtime functions, effectively allows the edge to be brought closer to the core. These forms of optimized architecture will radically improve the user experience and are a major opportunity for telcos to take the lead in distributed cloud services.
The first obstacle telecoms firms face is a legacy of slow-moving, risk-averse operations. In a heavily-regulated industry, often with unnaturally low market competition, telcos never needed to be agile and responsive.
Of course, the landscape has changed quickly, but reputations and memory have not faded as fast. Numerous surveys show that telcos are less trusted than other types of technology providers in supporting enterprises with their digital transformation.
Businesses know that, in order to be as effective as the cloud giants, telcos need to alter their DNA. They need to shift from being a utility and a dumb pipe provider to acting as a business advisor and systems integrator. This is a major mindset and an organizational change that would affect every function within a telco. Not many have made that shift.
The inertia is understandable; there are some steep hills to climb. For one, telcos are geographically segmented and territorial. Yet, they are competing with worldwide cloud providers who are able to offer developers and software providers almost globally-consistent UI and capabilities.
Building the uniformity and reliability that enterprise clients need will require standardization and time. Time to build trust, and common standards to establish unity between telecoms operators. Such a collaborative approach may be difficult for telcos to implement but will be necessary if they are to claim the edge.
Another major obstacle is software itself. Virtualization and edge computing rely on software capabilities that telcos don’t traditionally have. By comparison, these skills are the lifeblood of cloud companies. Telecoms operators need to rethink their workforce and processes if they are to achieve the software efficiency and refinement that cloud competitors display.
One way telcos are dealing with this challenge is by working with cloud providers in potentially symbiotic arrangements. Recent announcements by Amazon Web Services and Microsoft see these behemoths using the networks and infrastructure of large telco carriers to deliver their cloud services to the edge. For many, this is a classic win-win, even if the telecoms operators do not get the ownership of the edge that they may have hoped for. Perhaps afraid of losing once again to cloud providers, telcos are willing to strike a deal that sees them capture at least some of the pie.
Were this the case, it would be another example of risk-averse thinking that will end with telecoms operators losing their play for the edge. Though cloud providers may be relying on carriers for now, they have already signaled their intention for the future.
Only last month, Microsoft acquired Affirmed Networks, specialists in fully virtualized, cloud-native networking solutions for telecom operators. This will release Microsoft from reliance on telecoms’ data centers as they build edge computing capability. In Microsoft’s recent quarterly conference call with analysts Microsoft’s CEO was boasting “We are the only cloud that extends to the edge, with consistency across operating models, development environments and infrastructure stack,” revealing more about Microsoft’s edge computing strategy.
Google Cloud has revealed similar aspirations with the launch of Anthos and Global Mobile Edge Cloud (GMEC). More than that, Alphabet also has Google Fi, a mobile virtual network operator (MVNO), as well as fiber and cloud offerings. Taken together, this represents a disruptive portfolio in telecommunications.
Similarly, Facebook is launching its Terragraph network in San Jose using 60Ghz spectrum, which could compete with 5G. Despite current agreements and collaborations, telcos’ competitors are making bold moves towards taking away more of telecoms’ business.
We have known for some time now that telecommunications firms are in need of new business models and approaches to market development. It is not novel to observe that telcos were too slow to react in the past, and as a result were beaten by OTT.
But reactive strategies, no matter how fast, are now altogether insufficient. Only proactive movement will do if telcos are to avoid losing the battle for edge computing. Cloud providers are already closing the gaps in their own capabilities, and telecom companies need to do the same, but faster.
This will require a new way of doing business, but also a new way of thinking about telecom business. It won’t be easy, but that doesn’t matter – right now, there’s no time to think about pain.