Вы находитесь на странице: 1из 10

Evolution of standards

Wireless communication standards have seen a rapid and multidirectional evolution since the
start of the cellular era in the 1980s with the launch of the analogue cellular systems. Soon
after, digital wireless communication systems emerged in a quest to satisfy mobility, quality
of service and ever-growing data.
Despite the large variety of existing communication systems, each development has been
motivated by the same goal: to provide universal service facilities to users, while maintaining
or increasing profitability. While both aspects of this goal are strongly dependent on novel
and smart technologies, the latter has also been a key factor in impeding rapid regulatory
agreements that could speed up the adoption of interoperability on various infrastructural
levels. This is a pity, because such agreements would make it possible to exploit dynamic
access technologies to the full extent.
Backward compatibility, technology- and site-sharing, and convergence are key technological
elements that jointly with adequate regulatory agreements will enable ubiquity of
communications on a highly personalized level. The vision of a 5G wireless communication
system is one of universally deployable converging technologies that will enable wireless
services and applications at a data rate of more than one terabit per second (Tbit/s), with
coverage extending from a city, to a country, to the continents and to the world, that will
enable user-centric mega-communications.
Myriad services
The challenges faced by standardization in relation to the next-generation wireless
communication system (that is, 5G) are multifold. They are determined by the complexity of
the emerging user and usage scenarios for which 5G must provide myriad high-quality
services. Unlike single-purpose wireless systems, 5G will have the hard task of operating an
ever-growing number of heterogeneous networked devices that can communicate with each
other or with people or robots to satisfy dynamic and high-level user expectations.
The efficient wireless communication system that is needed will be able to follow the user
regardless of location, and be able to adapt its traffic capabilities on demand in order to
satisfy user and service requirements. Standardization work faces the tough challenge of
responding to the high public demand for universal, dynamic, user-centric and data-rich
wireless applications. The user-centric concept here also includes protection of privacy and
maintenance of trust.
Technological requirements
Both standardization and technology developers are facing the challenge of diverse 5G
technological requirements carrying equal weight in the provision of 5G services and
applications.
Technological solutions for 5G should make it possible to eradicate or, at least, control the
potentially dangerous aspects of ubiquitous communication, in particular those related to
security, trust and the protection of personal data. Technological solutions should also offer
reliability and dependability.
Researchers focused for years on finding the killer application for emerging wireless
systems, but today the danger comes from the application business model itself. In order to
boost profits, service providers must enable access of personal data from one application to
another, without allowing any visible control of what happens to the information afterwards.
Beyond the technological challenges, this entails moral and ethical considerations, especially
in relation to services and applications for critical infrastructure.
Thus, 5G standardization must define uncertainties relating, for example, to new threats to
cybersecurity, trust or privacy; trends in economic growth around the world; public
acceptance of wireless and applied-field technologies; and legislative restrictions. These
uncertainties then have to be taken into account in regard to long-term trends in technological
innovation, such as the increase in distributed computing, the new forms of ultra-fast wireless
connectivity, miniaturization and automation, and an increasing focus on cost containment.
Communication, navigation, sensing and services
Convergence of technologies, ultra-high capacity, universal coverage and maximal energy
and cost-efficiency are key characteristics of the 5G wireless system concept.
The enabling technologies converging into the 5G wireless system concept are
communication, navigation, sensing and services. A determining factor for the first three is
the availability of radio spectrum, through which information can be transmitted in relation to
the service requested. Cognitive radio relies on sensing for better exploitation of the available
spectrum, while high-frequency millimetre-wave bands used in terrestrial and satellite
communications are able to satisfy the 5G capacity requirements and represent a solution to
the limited availability of radio-frequency spectrum.
Small cell deployment within the coverage areas of cellular networks requires minimum pre-
planning and can boost capacity, increase coverage and improve energy and cost-efficiency
for the wireless provider, individual user and third parties that may be providing the
communication interface. These benefits, however, may be partially lost because of increased
interference and the inability of the network operator to manually configure the smaller cell
to be properly detected and used by the mobile devices, or simply because of an inability to
adapt to user needs. Proper self-optimizing procedures and protocols for fast network
deployment and dynamic reconfiguration of small cells must solve the problem of how to
deploy, where to deploy, and how to deal with the increased number of small cell sites. Such
procedures and protocols thus carry the value of economically viable technological solutions.
5G services will rely on strong computational power to process the huge volume of data
collected from various large-scale distributed sources. More specifically, 5G mobile devices
will consume and produce data at the same time. Already today, most mobile devices are
equipped with navigation capabilities (such as the Global Positioning System GPS) and
are able to report their location. The transfer of such an enormous load of information
requires communication channels with the maximum possible capacity.
Novel antenna technologies and implementation of their hardware are crucial to maximizing
throughput over the 5G communication channel. Beam forming with distributed elements is
an interesting emerging technology, where the array elements are parts of different systems
(that is, physically on different chips). This technology shows a potential for increasing the
data throughput of distributed sources such as sensors or smart dust. On-chip integrated
antennas can be used for distributed beam forming to maximize the data throughput of
miniature sensor systems and other similar applications.
Using cloud computing capacities to provide and support ubiquitous 5G connectivity and
real-time applications and services is a powerful way to automatically manage, analyse and
control data procured from highly distributed and heterogeneous devices (sensors, actuators,
smart devices). The cloud will be able to provide large-scale and long-lived storage and
processing resources, as well as important backend resources, for the user-centric 5G
ubiquitous applications delivered over the 5G wireless communication and network
infrastructure.
5G business case
The 5G wireless communication system should seamlessly bridge the virtual and physical
worlds, offering the same level of all-senses, context-based, rich communication experience
over fixed and wireless networks. Because 5G will be a plethora of interworking technologies
governed by separate specifications, it is important to find technological solutions and
standardize interconnectivity in order to enable end-to-end telecommunication service
provision across technologies and operators.
The successful 5G business case must adopt an active integration strategy that merges the
different realms of the enabling technologies with new business opportunities.
Standardization then becomes an enabler for both a successful technological and business
concept.


The first challenge for 5G standardization and regulation is to adopt technological concepts
and regulatory decisions that remove the limit on data rates. Each user should have
ubiquitous personalized 5G wireless access at very high sustainable data rates approaching
the current Ethernet state-of-the-art of 10+ gigabit per second (Gbit/s). A ubiquitous and
pervasive wireless network offering a sustainable 10 Gbit/s (reaching a rate of up to 1 Tbit/s
in burst mode) can be used as an alternative to Ethernet and access to Tbit/s fibre networks.
Thus, standardization should evolve 5G as a Wireless Innovative System for Dynamically
Operating Mega-Communications (WISDOM).
Academic role
5G standardization faces the task of bundling multi-radio, multi-band air interfaces to support
portability and nomadic mobility in a dynamic ultra-high data rate communication
environment using novel concepts and cognitive technologies. Here, academic research and
participation in standardization can play a crucial role. Standardization work should also
recognize the specifics of the scenarios in various world regions (for example, developing
countries) in order to stimulate profitable deployment and higher penetration worldwide.











Experiments in so-called 5G mobile communications have begun, but early indications
suggest that it's going to be a fundamentally different kind of cellular network that emerges.
The cellular industry's 'generation game' could now be coming to an end. At each point in its
evolution cellular radio has found itself leaning increasingly on previous generations. When
the first digital systems were introduced they effectively replaced the older, less-efficient
analogue cellular systems.
When telecom operators spent big on spectrum licences that would allow them to deploy 3G
networks, 2G systems - such as GSM - were pushed into the background, but remain even
today important parts of the network because they have such wide coverage. The 3G parts of
the network, meanwhile, remain focused on population centres.
For years to come, the 4G networks now being installed based on the Long-Term Evolution
(LTE) protocol will lean on the older 2G and 3G networks to support voice calls. LTE will be
reserved for high-bandwidth data and video. It will almost certainly take a long time for 4G
to extend beyond major conurbations, but it is a system that is meant to cover both urban and
rural environments, so could, in principle, eventually push 2G and 3G out of the mobile
communications picture altogether.
So now operators are thinking about the next step forward - but this will be a different kind of
network evolution. Rather than being an enhanced replacement for 4G, operators see it as a
merger of many different technologies. A new radio standard is only part of the picture. And
yet some of the changes being proposed could reshape the way cellular networks operate.
Although Samsung claimed to have demonstrated the first 5G-capable systems in Q2/2013,
any network that legitimately lays claim to that name is some way away. Nonetheless,
Professor Rahmi Tafazolli, who heads the Centre for Communication Systems Research at
the University of Surrey, says it will evolve: "It will be at least seven or eight years before we
have a complete specification," he believes.
"We are looking at new systems coming in around 2020," predicts Lauri Oksanen, head of
research and technology at equipment maker Nokia Siemens Networks, "but we do not talk
about 5G as the overall network evolution. We don't want to label everything '5G'. Our view
is that 5G will really be about better local-area performance, with lower latency and higher
bandwidth in high-density hetnets."
Oksanen refers here to the industry's contraction of the term 'heterogeneous networks', in
which different types of radio technology co-operate and interoperate. "Further development
to LTE is the most likely way to go for macrocell, wide-area coverage," he adds. "A new 5G
radio would be more of a complement to LTE evolution."
Prof Tafazolli also maintains that 4G is a very good technology: "It provides good speed, and
when it offers national coverage people will have a much better experience of Internet usage
on the move... The problem is that the way that we use the available radio spectrum and the
way we have developed the standards is not efficient," Prof Tafazolli adds. "We are running
out of radio spectrum. Before 2020, with all the spectrum that we have, most Western Europe
capitals and cities such as New York, Los Angeles, and Tokyo will run out of capacity. In
short, we have to come up with revolutionary ways of using the spectrum."
Shannon canon
In 1948 the American mathematician and electronic engineer Claude Shannon (1916-2001)
developed a key piece of communications theory that asserted, for a given level of noise,
there is a limit to how much data a channel can send. Nick Johnson, CTO of basestation
maker Ip.access, says that existing radio technologies are "operating as close as makes no
difference to their Shannon limits".
The University of Surrey's Prof Tafazolli agrees with this assertion, but adds: "It's wrong to
compare everything to the Shannon capacity limit - because that is defined for point-to-point
connections. The metric that really applies to cellular communications is capacity per metre
squared... and that's what we are going to be doing with 5G."
Increasing the data-communications density will mean finding new spectrum and being as
smart as possible about using existing frequency bands. The mobile industry and its
confederates has already started down that road with 4G, by introducing small-cell
basestations or 'femtocells' that work alongside 'macrocells', which cover much wider areas
['Feeding time', E&T, April 2013]. These very short-range basestations are designed to be
packed into city streets in dense meshes, possibly hanging from street lamps or even
deployed in users' homes where they double up as Wi-Fi access points.
The FON network, now owned by BT, provides an indication of how private access points
can be used to provide a high degree of public wireless coverage - the service was rolled into
the BTOpenzone service several years ago. Standard Wi-Fi itself provides the possibility to
offload traffic from the 3G and 4G services.
One option is to move these basestations into hitherto unused parts of the radio spectrum. "As
part of the 5G innovation work, we will look at new frequency bands," confirms Prof
Tafazolli.
Frequencies above 20GHz - ten times higher than those used for 3G and Wi-Fi
communications - offer massive potential data-rates, because the bands themselves are much
wider. "If you go up a couple of orders in frequency you can go up a couple of orders of
magnitude in bandwidth," explains Professor Ted Rappaport, director of the NYU Wireless
research centre at the Polytechnic Institute of New York University.
Attentuation issues
The millimetre-wave bands - they range from around 3mm to 30mm in wavelength - are also
practically unused for commercial wireless communication. There is a reason for that.
Absorption by rainfall climbs rapidly from 2GHz to 100GHz, making this region of the
spectrum unattractive for long-distance radio communication. It is also a difficult region of
spectrum to serve. Only recently have low-cost silicon processes reached the level of
development where they can be used in handsets that support such high frequencies.
Prof Rappaport contends it is a matter of distance. If you restrict the use of 20GHz-plus
signals to relatively short distances, some of the problems go away. "It is a common myth
that rainfall and oxygen absorption will attenuate these frequencies too much," says Prof
Rappaport. "We've performed measurements to show it in one of the toughest radio
environments we have: New York City."
Over distances of a few hundred metres, there is some loss - but far from enough to wreck the
technology's chances. Says Oksanen: "In densely populated areas, that is already a long
distance. Even macrocells are less than 400m apart in urban environments." There are also
sweet spots in the spectrum, such as 28GHz and 38GHz, where Prof Rappaport's group has
conducted experiments. "We will be measuring 72GHz this summer," he says.
Although there is a steady rise in absorption towards 100GHz, a number of the candidate
frequencies lie in troughs between very strong peaks. One area that is badly affected is
around 60GHz, a frequency now earmarked for automotive radar and indoor wireless
networks. But either side of that range are frequencies that are far less affected by air
absorption.
One issue with higher-frequency transmissions is that they are highly directional and work
best where the handset has a clear line of sight to the basestation; but Prof Rappaport's group
found the waves bounce off buildings providing multiple paths to a user even if they cannot
'see' the transmitter. "We've done research that shows you can get range extension to 400m by
combining antenna paths," he reports.
To steer radio transmissions towards a receiver, Prof Rappaport envisages the use of beam-
forming with multiple antennas, which are already being introduced on handsets on much
lower frequencies to improve reception quality.
More spectrum required
As wavelength is inversely proportional to frequency, higher frequencies will make it easier
to pack more antennas into the handset. Although designers are struggling to squeeze
multiple antennas for sub-2GHz bands into extant designs because of the need to use
structures appropriate for them, the wavelengths above 20GHz are at least ten times smaller.
So-called 'massively MIMO' antennas, such as the 64-element structure used by Samsung in
its 1Gbit/s transmission over 2km at 28GHz experiment, have become realistic. As they
introduce higher frequencies for small cells, the industry will have the opportunity to
reallocate spectrum to make best use of existing bands. In general, the lower the frequency,
the further it tends to propagate.
"It's clear that the industry's direction is to have macrocells at as low frequencies as possible,"
says Oksanen at Nokia Siemens Networks. The lower end, however, is the most precious area
of the radio spectrum and reallocation will not completely fix the problem.
"We are working on the spectrum front with customers and regulators and other industry
stakeholders to find new spectrum in the low bands," Oksanen adds. "That is one of the
important things about the future. It's not just about higher spectrum. We need new low-band
spectrum."
Prof Tafazolli says: "We are really short of spectrum. We should not be limited to licensed,
we could also use unlicensed spectrum." One way for cellular operators to use unlicensed
spectrum, which allows anyone who keeps within power limits access to a band, is to use
cognitive-radio techniques, in which transmitters constantly monitor other active radios and
attempt to use the spaces between them, hence the term 'white space' radio.
The Weightless Special Interest Group is promoting this use of unlicensed spectrum for
machine-to-machine communications, offering long-distance communications at low data
rates ('Standard's net gains', E&T, June 2013). Such spectrum may not suit cellular operators,
Oksanen points out: "It's difficult to invest in and use a band where you cannot guarantee
quality of service to the end user."
Bandwidth re-allocation?
There may be a middle way between dedicated and unlicensed spectrum. Oksanen says that
he is "already working with industry stakeholders on how we can maximise low-band
spectrum," and adds that "there are current users who have spectrum who don't use it all the
time".
There are bands allocated to radar and wireless microphones, as well as other bands reserved
by governments, that are not in use for 90 per cent of the time, according to Oksanen. "We
are working on a regulatory regime and developing a mechanism whereby operators have
guarantees that when they use it they can use it in the same way they use licensed spectrum,
but the primary user can claim it back when they need it."
For sub-10GHz urban radio, as well as adding extra bands there could be changes to the way
the data is transmitted. 4G uses orthogonal frequency division multiplexing (OFDM) -
already used in Wi-Fi and wired broadband - to spread multiple data bits over a single band.
One way to improve data rates is to have multiple basestations communicate with a single
handset on the same frequency band - but synchronising them is tricky.
"OFDM requires a lot of management - but there are other potential solutions," says
University of Surrey's Prof Tafazolli. "We don't have that technology yet, but I'm more in
favour in other types of waveform that do not require strict timing and frequency
synchronisation because they would reduce the management load."
Oksanen says: "There are some proposals for new coding methods' But when we look at
whether we can do better than OFDM, there doesn't seem to be any significant improvement
with these new methods. You find you can improve power efficiency, for example, but the
spectrum efficiency goes down. We believe that OFDM is the best way to go forward - and it
looks to be the most promising way for local-area 5G radio."
The industry has a while before it has to make a decision on what 5G means, but individual
radio standards are only going to be part of the picture. "Cellular architecture needs to
change. The legacy structures that we defined in 2G need to be revised," Prof Tafazolli
concludes. "We need to have a better way of structuring communications between the
basestation and devices."

Вам также может понравиться