Вы находитесь на странице: 1из 11

Материал из Википедии — свободной энциклопедии

Перейти к навигацииПерейти к поиску

Huawei Technologies Co. Ltd.


华为技术公司

Тип Частная компания

Основание 1987

Основатели Жэнь Чжэнфэй

Расположение  Китай: Шэньчжэнь

Ключевые фигуры Сунь Яфан (председатель совета


директоров)[1]

Отрасль Телекоммуникации

Продукция Телекоммуникационное оборудование

Оборот ▲ $75,103 млрд (2016 год) [2]

Операционная ▲ $6,842 млрд (2016 год)


прибыль

Чистая прибыль ▲ $5,335 млрд (2016 год)


Активы ▲443 634 000 000 ¥ (31
декабря 2016)[4]

Число сотрудников Более 180 тыс. (2019)[3]

Дочерние компании Honor

Сайт www.huawei.com/ru/

 Медиафайлы на Викискладе

Huawei Technologies Co. Ltd. (кит. трад. 華為技術有限公司, упр. 华为技术有限公


司, пиньинь Huáwèi Jìshù Yǒuxiàn Gōngsī, палл. Хуавэй цзишу юсянь гунсы[5]) — одна
из крупнейших мировых компаний в сфере телекоммуникаций[6]. Основана в 1987
году, бывшим инженером Народно-освободительной армии Китая Жэнь Чжэнфэем[7].

Содержание

 1Продукция

o 1.1Процессоры Kirin

o 1.2Мобильный сегмент

 2Основные показатели деятельности

 3Деятельность в СНГ

o 3.1В России

o 3.2В Белоруссии

o 3.3В Казахстане

 4Huawei и США

o 4.1Арест «принцессы Хуавэй»

o 4.2«Война» с Хуавэй

 5Критика

 6Ограничения в доступе на рынок

 7См. также

 8Примечания
 9Ссылки

Продукция[править | править код]
Решения и продукция Хуавэй включают:
Потребительская электроника

 Мобильные устройства: смартфоны (в том числе под торговой маркой Honor)


и планшеты; также разрабатывается ОС (Hongmeng OS) для них.
 Сетевые сервисы: магазин приложений AppGallery, число ежемесячных
пользователей которого достигло в 2018 году 266 млн человек. В AppGallery
зарегистрировано 560 тыс. разработчиков [8].
Телекоммуникационное оборудование

 оборудование беспроводных сетей (LTE/HSDPA/W-


CDMA/EDGE/GPRS/GSM, CDMA2000 1xEV-DO/CDMA2000 1X, TD-SCDMA)
 оборудование базовой сети (IMS, Mobile Softswitch, NGN)
 сетевые устройства (FTTx, xDSL, оптические
устройства, маршрутизаторы, сетевые коммутаторы)
 приложения и аппаратное обеспечение (IN, услуги мобильной передачи
данных, BOSS)
 терминалы (UMTS/CDMA)
21 сентября 2010 года Хуавэй продемонстрировала в Гонконге опытный образец 700
Мбит/с DSL, где достигается скорость передачи 100 Мбит/с, которое больше
подходит для предоставления «сверхширокополосных» услуг — такое решение
может позволить операторам связи строить экономичные сети с большой полосой
пропускания.[9].
5G-технологии: 16 марта 2019 стало известно, что компания при помощи 5G в скором
времени разработает онлайн-версию самого большого дворцового комплекса в
мире — Запретного города в Пекине.[10].
Процессоры Kirin[править | править код]
На выставке электроники и бытовой техники IFA 2017 в августе компания Huawei
представила систему-на-кристалле Kirin 970 (техпроцесс 10 нм)[11], имеющую блок
«искусственного интеллекта» (ИИ)[12]. Данный процессор в ноябре 2017 года издание
GizChina назвало самым мощным по скорости передачи данных [13].
19 июля 2018 года Huawei представила мобильный процессор Kirin 710,
производимый по 12-нанометровой технологии. В системе содержится восемь
вычислительных ядер: 4 ядра ARM Cortex-A53 с частотой до 1,7 ГГц и 4 ядра ARM
Cortex-A73 с тактовой частотой до 2,2 ГГц; в роли графического ускорителя
выступает интегрированный контроллер ARM Mali-G51 MP4 [14].
Kirin A1 — решение, ориентированное на носимую электронику: беспроводные
наушники и умные часы.[15]
Kirin 985, со встроенным модемом 5G (2020) - ориентирован на недорогие
флагманские смартфоны. [2]
Мобильный сегмент[править | править код]
В 2015 году Huawei стала третьим по величине производителем смартфонов в мире.
По итогам первого квартала 2017 года компания также занимала третье место на
рынке — объём поставок составил 34,2 млн смартфонов, что соответствует доле в
6,1 %[16]. Во втором квартале 2018 года компания Huawei обошла Apple, став вторым
по величине производителем смартфонов в мире[17].
В июне 2018 года Huawei стала лидером по интернет-продажам смартфонов в
России, обойдя Samsung и Apple[18].
В марте 2019 года компания представила свои первые умные очки, созданные в
партнерстве с корейским модным брендом Gentle Monster. В их оправу будут
встроены динамики, микрофон, антенна и аккумулятор[19].
30 апреля 2019 года международная аналитическая компания IDC опубликовала
статистические данные по мировым продажам смартфонов (количество проданных
устройств) за первый квартал 2019 года. Huawei потеснила Apple и поднялась на
второе место по продажам в этом сегменте, заняв 19,1% мирового рынка
смартфонов. На первом месте по-прежнему Samsung c 23,1%, Apple c 11,7% - на
третьем. В абсолютных цифрах продажи Huawei в первом квартале 2019 года
выросли на 50,3% по сравнению с показателями первого квартала 2018 года - с 39,3
млн. штук до 59,1 млн. штук. За это же время продажи лидера (Samsung) упали на
8,1% (с 78,2 до 71,9 млн. штук), а объёмы продаж Apple рухнули практически на треть
- 30,2% (с 52,2 до 36,4 млн. штук)[20][21].
В июле 2019 года компания представила первый смартфон с поддержкой связи 5G.
Модель смартфона получила название HUAWEI Mate 20 X[22].
9 августа 2019 Huawei представила собственную операционную систему Harmony
OS[23].

Основные показатели деятельности[править | править код]


На начало 2012 года в компании работало более 110 тыс. человек, а в 2017 году в
компании работает уже 180 тыс. человек.
Всего в 2010 году в компании насчитывалось 8 региональных отделений и около 100
филиалов по всему миру. Huawei имеет 20 научно-исследовательских центров в
разных странах, включая Китай, США, Германию, Турцию, Индию
(Бангалор), Швецию (Стокгольм) и Россию (Москва)[24]. Компания создала совместные
инновационные центры с такими крупными мировыми операторами связи,
как Vodafone Group, BT Group, Telecom Italia, France Telecom, Telefonica, Deutsche
Telekom.
В 2006 году компания демонстрировала рост в сегменте сетей нового поколения,
включая сети 3G[25].
Ежегодно Huawei инвестирует в исследования и разработки не менее 10 % от объёма
подписанных контрактов ($1,26 млрд), оставаясь одним из ведущих (13 место [26])
предприятий среди компаний всего мира по количеству поданных патентных заявок
— 26,8 тыс.

 Выручка в 2007 году — 12,5 млрд $ (рост на 48 % в сравнении 2006 годом).
Большую часть контрактов в 2007 году — около 72 % — Huawei заключила на
зарубежных или международных рынках. В сегменте мобильной связи 45 % всех
новых контрактов Huawei пришлось на создание инфраструктуры UMTS/HSDPA и
44,8 % — на построение сетей CDMA. Заказчиками Huawei являются 35 из 50
крупнейших мировых операторов связи. В 2007 г. Huawei подписала 34 контракта
в сфере управляемых сервисов, включая контракт с Vodafone, China
Mobile, Etisalat и MNT Group[27].
 Выручка в 2008 году — 23,3 млрд $ (рост на 46 %), чистая прибыль —
1,15 млрд долл. В 2008 рентабельность операционной прибыли Huawei
увеличилась с 3 до 13 %[28].
 Выручка в 2009 году — 21,5 млрд $ (снижение на 7,7 %)[29]. Чистая
прибыль компании, по сравнению с прошлым годом, выросла в 2,3 раза и
составила 2,7 млрд $. Доход от продаж — 21,8 млрд (увеличение на
19 %). Рентабельность чистой прибыли — 12,2 %[30].
 В 2016 году выручка компании достигла 75,103 млрд $, операционная
прибыль — 6,842 млрд, чистая прибыль — 5,335 млрд[2]. Продано 139 млн
смартфонов.
 В 2017 году выручка компании достигла 92 млрд $, прирост товарооборота
составил 15 %, продано 153 млн смартфонов[31].
 В 2018 году выручка компании достигла 107 млрд $, прибыль 
Digital electronics, digital technology or digital (electronic) circuits are electronics that
operate on digital signals. In contrast, analog circuits manipulate analog signals whose
performance is more subject to manufacturing tolerance, signal attenuation and noise.
Digital techniques are helpful because it is much easier to get an electronic device to switch
into one of a number of known states than to accurately reproduce a continuous range of
values.
Digital electronic circuits are usually made from large assemblies of logic gates (often
printed on integrated circuits), simple electronic representations of Boolean logic functions.
[1]

Contents

 1History
o 1.1Digital revolution and digital age
 2Properties
 3Construction
 4Design
o 4.1Representation
o 4.2Synchronous systems
o 4.3Asynchronous systems
o 4.4Register transfer systems
o 4.5Computer design
o 4.6Computer architecture
o 4.7Design issues in digital circuits
o 4.8Automated design tools
o 4.9Design for testability
o 4.10Trade-offs
 4.10.1Cost
 4.10.2Reliability
 4.10.3Fanout
 4.10.4Speed
 5Logic families
 6Recent developments
 7See also
 8Notes
 9References
 10Further reading
 11External links

History[edit]
The binary number system was refined by Gottfried Wilhelm Leibniz (published in 1705)
and he also established that by using the binary system, the principles of arithmetic and
logic could be joined. Digital logic as we know it was the brain-child of George Boole in the
mid 19th century. In an 1886 letter, Charles Sanders Peirce described how logical
operations could be carried out by electrical switching circuits. [2] Eventually, vacuum
tubes replaced relays for logic operations. Lee De Forest's modification, in 1907, of
the Fleming valve can be used as an AND gate. Ludwig Wittgenstein introduced a version
of the 16-row truth table as proposition 5.101 of Tractatus Logico-
Philosophicus (1921). Walther Bothe, inventor of the coincidence circuit, shared the
1954 Nobel Prize in physics, for the first modern electronic AND gate in 1924.
Mechanical analog computers started appearing in the first century and were later used in
the medieval era for astronomical calculations. In World War II, mechanical analog
computers were used for specialized military applications such as calculating torpedo
aiming. During this time the first electronic digital computers were developed. Originally
they were the size of a large room, consuming as much power as several hundred
modern personal computers (PCs).[3]
The Z3 was an electromechanical computer designed by Konrad Zuse. Finished in 1941, it
was the world's first working programmable, fully automatic digital computer. [4] Its operation
was facilitated by the invention of the vacuum tube in 1904 by John Ambrose Fleming.
At the same time that digital calculation replaced analog, purely electronic circuit elements
soon replaced their mechanical and electromechanical equivalents. John
Bardeen and Walter Brattain invented the point-contact transistor at Bell Labs in 1947,
followed by William Shockley inventing the bipolar junction transistor at Bell Labs in 1948.[5]
[6]

At the University of Manchester, a team under the leadership of Tom Kilburn designed and


built a machine using the newly developed transistors instead of vacuum tubes.[7] Their
first transistorised computer and the first in the world, was operational by 1953, and a
second version was completed there in April 1955. From 1955 onwards, transistors
replaced vacuum tubes in computer designs, giving rise to the "second generation" of
computers. Compared to vacuum tubes, transistors were smaller, more reliable, had
indefinite lifespans, and required less power than vacuum tubes - thereby giving off less
heat, and allowing much denser concentrations of circuits, up to tens of thousands in a
relatively compact space.
While working at Texas Instruments in July 1958, Jack Kilby recorded his initial ideas
concerning the integrated circuit (IC), then successfully demonstrated the first working
integrated on 12 September 1958. [8] Kilby's chip was made of germanium. The following
year, Robert Noyce at Fairchild Semiconductor invented the silicon integrated circuit. The
basis for Noyce's silicon IC was the planar process, developed in early 1959 by Jean
Hoerni, who was in turn building on Mohamed Atalla's silicon surface passivation method
developed in 1957.[9] This new technique, the integrated circuit, allowed for quick, low-cost
fabrication of complex circuits by having a set of electronic circuits on one small plate
("chip") of semiconductor material, normally silicon.

Digital revolution and digital age[edit]


Further information: Digital Revolution and Digital Age
The metal–oxide–semiconductor field-effect transistor (MOSFET), also known as the MOS
transistor, was invented by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959.[10][11]
[12]
 The MOSFET's advantages include high scalability,[13] affordability,[14] low power
consumption, and high transistor density.[15] Its rapid on–off electronic switching speed also
makes it ideal for generating pulse trains,[16] the basis for electronic digital signals,[17][18] in
contrast to BJTs which more slowly generate analog signals resembling sine waves.
[16]
 Along with MOS large-scale integration (LSI), these factors make the MOSFET an
important switching device for digital circuits.[19] The MOSFET revolutionized the electronics
industry,[20][21] and is the most common semiconductor device.[11][22] MOSFETs are the
fundamental building blocks of digital electronics, during the Digital Revolution of the late
20th to early 21st centuries.[12][23][24] This paved the way for the Digital Age of the early 21st
century.[12]
In the early days of integrated circuits, each chip was limited to only a few transistors, and
the low degree of integration meant the design process was relatively simple.
Manufacturing yields were also quite low by today's standards. The wide adoption of the
MOSFET transistor by the early 1970s led to the first large-scale integration (LSI) chips
with more than 10,000 transistors on a single chip. [25] Following the wide adoption of CMOS,
a type of MOSFET logic, by the 1980s, millions and then billions of MOSFETs could be
placed on one chip as the technology progressed, [26] and good designs required thorough
planning, giving rise to new design methods. As of 2013, billions of MOSFETs are
manufactured every day.[11]
The wireless revolution, the introduction and proliferation of wireless networks, began in the
1990s and was enabled by the wide adoption of MOSFET-based RF power
amplifiers (power MOSFET and LDMOS) and RF circuits (RF CMOS).[27][28][29] Wireless
networks allowed for public digital transmission without the need for cables, leading
to digital television (digital TV), GPS, satellite radio, wireless Internet and mobile
phones through the 1990s–2000s.
Discrete cosine transform (DCT) coding, a data compression technique first proposed
by Nasir Ahmed in 1972,[30] enabled practical digital media transmission,[31][32][33] with image
compression formats such as JPEG (1992), video coding formats such as H.26x (1988
onwards) and MPEG (1993 onwards),[34] audio coding standards such as Dolby
Digital (1991)[35][36] and MP3 (1994),[34] and digital TV standards such as video-on-
demand (VOD)[31] and high-definition television (HDTV).[37] Internet video was popularized
by YouTube, an online video platform founded by Chad Hurley, Jawed Karim and Steve
Chen in 2005, which enabled the video streaming of MPEG-4 AVC (H.264) user-generated
content from anywhere on the World Wide Web.[38]

Properties[edit]
An advantage of digital circuits when compared to analog circuits is that signals
represented digitally can be transmitted without degradation caused by noise.[39] For
example, a continuous audio signal transmitted as a sequence of 1s and 0s, can be
reconstructed without error, provided the noise picked up in transmission is not enough to
prevent identification of the 1s and 0s.
In a digital system, a more precise representation of a signal can be obtained by using
more binary digits to represent it. While this requires more digital circuits to process the
signals, each digit is handled by the same kind of hardware, resulting in an
easily scalable system. In an analog system, additional resolution requires fundamental
improvements in the linearity and noise characteristics of each step of the signal chain.
With computer-controlled digital systems, new functions to be added through software
revision and no hardware changes. Often this can be done outside of the factory by
updating the product's software. So, the product's design errors can be corrected after the
product is in a customer's hands.
Information storage can be easier in digital systems than in analog ones. The noise
immunity of digital systems permits data to be stored and retrieved without degradation. In
an analog system, noise from aging and wear degrade the information stored. In a digital
system, as long as the total noise is below a certain level, the information can be recovered
perfectly. Even when more significant noise is present, the use of redundancy permits the
recovery of the original data provided too many errors do not occur.
In some cases, digital circuits use more energy than analog circuits to accomplish the same
tasks, thus producing more heat which increases the complexity of the circuits such as the
inclusion of heat sinks. In portable or battery-powered systems this can limit use of digital
systems. For example, battery-powered cellular telephones often use a low-power analog
front-end to amplify and tune in the radio signals from the base station. However, a base
station has grid power and can use power-hungry, but very flexible software radios. Such
base stations can be easily reprogrammed to process the signals used in new cellular
standards.
Many useful digital systems must translate from continuous analog signals to discrete
digital signals. This causes quantization errors. Quantization error can be reduced if the
system stores enough digital data to represent the signal to the desired degree of fidelity.
The Nyquist–Shannon sampling theorem provides an important guideline as to how much
digital data is needed to accurately portray a given analog signal.
In some systems, if a single piece of digital data is lost or misinterpreted, the meaning of
large blocks of related data can completely change. For example, a single-bit error in audio
data stored directly as linear pulse code modulation causes, at worst, a single click.
Instead, many people use audio compression to save storage space and download time,
even though a single bit error may cause a larger disruption.
Because of the cliff effect, it can be difficult for users to tell if a particular system is right on
the edge of failure, or if it can tolerate much more noise before failing. Digital fragility can
be reduced by designing a digital system for robustness. For example, a parity bit or
other error management method can be inserted into the signal path. These schemes help
the system detect errors, and then either correct the errors, or request retransmission of the
data.

Construction[edit]

A binary clock, hand-wired on breadboards

A digital circuit is typically constructed from small electronic circuits called logic gates that
can be used to create combinational logic. Each logic gate is designed to perform a
function of boolean logic when acting on logic signals. A logic gate is generally created
from one or more electrically controlled switches, usually transistors but thermionic
valves have seen historic use. The output of a logic gate can, in turn, control or feed into
more logic gates.
Another form of digital circuit is constructed from lookup tables, (many sold as
"programmable logic devices", though other kinds of PLDs exist). Lookup tables can
perform the same functions as machines based on logic gates, but can be easily
reprogrammed without changing the wiring. This means that a designer can often repair
design errors without changing the arrangement of wires. Therefore, in small volume
products, programmable logic devices are often the preferred solution. They are usually
designed by engineers using electronic design automation software.
Integrated circuits consist of multiple transistors on one silicon chip, and are the least
expensive way to make large number of interconnected logic gates. Integrated circuits are
usually interconnected on a printed circuit board which is a board which holds electrical
components, and connects them together with copper traces.

Design[edit]
Engineers use many methods to minimize logic redundancy in order to reduce the circuit
complexity. Reduced complexity reduces component count and potential errors and
therefore typically reduces cost. Logic redundancy can be removed by several well-known
techniques, such as binary decision diagrams, Boolean algebra, Karnaugh maps,
the Quine–McCluskey algorithm, and the heuristic computer method. These operations are
typically performed within a computer-aided design system.
Embedded systems with microcontrollers and programmable logic controllers are often
used to implement digital logic for complex systems that don't require optimal performance.
These systems are usually programmed by software engineers or by electricians,
using ladder logic.

Representation[edit]
Representations are crucial to an engineer's design of digital circuits. To choose
representations, engineers consider types of digital systems.
The classical way to represent a digital circuit is with an equivalent set of logic gates. Each
logic symbol is represented by a different shape. The actual set of shapes was introduced
in 1984 under IEEE/ANSI standard 91-1984 and is now in common use by integrated circuit
manufacturers.[40] Another way is to construct an equivalent system of electronic switches
(usually transistors). This can be represented as a truth table.
Most digital systems divide into combinational and sequential systems. A combinational
system always presents the same output when given the same inputs. A sequential system
is a combinational system with some of the outputs fed back as inputs. This makes the
digital machine perform a sequence of operations. The simplest sequential system is
probably a flip flop, a mechanism that represents a binary digit or "bit". Sequential systems
are often designed as state machines. In this way, engineers can design a system's gross
behavior, and even test it in a simulation, without considering all the details of the logic
functions.
Sequential systems divide into two further subcategories. "Synchronous" sequential
systems change state all at once, when a "clock" signal changes state. "Asynchronous"
sequential systems propagate changes whenever inputs change. Synchronous sequential
systems are made of well-characterized asynchronous circuits such as flip-flops, that
change only when the clock changes, and which have carefully designed timing margins.
For logic simulation, digital circuit representations have digital file formats that can be
processed by computer programs.

Synchronous systems[edit]

A 4-bit ring counter using D-type flip flops is an example of synchronous logic. Each device is
connected to the clock signal, and update together.

Main article: synchronous logic


The usual way to implement a synchronous sequential state machine is to divide it into a
piece of combinational logic and a set of flip flops called a "state register." Each time a
clock signal ticks, the state register captures the feedback generated from the previous
state of the combinational logic, and feeds it back as an unchanging input to the
combinational part of the state machine. The fastest rate of the clock is set by the most
time-consuming logic calculation in the combinational logic.
The state register is just a representation of a binary number. If the states in the state
machine are numbered (easy to arrange), the logic function is some combinational logic
that produces the number of the next state.

Asynchronous systems[edit]
As of 2014, most digital logic is synchronous because it is easier to create and verify a
synchronous design. However, asynchronous logic has the advantage of its speed not
being constrained by an arbitrary clock; instead, it runs at the maximum speed of its logic
gates.[a] Building an asynchronous system using faster parts makes the circuit faster.
Nevertherless, most systems need circuits that allow external unsynchronized signals to
enter synchronous logic circuits. These are inherently asynchronous in their design and
must be analyzed as such. Examples of widely used asynchronous circuits include
synchronizer flip-flops, switch debouncers and arbiters.
Asynchronous logic components can be hard to design because all possible states, in all
possible timings must be considered. The usual method is to construct a table of the
minimum and maximum time that each such state can exist, and then adjust the circuit to
minimize the number of such states. Then the designer must force the circuit to periodically
wait for all of its parts to enter a compatible state (this is called "self-resynchronization").
Without such careful design, it is easy to accidentally produce asynchronous logic that is
"unstable," that is, real electronics will have unpredictable results because of the
cumulative delays caused by small variations in the values of the electronic components.

Register transfer systems[edit]

Example of a simple circuit with a toggling output. The inverter forms the combinational logic in this
circuit, and the register holds the state.

Many digital systems are data flow machines. These are usually designed using
synchronous register transfer logic, using hardware description languages such
as VHDL or Verilog.
In register transfer logic, binary numbers are stored in groups of flip flops called registers.
The outputs of each register are a bundle of wires called a "bus" that carries that number to
other calculations. A calculation is simply a piece of combinational logic. Each calculation
also has an output bus, and these may be connected to the inputs of several registers.
Sometimes a register will have a multiplexer on its input, so that it can store a number from
any one of several buses. Alternatively, the outputs of several items may be connected to a
bus through buffers that can turn off the output of all of the devices except one. A
sequential state machine controls when each register accepts new data from its input.
Asynchronous register-transfer systems (such as computers) have a general solution. In
the 1980s, some researchers discovered that almost all synchronous register-transfer
machines could be converted to asynchronous designs by using first-in-first-out
synchronization logic. In this scheme, the digital machine is characterized as a set of data
flows. In each step of the flow, an asynchronous "synchronization circuit" determines when
the outputs of that step are valid, and presents a signal that says, "grab the data" to the
stages that use that stage's inputs. It turns out that just a few relatively simple
synchronization circuits are needed.

Computer design[edit]

Intel 80486DX2 microprocessor

The most general-purpose register-transfer logic machine is a computer. This is basically


an automatic binary abacus. The control unit of a computer is usually designed as
a microprogram run by a microsequencer. A microprogram is much like a player-piano roll.
Each table entry or "word" of the microprogram commands the state of every bit that
controls the computer. The sequencer then counts, and the count addresses the memory
or combinational logic machine that contains the microprogram. The bits from the
microprogram control the arithmetic logic unit, memory and other parts of the computer,
including the microsequencer itself. A "specialized computer" is usually a conventional
computer with special-purpose control logic or microprogram.
In this way, the complex task of designing the controls of a computer is reduced to a
simpler task of programming a collection of much simpler logic machines.
Almost all computers are synchronous. However, true asynchronous computers have also
been designed. One example is the Aspida DLX core.[42] Another was offered by ARM
Holdings. Speed advantages have not materialized, because modern computer designs
already run at the speed of their slowest component, usually memory. These do use
somewhat less power because a clock distribution network is not needed. An unexpected
advantage is that asynchronous computers do not produce spectrally-pure radio noise, so
they