Вы находитесь на странице: 1из 18


интегрированного урока английского языка и физики,

учителем английского языка Пин О.Л. и
учителем физики Саркисян А. В.
в 11 А классе 18 ноября 2009 года

ТЕМА: «Вехи науки ХХ века» (”Landmarks of Science in the XXth century ”)

развивать монологическую речь учащихся в рамках темы «Вехи науки ХХ века»

образовательные: 1. обогащать и активизировать словарный запас учащихся
по теме «Наука и открытия» 2. содействовать развитию монологической и
диалогической речи 3. совершенствовать навыки аудирования 4. обогатить и
активизировать знания по темам «теория относительности»,

воспитательные: 1. создавать условия для развития самостоятельности

2. укреплять навыки взаимодействия в группе 3. создавать условия для
становления коммуникативной культуры 4. совершенствовать общеучебные, в
том числе рефлексивные навыки 5. содействовать становлению объективной

развивающие: 1. содействовать поддержанию благоприятного

психологического климата в коллективе 2. помочь раскрытию личностных
способностей учащихся 3. содействовать становлению чувства самоуважения и
уважения к окружающим 4. содействовать реализации межпредметных связей
5. содействовать становлению метапредметных умений, таких как умение
владеть аудиторией
У учащихся:
1. учебники “Opportunities Intermediate” 2. тетради 3. монолингвистические
словари 4. презентации и другие сопроводительные материалы для
выступлений 5. рабочая тетрадь “Opportunities Intermediate”

У учителя:
1. учебники “Opportunities Intermediate” 2. презентация Power Point 3. рабочая
тетрадь “Opportunities Intermediate”


 групповая
 индивидуальные выступления
 фронтальная

1. Орг. начало урока 1. Приветствие 0,5 мин

2. Сообщение темы и цели урока 0,5 мин

2. Подготовительная работа 1. актуализация имеющихся знаний 1 мин

2. подготовительная работа в группах 2 мин

3. Работа над темой 1. индивидуальные выступления по темам 25 мин

2. обсуждение выступлений 5 мин
3. работа в группах по обсуждению цитаты 6 мин

4. Итог урока 1. Итог урока 2 мин

2 Итог работы класса 1 мин..
3. Анализ домашнего задания 1,5 мин
4. Орг. конец урока 0,5 мин

45 мин



T: Good morning, everybody! I’m glad to see you.

У: Здравствуйте. Рада вас видеть. Садитесь, пожалуйста. Давайте начнем урок.


T: We have already started studying some of the most important scientific discoveries
of the XXth century and today we’re here to have a closer look at them with the help of
a professional physicist.
У: А нам с Вами первое знакомство с этими открытиями поможет, когда мы
займемся подробным изучением физической сути некоторых этих открытий в
ближайшем будущем, а пока познакомимся c общей информацией.
T: So, first of all, please, look at the photos and name these discoveries
S1: equation e=mc2, theory of relativity
S2: the DNA molecule
S3: laser
S4: the first computer
S5: penicillin
T: Well, thank you. So, today we’re going to discuss these discoveries.


T: And first of all, just to warm up, let`s match these discoveries to the scientists. Any
S: (on the screen)
T: Does everyone agree?


T: And now I`d like you to work in groups and talk about these discoveries. The
groups have found some information and prepared speeches. Now they`ve got just a
couple of minutes to prepare for their presentation.
Бахарев А. Тышковец Н. Эдельштейн Дюбо Д.
Звягин Д. Тышковец Т. Колесников И. Зиборов Н.
Архипов А. Бурцев В. Лагутина Е. Волков П.
Сухачева К. Блехан Е. Грушко Е. Галичина А.
Лиходеева Г. Боровков Е. Ляпина О. Кожеко Ю.



T: So, are you ready, the first group?

S: Yes, we are.
T: Are you ready to listen actively? Please, while you’re listening, prepare at least one
question for the group. So, let`s start.
S1: In 1905 Albert Einstein invented his popular equation E=mc². This equation
allowed to use a huge amount of energy and opened new ways to explore the universe.
This formula is based on Albert Einstein’s research about the behavior of a body,
moving with speed near the speed of light.
The theory of relativity was made as a theory of gravity. However it was clear then
that it can be used for modeling the universe.
The theory of relativity is a theory describing the motion, the laws of mechanics and
spatial-temporal relations defining them at speeds close to the speed of light.
Gravitational effects here are caused not by forced interaction of bodies and fields
located in the space-time, but the deformation of space-time.
The principle of relativity - all physical processes in inertial frames look the same,
irrespective of whether the system is stationary or it is in a state of rectilinear motion.
S2: The Postulates of Einstein:
1. The speed of light is not dependent on the velocity of the source in all inertial
reference systems.
2. Space and time are homogeneous, the space is isotropic.
The theory of relativity is the most successful theory, proved by observations.
S3: The ToR in fact gave an opportunity to invent nuclear weapon.
First steps in forming the nation nuclear projects were made in late 1930s in GB,
France, USSR and Germany. American physics paid president`s attention to the
problem of nuclear weapons. The White House has a nuclear program to implement
and deploy all necessary means and resources. L. Groves was appointed as a Project
Manager. Enrico Fermi stood at the head of construction of a uranium-graphite reactor.
Oppenheimer led research and development in the construction of the nuclear bomb.
Manhattan Project began its implementation of the September 17, 1943. Many
prominent physicists, many of whom were refugees from Europe were drawn for it.
S4: By the summer of 1945 the Americans managed to build 3 nuclear bombs, 2 of
which were dropped on Hiroshima and Nagasaki, and the other suffered shortly before.
On the morning of August 6, 1945 an American bomber B-29 «Enola Gay» (crew
commander - Colonel Paul Tibbets) dropped uranium bomb «Little Boy» ( «A Kid")
on the Japanese city of Hiroshima. Power explosion was variously estimated from 13
to 18 kilotons. Three days later, on Aug. 9, 1945, plutonium bomb «Fat Man» ( «Fat
Man") was dropped on Nagasaki by a pilot Charles Sweeney.
T: So, your questions?
T: Anything else?
T: Any more questions?
T: O.K. Now let`s some up the scientific contents.
У: …
T: And now – the second group.
S1: In 1953, Charles H. Townes and graduate students James P. Gordon and Herbert J.
Zeiger produced the first microwave amplifier, a device operating on similar principles
to the laser, but amplifying microwave rather than infrared or visible radiation.
Townes's maser was incapable of continuous output. Nikolay Basov and Aleksandr
Prokhorov of the Soviet Union worked independently on the quantum oscillator and
solved the problem of continuous output systems by using more than two energy levels
and produced the first maser. These systems could release stimulated emission without
falling to the ground state, thus maintaining a population inversion. In 1955 Prokhorov
and Basov suggested an optical pumping of multilevel system as a method for
obtaining the population inversion, which later became one of the main methods of
laser pumping. Townes reports that he encountered opposition from a number of
eminent colleagues who thought the maser was theoretically impossible—including
Niels Bohr, John von Neumann, Isidor Rabi, Polykarp Kusch, and Llewellyn H.
Thomas Townes, Basov, and Prokhorov shared the Nobel Prize in Physics in 1964 "for
fundamental work in the field of quantum electronics, which has led to the
construction of oscillators and amplifiers based on the maser-laser principle. Laser was
made on a base of maser using electron. Laser, which stands for Light Amplification
by Stimulated Emission of Radiation, is a device that emits light (electromagnetic
radiation) through a process called stimulated emission. Laser light is usually spatially
coherent, which means that the light either is emitted in a narrow, low-divergence
beam, or can be converted into one with the help of optical components such as lenses.
More generally, coherent light typically means the source produces light waves that
are in step. They have the same frequencies and identical phase. The coherence of
typical laser emission is a distinctive characteristic of lasers. Most other light sources
emit incoherent light, which has a phase that varies randomly with time and position.
Typically, lasers are thought of as emitting light with a narrow wavelength spectrum
("monochromatic" light). This is not true of all lasers, however: some emit light with a
broad spectrum, while others emit light at multiple distinct wavelengths
S2: The word laser was originally spelled LASER and is an acronym for light
amplification by stimulated emission of radiation. The word light in this phrase is used
in the broader sense, referring to electromagnetic radiation of any frequency, not just
that in the visible spectrum. Hence there are infrared lasers, ultraviolet lasers, X-ray
lasers, etc. Because the microwave equivalent of the laser, the maser, was developed
first, devices that emit microwave and radio frequencies are usually called masers. In
early literature, particularly from researchers at Bell Telephone Laboratories, the laser
was often called the optical maser. This usage has since become uncommon, and as of
1998 even Bell Labs uses the term laser. The word "laser" is sometimes used to
describe other non-light technologies. For example, a source of atoms in a coherent
state is called an "atom laser." A laser consists of a gain medium inside a highly
reflective optical cavity, as well as a means to supply energy to the gain medium. The
gain medium is a material with properties that allow it to amplify light by stimulated
emission. In its simplest form, a cavity consists of two mirrors arranged such that light
bounces back and forth, each time passing through the gain medium. Typically one of
the two mirrors, the output coupler, is partially transparent. The output laser beam is
emitted through this mirror. Light of a specific wavelength that passes through the gain
medium is amplified (increases in power); the surrounding mirrors ensure that most of
the light makes many passes through the gain medium, being amplified repeatedly.
Part of the light that is between the mirrors (that is, within the cavity) passes through
the partially transparent mirror and escapes as a beam of light. The process of
supplying the energy required for the amplification is called pumping. The energy is
typically supplied as an electrical current or as light at a different wavelength. Such
light may be provided by a flash lamp or perhaps another laser. Most practical lasers
contain additional elements that affect properties such as the wavelength of the emitted
light and the shape of the beam.
S3: The output of a laser may be a continuous constant-amplitude output (known as
CW or continuous wave); or pulsed, by using the techniques of Q-switching,
modelocking, or gain-switching. In pulsed operation, much higher peak powers can be
achieved. Some types of lasers, such as dye lasers and vibronic solid-state lasers can
produce light over a broad range of wavelengths; this property makes them suitable for
generating extremely short pulses of light, on the order of a few femtoseconds (10-15
S4: In the pulsed mode of operation, the output of a laser varies with respect to time,
typically taking the form of alternating 'on' and 'off' periods. In many applications one
aims to deposit as much energy as possible at a given place in as short time as
possible. In laser ablation for example, a small volume of material at the surface of a
work piece might evaporate if it gets the energy required to heat it up far enough in
very short time. If, however, the same energy is spread over a longer time, the heat
may have time to disperse into the bulk of the piece, and less material evaporates.
There are a number of methods to achieve this. In a Q-switched laser, the population
inversion (usually produced in the same way as CW operation) is allowed to build up
by making the cavity conditions (the 'Q') unfavorable for lasing. Then, when the pump
energy stored in the laser medium is at the desired level, the 'Q' is adjusted (electro- or
acousto-optically) to favourable conditions, releasing the pulse. This results in high
peak powers as the average power of the laser (were it running in CW mode) is packed
into a shorter time frame. A modelocked laser emits extremely short pulses on the
order of tens of picoseconds down to less than 10 femtoseconds. These pulses are
typically separated by the time that a pulse takes to complete one round trip in the
resonator cavity. Due to the Fourier limit (also known as energy-time uncertainty), a
pulse of such short temporal length has a spectrum which contains a wide range of
wavelengths. Because of this, the laser medium must have a broad enough gain profile
to amplify them all. An example of a suitable material is titanium-doped, artificially
grown sapphire (Ti:sapphire). The modelocked laser is a most versatile tool for
researching processes happening at extremely fast time scales also known as
femtosecond physics, femtosecond chemistry and ultrafast science, for maximizing the
effect of nonlinearity in optical materials (e.g. in second-harmonic generation,
parametric down-conversion, optical parametric oscillators and the like), and in
ablation applications. Again, because of the short timescales involved, these lasers can
achieve extremely high powers.
S5: Another method of achieving pulsed laser operation is to pump the laser material
with a source that is itself pulsed, either through electronic charging in the case of
flashlamps, or another laser which is already pulsed. Pulsed pumping was historically
used with dye lasers where the inverted population lifetime of a dye molecule was so
short that a high energy, fast pump was needed. The way to overcome this problem
was to charge up large capacitors which are then switched to discharge through
flashlamps, producing a broad spectrum pump flash. Pulsed pumping is also required
for lasers which disrupt the gain medium so much during the laser process that lasing
has to cease for a short period. These lasers, such as the excimer laser and the copper
vapour laser, can never be operated in CW mode. In 1917 Albert Einstein, in his paper
Zur Quantentheorie der Strahlung (On the Quantum Theory of Radiation), laid the
foundation for the invention of the laser and its predecessor, the maser, in a ground-
breaking rederivation of Max Planck's law of radiation based on the concepts of
probability coefficients (later to be termed 'Einstein coefficients') for the absorption,
spontaneous emission, and stimulated emission of electromagnetic radiation. In 1928,
Rudolf W. Ladenburg confirmed the existence of stimulated emission and negative
absorption. In 1939, Valentin A. Fabrikant predicted the use of stimulated emission to
amplify "short" waves. In 1947, Willis E. Lamb and R. C. Retherford found apparent
stimulated emission in hydrogen spectra and made the first demonstration of
stimulated emission. In 1950, Alfred Kastler (Nobel Prize for Physics 1966) proposed
the method of optical pumping, which was experimentally confirmed by Brossel,
Kastler and Winter two years later.The first working laser was demonstrated on 16
May 1960 by Theodore Maiman at Hughes Research Laboratories Since then, lasers
have become a multi-billion dollar industry. By far the largest single application of
lasers is in optical storage devices such as compact disc and DVD players in which a
semiconductor laser less than a millimeter wide scans the surface of the disc. The
second-largest application is fiber-optic communication. Other common applications
of lasers are bar code readers, laser printers and laser pointers. Dye lasers use an
organic dye as the gain medium. The wide gain spectrum of available dyes allows
these lasers to be highly tunable, or to produce very short-duration pulses (on the order
of a few femtoseconds) The continuous or average power required for some uses:
 less than 1 mW – laser pointers
 5 mW – CD-ROM drive
 5–10 mW – DVD player or DVD-ROM drive
 100 mW – High-speed CD-RW burner
 250 mW – Consumer DVD-R burner
 1 W – green laser in current Holographic Versatile Disc prototype development
 1–20 W – output of the majority of commercially available solid-state lasers
used for micro machining
 30–100 W – typical sealed CO2 surgical lasers
 100–3000 W (peak output 1.5 kW) – typical sealed CO2 lasers used in industrial
laser cutting
 1 kW – Output power expected to be achieved by a prototype 1 cm diode laser
Examples of pulsed systems with high peak power:
 700 TW (700×1012 W) – National Ignition Facility, a 192-beam, 1.8-megajoule
laser system adjoining a 10-meter-diameter target chamber.
 1.3 PW (1.3×1015 W) – world's most powerful laser as of 1998, located at the
Lawrence Livermore Laboratory
Laser is used in medicine. there is even a special area of medicine cold laser surgery.
laser are especially useful in surgery because they stop bleeding as they cut.
T: So, your questions?
T: Anything else?
T: Any more questions?
T: O.K. Now let`s some up the scientific contents.
У: …
T: And now – the third group.
S1: A personal computer (PC) is any general-purpose computer which size,
capabilities, and original sales price make it useful for individuals, and which is
intended to be operated directly by an end user, with no intervening computer operator.
This is in contrast to the batch processing or time-sharing models which allowed large
expensive mainframe systems to be used by many people, usually at the same time, or
large data processing systems which required a full-time staff to operate efficiently. A
personal computer may be a desktop computer, a laptop, tablet PC or a handheld PC
(also called palmtop). The most common microprocessors are x86-compatible CPUs,
ARM architecture CPUs and PowerPC CPUs. Software applications for personal
computers include word processing, spreadsheets, databases, Web browsers and e-mail
clients, games, and myriad personal productivity and special-purpose software.
Modern personal computers often have high-speed or dial-up connections to the
Internet, allowing access to the World Wide Web and a wide range of other resources.
A PC may be used at home, or may be found in an office. Personal computers can be
connected to a local area network (LAN) either by a cable or wirelessly.
S2: While early PC owners usually had to write their own programs to do anything
useful with the machines, today's users have access to a wide range of commercial and
non-commercial software which is provided in ready-to-run form. Since the 1980s,
Microsoft and Intel have dominated much of the personal computer market with the
Wintel platform. The capabilities of the personal computer have changed greatly since
the introduction of electronic computers. By the early 1970s, people in academic or
research institutions had the opportunity for single-person use of a computer system in
interactive mode for extended durations, although these systems would still have been
too expensive to be owned by a single person. The introduction of the microprocessor,
a single chip with all the circuitry that formerly occupied large cabinets, led to the
proliferation of personal computers after about 1975. In what was later to be called
The Mother of All Demos, SRI researcher Douglas Englebart in 1968 gave a preview
of what would become the staples of daily working life in the 21st century - e-mail,
hypertext, word processing, video conferencing, and the mouse. Early personal
computers - generally called microcomputers - were sold often in Electronic kit form
and in limited volumes, and were of interest mostly to hobbyists and technicians.
Minimal programming was done by toggle switches, and output was provided by front
panel indicators. Practical use required peripherals such as keyboards, computer
terminals, disk drives, and printers. Micral N was the earliest commercial, non-kit
"personal" computer based on a microprocessor, the Intel 8008. It was built starting in
1972 and about 90,000 units were sold. Unlike other hobbyist computers of its day,
which were sold as electronics kits, in 1976 Steve Jobs and Steve Wozniak sold the
Apple I computer circuit board, which was fully prepared and contained about 30
chips. The first complete personal computer was the Commodore PET introduced in
January 1977. It was soon followed by the popular Apple II. Mass-market pre-
assembled computers allowed a wider range of people to use computers, focusing
more on software applications and less on development of the processor hardware.
Throughout the late 1970s and into the 1980s, computers were developed for
household use, offering personal productivity, programming and games. Somewhat
larger and more expensive systems (although still low-cost compared with
minicomputers and mainframes) were aimed for office and small business use.
Workstations are characterized by high-performance processors and graphics displays,
with large local disk storage, networking capability, and running under a multitasking
operating system. Workstations are still used for tasks such as computer-aided design,
drafting and modelling, computation-intensive scientific and engineering calculations,
image processing, architectural modelling, and computer graphics for animation and
motion picture visual effects. Eventually due to the IBM-PC's influence on the
Personal Computer market Personal Computers and Home Computers lost any
technical distinction. Business computers acquired color graphics capability and
sound, and home computers and game systems users used the same processors and
operating systems as office workers. Mass-market computers had graphics capabilities
and memory comparable to dedicated workstations of a few years before. Even local
area networking, originally a way to allow business computers to share expensive mass
storage and peripherals, became a standard feature of the personal computers used at
S3: In 2001 125 million personal computers were shipped in comparison to 48
thousand in 1977. More than 500 million personal computers were in use in 2002 and
one billion personal computers had been sold worldwide since mid-1970s until this
time. Of the latter figure, 75 percent were professional or work related, while the rest
sold for personal or home use. About 81.5 percent of personal computers shipped had
been desktop computers, 16.4 percent laptops and 2.1 percent servers. United States
had received 38.8 percent (394 million) of the computers shipped; Europe 25 percent
and 11.7 percent had gone to Asia-Pacific region, the fastest-growing market as of
2002. The second billion was expected to be sold by 2008. Almost half of all the
households in Western Europe had a personal computer and a computer could be
found in 40 percent of homes in United Kingdom, compared with only 13 percent in
1985. The global personal computer shipments were 264 million units in the year
2007, according to iSuppli, up 11.2 percent from 239 million in 2006.. In year 2004,
the global shipments was 183 million units, 11.6 percent increase over 2003. In 2003,
152.6 million computers were shipped, at an estimated value of $175 billion In 2002,
136.7 million PCs were shipped, at an estimated value of $175 billion In 2000, 140.2
million personal computers were shipped, at an estimated value of $226 billion
Worldwide shipments of personal computers surpassed the 100-million mark in 1999,
growing to 113.5 million units from 93.3 million units in 1998. In 1999, Asia had 14.1
million units shipped. As of June 2008, the number of personal computers in use
worldwide hit one billion, while another billion is expected to be reached by 2014.
Mature markets like the United States, Western Europe and Japan accounted for 58
percent of the worldwide installed PCs. The emerging markets were expected to
double their installed PCs by 2013 and to take 70 percent of the second billion PCs.
About 180 million computers (16 percent of the existing installed base) were expected
to be replaced and 35 million to be dumped into landfill in 2008. The whole installed
base grew 12 percent annually. In the developed world, there has been a vendor
tradition to keep adding functions to maintain high prices of personal computers.
However, since the introduction of One Laptop per Child foundation and its low-cost
XO-1 laptop, the computing industry started to pursue the price too. Although
introduced only one year earlier, there were 14 million netbooks sold in 2008. Besides
the regular computer manufacturers, companies making especially rugged versions of
computers have sprung up, offering alternatives for people operating their machines in
extreme weather or environments.
S4: A workstation is a high-end personal computer designed for technical or scientific
applications. Intended primarily to be used by one person at a time, they are commonly
connected to a local area network and run multi-user operating systems.
Desktop computer. Prior to the wide spread of PCs a computer that could fit on a
desk was considered remarkably small. Today the phrase usually indicates a particular
style of computer case. Desktop computers come in a variety of styles ranging from
large vertical tower cases to small form factor models that can be tucked behind an
LCD monitor. In this sense, the term 'desktop' refers specifically to a horizontally-
oriented case, usually intended to have the display screen placed on top to save space
on the desk top. Most modern desktop computers have separate screens and keyboards.
Single unit PCs (also known as all-in-one PCs) are a subtype of desktop computers,
which combine the monitor and case of the computer within a single unit. The monitor
almost always utilizes a touchscreen as an optional method of user input, however
detached keyboards and mice are normally still included. The inner components of the
PC are often located directly behind the monitor.
A subtype of desktops, called nettops, was introduced by Intel in February 2008 to
describe low-cost, lean-function, desktop computers. A similar subtype of laptops (or
notebooks) are the netbooks. These feature the new Intel Atom processor which
specially enables them to consume less power and to be built into small enclosures.
A laptop computer or simply laptop, also called a notebook computer or sometimes a
notebook, is a small personal computer designed for portability. Usually all of the
interface hardware needed to operate the laptop, such as parallel and serial ports,
graphics card, sound channel, etc., are built in to a single unit. Laptops contain high
capacity batteries that can power the device for extensive periods of time, enhancing
portability. Once the battery charge is depleted, it will have to be recharged through a
power outlet. In the interest of saving power, weight and space, they usually share
RAM with the video channel, slowing their performance compared to an equivalent
desktop machine. One main drawback of the laptop is that, due to the size and
configuration of components, relatively little can be done to upgrade the overall
computer from its original design. Some devices can be attached externally through
ports (including via USB), however internal upgrades are not recommended or in some
cases impossible, making the desktop PC more modular.
A subtype of notebooks, called subnotebooks, are computers with most of the features
of a standard laptop computer but smaller. They are larger than hand-held computers,
and usually run full versions of desktop/laptop operating systems. Ultra-Mobile PCs
(UMPC) are usually considered subnotebooks, or more specifically, subnotebook
Tablet PCs (see below). Netbooks are sometimes considered in this category, though
they are sometimes separated in a category of their own (see below).
Desktop replacements, meanwhile, are large laptops meant to replace a desktop
computer while keeping the mobility of a laptop.
S5: Netbooks are small portable computers in a "clamshell" design, that are designed
specifically for wireless communication and access to the Internet. They are generally
much lighter and cheaper than subnotebooks, and have a smaller display, between 7"
and 9", with a screen resolution between 800x600 and 1024x768 but newer models
feature higher resolution at up to 1280x768 like the Gigabyte M912X netbook. The
operating systems and applications on them are usually specially modified so they can
be comfortably used with a smaller sized screen, and the OS was in the start Linux,
although most netbooks run one of the NT version, Windows XP or Windows Vista
(For example Sony's Tablet Netbook). Netbooks Have built in Wireless connectivity
(Wi-Fi). Some even have smaller but faster solid state storage systems instead of
mechanical hard-disks. Storage capacities were usually in the 4 to 16 GB for solid state
flash drives range but have largely increased with mechanical drives, expanding up to
160GB for example the Gigabyte M912X and the MSI WInd U100 with units that
have mechanical hard drives instead of solid state flash drives. One of the first
examples of such a system was the original Eee PC.
The emergence of new market segment of small, energy-efficient and low-cost devices
designed for access to the Internet (netbooks and nettops) could threaten established
companies like Microsoft, Intel, HP or Dell, analysts said in July 2008. A market
research firm International Data Corporation predicted that the category could grow
from fewer than 500,000 in 2007 to 9 million in 2012 as the market for low cost and
secondhand computers expands in developed economies. Also, after Microsoft ceased
selling consumer versions of Windows XP, it made an exception and continued to
offer the operating system for netbook and nettop makers. A tablet PC is a notebook or
slate-shaped mobile computer, first introduced by Pen computing in the early 90s with
their PenGo Tablet Computer and popularized by Microsoft. Its touchscreen or
graphics tablet/screen hybrid technology allows the user to operate the computer with
a stylus or digital pen, or a fingertip, instead of a keyboard or mouse. The form factor
offers a more mobile way to interact with a computer. Tablet PCs are often used where
normal notebooks are impractical or unwieldy, or do not provide the needed
functionality. As technology and functionality continue to progress, prototype tablet
PC's will continue to emerge. The Microsoft Courier, a personal business device, has
two 7" monitors that support multi-touch gestures, Wi-Fi capabilities and has a built-in
camera. The device looks to be a replacement to traditional planners while offering
what most digital planners cannot, two pages and large writing spaces. [17]
The ultra-mobile PC (UMPC) is a specification for a small form factor of tablet PCs. It
was developed as a joint development exercise by Microsoft, Intel, and Samsung,
among others. Current UMPCs typically feature the Windows XP, Windows Vista,
Windows 7, or Linux operating system and low-voltage Intel Atom or VIA C7-M
A home theater PC (HTPC) is a convergence device that combines the functions of a
personal computer and a digital video recorder. It is connected to a television or a
television-sized computer display and is often used as a digital photo, music, video
player, TV receiver and digital video recorder. Home theater PCs are also referred to
as media center systems or media servers. The general goal in a HTPC is usually to
combine many or all components of a home theater setup into one box. They can be
purchased pre-configured with the required hardware and software needed to add
television programming to the PC, or can be cobbled together out of discrete
components as is commonly done with MythTV, Windows Media Center, GB-PVR,
SageTV, Famulent or LinuxMCE.
A pocket PC is a hardware specification for a handheld-sized computer (personal
digital assistant) that runs the Microsoft Windows Mobile operating system. It may
have the capability to run an alternative operating system like NetBSD or Linux. It has
many of the capabilities of modern desktop PCs.
Currently there are tens of thousands of applications for handhelds adhering to the
Microsoft Pocket PC specification, many of which are freeware. Some of these devices
also include mobile phone features. Microsoft compliant Pocket PCs can also be used
with many other add-ons like GPS receivers, barcode readers, RFID readers, and
cameras. In 2007, with the release of Windows Mobile 6, Microsoft dropped the name
Pocket PC in favor of a new naming scheme. Devices without an integrated phone are
called Windows Mobile Classic instead of Pocket PC. Devices with an integrated
phone and a touch screen are called Windows Mobile Professional.
T: So, your questions?
T: Anything else?
T: Any more questions?
T: O.K. Now let`s some up the scientific contents.
У: …
T: And now – the forth group.
S1: Deoxyribonuc`leic acid (DNA) is one of the two types of nuc`leic acids, providing
storage, transmission from one generation another and implementation of the genetic
program of development and functioning of living organisms. From a chemical point
of view, DNA - is a long `polymer molecule consisting of a serious of repeating units
called nucleotides. Nucleobase of one of the `circuits is connected with the nitrogen
bases of another chain of hydrogen bonds according to the principle of
complementarity: adenine is connected only with thymine, thus guanine is connected
only with cytosine.
S2: Deoxyribonucleic acid (DNA) is a bio`polymer (polyanions), which `monomer is
a `nucleotide. Each `nucleotide consists of phosphoric acid `residues attached to the
sugar to `deoxy`ribose, which is also attached to one of the four `nitrogenous bases.
The presence of this specififc type of sugar and is one of the major differences
between DNA and RNA, recorded in the names of these nucleic acids (R in RNA
stands for the sugar called ribose). Polymer of DNA has a rather complex structure.
Nucleotides are joined together covalently into long polynucleotide chains. These
chains are in most cases (except for some viruses with single-stranded DNA genomes),
in turn, are combined in pairs through hydrogen bonds in the structure known as the
double helix.
S3: Each base on one of the chains is associated with one definite base of the second
chain. Such specific binding is called complementary. The complementarity of the
double helix means that the information contained in one of the chains, is contained in
the other circuit as well. Reversibility and specificity of interactions between
complementary base pairs is important for the replication of DNA and all the other
functions of DNA in living organisms. Since hydrogen bonds are non-covalent, they
are torn and restored easily. Chains of the double helix may diverge as zipper-locking
when they are influenced by `enzymes or high temperature.
S4: DNA can be damaged by various mutagens, which include the oxidizing and
alkylating substances, as well as high-energy electromagnetic radiation - ultraviolet
and X-rays. Type of DNA damage depends on the type of mutagen. For example, UV
damages the DNA by the formation of thymine dimers in it, which are formed during
the formation of covalent bonds between neighboring bases. Oxidants, such as free
radicals or hydrogen peroxide lead to several types of DNA damages, including
modified bases, particularly guanosine, and double chain breaks in the DNA.
According to some estimates about 500 bases in each human cell are damaged daily by
different oxidizing compounds. The most dangerous among the different types of
damage are double chain breaks, because they are difficult to repair and may lead to a
loss of parts of chromosomes (deletions), and translocations.
S5: Recombination allows chromosomes to exchange genetic information, which
leads to a formation of new combinations of genes, which increases the efficiency of
natural selection and it is important for the rapid evolution of new proteins. Genetic
recombination also plays a role in repair, particularly in response to rupture the cells of
both strands of the DNA.
The most common form of crossing over is homologous recombination, when
chromosomes, participating in the recombination, have very similar sequences.
Sometimes the sites homology act transposons. DNA contains genetic information that
makes life, growth, development and reproduction of all modern organisms possible.
But it is unknown for how long a period of four billion years of the history of life on
Earth has the DNA been the main carrier of genetic information.
T: So, your questions?
T: Anything else?
T: Any more questions?
T: Well, thanks for everyone. Is there anyone who would like to add something?
S: …
T: Thanks. Are there any questions you would like to ask us or your classmates?


T: And now, please, look at the quotation on the screen:

Creativity in science could be described as the act of putting two and two together to
make five. How would you translate that?
S: Творчество в науке может быть описано как действие сложения двух с
двумя, когда в результате получается 5.
У: Как ВЫ это понимаете?
S: … (обсуждение)

T: So, what did we study today?
T: Well. Did you like the lesson?
S: Yes
T: Why?
S: …
T: Which speech did you like most of all/ least of all? Why?
S: …
У: Какая информация, по вашему мнению, самая интересная? самая полезная?
S: …


T: I liked the lesson too. And I liked the way you worked today. I` m pleased with
(names) - thank you very much for your active participation and hard work; but a bit
disappointed with (names) – hope to see you working more actively and creatively
next time.


У: Посмотрите, пожалуйста, на эту цепочку корифеев науки: Галилей,

исследования которого позволили Ньютону, обобщив полученные данные,
создать фундаментальную физику, и Эйнштейн, который эту фундаментальную
физику в корне изменил. Дома я прошу Вас выполнить ответить на один из трёх
вопросов по выбору:
1. В чем вклад каждого из этих ученых в мировую науку?
2. Где мое место в этой цепочке?
3. Что будет представлять из себя наука будущего?


T: Stand up, please. The lesson is over. Goodbye.