Академический Документы
Профессиональный Документы
Культура Документы
The yield enables a function to comeback where it left off when it is called again.
This is the critical difference from a regular function. A regular function cannot
comes back where it left off. The yield keyword helps a function to remember its
state.
Let's look at the following sample code which has 3 yields and it is iterated over 3
times, and each time it comes back to the next execution line in the function not
starting from the beginning of the function body:
def foo_with_yield():
yield 1
yield 2
yield 3
# iterative calls
for yield_value in foo_with_yield():
print yield_value,
Output
1 2 3
Simply put, the yield enables a function to suspend and resume while it turns in
a value at the time of the suspension of the execution.
In the previous same, what actually returns is a generator object. We can see it
from a modified code:
def foo_with_yield():
yield 1
yield 2
yield 3
x=foo_with_yield()
print x
print next(x)
print x
print next(x)
print x
print next(x)
Output:
<generator object foo_with_yield at 0x7f6e4f0f1e60>
1
<generator object foo_with_yield at 0x7f6e4f0f1e60>
2
<generator object foo_with_yield at 0x7f6e4f0f1e60>
3
The next() function takes a generator object and returns its next value.
Repeatedly calling next() with the same generator object resumes exactly where
it left off and continues until it hits the next yield statement. All variables and local
state are saved onyield and restored on next().
do
with
the
Generators are closely tied with the iteration protocol. Iterable objects define
a__next__() method which either returns the next item in the iterator or raises
the specialStopIteration exception to end the iteration. An object's iterator is
fetched with the iterbuilt-in function.
The for loops use this iteration protocol to step through a sequence or value
generator if the protocol is suspended. Otherwise, iteration falls back on
repeatedly indexing sequences.
To support this protocol, functions with yield statement are compiled specially as
generators. They return a generator object when they are called. The returned
object supports the iteration interface with an automatically created __next__()
method to resume execution. Generator functions may have a return simply
terminates the generation of values by raising a StopIteration exceptions after
any normal function exit.
The net effect is that generator functions, coded as def statements
containing yieldstatement, are automatically made to support the iteration
protocol and thus may be used any iteration context to produce results over time
and on demand.
In short, a generator looks like a function but behaves like an iterator.
Note
For more information on generator, please visit Python generators.
Q: We have the following code with unknown function f(). In f(), we do not want to
usereturn, instead, we may want to use generator.
for x in f(5):
print x,
The yield enables a function to comeback where it left off when it is called again.
This is the critical difference from a regular function. A regular function cannot
comes back where it left off. The yield keyword helps a function to remember its
state.
http://www.bogotobogo.com/python/python_generators.php
Generators
In computer science, a generator is a special routine that can be used to control
the iteration behavior of a loop.
A generator is very similar to a function that returns an array, in that a generator
has parameters, can be called, and generates a sequence of values. However,
instead of building an array containing all the values and returning them all at
once, a generatoryields the values one at a time, which requires less memory
and allows the caller to get started processing the first few values immediately. In
short, a generator looks like a function but behaves like an iterator.
generator functions retain when they are suspended includes their local scope,
their local variables retain information and make it available when the functions
are resumed.
The primary difference between generator and normal functions is that a
generatoryields a value, rather than returns a value. The yield suspends the
function and sends a value back to the caller while retains enough state to
enable the function immediately after the last yield run. This allows the generator
function to produce a series of values over time rather than computing them all at
once and sending them back in a list.
Generators are closely bound up with the iteration protocol. Iterable objects
define a__next__() method which either returns the next item in the iterator or
raises the specialStopIteration exception to end the iteration. An object's iterator
is fetched with the iterbuilt-in function.
The for loops use this iteration protocol to step through a sequence or value
generator if the protocol is suspended. Otherwise, iteration falls back on
repeatedly indexing sequences.
To support this protocol, functions with yield statement are compiled specially
asgenerators. They return a generator object when they are called. The
returned object supports the iteration interface with an automatically
created __next__() method to resume execution. Generator functions may have
a return simply terminates the generation of values by raising
a StopIteration exceptions after any normal function exit.
The net effect is that generator functions, coded as def statements
containing yieldstatement, are automatically made to support the iteration
protocol and thus may be used any iteration context to produce results over time
and on demand.
Let's look at the interactive example below:
>>> def create_counter(n):
print('create_counter()')
while True:
yield n
print('increment n')
n += 1
>>> c = create_counter(2)
>>> c
<generator object create_counter at 0x03004B48>
>>> next(c)
create_counter()
2
>>> next(c)
increment n
3
>>> next(c)
increment n
4
>>>
5. The next() function takes a generator object and returns its next value. The
first time we call next() with the counter generator, it executes the code
increate_counter() up to the first yield statement, then returns the value
that was yielded. In this case, that will be 2, because we originally created
the generator by calling create_counter(2).
6. Repeatedly calling next() with the same generator object resumes exactly
where it left off and continues until it hits the next yield statement. All
variables, local state, &c. are saved on yield and restored on next(). The
next line of code waiting to be executed calls print(), which
prints increment n. After that, the statement n += 1. Then it loops through
the while loop again, and the first thing it hits is the statement yield n,
which saves the state of everything and returns the current value
of n (now 3).
7. The second time we call next(c), we do all the same things again, but this
time n is now 4.
8. Since create_counter() sets up an infinite loop, we could theoretically do
this forever, and it would just keep incrementing n and spitting out values.
The generator function in the following example generated the cubics of numbers
over time:
>>> def cubic_generator(n):
for i in range(n):
yield i ** 3
>>>
The function yields a value and so returns to its caller each time through the loop.
When it is resumed, its prior state is restored and control picks up again after
the yieldstatement. When it's used in a for loop, control returns to the function
after its yieldstatement each time through the loop:
>>> for i in cubic_generator(5):
print(i,
end=' : ')
# Python 3.0
#print i,
# Python 2.x
0 : 1 : 8 : 27 : 64 :
>>>
end=' : ')
#Python 3.0
a, b = b, a+b
if (count == Limit):
break
count += 1
>>>
>>> for n in fibonacci():
print(n, end=' ')
0 1 1 2 3 5 8 13 21 34 55
>>>
(1)
(2)
a, b = b, a + b
(3)
As we can see from the output, we can use a generator like fibonacci() in a for
loop directly. The for loop will automatically call the next() function to get values
from thefibonacci() generator and assign them to the for loop index variable (n).
Each time through the for loop, n gets a new value from the yield statement
in fibonacci(), and all we have to do is print it out. Once fibonacci() runs out of
numbers (a becomes bigger than max, which in this case is 500), then the for
loop exits gracefully.
This is a useful idiom: pass a generator to the list() function, and it will iterate
through the entire generator (just like the for loop in the previous example) and
return a list of all the values.
To end the generation of values, functions use either a return with no value or
simply allow control to fall off the end of the function body.
To see what's happening inside the for, we can call the generator function
directly:
>>> x = cubic_generator(5)
>>> x
<generator object cubic_generator at 0x000000000315F678>
>>>
0 : 1 : 8 : 27 : 64 :
>>>
Or:
>>>
>>> for x in [n ** 3 for n in range(5)]:
print(x, end=' : ')
0 : 1 : 8 : 27 : 64 :
>>>
>>> for x in map((lambda n: n ** 3), range(5)):
print(x, end=' : ')
0 : 1 : 8 : 27 : 64 :
>>>
As we've seen, we could have had the same result using other approaches.
However, generators can be better in terms of memory usage and the
performance. They allow functions to avoid doing all the work up front. This is
especially useful when the resulting lists are huge or when it consumes a lot of
computation to produce each value. Generator distributes the time required to
produce the series of values among loop iterations.
As a more advanced usage example, generators can provide a simpler
alternatives to manually saving the state between iterations in class objects. With
generators, variables accessible in the function's scopes are saved and restored
automatically.
Typically, we don't see the next iterator machinery under the hood of a generator
expression like this because of for loops trigger the next for us automatically:
>>> for n in (x ** 3 for x in range(5)):
print('%s, %s' % (n, n * n))
0, 0
1, 1
8, 64
27, 729
64, 4096
>>>
In the above example, the parentheses were not required around the generator
expression if they are the sole item enclosed in other parentheses. However,
there are cases when extra parentheses are required as in the example below:
>>>
>>> sum (x ** 3 for x in range(5))
100
>>>
>>> sorted(x ** 3 for x in range(5))
[0, 1, 8, 27, 64]
>>>
>>> sorted((x ** 3 for x in range(5)), reverse=True)
[64, 27, 8, 1, 0]
>>>
>>> import math
>>> list( map(math.sqrt, (x ** 3 for x in range(5))) )
[0.0, 1.0, 2.8284271247461903, 5.196152422706632, 8.0]
>>>
The equivalent generator function requires a little bit more code but as a
multistatement function, it will be able to code more logic and use more state
information if needed:
>>> def repeat5times(x):
for c in x:
yield c * 5
>>> G = repeat5times('Python')
>>> list(G)
['PPPPP', 'yyyyy', 'ttttt', 'hhhhh', 'ooooo', 'nnnnn']
>>>
Note that we make new generators here to iterator again. Generators are oneshot iterators.
Both generator functions and generator expressions are their own iterators. So,
they support just one active iteration. We can't have multiple iterators. In the
previous example for generator expression, a generator's iterator is the generator
itself.
>>> G = (c * 5 for c in 'Python')
>>> # My iterator is myself: G has __next__() method
>>> iter(G) is G
True
>>>
If we iterate over the results stream manually with multiple iterators, they will all
point to the same position:
>>> G = (c * 5 for c in 'Python')
>>> # Iterate manually
>>> I1 = iter(G)
>>> next(I1)
'PPPPP'
>>> next(I1)
'yyyyy'
>>> I2 = iter(G)
>>> next(I2)
'ttttt'
>>>
Once any iteration runs to completion, all are exhausted. We have to make a
new generator to start again:
# Collect the rest of I1's items
>>> list(I1)
['hhhhh', 'ooooo',
# Other iterators exhausted too
>>> next(I2)
Traceback (most recent call last):
'PPPPP'
>>> next(I1)
'yyyyy'
>>> # I2 at same position I1
>>> next(I2)
'ttttt'
>>>
This is different from the behavior of some built-in types. Built-in types support
multiple iterators and passes and reflect their in-place changes in active iterators:
>>>
>>> L = [1, 2, 3, 4]
>>> I1, I2 = iter(L), iter(L)
>>> next(I1)
1
>>> next(I1)
2
>>> # Lists support multiple iterators
>>> next(I2)
1
>>> # Changes reflected in iterators
>>> del L[2:]
>>> next(I1)
Traceback (most recent call last):
File "<pyshell#21>", line 1, in <module>
next(I1)
StopIteration
>>>