fluentpython / example-code-2e Goto Github PK
View Code? Open in Web Editor NEWExample code for Fluent Python, 2nd edition (O'Reilly 2022)
Home Page: https://amzn.to/3J48u2J
License: MIT License
Example code for Fluent Python, 2nd edition (O'Reilly 2022)
Home Page: https://amzn.to/3J48u2J
License: MIT License
I'm trying to run the Example A-2
(page 689) in the First Edition.
I'm not sure if it's something wrong that I'm doing.. but I read the code 10 times. I'm sure it is the same of the book.
I'm running this code as:
$python example_a_2__generate_array.py
where my Python version is 3.8.5
. Then I see this error:
initial sample: 10500000 elements
Traceback (most recent call last):
File "example_a_2__generate_array.py", line 19, in <module>
with len(sample) < SAMPLE_LEN:
AttributeError: __enter__
Some reasoning:
I'm not reasigning the open
function (as suggested here for one possible cause).
Possible solution:
Maybe line 19 should be replaced by:
if len(sample) < SAMPLE_LEN:
I tried that and got a good looking output:
initial sample: 10500000 elements
complete sample: 10500000 elements
not selected: 500000 samples
writing not_selected.arr
selected: 10000000 samples
writing selected.arr
And they seem to have been created correctly:
$ls -la *selected*
-rw-rw-r-- 1 lucas lucas 4000000 mai 13 14:57 not_selected.arr
-rw-rw-r-- 1 lucas lucas 80000000 mai 13 14:57 selected.arr
PS.: Thank you for the book and sharing your knowledge!
how long should i wait for 2nd edition? its been 1 yr + already.
hi.
so i know that you probably know this
but if there will be a 3rd edition, exapmles 19-13 and 19-12 (and other examples in chapter 19) need a rework.
e.g the mentioned examples actually fail to prove the point of multiprocessing being good for cpu bound work since the prime number example is done in less than a second
the perf_countr() actually shows 0.00
and 19-13 actually takes longer due to multiprocessing overhead
below is the result of running them on python 3.12.2 on a not very new xeon cpu with perf_counter()
python sequential.py
Checking 20 numbers sequentially:
2 P 0.000001s
142702110479723 P 0.000002s
299593572317531 P 0.000001s
3333333333333301 P 0.000001s
3333333333333333 P 0.000001s
3333335652092209 P 0.000001s
4444444444444423 P 0.000001s
4444444444444444 0.000001s
4444444488888889 P 0.000001s
5555553133149889 P 0.000001s
5555555555555503 P 0.000001s
5555555555555555 P 0.000001s
6666666666666666 0.000001s
6666666666666719 P 0.000001s
6666667141414921 P 0.000001s
7777777536340681 P 0.000001s
7777777777777753 P 0.000001s
7777777777777777 P 0.000001s
9999999999999917 P 0.000001s
9999999999999999 P 0.000001s
Total time: 0.00s
python procs.py
Cheking 20 numbers with 16 processes:
2 P 0.000035s
142702110479723 P 0.000003s
299593572317531 P 0.000002s
3333333333333301 P 0.000001s
3333333333333333 P 0.000001s
3333335652092209 P 0.000001s
4444444444444423 P 0.000001s
4444444444444444 0.000001s
4444444488888889 P 0.000001s
5555555555555503 P 0.000001s
5555555555555555 P 0.000001s
6666666666666666 0.000001s
6666666666666719 P 0.000001s
6666667141414921 P 0.000001s
7777777536340681 P 0.000001s
7777777777777753 P 0.000001s
7777777777777777 P 0.000001s
9999999999999917 P 0.000001s
9999999999999999 P 0.000001s
5555553133149889 P 0.000033s
20 checks in 0.02s
and this is the result using perf_counter_ns()
python sequential.py
Checking 20 numbers sequentially:
2 P 487.000000ns
142702110479723 P 1047.000000ns
299593572317531 P 500.000000ns
3333333333333301 P 320.000000ns
3333333333333333 P 295.000000ns
3333335652092209 P 279.000000ns
4444444444444423 P 214.000000ns
4444444444444444 296.000000ns
4444444488888889 P 215.000000ns
5555553133149889 P 200.000000ns
5555555555555503 P 222.000000ns
5555555555555555 P 247.000000ns
6666666666666666 290.000000ns
6666666666666719 P 196.000000ns
6666667141414921 P 227.000000ns
7777777536340681 P 202.000000ns
7777777777777753 P 202.000000ns
7777777777777777 P 202.000000ns
9999999999999917 P 202.000000ns
9999999999999999 P 202.000000ns
Total time: 118353.00ns
python procs.py
Cheking 20 numbers with 16 processes:
2 P 13804.000000ns
142702110479723 P 2860.000000ns
299593572317531 P 1762.000000ns
3333333333333301 P 1324.000000ns
3333333333333333 P 1270.000000ns
3333335652092209 P 1073.000000ns
4444444444444423 P 794.000000ns
4444444444444444 1200.000000ns
4444444488888889 P 810.000000ns
5555553133149889 P 890.000000ns
5555555555555503 P 753.000000ns
5555555555555555 P 995.000000ns
6666666666666666 925.000000ns
6666666666666719 P 813.000000ns
6666667141414921 P 678.000000ns
7777777536340681 P 763.000000ns
7777777777777753 P 703.000000ns
9999999999999917 P 941.000000ns
9999999999999999 P 740.000000ns
7777777777777777 P 26330.000000ns
20 checks in 18213477.00ns
Hi Luciano,
I am playing a bit with the spinner_async.py
example, and I am wondering why the execution finishes normally when I comment the spinner.cancel()
line in the supervisor
function. I think the execution should stay in an infinite loop because I am not cancelling the coroutine and, therefore the asyncio.CancelledError
exception is never raised.
async def supervisor() -> int: # <3>
spinner = asyncio.create_task(spin('thinking!')) # <4>
print(f'spinner object: {spinner}') # <5>
result = await slow() # <6>
#spinner.cancel() # <------This is the only change <7>
return result
I am forgetting something?, any comments to help me to understand this scenario are welcome.
In the 1st edition you mention that you use XOR to mix hashes for Vector2d
since the Python documentation recommended that approach at the time. This is no longer the case; __hash__
now states
"It is advised to mix together the hash values of the components of the object that also play a part in comparison of objects by packing them into a tuple and hashing the tuple"
So the following code snippet should be changed from:
def __hash__(self):
return hash(self.x) ^ hash(self.y)
To:
def __hash__(self):
return hash((self.x, self.y))
In <6> annotations :"Each iteration in thisloop creates a new instance of averager; each is a generator object operating as a coroutine." and <7>:"Whenever grouper is sent a value, it’s piped ..., the value it returns is bound to results[key]. The while loop then proceeds to create another averager instance to consume more values."
Would I replace while True
with yield
?
This seems easier to understand
First, let me thank you very much for this great book! Fluent Python is very pedagogical, informative and quite a pleasure to read.
So, I have been quite surprised to encounter a bug when running flags2_asyncio.py, in the error accounting in the supervisor function. After a first error happens, all downloads are counted as erroneous, even when they are in fact successful. This is because the "error" variable is never reset after entering the for loop.
This can be fixed simply by adding "else: error = None" at the end of the try/except (parallel to what you do in flags2_threadpool.py and flags2_asyncio_executor.py).
This bug is also present in flags3_asyncio.py.
(I am using Python3.10.)
A big thank you again!
Hello,
Can someone please help me understand what am I doing wrong?
The example 8-17
shows this expected output:
But this is what I got instead:
I also tried that in the console session (using IPython v.7.29.0
):
Is it something that might be related to different python versions (mine is 3.8.10
)?
Sorry if this is outdated.
Hope to hear from you,
Kind regards
Hi @ramalho
I wanted to know at which stage the book is. Is it finished and revised? Cause the early access badge is gone on the book's page on O'Rielly's website.
Hello,
I run the code in https://github.com/fluentpython/example-code-2e/tree/master/24-class-metaprog/checked/metaclass/checked_demo.py
#!/usr/bin/env python3
# tag::MOVIE_DEMO[]
from checkedlib import Checked
class Movie(Checked):
title: str
year: int
box_office: float
if __name__ == '__main__':
movie = Movie(title='The Godfather', year=1972, box_office=137)
print(movie)
print(movie.title)
# end::MOVIE_DEMO[]
try:
# remove the "type: ignore" comment to see Mypy error
movie.year = 'MCMLXXII' # type: ignore
except TypeError as e:
print(e)
try:
blockbuster = Movie(title='Avatar', year=2009, box_office='billions')
except TypeError as e:
print(e)
There is a TypeError:
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Input In [2], in <cell line: 6>()
1 #!/usr/bin/env python3
2
3 # tag::MOVIE_DEMO[]
4 from checkedlib import Checked
----> 6 class Movie(Checked):
7 title: str
8 year: int
File D:\books/python/0. Fluent Python, 2nd Edition/example-code-2e/24-class-metaprog/checked/metaclass\checkedlib.py:107, in CheckedMeta.__new__(meta_cls, cls_name, bases, cls_dict)
105 type_hints = cls_dict.get('__annotations__', {}) # <3>
106 for name, constructor in type_hints.items(): # <4>
--> 107 field = Field(name, constructor) # <5>
108 cls_dict[name] = field # <6>
109 slots.append(field.storage_name) # <7>
File D:\books/python/0. Fluent Python, 2nd Edition/example-code-2e/24-class-metaprog/checked/metaclass\checkedlib.py:76, in Field.__init__(self, name, constructor)
74 def __init__(self, name: str, constructor: Callable) -> None:
75 if not callable(constructor) or constructor is type(None):
---> 76 raise TypeError(f'{name!r} type hint must be callable')
77 self.name = name
78 self.storage_name = '_' + name # <1>
TypeError: 'title' type hint must be callable
My python version is Python 3.8.13.
Thanks
I deployed the local server according to SETTING UP TEST SERVERS, finding two small suspected errors both in slow_server.py.
The first one is about the parser in the main fucncion(Line 59). The sokcet module requires the parameter bind to be str, bytes, or bytearray , so I add a default value as the other part. (This problem is maybe linked to the python environment thing)
The second one is in the doGet function of SlowHTTPRequestHandler(Line41). I get an AttributeError when I first ran the ERROR option, which told me that the HTTPStatus did not have this attribute IM_A_TEAPOT, so maybe we could just throw a number(418 according to the book) instead?
Consider the example in 24-class-metaprog/factories.py (corresponds to example 24-2, page 912 in my copy)
Dog = record_factory('Dog', 'name weight owner')
rex = Dog('Rex', 30, 'Bob')
We do not get static type checking on, e.g., rex.name
Is there an easy fix, or is this because types in metaclasses / metaprogramming are hard?
When downloading the flags in
async def get_flag(client: AsyncClient, cc: str) -> bytes: 4
url = f'{BASE_URL}/{cc}/{cc}.gif'.lower()
resp = await client.get(url, timeout=6.1,
follow_redirects=True) 5
return resp.read() 6
it seems that we read the response using .read() which is synchronous, why don't we
return await resp.aread()
instead?
The explanation seems to be for client.get() where it is doing network IO asynchrnously provided by httpx. But not for read()
6
Network I/O operations are implemented as coroutine methods, so they are driven asynchronously by the asyncio event loop.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.