From: snorble on
My question is, why do the modules bar and foo show up in mypack's
dir()? I intend for Foo (the class foo.Foo) and Bar (the class
bar.Bar) to be there, but was not sure about the modules foo and bar.

My big picture intention is to create smaller modules, but more of
them (like I am used to doing with C++), and then use a package to
organize the namespace so I'm not typing out excessively long names
and making the code less readable. Is that a reasonable approach to
developing Python programs?

$ ls mypack/*.py
bar.py
foo.py
__init__.py

$ cat mypack/__init__.py
from foo import Foo
from bar import Bar

$ python
Python 2.6.5 (r265:79096, Mar 19 2010, 21:48:26) [MSC v.1500 32 bit
(Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import mypack
>>> dir(mypack)
['Bar', 'Foo', '__builtins__', '__doc__', '__file__', '__name__',
'__package__', '__path__', 'bar', 'foo']
>>>
From: Peter Otten on
snorble wrote:

> My question is, why do the modules bar and foo show up in mypack's
> dir()? I intend for Foo (the class foo.Foo) and Bar (the class
> bar.Bar) to be there, but was not sure about the modules foo and bar.

> $ ls mypack/*.py
> bar.py
> foo.py
> __init__.py
>
> $ cat mypack/__init__.py
> from foo import Foo
> from bar import Bar
>
> $ python
> Python 2.6.5 (r265:79096, Mar 19 2010, 21:48:26) [MSC v.1500 32 bit
> (Intel)] on win32
> Type "help", "copyright", "credits" or "license" for more information.
>>>> import mypack
>>>> dir(mypack)
> ['Bar', 'Foo', '__builtins__', '__doc__', '__file__', '__name__',
> '__package__', '__path__', 'bar', 'foo']


How is Python to know that you won't perform an

import mypack.foo

afterwards? After this statement foo must be an attribute of mypack. But
when mypack.foo has been imported before this just performs

mypack = sys.modules["mypack"]

If the foo attribute weren't added by

from foo import Foo

the caching mechanism would not work. While

import mypack.foo

might also have been implemented as

mypack = sys.modules["mypack"]
if not hasattr(mypack", "foo"):
mypack.foo = sys.modules["mypack.foo"]

I think that would have been a solution to a non-problem. If you're sure you
don't need to access mypack.foo directly you can add

del foo

to mypack/__init__.py, but don't complain when you get bitten by

>>> import mypack.foo
>>> mypack.foo
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'module' object has no attribute 'foo'

> My big picture intention is to create smaller modules, but more of
> them (like I am used to doing with C++), and then use a package to
> organize the namespace so I'm not typing out excessively long names
> and making the code less readable. Is that a reasonable approach to
> developing Python programs?

I like to put related classes and functions into a single file. As a rule of
thumb, when a class needs a file of its own the class is too big...

Peter