Say I have a Python project that looks like this:
project/ ├── package/ │ ├── __init__.py │ ├── module1.py └── main.py
I would like package/
to be self-contained, though it is called from main.py
.
Here are the contents of package/module1.py
:
def some_fn(): print("hi")
now say I want to call some_fn
inside __init__.py
. I have the following options:
1:
import module1
- Benefit: can call when executing in package/
- Drawback: fails to import when executing outside of package/
2:
import package.module1
- Benefit: imports outside of package/
- Drawback: fails to import when inside of package/
3:
import os import sys sys.path.append(os.path.dirname(os.path.abspath(__file__))) import module1
- Benefit: covers both cases
- Drawback: ugly
Is there a preferred or idiomatic way of doing this?
__main__
module?sys.path
manipulation. That gets really difficult to debug. Instead, define a shared/global module namespace hierarchy. You can define multiple command line entrypoints, and can re-export objects under different names. You can also use relative imports likefrom . import module1
so that a module doesn't have to know its place in the module hierarchy, which makes it easier to move a package around. The drawback of scripts/entrypoints is that your project must be installed, ideally in a venv. Modern tools likepoetry
oruv
make this a lot easier thanpip
.__main__.py
and how it will enable you topython -m my_module
, but does it help with relative imports?from . import module1
is I cannot use it independently, I would get the error:ImportError: attempted relative import with no known parent package