ModuleNotFoundError: No module named 'funciones' (Python) - python

I'm trying to create some diccionary with some functions for some repetitive tasks where I work, and now when I try to use any function on the dicc I have this error.
I'm also learning python, there's like 2 weeks i'm proggraming in this language, so if anybody can help me to approach the problem in a more specific way, I'll be very grateful.
That's the folders structure:
folders structure
here is how I'm importing the functions, storing and invoking function
#Funciones Cred Now ------------------
from funciones.credNowFunciones.CredNowLoanCargar import credNowLoanCargar
from funciones.credNowFunciones.CredNowLoanCargar import credNowLoanCargarYComparar
from funciones.credNowFunciones.CredNowLoanAcumulable import credNowLoanAcumulable
credNowDict = {
"cargarLoan": credNowLoanCargar,
"cargarYCompararLoan": credNowLoanCargarYComparar,
"loanAcumulable": credNowLoanAcumulable
}
#Invocar funciones
credNowDict['cargarLoan']('archivos/excels/credNow/1-100-CXXXNXw_1102.xlsx',
'archivos/generadosPorEmpresa/credNowGenerados/crxxNxxnxCargar.xlsx')

The from package.subpackage.module import object structure only works if package and subpackage are valid packages - that is, they are in a directory on the PYTHONPATH or in the same directory as the __main__ file, containing a file called __init__.py.
Because neither funciones nor credNowFunciones contain a file called __init__.py, they do not count as packages and cannot be imported properly. To solve this, simply add an empty __init__.py file into each of the folders which you want to be able to import from, so the folder structure of funciones becomes like this:
funciones/
__init__.py
credNowFunciones/
__init__.py
CredNowLoanAcumulable.py
CredNowLoanCargar.py
edemsaFunciones/
__init__.py
...
Also note that if the python script you are importing the objects into is not in the same directory as funciones, you need to make sure funciones is in a directory which is on your PYTHONPATH. See here for more info.
For more information on Python packages, see here.

Related

Is specific way of importing submodules at all possible?

I am working in the following directory tree:
src/
__init__.py
train.py
modules/
__init__.py
encoders/
__init__.py
rnn_encoder.py
My pwd is the top-level directory and my __init__.py files are all empty. I am executing train.py, which contains the following code snippet.
import modules
# RNNEncoder is a class in rnn_encoder.py
encoder = modules.encoders.rnn_encoder.RNNEncoder(**params)
When I execute train.py, I get an error saying that
AttributeError: module 'modules' has no attribute 'encoders'
I am wondering if there is any clean way to make this work. Note that I am not looking for alternative methods of importing, I am well-aware that this can be done in other ways. What I'd like to know is whether it is possible to keep the code in train.py as is while maintaining the given directory structure.
Putting an __init__.py file in a folder allows that folder to act as an import target, even when it's empty. The way you currently have things set up, the following should work:
from modules.encoders import rnn_encoder
encoder = rnn_encoder.RNNEncoder(**params)
Here, python treats modules.encoders as a filepath, essentially, and then tries to actually import the code inside rnn_encoder.
However, this won't work:
import modules
encoder = modules.encoders.rnn_encoder.RNNEncoder(**params)
The reason is that, when you do import modules, what python is doing behind the scenes is importing __init__.py from the modules folder, and nothing else. It doesn't run into an error, since __init__.py exists, but since it's empty it doesn't actually do much of anything.
You can put code in __init__.py to fill out your module's namespace and allow people to access that namespace from outside your module. To solve your problem, make the following changes:
modules/encoders/__init__.py
from . import rnn_encoder
modules/__init__.py
from . import encoders
This imports rnn_encoder and assigns it to the namespace of encoders, allowing you to import encoders and then access encoders.rnn_encoder. Same with modules.encoders, except a step upwards.

How to import all modules in a package as a list?

Okay, so I'm trying to find a way to get a list of all modules in a package, into a variable.
My file structure is as such:
main/
app.py
__init__.py
modules/
__init.py
mod_1.py
mod2.py
All of the code is running from app.py . The modules folder will have many different modules in it, made by several people with a common interface. The goal is to have app.py import a list of all the modules in main.modules, and then iterate through them, doing calls such as:
for mod in module_list:
mod.add(5,2)
I had a working version where I got a list of filenames, and then used module_list = map(__import__, file_list) to make a list of modules that I was able to loop through and call. The problem was, I was using the horrible method of sys.append('path/to/module/folder/') and filtering by .py extension to get my list of files, but the problem is that this wouldn't work well on other computers. Playing with the init.py files, I am able to now correctly import my main/ package from anywhere by simply:
import main
(with a better name obviously). And am able to do stuff such as:
main.modules.mod_1.add(5,2)
from main.modules import *
mod_1.add(5,2)
mod_2.add(6,2)
But I can't find a way to make a list of all of the modules in the main.modules 'package'
mods = main.modules.*
mods = from main.modules import *
import main.modules.* as mods
All don't work, and was the closest I could come. Any help would be great, thanks!

How to organize code with __init__.py?

I am just starting off using google app engine, and have been looking around for good practices and code organization. Most of my problems lie from confusion of __init__.py.
My current test structure looks like
/website
main.py
/pages
__init__.py #1
blog.py
hello2.py
hello.py
/sub
__init__.py #2
base.py
I am trying to use main.py as a file that simply points to everything in /pages and /pages/sub. Most modules in /pages share almost all the same imports (ex. import urllib), is there a way to define that everything in /pages imports what I want rather than adding it in every individual module?
Currently in __init__.py #1 I have
from sub.base import *
Yet my module blog.py says BaseHandler (a function in base.py) not defined.
My end goal is to have something like ...
main.py
from pages import *
#be able to call any function in /pages without having to do blog.func1() or hello.func2()
#rather just func1() and func2()
And to be able to share common imports for modules in /pages in __init__.py. So that they share for example urllib and all functions from base.py. Thank you for taking the time to read this post, I look forward to your insight.
Sounds like you think __init__.py is an initializer for the other modules in the package. It is not. It turns pages into a package (allowing its files and subdirectories to be modules), and it is executed, like a normal module would be, when your program calls import pages. Imagine that it's named pages.py instead.
So if you really want to dump everything into the same namespace, init #2 can contain from base import * (which will import everything in base to the namespace of sub), and blog.py can contain from sub import *. Got it?

How to reference to the top-level module in Python inside a package?

In the below hierachy, is there a convenient and universal way to reference to the top_package using a generic term in all .py file below? I would like to have a consistent way to import other modules, so that even when the "top_package" changes name nothing breaks.
I am not in favour of using the relative import like "..level_one_a" as relative path will be different to each python file below. I am looking for a way that:
Each python file can have the same import statement for the same module in the package.
A decoupling reference to "top_package" in any .py file inside the package, so whatever name "top_package" changes to, nothing breaks.
top_package/
__init__.py
level_one_a/
__init__.py
my_lib.py
level_two/
__init__.py
hello_world.py
level_one_b/
__init__.py
my_lib.py
main.py
This should do the job:
top_package = __import__(__name__.split('.')[0])
The trick here is that for every module the __name__ variable contains the full path to the module separated by dots such as, for example, top_package.level_one_a.my_lib. Hence, if you want to get the top package name, you just need to get the first component of the path and import it using __import__.
Despite the variable name used to access the package is still called top_package, you can rename the package and if will still work.
Put your package and the main script into an outer container directory, like this:
container/
main.py
top_package/
__init__.py
level_one_a/
__init__.py
my_lib.py
level_two/
__init__.py
hello_world.py
level_one_b/
__init__.py
my_lib.py
When main.py is run, its parent directory (container) will be automatically added to the start of sys.path. And since top_package is now in the same directory, it can be imported from anywhere within the package tree.
So hello_world.py could import level_one_b/my_lib.py like this:
from top_package.level_one_b import my_lib
No matter what the name of the container directory is, or where it is located, the imports will always work with this arrangement.
But note that, in your original example, top_package it could easily function as the container directory itself. All you would have to do is remove top_package/__init__.py, and you would be left with efectively the same arrangement.
The previous import statement would then change to:
from level_one_b import my_lib
and you would be free to rename top_package however you wished.
You could use a combination of the __import__() function and the __path__ attribute of a package.
For example, suppose you wish to import <whatever>.level_one_a.level_two.hello_world from somewhere else in the package. You could do something like this:
import os
_temp = __import__(__path__[0].split(os.sep)[0] + ".level_one_a.level_two.hello_world")
my_hello_world = _temp.level_one_a.level_two.hello_world
This code is independent of the name of the top level package and can be used anywhere in the package. It's also pretty ugly.
This works from within a library module:
import __main__ as main_package
TOP_PACKAGE = main_package.__package__.split('.')[0]
I believe #2 is impossible without using relative imports or the named package. You have to specify what module to import either by explicitly calling its name or using a relative import. otherwise how would the interpreter know what you want?
If you make your application launcher one level above top_level/ and have it import top_level you can then reference top_level.* from anywhere inside the top_level package.
(I can show you an example from software I'm working on: http://github.com/toddself/beerlog/)

Adding code to __init__.py

I'm taking a look at how the model system in django works and I noticed something that I don't understand.
I know that you create an empty __init__.py file to specify that the current directory is a package. And that you can set some variable in __init__.py so that import * works properly.
But django adds a bunch of from ... import ... statements and defines a bunch of classes in __init__.py. Why? Doesn't this just make things look messy? Is there a reason that requires this code in __init__.py?
All imports in __init__.py are made available when you import the package (directory) that contains it.
Example:
./dir/__init__.py:
import something
./test.py:
import dir
# can now use dir.something
EDIT: forgot to mention, the code in __init__.py runs the first time you import any module from that directory. So it's normally a good place to put any package-level initialisation code.
EDIT2: dgrant pointed out to a possible confusion in my example. In __init__.py import something can import any module, not necessary from the package. For example, we can replace it with import datetime, then in our top level test.py both of these snippets will work:
import dir
print dir.datetime.datetime.now()
and
import dir.some_module_in_dir
print dir.datetime.datetime.now()
The bottom line is: all names assigned in __init__.py, be it imported modules, functions or classes, are automatically available in the package namespace whenever you import the package or a module in the package.
It's just personal preference really, and has to do with the layout of your python modules.
Let's say you have a module called erikutils. There are two ways that it can be a module, either you have a file called erikutils.py on your sys.path or you have a directory called erikutils on your sys.path with an empty __init__.py file inside it. Then let's say you have a bunch of modules called fileutils, procutils, parseutils and you want those to be sub-modules under erikutils. So you make some .py files called fileutils.py, procutils.py, and parseutils.py:
erikutils
__init__.py
fileutils.py
procutils.py
parseutils.py
Maybe you have a few functions that just don't belong in the fileutils, procutils, or parseutils modules. And let's say you don't feel like creating a new module called miscutils. AND, you'd like to be able to call the function like so:
erikutils.foo()
erikutils.bar()
rather than doing
erikutils.miscutils.foo()
erikutils.miscutils.bar()
So because the erikutils module is a directory, not a file, we have to define it's functions inside the __init__.py file.
In django, the best example I can think of is django.db.models.fields. ALL the django *Field classes are defined in the __init__.py file in the django/db/models/fields directory. I guess they did this because they didn't want to cram everything into a hypothetical django/db/models/fields.py model, so they split it out into a few submodules (related.py, files.py, for example) and they stuck the made *Field definitions in the fields module itself (hence, __init__.py).
Using the __init__.py file allows you to make the internal package structure invisible from the outside. If the internal structure changes (e.g. because you split one fat module into two) you only have to adjust the __init__.py file, but not the code that depends on the package. You can also make parts of your package invisible, e.g. if they are not ready for general usage.
Note that you can use the del command, so a typical __init__.py may look like this:
from somemodule import some_function1, some_function2, SomeObject
del somemodule
Now if you decide to split somemodule the new __init__.py might be:
from somemodule1 import some_function1, some_function2
from somemodule2 import SomeObject
del somemodule1
del somemodule2
From the outside the package still looks exactly as before.
"We recommend not putting much code in an __init__.py file, though. Programmers do not expect actual logic to happen in this file, and much like with from x import *, it can trip them up if they are looking for the declaration of a particular piece of code and can't find it until they check __init__.py. "
-- Python Object-Oriented Programming Fourth Edition Steven F. Lott Dusty Phillips

Categories