I have a small annoyance with PyCharm; for example if I am writing assert_true without having it imported, then I can Alt-Enter on it and I get the suggestion to Import from... nose.tools.assert_true and the ide automatically adds the line:
from nose.tools import assert_true
Thats all good. But if I later write assert_false and press Alt-Enter I don't seem to have the choice to add that to the existing import automatically. Seem odd to me, do I miss something ?
So currently what I do is to add the first function like that and then I have to add additional functions manually to the existing import lines.
I know you can use * and to import the full module; but I am still interested in on how to automatically add further functions to existing import lines in some non manual way.
Update: as in the comments below this seem to be specific to nose.tools, when importing from other modules it works.
Related
I'm building a large application, a game-engine if you so will. I've tried both relative and absolute imports but nothing seems to be working.
As seen, I'm inside of Sacra/Screen/main.py and want to import something located in Sacra/Audio/PlayAudio.py; The part of which is a "simple" class.
To relate to the "problem", how can one do to import that class? I've looked through the bulk of python docs, real python and many more websites and all I get is the same, that it should be working as of right now.
Note: I've also tried ..Audio.PlayAudio import PlaySound, where PlaySound is the class in PlayAudio.
Another note: It's possbile to import in the Sacra, being a test.py file, with the code, import Audio. Because the init file is configuered.
I've split one large python file containing a bunch of methods into several smaller ones. However, there's one problem: I'd like all of these small files to import almost the same modules.
I've tried to create a header.py file where I just pasted the common header. I've then added from header import * on the other ones, but it seems to me that this would only import the methods listed on header.py, rather than the actual modules.
I know one solution is to figure out which libraries each small file depends on, but isn't there a faster way to do that?
It is very important and useful to have all dependencies in a .py file to be listed in the import statements. That way we can easily trace back the source of all modules used.
Say if you are using module1.method somewhere and want to check where this method came from, you'll always find in the import statements at the top. This is the reason why from module1 import * is highly discouraged.
We can do what you need in neat way.
In your header.py
import module1
import module2
In your other files,
import header
header.module1.method() #make all the function calls via header module.
If you think its making your code writing tedious because of longer function names, then may be try this.
import header as h
h.module1.method()
But please make sure you don't have untraceable modules in your python files.
Is there any conceivable point to reloading these modules immediately after importing them? This is the code that I was reviewing which made me wonder:
import time
import sys
import os
import string
import pp
import numpy
import nrrd
reload(nrrd)
import smooth as sm
reload(sm)
import TensorEval2C as tensPP
reload(tensPP)
import TrackFiber4C as trackPP
reload(trackPP)
import cmpV
reload(cmpV)
import vectors as vects
reload(vects)
Edit: I suggested that this might make the creation of .pyc files more likely, but several people pointed out that this happens this first time, every time.
I note that the standard modules are just imported: it's the other modules that are reloaded. I expect whoever wrote this code wanted to be able to easily reload the whole package (so as to get their latest edits). After putting in all these redundant reload calls, the programmer only had to write
>>> reload(package)
to bring things up to date in the interpreter, instead of having to type
>>> reload(package.nrrd)
>>> reload(package.sm)
>>> reload(package.tensPP)
etc. So please ignore the suggestion that you commit violence against the programmer who wrote this: they are far from the only programmer who's had trouble with reloading of dependencies. Just encourage them to move the reloads to a convenience function.
It is possible that this does cause something to happen; the obvious example is side-effects that happen on import. For instance, a module could log to a file the time and date of every time it is imported.
There is probably no good reason for this, however.
The .pyc files would be created on the first import, so even that's not a very good reason for this.
What's the execution environment for this code? There exists at least one Python web framework that makes different reload decisions than standard python does, which leads to frustration and confusion when you make a change that doesn't 'take'.
I've noticed that importing a module will import its functions and methods, and the functions and methods of those as well. Is there a set rule for how many levels down python will import when you import an upper-level module?
edit
sorry, I think I've been misunderstood by the answers so far responding about multiple imports of some dependencies. I'm thinking of nested folders e.g. in django, if you import django, you can access django.contrib.auth, but you can't access django.contrib.auth.views unless you import that specifically. I was just wondering if it's always two levels down in such a case
second edit
to clarify again.. in the django example, the layout is /django/contrib/auth/views.py, where each of the subfolders has a "init.py" making it a module, none of which define any "all" attributes. Is my example bad, since maybe you can't use the dot syntax to navigate to a file within a module designated folder?
No, python will import what it needs to import. However, each module is only imported once. For example, if one module does import sys and another module does import sys, it will not physically do it twice.
Not really. A module imports stuff from other modules because it needs to use them in that module, otherwise it'll break.
There is no pre-defined import depth level. Import statements are executed, just like any other python statement.
But, you may wonder, how are cycles avoided? Modules are added to sys.modules (i.e., cached) when they get imported for the first time, and that is the first location examined when an import statement is executed. So each module is loaded just once, although it may appear in many import statements.
So I just met a strange so-called bug. Because this work on my other .py files, but just on this file it suddenly stopped working.
from tuttobelo.management.models import *
The above used to work, but it stopped working all of a sudden, and I had to replace it with the bottom.
from tuttobelo.management.models import Preferences, ProductVariant, UserSeller, ProductOwner, ProductModel, ProductVariant
from tuttobelo.management.models import ProductMeta, ShippingMethods
I know the following is the better way of coding, however ALL of the models mentioned in models are used, so my question is, what possible reasons can wildcard stop working?
The error I got was that the model I was trying to import does not exist, only if I remove the wildcard and import the name of the model could I get it imported properly.
Thanks!
Maybe the models module has an __all__ which does not include what you're looking for. Anyway, from ... import * is never a good idea in production code -- we always meant the import * feature for interactive exploratory use, not production use. Specifically import the module you need -- use that name to qualify names that belong there -- and you'll be vastly happier in the long run!-)
There are some cases in Python where importing with * will not yield anything. In your example, if tuttobelo.management.models is a package (i.e. a directory with an __init__.py) with the files Preferences.py, ProductVariant.py, etc in it, importing with star will not work, unless you already have imported it explicitly somewhere else.
This can be solved by putting in the __init__.py:
__all__ = ['Preferences', 'ProductVariant', 'UserSeller', <etc...> ]
This will make it possible to do import * again, but as noted, that's a horrible coding style for several reasons. One, tools like pyflakes and pylint, and code introspection in your editor, stops working. Secondly, you end up putting a lot of names in the local namespace, which in your code you don't know where they come from, and secondly you can get clashes in names like this.
A better way is to do
from tuttobelo.management import models
And then refer to the other things by models.Preferences, models.ProductVariant etc. This however will not work with the __all__ variable. Instead you need to import the modules from the __init__.py:
import Preferences, ProductVariant, UserSeller, ProductOwner, <etc...>
The drawback of this is that all modules get imported even if you don't use them, which means it will take more memory.