import statement fails for one module - python

Ok I found the problem, it was an environmental issue, I had the same modules (minus options.py) on the sys.path and it was importing from there instead. Thanks everyone for your help.
I have a series of import statements, the last of which will not work. Any idea why? options.py is sitting in the same directory as everything else.
from snipplets.main import MainHandler
from snipplets.createnew import CreateNewHandler
from snipplets.db import DbSnipplet
from snipplets.highlight import HighLighter
from snipplets.options import Options
ImportError: No module named options
my __init__.py file in the snipplets directory is blank.

I suspect that one of your other imports redefined snipplets with an assignment statement. Or one of your other modules changed sys.path.
Edit
"so the flow goes like this: add snipplets packages to path import..."
No.
Do not modify sys.path -- that way lies problems. Modifying site.path leads to ambiguity about what is -- or is not -- on the path, and what order they are in.
The simplest, most reliable, most obvious, most controllable things to do are the following. Pick exactly one.
Define PYTHONPATH (once, external to your program). A single, simple environment variable that is nearly identical to installation on site-packages.
Install your package in site-packages.

your master branch doesn't have options.py. could it be that you dev and master branches are conflicting?
if this is your actual code then you have option variable at line 21.

Does the following work?
import snipplets.options.Options
If so, one of your other snipplets files probably sets a global variable named options.

Are you on windows? You might want to try defining an __all__ list in your __init__.py file like noted here. It shouldn't make a difference unless you're importing *, but I've seen modules not import unless they were defined there.
Secondly, you might try setting up a virtualenv. Using a lot of site-wide python packages can lead to these kinds of things.
Lastly, make sure the permissions of options are set correctly. I've spent hours trying to figure these things out only to find out it was an issue of me not having permission to import it.

Related

"No Module named..."-error in editor despite code appearing to work

I've been struggling for quite some time trying to import a module from a molder in a separate directory on my computer for a python project. Currently the code seems to work, but Pycharm is still giving me errors that the module cannot be found. Despite this, if I run the code it seems to do what is intended.
What I have is essentially this:
import sys
sys.path.append(r'D:\Progam\bin')
import foo
Where foo is a module found in D:\Progam\bin, and it's warning me that there is no module named foo. Considering how much issue I've for some reason had to get this working I'm hesitant to just ignore the warning if there's some underlying problem
Anyone have any idea what's happening here?
Because the file isn't in your path globally, your IDE isn't recognizing that it is then valid during execution. It would probably be a security issue if it were adding files to its path from potentially unknown code.
You could either add that directory to your path via CMD like so:
set PATH=%PATH%;C:\your\path\here\
Or just ignore the error.
EDIT: Ignore that, I'm being a sleep deprived dumbass. Take a look at:
how to manage sys.path globally in pycharm
(Thought this edit would be slightly more useful than me just deleting my answer)

Including xlrd/xlwt/xlutils with modules outside of python installation

I'm self-taught in the Python world, so some of the structural conventions are still a little hazy to me. However, I've been getting very close to what I want to accomplish, but just ran into a larger problem.
Basically, I have a directory structure like this, which will sit outside of the normal python installation (this is to be distributed to people who should not have to know what a python installation is, but will have the one that comes standard with ArcGIS):
top_directory/
ArcToolbox.tbx
scripts/
ArcGIStool.py (script for the tool in the .tbx)
pythonmod/
__init__.py
general.py
xlrd/ (copied from my own python installation)
xlwt/ (copied from my own python installation)
xlutils/ (copied from my own python installation)
So, I like this directory structure, because all of the ArcGIStool.py scripts call functions within the pythonmod package (like those within general.py), and all of the general.py functions can call xlrd and xlwt functions with simple "import xlrd" statements. This means that if the user desired, he/she could just move the pythonmod folder to the python site-packages folder, and everything would run fine, even if xlrd/xlwt/xlutils are already installed.
THE PROBLEM:
Everything is great, until I try to use xlutils in general.py. Specifically, I need to "from xlutils.copy import copy". However, this sets off a cascade of import errors. One is that xlutils/copy.py uses "from xlutils.filter import process,XLRDReader,XLWTWriter". I solved this by modifying xlutils/copy.py like this:
try:
from xlutils.filter import process,XLRDReader,XLWTWriter
except ImportError:
from filter import process,XLRDReader,XLWTWriter
I thought this would work fine for other situations, but there are modules in the xlutils package that need to import xlrd. I tried following this advice, but when I use
try:
import xlrd
except ImportError:
import os, sys, imp
path = os.path.dirname(os.path.dirname(sys.argv[0]))
xlrd = imp.load_source("pythonmod.xlrd",os.path.join(path,"xlrd","__init__.py"))
I get a new import error: In xlrd/init.py, the info module is called (from xlrd/info.py), BUT when I use the above code, I get an error saying that the name "info" is not defined.
This leads me to believe that I don't really know what is going on, because I thought that when the init.py file was imported it would run just like normal and look within its containing folder for info.py. This does not seem to be the case, unfortunately.
Thanks for your interest, and any help would be greatly appreciated.
p.s. I don't want to have to modify the path variables, as I have no idea who will be using this toolset, and permissions are likely to be an issue, etc.
I realized I was using imp.load_source incorrectly. The correct syntax for what I wanted to do should have been:
imp.load_source("xlrd",os.path.join(path,"xlrd","__init__.py"))
In the end though, I ended up rewriting my code to not need xlutils at all, because I continued to have import errors that were causing many more problems than were worth dealing with.

Calling modules in python

So Im a beginner to python/programming and came upon this code in a tutorial, which Im having trouble understanding.
from pythonds.basic.stack import Stack
What I did was , I went to the site-packages folder in my python directory (which holds all modules). There I could find the directory structure to be : -
pythonds/basic/stack.py
The file stack.py has a "class Stack" inside it.
So am I correct in interpreting/relating the import command to this directory structure ?
Also , whenever such a long chaining of modules happen in python, can it always be understood in such a manner.
In command line, you can do like this:
C:\Python27\Lib>pip intall pythonds
Then this module can work.
Not all the time.
It's probably better to not try and compare the directory structure with the module path, unless you have to debug modules or install them manually.
Sometimes, your PYTHONPATH will be extended to include subdirectories in site-packages, and then there'll be an extra subdirectory.
Other times, there can be an __init__.py file in the pythonds/basic/ directory (there likely is), that can contain
from .stack import Stack
in which case the import path could be
from pythonds.basic import Stack
Your understanding is right.
import pythonds.basic.stack
This will make all the classes in the module accessible by your script. Whereas,
from pythonds.basic.stack import Stack
will make only the Stack class accessible by your script.

Python - Where to paste files to import

First of all let me tell you that I'm a new user and I'm just starting to learn Python in College so my apologies if this question is answered in other topic, but I searched and I can't seem to find it.
I received a file work.pyc from my teacher and he says I have to import it in my Wing IDE using the command from work import *, the question is I don't know where to put the file to import it.
It just says ImportError: No module named work.
Thank you
There are several options for this.
The most straightforward is to just place it in the same folder as the py file that wants to import it.
You may also want to have a look at this
if you're using the python interpreter (the one that lets you directly input python code into it and executes) you'll have to do this:
sys.path.append('newpath')
from work import *
where newpath is the path on your filesystem containing your work.pyc file
If you're working on a script called main.py in the folder project, one option is to place it at project/work.pyc
This will make the module importable because it's in the same working directory as your code.
The way Python resolves import statements works like this (simplified):
The Python interpreter you're using (/usr/bin/python2.6 for example, there can be several on your system) has a list of search paths where it looks for importable code. This list is in sys.path and you can look at it by firing up your interpreter and printing it out like this:
>>> import sys
>>> from pprint import pprint
>>> pprint(sys.path)
sys.path usually contains the path to modules from the standard library, additional installed packages (usually in site-packages) and possibly other 3rd party modules.
When you do something like import foo, Python will first look if there is a module called foo.py in the directory your script lives. If not, it will search sys.path and try to import it from there.
As I said, this explanation is a bit simplified. The details are explained in the section about the module search path.
Note 1:
The *.pyc you got handed is compiled Python bytecode. That means it's contents are binary, it contains instructions to be executed by a Python virtual machine as opposed to source code in *.py that you will normally deal with.
Note 2:
The advice your teacher gave you to do from work import * is rather bad advice. It might be ok to do this for testing purposes in the interactive interpreter, but your should never do that in actual code. Instead you should do something like from work import chop, hack
Main reasons:
Namespace pollution. You're likely to import things you don't need but still pollute your global namespace.
Readability. If you ever read someone elses code and wonder where foo came from, just scroll up and look at the imports, and you'll see exactly where it's being imported from. If that person used import *, you can't do that.

Properly importing modules in Python

How do I set up module imports so that each module can access the objects of all the others?
I have a medium size Python application with modules files in various subdirectories. I have created modules that append these subdirectories to sys.path and imports a group of modules, using import thisModule as tm. Module objects are referred to with that qualification. I then import that module into the others with from moduleImports import *. The code is sloppy right now and has several of these things, which are often duplicative.
First, the application is failing because some module references aren't assigned. This same code does run when unit tested.
Second, I'm worried that I'm causing a problem with recursive module imports. Importing moduleImports imports thisModule, which imports moduleImports . . . .
What is the right way to do this?
"I have a medium size Python application with modules files in various subdirectories."
Good. Make absolutely sure that each directory include a __init__.py file, so that it's a package.
"I have created modules that append these subdirectories to sys.path"
Bad. Use PYTHONPATH or install the whole structure Lib/site-packages. Don't update sys.path dynamically. It's a bad thing. Hard to manage and maintain.
"imports a group of modules, using import thisModule as tm."
Doesn't make sense. Perhaps you have one import thisModule as tm for each module in your structure. This is typical, standard practice: import just the modules you need, no others.
"I then import that module into the others with from moduleImports import *"
Bad. Don't blanket import a bunch of random stuff.
Each module should have a longish list of the specific things it needs.
import this
import that
import package.module
Explicit list. No magic. No dynamic change to sys.path.
My current project has 100's of modules, a dozen or so packages. Each module imports just what it needs. No magic.
Few pointers
You may have already split
functionality in various module. If
correctly done most of the time you
will not fall into circular import
problems (e.g. if module a depends
on b and b on a you can make a third
module c to remove such circular
dependency). As last resort, in a
import b but in b import a at the
point where a is needed e.g. inside
function.
Once functionality is properly in
modules group them in packages under
a subdir and add a __init__.py file
to it so that you can import the
package. Keep such pakages in a
folder e.g. lib and then either add
to sys.path or set PYTHONPATH env
variable
from module import * may not
be good idea. Instead, import whatever
is needed. It may be fully qualified. It
doesn't hurt to be verbose. e.g.
from pakageA.moduleB import
CoolClass.
The way to do this is to avoid magic. In other words, if your module requires something from another module, it should import it explicitly. You shouldn't rely on things being imported automatically.
As the Zen of Python (import this) has it, explicit is better than implicit.
You won't get recursion on imports because Python caches each module and won't reload one it already has.

Categories