Importing functions from an external ipynb file works as intended if it is in the same directory. However, if I have a project with multiple directories, I would like to be able to import functions from other directories. I understand that it may be easier to use another IDE for this purpose, however, I am documenting my work and would like to continue leveraging Jupyter Notebook's seamless Markdown integration.
I am not seeing anything in the ipynb.fs documentation here and answers on ipynb import another ipynb file and import a function from another .ipynb file all specify that documents must be in the same directory.
This is normally what I would do to import a function from an external ipynb file in the same directory,
from ipynb.fs.full.my_functions import split_into_sentences
I've moved my_functions into functions/ and I've tried this,
from ipynb.fs.full."functions/my_functions" import split_into_sentences
and
from ipynb.fs.full.functions/my_functions import split_into_sentences
Neither works.
Is there any workaround to this?
Related
I have created two new python notebooks for databricks in /workspace/Shared/Notebooks.
I would share some functions between the both notebooks. I have created a python file containing a few generic functions.
It exists a way to import my functions into my both notebooks ?
Thanks for your help.
It is really easy, when it comes to sharing python code between .ipynb files.
Say, you have following files:
source.py
dest_1.ipynb
dest_2.ipynb
With following contents:
source.py
a = "hello from source"
dest_1.ipynb and dest_2.ipynb
import source
print(source.a)
And you can simply run cells in your notebooks.
The main part is that your source file should be located in the same folder with notebooks.
There are two different methods to achieve this:
Use the %run <another_notebook> to include content of another notebook into a current one (doc)
If you use Databricks Repos, it has support for so-called "Files in Repos" - in this case you can use Python or R files (not notebooks!) as Python or R modules, so for Python you can just do import some_file.
Unit tests in this demo repository shows both approaches.
I'm building a pyd file via pybind11. There's no python-only code in the project (not even in the final wheel). I'm packing this prebuilt pyd into a wheel package. It's all working well so far, but I've got one small issue. Every time I want to use the functions from the pyd file (which is actually the primary "source" of code) I have to type the following: from python_lib_name.pyd_name import pyd_exported_class example: from crusher.crusher import Shard. This looks pretty lame, is there any way to prevent this and let the developer simply use the from crusher import Shard syntax?
Also, I'm not sure what information to provide, obviously, I wouldn't like to throw the whole project at you unnecessarily. I'm glad to provide any details by editing.
how about this: put your pyd in a folder and include a __init__.py that import the stuff from the pyd
my_project
|---my_pyd_lib
| |---my_pyd_lib.pyd
| |---__init__.py
|---#the other stuff
and in that init just put
#__init__.py
from .my_pyd_lib import *
the dot in the import make it a relative import useful to make sure that import the thing in the same folder as that file, it also make sure you import your thing if it happens to share a name with some other package/module you happens to have installed
One way to make a pyd file simply importable is to use the Python interpreter's -m flag. For example, if your file is called foo.pyd and is located in the current directory, you can import it by running the following command:
python -m foo
This will cause the Python interpreter to load the module from foo.pyd and then execute its contents as if they were written in a normal .py file. You can also use this technique to import modules that are located in different directories; just specify the path to the module as an argument to -m.
Another way to make a pyd file simply importable is to convert it into a normal .py file.
In my project, no matter where I am or where the file I'm trying to import from is in, I have to specify the path, which is fine.
Ex: import protos.example as example, even when I'm already in the protos directory. So when the generated files are made, they naturally just say import example_pb2 as example__pb2, but with the way it is forcing me to put the full path, that natural way of generating doesn't work because it has to be import protos.generated.example_pb2.
How can I change this to where it automatically searches the current directory before needing to specify the location?
Found the fix. I'm using Pycharm , not sure how this works in other IDEs. Right click the folder that has the generated files and mark it as sources. That solves the generated files not being able to see each other. Then in any file where you reference the generated files, put
import sys
sys.path.append(r'generated')
and then your imports should work as expected
Looking at this behave tutorial I find that in file features/steps/step_tutorial06.py, if I use from company_model import CompanyModel as is in the example I get Unresolved reference 'company_model' but if I use from features.steps.company_model import CompanyModel it works. Why is this and is there any way around this?
This is in PyCharm.
It's called a Relative import. This is because PyCharm launches python from Project directory and not from the directory you are working in.
However, to get rid of this long from features.steps.company_model import CompanyModel, you can use from .company_model import CompanyModel since both files are in same directory.
because project structure starts from the folder features in pycharm. Hence it is appearing in that format.
I don't manage to find how import custom class in Python with the Jupyter notebook in anaconda.
In my work folder I have a file 'user.ipynb' that contains a class name User. In an other file in the same folder, I try to import this class with : from user import User.
I get this error: ImportError: No module named user.
I tried to create a file ' _ _init__.py' and _ _init__.ipynb in this folder but it doesn't work.
Do you know how I could do this ?
Thank you in advance
Python modules are either files named *.py or directories with an __init__.py in them. user.ipynb is neither, it's an IPython (Jupyter) notebook, so it is not a module and you can't import it.
The easiest way to get this to work is to convert the notebook to a .py file (this can be done using the export function) and then you'll be able to import it.
An alternative is to provide code that lets you import notebooks directly; such code is available here. Looks like a lot of bother, though. I'm not sure why they didn't just bake this in; it seems like it would be useful.