I am making a web project, for this i have summery scripts and crawling scripts now I want to link both these two scripts together so that my both scripts work automatically for a single project.
For example: I am using PyCharm.
If i have one file is Summery.Py and second file is crawler.Py both have separate modules. I want to link them a one program
You could try import to make the functions in one script callable in the other.
E.g. Within crawler.py add import Summery and call the functions within normally.
This is handled through the import functionality of Python. You can do something like:
crawler.py
from Summery import * # will run Summery.py and load all global variables and functions
Or use the more apt
from Summery import myFunction # get just one function
It works both ways too
Summery.py
from crawler import * # you don't include the '.py'
If they are in different directories, you do
from MainPyCharmFolder.SubFolder.DeepFolder.Summery import myFunction
Which enables you to navigate your folder structure
Related
I have the following file. It contains several functions that I would use frequently. Instead of writing the same function over and over in every project, I think it would be more efficient if we are able to include the files in each project.
I want to include functions inside this file:
However, I don't know what technique is called and I am not sure if it is possible to make it work.
tools is not found:
Here's the folder that I want to include:
I have created __init__.py as explained in https://stackoverflow.com/a/4116384/17869806
If you are using a Jupyter notebook, create a new one in the folder ivm_github_python with this content:
from tools import ivm_string_print
ivm_string_print.myfunction()
in this case myfunction is defined in ivm_string_print.py
So in the file which is the main file for each project you can do
import filename and import it as a module in the start of the project.
Note: the file which you want to import should be in the same folder as your main file. and suppose the file name you want to import is "functions.py"
when you import it write import functions and not import functions.py
pls upvote if helps :)
I want save some classes and functions that I have wrote for Neural Networks, which I want to use in the future whenever I need them. Is there a way to save functions and classes in some library?
So more precisely, I'm looking for a library (let's call it tools), so that I can do:
save my_function in tool
...
from tool import my_function
The way to do that in Python is to simply save your functions in a separate python file (also called a module, see the official docs).
In your case the custom function code could be saved as the file tool.py.
You can then use the syntax you mentioned:
from tool import my_function
To import this specific function, but only if the file tool.py is actually in the same directory as the session you are importing it to (this is an easy way to add the module to your Module Search path, see the offical documentation).
If you want to use the module in another directory, you can append the path where you saved tool.py to your sys.paths:
import sys
sys.path.append('/usr/dir/customcode/')
Then you can from tool import my_function in the same session, if you have tool.py saved in the directory /usr/dir/customcode/.
I have a file structure like the following:
config.py
main.py
some_other_file.py
where config.py contains easily accessible parameters but not much code otherwise. These should be accessible to all other code files. Normally import config would do, but in this case the python script is called externally from another program, and therefore the root calling directory is not the same as the one the files are located at (so just an import results in an exception since it does not find the files).
Right now, the solution I have is to include into my main.py file (the one that is directly called by the third program) the following:
code_path = "Path\\To\\My\\Project\\"
import sys
sys.path.insert(0, code_path)
import config
import some_other_file
...
However, this means having to modify main.py every time the code is moved around. I could live with that, but I would certainly like having one single, simple file with all necessary configuration, not needing to dig through the others (especially since the code may later be passed onto others who just want it to work as effortlessly as possible).
I tried having the sys.path.insert inside the config file, and having that be the file directly called by the external script (and all other files called from there). However, I run into the problem that when the other files do import config, it results in an import loop since they are also being imported from config.py. Typically, I believe this is solved by making the imports in the config.py file only once through something like if __name__ == "__main__": (see below). This does not work in my case, and the script never goes into the if statement, possibly because it is being called as a sub-routine by a third program and it is not the main program itself. As a result, I have no way of enforcing a portion of the code in config.py to only be executed once.
This is what I meant above for config.py (which does not work for my case):
... # Some parameter definitions
if __name__ == "__main__":
code_path = "Path\\To\\My\\Project\\"
import sys
sys.path.insert(0, code_path)
import main # Everything is then executed in main.py (where config.py is also cross-imported)
Is there any way to enforce the portion of code inside the if above to only be executed once even if cross-imported, but without relying on __name__ == "__main__"? Or any other way to handle this at all, while keeping all parameters and configuration data within one single, simple file.
By the way, I am using IronPython for this (not exactly a choice), but since I am sticking to hopefully very simple stuff, I believe it is common to all python versions.
tl;dr: I want a config.py file with a bunch of parameters, including the directory where the program is located, to be accessible to multiple .py files. I want to avoid needing the project directory written in the "main" code file, since that should be a parameter and therefore only in config.py. The whole thing is passed to and executed by a third external program, so the directory where these files are located is not the same as where they are called from (therefore the project directory has to be included to system path at some point to import the different files).
A possible design alternative that is fairly common would be to rely on environment variables configured with a single file. Your multi-program system would then be started with some run script and your python application would then need to use something along the lines of os.env[…] to get/set/check the needed variables. Your directory would then look something along the lines of:
.
.
.
.env (environment variables - doesn't have to be called .env)
main.py
run.sh (starts system of programs - doesn't have to be called run.sh)
.
.
.
For the run script, you could then "activate" the environment variables and, after, start the relevant programs. If using bash as your terminal:
source .env # "activate" your environment variables
# Then the command to start whatever you need to; for example
#
# python main.py
# or
# ./myprogram
I am doing an online Python course and currently learning about modules and packages, but can't seem to get my head around the logic. I have used modules (or maybe they were packages who knows?) before in current job but without really knowing what was happening when I would do import Pandas for example.
By following the course I have created a folder called "MyMainPackage" and in this folder there is another folder called MySubPackage.
In "MyMainPackage":
init.py and my_main_script.py; in my_main_script.py there is a function called main_func()
in "MySubPackage":
init.py and my_sub_script.py; in my_sub_script.py there is a function called sub_func()
In Sublime Text editor I wrote a script called myprogram2.py:
import MyMainPackage
MyMainPackage.my_main_script.main_func()
However, this hasn't worked; when I call python myprogram2.py in Windows Command Prompt I get the following message
Attribute Error: module MyMainPackage has no attribute my_main_script
However, as per the online lecture, what does work is this:
from MyMainPackage import my_main_script
from MyMainPackage.MySubPackage import my_sub_script
my_main_script.main_func()
my_sub_script.sub_func()
Why can't I just import the entire package and access the modules like I have tried to above instead of the way the online lecture does it? I thought it would be the same thing. I am just struggling to understand the logic.
I'm a java developer new to python. In java, you can access all classes in the same directory without having to import them.
I am trying to achieve the same behavior in python. Is this possible?
I've tried various solutions, for example by importing everything in a file which I import everywhere. That works, but I have to type myClass = rootFolder.folder2.folder3.MyClass() each time I want to access a foreign class.
Could you show me an example for how a python architecture over several directories works? Do you really have to import all the classes you need in each file?
Imagine that I'm writing a web framework. Will the users of the framework have to import everything they need in their files?
Put everything into a folder (doesn't matter the name), and make sure that that folder has a file named __init__.py (the file can be empty).
Then you can add the following line to the top of your code:
from myfolder import *
That should give you access to everything defined in that folder without needing to give the prefix each time.
You can also have multiple depths of folders like this:
from folder1.folder2 import *
Let me know if this is what you were looking for.