i have bash script with embedded python code that included importing python modules named :pyVim and pyVmomi
i am use to running bash scripts from a remote server like this:
bash <(curl http://remote_server/script)
but if i run my script as above ,i am getting :
" ImportError: No module named pyVim"
i know that if i will copy the modules locally ,it will work ,but i wonder
if i can run it remotely ,while the modules exits on the remote server?
Related
On a Windows machine, where I have no admin rights, I've created a python script which loads a module that is only available in Spyder development environment. To run it, I have to open Spyder, select the script and the working directory, before actually running the program.
I'd like to automate those steps. The ideal way would be to run the script from a console (in the correct directory), loading the python modules available in spyder.
Something like:
spyder.exe myPythonScript.py
Any idea about how to do it?
I'm using VS Code to debug and write C++ on a remote compute cluster. If I have the Anaconda module loaded then when I run GDB I get the following error
ImportError: No module named site
If I run GDB with the following command it works fine
PYTHON_LIBRARY=/usr/lib64/python2.7
PYTHONPATH=/usr/lib64/python2.7:/usr/lib64/python2.7/lib
dynload:/usr/lib64/python2.7/site-packages gdb
How do I add these paths to the launch.json config? I've already tried just adding those paths to PYTHONPATH or PYTHON_LIBRARY but if I do that then I break ipython which I use regularly.
I have created a Bamboo task which runs the python code from a BitBucket Repo.
Bamboo config:
I am running the script as a file.
I have selected interpreter as Shell and given this in the Script Body to execute the script python create_issue.py -c conf.yml
After I click on 'Run Plan', the build fails with ImportError: No module named pandas. The rest of the libraries are working fine, like, requests, itertools, etc.
It sounds like you are running this Bamboo plan using the Agent Host and not a Docker Container. As such you will need to:
Remote/Log into the Bamboo server
Use pip or some other package tool to install requests, itertools, and any other missing imports
Alternatively, you could set-up an isolated Docker image that has all these dependencies and build within that.
I have a GitHub - python project with 2 scripts. I can run one script in Jenkins using shell command - python script_1_name.py (this script doesn't have any external packages imported). I have another script which has a external packages - jenkinsapi and jenkins. when i try to run this script using shell command in jenkins - python script_2_name.py, I am getting the below error.
***22:29:36 File "createJenkinsJobs.py", line 1, in <module>
22:29:36 from jenkinsapi.jenkins import Jenkins
22:29:36 ImportError: No module named jenkinsapi.jenkins
22:29:36 Build step 'Execute shell' marked build as failure
22:29:36 Finished: FAILURE***
Since i have these packages installed in virtual environment, i can successfully execute them in local. How to execute this using jenkins job.
I have used set PYTHONPATH=%PYTHONPATH%;dictionary_containing_modules to make some modules globaly visiable.
I have a script importing the modules with import my_module.
When i run the script from the iPython shell (run my_script.py), i get no errors and the script runs as intended, but when I run the script from the command promt (windows) with python my_script.py I get the error:
ImportError: No module named my_module
I checked with pwd that they use the same working directory.
Remember that you can dynamically change your system path from within your script, using sys.path or the site module. So maybe you want to add them to your script...
Or maybe you want to write a BAT or Python launcher script that sets the PYTHONPATH...
Or you want to edit the Windows environment variables (somewhere inside System properties Win+Break).