I'm using Enthought Canopy to run my python code. On one computer I want to run 3 different python codes, all completely independently (for 3 different instruments), simultaneously. I seem to only be able to run one code at a time in Canopy with one kernel. How do I run more at the same time?
I have read in a thread about using ipython qtconsole, but I don't know what this is. I can't even find this file on my pc. Can anyone step me through the process of how to get these codes all running on multiple kernels?
Related
I want to run multiple programs in parallel in Spyder. How do I do it? I tried deactivating Use a single instance under Preferences---Applications but this didn't help.
I created a python script to control the RGB of my keyboard and my mouse and i want it to start when i turn my computer on. I have 2 solutions but i don't know which is the best.
the first solution is to build my script and run the executable on startup
the second solution is to run directly the python script with python.
the first solution is a little bit annoying because is i want to change the code, i need to rebuild my script. and also when i tried to run the exe (created with pyinstaller), it didn't work (because of the dlls i think)
the second solution is better if i want to change the code later, but i don't know how it would react if i run another python script (i don't even know if it's possible to run 2 python instances at the same time.)
so if you have any idea on which solution to choose, how to build a script using dlls with pyinstaller or if i can run multiple python instances (maybe i can run multiple venvs), feel free th help me.
PS: I tried to be clear but as i didn't speak english very well i don't know if you understood my issue.
Using windows : https://www.geeksforgeeks.org/autorun-a-python-script-on-windows-startup/
Using Linux : follow this https://stackoverflow.com/questions/24518522/run-python-script-at-startup-in-ubuntu#:~:text=Put%20the%20command%20python%20startuptest,and%20put%20the%20command%20there
I run a complex python program that is computationally demanding.
While it is complex in terms of number of lines of code, the code itself is simple: it is not multi-thread, not multi-process and does not use any "external" library, with the exception of colorama, installed via pip.
The program does not require a big amount of memory.
When I run it, and monitor via "htop", it shows one (out of the eight) cpu is used 100% by the script, and around 1.16GB (out of 62.8GB) of memory are used (this number remains more or less steady).
After a while (10 to 20 minutes) of running the script, my ubuntu dell desktop running ubuntu 16.04 systematically freezes. I can move the mouse, but clicks do not work, the keyboard is unresponsive, and running programs (e.g. htop) freeze. I can only (hard) reboot. Note that the last frame displayed by htop does not show anything unexpected (e.g. no higher usage of memory).
I never experience such freezes when not running the python program.
I do nothing special in parallel of running the script, aside of browsing with firefox or dealing with mails using thunderbird (i.e. nothing that would use cpu or ram in a significative fashion).
I have printed traces in my python code: it never crashes at the same state.
I also print kernel logs in another terminal: nothing special is printed at the time of the freeze.
I do not use any IDE, and run the script directly from a terminal.
Searching for similar issue, it seems that they are usually related to overusage of memory, which does not seem to be my case.
I have no idea how to investigate this issue.
I have written a python (python version is 3) script that runs 24/7. The way I run my script in my Windows machine is the following. I right click on the .py file, then click on "Edit with IDLE" and then "Run". The script has no issue but, due to the many instructions printed in the python shell (I use a logger), after a couple of days this python shell gets very heavy. My newbie question is the following. Is there to limit the number of rows temporarily saved in the python shell to a specific number? Or perhaps somebody has a better suggestion to run this constantly running script that prints a lot of the script steps in the shell? Please, notice how I'm not asking how to run a script 24/7, it's my understanding the best way to do it is though a VPS. My problem is that the data saved in the displayed python shell gets bigger and bigger every day, so I only wonder how to limit the data temporarily displayed/saved in it. Thanks
I have a very large script which I developed using Spyder. I have friends who are trying to run it using the standard anaconda console / command prompt. What is interesting is the the script works between computers if run on Spyder, but completely freaks out if run in the anaconda console, command prompt, or even Ipython outside of Spyder. When I say freak out, I mean it randomly starts again at the beginning of the script, reloads packages, spams the user interface text a few times then randomly crashs. There is never an error code. There is no while loop or for loop or anything which would cause it to restart randomly like this. I have had a number of python programmers look at the code directly and cannot understand either why it would even attempt to restart.
I have isolated the issue to the following line:
results = pd.DataFrame(MOR.predict(dat), columns = scores)
In this case, MOR is an imported pickled sklearn multioutput xgboost model. dat is a dataframe with a couple rows and the same required columns as the xgboost model. scores is simply a list of names which is equal to the number of output variables from the sklearn model.
Any ideas as to why this script would run perfectly in spyder but not the console? Note, its clear that this likely will not have a solved answer, but any speculations as to why spyder would behave different from these other consoles?