Linux/Python: Monitor /proc/acpi files without polling? - python

Is there any way to monitor /proc files, such as
/proc/acpi/battery/BAT0/state
/proc/acpi/ac_adapter/ADP0/state
in a non-polling fashion, similar to inotify on a normal filesystem?
I want to do this in a PyGTK app, so I tried using PyGObject's gio.FileMonitor, but no dice. A Python solution that plays well with gtk.main() would be ideal.

Probably you can get the information you want by listening to the ACPI events. Preferably not directly (/proc/acpi/event), but via acpid or other high-level interface.
Update: the other, higher level interface is the DBus interface provided by DeviceKit-power / UPower.
Files in /proc are not regular files, rather a simple interface to kernel state, so many facilities for regular files won't work there.

Related

How to port/embed python for arm based devices?

TL; DR: how hard it is to port Python to new OS?
I want to use python to write applications for Verifone's VX 680. They are 32-bit ARM based devices with 128+MB of RAM. http://www.verifone.com/media/4300697/vx680_ds_ltr.pdf
My idea is to write a C application which calls Python interpreter. My application will be a bunch of python modules. The app needs to show graph rich UI, send HTTPS messages, access peripherals (e.g. WiFi radio, PinPad, thermal printer). Despite of my investigation, I'm completely lost still.
What is the list of things I need to address in order to be able to write python applications in this device?
I have personally ported CPython for my own operating system; the real problem was the lack of cross-compiling support really - I found patches for 2.5.1 to make it cross-compile cleanly.
After it was compiling cleanly, I just needed to provide quite a minimal set of system calls to work. For anything serious at least a read-only filesystem is a must. In any case, if your libc is POSIXish, you shouldn't have too many problems to get started.
The set of system calls I had in the beginning was exit, open, close, read (for console and files), write (to file descriptors 1 and 2 only), stat, fstat and sbrk (for changing the heap size). I used the newlibc C library with libgloss - everything that wasn't mapped to these, returned just the error values or defaults.

auto detect file in a folder

Sorry wasn't sure how to best word this question.
My scenario is that I have some python code (on a linux machine) that uses an xml file to acquire its arguements to perform a task, on completion of the task it disposes of the xml file and waits for another xml file to arrive to do it all over again.
I'm trying to find out the best way to be alerted an xml file has arrived in a specified folder.
On way would be to continually monitor the folder in the Python code, but that would mean a lot of excess resourses used while waiting for something to turn up (which may be as little as a few times a day). Another way, would be to set up a cronjob, but it's efficiency would't be any better than monitoring from within the code. An option I was hoping was possible would be to set up some sort of interrupt that would alert the code when an xml file appeared.
Any thoughts?
Thanks.
If you're looking for something "easy" to just run a specific script when new files arrive, the incron daemon provides a very handy combination of inotify(7) and cron(8)-like support for executing programs on demand.
If you want something a little better integrated into your application, or if you can't afford the constant fork(2) and execve(2) of the incron approach, then you should probably use the inotify(7) interface directly in your script. The pyinotify module can integrate with the underlying inotify(7) interfaces.

Make new software for Windows 95 on 486 machines, what to use?

I have a VB6 application running on a number of old 486 Windows 95 machines and sometimes the application is upgraded. The only way to accomplish this today is to use Hyperterminal and send the file over a null modem cable. Replacing the machines is not an option at this point.
I want to write an application that can take care of transferring the updating app over null modem without rewriting the VB6 app. This means I'm free to use anything I see fit. What alternatives are there?
These are the ones I can think of but I'd like to know if I'm wrong and any pros/cons. Also, I'd prefer to avoid C/C++ if at all possible.
Python with py2exe
Another VB6 app
C/C++
Edit: Some clarifications after reading the comments:
I want to make the process as easy as possible, today we have to remove and dismantle the computer, connect a keyboard and then fire up Hyperterminal to get going. That's why I want something more automatic. I'm open to suggestion of existing solutions but given the specific needs I didn't think there were any.
There is no ethernet on some of the computers either so the solution needs to be able to run RS232.
And again: Replacing the machines is not an option at this point. Just trust me on this.
If you must use a null modem, how about the built in serial line networking support?
Then you can just use normal network methods (psexec, file share, etc) methods to do the update.
Network Method
I would first get network cards installed in everything. If you want something that you can just plugin and go, look for any card compatible with NE2000. That card will work out of the box on Windows 3.11 and 95, no problem. In particular, you can find the 3Com EtherLink II or the 3C509B for very cheap online. It's an excellent card. (The Google Shopping results list several for under $20.)
From there, just enable the Windows File/Print Sharing service over TCP/IP, and you're good to go! After you've done this, you can remotely manage and upgrade these machines, saving you a lot of headache later on.
Serial-Port Method
Deanna's suggestion of using the serial port as a network device and transferring files normally will work as well. However, there is a bit of setup involved, and it can be a hassle if you've never done it. There are several other software options. I recommend LapLink. It's fairly painless.
You could even go all-out and pickup a multi-port serial interface for fairly cheap these days, and manage these computers centrally. RS232 is very robust and can go a long distance over the proper cabling.
Networking over Ethernet is the way to go though. If at all possible, choose that option.
We've got a few testing laboratories in a similar situation-the labs make money for the company so no touching the ancient computers that run the tests under pain of death. :-)
Anyway, pySerial seems like it'd work for this application, maybe take a look at their wxPython examples for some ideas on a GUI.
I guess the answer is pretty simple if you are happy using VB6 and the other app is already VB6, then use it.
That will do whatever serial comms you require quite adequately. Remember though you may want to update the application you write to do the updating in which case you are back to using hyperterminal!

Complete log management (python)

Similar questions have been asked, but I have not come across an easy-to-do-it way
We have some application logs of various kinds which fill up the space and we face other unwanted issues. How do I write a monitoring script(zipping files of particular size, moving them, watching them, etc..) for this maintenance? I am looking for a simple solution(as in what to use?), if possible in python or maybe just a shell script.
Thanks.
The "standard" way of doing this (atleast on most Gnu/Linux distros) is to use logrotate. I see a /etc/logrotate.conf on my Debian machine which has details on which files to rotate and at what frequency. It's triggered by a daily cron entry. This is what I'd recommend.
If you want your application itself to do this (which is a pain really since it's not it's job), you could consider writing a custom log handler. A RotatingFileHandler (or TimedRotatingFileHandler) might work but you can write a custom one.
Most systems are by default set up to automatically rotate log files which are emitted by syslog. You might want to consider using the SysLogHandler and logging to syslog (from all your apps regardless of language) so that the system infrastructure automatically takes care of things for you.
Use logrotate to do the work for you.
Remember that there are few cases where it may not work properly, for example if the logging application keeps the log file always open and is not able to resume it if the file is removed and recreated.
Over the years I encountered few applications like that, but even for them you could configure logrotate to restart them when it rotates the logs.

python IPC (Inter Process Communication) for Vista UAC (User Access Control)

I am writing a Filemanager in (wx)python - a lot already works. When copying files there is already a progress dialog, overwrite handling etc.
Now in Vista when the user wants to copy a file to certain directories (eg %Program Files%) the application/script needs elevation, which cannot be asked for at runtime. So i have to start another app/script elevated, which does the work, but needs to communicate with the main app, so latter can update the progress etc.
I searched and found a lot of articles saying shared memory and pipes are the easiest way. So what i am looking for is a 'high level' platform independent ipc library whith python bindings using shared mem or pipes.
I already found ominORB, fnorb, etc. They look very interesting, but use TCP/IP, is there an equivalent lib using shared mem or pipes ? Since the ipc-client is always on the same machine sockets seems not to be neccesary here. And i am also afraid the user would have to allow ipc-socket-communications on his/her personal firewall.
EDIT: I really mean high level: it would be great to be able to just call some functions like when using omniORB instead of sending strings to stdin/stdout.
How about just communicating with the second process using stdin/stdout?
There are some caveats due to input and output buffering, but take a look at this Python Cookbook recipe, and also Pexpect, for ideas on how to do this.

Categories