I have a pyinotify instance watching a mounted network drive (mounted with CIFS) for IN_WRITE_CLOSE events which picks up the system created events (sudo cp, sudo mv, etc.) flawlessly if the server itself puts files in this directory.
However, I would like to pick up the events for files that are created on this network drive by a different server and program.
For some reason my inotify instance is not seeing these events...Is this normal behavior or could there be something wrong with my code? If this IS normal behavior, is there a way around it or do I have to find a different monitoring tool other than inotify to gather these events?
UPDATE
per #Cedric's answer I changed my mountpoint to NFS but I am still not getting any events.
According to this thread on the linux cifs client, this wasn't implemented (nor really scheduled) in 2009.
For the NFS part, well, it works on NFS mount, but only if the listener is the same machine than the creater/modifer/deleter of the file.... :( (source here)
Last, I ended by founding a hook (just here), this is a python script that you have to run on the remote server that will send to you the events (the script is made to talk to a MediaTomb server on http, but you can implement your own sender)
Related
I have a node.js server running on a Raspberry Pi 3 B+. (I'm using node because I need the capabilities of a bluetooth library that works well).
Once the node server picks up a message from a bluetooth device, I want it to fire off an event/command/call to a different python script running on the same device.
What is the best way to do this? I've looked into spawning child processes and running the script in them, but that seems messy... Additionally, should I set up a socket between them and stream data through it? I imagine this is done often, what is the consensus solution?
Running a child process is how you would run a python script. That's how you do it from nodejs or any other program (besides a python program).
There are dozens of options for communicating between the python script and the nodejs program. The simplest would be stdin/stdout which are automatically set up for you when you create the child process, but you could also give the nodejs app a local http server that the python script could communicate with or vice versa.
Or, set up a regular socket between the two.
If, as you now indicate in a comment, your python script is already running, then you may want to use a local http server in the nodejs app and the python script can just send an http request to that local http server whenever it has some data it wants to pass to the nodejs app. Or, if you primarily want data to flow the opposite direction, you can put the http server in the python app and have the nodejs server send data to the python app.
If you want good bidirectional capabilities, then you could also set up a socket.io connection between the two and then you can easily send messages either way at any time.
I want to make access from remote ubuntu server to local machine because I have multiple files in this machine and I want to transfer it periodically (every minute) to server how can I do that using python
Depending on your local machine OS and network setup, I would recommend the following:
File transfers
Based on the file size, if its a small copy, I would use scp (secure copy). This is because of the simplicity of the command.
In most use cases however I would use rsync because of its great capabilities, most importantly the ability to handle failed partial transfers. It works by analysing the differences between the source and destination. It has pretty much every preference under the sun available (overwriting, deltas, etc.)
Note that when using these commands in an automation script over a longer period of time, you'll probably want to set up a static IP or DDNS for your remote machine.
Python
To run shell commands in a Python script, use pexpect. Its built around the original C based expect and it's fantastic. I used it the other day to transfer a folder from a dev computer to a number of different Raspberry Pis remotely at the same time. Check out the documentation here: https://pexpect.readthedocs.io/en/stable/
Automation
As for automation, it really depends on how you want it set up. If you want the python script responsible for transferring data to be called when you want to transfer data you could look into crontab. It's very well known by admins so easy to Google.
Alternatively, if this is part of a Python app you could have the app running in the background and sleeping (time.sleep() or a time elapsed function to check) between transfers. If you needed to do other things in the same Python app then you could stick the whole transfer and sleep part into a thread (also easily implemented in Python).
I hope this helps, let me know if you want details elaborated.
You can easily transfer files between local and remote or between two remote servers. If both servers are Linux-based and require to transfer multiple files and folder using single command, however, you need to follow up below steps:
User from one remote server should have access to another remote server to corresponding directory you want to transfer the file.
You might need to create a policy or group and assign to server list to that group
which you want to access and assign the user to that group so 2 different remote
server can talk to each other.
Run the following scp command:-
scp [options] username1#source_host:directory1/filename1
username2#destination_host:directory2/filename2
I am able to watch local directories using inotify kernel subsystem based solutions. There are some python projects too which are working on top of inotify like pyinotify, PyInotify, fsmonitor and watchdog.
I have mounted remote ftp server in local directory using curlftpfs so all syncing is easy now. But inotify is not able to watch network mounted points like local directories.
I want to track if there is new files added to ftp server. How can I achieve like I do for local directory using inotify based solution.
It can hardly work. The FTP protocol has no API to notify a client about the changes. The curlftpfs would have to continually poll the remote folder to provide the notification for inotify or other similar tool. It hardly does that.
You have to poll the FTP folder yourself.
See for example Monitor remote FTP directory.
I'm building a backup program which involves detecting when media available for backup is inserted. I've looked into detecting the insertion of backup media, and I'm going to use the file system watch service inotify on the /media/username directory.
The problem is that I've looked into this directory and there are folders that don't represent any currently available medium. How can I detect the list of currently available mediums (USBs, HDDs) and watch for any future ones? More technically, what are the characteristics of an actively available USB/HDD folder in the /media/username directory?
In Linux, you could use uevents from kernel and start sniffing for "ACTION"="add"
Please check the following links:
http://lwn.net/Articles/242046/
and Netlink socket:
http://www.kernel.org/doc/man-pages/online/pages/man7/netlink.7.html
Or use DBUS/HAL API bindings for Python http://ubuntuforums.org/archive/index.php/t-904706.html
Check dmesg messages and see exactly what was attached and to what partition was mounted
P.S.: Here is an example (on SO) of how to do it in Python using DBUS binding:
How can I listen for 'usb device inserted' events in Linux, in Python?
UPDATE
How to check if a path is mounted:
https://serverfault.com/questions/143084/how-can-i-check-whether-a-volume-is-mounted-where-it-is-supposed-to-be-using-pyt
I can detect it rather easily through monitoring the /dev/disks/by-label/ directory.
I'm writing a Python script to access all computers on the network, log in to them and read some log files. I don't want to use something as low-level as socket, but I can if I must. I realize that my problem is similar to this question, but not the same.
Are there any modules for accessing external Windows machines?
Has anyone done anything like this before?
I'm specifically looking to log into Windows 7 machines, not unix.
Let's also assume that each computer I want to log into has Remote Desktop installed and enabled. I'm also not worried about network security or encryption because these files are not confidential. Windows machines don't have SSH installed on the by default do they?
There has to be something on the other side for you to talk to. This limits you to either setting up a "server" on each machine, installing a real server (i.e. sshd), building a "server" yourself and installing it, or using a built in and active feature of the OS.
Based upon this, what kind of system do you want to set up on these machines? What does it need to do? Just read the contents of a prespecified file list? Will that list change?
One solution is to turn on telnet, and use paramiko or twisted to
talk across it. This isn't very secure of course
Next up, set up a samba share, and access the folder remotely. This
is also insecure, though less so than telnet
You could find a ssh daemon port and run that, if you are so inclined
Psexec from sysinternals might work
Use twisted to build a server app with the features you need
Use ncat to listen on a port and spawn a cmd prompt
Be aware that most of the solutions for accessing windows remotely are... poor. The best solution is probably to roll your own, but that is hard work and you will probably make mistakes.
Also, Windows 7 is not exactly multi-user friendly. Individual processes can run as separate users, but the OS does not support having multiple users logged in at the same time. Someone is going to be the "user" and everyone else is just a process with a different credential set.
This is more an artificial limitation on M$'s part than anything technical. To see this in action, try to log in with RDP while a user is logged in locally. Fun times.
Per your edit, the easiest thing to do is just set up a samba share on the box.
After this share is set up:
with open(r'\\myCompNameOrIP\C\windows\logs\logfile.txt','rb') as logfile:
loglines = logfile.readlines()
Or you can use the gencat sample found here. Just give it r'\\myCompNameOrIP\C\windows\logs\*.txt' as the search path and watch the magic.
From Ubuntu I use samba:
In Bash:
gvfs-mount smb://them/folder
Here I give name, domain and password
Then in python:
folder = '/home/me/.gvfs/folder on them'
using the os module I read folders and files inside.
I am working in a small business environment.
Why not have each of the computers send the log file to the central computer?