My python environment and the script I want to run are located at:
/home/pi/webscrap/bin
/home/pi/webscrap/news/news.py
In the news directory I created a newscrap.sh file containing this:
/home/pi/webscrap/bin/python /home/pi/webscrap/news/news.py
I did this just with nano newscrap.sh.
then I created a crontab and added the line:
* * * * * /home/pi/webscrap/news/newscrap.sh >> /home/mypc/logs/cronlogs.log 2>&1
I checked this by looking at the size of some files that should change when I run the script. This does not happen, so what am I missing here?
Edit: I also did
sudo chmod a+x news.py
Is the destination /home/mypc/logs/cronlogs.log on the raspberry pi or is it on a different system? In other words, if you type ls /home/mypc from the command line on the raspberry pi, does that folder exist?
I'm going to answer as though your intent is to copy files from one system (the raspberry pi) to another (mypc), but you may be confusing filesystem paths with network locations. If that's the case, then there are several good options available to you. I'll give you a quick summary of two.
Solution 1: Use scp
Write to a file on the raspberry pi.
Use scp to copy the file to a location on mypc.
You will want to set up SSH keys so that it does not prompt you for a password each time.
This won't append to a file on the mypc, it will copy the whole thing, so you will have to come up with a solution to that problem. This solution feels a little hacky to me, and I don't love it.
Solution 2: Use a queueing system
This is a lot more involved, but much cleaner than scp, because it was designed for exactly this. You'll have to learn about networking and sockets, but you like that sort of thing or you wouldn't be here. I like ZMQ because it is easy to get started and is well supported. You will have to write a server on the raspberry pi and a client on mypc. This can be a bit tricky if there are firewalls, but there is help for that. I would start with client/server on one machine, and then figure out how to get it across the network.
Related
I am looking to create a usb drive that when inserted into a laptop will automatically run a python file (without any user interaction) which grabs their ip address and hostname and emails it to us in a text file. I know this seems sketchy, but it is for a pen test (I'm sure all of you are familiar with these kinds of tests)
I have the script already written, I just need to find a way to automatically run the script (regardless of what op system their running when somebody is dumb enough to put a usb drive into their computer... I have scavenged the web and found the "outrun.inf" ideas, but I know this does not work with all op systems...
My idea is to have it run command line commands when plugged into the laptop (which i can figure out later which command lines will work), but I am unsure if this is possible.. any ideas?
If their is another way you guys think could work to achieve what I am looking for, I am all ears!
Thanks all
I need to make a python script that will do these steps in order, but I'm not sure how to go about setting this up.
SSH into a server
Copy a folder from point A to point B (cp /foo/bar/folder1 /foo/folder2)
mysql -u root -pfoobar (This database is accessible from localhost only)
create a database, do some other mysql stuff in the mysql console
Replaces instances of Foo with Bar in file foobar
Copy and edit a file
Restart a service
The fact that I have to ssh into a server, and THEN do all of this is really confusing me. I looked into the Fabric library, but that seems to do only do 1 command at a time and doesn't keep context from previous commands.
I looked into the Fabric library, but that seems to do only do 1 command at a time and doesn't keep context from previous commands.
Look into Fabric more. It is still probably what you want.
This page has a lot of good examples.
By "context" I'm assuming you want to be able to cd into another directory and run commands from there. That's what fabric.context_managers.cd is for -- search for it on that page.
Sounds like you are doing some sort of remote deployment/configuring. There's a whole world of tools out there to professionally set this up, look into Chef and Puppet.
Alternatively if you're just looking for a quick and easy way of scripting some remote commands, maybe pexpect can do what you need.
Pexpect is a pure Python module for spawning child applications; controlling them; and responding to expected patterns in their output.
I haven't used it myself but a quick glance at its manual suggests it can work with an SSH session fine: https://pexpect.readthedocs.org/en/latest/api/pxssh.html
I have never used Fabric.
My way to solve those kind of issues (before starting to use saltstack) it was using pyexpect, to run the ssh connection, and all the commands that were needed.
maybe the use of a series of sql scripts to work with the database (just to make it easier) would help.
Another way, since you need to access the remote server using ssh, it would be using paramiko to connect and execute commands remotely. It's a bit more complicated when you want to see what's happening on stdout (while with pexpect you will see exactly what's going on).
but it all depends from what you really need.
I'm familiar with python within the 3D application I use (OSX platform), but am struggling with its usage in a client/server relationship. I've written a simple distributed rendering script which breaks my 3D render script into smaller OSX bash shell scripts and saves them to a directory on my machine. The remote machines in the room then look at my local folder with these smaller bash shell scripts and execute them one by one until they are all gone. It is a rudimentary solution to distributed rendering, but it works. What I would like to do is have the remote machines listen for a command from my local machine (the local machine would need to send an OSX bash command to the remote machines). I have been looking and this site: http://www.tutorialspoint.com/python/python_networking.htm
This seems to be what I'm looking for, but not knowing much about how python works with the network, I'm not sure whether its secure, and I am not sure how to send a Bash command rather than a message.
If anyone has any suggestions it'd be much appreciated.
I wrote a simple script that parses some stuff off the web and emails it to me. Very simple. But I'm starting to realize that implementing this is going to be far more difficult that it really should be.
All I really want to do is run this script once a day.
I have explored using Google App Engine, but it doesn't like smtplib using ssl to login to my gmail to send an email.
I am considering using Heroku, but that just seems like a lot of work for something so simple.
I tried using my raspberry pi, but I'm not sure the script is still running when I exit ssh. I looked into running the scrip on a cron job, but I'm not sure thats an elegant solution.
I looked into running an applescript from my calendar, but I'm not sure what happens if my computer is closed and/or offline.
My question is: Is there a simple, elegant, easy solution here?
When you start the script from your session (./script.py or python script.py) than it stops running when you disconnect. If you want to run the script this way for whatever reason, I would recommend using tmux .
If you're using Raspian or another Debian based distro for your Pi:
$ apt-get install tmux
$ tmux
# disconnect from your tmux session with pressing CTRL+B and (after that) D
# to reattach to your session later, use
$ tmux attach
I would recommend using cron. Just add a file like this in /etc/cron.d/, if you want to run it at a specific time (e.g. every day at 1am), like so:
$ echo "0 1 * * * python /path/to/your/script.py > /dev/null 2>&1" > /etc/cron.d/script-runner
# and don't forget to make it executable
$ chmod +x /etc/cron.d/script-runner
Wikipedia has a nice explanation of the format (and also of shortcuts like #hourly and #daily).
If you don't care when exactly it runs, you can just put your script into /etc/cron.daily/. Don't forget a chmod +x to make it executable.
If you don't want to run it on one of your machines, you can also get a shell on one of uberspaces servers. You can pay whatever you wan't (minimum 1 Eur/month) and you get a shell on a linux box with 10GB storage (the first month is free for testing, cancelation happens automatically when you don't pay, no strings attached). I'm sure there are a lot of other services like that, I just mention it, because it's a cheap one with nice support. Also you get a domain (..uberspace.de) and can send mail from the server (e.g. with mail). So no need to use a gmail account.
Edit: Overread the "python" part. Changed everything to .py. Either use #!/usr/bin/env python3 (or 2.7) in your script or start the script via python scriptname.py.
I'm writing a Python script to access all computers on the network, log in to them and read some log files. I don't want to use something as low-level as socket, but I can if I must. I realize that my problem is similar to this question, but not the same.
Are there any modules for accessing external Windows machines?
Has anyone done anything like this before?
I'm specifically looking to log into Windows 7 machines, not unix.
Let's also assume that each computer I want to log into has Remote Desktop installed and enabled. I'm also not worried about network security or encryption because these files are not confidential. Windows machines don't have SSH installed on the by default do they?
There has to be something on the other side for you to talk to. This limits you to either setting up a "server" on each machine, installing a real server (i.e. sshd), building a "server" yourself and installing it, or using a built in and active feature of the OS.
Based upon this, what kind of system do you want to set up on these machines? What does it need to do? Just read the contents of a prespecified file list? Will that list change?
One solution is to turn on telnet, and use paramiko or twisted to
talk across it. This isn't very secure of course
Next up, set up a samba share, and access the folder remotely. This
is also insecure, though less so than telnet
You could find a ssh daemon port and run that, if you are so inclined
Psexec from sysinternals might work
Use twisted to build a server app with the features you need
Use ncat to listen on a port and spawn a cmd prompt
Be aware that most of the solutions for accessing windows remotely are... poor. The best solution is probably to roll your own, but that is hard work and you will probably make mistakes.
Also, Windows 7 is not exactly multi-user friendly. Individual processes can run as separate users, but the OS does not support having multiple users logged in at the same time. Someone is going to be the "user" and everyone else is just a process with a different credential set.
This is more an artificial limitation on M$'s part than anything technical. To see this in action, try to log in with RDP while a user is logged in locally. Fun times.
Per your edit, the easiest thing to do is just set up a samba share on the box.
After this share is set up:
with open(r'\\myCompNameOrIP\C\windows\logs\logfile.txt','rb') as logfile:
loglines = logfile.readlines()
Or you can use the gencat sample found here. Just give it r'\\myCompNameOrIP\C\windows\logs\*.txt' as the search path and watch the magic.
From Ubuntu I use samba:
In Bash:
gvfs-mount smb://them/folder
Here I give name, domain and password
Then in python:
folder = '/home/me/.gvfs/folder on them'
using the os module I read folders and files inside.
I am working in a small business environment.
Why not have each of the computers send the log file to the central computer?