I want to automate the remote deployment which currently I am doing manually.
The process includes
Make the tar ball from certain folders
SFTP to the remote server
Rename the old folders
Untar the new tar file
Restart apache
The remote system is on the intranet and has no access to the outside internet
I want to know how can I transfer the file from my python script and then when the transfer is complete then log into ssh and do stuff. I am confused about how can I achieve that. On localhost and I can do all that but how can I do that on a remote host?
For simple&dirty work you can use fabric (This by no means say that you cannot use fabric to build serious product)
For heavy configuration routines, you'd better pick a CMS (e.g., ansible)
Related
I'm trying to automate the process of transferring files from my remote server to my local directory using Python. I know of libraries like pysftp which are popular, but as far as I can tell these libraries require connections to be initiated from the local side (i.e. from a script that lives on the local machine). Is there a way to transfer files from a script that lives on the remote machine instead?
Thanks in advance.
I would like to create a python script to ship files from a Virtual Machine and send them to a kafka broker.
to make it simple:
I have log files in this ip address: VM1 10.10.10.1
I would like to create a kafka producer (using python) to ship files from VM1
Kafka is installed on another virtual machine: VM2 10.10.10.2
limitations
I can't install any tools on the VM1
I can only use user/pwd to get access to VM1
I think that I need to create a connector? Can we reuse something like connect-file-sink.properties and specify the folder+ip where a file is stored in an external machine: VM1
There is another tools to do that?
There are plenty of existing tools that will do this, e.g. Filebeat which supports output to Kafka, Fluentd, etc.
These are all simple to setup with simple configuration files. The trickiest bit will probably be your networking—just make sure you've configured Kafka's listeners correctly.
I want to make access from remote ubuntu server to local machine because I have multiple files in this machine and I want to transfer it periodically (every minute) to server how can I do that using python
Depending on your local machine OS and network setup, I would recommend the following:
File transfers
Based on the file size, if its a small copy, I would use scp (secure copy). This is because of the simplicity of the command.
In most use cases however I would use rsync because of its great capabilities, most importantly the ability to handle failed partial transfers. It works by analysing the differences between the source and destination. It has pretty much every preference under the sun available (overwriting, deltas, etc.)
Note that when using these commands in an automation script over a longer period of time, you'll probably want to set up a static IP or DDNS for your remote machine.
Python
To run shell commands in a Python script, use pexpect. Its built around the original C based expect and it's fantastic. I used it the other day to transfer a folder from a dev computer to a number of different Raspberry Pis remotely at the same time. Check out the documentation here: https://pexpect.readthedocs.io/en/stable/
Automation
As for automation, it really depends on how you want it set up. If you want the python script responsible for transferring data to be called when you want to transfer data you could look into crontab. It's very well known by admins so easy to Google.
Alternatively, if this is part of a Python app you could have the app running in the background and sleeping (time.sleep() or a time elapsed function to check) between transfers. If you needed to do other things in the same Python app then you could stick the whole transfer and sleep part into a thread (also easily implemented in Python).
I hope this helps, let me know if you want details elaborated.
You can easily transfer files between local and remote or between two remote servers. If both servers are Linux-based and require to transfer multiple files and folder using single command, however, you need to follow up below steps:
User from one remote server should have access to another remote server to corresponding directory you want to transfer the file.
You might need to create a policy or group and assign to server list to that group
which you want to access and assign the user to that group so 2 different remote
server can talk to each other.
Run the following scp command:-
scp [options] username1#source_host:directory1/filename1
username2#destination_host:directory2/filename2
I may be being ignorant here, but I have been researching for the past 30 minutes and have not found how to do this. I was uploading a bunch of files to my server, and then, prior to them all finishing, I edited one of those files. How can I update the file on the server to the file on my local computer?
Bonus points if you tell me how I can link the file on my local computer to auto update on the server when I connect (if possible of course)
Just use scp and copy the file back.
scp [user]#[address]:/[remotepath]/[remotefile] [localfolder] if you want to copy the file on the server back to your local machine or
scp [localfolder]/[filename] [user]#[address]:/[remotepath] in case you want to copy the file again to the server. The elements in [] have to be exchanged with actual paths and or file names. On the remote it has to be the absolute path and on the local machine it can be absolute or relative.More information on scp
This of course means that the destination file will be overwritten.
Maybe rsync would be an option. It is capable to synchronize different folders.
rsync can be used to sync folders over the network and is capable to be combined with ssh.
Have you considered Dropbox or SVN ?
I don't know your local computer OS, but if it is Linux or OSX, you can consider LFTP. This is an FTP client which supports SFTP://. This client has the "mirror" functionality. With a single command you can mirror your local files against a server.
Note: what you need is a reverse mirror. in LFTP this is mirror -r
I understand nearly nothing to the functioning of EC2. I created an Amazon Web Service (AWS) account. Then I launched an EC2 instance.
And now I would like to execute a Python code in this instance, and I don't know how to proceed. Is it necessary to load the code somewhere in the instance? Or in Amazon's S3 and to link it to the instance?
Where is there a guide that explain the usages of instance that are possible? I feel like a man before a flying saucer's dashboard without user's guide.
Here's a very simple procedure to move your Python script from local to EC2 Instance and run it.
> 1. scp -i <filepath to Pem> <filepath to Py File> ec2-user#<Public DNS>.compute-1.amazonaws.com:<filepath in EC2 instance where you want
> your file to be>
> 2. Cd to to the directory in EC2 containing the file. Type Python <Filename.py> There it executed.
Here's a concrete examples for those who likes things shown step-by-step:
In your local directory, create a python script with the following code: print("Hello AWS")
Assuming you already have AWS already set up and you want to run this script in EC2, you need to SCP (Secure Copy Protocol) your file to a directory in EC2. So here's an example:
- My filepath to pem is ~/Desktop/random.pem.
- My filepath to py file is ~/Desktop/hello_aws.py
- My public DNS is ec22-34-12-888
- The ec2 directory where I want my script to be is in /home/ec2-user
- So the full command I run in my local terminal is:
scp -i ~/Desktop/random.pem ~/Desktop/hello_aws.py ec2-user#ec2-34-201-49-170.compute-1.amazonaws.com:/home/ec2-user
Now ssh to your ec2 instance, cd to /home/ec2-user (Or wherever you put your file) and Python hello_aws.py
You have a variety of options. You can browse through a large library of AMIs here.
You can import a vm, instructions are here.
This is a general article about AWS and python.
And in this article, the author takes you through a more advanced system with a combination of datastores in python using the highly recommend django framework.
Launch your instance through Amazon's Management Console -> Instance Actions -> Connect
(More details in the getting started guide)
Launch the Java based SSH CLient
Plugins-> SCFTP File Transfer
Upload your files
run your files in the background (with '&' at the end or use nohup)
Be sure to select an AMI with python included, you can check by typing 'python' in the shell.
If your app require any unorthodox packages you'll have to install them.
Running scripts on Linux ec2 instances
I had to run a script on Amazon ec2 and learned how to do it. Even though the question was asked years back, I thought I would share how easy it is today.
Setting up EC2 and ssh-ing to ec2 host
Sign up and launch an ec2 instance(Do not forget to save the certificate file that will be generated while launching ec2) with default settings.
Once the ec2 is up and running, provide required permissions to the certificate file chmod 400 /path/my-key-pair.pem (or .cer file)
Run the command: ssh -i /path/my-key-pair.pem(.cer) USER#Public DNS(USER data changes based on the operating system you have launched, refer to the below paragraph for more details && Public DNS can be obtained on ec2 instance page)
Use the ssh command to connect to the instance. You specify the private key (.pem) file and user_name#public_dns_name. For Amazon Linux, the user name is ec2-user. For RHEL, the user name is ec2-user or root. For Ubuntu, the user name is ubuntu or root. For Centos, the user name is centos. For Fedora, the user name is ec2-user. For SUSE, the user name is ec2-user or root. Otherwise, if ec2-user and root don't work, check with your AMI provider.
Clone the script to EC2
In order to run the scripts on ec2, I would prefer storing the code on Github as a repo or as a gist(if you need to keep code private) and clone into ec2.
Above mention is very easy and is not error-prone.
Running the python script
I have worked with RHEL Linux instance and python was already installed. So, I could run python script after ssh-ing to host directly. It depends on your operating system you choose. Refer to aws manuals if it's not installed already.
Reference: AWS Doc