Is there any way to save image from python script to server with python command without saving on local machine? For example, what would be the last line in this code:
path = 'path/to/folder/on/server'
imfile=frame.copy() #get screenshot from my webcam
magic_function(imfile, path)
Maybe I can do this with paramiko library and ssh.exec_command? What should I do in this case?
I use Ubuntu 14.04 on both local and server machine and my code is in python-2.7
Related
My python program is running in Assure Virtual machine. I needs to read csv file from local machine. Develop Python code to read CSV file from local machine into Asure virtual machine. python code will runs in VM.
How can a python code running in Asure VM, can read data from local disk/local machine
develop python code to import/read data from local machine to Asure VM. While executing the code in Asure VM, data needs to import from local machine.
I'm trying to use a python script to automate downloading a file from AWS s3 to my local machine. The python script itself is hosted on ubuntu (AWS ec2 instance), so it's not recognizing a directory on my local machine.
Here's my code:
import os
import boto3
from boto3.session import Session
print("this script downloads the file from s3 to local machine")
s3 = boto3.resource('s3')
BUCKET_NAME = 'sfbucket.myBucket'
KEY = 'sf_events.json'
s3.Bucket(BUCKET_NAME).download_file(KEY, '/Users/Documents/my_file.json')
print('end')
However, this gives me the following error:
FileNotFoundError: [Errno 2] No such file or directory: '/Users/Documents/my_file.json.CDC5FEf4'
Can anyone tell me what I'm doing wrong? If I replace the output directory to /home/ubuntu/ it works fine, but I want the file on my local machine. Thanks in advance.
The script has to run on your local Windows machine not on EC2 instance.
Alternatively, you can simply use aws cli
aws s3 cp s3://sfbucket.myBucket/sf_events.json /Users/Documents/my_file.json
As you are trying to download the file using the EC2 instance, this machine doesn't know your local path /Users/Documents/my_file.json
You have two options:
Run this script directly in your local machine.
In this case, you have to run this script in your local machine and make sure you have access to this bucket.
Run this script in EC2 Instance and copy.
In this case, you have to download the file to somewhere /home/ubuntu/ and then after in your local machine make a copy from EC2.
You can use [SCP][1]
You can install the SCP server in your local machine but the problem is that your local machine probably doesn't have a private IP, so every time that your local IP change you have to update the script. Maybe you should think about to do whatever you need to do with this file directly on EC2 Instance or if it's a real manual thing, sending via email, maybe?
How webcam can be accessed from local PC to remote server in order to do remote development?
My remote server configuration is
Ubuntu 16.04
OpenCV 4
Python 3.6
I'm using PyCharm 2019.1.3 in my local PC to use as remote development. All the other configurations, code (debugging) etc. is working fine, just camera is not accessing.
All the files (code and data) are on my remote server
The development (adding/editing files) will happen at the remote server
The code will execute on remote server from local PC
I searched on internet and some related questions but it's for Docker Access webcam using OpenCV (Python) in Docker? and another https://answers.opencv.org/question/199105/videoio-error-v4l-cant-open-camera-by-index-0/
But didn't solve my problem.
if video_file is not None:
video = cv2.VideoCapture(video_file)
assert video.isOpened()
else:
if disable_vidgear:
video = cv2.VideoCapture(camera_id)
assert video.isOpened()
else:
video = CamGear(camera_id).start()
I'm passing video_file (file name and path) and camera_id (which is 0) in arguments. It works fine if code is in local PC.
When I run the code on server, I got this error "VIDEOIO ERROR: V4L: can't open camera by index 0".
I tried with different index (-1, 0, 1, 2 etc) and same error.
I hope to get some valuable suggestions from experts.
I have been having trouble executing my python script in a remote server. However, when I build the docker image in my local machine (macOS Mojave), the docker image executes my script fine. The same docker image built in the remote server has issues with running the python script in the docker image. I was under the impression that the docker image would produce the same results regardless of its host OS. Correct if I'm wrong and what I can do to produce the same results in the remote server VM. What exactly is the underlying problem causing this issue? Should I request a new remote VM with certain specifications?
I'm trying to run a python script on a remote linux machine accessed by ssh(putty). I want to change/access directory to the windows directory and run a program which converts files on the server to csv and saves them to the server.
Is it possible to run the program without moving the files from remote to local, run conversion, move local to remote?
I am not the root user and can't install anything on the linux machine.
My Windows is 64bit and the linux machine is 64bit Ubuntu. Any suggestions?
I found a way to do what I wanted. Doing what I initially wanted requires me to transfer the files from the local machine to the remote machine then running the script and transferring it back. Ultimately it's a function of how fast my internet connection is. Since my local connection isn't that strong, I realized that my initial thoughts were flawed. Eventually, I just uploaded my data to the remote machine and run the script there. It was the fastest solution