Subprocess change folder permission on Picloud (an AMAZON EC2 server)? - python

In Ubuntu, I use subprocess.Popen to call an executable file, which will save some output files on the hard drive of server (Picloud). The code has been tested successfully on a Local Ubuntu machine. However, it does not work on the server. My approach is listed below:
#create a new folder.
#The permission is drwxrwx--- both locally and on the server end
os.makedirs(folder)
#copy the executable file to this folder.
#The permission is drwxrwx--- both locally and on the server end after copy
shutil.copy("a.exe", folder)
#call this exe file.
#The permission is drwxrwx--- locally but changed to drwxr-x--- on the server end.
#Since I do not have the write permit, my code fails
subprocess.Popen("a.exe")
>>>OSError: [errno 13] permission denied.
I am not sure why subprocess changes my folder permission on the server end. So I try to use sudo mode as:
subprocess.Popen("sudo", "a.exe")
As I expected, this code work locally only.
So can anyone give me some help on why subprocess will remove my write permission?
Thanks!

In windows, you can run it with command:
subprocess.Popen("runas /user:Admin a.exe")

Related

Jenkins to Linux Issues - Python File execution

I am quite new to linux set up and I am finding it really tough to make my jenkins machine talk to a python program residing in a linux server.
Here are the details:
I have a python file which resides in a path '/filenet/EFBI/Scripts/Test_Scripts/add.py'
I have created one Jenkins Job to run the add.py file which has the JenkinsHome path set in the same linux machine -'/jenkins/jenkins_fmlvlm0000/jenkins2_176/Jenkins_Home'
I donot have access to the workspace file inside the Jenkins_Home (using a user to login to the linux box).
In the Jenkins Job configuration I have written the below code:
#! /usr/local/bin python3
python3 /filenet/EFBI/Scripts/Test_Scripts/add.py
When this gets executed receiving a error:
started by user Jenkins Admin User
Running as System
Building in workspace /jenkins/jenkins_fmlvlm0000/jenkins2_176/Jenkins_Home/Workspace/PythonTest
[PythonTest] $ /usr/local/bin python3 /jenkins/jenkins_fmlvlm0000/jenkins2_176/apache-tomcat-9.0.20/temp/jenkins123456789.sh
FATAL : command execution failed
java.io.IOException: error=13, Permission Denied
....
....
....
....
caused: java.io.IOException: cannot run program "/usr/local/bin" (in directory "/jenkins/jenkins_fmlvlm0000/jenkins2_176/Jenkins_Home/Workspace/PythonTest"): error=13, Permission Denied
Now I have below questions:
Where am I going wrong?
How to find out which user is being used to interact between jenkins and linux python program?
Is this a permission issue and what all permission be required?
The issue is resolved now.
We had to create a slave node and then in the linux machine create a worker folder for the agent to work.
And then in the set up create the same user ID and password through which we are logging into to linux machine and then provide required SSH permission to make it work.

Adding read/write permission to access external HD under Catalina

I'm working on MAC application contains a command line tool.
The GUI (electron) using sudo prompt to run the command line tool written in Python.
Under Catalina, the app all needed permission to w/r for the protected directories like Desktop with no issue.
Needed permission request were added to plist and it is signed with list of entitlements.
All works well except accessing external hard drive. Any try to create folder or modifying its properties using chmod command for example return with permission denied.
Just for note, calling same command from terminal ends well with no issue
Any ideas how I can go over it?
How App can request Full disk access?

Permission denied when django writes to external device

I have a django file server. The server works perfectly on my laptop & localhost (with external hard drive) but when I transferred it to my Raspberry Pi running Raspbian, it starts acting up.
I did a lot of googling and tried every possible solution but it does not work.
Here is my problem:
I have connected an external hard drive to my raspberry pi. I believe it has write permissions because I can easily write to it with mkdir. I have also set this directory which is /media/pi/SAMSUNG/media as my MEDIA_ROOT.
Now I have set up Apache2, WSGI and Django, everything works, I have set up all the permissions and everything, but still when django tries to access the hard drive, whether it be to read or write, I get an error [Errno 13] Permission denied: '/media/pi/SAMSUNG'.
I have fixed this in the past with chown -R 777 but it does not work this time.
Unfortunetly, I have no idea what I am doing when it comes to servers and file permissions, so I have no idea what code to attach. Can some please help me?
I will attach all the necessary code on request.
Thank you
On raspbian you have to be careful when using external drives. Have you edited the file /etc/fstab? This is necessary to mount the drive also correctly after restarts and when the file is temporarily disconnected and then connected again.
If you did this correctly, then the drive will have the permissions you gave it when mounting and Django should be able to access it/write to it.
BR

Python application permission denied

My Python program uses the terminal(system) command to perform the task on the file and script. I am going to convert this Python program into Mac OS application and Linux application using pyinstaller. I am going to pass the application installer file to my friends. However, I have following questions.
If script or file doesn't have proper permission which my program is trying to access, will Python get an error?
Running some script or opening file will require the permission of root. So is there a possible option that will prompt the user for root(admin) password or run my application with root privilege?
Thanks
Try chmod 777 filname.py, this will give the file all rights for execution and editing. There are also other modes for chmod like 755, which will also work for your case.

Environmen PATH with Pycharm remote connection

I have a remote server and I want to use Pycharm to execute my scripts on this remote server using SFTP and SSH. Everything works properly, except that, when running the scripts on the server through the Pycharm "Run" command, the PATH is wrong. I am trying to understand which file I should edit to properly set the PATH environmental variable. I am using a mac and the server runs ubuntu. After having installed several libraries on the server, I edited the .bashrc file in my home directory on the server to properly set up the PATH. Then the following happens:
If I login to the server using ssh and the mac terminal, the PATH is loaded correctly.
If I open a ssh session inside pycharm and connect to the server, the PATH is loaded correctly.
If I run a simple script (that only contains calls to the standard library) on the remote server through the Pycharm "Run" command, it works
If I run the full script on the remote server through the pycharm "Run" command, then in crashes: os.environ['PATH'] shows that the path is not correct
If I run the script by directly calling the python interpreter on the server using one of the terminal ssh session, then the PATH is set up properly and it works
So apparently the PATH is set up properly if I connect to it using ssh and the terminal, but when Pycharm tries to run it directly the PATH is not updated and it crashes.
So I am guessing pycharm doesn't look into .bashrc before running the program. Which file should I modify then? Note that the only "configuration" files present in my home directory on the remote server are .bashrc and .profile.
Note I do not have sudo privileges on this server.

Categories