Im working in an architecture firm where we are trying to use the Python API for ArchiCAD.
It allows us some much needed automation for certain workflows, but we still haven't used it, because we don't know how to limit the users right to execute Pyhton.
We are planning to put some Python Scripts on an internal server and our co-workers would be able to use the "ArchiCAD Palette" to execute the scripts from the Server.
But we are currently worried that a person could write tehir own script and harm our projects etc, or delete data w/e.
So we were thinking about only allowing the Execution of Python scripts from a certain folder.
->The Folder would be read-only for users so they cant throw in their own scripts
Is it possible to do that?
If not, can anyone recommend a way were we can limit our co-workers to use only our self-written scripts but prevent them from running anything harmful?
Thanks for any Help
Kind Regards
Dayiz
Related
So far when dealing with web scraping projects, I've used GAppsScript, meaning that I can easily trigger the script to be run once a day.
Is there an equivalent service when dealing with python scripts? I have a RaspberryPi, so I guess I can keep it on 24/7 and use cronjobs to trigger the script daily. But that seems rather wasteful, since I'm talking about a few small scripts that take only a few seconds to run.
Is there any service that allows me to trigger a python script once a day? (without a need to keep a local machine on 24/7) The simpler the solution the better, wouldn't want to overengineer such a basic use case if a ready-made system already exists.
The only service I've found so far to do this with is WayScript and here's a python example running in the cloud. The free tier that should be enough for most simple/hobby-tier usecases.
I have a web server which I have developed an application on using php and SQL, mainly picked php as I am more comfortable with it.
In short the application automates some of our network tasks .
As part of this I have automated some solarwinds tasks and the library orionsdk doesnt have a php library so I have used python.
It's all working fine but I really need to run these python scripts from my browser .
I have considered using php shell exec and got my python scripts to accept args so I can run them and parse the output.
I know I could also use flask or django but worry I will have a flask app to maintain aswell as a php app.
I think the question is what the best way to achieve this or any way which I haven't mentioned .
Any help would be very much appreciated
So you want PHP to communicate with Python and you've already mentioned using shell commands and http traffic.
I can imagine you could also achieve something similar by connecting up both PHP and Python up to the same database. In that case PHP could write a record in a table and Python could pick that up and do something with the data in there. Python could be either be a long-running process or fired off by a cronjob in this case. If a database seems overkill you could also write a file to some place on disk, which Python can pick up.
Personally I'd go for the shell exec approach if you want to keep it light weight and for a API connection if you want to have a more robust solution which needs to be expanded later on.
I have a python script which executes a string of code with the exec function. I need a way to restrict the read/write access of the script to the current directory. How can I achieve this?
Or, is there a way to restrict the python script's environment directly through the command line so that when I run the interpreter, it does not allow writes out of the directory? Can I do that using a virtualenv? How?
So basically, my app is a web portal where people can write and execute python apps and get a response - and I've hosted this on heroku. Now there might be multiple users with multiple folders and no user should have access to other's folders or even system files and folders. The permissions should be determined by the user on the nodejs app (a web app) and not a local user. How do I achieve that?
Execute the code as a user that only owns that specific directory and has no permissions anywhere else?
However- if you do not completely trust the source of code, you should simply not be using exec under any circumstances. Remember, say you came up with a python solution... the exec code could literally undo whatever restrictions you put on it before doing its nefarious deeds. If you tell us the problem you're trying to solve, we can probably come up with a better idea.
The question boils down to: How can I safely execute the code I don't trust.
You can't.
Either you know what the code does or you don't execute it.
You can have an isolated environment for your process, for example with docker. But the use cases are far away from executing unsafe code.
I'm about to start working on a project where a Python script is able to remote into a Windows Server and read a bunch of text files in a certain directory. I was planning on using a module called WMI as that is the only way I have been able to successfully remotely access a windows server using Python, But upon further research I'm not sure i am going to be using this module.
The only problem is that, these text files are constantly updating about every 2 seconds and I'm afraid that the script will crash if it comes into an MutEx error where it tries to open the file while it is being rewritten. The only thing I can think of is creating a new directory, copying all the files (via script) into this directory in the state that they are in and reading them from there; and just constantly overwriting these ones with the new ones once it finishes checking all of the old ones. Unfortunately I don't know how to execute this correctly, or efficiently.
How can I go about doing this? Which python module would be best for this execution?
There is Windows support in Ansible these days. It uses winrm. There are plenty of Python libraries that utilize winrm, just google it, but Ansible is very versatile.
http://docs.ansible.com/intro_windows.html
https://msdn.microsoft.com/en-us/library/aa384426%28v=vs.85%29.aspx
I've done some work with WMI before (though not from Python) and I would not try to use it for a project like this. As you said WMI tends to be obscure and my experience says such things are hard to support long-term.
I would either work at the Windows API level, or possibly design a service that performs the desired actions access this service as needed. Of course, you will need to install this service on each machine you need to control. Both approaches have merit. The WinAPI approach pretty much guarantees you don't invent any new security holes and is simpler initially. The service approach should make the application faster and required less network traffic. I am sure you can think of others easily.
You still have to have the necessary permissions, network ports, etc. regardless of the approach. E.g., WMI is usually blocked by firewalls and you still run as some NT process.
Sorry, not really an answer as such -- meant as a long comment.
ADDED
Re: API programming, though you have no Windows API experience, I expect you find it familiar for tasks such as you describe, i.e., reading and writing files, scanning directories are nothing unique to Windows. You only need to learn about the parts of the API that interest you.
Once you create the appropriate security contexts and start your client process, there is nothing service-oriented in the, i.e., your can simply open and close files, etc., ignoring that fact that the files are remote, other than server name being included in the UNC name of the file/folder location.
I have a django site that needs to be rebuilt every night. I would like to check out the code from the Git repo and then begin doing the stuff like setting up the virtual environment, downloading the packages, etc. This would have no manual intervention as this would be run from cron
I'm really confused as to what to use for this. Should I write a Python script or a Shell script? Are there any tools that assist in this?
Thanks.
So what I'm looking for is CI and from what I've seen I'll probably end up using Jenkins or Buildbot for it. I've found the docs to be rather cryptic for someone who's never attempted anything like this before.
Do all CI like Buildbot/Jenkins simply run tests and more test and send you reports or do they actually set up a working Django environment that you can access through your browser?
You'll need to create some sort of build script that does everything but the GIT checkout. I've never used any Python build tools, but perhaps something like: http://www.scons.org/.
Once you've created a script you can use Jenkins to schedule a nightly build and report success/failure: http://jenkins-ci.org/. Jenkins will know how to checkout your code and then you can have it run your script.
There are litterally 100's of different tools to do this. You can write python scripts to be run from cron, you can write shell scripts, you can use one of the 100's of different build tools.
Most python/django shops would likely recommend Fabric. This really is a matter of you running through and making sure you understand everything that needs to be done and how to script it. Do you need to run a test suite before you deploy to ensure it doesn't really break everything? Do you need to run South database migrations? You really need to think about what needs to be done and then you just write a fabric script to do those things.
None of this even touches the fact that overall what you're asking for is continuous integration which itself has a whole slew of tools to help manage that.
What you are asking for is Continuous Integration.
There are many CI tools out there, but in the end it boils down to your personal preferences (like always, hopefully) and which one just works for you.
The Django project itself uses buildbot.
If you would ask me, then I would recommend you continuous.io, which works ouf the box with Django applications.
You can manually set how many times you would like to build your Django project, which is great.
You can, of course, write a shell script which rebuilds your Django project via cron, but you should deserve better than that.