I was trying to figure out if it is possible to run a python script (watchdog) to monitor when a file is inserted into a folder that is hosted by the OneDrive platform. I work in a company and we want to monitor a folder that is shared with everyone. And I think that a program that is able to monitor OneDrive Side is better than installing the same program for all clients of this OneDrive Clound application. Because I can't guarantee that local changes will be loaded and shared with everyone, and some users could make the mistake of stopping the monitoring program.
Related
I've been tasked to solve a ML/DL problem.
But due to the processing limitations of my system the progress of developing the program further has come to a halt. It is a requirement of my program that it uses a GUI for the layman user, which I've already designed and implemented.
Now due to system limitations, I was wondering if I could somehow connect this python GUI instance on my local machine to Google Colab. The user could then use the GUI interface to upload files to google colab and upon pressing the submit button google colab would import these files. It would then do the ML/DL processing on these files and return some output back to the GUI interface for the user.
Any ideas of how I would achieve this?
I have a windows application built with progress openedge technology.
I have created a python script to generate an excel file but I need to deploy it to the client and im afraid of requiring special permissions on the client side if I compile it to .exe and attempt to run it.
Can someone suggest me a method to be able to integrate python with my project smoothly without breaking anything?
You could compile it on your own machine then try to run it while logged in as a guest user. If a guest account can run it without complaints it will probably run fine on the client machine.
This is crude because you still haven't tested all possible client platforms (unless you're talking about one specific client), also we don't know what's inside your script.
Use icacls to set appropriate permissions of your compiled script before shipping.
I'm not sure about the special permissions thing, but is it possible for you to turn your script into a CGI program and stick it on your webserver, or wrapper it with WebSpeed? Then your app could call a web service to get the .xls file.
As the title suggests, I'm looking for a way to run local python scripts from the Google Apps Scripts environment. Right now my method is to write a json file from app scripts and save it to a Google Drive directory. The local machine polls the directory via Google Drive File Stream, and runs the necessary code when the file appears in the directory. Not pretty, but it gets the job done. The main concern with this is that File Stream can be fairly latent - could be up to 15 minutes after a file is put in the directory before File Stream catches it and syncs the local machine.
Is anybody aware of other options? I could try re-writing the python code in Apps Script, though the python code relies on a few third party libraries that aren't available in JS. Another idea is setting up a flask server, but that seems like a good bit of work for the return.
I'm trying to save a image after a process with opencv in a Python Blumix server but an error occurs. I'm reading a local image that I upload when push the app, but I can't write in the same location.
image = cv2.imread('static/images/image.jpg') #works
# do some process to the image...
cv2.imwrite('static/images/processed.jpg', processed_image) #works in local but not in server
Do I need some permissions? Special folder? The server is read only?
Thanks in advance
I would probably look to using object storage for the image. This will give you more options if you want to later split your code into microservices. Using local storage is also not advised:
Avoid Writing to the Local File System
Applications running on Cloud
Foundry should not write files to the local file system for the
following reasons:
Local file system storage is short-lived. When an application instance
crashes or stops, the resources assigned to that instance are
reclaimed by the platform including any local disk changes made since
the app started. When the instance is restarted, the application will
start with a new disk image. Although your application can write local
files while it is running, the files will disappear after the
application restarts.
Instances of the same application do not share a
local file system. Each application instance runs in its own isolated
container. Thus a file written by one instance is not visible to other
instances of the same application. If the files are temporary, this
should not be a problem. However, if your application needs the data
in the files to persist across application restarts, or the data needs
to be shared across all running instances of the application, the
local file system should not be used. We recommend using a shared data
service like a database or blobstore for this purpose.
Source: https://docs.cloudfoundry.org/devguide/deploy-apps/prepare-to-deploy.html#filesystem
I'd like to check the status, e.g. How many files left to upload. of a Dropbox account using the API in python. Is this possible?
You can't do it directly, but you can do it indirectly (sort of). What I do is place a dummy file in my Dropbox and have it sync to the server. At the very beginning of my script I delete the file locally and then wait for it to reappear. Once it has reappeared I have a good idea that Dropbox is connected and synced. It's not perfect because more files might be waiting to be synced, but that's the best I could come up with.
The Dropbox API is a way to add Dropbox integration to your application (that is, to allow your application to sync its files via Dropbox, or to explicitly upload/download files outside of the sync process), not a way to control or monitor the Dropbox desktop application (or the way other applications sync, or anything else).
So no, it's not possible.
The only way to do this is to write code for each platform to control the Dropbox app, e.g., via UI scripting on the Mac or intercepting WM messages on Windows. (Or, alternatively, it might be possible to write your own replacement sync tool and use that instead of the standard desktop app, in which case you can obviously monitor what you're doing.)