We have implemented a simple DynamoDB database that is updated by a remote IoT device that does not have a user (i.e. root) constantly logged in to the device. We have experience issues in logging data as the database is not updated if a user (i.e. root) is not logged into the device (we log in via a ssh session). We are confident that the process is running in the background as we are using a Linux service that runs on bootup to execute a script. We have verified that the script runs on bootup and successfully pushes data to Dynamo upon user log in (via ssh). We have also tried to disassociate a screen session to allow for the device to publish data to Dynamo but this did not seem to fix the issue. Has anyone else experienced this issue? Does amazon AWS require a user (i.e. root) to be logged in to the device at all times in order to allow for data to be published to AWS?
No, it does not. I have done similar setup and it is working fine. Are you sure that your IoT device does not go into some kind of sleep mode after a while ?
Related
I successfully hosted a Django website on the domain but I want to publish it without running the putty session again and again.
I am using a open lite speed server please help me what are setting I need to configure to make it keep alive all the time.
i tried putty session connect keep alive time setting.but once i close the putty or power off the system it stops running.
I have a script that I run locally on my laptop, which polls a server and if certain criteria are met, it sends an email. I guess like a heartbeat app of sorts.
The server is a Raspberry Pi, it has a public IP address which is all working fine.
Id like to host the python script on Heroku, so that it can run without my laptop having to be on the local network or always running.
Does anyone have experience of Heroku to show me how I can have the script hosted and running constantly?
My other query is that free tiers of Heroku go to sleep after 30 mins, so would essentially stop the script until getting an http request and spin up the instance once again.
Trying to find some form of elegant solution.
Many thanks for any advice you can give,
All the best,
Simon
If you have ssh access, use screen to keep your script running.
Install screen :
sudo apt install screen
Open a new session:
screen -S session_name
Now, you can run whatever you want, then detach the session by pressing Ctrl+A then D
To go back to the session at any time, first list the active sessions:
screen -ls
Then resume the session using:
screen -r session_id_name
Different approachs would be:
1- Cloud Functions
Create lambda function in a cloud provider with your python code (free tier elegible)
Trigger that function every once in a while
2- Get VM on Cloud
Go to AWS, GCP, etc
Get a free tier VM
Run server from there
I want to use Raspberry Pis to communicate collected Data via Python (Modbus TCP and RTU) scripts to a Database. These scripts are constantly running on the Pi and are connected to the Products where the data is coming from.
Consequently, we have to ship the already set up Raspberry Pi to the Customer. Now the Problem occurs, that the Database Credentials are stored in the Python Scripts running on the Raspberry Pi.
Is there a possibility to overcome this Problem?
Naive solution: Store database credentials on your server (or somewhere on internet) so every time Raspberry Pi run the script, it connect to the server to get the credentials first.
My recommended solution: Create an API (may be web API) to communicate with database and Rasp Pi only work with this API. By this way, the client side doesn't know about database's credentials and some private things you want to hide also.
I have created a web-app using Python Flask framework on Raspberry Pi running Raspbian. I want to control the hardware and trigger some sudo tasks on the Pi through web.
The Flask based server runs in non-sudo mode listening to port 8080. When a web client sends request through HTTP, I want to start a subprocess with sudo privileges. (for ex. trigger changes on gpio pins, turn on camera etc.). What is the best practice for implementing this kind of behavior?
The webserver can ask for sudo password to the client, which can be used to raise the privileges. I want some pointers on how to achieve this.
Best practice is to never do this kind of thing. If you are giving sudo access to your pi from internet and then executing user input you are giving everyone in the internet the possibility of executing arbitrary commands in your system. I understand that this is probably your pet project, but still imagine someone getting access to your computer and turning camera when you don't really expect it.
Create a separate process that does the actual work for you. You can run this process in the background with elevated privileges.
Your web client will just deposit tasks for this process in a queue. This can be as simple as a database to which your Flask app writes a "task request", the other, privileged process reads this request, performs the action, and updates the databases.
This concept isn't new, so there are multiple brokers and task queues that you can use. Celery is popular with Python developers and is easily integrated into your application.
I'm going to be using python to build a web-based asset management system to manage the production of short cg film. The app will be intranet-based running on a centos machine on the local network. I'm hoping you'll be able to browse through all the assets and shots and then open any of them in the appropriate program on the client machine (also running centos). I'm guessing that there will have to be some sort of set up on the client-side to allow the app to run commands, which is fine because I have access to all of the clients that will be using it (although I don't have root access). Is this sort of thing possible?
As you already guessed, you will need to have a service running on the client PC listening on a predetermined port.
When the client requests to open an asset, your webapp will send the request to the running service to download the asset and run it. As long as your port no. is above 1024 and you are not running any application which requires root access, you can run this service without root.
But this is a very bad idea as it exposes the clients to malicious attacks. You will have to ensure all requests to the client service is properly signed and that the client verifies each request as valid before executing it. There may be many other security factors you will have to consider depending on your implementation of the client service. But in general, having a service that can run arbitrary requests from a remote machine is a very dangerous thing to have.
You may also not be allowed to run such a service on client PC depending on your comany's IT policies.
You are better of having the client download the resource normally and then having the user execute the resource manually.
PS: You can have the client service run on a port below 1024, but it will have to start as root and after binding to the port drop all root privileges and change the running user to a different user using setuid (or the equivalent in your language of choice)
Note this is not a standard way. Imagine the websites out there had the ability to open Notepad or Minesweeper at their will when you visit or click something.
The way it is done is, you need to have a service which is running on the client machine which can expose certain apis and trust the request from the web apps call. this needs to be running on the client machines all the time and in your web app, you can send a request to this service to launch the application that you desire.
If you have a specific subset of applications that will be run on the client systems (aka you are distributing jobs), then you might want to consider python salt. It is a distributed RPC which uses a secure protocol and authentication to distribute jobs and deliver results:
http://docs.saltstack.org/en/latest/topics/index.html
If you are looking at automating content generation based on specific updates then you might want to consider Jenkins, which has plugins for various revision control systems and build systems:
https://wiki.jenkins-ci.org/display/JENKINS/Meet+Jenkins
It may not have integration with the particular tools you are using, but if it does then it could be a quicker setup and administration than generic salt automation.
--David