I successfully hosted a Django website on the domain but I want to publish it without running the putty session again and again.
I am using a open lite speed server please help me what are setting I need to configure to make it keep alive all the time.
i tried putty session connect keep alive time setting.but once i close the putty or power off the system it stops running.
Related
I have a script that I run locally on my laptop, which polls a server and if certain criteria are met, it sends an email. I guess like a heartbeat app of sorts.
The server is a Raspberry Pi, it has a public IP address which is all working fine.
Id like to host the python script on Heroku, so that it can run without my laptop having to be on the local network or always running.
Does anyone have experience of Heroku to show me how I can have the script hosted and running constantly?
My other query is that free tiers of Heroku go to sleep after 30 mins, so would essentially stop the script until getting an http request and spin up the instance once again.
Trying to find some form of elegant solution.
Many thanks for any advice you can give,
All the best,
Simon
If you have ssh access, use screen to keep your script running.
Install screen :
sudo apt install screen
Open a new session:
screen -S session_name
Now, you can run whatever you want, then detach the session by pressing Ctrl+A then D
To go back to the session at any time, first list the active sessions:
screen -ls
Then resume the session using:
screen -r session_id_name
Different approachs would be:
1- Cloud Functions
Create lambda function in a cloud provider with your python code (free tier elegible)
Trigger that function every once in a while
2- Get VM on Cloud
Go to AWS, GCP, etc
Get a free tier VM
Run server from there
I manually login to one of my Application GUI consoles(https://localhost:port) hosted on a server on the enterprise internal network, through a jump server. These are the exact steps:
1. Open putty
2. ssh(port 22) to the jumpserverhost.domain.com(this is a linux server)
3. Enter user, and passcode(RSA token)
4. Ensure the putty session window is alive/open
5. Now open browser and open the Application console - https://localhost:port
NOTE: "We don't use the direct server IP, we use localhost ip-127.0.0.1 and the port on which the application GUI is running"
6. Enter Application GUI user/password and do your work on the Application
My requirement is to automate the above steps for login to my Application GUI Login. This is what I have done
a. Use a pywinauto script to automate the exact human steps from steps 1-4 above
Please check my answer to how to pass variable in pywinauto.Application().start(cmd_line='') for the script I am using for the task.
Even though my pywinauto script is working just fine and logging in to the jump server via putty, I am not able to connect to the application GUI, see image below.
But every time I login to the jump server manually using steps 1-4 above, the application GUI login page opens on https://127.0.0.1:. See image below
Why is it changing behavior via automated script ?
I have also checked many articles which suggest to automate ssh connections through paramiko , will it work in this case ? (considering the ssh connection to the jump server needs to be alive till the time I'm connected to the Application GUI )
My knowledge of how jump servers work is very shallow, please help.
So I suspect that the problem is that you use Putty to create an SSH tunnel connection to the server in question, while your pywinauto command might just open a normal SSH connection. You can check if there is an ssh tunnel configured under Connections > SSH > Tunnels in the Putty configuration for that server.
From what I read in your other question regarding pywinauto, it seems this is indeed the case as the command you issue there:
putty.exe -ssh uname#host.domain.com
Doesn't create a tunnel.
If you want to create an SSH Tunnel with the Putty command-line with pywinauto, you should do something like:
putty.exe -ssh uname#host.domain.com -L <server_port>:host.domain.com:<local_port>
I have created a Django application and uploaded in to AWS EC2. I can access the site using public IP address only when I run the python manage.py in AWS command line.
If I close the Putty window, I am not able to access the site. How can I make sure that the site is available always even if I close the command line / putty?
I tried WSGI option but its not working at all. Appreciate your help to give us a solution to run the Python application in AWS.
It happens because you are running the app from within the SSH session, which means that ending the session (SIGHUP) will kill your application.
There are several ways to keep the app running after you disconnect the SSH, the simplest would be to run it inside a screen session and keeping this instance running while disconnecting from SSH, the advantage of this method is that you can still control the app when you are reconnecting to this machine and control the state of the app and also potentially see the logs.
Although it might be pretty cool it's considered a patch, the more stable and solid way would be to create a service that will run the app and will allow you to start, stop and look at logs using the nifty wrappers of systemd.
Keep the process running with screen:
First you'll have to make sure screen is installed (apt-get or yum) whatever suits your desired distro.
Run screen.
Run the app just like you did outside screen.
Detach from the screen session by pressing Ctrl+A and then d.
Disconnect from the SSH and see how the service is still running.
Creating a systemd service is a bit more complicated so try and read through the following manual.
I am running a python script on a Bluehost remote machine, and my script opens a browser window to ask the user for his permission to access his personal calendar. This is part of the oAuth2 protocol. In fact, this is for me to give access to this script to access my Google Calendar. I am running this script through SSH from my local machine, and need it to display on my local machine. Is there a way to do this? Has someone with a Bluehost account done this?
thanks
We have implemented a simple DynamoDB database that is updated by a remote IoT device that does not have a user (i.e. root) constantly logged in to the device. We have experience issues in logging data as the database is not updated if a user (i.e. root) is not logged into the device (we log in via a ssh session). We are confident that the process is running in the background as we are using a Linux service that runs on bootup to execute a script. We have verified that the script runs on bootup and successfully pushes data to Dynamo upon user log in (via ssh). We have also tried to disassociate a screen session to allow for the device to publish data to Dynamo but this did not seem to fix the issue. Has anyone else experienced this issue? Does amazon AWS require a user (i.e. root) to be logged in to the device at all times in order to allow for data to be published to AWS?
No, it does not. I have done similar setup and it is working fine. Are you sure that your IoT device does not go into some kind of sleep mode after a while ?