I have created a database with python and the results are kept in a text file.
I would like to create a simple html with a search button that looks through my database.
My database would be accessible only on my computer.
My database looks something like this:
Reference | Name | Dimension
XXX AAA 45
So, for example, if I search for a specific reference I would like to obtain their name or dimension obviously.
Is this even possible with python and where do I start ?
I should mention I have basic skills in html and python.
There are multiple ways to solve this:
Install a webserver on your local machine and write a full-fledged Python web application
Write a simple webserver and application in Python using BaseHTTPServer
Write an HTML/JavaScript application that doesn't use Python at all and parses the file for itself. Note that due to recently tightened restrictions, this may still require being served by a webserver.
Write a Python application that writes the database to a JavaScript file. While this is inflexible (you need to run it every time you want to update the database), it's also very simple - just encode your data with JSON and write it in a JavaScript file or element, like this.
htmlfile = open('database.html', 'w')
htmlfile.write('<html>...')
htmlfile.write('<script>var db = ' + json.encode(database) + ';</script>');
Either way, you're going to have to write HTML and JavaScript.
You have a very simple use case and if you already have a web server installed, then probably the simplest and quickest way is to use a CGI script written in python. Check out the cgi module: http://docs.python.org/library/cgi.html.
Basically you have to handle GET request to show a HTML form for asking the query string and the POST request to process the query and show the results.
Related
I need to convert an excel sheet that is shared between couple of users and they do insert/update on it to a more DB/web-based application. What would be the easiest route to do that?
If I understand properly, What you want achieve is to switch from Excel file to web/based application.
you have to create a SQL (Postgresql, Mysql...) database that represent the data structure of the excel file.
Create REST API for web application
Create your web interfaces in order to perform input data using html, css and javascript.
To achieve this, in python you have a lot of web frameworks that simplify your work. following some of the most popular:
Django https://www.djangoproject.com/start/
Flask https://flask.palletsprojects.com/en/2.0.x/
Pyramid https://trypyramid.com/
Or maybe take a look of this tool, Streamlit https://streamlit.io/
I'm giving myself a project to better learn these languages which I already know a lot of it's just syncing them together I need to get better with. This project is a pretty basic "SIM" game, generate some animals into your profile with login/logout. So far I've got the website aspect with HTML/CSS done and functioning with all the pages I currently need all of which is local host on my desktop. Now I'm moving on to working with Python and possibly some PHP aspects into this to get the login/logout and generate a new animal into your account.
Everything I've done with python so far has been done in IDEL, I'm wondering how to link my python document to my HTML document. Like you would CSS? Or is that not possible if not then how do I connect the two to have python interact with the HTML/CSS that has been created? I'm guessing to need MySQL for a database setup but seeing how much I can get as a simple local host without hosting online?
If you want to setup a localhost with PHP and MYSQL I can recommend XAMP (https://www.apachefriends.org/). In order for your webapp to talk to your Python scripts you will either need to use FLASK or Django to create a python webserver, or use PHP to run python scripts. Either way, you will need to make AJAX requests to an API to get this done.
Edit: Forgot to mention this, but you will need JavaScript in order to do this
I am in the middle of my personal website development and I am using python to create a "Comment section" which my visitors could leave comments at there in public (which means, everybody can see it, so don't worry about the user name registration things). I already set up the sql database to store those data but only thing I haven't figured out yet was how to get the user input (their comments) from the browser. So, is there any modules in python could do that? (Like, the "Charfield" things in django, but unfortunately I don't use django)
For that you would need a web framework like Bottle or Flask. Bottle is a simple WSGI based web framework for Python.
Using either of these you may write simple REST based APIs, one for set and other for get. The "set" one could accept data from your client side and store it on your database where as your "get" api should return the data by reading it from your DB.
Hope it helps.
I have a script which scans an email inbox for specific emails. That part's working well and I'm able to acquire the data I'm interested in. I'd now like to take that data and add it to a Django app which will be used to display the information.
I can run the script on a CRON job to periodically grab new information, but how do I then get that data into the Django app?
The Django server is running on a Linux box under Apache / FastCGI if that makes a difference.
[Edit] - in response to Srikar's question When you are saying " get that data into the Django app" what exactly do you mean?...
The Django app will be responsible for storing the data in a convenient form so that it can then be displayed via a series of views. So the app will include a model with suitable members to store the incoming data. I'm just unsure how you hook into Django to make new instances of those model objects and tell Django to store them.
I think Celery is what you are looking for.
You can write custom admin command to load data according to your need and run that command through cron job. You can refer Writing custom commands
You can also try existing loaddata command, but it tries to load data from fixture added in your app directory.
I have done the same thing.
Firstly, my script was already parsing the emails and storing them in a db, so I set the db up in settings.py and used python manage.py inspectdb to create a model based on that db.
Then it's just a matter of building a view to display the information from your db.
If your script doesn't already use a db it would be simple to create a model with what information you want stored, and then force your script to write to the tables described by the model.
Forget about this being a Django app for a second. It is just a load of Python code.
What this means is, your Python script is absolutely free to import the database models you have in your Django app and use them as you would in a standard module in your project.
The only difference here, is that you may need to take care to import everything Django needs to work with those modules, whereas when a request enters through the normal web interface it would take care of that for you.
Just import Django and the required models.py/any other modules you need for it work from your app. It is your code, not a black box. You can import it from where ever the hell you want.
EDIT: The link from Rohan's answer to the Django docs for custom management commands is definitely the least painful way to do what I said above.
When you are saying " get that data into the DJango app" what exactly do you mean?
I am guessing that you are using some sort of database (like mysql). Insert whatever data you have collected from your cronjob into the respective tables that your Django app is accessing. Also insert this cron data into the same tables that your users are accessing. So that way your changes are immediately reflected to the users using the app as they will be accessing the data from the same table.
Best way?
Make a view on the django side to handle receiving the data, and have your script do a HTTP POST on a URL registered to that view.
You could also import the model and such from inside your script, but I don't think that's a very good idea.
Have your script send an HTTP Post request like so. This is the library Requests
>>> files = {'report.xls': open('report.xls', 'rb')}
>>> r = requests.post(url, files=files)
>>> r.text
then on the receiving side you can use web.py to process the info like this
x = web.input()
then do whatever you want with x
On the receiving side of the POST request import web and write a function that handles the post
for example
def POST(self):
x = web.input()
If you dont want to use HTTP to send messages back and forth you could just have the script write the email info to a .txt file and then have your django app open the file and read it.
EDIT:
You could set your CRON job to read the e-mails at say 8AM then write it to a text file info.txt. The in your code write something like
import time
if '9' == time.strftime("%H"):
file = open(info.txt)
info = file.read()
that will check the file at 9AM untill 10AM. if you want it to only check one time just add the minutes too the if statement as well.
I am really new to python, just played around with the scrapy framework that is used to crawl websites and extract data.
My question is, how to I pass parameters to a python script that is hosted somewhere online.
E.g. I make following request mysite.net/rest/index.py
Now I want to pass some parameters similar to php like *.php?id=...
Yes that would work. Although you would need to write handlers for extracting the url parameters in index.py. Try import cgi module for this in python.
Please note that there are several robust python based web frameworks available (aka Django, Pylons etc.) which automatically parses your url & forms a dictionary of all it's parameters, plus they do much more like session management, user authentication etc. I would highly recommend you use them for faster code turn-around and less maintenance hassles.