How to build an interactive search engine web interface using python - python

I have build a static web interface for searching data from some tables in my PostgreSQL database. The query website consists of a simple textfield for entering the search term, the result website presents the results as a simple html table. The server side code for searching the PostgreSQL database and returning the results is written in python using psycopg2.
Now I would like to add some interactive "Ajax features" to my search engine. When entering the search term I would like to be able to see a list of possible search terms like Google does it. On the results page, I would like to be able to sort the table showing the results.
What would be the easiest/recommended way to implement these features for my search engine web site?

I have not had to build a search outside of Django, but Haystack http://haystacksearch.org/ makes things very easy.
If you don't want to get into Django you could look at Whoosh. http://bitbucket.org/mchaput/whoosh/wiki/Home

what you call "Ajax features" are technically known as auto-suggest. Unless you want to reinvent the wheel. I would highly recommend indexing your db tables using Apache Solr. It comes with autosuggest, faceted filtering (like on most ecommerce sites) and spell-check. and since it is HTTP based you can hook into Python very easily using its RESTful API.

Related

Full text mysql database search in Django

We have been using a MYSQL database for our project and Django as the backend framework. We want to support a full text search on a particular table and return the Queryset in Django. We know that Django supports full text search on a Postgres database but we can't move to another database now.
From what we have gathered till now -
Using inbuilt search functionality - Here we check on every field if the value exists and then take an OR to combine the results. Similar to the link (Django Search query within multiple fields in same data table).
This approach however straight forward may be inefficient for us because we have huge amounts of data.
Using a library or package - From what we have read Django haystack is something a lot of people are talking about when it comes to full text search.
Django Haystack - https://django-haystack.readthedocs.io/en/master/tutorial.html#installation
We haven't checked the library completely yet because we are trying to avoid using any library for this purpose. Let us know if you people have worked with this and have any views.
Any help is appreciated. Thanks.

How to structure a web scraper project?

I have a project that is to collect posts from several second hand vehicle websites using BeautifulSoup and then store them in a database. Also my client requested to build this functionality on top of some content management system he is familiar or semi-familiar with like wordpress.
Can this be done using wordpress without making a big mess out of it? If not how would you suggest to structure my project and what cms to use?
Wordpress seems to support only mySQL and MariaDB, according to their site: https://codex.wordpress.org/Using_Alternative_Databases. Those seem to be your only database-tech options if you want to maintain Wordpress support.
From there, it's up to whatever is easier for your python to access, to be honest.

Creating a sortable table (with React.js) that was produced using Flask and Bootstrap

I've got a table that I've created using Flask by getting data from an API and later compiling it into a table with bootstrap as a front end. I want to make the headers clickable in order to sort them, I've heard that React.js might be a good option for this, is there anyway for me to use React directly with my table without rewriting the entire app in javascript?
Possibly! If you can install react then you should be able to use a library such as the react-collapsing-table. You would need to install it with npm then you should be able to import/require it on your page and just do
<ReactCollapsingTable rows={data} columns={columns} />
Hope that helps :)
I implemented a react-bootstrap-table2 frontend with a flask backend here: http://thomaxxl.pythonanywhere.com/ja/index.html#/books (no sorting implemented there but it is possible by using the appropriate react-bootstrap-table syntax: https://react-bootstrap-table.github.io/react-bootstrap-table2/docs/basic-sort.html )
You can implement sorting in the backend or the frontend.
To implement it in the backend, your api must support sorting parameters (eg. in the query string: ?sort=title,id )
To implement sorting in the frontend you must fetch all your data which is not feasible for large tables
Other things you may want to consider are pagination, filtering and search.

What is the best approach for managing static information for a site, while implementing the Search API across it?

Recently, Google has created a new Search API that you can integrate into your google app engine application for searching documents and information within your site. Cool!
I have a site that has quite a few Django resources that contain a significant amount of static information. I would like to integrate this information into a site-wide search engine using the new Search API.
For someone with an existing site and numerous text resources used for content, what is the best way of integrating the static information (from flat, HTML files) into the sites Search API datastore? Bonus question, what is the best way to manage this content so that as I add additional pages to the site, they will be integrated into the search datastore?
The search API requires you to add documents to the search backend in order to be searchable. For your static resources this means you have to crawl and add them to the search backend using the search API.
You probably want to do this after every upload. Maybe the easiest way is to have a cron job that traverses your files and checks their timestamps. If they are newer than when they were last traversed (if at all) add them to/update them in the search backend.
Instead of a cron job, you could also define a handler that triggers the traversal and you hit after you deployed a new app version.

Need help in designing a phone book application on python running on google app engine

Hi I want some help in building a Phone book application on python and put it on google app engine. I am running a huge db of 2 million user lists and their contacts in phonebook. I want to upload all that data from my servers directly onto the google servers and then use a UI to retrieve the phone book contacts of each user based on his name.
I am using MS SQL sever 2005 as my DB.
Please help in putting together this application.
Your inputs are much appreciated.
For building your UI, AppEngine has it's own web framework called webapp that is pretty easy to get working. I've also had a good experience using the Jinja2 templating engine, which you can include in your source, or package as a zip file (example shows Django, you can do the same type of thing for Jinja).
As for loading all of your data into the DataStore, you should take a look at the bulk uploader documentation.
I think you're going to need to be more specific as to what problem you're having. As far as bulk loading goes, there's lots of bulkloader documentation around; or are you asking about model design? If so, we need to know more about how you plan to search for users. Do you need partial string matches? Sorting? Fuzzy matching?

Categories