I got a project in school to make a python program for a hotel management system. In a specific part I found to make cancellation slip in the following format.
Refer to format 3
Please tell how to make a format like this. I tried to make table using prettytable module but didn't get the desired results. I am new to python so try to suggest an easy way. Specifically I want to know how to create a self made table in a short way. I want the columns and rows merged as is in the image. For example so that I can merge two cells of a table. The table doesn't need to be fancy I just want the strings aligned properly looking neat and clean.
Related
I plan to make an educational web game. I have thousands of trivia questions I need to write down in a way that can be easily transferred out and automatically organized based on their column, at a later date.
I was suggested to use google sheets so I can later export as a .csv, and that should be easy to work with for a developer. When i exported a .csv and opened it in Panda python the a column was cut off and 1 column was used as a 'header', not just a normal entry https://imgur.com/a/olcpVO8. This obviously wont work and seems to be an issue.
Should I just leave the first row and column empty and work around the issue? I don't want to write thousands of sets only to find out I did this the wrong way. Can anyone give any insight into whether this is my best option and how I should best format it?
I have to write Questions(1), Answers(4), Explanations(1) per entry
I hope this makes sense, thanks for your time.
I tried doing this and have no issue at all using the exported CSV from Google Sheets, using the same data as in your example.
In my opinion, whatever software you're using in your second screenshot is your issue, it seems like its removing numbers from the first row because that should be your header row. Check around in your software for options like, "First column contains headers" or "Use row 1 as Header" and make sure these aren't being used.
How do I get 4 million rows and 28 columns from Python to Tableau in a table form?
I assume (based on searching) that I should use a JSON format. This format can handle a lot of data and is fast enough.
I have made a subset of 12 rows of the data and tried to get it working. The good news is: it's working. The bad news: not the way I want to.
My issue is that when I import it in Tableau it doesn't look like a table. I have tried the variances which are displayed here.
This is the statement in Python (pandas):
jsonfile = pbg.to_json("//vsv1f40/Pricing_Management$/Z-DataScience/01_Requests/Marketing/Campaign_Dashboard/Bronbestanden/pbg.json",orient='values')
Maybe I select too many schemas in Tableau (I select them all), but I think my problem is in Python. Do I need to use another library instead of Pandas? Or do I need to change the variables?
Other ways are also welcome. I have no preference for JSON, but I thought that was the best way, based on the search results.
Note: I am new to python and tableau :) I use python 3.5.2 and work in Jupyter. From Tableau I only have the free trial desktop version.
JSON is good for certain types of data, but if your DataFrame is purely tabular (no MultiIndexes, complex objects, etc.) and contains simple data types (strings, digits, floats), then a comma-separated value (CSV) text file is probably the best format to use, as it would take up the least space. A DataFrame can easily be saved as a CSV using the to_csv() method, and there are a number of customization options available. I'm not terribly familiar with Tableau, but according to their website CSV files are a supported input format.
I am trying to understand how enthought.traits and enthought.traitsui work, and so far I have found it very easy to work with. I have also looked at the example https://svn.enthought.com/enthought/browser/Traits/trunk/enthought/traits/ui/demo/Advanced/Tabular_editor_demo.py?rev=17881
That shows how to put data on to a TabularEditor. But we have to mention the column names in the Adapter class. But if I have to put all my data with loads of columns and rows from a spreadsheet to the table, how would I go about it? Is there a demo file that I have missed?
I am attempting to add an Excel like filtering option to a program that is processing 3 xml files and optionally an xls file. One of the problems I am running into is finding good examples of applying multiple filters at once. I understand that Filter.Chain allows for multiple "pre-built" filters to be applied at once, the main trouble is being able to dynamically create the filters then applying them. For example, one of the things being looked at by the program is vehicle makes such as Honda, Ford, etc and I would like to be able to select which makes I want to see in the ObjectListView.
Due to the amount of code in use it is hosted at pastebin:
Main Gui
Worker Script
Input Panel
Primary Output
Secondary Output
What I am hoping to find are examples of how I would be able to add an Excel like filtering. Currently I have a menu option set that opens a MultiChoiceDialog window which provides a list of options, but I am unable to find a good pythonic way of taking the selections and applying them as a filter or series of filters.
Thanks in advance.
It appears that you are reading an Excel file into your ObjectListView widget. I think it would be easier to load the data into a sqlite database and then use SQL commands to do your filtering. I prefer using SQLAlchemy. That way I can create a class that represents the data that I can use both for SQLAlchemy and for my ObjectListView widget.
You can read about some of this sort of thing in the following articles:
http://www.blog.pythonlibrary.org/2012/06/04/wxpython-and-sqlalchemy-loading-random-sqlite-databases-for-viewing/
http://www.blog.pythonlibrary.org/2011/11/10/wxpython-and-sqlalchemy-an-intro-to-mvc-and-crud/
I have this problem, I need to scrape lots of different HTML data sources, each data source contains a table with lots of rows, for example country name, phone number, price per minute.
I would like to build some semi automatic scraper which will try to ..
find automatically the right table in the HTML page,
-- probably by searching the text for some sample data and trying to find the common HTML element which contain both
extract the rows
-- by looking at above two elements and selecting the same patten
identify which column contains what
-- by using some fuzzy algorithm to best guess which column is what.
export it to some python / other list
-- cleaning everytihng.
does this look like a good design ? what tools would you choose to do it in if you program in python ?
does this look like a good design ?
No.
what tools would you choose to do it in if you program in python ?
Beautiful Soup
find automatically the right table in the HTML page, -- probably by searching the text for some sample data and trying to find the common HTML element which contain both
Bad idea. A better idea is to write a short script to find all tables, dump the table and the XPath to the table. A person looks at the table and copies the XPath into a script.
extract the rows -- by looking at above two elements and selecting the same patten
Bad idea. A better idea is to write a short script to find all tables, dump the table with the headings. A person looks at the table and configures a short block of Python code to map the table columns to data elements in a namedtuple.
identify which column contains what -- by using some fuzzy algorithm to best guess which column is what.
A person can do this trivially.
export it to some python / other list -- cleaning everytihng.
Almost a good idea.
A person picks the right XPath to the table. A person writes a short snippet of code to map column names to a namedtuple. Given these parameters, then a Python script can get the table, map the data and produce some useful output.
Why include a person?
Because web pages are filled with notoriously bad errors.
After having spent the last three years doing this, I'm pretty sure that fuzzy logic and magical "trying to find" and "selecting the same patten" isn't a good idea and doesn't work.
It's easier to write a simple script to create a "data profile" of the page.
It's easier to write a simple script reads a configuration file and does the processing.
I cannot see better solution.
It is convenient to use XPath to find the right table.