I use Kodi with add-ons such as Exodus which allow me to load episodes of my favourite TV programs etc, and I also have a subscription to horse and country to watch equine based shows but there is not a horse and country add-on to allow me to watch these on Kodi.
Is it possible for me to be able to code an add-on for Kodi (presumably in Python?) that would obtain all of the possible video links from www.horseandcountry.tv (catch up and on demand) using my log-in details and list these for me to watch.
I have a fair bit of experience coding (mostly Java, and a little Python), but have never written a Kodi add-on or done anything that scrapes video links from websites etc. I'm a first year computer science student so have a bit of understanding but not much experience!
First off, is what I am looking to do realistic and possible? and secondly, if so, would someone be able to give a very brief overview of how I would go about it and the necessary principles involved?
Your going to write a python addon, in this case probably a plugin is fitted best.
Have a look at https://github.com/Razzeee/generator-kodi for a good quickstart/boilerplate. (Let me know if you spot something wrong)
You might also want to use beautifulsoup to grab the links via python.
Related
First of all I want to apologize if my question is too broad or generic, but it would really save me a lot of needlessly wasted time to get an answer to guide me in the right direction for the work I want to do. With that out of the way, here it goes.
I am trying to retrieve some publicly available data from a website, to create a dataset to work with for a Data Science project. My big issue is that the website does not have a friendly way to download it, and, from what I gathered, it also has no API. So, getting the data requires skills that I do not possess. I would love to learn how to scrape the website (the languages I am most comfortable with are Python and R), and it would add some value to my project if I did it, but I also am somewhat pressured by time constraints, and don't know if it is even possible to scrape the website, much less to learn how to do it in a few days.
The website is this one https://www.rnec.pt/pt_PT/pesquisa-de-estudos-clinicos. It has a search box, and the only option I configure is to click the banner that says "Pesquisa Avançada" and then mark the box that says "Menor de 18 anos". I then click the "Pesquisar" button in the lower-right, and the results that show up are the ones that I want to extract (either that or, if it's simpler, all the results, without checking the "Menor de 18 anos" box). In my computer, 2 results show up per page, and there are 38 pages total. Each result has some of it details in the page where the results appear but, to get the full data from each entry, one has to click "Detalhes" in the lower right of each result, which opens a display with all the data from that result. If possible, I would love to download all the data from that "Detalhes" page of each result (the data there alerady contains the fields that show up in the search result page).
Honestly, I am ready to waste a whole day manually transcribing all the data, but it would be much better to do it computationally, even it it takes me two or three days to learn and do it.
I think that, for someone with experience in web scraping, it is probably super simple to check if it is possible to download the data I described, and what is the best way to go about it (in general terms, I would research and learn it). But I really am lost when it comes to this, and just kindly want to ask for some help in showing me the way go about it (even if the answer is "it is too complicated/impossible, just do it manually"). I know that there are some Python packages for web scraping, like BeautifulSoup and Selenium, but I don't really know if either of them would be appropriate.
I am sorry if my request is not exactly a short and simple coding question, but I have to try to gather any help or guidance I can get. Thank you in advance to everyone who reads my question and a special thank you if you are able to give me some pointers.
I am using a simple smart card reader and am wanting to retrieve information that is stored on an EMV chip using the same reader. Currently, I found a library to do so, called pyscard, and it uses python.
At the moment, I have managed to use the documentation and some command codes to get the ATR of my card, including Applet codes, but have not been able to retrieve any of the data that is actually important, as in CardHolder name, PAN, Expiration date, card type(Visa, MasterCard...) etc.
Does the ATR or AID is somewhat useful? Do I have to process or analyze the ATR and AID to get the command codes?
Is there a possibility to get the data, as there are no command codes anywhere, but a lot of stack overflow articles leading mostly nowhere with thorough explanations that are available in Wikipedia and in overall documentation.
The pyscard documentation also did not seem to provide such information.
Also, is it possible to access the library of purchases the user has made, for instance as a list: 4.5euro, 22euro, 5euro.. etc or rather not, or is such data even logged on the chip?
The closest source I found was this link: https://iso8583.info/lib/EMV/TLVs with a duplicate-like article here Retrieve smart card's PAN with Python and pyscard with some documentation here https://www.openscdp.org/scripts/tutorial/emv/reademv.html
Here is also documentation for the pyscard: https://pyscard.sourceforge.io/index.html
Thanks for answering if not for downvoting, as many similar queries I have found, have received such votes.
SCSH (Smart Card Shell) provided by Card connect can be a useful tool for you. It has some predefined scripts to read EMV cards.
Download link is EMV Credit Card Application
RFIDIOt is an great library to use for this:
https://github.com/AdamLaurie/RFIDIOt
It comes with many examples include ChAP.py which has the code you need to talk to EMV cards.
Question
I am interested in web scraping and data analysis and I would like to develop my skills by writing a program using Python 2.7 that will monitor changes in stock prices. My goal is to be able to compare two stocks (for the time being) at certain points throughout the day, save that info into a document format easily handled by pandas (which I will learn how to use after I get this front end working). In the end I would like to map relationship trends between chosen stocks (when this one goes up by x what effect does that have on the other one). This is just a hobby project so it doesn't matter if the code is production quality.
My Experience
I am a brand new Python programmer (I have a very basic understanding of python and no real experience with any non-included modules) but I do have a technical background so if the answer to my question requires reading and understanding documentation intended for intermediate level programmers that should be OK.
For the basics I am working my way through Learning Python: Powerful Object-Oriented Programming by Mark Lutz if this helps any.
What I'm Looking For
I recognize this is a very broad subject and I am not asking for anyone to write any actual code examples or anything. I just want some direction as to where to go to get information more specific to my interests and goals.
This is actually my first post on this forum so please forgive me if this doesn't follow best practices for posting. I did search for other questions like mine and read the posting tips docs prior to writing this.
So, you want to web-scrape? If you're using Python 2.7, then you'll want to look into the urllib2, requests and BeautifulSoup libraries. If you're using Python 3.x, then you'll want to peek at urllib, urllib.request, and BeautifulSoup, again. Together, these libraries should accomplish everything you're looking to do in terms of web-scraping.
If you're interested in scraping stock data, might I suggest the yahoo_finance package? This is a Python wrapper for the Yahoo Finance API. Whenever I've done things with stock data in the past, this module was invaluable. There's also googlefinance, too. It's much easier to use these already-developed wrappers to extract stock info, rather than scraping hundreds (if not thousands) of web pages to get the data you want.
I'm attempting to pull physical property information (dimensions and resistance values, in particular) from an architectural (Autodesk - Revit) model and organize that information to be exported as specific variables.
To expand slightly, for an independent study I want to perform energy balances on Revit Models, starting simple and building from there. The goal is to write code that collects information from a Revit Model and then organizes it into variables such as "Total Wall Area", "Insulation Resistance", "Drywall depth", "Total Window Area", etc. that could be then sent to a model (or simply a spreadsheet) and stored as such.
I hope that makes some sense.
Given that I am a novice coder and would prefer to write in Python, does anyone have any advice or resources concerning an efficient (simple) path to go about importing and organizing specific parameters from a Revit model?
Is it necessary (or realistically necessary, given the humble extent of my knowledge) to use the API for this program (Revit) to accomplish this task?
I imagine this task is similar to web scraping yet I have no HTML to call and search through and therefore am happily winging my way along, asking folks far more knowledgeable than I if they have any insight.
A brief background, I have next to no knowledge of Revit or APIs in general, basic knowledge of coding in Python and really want to learn more!
Any help you are able to give is absolutely appreciated! I'm also happy to answer any questions that come up.
Thank you for reading and have a terrific day!
Great question - my +1 is definitely for Revit Python Shell (RPS).
Likewise I had a basic understanding of Python and none of the Revit API, but with RPS Ive coded multiple addins for our office (including rich user interfaces using winforms) and had no limitations so far from coding in Python. Its true that there is some translating C# API samples into Python - but the reward is in seeing a few paragraphs of code becoming a few lines...
The maker of RPS (Daren) is also really helpful, so no questions go unanswered.
Disclaimer is that (like you), Im a novice programmer who has simply wanted to use the API to extend Revit. RPS for the win
Indeed the most used programming language for Revit is C# (.NET), if you decide to go with IronPython, it should work, but there is less material...
Using C#, check the My First Revit Plugin training. For your specific scenario, download the SDK and check the "Fire Rating" sample.
I'm creating a video sharing site using django, it's now using phpmotion but I decided to rewrite the script.
Users come to my site and upload spams and adult videos, i hate that. Is it possible to censor videos automatically? Using python, because i will soon remove phpmotion.
I'm pretty sure that this is impossible but, maybe you know about some way to do it automatically. Youtube censors Barclays football premier league games and occasionally some music videos, I don't know if it does that automatically but i guess not.
I don't know whether you understand German or not, but this article might be useful:
http://www.linux-magazin.de/Heft-Abo/Ausgaben/2011/07/Objekterkennung
They have done some amazing things with opencv and show how to detect naked bodies - if it's the adult content you're worrying about.