I'm creating a website using django and since im not that strong in html/css (i can code and create layouts, but not the ones that look professional enough) i thought i might use the Bootstrap open source product, originally created by twitter.
Bootstrap is using the Apache License 2.0 as far as i can see and when reading it there does not seem to be any problems using it for commerical perposes...but i still wanted to ask since reading terms and condition isnt something i do every day and im not a lawyer.. can i use bootstrap (http://getbootstrap.com/2.3.2/) for commercial websites ?
Tonnes of sites use it every day, and many of them for commercial purposes. Feel free to use bootstrap for any use you please.
You might want to check the comparison between various licenses here http://choosealicense.com/licenses/. And using twitter-bootstrap is absolutely fine for a commercial website. You can check few websites using bootstrap here in the expo http://expo.getbootstrap.com/
Related
I am quite new to coding, and have limited experience with HTML and Python. I would like to develop a web page for personal use, and I thought it might be easier to starter with a template and customize it from there. Can anyone recommend a place to find some free templates, or know a good place to start looking?
I learned how with Flask using this tutorial. It's for creating a blog, but you can apply the concepts to other types of websites.
It is time consuming and will feel cumbersome at first, but it teaches you some extra things you need to know if you want the web page to be interactive, store data, etc.
Start Bootstrap is a great starting place! https://startbootstrap.com/ It has free, open source templates that you can use and let you become more comfortable with bootstrap.
Although I do recommend going through W3 tutorials https://www.w3schools.com/html/ and really familiarizing yourself with HTML and CSS if web development is of interest to you.
I realize that versions of this question have been asked and I spent several hours the other day trying a number of strategies.
What I would like to is use python to scrape all of the URLs from a google search that I can use in a separate script to do text analysis of a large corpus (news sites mainly). This seems relatively straightforward, but none of the attempts I've tried have worked properly.
This is as close as I got:
from google import search
for url in search('site:cbc.ca "kinder morgan" and "trans mountain" and protest*', stop=100):
print(url)
This returned about 300 URLs before I got kicked. An actual search using these parameters provides about 1000 results and I'd like all of them.
First: is this possible? Second: does anyone have any suggestions to do this? I basically just want a txt file of all the URLs that I can use in another script.
It seems that this package uses screen scraping to retrieve search results from google, so it doesn't play well with Google's Terms of Service which could be the reason why you've been blocked.
The relevant clause in Google's Terms of Service:
Don’t misuse our Services. For example, don’t interfere with our Services or try to access them using a method other than the interface and the instructions that we provide. You may use our Services only as permitted by law, including applicable export and re-export control laws and regulations. We may suspend or stop providing our Services to you if you do not comply with our terms or policies or if we are investigating suspected misconduct.
I haven't been able to find a definite number, but it seems like their limit for the number of search queries a day is rather strict too - at 100 search queries / day on their JSON Custom Search API documentation here.
Nonetheless, there's no harm trying out other alternatives to see if they work better:
BeautifulSoup
Scrapy
ParseHub - this one is not in code, but is a useful piece of software with good documentation. Link to their tutorial on how to scrape a list of URLs.
I've been working on a small project which requires me to create a proxy with access to only a few sites.
Am using the code from here: https://github.com/labnol/google-proxy
Now, basically am a PHP guy, but haven't found anything as better as the above for setting up a web proxy server.
What I need here is:
A way to filter out URLs. Like I want people to access only the sites I allow.
In the allowed sites, I'd like to block certain scripts. e.g. On wikipedia.org, the person shouldn't be allowed to login.
Am a complete noob in Python. Can anyone suggest me something here, or provide a code snippet which I can use?
Thanks! :)
P.S.: For a PHP version, I've tried using Glype and miniProxy. But not as good as the one I mentioned. They break the CSS/JS of the websites.
You may install Squid and write your own filtering rules.
I need to create a simple web service, and am leaning towards the Python package soaplib.
I would also like to create a simple web page that can consume the web service. This will serve two purposes:
Allow for easier testing by multiple people including non-programmers.
Allow knowledgeable / power users to call the service directly in Production on a few occasions.
Any suggestions for creating this web page? Ideally, I want to generate it automatically and not craft it by hand. I have used this type of feature with Visual Studio .NET which autocreates a basic web page as part of the process of creating a web service.
Any ideas are appreciated. I'd prefer an automated solution based on soaplib but am open to any non-soaplib or non-Python solution as well.
Have you considered using REST (here or here)? I was creating small app using GAE and I wanted to have programmable interface for that too. So using exactly the same URLs I've created a web service (REST) which based on client accepted mime type answered either using json (for software) or html (for users). You have wide range of possibilities and it's lot cleaner that soap-based ws.
This week, I want to start a web mapping and data visualization site for my work.
Unfortunately, I just found out my work place will be using Drupal in a few months down the road. (Most of my web development experience is with App Engine.)
My problem is that I need to make sure my web application embeds nicely into the larger Drupal site that outside consultants plan to make.
I am most comfy with Python, and I was expecting to use the Python-Django combo instead. There are important python libraries and modules I must have that cant be found or re-written in PHP.
I was thinking I will avoid all django on the web pages so things dont get confusing when the Drupal switch is made.
I will have the javascript on the web page make calls to python on the server which then spits out JSON data, and I think this will stay the same even after the Drupal switch.
Does this make sense?
Any general or specific suggestions that may guide me are greatly appreciated!
If you write API calls and utilize Drupal Services module, you can hook into just about anything and send/receive JSON/XML data.