I m working on a stock prediction project. This is how I want:
To show all the stocks available in Nifty50, Nifty100 or so and then the user will select the stock to predict the high and low price of a stock on next day only.
I m using Django.
What I have done till now:
I m able to display a list of stock.
def index(request):
api_key = 'myAPI_Key'
url50 = 'https://archives.nseindia.com/content/indices/ind_nifty50list.csv'
url100 = 'https://archives.nseindia.com/content/indices/ind_nifty100list.csv'
url200 = 'https://archives.nseindia.com/content/indices/ind_nifty200list.csv'
sfifty = requests.get(url50).content
shundred = requests.get(url100).content
stwohundred = requests.get(url200).content
nifty50 = pd.read_csv(io.StringIO(sfifty.decode('utf-8')))
nifty100 = pd.read_csv(io.StringIO(shundred.decode('utf-8')))
nifty200 = pd.read_csv(io.StringIO(stwohundred.decode('utf-8')))
nifty50 = nifty50['Symbol']
nifty100 = nifty100['Symbol']
nifty200 = nifty200['Symbol']
context = {
'fifty': nifty50,
'hundred': nifty100,
'twohundred': nifty200
}
return render(request, 'StockPrediction/index.html', context)
What I want:
I want to get the live data of all stocks open, high,LTP,Change, Volume.by mean of live data is that it will change as per stock values will change.
Please Help!
You must combine Ajax/Jquery like code below to periodically get data and update values in DOM :
(function getStocks() {
$.ajax({
type: "GET",
url: "url to your view",
success: function (data) {
// here you can get data from backend and do changes like
// changing color by the data coming from your view.
}
}).then(function() { // on completion, restart
setTimeout(getStocks, 30000); // function refers to itself
});
})();
But be careful about making too requests, you must choose proper interval right in this line setTimeout(getStocks, "proper interval");
And in your view you should put queries into a JSON format something like this :
return JsonResponse({'stocks': stocks})
here stocks must be in json format.
Related
I am working with the TDAmeritrade API and i am making something that returns the price history when requested. Currently, i am unable to get 4:00PM when the aftermarket data is "false". I can get 4:00PM when 'needExtendedHoursData' is true, but i dont want the extra data that comes with it.
def get_priceHistory(start_date, freq, symbol):
payload = {
'apikey':client_id,
'periodType':'day',
'frequencyType':'minute',
'frequency':60,
'startDate':start_date,
'needExtendedHoursData':'false'
}
endpoint = r'https://api.tdameritrade.com/v1/marketdata/{}/pricehistory'.format(symbol)
content = requests.get(url=endpoint, params=payload)
data = content.json()
I'm trying to perform a custom query in Python via an ajax call.
The frontend sends the the start time and end time data in unix time eg 1548417600000.
I then convert to (ISO) time (I think?) in Python as that is what MongoDB prefers afaik.
Document example:
{
"_id" : ObjectId("5c125a185dea1b0252c5352"),
"time" : ISODate("2018-12-13T15:09:42.536Z"),
}
PyMonogo doesn't return anything however, despite knowing that there should be thousands of results.
#login_required(login_url='/login')
def querytimerange(request):
print("Expecto Patronum!!")
if request.method == 'POST':
querydata = lambda x: request.POST.get(x)
colname = querydata('colname')
startdate = querydata('start')
enddate = querydata('end')
startint = int(startdate)
endint = int(enddate)
dtstart = datetime.utcfromtimestamp(startint/1000.0)
iso_start = str(dtstart.isoformat())
print(iso_start)
dtend = datetime.utcfromtimestamp(endint/1000.0)
iso_end = str(dtend.isoformat())
print(iso_end)
collection = db[colname]
data = collection.find({"time": {"$gt": iso_start,"$lt": iso_end}})
for a in data:
print(a)
return JsonResponse({"ok": "ok"})
else:
return JsonResponse({"ok": "no"})
So yeah, I think I'm struggling to get the format of the dates right.
After converting from Unix time, the date is in a str like this:
2019-01-20T04:00:00 &
2019-01-25T12:00:00.
Not sure if that's correct, but that should by isoformat afaik?
Main goal is to use it in an aggregation pipeline.
{
"$match": {
"time":{
"date": {
"$gt":startdate,
"$lt":enddate
}
}
}
},
I'm using PyMongo Driver on my Django app.
Thanks!
I am a beginner in Python and am trying to use the webhose.io API to collect data from the web. The problem is that this crawler retrieves 100 objects from one JSON at a time, i.e., to retrieve 500 data, it is necessary to make 5 requests. When I use the API, I am not able to collect all the data at once. I was able to collect the first 100 results, but when going to the next request, an error occurs, the first post is repeated. Follow the code:
import webhoseio
webhoseio.config(token="Xxxxx")
query_params = {
"q": "trump:english",
"ts": "1498538579353",
"sort": "crawled"
}
output = webhoseio.query("filterWebContent", query_params)
x = 0
for var in output['posts']:
print output['posts'][x]['text']
print output['posts'][x]['published']
if output['posts'] is None:
output = webhoseio.get_next()
x = 0
Thanks.
Use the following:
while output['posts']:
for var in output['posts']:
print output['posts'][0]['text']
print output['posts'][0]['published']
output = webhoseio.get_next()
I would like to get Dividend and Split from the python module to Bloomberg API (blapi) for some companies in the US (I am using a Screening to extract these companies). I am using the python module blapi :
import blpapi
# Connect the bloomberg platform
sessionOptions = blpapi.SessionOptions()
sessionOptions.setServerHost(bloomberg_host)
sessionOptions.setServerPort(bloomberg_port)
session = blpapi.Session(sessionOptions)
# Get the dividend and Split
refDataService = session.getService("//blp/refdata")
request = refDataService.createRequest("HistoricalDataRequest")
request.getElement("securities").appendValue("AAPL US Equity")
request.getElement("fields").appendValue("DVD_HIST_ALL")
request.set("periodicityAdjustment", "ACTUAL")
request.set("periodicitySelection", "DAILY")
request.set("startDate", "20140101")
request.set("endDate", "20141231")
request.set("maxDataPoints", 1)
But I get the following amswer :
HistoricalDataResponse = {
securityData = {
security = "AAPL US Equity"
eidData[] = {
}
sequenceNumber = 0
fieldExceptions[] = {
fieldExceptions = {
fieldId = "DVD_HIST_ALL"
errorInfo = {
source = "951::bbdbh5"
code = 1
category = "BAD_FLD"
message = "Not valid historical field"
subcategory = "NOT_APPLICABLE_TO_HIST_DATA"
}
}
}
fieldData[] = {
}
}
}
Looking at the documentation (blpapi-developers-guide) I see multiple request possibility (Reference Data Service, Market Data Service, API Field Information Service) but none of them explain how to get the dividend/split. I don't know which Service and which Request to use.
From the terminal these Dividend and Split and registered under the tag CACT if you use a screening and DVD if you look for the dividend/split of a currently loaded stock (I can loop over the companies I want in my code in worse case).
If someone knows how to do it you will illuminate my day!
I am trying to return a python dictionary to the view with AJAX and reading from a JSON file, but so far I am only returning [object Object],[object Object]...
and if I inspect the network traffic, I can indeed see the correct data.
So here is how my code looks like. I have a class and a method which based on the selected ID (request argument method), will print specific data. Its getting the data from a python discretionary. the problem is not here, have already just tested it. But just in case I will link it.
# method to create the directionary - just in case #
def getCourselist_byClass(self, classid):
"""
Getting the courselist by the class id, joining the two tables.
Will only get data if both of them exist in their main tables.
Returning as a list.
"""
connection = db.session.connection()
querylist = []
raw_sql = text("""
SELECT
course.course_id,
course.course_name
FROM
course
WHERE
EXISTS(
SELECT 1
FROM
class_course_identifier
WHERE
course.course_id = class_course_identifier.course_id
AND EXISTS(
SELECT 1
FROM
class
WHERE
class_course_identifier.class_id = class.class_id
AND class.class_id = :classid
)
)""")
query = connection.engine.execute(raw_sql, {'classid': classid})
for column in query:
dict = {
'course_id' : column['course_id'],
'course_name' : column['course_name']
}
querylist.append(dict)
return querylist
my jsonify route method
#main.route('/task/create_test')
def get_courselist():
#objects
course = CourseClass()
class_id = request.args.get('a', type=int)
#methods
results = course.getCourselist_byClass(class_id)
return jsonify(result=results)
HTML
and here is how the input field and where it should link the data looks like.
<input type="text" size="5" name="a">
<span id="result">?</span>
<p>click me
and then I am calling it like this
<script type=text/javascript>
$(function() {
$('a#link').bind('click', function() {
$.getJSON("{{ url_for('main.get_courselist') }}", {
a: $('input[name="a"]').val()
}, function(data) {
$("#result").text(data.result);
});
return false;
});
});
</script>
but every time I enter a id number in the field, i am getting the correct data. but it is not formatted correctly. It is instead printing it like [object Object]
source, followed this guide as inspiration: flask ajax example
The data return by your server is like: {result: [{course_id: 'xxx', course_name: 'xxx'}]}, in which data.result is a JS Array.
when you set it to $("#result").text(), JS convert a array to string, so the result is [object Object].
You should iterate over the array to construct a string, then set the string in DOM, like:
courseStr = data.result.map(function(course) {return course.course_id + '-' + course.course_name; }).join(',');
$("#result").text(courseStr);
The API description for flask.json.jsonify indicates it's expecting keyword parameters. What you actually want to do seems to be serialize a list object containing dictionaries, have you tried flask.json.dumps instead? Assuming you've got the dumps symbol imported, instead of your jsonify call you can try:
return dumps(results)