zerodha api and python : InputException: Invalid `api_key` or `access_token` - python

It looks like the error is related to an invalid API key or access token. Everything I did is correct and mentioned below what steps are taken by me:
I've created https://developers.kite.trade/ (from the Zerodha Kite Connect dashboard)
Here is Zerodha API key created image
To get data from Zerodha in Python, I am trying the Zerodha Kite Connect API. Kite Connect is a set of REST-like APIs that expose many capabilities required to build a complete investment and trading platform. To use the API, I first needed to create a Zerodha account and then applied for API access. After I have received your API key, I can use it to make requests to the Kite Connect API using a Python library such as kiteconnect or kiteconnect-python.
Here is an example of how you could use the kiteconnect library to get historical data for a stock:
This python code:
from kiteconnect import KiteConnect
import datetime
kite = KiteConnect(api_key='0cv9cnax7bmgjclh')
# Get historical data for a stock
today = datetime.datetime.now().date()
historical_data = kite.historical_data(
instrument_token=6048, # Instrument token of a stock
from_date=today - datetime.timedelta(days=365), # From date
to_date=today, # To date
interval="daily" # Interval (minute, hourly, daily, weekly, monthly, yearly)
)
print(historical_data)
Error:
---------------------------------------------------------------------------
InputException Traceback (most recent call last)
~\AppData\Local\Temp\ipykernel_18768\3958108260.py in <module>
6 # Get historical data for a stock
7 today = datetime.datetime.now().date()
----> 8 historical_data = kite.historical_data(
9 instrument_token=6048, # Instrument token of a stock
10 from_date=today - datetime.timedelta(days=365), # From date
~\anaconda3\lib\site-packages\kiteconnect\connect.py in historical_data(self, instrument_token, from_date, to_date, interval, continuous, oi)
629 to_date_string = to_date.strftime(date_string_format) if type(to_date) == datetime.datetime else to_date
630
--> 631 data = self._get("market.historical",
632 url_args={"instrument_token": instrument_token, "interval": interval},
633 params={
~\anaconda3\lib\site-packages\kiteconnect\connect.py in _get(self, route, url_args, params, is_json)
849 def _get(self, route, url_args=None, params=None, is_json=False):
850 """Alias for sending a GET request."""
--> 851 return self._request(route, "GET", url_args=url_args, params=params, is_json=is_json)
852
853 def _post(self, route, url_args=None, params=None, is_json=False, query_params=None):
~\anaconda3\lib\site-packages\kiteconnect\connect.py in _request(self, route, method, url_args, params, is_json, query_params)
925 # native Kite errors
926 exp = getattr(ex, data.get("error_type"), ex.GeneralException)
--> 927 raise exp(data["message"], code=r.status_code)
928
929 return data["data"]
InputException: Invalid `api_key` or `access_token`.
I am trying to get historical data via API from Zerodha.

Related

Binance Python Spot API returns wrong data for certain timestamps

I came across a weird problem with Binance API. There are 438 ETH and 434 BTC timestamps for which the API returns the wrong data. For ETH, some of them are:
2017-09-06 16:01:00
2017-12-04 06:01:00
2017-12-18 10:01:00
2017-12-18 12:30:00
When I request the data for one of these defunct timestamps, the Binance response returns a different timestamp, while for all others, the response timestamp is the same as the request timestamp.
I put together the simplest reproducible example to illustrate the difference in Binance's response for normal and defunct timestamps.
Code:
import json
from binance.spot import Spot
from datetime import timezone
from datetime import datetime
import numpy as np
f = open('../data.json')
cred = json.load(f)
key = cred['key']
secret = cred['secret']
client = Spot(key=key, secret=secret)
def to_milliseconds(time):
return int(np.datetime64(time, 'ms').astype('int64'))
def to_timestamp(time):
return np.datetime64(datetime.fromtimestamp(int(time) / 1000, tz=timezone.utc), 'm')
normal_timestamp = "2018-05-02T10:00"
defunct_timestamp = "2017-09-06T16:01"
for id, timestamp in enumerate([normal_timestamp, defunct_timestamp]):
start_time = np.datetime64(timestamp, 'm')
start_time_ms = to_milliseconds(start_time)
ticker = client.klines(symbol='ETHUSDT', interval='1m', limit=1, startTime=start_time_ms)
return_time_ms = ticker[0][0]
print('Case {id}. Request time: {request_time} (in ms: {request_time_in_ms}). Response time: {response_time} (in ms: {response_time_in_ms})'.format(
id=id,
request_time=start_time,
request_time_in_ms=start_time_ms,
response_time=to_timestamp(return_time_ms),
response_time_in_ms=return_time_ms
))
Output:
Case 0. Request time: 2018-05-02T10:00 (in ms: 1525255200000). Response time: 2018-05-02T10:00 (in ms: 1525255200000)
Case 1. Request time: 2017-09-06T16:01 (in ms: 1504713660000). Response time: 2017-09-06T23:00 (in ms: 1504738800000)
Did someone encounter that too? What would be the reason for that?

Tiingo Data Reader

I was developing a price prediction model that requires Tiingo but there seems to be problem in the API authentification. I used the OS access the Tiingo API.
`
api_key =os.environ.get('TIINGO_API_KEY')
df=pdr.get_data_tiingo('AAPL',api_key)
df=pd.read_csv('AAPL.csv')
print(df.tail())
The error I got looks like:
~\AppData\Local\Temp/ipykernel_9920/1017009006.py in <module>
1 api_key =os.environ.get('TIINGO_API_KEY')
----> 2 df=pdr.get_data_tiingo('AAPL',api_key)
3 df=pd.read_csv('AAPL.csv')
4 print(df.tail())
~\anaconda3\lib\site-packages\pandas_datareader\data.py in get_data_tiingo(*args, **kwargs)
118
119 def get_data_tiingo(*args, **kwargs):
--> 120 return TiingoDailyReader(*args, **kwargs).read()
121
122
~\anaconda3\lib\site-packages\pandas_datareader\tiingo.py in __init__(self, symbols, start, end, retry_count, pause, timeout, session, freq, api_key)
181 api_key = os.getenv("TIINGO_API_KEY")
182 if not api_key or not isinstance(api_key, str):
--> 183 raise ValueError(
184 "The tiingo API key must be provided either "
185 "through the api_key variable or through the "
ValueError: The tiingo API key must be provided either through the api_key variable or through the environmental variable TIINGO_API_KEY.
Any assistance is highly appreciated
It seems api_key is coming as None. You should check that.

Count all restaurants from a city by maps api

i'm trying to count all restaurants in my city using python-google-places api
but is not working, i'm getting "failed with response code: INVALID_REQUEST"
What could be causing this?
My code is like that
from googleplaces import GooglePlaces, types, lang
from time import sleep
YOUR_API_KEY = '<<MYKEY>>'
google_places = GooglePlaces(YOUR_API_KEY)
# You may prefer to use the text_search API, instead.
query_result = google_places.nearby_search(
lat_lng={'lat': -16.6824083, 'lng': -49.2556573}
,
location='Goiania',
radius=50000,types=[types.TYPE_RESTAURANT])
counter = 0;
while (query_result.has_next_page_token):
counter = counter + len(query_result.places)
query_result = google_places.nearby_search(
lat_lng={'lat': -16.6824083, 'lng': -49.2556573},
location='Goiania',
radius=50000,types=[types.TYPE_RESTAURANT],
pagetoken=query_result.next_page_token)
print(counter)
i'm getting this
---------------------------------------------------------------------------
GooglePlacesError Traceback (most recent call last)
<ipython-input-42-9cc6675b31bc> in <module>()
21 location='Goiania',
22 radius=50000,types=[types.TYPE_RESTAURANT],
---> 23 pagetoken=query_result.next_page_token)
24
25 print(counter)
C:\ProgramData\Anaconda3\lib\site-packages\googleplaces\__init__.py in nearby_search(self, language, keyword, location, lat_lng, name, radius, rankby, sensor, type, types, pagetoken)
303 url, places_response = _fetch_remote_json(
304 GooglePlaces.NEARBY_SEARCH_API_URL, self._request_params)
--> 305 _validate_response(url, places_response)
306 return GooglePlacesSearchResult(self, places_response)
307
C:\ProgramData\Anaconda3\lib\site-packages\googleplaces\__init__.py in _validate_response(url, response)
173 error_detail = ('Request to URL %s failed with response code: %s' %
174 (url, response['status']))
--> 175 raise GooglePlacesError(error_detail)
176
177
GooglePlacesError: Request to URL https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=-16.6824083%2C-49.2556573&radius=50000&type=restaurant&pagetoken=CqQCGAEAAAzE3wT0DnczXFlzyjvAaka8vRLZMlsAjF2aqezA8dtGcLIV7ePoqXAUOm0MyxgroXBcKydzt3U3rB2RFvqLijFCbJ3-ucQ-nijN1E7d4aEcC2UlKUR2gNnHfmKYmFVmfQ70lbW-UmCm79WOl2s5oQ8VYoE9bRnr01IphBbVeiS_IDBsCwmsALU4ti5z-7RSYT9ACTCgFs8bVwU9lQ2x_F3v2FtkdqP7UWl5MmNLteox4dSCwa_k3gKD9yd8mCzzos0CvS248uqn_24wLaVubPmxAUrDbSFDhoSx5c8O7S-XrHl4aZ2dx4QUznYXVcEcD_9c-AHKnPoqK-zwh2MVRiHLHNscTnxr4_iCJwsrrOcqlyQrN192HCq9BMADG1tLVxIQ16yZSa5g10FKIcHzFwQqrxoUxS_m8v1Lbr0IbujvfXRi74p71ws&language=en&key=AIzaSyD8YxHJjYdGMO-k7MbOdF807uzEYT-QGYo&sensor=false failed with response code: INVALID_REQUEST

Tweepy returns only 76 tweets

I am trying to gather movie reviews from Twitter. However, I get only 76 tweets. I tried to except tweeterror but that doesn't help. Here is my code:
import tweepy
import time
import cPickle as pickle
auth = tweepy.OAuthHandler(**hidden**)
auth.set_access_token(**hidden**)
api = tweepy.API(auth)
def limit_handled(cursor):
while True:
try:
yield cursor.next()
"I am awake..."
except tweepy.error:
print "going to sleep..."
time.sleep(15 * 60)
except StopIteration:
break
query = '#moviereview -filter:links'
max_tweets = 1000000
searched_tweets = [status.text for status in limit_handled(tweepy.Cursor(api.search, q=query).items(max_tweets))]
with open("twitter_reviews.pkl","wb") as f:
pickle.dump(searched_tweets,f,-1)
print len(searched_tweets)
Try modifying your query parameters, as per your code, this is not what is filtering out further results.
Query for:
'#moviereview -filter:links'
provides 78 results (and counting)
Query for:
'#moviereview'
provides 1713 results (and counting)
Query for:
'#moviereview Filter:links'
provides 4534 results (and counting)
and as #Ethan mentioned + Twitters API documentation (https://dev.twitter.com/rest/public/search)
The Twitter Search API searches against a sampling of recent Tweets
published in the past 7 days.

Pandas: noauth_local_webserver

I haven't used io within pandas to access google analytic's API for a few weeks but it had been working fine to my knowledge historically without hiccups. I ran it again today and it looks as though the tools.run syntax is deprecated, so I made a pull and replaced tools.py with this update and I've changed to auth.py within pandas to be:
def authenticate(flow, storage=None):
"""
Try to retrieve a valid set of credentials from the token store if possible
Otherwise use the given authentication flow to obtain new credentials
and return an authenticated http object
Parameters
----------
flow : authentication workflow
storage: token storage, default None
"""
http = httplib2.Http()
# Prepare credentials, and authorize HTTP object with them.
credentials = storage.get()
if credentials is None or credentials.invalid:
credentials = tools.run_flow(flow, storage, FLAGS)
http = credentials.authorize(http)
return http
I have a feeling my usage of FLAGS there is incorrect.
Any help? Thanks!
Here's my code and the error:
df = ga.read_ga(
account_id = id,
profile_id = profile,
property_id = property,
metrics = ['transactionRevenue', 'transactions'],
dimensions = ['transactionId', 'city', 'region', 'date', 'hour', 'minute', 'cityId'],
start_date = "2015-07-11",
end_date = "2015-07-16",
index_col = 0,
parse_dates = {'new_date': [3,4,5]})
The error thrown up:
C:\Users\mburke\AppData\Local\Continuum\Anaconda64\lib\site-packages\pandas\io\auth.py in authenticate(flow, storage)
106 credentials = storage.get()
107 if credentials is None or credentials.invalid:
--> 108 credentials = tools.run_flow(flow, storage, FLAGS)
109
110 http = credentials.authorize(http)
C:\Users\mburke\AppData\Local\Continuum\Anaconda64\lib\site-packages\oauth2client\util.pyc in positional_wrapper(*args, **kwargs)
140 else: # IGNORE
141 pass
--> 142 return wrapped(*args, **kwargs)
143 return positional_wrapper
144
C:\Users\mburke\AppData\Local\Continuum\Anaconda64\lib\site-packages\oauth2client\tools.pyc in run_flow(flow, storage, flags, http)
148 logging.getLogger().setLevel(getattr(logging, flags.logging_level))
--> 149 if not flags.noauth_local_webserver:
150 success = False
151 port_number = 0
C:\Users\mburke\AppData\Local\Continuum\Anaconda64\lib\site-packages\python_gflags-2.0-py2.7.egg\gflags.pyc in __getattr__(self, name)
1057 fl = self.FlagDict()
1058 if name not in fl:
-> 1059 raise AttributeError(name)
1060 return fl[name].value
1061
AttributeError: noauth_local_webserver
I did a little digging and you are correct in your assumption that the usage of FLAGS is incorrect. The docstring for tools.run_flow() states:
flags: ``argparse.Namespace``, The command-line flags. This is the
object returned from calling ``parse_args()`` on
``argparse.ArgumentParser`` as described above.
The quick-n-dirty fix would be something like this:
credentials = tools.run_flow(flow, storage, tools.argparser.parse_args([]))
I believe a more robust solution would be for the maintainers of pandas.io to update it to the new workflow if tools.run is really deprecated.

Categories