Here's my cache setting:
CACHES = {
'default': {
'BACKEND': 'django_redis.cache.RedisCache',
'LOCATION': 'redis://127.0.0.1:6379',
'OPTIONS': {
'CLIENT_CLASS': 'django_redis.client.DefaultClient',
}
}
}
And a very basic view with cache_page decorator
#api_view(['POST'])
#cache_page(60*1)
def sawan_ko_jhari(request):
data = request.data.get('name')
return JsonResponse({"success": True, "data": data}, safe=False)
I've been checking cache keys for every request sent.. and I get empty array.
Is there something I'm missing here?
Related
I have multiple index in elastic using haystack I am trying to auto-update the index with RealtimeSignalProcessor. Is it supported by Haystack ?
Here is the link I followed .
The same thing worked for single index very well.
I suspect the Haystack_connection in settings is something wrong. please suggest the correct syntax.
I don't have any specific need to write any Custom SignalProcessors. Is there a way to use off-the-shelve Haystack Realtime - RealtimeSignalProcessor
I referred to this question but was not helpful .
HAYSTACK_CONNECTIONS = {
'default': {
'ENGINE': 'haystack.backends.elasticsearch_backend.ElasticsearchSearchEngine',
'URL': 'http://127.0.0.1:9200/',
'INDEX_NAME': 'haystack',
'INCLUDE_SPELLING': True,
},
'Hello':
{
'ENGINE': 'haystack.backends.elasticsearch_backend.ElasticsearchSearchEngine',
'URL': 'http://127.0.0.1:9200/',
'INDEX_NAME': 'helloindex',
'INCLUDE_SPELLING': True,
},
'Note':
{
'ENGINE': 'haystack.backends.elasticsearch_backend.ElasticsearchSearchEngine',
'URL': 'http://127.0.0.1:9200/',
'INDEX_NAME': 'noteindex',
'INCLUDE_SPELLING': True,
},
}
Thank-you in advance.
Yes its possible
I was able to solve this issue by using Django-Haystack's routers
In settings.py i did this
HAYSTACK_CONNECTIONS = {
'My_Testing':
{
'ENGINE': 'haystack.backends.elasticsearch_backend.ElasticsearchSearchEngine',
'URL': 'http://127.0.0.1:9200/',
'INDEX_NAME': 'my_testing',
'INCLUDE_SPELLING': True,
'EXCLUDED_INDEXES': ['talks.search_indexes.NoteIndex'],
},
'Note':
{
'ENGINE': 'haystack.backends.elasticsearch_backend.ElasticsearchSearchEngine',
'URL': 'http://127.0.0.1:9200/',
'INDEX_NAME': 'note',
'INCLUDE_SPELLING': True,
'EXCLUDED_INDEXES': ['talks.search_indexes.My_TestingIndex'],
},
'default': {
'ENGINE': 'haystack.backends.elasticsearch_backend.ElasticsearchSearchEngine',
'URL': 'http://127.0.0.1:9200/',
'INDEX_NAME': 'haystack',
# 'INCLUDE_SPELLING': True,
},
}
HAYSTACK_ROUTERS = ['talks.routers.My_TestingRouter',
'talks.routers.NoteRouter']
HAYSTACK_SIGNAL_PROCESSOR = 'haystack.signals.RealtimeSignalProcessor'
and in routers.py file which is at same level as search_indexes.py add this
from haystack import routers
class My_TestingRouter(routers.BaseRouter):
def for_write(self, **hints):
return 'My_Testing'
def for_read(self, **hints):
return 'My_Testing'
class NoteRouter(routers.BaseRouter):
def for_write(self, **hints):
return 'Note'
def for_read(self, **hints):
return 'Note'
Hope this helps somebody someday.
peace.
I'm trying to adapt the asynch_query.py script found at https://github.com/GoogleCloudPlatform/bigquery-samples-python/tree/master/python/samples for use in executing a query and having the output go to a BigQuery table. The JSON section of the script as I've created it for seting the parameters is as follows:
job_data = {
'jobReference': {
'projectId': project_id,
'job_id': str(uuid.uuid4())
},
'configuration': {
'query': {
'query': queryString,
'priority': 'BATCH' if batch else 'INTERACTIVE',
'createDisposition': 'CREATE_IF_NEEDED',
'defaultDataset': {
'datasetId': 'myDataset'
},
'destinationTable': {
'datasetID': 'myDataset',
'projectId': project_id,
'tableId': 'testTable'
},
'tableDefinitions': {
'(key)': {
'schema': {
'fields': [
{
'description': 'eventLabel',
'fields': [],
'mode': 'NULLABLE',
'name': 'eventLabel',
'type': 'STRING'
}]
}
}
}
}
}
}
When I run my script I get an error message that a "Required parameter is missing". I've been through the documentation at https://cloud.google.com/bigquery/docs/reference/v2/jobs#configuration.query trying to figure out what is missing, but attempts at various configurations have failed. Can anyone identify what is missing and how I would fix this error?
Not sure what's going on. To insert the results of a query into another table I use this code:
def create_table_from_query(connector, query,dest_table):
body = {
'configuration': {
'query': {
'destinationTable': {
'projectId': your_project_id,
'tableId': dest_table,
'datasetId': your_dataset_id
},
'writeDisposition': 'WRITE_TRUNCATE',
'query': query,
},
}
}
response = connector.jobs().insert(projectId=self._project_id,
body=body).execute()
wait_job_completion(response['jobReference']['jobId'])
def wait_job_completion(connector, job_id):
while True:
response = connector.jobs().get(projectId=self._project_id,
jobId=job_id).execute()
if response['status']['state'] == 'DONE':
return
where connector is build('bigquery', 'v2', http=authorization)
Maybe you could start from there and keep adding new fields as you wish (notice that you don't have to define the schema of the table as it's already contained in the results of the query).
I can get the stats for a Memcached server in Python like this:
import memcache
host = memcache._Host('127.0.0.1:11211')
host.connect()
host.send_cmd('stats')
stats = {}
while 1:
line = host.readline().split(None, 2)
if line[0] == "END": break
stat, key, value = line
try:
value = int(value)
except ValueError:
pass
stats[key] = value
host.close_socket()
print stats
Using several caches in Django, how do I get the stats for a specific one, e.g. the "store" cache in this configuration:
CACHES = {
'default': { 'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache', 'LOCATION': '127.0.0.1:11211', 'OPTIONS': { 'MAX_ENTRIES': 4000 } },
'store': { 'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache', 'LOCATION': '127.0.0.1:11211', 'OPTIONS': { 'MAX_ENTRIES': 1000 } },
}
I'd like to find out if MAX_ENTRIES is large enough for our purpose. So I need to know how many items are currently in the "default" cache and in the "stored" cache.
UPDATE: AFAICS, Memcached does not support MAX_ENTRIES and the different "cache names" are only used for Django's cache prefixs. Likely, there's no different cache units inside Memcached.
I'm using Django + Celery to asynchronously process data.
Here is my settings.py:
CACHES = {
'default': {
'BACKEND': 'django.core.cache.backends.locmem.LocMemCache',
'LOCATION': 'unique-snowflake'
}
}
And here is my Celery task:
from celery import shared_task
from django.core.cache import cache
#shared_task
def process():
my_data = cache.get('hello')
if(my_data == None):
my_data = 'something'
cache.set('hello', my_data)
It's very simple. However, everytime I call the task, cache.get('hello') returns always None. I have no clu why. Someone could help me?
I also tried with Memcached and these settings:
> CACHES = {
> 'default': {
> 'BACKEND':
> 'django.core.cache.backends.memcached.MemcachedCache',
> 'LOCATION': '127.0.0.1:11211',
> 'TIMEOUT': 60 * 60 * 60 * 24,
> 'OPTIONS': {
> 'MAX_ENTRIES': 5000,
> }
> } }
Of course, memcached is running as daemon. But the code is still not working...
I'm using Django and having issues exceeding my max number of redis connections. The library I'm using is:
https://github.com/sebleier/django-redis-cache
Here is my settings.py file:
CACHES = {
'default': {
'BACKEND': 'redis_cache.RedisCache',
'LOCATION': "pub-redis-11905.us-east-1-3.1.ec2.garantiadata.com:11905",
'OPTIONS': {
'DB' : 0,
'PASSWORD': "*****",
'PARSER_CLASS': 'redis.connection.HiredisParser'
},
},
}
Then i another file, I do some direct cache access like so:
from django.core.cache import cache
def getResults(self, key):
return cache.get(key)
Looks like this is an outstanding issue with django-redis-cache - perhaps you should consider a different Redis cache backend for Django that does support connection pooling.
Here's django-redis-cache using connection-pool set max_connections.
CACHES = {
'default': {
'OPTIONS': {
'CONNECTION_POOL_CLASS': 'redis.BlockingConnectionPool',
'CONNECTION_POOL_CLASS_KWARGS': {
'max_connections': 50,
'timeout': 20,
...
},
...
},
...
}
}