Django - How to save data into 'cp1251' database - python

I need to save data into external 1251 MSSQL database.
ee = Claim.objects.using('mssql').get(pk=1)
test = u'test'
ee.comment = test
ee.save(using='mssql')
But I get an error:DjangoUnicodeDecodeError: 'utf8' codec can't decode byte 0xbb in position 0: invalid start byte. You passed in bytearray(b'\xbbu\xc09P\xf2\x00F\x8a\xd5T\x9a\xf9K\xff!') (<type 'bytearray'>)
I tried to save data in this way:
tt = u'test'.decode('cp1251')
ee.comment = tt
But i've got same error. I will be thankful if you help me with it.
P.S.
this is my settings.py file :
'mssql': {
'ENGINE': 'sql_server.pyodbc',
'NAME': 'test',
'USER': 'textures',
'PASSWORD': 'testpassword',
'HOST': 'MSSQL-PYTHON',
'PORT': '1234',
'OPTIONS': {
'host_is_server': False,
'dsn': 'MSSQL-PYTHON',
'charset': 'cp1251',
'use_unicode': True,
},
}

Related

How to differentiate access to data at the database server level?

In DB i have three roles: guest, client and admin.
In my django project, there are three connections under these roles
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'test',
'USER': 'guest',
'PASSWORD': 'guest',
'HOST': 'localhost',
'PORT': 5432,
},
'admin': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'test',
'USER': 'admin',
'PASSWORD': 'admin',
'HOST': 'localhost',
'PORT': 5432,
},
'customer': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'test',
'USER': 'customer',
'PASSWORD': 'customer',
'HOST': 'localhost',
'PORT': 5432,
}
}
How and where can I change the connection to the database depending on whether the user is authenticated or not?
I am presuming that you are using psycopg2 to connect to the Postgresql RDBMS. What I would do is specify what Postgresql user you want to use before you execute your query.
For example:
import psycopg2
def func1():
conn = psycopg2.connect(database = "exampledb", user = "user1", password = "user1password", host = "127.0.0.1", port = "5432")
cur = conn.cursor()
cur.execute("SELECT * FROM schema_name.table_name;")
rows = cur.fetchall()
for row in rows:
print(row)
def func2():
conn = psycopg2.connect(database = "exampledb", user = "user2", password = "user2password", host = "127.0.0.1", port = "5432")
cur = conn.cursor()
cur.execute("INSERT INTO schema_name.table_name (col1, col2) VALUES(1, 2);")
rows = cur.fetchall()
for row in rows:
print(row)
I would also be very careful with the admin user, from a security standpoint I would not allow this account to be used for the server-side scripting, this is because if and sql injection is executed, then a lot of harm could be caused. For prevention of sql injections in python I would recommend this: https://realpython.com/prevent-python-sql-injection/

Using quicksight boto3 update_data_source

Summary
What is the right syntax to use update_data_source Quicksight boto3 to change credentials ?
Context
I am trying to use update data source method for Quicksight on boto3 to update my Redshift credentials in Quicksight.
My issue is that it is passing a dictionary as key to another dictionary. How can I unpack that to get to the username / password for Redshift ?
Code
My code looks like this :
def main():
qs = boto3.client('quicksight', region_name=region_name)
response = qs.update_data_source(
AwsAccountId='awsaccountid',
DataSourceId='datasourceid',
Name='qs_test',
Credentials={
{
'CredentialPair':{
'Username': 'test_user'
'Password': 'my_pass'
}
}
}
)
print(response)
main()
Also tried the below
response = qs.update_data_source(
AwsAccountId='awsaccountid',
DataSourceId='datasourceid',
Name='qs_test',
Credentials={CredentialPair
{
RedshiftParameters=[
{
'Database': 'dbname',
'ClusterId': 'clusterid'
}
}
],
Credentials={
'CredentialPair': {
'Username': 'test_user',
'Password': 'my_pass'
}
}
)
print(response)
The below syntax works :
def main():
qs = boto3.client('quicksight', region_name=region_name)
response = qs.update_data_source(
AwsAccountId='awsaccountid',
DataSourceId='datasourceid',
Name='qs_test',
DataSourceParameters={
'RedshiftParameters'={
'Database': 'dbname',
'ClusterId': 'clusterid'
}
}
}
Credentials={
'CredentialPair':{
'Username': 'test_user'
'Password': 'my_pass'
}
}
)
print(response)
main()

Get index name of a list made from dictionaries

I want to begin by saying that I am by no mean a python expert so I am sorry if I express myself in an incorrect way.
I am building a script that goes something like this:
from netmiko import ConnectHandler
visw0102 = {
'device_type': 'hp_comware',
'ip': '192.168.0.241',
'username': 'admin',
'password': 'password'
}
visw0103 = {
'device_type': 'hp_comware',
'ip': '192.168.0.242',
'username': 'admin',
'password': 'password'
}
site1_switches = [visw0102, visw0103]
for switch in site1_switches:
... (rest of the script)
I am trying to get the current index name in the FOR loop by using the enumerate() function to get the index name of the site1_switches list but since that list is made of dictionary items, the dictionary keys are returned:
>>> for index, w in enumerate(switch):
... print(w)
...
device_type
ip
username
password
Is there a way the get the actual index name (VISW010X) instead of values that are in the dictionaries?
Thank you
Edit: Nested dictionary was the answer here, thanks Life is complex
So I was able to get further. Here's the code now.
from netmiko import ConnectHandler
site1_switches = {
'visw0102' : {
'device_type': 'hp_comware',
'ip': '192.168.0.241',
'username': 'admin',
'password': 'password'
},
'visw0103' : {
'device_type': 'hp_comware',
'ip': '192.168.0.242',
'username': 'admin',
'password': 'password'
}
}
for key, values in site1_switches.items():
device_type = values.get('device_type', {})
ip_address = values.get('ip', {})
username = values.get('username', {})
password = values.get('password', {})
for key in site1_switches.items():
net_connect = ConnectHandler(**dict(key)) <- The ConnectHandler needs a dictionary
Now the problem is that the dictionary key seems to be converted to a tuple but the ConnectHandler module needs a dictionary to proceed.
Here's what I get:
Traceback (most recent call last):
File "<stdin>", line 2, in <module>
ValueError: dictionary update sequence element #0 has length 8; 2 is required
I would need to find a way to convert the tuple to a dictionary but it seems that dict(key) doesn't work as it puts the tuple in the first dictionary key (or so it seems).
Anyway I can achieve that?
Thanks!
Have you considered using a nested dictionary?
site1_switches = {
'visw0102': {
'device_type': 'hp_comware',
'ip': '192.168.0.241',
'username': 'admin',
'password': 'password'
},
'visw0103': {
'device_type': 'hp_comware',
'ip': '192.168.0.242',
'username': 'admin',
'password': 'password'
}}
for key, value in site1_switches.items():
print (key)
# output
visw0102
visw0103
Here's another way to accomplish this.
for index, (key, value) in enumerate(site1_switches.items()):
print(index, key, value)
# output
0 visw0102 {'device_type': 'hp_comware', 'ip': '192.168.0.241', 'username': 'admin', 'password': 'password'}
1 visw0103 {'device_type': 'hp_comware', 'ip': '192.168.0.242', 'username': 'admin', 'password': 'password'}
A more complete solution
from netmiko import ConnectHandler
# nested dictionary
site1_switches = {
'visw0102': {
'device_type': 'hp_comware',
'ip': '192.168.0.241',
'username': 'admin',
'password': 'password'
},
'visw0103': {
'device_type': 'hp_comware',
'ip': '192.168.0.242',
'username': 'admin',
'password': 'password'
}}
for key, values in site1_switches.items():
device_type = values.get('device_type', {})
ip_address = values.get('ip', {})
username = values.get('username', {})
password = values.get('password', {})
print (f'{key}', {device_type}, {ip_address}, {username}, {password})
# output
visw0102 {'hp_comware'} {'192.168.0.241'} {'admin'} {'password'}
visw0103 {'hp_comware'} {'192.168.0.242'} {'admin'} {'password'}
print (f'Establishing a connection to {key}')
# output
Establishing a connection to visw0102
# pseudo code based on ConnectHandler parameters
switch_connect = ConnectHandler(device_type=device_type, host=ip_address, username=username, password=password)
# checking that the connection has a prompt
switch_connect.find_prompt()
# What you want to do goes here...
# Example
command_output = switch_connect.send_command('display current-configuration')
Unfortunately, there doesn't seem to be a nice, succinct way of accessing the dictionary's name, but Get name of dictionary provides some possible workarounds:
Nesting your switch dictionaries within an overarching dictionary that maps names to dictionaries is one method.
site1_switches = {
"visw0102": visw0102,
"visw0103": visw0103
}
Another would be to add a "name" key to each dictionary, so that you can access the names of each switch in site1_switches by switch['name']
visw0102 = {
'name': 'visw0102',
'device_type': 'hp_comware',
'ip': '192.168.0.241',
'username': 'admin',
'password': 'password'
}
visw0103 = {
'name': 'visw0103',
'device_type': 'hp_comware',
'ip': '192.168.0.242',
'username': 'admin',
'password': 'password'
}

Is it possible to have multiple index in haystack elastic using real-time signal for auto-update

I have multiple index in elastic using haystack I am trying to auto-update the index with RealtimeSignalProcessor. Is it supported by Haystack ?
Here is the link I followed .
The same thing worked for single index very well.
I suspect the Haystack_connection in settings is something wrong. please suggest the correct syntax.
I don't have any specific need to write any Custom SignalProcessors. Is there a way to use off-the-shelve Haystack Realtime - RealtimeSignalProcessor
I referred to this question but was not helpful .
HAYSTACK_CONNECTIONS = {
'default': {
'ENGINE': 'haystack.backends.elasticsearch_backend.ElasticsearchSearchEngine',
'URL': 'http://127.0.0.1:9200/',
'INDEX_NAME': 'haystack',
'INCLUDE_SPELLING': True,
},
'Hello':
{
'ENGINE': 'haystack.backends.elasticsearch_backend.ElasticsearchSearchEngine',
'URL': 'http://127.0.0.1:9200/',
'INDEX_NAME': 'helloindex',
'INCLUDE_SPELLING': True,
},
'Note':
{
'ENGINE': 'haystack.backends.elasticsearch_backend.ElasticsearchSearchEngine',
'URL': 'http://127.0.0.1:9200/',
'INDEX_NAME': 'noteindex',
'INCLUDE_SPELLING': True,
},
}
Thank-you in advance.
Yes its possible
I was able to solve this issue by using Django-Haystack's routers
In settings.py i did this
HAYSTACK_CONNECTIONS = {
'My_Testing':
{
'ENGINE': 'haystack.backends.elasticsearch_backend.ElasticsearchSearchEngine',
'URL': 'http://127.0.0.1:9200/',
'INDEX_NAME': 'my_testing',
'INCLUDE_SPELLING': True,
'EXCLUDED_INDEXES': ['talks.search_indexes.NoteIndex'],
},
'Note':
{
'ENGINE': 'haystack.backends.elasticsearch_backend.ElasticsearchSearchEngine',
'URL': 'http://127.0.0.1:9200/',
'INDEX_NAME': 'note',
'INCLUDE_SPELLING': True,
'EXCLUDED_INDEXES': ['talks.search_indexes.My_TestingIndex'],
},
'default': {
'ENGINE': 'haystack.backends.elasticsearch_backend.ElasticsearchSearchEngine',
'URL': 'http://127.0.0.1:9200/',
'INDEX_NAME': 'haystack',
# 'INCLUDE_SPELLING': True,
},
}
HAYSTACK_ROUTERS = ['talks.routers.My_TestingRouter',
'talks.routers.NoteRouter']
HAYSTACK_SIGNAL_PROCESSOR = 'haystack.signals.RealtimeSignalProcessor'
and in routers.py file which is at same level as search_indexes.py add this
from haystack import routers
class My_TestingRouter(routers.BaseRouter):
def for_write(self, **hints):
return 'My_Testing'
def for_read(self, **hints):
return 'My_Testing'
class NoteRouter(routers.BaseRouter):
def for_write(self, **hints):
return 'Note'
def for_read(self, **hints):
return 'Note'
Hope this helps somebody someday.
peace.

Django Redis set max connections

I'm using Django and having issues exceeding my max number of redis connections. The library I'm using is:
https://github.com/sebleier/django-redis-cache
Here is my settings.py file:
CACHES = {
'default': {
'BACKEND': 'redis_cache.RedisCache',
'LOCATION': "pub-redis-11905.us-east-1-3.1.ec2.garantiadata.com:11905",
'OPTIONS': {
'DB' : 0,
'PASSWORD': "*****",
'PARSER_CLASS': 'redis.connection.HiredisParser'
},
},
}
Then i another file, I do some direct cache access like so:
from django.core.cache import cache
def getResults(self, key):
return cache.get(key)
Looks like this is an outstanding issue with django-redis-cache - perhaps you should consider a different Redis cache backend for Django that does support connection pooling.
Here's django-redis-cache using connection-pool set max_connections.
CACHES = {
'default': {
'OPTIONS': {
'CONNECTION_POOL_CLASS': 'redis.BlockingConnectionPool',
'CONNECTION_POOL_CLASS_KWARGS': {
'max_connections': 50,
'timeout': 20,
...
},
...
},
...
}
}

Categories