I'm creating an iOS client for App.net and I'm attempting to setup a push notification server. Currently my app can add a user's App.net account id (a string of numbers) and a APNS device token to a MySQL database on my server. It can also remove this data. I've adapted code from these two tutorials:
How To Write A Simple PHP/MySQL Web Service for an iOS App - raywenderlich.com
Apple Push Notification Services in iOS 6 Tutorial: Part 1/2 - raywenderlich.com
In addition, I've adapted this awesome python script to listen in to App.net's App Stream API.
My python is horrendous, as is my MySQL knowledge. What I'm trying to do is access the APNS device token for the accounts I need to notify. My database table has two fields/columns for each entry, one for user_id and a one for device_token. I'm not sure of the terminology, please let me know if I can clarify this.
I've been trying to use peewee to read from the database but I'm in way over my head. This is a test script with placeholder user_id:
import logging
from pprint import pprint
import peewee
from peewee import *
db = peewee.MySQLDatabase("...", host="localhost", user="...", passwd="...")
class MySQLModel(peewee.Model):
class Meta:
database = db
class Active_Users(MySQLModel):
user_id = peewee.CharField(primary_key=True)
device_token = peewee.CharField()
db.connect()
# This is the placeholder user_id
userID = '1234'
token = Active_Users.select().where(Active_Users.user_id == userID)
pprint(token)
This then prints out:
<class '__main__.User'> SELECT t1.`id`, t1.`user_id`, t1.`device_token` FROM `user` AS t1 WHERE (t1.`user_id` = %s) [u'1234']
If the code didn't make it clear, I'm trying to query the database for the row with the user_id of '1234' and I want to store the device_token of the same row (again, probably the wrong terminology) into a variable that I can use when I send the push notification later on in the script.
How do I correctly return the device_token? Also, would it be easier to forgo peewee and simply query the database using python-mysqldb? If that is the case, how would I go about doing that?
The call User.select().where(User.user_id == userID) returns a User object but you are assigning it to a variable called token as you're expecting just the device_token.
Your assignment should be this:
matching_users = Active_Users.select().where(Active_Users.user_id == userID) # returns an array of matching users even if there's just one
if matching_users is not None:
token = matching_users[0].device_token
Related
I have an SQL database that stores all my data. I have a column there called “client_name” where I have client1, client2, etc.
I have created a basic dash authentication where as soon as my dash application loads, it asks the user for a username and password.
How can I make it work that if the user inputs “client1” as username, the app will automatically filter my SQL database to only read rows that belong to client1 and thus all visuals will display client1’s data?
Sample code
# User dictionary
USERNAME_PASSWORD_PAIRS = {
'client1': 'client1'
, 'client2': 'client2'
}
# Basic authentication
auth = dash_auth.BasicAuth(
app,
USERNAME_PASSWORD_PAIRS
)
# Connect and read data from SQL server
odbc_params = f'DRIVER={driver};SERVER=tcp:{server};PORT=1433;DATABASE={database};UID={username};PWD={password}'
connection_string = f'mssql+pyodbc:///?odbc_connect={odbc_params}'
engine = create_engine(connection_string)
query = "SELECT * FROM [dbo].[test]" # database with all client data
df = pd.read_sql(query, engine)
engine.dispose()
So I want the “df” to read the filtered dataframe based on wether “client1” or “client2” was used to log in.
Thanks for the help
You have a column in df which is named "client" or something alike. You need to get the name of the logged in user, and then filter the df for it:
username = auth.get_username()
df_filtered_for_client = df[df["client"].isin([username])
I'm using the azure python api (https://github.com/microsoft/azure-devops-python-api) and I need to be able to query & find a specific work item based on a custom field value.
The closest thing I can find is the function create_query, but Im hoping to be able to run a query such as
queryRsp = wit_5_1_client.run_query(
posted_query='',
project=project.id,
query='Custom.RTCID=282739'
)
I just need to find my azure devops work item where the custom field RTCID has a certain specific unique value.
Do i need to create a query with the api, run it, get results, then delete the query? Or is there any way I can run this simple query and get the results using the azure devops api?
Your Requirement can be achieved.
For example, on my side, there is two workitems that have custom field 'RTCID':
The Below is how to use python to design this feature(On my side, both organization name and project name named 'BowmanCP'):
#query workitems from azure devops
from azure.devops.connection import Connection
from msrest.authentication import BasicAuthentication
from azure.devops.v5_1.work_item_tracking.models import Wiql
import pprint
# Fill in with your personal access token and org URL
personal_access_token = '<Your Personal Access Token>'
organization_url = 'https://dev.azure.com/BowmanCP'
# Create a connection to the org
credentials = BasicAuthentication('', personal_access_token)
connection = Connection(base_url=organization_url, creds=credentials)
# Get a client (the "core" client provides access to projects, teams, etc)
core_client = connection.clients.get_core_client()
#query workitems, custom field 'RTCID' has a certain specific unique value
work_item_tracking_client = connection.clients.get_work_item_tracking_client()
query = "SELECT [System.Id], [System.WorkItemType], [System.Title], [System.AssignedTo], [System.State], [System.Tags] FROM workitems WHERE [System.TeamProject] = 'BowmanCP' AND [Custom.RTCID] = 'xxx'"
#convert query str to wiql
wiql = Wiql(query=query)
query_results = work_item_tracking_client.query_by_wiql(wiql).work_items
#get the results via title
for item in query_results:
work_item = work_item_tracking_client.get_work_item(item.id)
pprint.pprint(work_item.fields['System.Title'])
Successfully got them on my side:
SDK source code is here:
https://github.com/microsoft/azure-devops-python-api/blob/451cade4c475482792cbe9e522c1fee32393139e/azure-devops/azure/devops/released/work_item_tracking/work_item_tracking_client.py#L704
You can refer to above source code.
We have data in a Snowflake cloud database that we would like to move into an Oracle database. As we would like to work toward refreshing the Oracle database regularly, I am trying to use SQLAlchemy to automate this.
I would like to do this using Core because my team is all experienced with SQL, but I am the only one with Python experience. I think it would be easier to tweak the data pulls if we just pass SQL strings. Plus the Snowflake db has some columns with JSON that seems easier to parse using direct SQL since I do not see JSON in the SnowflakeDialect.
I have established connections to both databases and am able to do select queries from both. I have also manually created the tables in our Oracle db so that the keys and datatypes match what I am pulling from Snowflake. When I try to insert, though, my Jupyter notebook just continuously says "Executing Cell" and hangs. Any thoughts on how to proceed or how to get the notebook to tell me where the hangup is?
from sqlalchemy import create_engine,pool,MetaData,text
from snowflake.sqlalchemy import URL
import pandas as pd
eng_sf = create_engine(URL( #engine for snowflake
account = 'account'
user = 'user'
password = 'password'
database = 'database'
schema = 'schema'
warehouse = 'warehouse'
role = 'role'
timezone = 'timezone'
))
eng_o = create_engine("oracle+cx_oracle://{}[{}]:{}#{}".format('user','proxy','password','database'),poolclass=pool.NullPool) #engine for oracle
meta_o = MetaData()
meta_o.reflect(bind=eng_o)
person_o = meta_o['bb_lms_person'] # other oracle tables follow this example
meta_sf = MetaData()
meta_sf.reflect(bind=eng_sf,only=['person']) # other snowflake tables as well, but for simplicity, let's look at one
person_sf = meta_sf.tables['person']
person_query = """
SELECT ID
,EMAIL
,STAGE:student_id::STRING as STUDENT_ID
,ROW_INSERTED_TIME
,ROW_UPDATED_TIME
,ROW_DELETED_TIME
FROM cdm_lms.PERSON
"""
with eng_sf.begin() as connection:
result = connection.execute(text(person_query)).fetchall() # this snippet runs and returns result as expected
with eng_o.begin() as connection:
connection.execute(person_o.insert(),result) # this is a coinflip, sometimes it runs, sometimes it just hangs 5ever
eng_sf.dispose()
eng_o.dispose()
I've checked the typical offenders. The keys for both person_o and the result are all lowercase and match. Any guidance would be appreciated.
use the metadata for the table. the fTable_Stage update or inserted as fluent functions and assign values to lambda variables. This is very safe because only metadata field variables can be used in the lambda. I am updating three fields:LateProbabilityDNN, Sentiment_Polarity, Sentiment_Subjectivity
engine = create_engine("mssql+pyodbc:///?odbc_connect=%s" % params)
connection=engine.connect()
metadata=MetaData()
Session = sessionmaker(bind = engine)
session = Session()
fTable_Stage=Table('fTable_Stage', metadata,autoload=True,autoload_with=engine)
stmt=fTable_Stage.update().where(fTable_Stage.c.KeyID==keyID).values(\
LateProbabilityDNN=round(float(late_proba),2),\
Sentiment_Polarity=round(my_valance.sentiment.polarity,2),\
Sentiment_Subjectivity= round(my_valance.sentiment.subjectivity,2)\
)
connection.execute(stmt)
I'm running a flask-rest-jsonapi application on top of Flask, sqlalchemy, and cx_Oracle. One requirement of this project is that the connection property client_identifier, made available via cx_Oracle (relevant documentation), be modifiable based on a value sent in a JWT with each client request. We need to be able to write to this property because our internal auditing tables make use of it to track changes made by individual users.
In PHP, setting this value is straightforward using oci8, and has worked great for us in the past.
However, I have been unable to figure out how to set the same property using this new application structure. In cx_Oracle, the client_identifier property is a 'write-only' property, so it's difficult to verify that the value is set correctly without going to the backend and examining the db session properties. You access this property via the sqlalchemy raw_connection object.
Beyond being difficult to read, setting the value has no effect. We get the desired client identifier value from the JWT passed in with each request and attempt to set it on the raw connection object. While the action of setting the value throws no error, the value does not show up on the backend for the relevant session, i.e. the client_identifier property is null when viewing sessions on the db side.
from flask import Flask, jsonify, request
from flask_sqlalchemy import SQLAlchemy
app = Flask(__name__)
db = SQLAlchemy(app)
# the client_identifier property is available at db.engine.raw_connection()
# attempt to set the client_identifier property
raw_conn = db.engine.raw_connection()
raw_conn.client_identifier = 'USER_210'
# execute some sql using raw_conn.cursor()...
# the client identifier value on the db side for this request is null
Is the approach shown above the correct way to set the client_identifier? If so, why isn't USER_210 listed in the client_identifier column when querying the backend session table using the v$session view?
In pure cx_Oracle, this works for me:
import cx_Oracle
db = cx_Oracle.connect("system", "oracle", "localhost/orclpdb")
db.client_identifier = 'this-is-me'
cursor = db.cursor()
cursor.execute("select username, client_identifier from v$session where username = 'SYSTEM'")
v = cursor.fetchone()
print(v)
The result is:
$ python3 q1.py
('SYSTEM', 'this-is-me')
I don't have the setup to test your exact scenario.
Is it possible to make SQLAlchemy do cross server joins?
If I try to run something like
engine = create_engine('mssql+pyodbc://SERVER/Database')
query = sql.text('SELECT TOP 10 * FROM [dbo].[Table]')
with engine.begin() as connection:
data = connection.execute(query).fetchall()
It works as I'd expect. If I change the query to select from [OtherServer].[OtherDatabase].[dbo].[Table] I get an error message "Login failed for user 'NT AUTHORITY\\ANONYMOUS LOGON"
Looks like there's an issue with how you authenticate to SQL server.
I believe you can connect using the current Windows user, the URI syntax is then mssql+pyodbc://SERVER/Database?trusted_connection=yes (I have never tested this, but give it a try).
Another option is to create a SQL server login (ie. a username/password that is defined within SQL server, NOT a Windows user) and use the SQL server login when you connect.
The database URI then becomes: mssql+pyodbc://username:password#SERVER/Database.
mssql+pyodbc://SERVER/Database?trusted_connection=yes threw an error when I tried to it. It did point me in the right direction though.
from sqlalchemy import create_engine, sql
import urllib
string = "DRIVER={SQL SERVER};SERVER=server;DATABASE=db;TRUSTED_CONNECTION=YES"
params = urllib.quote_plus(string)
engine = create_engine('mssql+pyodbc:///?odbc_connect={0}'.format(params))
query = sql.text('SELECT TOP 10 * FROM [CrossServer].[datbase].[dbo].[Table]')
with engine.begin() as connection:
data = connection.execute(query).fetchall()
It's quite complicated if you suppose to alter different servers through one connection.
But if you need to perform a query to a different server under different credentials you should add linked server first with sp_addlinkedserver. Then it should be added credentials to the linked server with sp_addlinkedsrvlogin. Have you tried this?