Given a Solana wallet address I would like to verify every single transaction ever confirmed to check other information, such as the receiver (or sender) and the amount sent (or received). So, as usual, I searched for some APIs. I found the following:
Solana py
PySolana
After that, I went to look which methods they offer. The one that seems to be close to what I wish is solana_client.get_confirmed_signature_for_address2 (available in 1), however my results do not match what its documentation shows. Here it is:
from solana.rpc.api import Client
solana_client = Client("https://api.devnet.solana.com")
solana_client.get_signatures_for_address("2AQdpHJ2JpcEgPiATUXjQxA8QmafFegfQwSLWSprPicm", limit=1)
I get this:
{'jsonrpc': '2.0', 'result': [], 'id': 1}
However, I should get its last signature, which seems to be this:
4SNQ4h1vL9GkmSnojQsf8SZyFvQsaq62RCgops2UXFYag1Jc4MoWrjTg2ELwMqM1tQbn9qUcNc4tqX19EGHBqC5u
Anyways, we can use SolanaBeach and check. Further, if we code as the documentation explains:
from solana.rpc.api import Client
solana_client = Client("https://api.devnet.solana.com")
solana_client.get_signatures_for_address("Vote111111111111111111111111111111111111111", limit=1)
I get this:
{'jsonrpc': '2.0', 'result': [{'blockTime': 1637328065, 'confirmationStatus': 'finalized', 'err': {'InstructionError': [0, {'Custom': 0}]}, 'memo': None, 'signature': '5yaeqDRCHWCGQMqNWhq3g6zqw63MBkri9i86hjK954YFFvnG2VCQJfszXsozDVUJbePagJieAzwsSY5H7Xd1jJhC', 'slot': 95301596}], 'id': 1}
Weird thing is "Vote111...11" seems not to be an address. Nevertheless, I get expected results, that is a signature, even though such signature can't be found by Solana Explorer...
Please, tell me what to fix. I have no idea what to do. I even tried to check if all Solana Explorers have their own API, but they do not. Probably because Solana already shares it, right?
EDIT
Well, it seems I need to enter the "account address as base-58 encoded string", thus the address becomes: HLiBGYYxaQqQx8UTPHEahqcd7aZjkDgN3bihc3hYM3SDUBGU9LFrQSnx9eje.
I also did that and I get:
{'jsonrpc': '2.0', 'error': {'code': -32602, 'message': 'Invalid param: WrongSize'}, 'id': 1}
I have implemented the function to get all the transactions of a given address on javaScript this might help you out.
async getTransactionsOfUser(address, options, connection) {
console.log({ address, options });
try {
const publicKey = new PublicKey(address);
const transSignatures =
await connection.getConfirmedSignaturesForAddress2(publicKey, options);
console.log({ transSignatures });
const transactions = [];
for (let i = 0; i < transSignatures.length; i++) {
const signature = transSignatures[i].signature;
const confirmedTransaction = await connection.getConfirmedTransaction(
signature,
);
if (confirmedTransaction) {
const { meta } = confirmedTransaction;
if (meta) {
const oldBalance = meta.preBalances;
const newBalance = meta.postBalances;
const amount = oldBalance[0] - newBalance[0];
const transWithSignature = {
signature,
...confirmedTransaction,
fees: meta?.fee,
amount,
};
transactions.push(transWithSignature);
}
}
}
return transactions;
} catch (err) {
throw err;
}
}
Problem is not module nor function but endpoint.
In Solana Doc I found endpoint for mainnet:
https://api.mainnet-beta.solana.com
https://solana-api.projectserum.com
and it gives all values.
On other page you can see that
devnet is only playground for tests and tokens are not real
testnet is only for stress test and tokens are not real
#Devnet#
- Devnet serves as a playground for anyone who wants to take Solana for a test drive, as a user, token holder, app developer, or validator.
- Application developers should target Devnet.
- Potential validators should first target Devnet.
- Key differences between Devnet and Mainnet Beta:
- Devnet tokens are not real
- Devnet includes a token faucet for airdrops for application testing
- Devnet may be subject to ledger resets
- Devnet typically runs a newer software version than Mainnet Beta
#Testnet#
-Testnet is where we stress test recent release features on a live cluster, particularly focused on network performance, stability and validator behavior.
- Testnet tokens are not real
- Testnet may be subject to ledger resets.
- Testnet includes a token faucet for airdrops for application testing
- Testnet typically runs a newer software release than both Devnet and Mainnet Beta
Minimal working example for tests:
from solana.rpc.api import Client
all_addresses = [
'2AQdpHJ2JpcEgPiATUXjQxA8QmafFegfQwSLWSprPicm',
'Vote111111111111111111111111111111111111111',
'fake address',
]
#endpoint = 'https://api.devnet.solana.com' # probably for `developing`
#endpoint = 'https://api.testnet.solana.com' # probably for `testing`
endpoint = 'https://api.mainnet-beta.solana.com'
#endpoint = 'https://solana-api.projectserum.com'
solana_client = Client(endpoint)
for address in all_addresses:
print('address:', address)
#result = solana_client.get_confirmed_signature_for_address2(address, limit=1)
result = solana_client.get_signatures_for_address(address)#, before='89Tv9s2uMGaoxB8ZF1LV9nGa72GQ9RbkeyCDvfPviWesZ6ajZBFeHsTPfgwjGEnH7mpZa7jQBXAqjAfMrPirHt2')
if 'result' in result:
print('len:', len(result['result']))
# I use `[:5]` to display only first 5 values
for number, item in enumerate(result['result'][:5], 1):
print(number, 'signature:', item['signature'])
# check if there is `4SNQ4h1vL9GkmSnojQsf8SZyFvQsaq62RCgops2UXFYag1Jc4MoWrjTg2ELwMqM1tQbn9qUcNc4tqX19EGHBqC5u`
for number, item in enumerate(result['result'], 1):
if item['signature'].startswith('4SN'):
print('found at', number, '>>>', item['signature'])
else:
# error message
print(result)
print('---')
#solana_client.get_account_info(address)
Result:
address: 2AQdpHJ2JpcEgPiATUXjQxA8QmafFegfQwSLWSprPicm
len: 1000
1 signature: 89Tv9s2uMGaoxB8ZF1LV9nGa72GQ9RbkeyCDvfPviWesZ6ajZBFeHsTPfgwjGEnH7mpZa7jQBXAqjAfMrPirHt2
2 signature: 3Ku2rDnAVo5Mj3r9CVSGHJjvn4H9rxzDvc5Cg5uyeCC9oa6p7enAG88pSfRfxcqhBh2JiWSo7ZFEAD3mP8teS1Yg
3 signature: 3wiYCmfXb9n6pT3mgBag7jx6jBjeKZowkYmeakMibw4GtERFyyitrmmoPU6t28HpJJgWkArymWEGWQj8eiojswoD
4 signature: 5vjV1wKU3ZEgyzqXCKrJcJx5jGC8LPqRiJBwhPcu62HQU64mkrvkK8LKYaTzX4x4p26UXSufWM57zKSxRrMgjWn3
5 signature: 3aLk4xZPcWRogtvsFe8geYC177PK8s47mgqUErteRc9NJ4EF2iHi3GPsaj5guTwyiabhwivFhrrEk4YQgiE2hZs8
found at 970 >>> 4SNQ4h1vL9GkmSnojQsf8SZyFvQsaq62RCgops2UXFYag1Jc4MoWrjTg2ELwMqM1tQbn9qUcNc4tqX19EGHBqC5u
---
address: Vote111111111111111111111111111111111111111
len: 1000
1 signature: 67RRbUWGCrwmJ3hhLL7aB2K8gc6MewxwgAdfG7FeXQBaSstacqvuo9QUPZ6nhqXjJwYpKHihNJwFfcaAZHuyFmMc
2 signature: 67PsyRRw8bXgtsB49htxcW2FE9cyyBrocUKacnrxJpqaBpFT6QDLrCkyovWnM8XyGKxTv3kqzmW72SH7gj3N8YJr
3 signature: 675FWqrAjE5Bt6rf3KD2H2PCKUmEtrcD8BRRypdS7m2V22zXhrGn3SktP6JYW4ws6xEqDj52MZMH8RwNjoqgW4mt
4 signature: 671K7N9FwaMAyBC4MEYbYb1ACYAendBbRMqKPvr3h63dt5ybAPHyppjHwxq1yPDjqaRUwCBVU9o5dVqgsdVabint
5 signature: 666jBXXLwmB5tuvufhNn8Q7A3eCzGo6CBFD5BYJkuGfBf1bRoAGz4DeEpUAKsUrRk4NdRBhYkwfrhyZjgFmo3Dp2
---
address: fake address
{'jsonrpc': '2.0', 'error': {'code': -32602, 'message': 'Invalid param: Invalid'}, 'id': 3}
---
BTW:
Because it gets only 1000 values you may not see 4SNQ... which is at position ~1200 at this moment, but if you use before=
get_signatures_for_address(address, before='89Tv9s2uMGaoxB8ZF1LV9nGa72GQ9RbkeyCDvfPviWesZ6ajZBFeHsTPfgwjGEnH7mpZa7jQBXAqjAfMrPirHt2')
then it should be at position ~970
BTW:
On Solana Explorer you have big button to change Mainnet to Devnet and when you use Devnet then
2AQdpHJ2JpcEgPiATUXjQxA8QmafFegfQwSLWSprPicm also gives 0 items.
The same on Solana Beach. There is also big button to change Mainnet to Devnet and when you use Devnet then
2AQdpHJ2JpcEgPiATUXjQxA8QmafFegfQwSLWSprPicm gives 0 items.
funny I was working on this exact same issue this morning.. and just like furas pointed out, it's the endpoint, need to use the mainnet endpoint:
https://api.mainnet-beta.solana.com
And I found it's bit confusing even though the doc says you need to input base-58 address, I tried same as you did it gives me the same error, turns out I just need to copy paste my address directly
Related
I am finding that I keep getting errors when accessing objects that are nest in three other objects.
The following is an example of one of the entities returned when accessing the API:
entity {
id: "000069R"
trip_update {
trip {
trip_id: "064650_R..S"
route_id: "R"
start_time: "10:46:30"
start_date: "20220902"
nyct_trip_descriptor {
train_id: "1R 1046+ CTL/95S"
direction: SOUTH
}
}
stop_time_update {
stop_id: "G08S"
arrival {
time: 1662129990
}
departure {
time: 1662129990
}
nyct_stop_time_update {
scheduled_track: "D1"
actual_track: "D1"
}
}
}
This is my code:
import gtfs_realtime_pb2
import urllib.request
trainsets=[
['a','c','e'],
['b','d','f', 'm'],
['g'],
['j','z'],
['n','q','r', 'w'],
['l'],
['1','2','3','4','5','6','7']
]
set_number=0
# get the train from user and find the corresponding trainset
char = input("What train are you looking for? ")
char = char.lower()
for sets in trainsets:
if char in sets:
print('Found it!')
set_number = trainsets.index(sets)
# form string from the values in specified list
desired_trainset= ''.join(trainsets[set_number])
# Communicate with the api
feed = gtfs_realtime_pb2.FeedMessage()
response = urllib.request.Request(f'https://api-endpoint.mta.info/Dataservice/mtagtfsfeeds/nyct%2Fgtfs-{desired_trainset}')
response.add_header("x-api-key", '<API KEY HERE>')
feed.ParseFromString(urllib.request.urlopen(response).read())
# access specific info from the api
for entity in feed.entity:
if entity.HasField('trip_update'):
#! ERROR: AttributeError: nyct_trip_descriptor
print(entity.trip_update.trip.nyct_trip_descriptor.direction)
#! ERROR: AttributeError: 'google._upb._message.RepeatedCompositeContainer' object has no attribute 'arrival'
print(entity.trip_update.stop_time_update.arrival.time)
The last two print statements are where I keep getting an error. I can access start_time and start_date
or stop_id from the entity just fine, but I keep getting an error when trying to access objects inside other objects.
Using:
Python: 3.10.6
Windows: 11
I've asked a couple of my friends and they don't see the error.
Thank you for helping.
EDIT:
Someone requested to print the entity before the error. This is one of the responses I get back from the data when I do the following code:
print(entity.trip_update.trip)
#response I get back
trip_id: "006600_R..N"
route_id: "R"
start_time: "01:06:00"
start_date: "20220903"
nyct_trip_descriptor {
train_id: "1R 0106 95S/WHL"
}
Someone commented that the nyct_trip_descriptor is an optional field, and after rereading the documentation i confirmed that it is optional but I'm not sure how to proceed further and whether or not that is the reason why I'm getting this error. Thank you for helping.
Edit:
I've done the following to return a default value when 'nyct_trip_descriptor' is missing:
nyct_trip_descriptor = getattr(entity.trip_update.trip, 'nyct_trip_descriptor', 'not found')
print(nyct_trip_descriptor)
This didn't work since all the entities I got from the API just stated 'not found', but when printing just the entity 'nyct_trip_descriptor' is present as seen in the first code snippet.
I am working on an Fall Detection System. I wrote the Arduino Code and connected to Firebase. So now I have two variables that get 1 or 0 status, and I created a mobile application to receive an automatic push notification whenever the system detects a fall through Firebase+Pusher. I wrote this Python code with PyCharm and I used the stream function to read live data from Firebase and send automatic notifications. The code was working for the variable "Fall_Detection_Status" and I was able to receive push notifications normally with every fall detection. But I tried to modify the code to read data from another variable "Fall_Detection_Status1" and I want my code now to send the notification if both variables are giving 1's. I came up with this code but it seems that the last if statement is not working because I am not able to receive notifications and also print(response['publishId']) at the end of the if statement is not showing any result.
So what is wrong?
import pyrebase
from pusher_push_notifications import PushNotifications
config = {
'apiKey': "***********************************",
'authDomain': "arfduinopushnotification.firebaseapp.com",
'databaseURL': "https://arduinopushnotification.firebaseio.com",
'projectId': "arduinopushnotification",
'storageBucket': "arduinopushnotification.appspot.com",
'messagingSenderId': "************"
}
firebase = pyrebase.initialize_app(config)
db = firebase.database()
pn_client = PushNotifications(
instance_id='*****************************',
secret_key='**************************',
)
value = 0
value1 = 0
def stream_handler(message):
global value
print(message)
if message['data'] is 1:
value = message['data']
return value
def stream_handler1(message):
global value1
print(message)
if message['data'] is 1:
value1 = message['data']
return value1
if value == 1 & value1 == 1:
response = pn_client.publish(
interests=['hello'],
publish_body={
'apns': {
'aps': {
'alert': 'Hello!',
},
},
'fcm': {
'notification': {
'title': 'Notification',
'body': 'Fall Detected !!',
},
},
},
)
print(response['publishId'])
my_stream = db.child("Fall_Detection_Status").stream(stream_handler)
my_stream1 = db.child("Fall_Detection_Status1").stream(stream_handler1)
You are using the wrong operator '&' to combine the results of the two tests. In Python, '&' is the bitwise and operator! I believe you want the logical version which is 'and'.
Secondly, assuming the stream_handler/1 calls are run by your last two statements, those two statements are AFTER the place where you test the values in the if statement. Move those line above the if block.
Reading the Developer's Guide I found how to delete single contact:
def delete_contact(gd_client, contact_url):
# Retrieving the contact is required in order to get the Etag.
contact = gd_client.GetContact(contact_url)
try:
gd_client.Delete(contact)
except gdata.client.RequestError, e:
if e.status == 412:
# Etags mismatch: handle the exception.
pass
Is there a way to delete all contacts?
Could not find a way to do so.
Iterate each contact takes few minutes for a large batch
If you are performing a lot of operations, use the batch requests. You can have the server perform multiple operations with a single HTTP request. Batch requests are limited to 100 operations at a time. You can find more information about batch operations in the Google Data APIs Batch Processing documentation.
To delete all contacts use the contactsrequest.Batch operation. For this operation, create a LIST<type>, you set the BatchData for each contact item, and then pass the list to the contactsrequest.Batch operation.
private void DeleteAllContacts()
{
RequestSettings rs = new RequestSettings(this.ApplicationName, this.userName, this.passWord);
rs.AutoPaging = true // this will result in automatic paging for listing and deleting all contacts
ContactsRequest cr = new ContactsRequest(rs);
Feed<Contact> f = cr.GetContacts();
List<Contact> list = new List<Contact>();
int i=0;
foreach (Contact c in f.Entries)
{
c.BatchData = new GDataBatchEntryData();
c..BatchData.Id = i.ToString();
c.BatchData.Type = GDataBatchOperationType.delete;
i++;
list.Add(c);
}
cr.Batch(list, new Uri(f.AtomFeed.Batch), GDataBatchOperationType.insert);
f = cr.GetContacts();
Assert.IsTrue(f.TotalResults == 0, "Feed should be empty now");
}
I'm trying to add a payment to xero using the pyxero python library for python3.
I'm able to add invoices and contacts, but payments always returns a validation exception.
Here is the data I'm submitting:
payments.put([{'Amount': '20.00',
'Date': datetime.date(2016, 5, 25),
'AccountCode': 'abc123',
'Reference': '8831_5213',
'InvoiceID': '09ff0465-d1b0-4fb3-9e2e-3db4e83bb240'}])
And the xero response:
xero.exceptions.XeroBadRequest: ValidationException: A validation exception occurred
Please note: this solution became a hack inside pyxero to get the result I needed. This may not be the best solution for you.
The XML that pyxero generates for "payments.put" does not match the "PUT Payments" XML structure found in the xero documentation.
I first changed the structure of your dictionary so that the XML generated in basemanager.py was similar to the documentation's.
data = {
'Invoice': {'InvoiceID': "09ff0465-d1b0-4fb3-9e2e-3db4e83bb240"},
'Account': {"AccountID": "58F8AD72-1F2E-AFA2-416C-8F660DDD661B"},
'Date': datetime.datetime.now(),
'Amount': 30.00,
}
xero.payments.put(data)
The error still persisted though, so I was forced to start changing code inside pyxero's basemanager.py.
In basemanager.py on line 133, change the formatting of the date:
val = sub_data.strftime('%Y-%m-%dT%H:%M:%S')
to:
val = sub_data.strftime('%Y-%m-%d')
pyxero is originally returning the Time. This is supposed to only be a Date value - The docs stipulate the formatting.
Then, again in basemanager.py, on line 257, change the following:
body = {'xml': self._prepare_data_for_save(data)}
to:
if self.name == "Payments":
body = {'xml': "<Payments>%s</Payments>" % self._prepare_data_for_save(data)}
else:
body = {'xml': self._prepare_data_for_save(data)}
Please note that in order for you to be able to create a payment in the first place, the Invoice's "Status" must be set to "AUTHORISED".
Also, make sure the Payment's "Amount" is no greater than Invoice's "AmountDue" value.
I hope, someone has stumbled over the same issue and might guide me towards a simple solution for my problem.
I want to retrieve regularly some data regarding my Ads on Facebook. Basically, I just want to store some metadata in one of my databases for further reporting purposes. Thus, I want to get AD-ID, AD-name and corresponding ADSET-ID for all my Ads.
I have written this small function in Python:
def get_ad_stats(ad_account):
""" Pull basic stats for all ads
Args: 'ad_account' is the Facebook AdAccount object
Returns: 'fb_ads', a list with basic values
"""
fb_ads = []
fb_fields = [
Ad.Field.id,
Ad.Field.name,
Ad.Field.adset_id,
Ad.Field.created_time,
]
fb_params = {
'date_preset': 'last_14_days',
}
for ad in ad_account.get_ads(fields = fb_fields, params = fb_params):
fb_ads.append({
'id': ad[Ad.Field.id],
'name': ad[Ad.Field.name],
'adset_id': ad[Ad.Field.adset_id],
'created_time': datetime.datetime.strptime(ad[Ad.Field.created_time], "%Y-%m-%dT%H:%M:%S+0000"),
})
return (fb_ads)
Similar functions for Campaign- and AdSet-data work fine. But for Ads I am always reaching a user request limit: "(#17) User request limit reached".
I do have an API-access level of "BASIC" and we're talking here about 12,000 Ads.
And, unfortunately, async-calls seem to work only for the Insights-edge.
Is there a way to avoid the user request limit, e.g. by limiting the API-request to only those Ads which have been changed/newly created after a specific date or so?
Ok, sacrificing the 'created_time' field, I have realized I could use the Insights-edge for that.
Here is a revised code for the same function which is now using async-calls and a delay between calls:
def get_ad_stats(ad_account):
""" Pull basic stats for all ads
Args: 'ad_account' is the Facebook AdAccount object
Returns: 'fb_ads', a list with basic values
"""
fb_ads = []
fb_params = {
'date_preset': 'last_14_days',
'level': 'ad',
}
fb_fields = [
'ad_id',
'ad_name',
'adset_id',
]
async_job = ad_account.get_insights(fields = fb_fields, params = fb_params, async=True)
async_job.remote_read()
while async_job['async_percent_completion'] < 100:
time.sleep(1)
async_job.remote_read()
for ad in async_job.get_result():
fb_ads.append({
'id': ad['ad_id'],
'name': ad['ad_name'],
'adset_id': ad['adset_id'],
})
return (fb_ads)