I'm using web3py and I want to get the transaction history of a specific contract. Here's my code sample
eventSignatureHash = web3.keccak(text='Transfer(address,uint256)').hex()
filter = web3.eth.filter({
'address': '0x828402Ee788375340A3e36e2Af46CBA11ec2C25e',
'topics': [eventSignatureHash]
})
I'm expected to get ERC20 Token Transactions related to this contract as found here but it does not display anything unfortunately. How to go about this?
Finally, is there a way to watch these transactions in real time?
What i did is that i created a contract instance:
contract = web3.eth.contract(address = contract_address)
then trasnfer_filter = contract.events.Transfer.filter(u have optional parameters such as: fromBlock:...,toBlock, argument_filters:{"to": users_address(this filters for transfers only to that address)})
so you can play around it.
https://web3py.readthedocs.io/en/latest/contracts.html#web3.contract.ContractEvents
found in the event log object section.
As is answered by other's answer, contract.events provides lots of useful methods. If nothing is returned, specifying from and to might help.
And besides, an ultimate solution is already provided here -> Advanced example: Fetching all token transfer events.
Finally, is there a way to watch these transactions in real time?
Actually, lots of nodes support subscribe RPCs. However, this feature is not yet supported by web3py(#1402). You can try SDK in other language or temporarily adopt this method here
Related
From the official docs I found this:
import bybit
client = bybit.bybit(test=True, api_key="api_key", api_secret="api_secret")
print(client.LinearOrder.LinearOrder_new(
side="Sell",
symbol="BTCUSDT",
order_type="Limit",
qty=0.22,
price=10000,
time_in_force="GoodTillCancel",
reduce_only=False,
close_on_trigger=False
).result())
Additional parameters take_profit & stop_loss may also be sent. The TP&SL then get placed along with the order.
I'm wondering if there is a way to place TP&SL orders after an order is placed. There are no examples in the official docs, and I do not understand any instructions written for those orders in there either.
Thank you in advance
Yes, it is possible to attach these orders after entering a position. In the docs they reference set stop, and this is also included in the test.py doc page within the Bybit python install
here is the link to the docs
Bybit Set Stop
Here is what a stop and TP would look like for a LONG position. Please note that for a long we set what our current pos is for the side argument. (BUY)
# Stop Loss
print(client.LinearPositions.LinearPositions_tradingStop(
symbol="BTCUSDT",
side="Buy",
stop_loss=41000).result())
# Take profit
print(client.LinearPositions.LinearPositions_tradingStop(
symbol="BTCUSDT",
side="Buy",
take_profit=49000).result())
Additional note: TP orders are conditional orders, meaning they are sent to the order book once triggered, which results in a market order. If you already know your target level, a limit order may be more suitable. This will go to your active orders, which you will have to cancel. We use a sell argument for this one:
# Limit order
print(client.LinearOrder.LinearOrder_new(
side="Sell",
symbol="BTCUSDT",
order_type="Limit",
qty=0.001,
price=49000,
time_in_force="GoodTillCancel",
reduce_only=True,
close_on_trigger=False).result())
Cheers my friend and good luck with your coding and trading!
Another way to is to listen to websocket data. What I do is I subscribe to "execution" topic. This way, every time your order gets executed you receive an event with all the info about the trade. Then, you can have a callback function that places a trade for you.
Here's the link to the api: https://bybit-exchange.github.io/docs/inverse/#t-websocketexecution
Here's how to subscribe:
Here's the sample response:
I was wondering if it's possible to utilize the Instagram API with Python in order to gather info on follower account status to seek trends/activity/etc. for my platform. Basically I want to see what brands, etc. users engage with by using the API to see where the accounts who are part of my network go, what they click like on, where they leave a comment/what type of feedback they give/interact across brands. The accounts will consent to this of course, but is this even possible with the API anyways? I have seen services offer this for a fee, so I assumed it's possible somehow.
I assume that when a user leaves a comment it is stored in some database that you can then use the API to see if it matches with some ID or such -- if not then maybe there is a way to do this indirectly, i.e., some kind of background service that can see if a comment/ID matches a username without having to use the API itself. Basically I'm interested if this is feasible/simple -- I'm not too savvy!
This is meant for business and non-business/personal accounts -- also just for the fun of it too.
I have peeked at the API but it does not mention this, and no amount of searching narrows it down.
I know Facebook made some changes with their graph API which basically makes this a dead end on their platform without some possible hackaround if that is even theoretically possible.
I am working on an application that sends logs to GCP StackDriver. I want to put custom "tags" (or summary fields) natively on my log entry. I am looking for a solution that doesn't rely on defining custom summary fields in the console, as those are not permanent, and not project-wide.
I realized that some logger have tags displayed. For example, GCF logs will show its execution_id. Using the following snippet, I can verify that the tags displayed depend on the name of the logger:
from google.cloud import logging
client = logging.Client()
client.logger(name="custom").log_text("foobar", labels={"execution_id": "foo"})
client.logger(name="cloudfunctions.googleapis.com%2Fcloud-functions").log_text("foobar", labels={"execution_id": "foo"})
if you filter your logs on "foobar", you will see that only the second entry has "foo" as a tag.
That tag matches the label.execution_id specified in the code. The problem is, I cannot add custom labels, if I add another label that is not execution_id, it is not displayed as a tag (but still found in the log body).
It looks like each monitored resources has its own set of tag, ie: BigQuery resources use protoPayload.authenticationInfo.principalEmail as tag. But I cannot find a way to specify my own resources.
Does anybody has some experience with that kind of issue?
Thanks in advance
The closest solution I found was in an expanded log entry, click on a field within the JSON representation. In the resulting panel, select Add field to summary line,
to get more information about this topic, please refer to this link
Additionally I found a feature request opened for the product team, where the user, on that case, wants to filter out in Stackdriver by Dataflow jobs custom labels, the reference might be useful on your use case, no ETA was shared, neither guarantee of its implementation
I've filed a Feature Request on your behalf to the product team, they'll evaluate the possibility to implement the functionality that fits your use case, you can follow up on this PIT [1], where you will be able to receive further updates from the team as well.
Keep in mind that there is no ETA, nor guarantee that this will be implemented. However, please feel free to ask for updates directly on the PIT, I would appreciate if you give my answer as accepted, if it was helpful for you.
[1]https://issuetracker.google.com/172667238
I am trying to use an API which I have used previously for various jobs, to query and get me relevant data. But lately, I am unable to do that because of an unusual exception returned, which I honestly have no idea about.
The CODE:
import SIEMAuth
import requests
alert_id = '144116287822364672|12101929'
query_params = {"id": {"value": alert_id}, "format": {"format": 0}}
print(requests.post(SIEMAuth.url + 'ipsGetAlertPacket', json=query_params, headers=SIEMAuth.session_headers, verify=False).text)
The following exception/traceback response is returned on querying this:
Can not construct instance of com.mcafee.siem.api.data.alert.EsmPacketFormat: no suitable constructor found, can not deserialize from Object value (missing default constructor or creator, or perhaps need to add/enable type information?)
at [Source: java.io.StringReader#1a15fbf; line: 1, column: 2]
Process finished with exit code 0
On trying to surf the internet to know more about the exception, most of the results are related to Jackson Parser for Json in Java Programming Environment which is not something I am working on or am aware of.
If anybody could help, I'd be extremely grateful.....
Unfortunately it's as I suggested; basically one way or another it's broken. The response from their support is as follows.
I have reach out to my development team for this question. I got below response.
That particular get is not meant to be used in the external API. It should only be used from the interface, and has been removed since the version of the ESM you are on. If you want to use that externally then you need to submit it as a per.
I hope this clears your questions.
Edit: This has actually been expanded on in a thread on their support forums. You need a login to see the original thread.
Name notwithstanding, this API does not return the actual data packet associated with an event. In fact, when aggregation is enabled, not all of the packets associated with a given event are available on the ESM. Raw packet data can be retrieved from the ELM through the UI, but unfortunately there currently is not a way to do that programmatically.
Assembla provides a simple way to fetch all commits of an organisation using api.assembla.com/v1/activity.json and it takes to and from parameters allowing to get commits of selected date(from all the spaces(repos) the user is participating.
Is there any similar way in Github ?
I found these for Github:
/repos/:owner/:repo/commits
Accepts since and until parameters for getting commits of selected date. But, since I want commits from all repos, I have to loop over all those repos and fetch commits for each repo.
/users/:user/events
This shows the commits of a user. I dont have any problem looping over all the users in the org, but how can I get for a particular date ?
/orgs/:org/events
This shows commits of all users of all repos but dont know how to fetch for a particular date ?
The problem with using the /users/:user/events endpoint is that you just don't get the PushEvents and you would have to skip over non-commit events and perform more calls to the API. Assuming you're authenticated, you should be safe so long as your users aren't hyper active.
For /orgs/:org/events I don't think they accept parameters for anything, but I can check with the API designers.
And just in case you aren't familiar, these are all paginated results. So you can go back until the beginning with the Link headers. My library (github3.py) provides iterators to do this for you automatically. You can also tell it how many events you'd like. (Same with commits, etc). But yeah, I'll come back an edit after talking to the API guys at GitHub.
Edit: Conversation
You might want to check out the GitHub Archive project -- http://www.githubarchive.org/, and the ability to query the archive using Google's BigQuery. Sounds like it would be a perfect tool for the job -- I'm pretty sure you could get exactly what you want with a single query.
The other option is to call the GitHub API -- iterate over all events for the organization and filter out the ones that don't satisfy your date rage criteria and event type criteria (commits). But since you can't specify date ranges in the API call, you will probably do a lot of calls to get the the events that interest you. Notice that you don't have to iterate over every page starting from 0 to find the page that contains the first result in the date range -- just do a (variation of) binary search over page numbers to find any page that contains a commit in the date range, a then iterate in both directions until you break out of the date range. That should reduce the number of API calls you make.