I want is for my python script to log into my Paypal account and check how many active profiles there are in my recurring payments section.
I can't find any API that would help me, TransactionSearch API isn't quite what I want - I don't want transactions, only active profiles
The only idea I have is to somehow detect every subscription and cancelation "on fly" and update a counter in the database. Although it seems... impure :)
So what you can do (or would need to do) is record the profile ID's of every profile. So as a user signs up, you keep track of the profile ID. Then you can use GetRecurringPaymentsProfileDetails API to figure out if its active or not.
Related
I was wondering if it's possible to utilize the Instagram API with Python in order to gather info on follower account status to seek trends/activity/etc. for my platform. Basically I want to see what brands, etc. users engage with by using the API to see where the accounts who are part of my network go, what they click like on, where they leave a comment/what type of feedback they give/interact across brands. The accounts will consent to this of course, but is this even possible with the API anyways? I have seen services offer this for a fee, so I assumed it's possible somehow.
I assume that when a user leaves a comment it is stored in some database that you can then use the API to see if it matches with some ID or such -- if not then maybe there is a way to do this indirectly, i.e., some kind of background service that can see if a comment/ID matches a username without having to use the API itself. Basically I'm interested if this is feasible/simple -- I'm not too savvy!
This is meant for business and non-business/personal accounts -- also just for the fun of it too.
I have peeked at the API but it does not mention this, and no amount of searching narrows it down.
I know Facebook made some changes with their graph API which basically makes this a dead end on their platform without some possible hackaround if that is even theoretically possible.
I'm trying to analyze (for business intelligence purpose) some google analytics data in python.
All I get after many tutorials are "aggregated" data... like the number of views in a day the thing I need instead is something capable of tracking the behavior of a single user.. like what page of the web site he visited, his bounce rate if he used the e-commerce and so on.
I saw many CSV already prepared for such analysis but I'm starting from scratch with my web site.
You can use the User-ID feature, when you send Analytics an ID and related data from multiple sessions, your reports tell a more unified, holistic story about a user’s relationship with your business:
https://support.google.com/analytics/answer/3123662?hl=en
Otherwise, you can examine individual-user behavior at the session level in User Explorer report. The User Explorer report lets you isolate and examine individual rather than aggregate user behavior. Individual user behavior is associated with either Client ID or User ID.
https://support.google.com/analytics/answer/6339208?hl=en
Is there any way to have a login on my python program that accesses my squarespace site? For example, I have a game that costs 4.99, and I want to make sure that no one can “copy-paste” my program, so I add a login. This login would connect to SquareSpace, and check if the account used to log in bought the game.
I'm going to assume that you have a Squarespace website through which you are selling the game. That being the case, customer information could be accessed from a different service/server via a couple methods:
Squarespace Commerce API: Use of the Squarespace Commerce API currently requires an "Advanced Online Store" plan from Squarespace, costing $40-$46/month at time of writing. If that is an option, you can obtain order information from your Squarespace site and use it to verify customer purchases.
Stripe API: Because customer payments through your Squarespace site will be processed via Stripe, it should be possible to use the Stripe API to access order/payment information as well. However, it may be more difficult depending on how Squarespace is submitting the information to Stripe.
It appears possible to filter customers by email address, though I've not explored this option myself. If this method works, it would avoid having to pay for a Commerce - Advanced plan through Squarespace, a significant cost for access to such a basic set of API features.
I have an issue with getting a live list of users with two-step verification enabled from Google Admin using Admin-SDK.
The reporting API can be used to gather reports on who is currently enrolled, but these are not live, they are three days old. The directory API (users & groups) does not have this information.
What I am playing with now is scraping the google admin user page for enrollment status. My question is what is the source of the ID on the google admin page? It is not returned from the Directory API. Is it an encoded version of the userID?
This is the ID I refer to
I have found that you can get an up to date list of users with 2FA enabled. By downloading a list of users from the Google Admin "Users" page.
Click the three dot symbol on the top right of the page and then select download users. Select the scope of the download and then you will have up-to-date documents on who is enrolled.
The reports feature of Google Admin is three days behind, which includes the reporting API. The users API has no way to pull 2FA enrollment, but the download users button lets you know any stragglers.
The reason I asked the question in the first place is because when you force two-step verification via the policy in the Security section, users who do not already have it enabled are locked out, as they are prompted for a two-step verification code which they are unable to acquire. You can create a 2FA exception group which allows them to log in and set it up, but then you must remove the users from the group who have it enabled; just for housekeeping purposes.
I cannot see a way to do this pragmatically using the admin-SDK without using the reporting API, which is three days behind and hence outdated. automated fashion. I'd love to see someone who has made it happen, so feel free to chime in!
I'm doing some research, where I have a facebook app, that asks for some permissions on the users facebook to get some basic information.
I can see that my app has about 600 users, and I'd like to query them, to see some patterns in the users. i.e. how many friends do they have, how long messages do they post in updates etc.
My question is: Do I have to copy the data when the user first visits my app and grants access to his information, or can I query it as long as the user hasn't "removed" my app.
I hope the second option will be true, since I have a lot of considerations about "copying" user data, and storing it in a database - primarily ethical but also related to security issues, compliance, resources so on and so forth.
the programming language is not important, but if anyone needs to exemplify, lets say it's python.
NO. You don't have to copy the data.
You can query Facebook as long as you have a valid access token regardless of whether the user is online or offline.
However, the only thing you need to take care of is handling of expired access token, because in that case the user will need to re-authorize your application for you to get the access_token.