At this moment i am tryging to retrieve the quickstats of an ESXI host itself.
This is the way i connect :
context = ssl.SSLContext(ssl.PROTOCOL_TLSv1)
context.verify_mode = ssl.CERT_NONE
si = SmartConnect(host=args.host,
user=args.user,
pwd=password,
port=int(args.port),
sslContext=context)
How do i retrieve the following stats with this library?
https://github.com/vmware/pyvmomi/blob/master/docs/vim/host/Summary/QuickStats.rst
First you need to locate your host, then access the properties like so:
content = si.RetrieveContent()
host = content.searchIndex.FindByDnsName(dnsName="DC0_C0_H0", vmSearch=False)
print host.summary.quickStats.uptime
The method Im using to locate the host does not have to be used.. its just one of MANY ways of finding a host.. This example assumes you only need to do this for 1 or 2 hosts... Assuming you have lots of hosts and dont want the call to be slow you would really want to build a property collector to fetch that data or else the call will take ages.. If you look at the community samples there is an example or 2 that use views and property collectors which will make your code fast and scaleable.
Related
I am using geopy to get lat/long coordinates for a list of addresses. All the documentation points to limiting server queries by caching (many questions here, in fact), but few actually give practical solutions.
What is the best way to accomplish this?
This is for a self-contained data processing job I'm working on ... no app platform involved. Just trying to cut down on server queries as I run through data that I will have seen before (very likely, in my case).
My code looks like this:
from geopy import geocoders
def geocode( address ):
# address ~= "175 5th Avenue NYC"
g = geocoders.GoogleV3()
cache = addressCached( address )
if ( cache != False ):
# We have seen this exact address before,
# return the saved location
return cache
# Otherwise, get a new location from geocoder
location = g.geocode( address )
saveToCache( address, location )
return location
def addressCached( address ):
# What does this look like?
def saveToCache( address, location ):
# What does this look like?
How exactly you want to implement your cache is really dependent on what platform your Python code will be running on.
You want a pretty persistent "cache" since addresses' locations are not going to change often:-), so a database (in a key-value mood) seems best.
So in many cases I'd pick sqlite3, an excellent, very lightweight SQL engine that's part of the Python standard library. Unless perhaps I preferred e.g a MySQL instance that I need to have running anyway, one advantage might be that this would allow multiple applications running on different nodes to share the "cache" -- other DBs, both SQL and non, would be good for the latter, depending on your constraints and preferences.
But if I was e.g running on Google App Engine, then I'd be using the datastore it includes, instead. Unless I had specific reasons to want to share the "cache" among multiple disparate applications, in which case I might consider alternatives such as google cloud sql and google storage, as well as another alternative yet consisting of a dedicated "cache server" GAE app of my own serving RESTful results (maybe w/endpoints?). The choice is, again!, very, very dependent on your constraints and preferences (latency, queries-per-seconds sizing, etc, etc).
So please clarify what platform you are in, and what other constraints and preferences you have for your databasey "cache", and then the very simple code to implement that can easily be shown. But showing half a dozen different possibilities before you clarify would not be very productive.
Added: since the comments suggest sqlite3 may be acceptable, and there are a few important details best shown in code (such as, how to serialize and deserialize an instance of geopy.location.Location into/from a sqlite3 blob -- similar issues may well arise with other underlying databases, and the solutions are similar), I decided a solution example may be best shown in code. So, as the "geo cache" is clearly best implemented as its own module, I wrote the following simple geocache.py...:
import geopy
import pickle
import sqlite3
class Cache(object):
def __init__(self, fn='cache.db'):
self.conn = conn = sqlite3.connect(fn)
cur = conn.cursor()
cur.execute('CREATE TABLE IF NOT EXISTS '
'Geo ( '
'address STRING PRIMARY KEY, '
'location BLOB '
')')
conn.commit()
def address_cached(self, address):
cur = self.conn.cursor()
cur.execute('SELECT location FROM Geo WHERE address=?', (address,))
res = cur.fetchone()
if res is None: return False
return pickle.loads(res[0])
def save_to_cache(self, address, location):
cur = self.conn.cursor()
cur.execute('INSERT INTO Geo(address, location) VALUES(?, ?)',
(address, sqlite3.Binary(pickle.dumps(location, -1))))
self.conn.commit()
if __name__ == '__main__':
# run a small test in this case
import pprint
cache = Cache('test.db')
address = '1 Murphy St, Sunnyvale, CA'
location = cache.address_cached(address)
if location:
print('was cached: {}\n{}'.format(location, pprint.pformat(location.raw)))
else:
print('was not cached, looking up and caching now')
g = geopy.geocoders.GoogleV3()
location = g.geocode(address)
print('found as: {}\n{}'.format(location, pprint.pformat(location.raw)))
cache.save_to_cache(address, location)
print('... and now cached.')
I hope the ideas illustrated here are clear enough -- there are alternatives on each design choice, but I've tried to keep things simple (in particular, I'm using a simple example-cum-mini-test when this module is run directly, in lieu of a proper suite of unit-tests...).
For the bit about serializing to/from blobs, I've chosen pickle with the "highest protocol" (-1) protocol -- cPickle of course would be just as good in Python 2 (and faster:-) but these days I try to write code that's equally good as Python 2 or 3, unless I have specific reasons to do otherwise:-). And of course I'm using a different filename test.db for the sqlite database used in the test, so you can wipe it out with no qualms to test some variation, while the default filename meant to be used in "production" code stays intact (it is quite a dubious design choice to use a filename that's relative -- meaning "in the current directory" -- but the appropriate way to decide where to place such a file is quite platform dependent, and I didn't want to get into such exoterica here:-).
If any other question is left, please ask (perhaps best on a separate new question since this answer has already grown so big!-).
How about creating a list or dict in which all geocoded addresses are stored? Then you could simply check.
if address in cached:
//skip
This cache will live from the moment the module is loaded and won't be saved after you finish using this module. You'll probably want to save it into a file with pickle or into a database and load it next time you load the module.
from geopy import geocoders
cache = {}
def geocode( address ):
# address ~= "175 5th Avenue NYC"
g = geocoders.GoogleV3()
cache = addressCached( address )
if ( cache != False ):
# We have seen this exact address before,
# return the saved location
return cache
# Otherwise, get a new location from geocoder
location = g.geocode( address )
saveToCache( address, location )
return location
def addressCached( address ):
global cache
if address in cache:
return cache[address]
return None
def saveToCache( address, location ):
global cache
cache[address] = location
Here is a simple implementation that uses the python shelve package for transparent and persistent caching:
import geopy
import shelve
import time
class CachedGeocoder:
def __init__(self, source = "Nominatim", geocache = "geocache.db"):
self.geocoder = getattr(geopy.geocoders, source)()
self.db = shelve.open(geocache, writeback = True)
self.ts = time.time()+1.1
def geocode(self, address):
if not address in self.db:
time.sleep(max(1 -(time.time() - self.ts), 0))
self.ts = time.time()
self.db[address] = self.geocoder.geocode(address)
return self.db[address]
geocoder = CachedGeocoder()
print geocoder.geocode("San Francisco, USA")
It stores a timestamp to ensure that requests are not issued more frequently than once per second (which is a requirement for Nominatim). One weakness is that doesn't deal with timed out responses from Nominatim.
The easiest way to cache geocoding requests is probably to use requests-cache:
import geocoder
import requests_cache
requests_cache.install_cache('geocoder_cache')
g = geocoder.osm('Mountain View, CA') # <-- This request will go to OpenStreetMap server.
print(g.latlng)
# [37.3893889, -122.0832101]
g = geocoder.osm('Mountain View, CA') # <-- This request should be cached, and return immediatly
print(g.latlng)
# [37.3893889, -122.0832101]
requests_cache.uninstall_cache()
Just for debugging purposes, you could check if requests are indeed cached:
import geocoder
import requests_cache
def debug(response):
print(type(response))
return True
requests_cache.install_cache('geocoder_cache2', filter_fn=debug)
g = geocoder.osm('Mountain View, CA')
# <class 'requests.models.Response'>
# <class 'requests.models.Response'>
g = geocoder.osm('Mountain View, CA')
# <class 'requests_cache.models.response.CachedResponse'>
requests_cache.uninstall_cache()
im trying to write a control VMs on a HyperV Server using Python. I start with connecting to the server the HyperV server runs on:
connection = wmi.connect_server(server="servername", namespace=r"root\virtualization", user=r"username", password=r"password")
wmiServerConnection = wmi.WMI(wmi=connection)
This gives me a wmi object for this connection.
For stopping and starting a VM I can simply use:
#get the wmi object representing the VM
vmSystem = wmiServerConnection.Msvm_ComputerSystem(ElementName="VmName")
#send change request to vm
vmSystem[0].RequestStateChange(3)
But before starting a VM I want to apply a certain snapshot.
The class Msvm_VirtualSystemManagementService provides a method - ApplyVirtualSystemSnapshot/ApplyVirtualSystemSnapshotEx - for this. It needs the SnapshotSettingData as a parameter and I thought I could get that one using the GetSummaryInformation method of the same class. MSDN says this method returns a Msvm_SummaryInformation class.
I call this function like this:
#get the wmi class object
vmManagement = wmiServerConnection.Msvm_VirtualSystemManagementService()
snapshotInfo = vmManagement[0].GetSummaryInformation([1,107])
This should give me the name and the snapshot information for all VMs on the HyperV server. But all I get is list of COM Objects.
When I try to give a certain VM as parameter gotten from
vmSettings = wmiServerConnection.Msvm_VirtualSystemSettingData(ElementName="VmName")
like this
snapshotInfo = vmManagement[0].GetSummaryInformation([1,107], [vmSettings[0]])
it crashes.
My questions:
Why don't I get a WMI object?
The second parameter is obviously wrong. MSDN says it needs CIM_VirtualSystemSettingData REF SettingData[] as parameter. Is the WMI object the wrong one? How do I get the correct parameter?
How can I retrieve the information I need from the COM object?
Or am I totally on the wrong track?
Thanks, Stefanie
So, I finally found the solution. It was much easier than I thought, but whatever:
1.Connect to your server and get the WMI-object:
connection = wmi.connect_server(server=serverName, namespace=r"root\virtualization", user=username, password=password)
wmiServerConnection = wmi.WMI(wmi=connection)
2.Get the system object and the management service object:
#get object representing VM
vmSystem = wmiServerConnection.Msvm_ComputerSystem(ElementName=VmName)
#get object responsible for VM
vmManagement = wmiServerConnection.Msvm_VirtualSystemManagementService()
3.Get the objects associated with the VM:
#get objects the VM contains
vmObjects = vmSystem[0].associators(wmi_result_class="Msvm_VirtualSystemSettingData ")
4.Apply the snapshot you want:
for singleVmObject in vmObjects:
if(singleVmObject.SettingType == 5 and singleVmObject.ElementName == snapshotName):
retVal = vmManagement[0].ApplyVirtualSystemSnapshotEx(vmSystem[0].path(), singleVmObject.path())
Further documentation can be found here:
http://timgolden.me.uk/python/wmi/wmi.html
http://msdn.microsoft.com/en-us/library/cc136986(v=vs.85).aspx
has anyone ever used wildcard subdomains in their application? I need to come up with a way to 'localise' my application. When i say localise i mean anyone who goes to ny.foo.com/items/new/ will be sent to a view which looks through a database and search for new items in ny. Obviously we could replace NY with any state.
Any tips would be great
Thanks!
I would do it using a middleware, eg.:
class StateCodeMiddleware(object):
def process_request(self, request):
bits = request.META['HTTP_HOST'].split('.')
if len(bits) == 3 and len(bits[0]) == 2:
request.state_code = bits[0]
else:
request.state_code = None
# Or a redirect to the default state.
And then in any of your views, you can just check request.state_code and fetch new items only for that state.
Edit: For development, the best method is to setup a local DNS server. Eg. dnsmasq is very easy to configure:
address=/.dev/127.0.0.1 # in dnsmasq.conf
This makes *.dev point to localhost. You'll also have to configure your system to use the local DNS server (on UNIX systems you do this by placing nameserver 127.0.0.1 into /etc/resolve.conf).
Alternatively, you can list all the domain names in your /etc/hosts if it is a finite set:
127.0.0.1 ny.localhost, az.localhost # and so on
I am using web2py (v1.63) and Flex 3. web2py v1.61 introduced the #service decorators, which allow you to tag a controller function with #service.amfrpc. You can then call that function remotely using http://..../app/default/call/amfrpc/[function]. See http://www.web2py.com/examples/default/tools#services. Does anybody have an example of how you would set up a Flex 3 to call a function like this? Here is what I have tried so far:
<mx:RemoteObject id="myRemote" destination="amfrpc" source="amfrpc"
endpoint="http://{mysite}/{myapp}/default/call/amfrpc/">
<mx:method name="getContacts"
result="show_results(event)"
fault="on_fault(event)" />
</mx:RemoteObject>
In my scenario, what should be the value of the destination and source attributes? I have read a couple of articles on non-web2py implementations, such as http://corlan.org/2008/10/10/flex-and-php-remoting-with-amfphp/, but they use a .../gateway.php file instead of having a URI that maps directly to the function.
Alternatively, I have been able to use flash.net.NetConnection to successfully call my remote function, but most of the documentation I have found considers this to be the old, pre-Flex 3 way of doing AMF. See http://pyamf.org/wiki/HelloWorld/Flex. Here is the NetConnection code:
gateway = new NetConnection();
gateway.connect("http://{mysite}/{myapp}/default/call/amfrpc/");
resp = new Responder(show_results, on_fault);
gateway.call("getContacts", resp);
-Rob
I have not found a way to use a RemoteObject with the #service.amfrpc decorator. However, I can use the older ActionScript code using a NetConnection (similar to what I posted originally) and pair that with a #service.amfrpc function on the web2py side. This seems to work fine. The one thing that you would want to change in the NetConnection code I shared originally, is adding an event listener for connection status. You can add more listeners if you feel the need, but I found that NetStatusEvent was a must. This status will be fired if the server is not responding. You connection set up would look like:
gateway = new NetConnection();
gateway.addEventListener(NetStatusEvent.NET_STATUS, gateway_status);
gateway.connect("http://127.0.0.1:8000/robs_amf/default/call/amfrpc/");
resp = new Responder(show_results, on_fault);
gateway.call("getContacts", resp);
-Rob
I'm trying to get a webservice up and running that actually requires to check whois databases. What I'm doing right now is ugly and I'd like to avoid it as much as I can: I call gwhois command and parse its output. Ugly.
I did some search to try to find a pythonic way to do this task. Generally I got quite much nothing - this old discussion list link has a way to check if domain exist. Quite not what I was looking for... But still, it was best anwser Google gave me - everything else is just a bunch of unanwsered questions.
Any of you have succeeded to get some method up and running? I'd very much appreciate some tips, or should I just do it the opensource-way, sit down and code something by myself? :)
Found this question in the process of my own search for a python whois library.
Don't know that I agree with cdleary's answer that using a library that wraps
a command is always the best way to go - but I can see his reasons why he said this.
Pro: cmd-line whois handles all the hard work (socket calls, parsing, etc)
Con: not portable; module may not work depending on underlying whois command.
Slower, since running a command and most likely shell in addition to whois command.
Affected if not UNIX (Windows), different UNIX, older UNIX, or
older whois command
I am looking for a whois module that can handle whois IP lookups and I am not interested in coding my own whois client.
Here are the modules that I (lightly) tried out and more information about it:
UPDATE from 2022: I would search pypi for newer whois libraries. NOTE - some APIs are specific to online paid services
pywhoisapi:
Home: http://code.google.com/p/pywhoisapi/
Last Updated: 2011
Design: REST client accessing ARIN whois REST service
Pros: Able to handle IP address lookups
Cons: Able to pull information from whois servers of other RIRs?
BulkWhois
Home: http://pypi.python.org/pypi/BulkWhois/0.2.1
Last Updated: fork https://github.com/deontpearson/BulkWhois in 2020
Design: telnet client accessing whois telnet query interface from RIR(?)
Pros: Able to handle IP address lookups
Cons: Able to pull information from whois servers of other RIRs?
pywhois:
Home: http://code.google.com/p/pywhois/
Last Updated: 2010 (no forks found)
Design: REST client accessing RRID whois services
Pros: Accessses many RRIDs; has python 3.x branch
Cons: does not seem to handle IP address lookups
python-whois (wraps whois command):
Home: http://code.google.com/p/python-whois/
Last Updated: 2022-11 https://github.com/DannyCork/python-whois
Design: wraps "whois" command
Cons: does not seem to handle IP address lookups
whois:
Home: https://github.com/richardpenman/whois (https://pypi.org/project/python-whois/)
Last Updated: 2022-12
Design: port of whois.c from Apple
Cons: TODO
whoisclient - fork of python-whois
Home: http://gitorious.org/python-whois
Last Updated: (home website no longer valid)
Design: wraps "whois" command
Depends on: IPy.py
Cons: does not seem to handle IP address lookups
Update: I ended up using pywhoisapi for the reverse IP lookups that I was doing
Look at this:
http://code.google.com/p/pywhois/
pywhois - Python module for retrieving WHOIS information of domains
Goal:
- Create a simple importable Python module which will produce parsed WHOIS data for a given domain.
- Able to extract data for all the popular TLDs (com, org, net, ...)
- Query a WHOIS server directly instead of going through an intermediate web service like many others do.
- Works with Python 2.4+ and no external dependencies
Example:
>>> import pywhois
>>> w = pywhois.whois('google.com')
>>> w.expiration_date
['14-sep-2011']
>>> w.emails
['contact-admin#google.com',
'dns-admin#google.com',
'dns-admin#google.com',
'dns-admin#google.com']
>>> print w
...
There's nothing wrong with using a command line utility to do what you want. If you put a nice wrapper around the service, you can implement the internals however you want! For example:
class Whois(object):
_whois_by_query_cache = {}
def __init__(self, query):
"""Initializes the instance variables to defaults. See :meth:`lookup`
for details on how to submit the query."""
self.query = query
self.domain = None
# ... other fields.
def lookup(self):
"""Submits the `whois` query and stores results internally."""
# ... implementation
Now, whether or not you roll your own using urllib, wrap around a command line utility (like you're doing), or import a third party library and use that (like you're saying), this interface stays the same.
This approach is generally not considered ugly at all -- sometimes command utilities do what you want and you should be able to leverage them. If speed ends up being a bottleneck, your abstraction makes the process of switching to a native Python implementation transparent to your client code.
Practicality beats purity -- that's what's Pythonic. :)
Here is the whois client re-implemented in Python:
http://code.activestate.com/recipes/577364-whois-client/
I don't know if gwhois does something special with the server output; however, you can plainly connect to the whois server on port whois (43), send your query, read all the data in the reply and parse them. To make life a little easier, you could use the telnetlib.Telnet class (even if the whois protocol is much simpler than the telnet protocol) instead of plain sockets.
The tricky parts:
which whois server will you ask? RIPE, ARIN, APNIC, LACNIC, AFRINIC, JPNIC, VERIO etc LACNIC could be a useful fallback, since they tend to reply with useful data to requests outside of their domain.
what are the exact options and arguments for each whois server? some offer help, others don't. In general, plain domain names work without any special options.
Another way to do it is to use urllib2 module to parse some other page's whois service (many sites like that exist). But that seems like even more of a hack that what you do now, and would give you a dependency on whatever whois site you chose, which is bad.
I hate to say it, but unless you want to re-implement whois in your program (which would be re-inventing the wheel), running whois on the OS and parsing the output (ie what you are doing now) seems like the right way to do it.
Parsing another webpage woulnd't be as bad (assuming their html woulnd't be very bad), but it would actually tie me to them - if they're down, I'm down :)
Actually I found some old project on sourceforge: rwhois.py. What scares me a bit is that their last update is from 2003. But, it might seem as a good place to start reimplementation of what I do right now... Well, I felt obligued to post the link to this project anyway, just for further reference.
here is a ready-to-use solution that works for me; written for Python 3.1 (when backporting to Py2.x, take special care of the bytes / Unicode text distinctions). your single point of access is the method DRWHO.whois(), which expects a domain name to be passed in; it will then try to resolve the name using the provider configured as DRWHO.whois_providers[ '*' ] (a more complete solution could differentiate providers according to the top level domain). DRWHO.whois() will return a dictionary with a single entry text, which contains the response text sent back by the WHOIS server. Again, a more complete solution would then try and parse the text (which must be done separately for each provider, as there is no standard format) and return a more structured format (e.g., set a flag available which specifies whether or not the domain looks available). have fun!
##########################################################################
import asyncore as _sys_asyncore
from asyncore import loop as _sys_asyncore_loop
import socket as _sys_socket
##########################################################################
class _Whois_request( _sys_asyncore.dispatcher_with_send, object ):
# simple whois requester
# original code by Frederik Lundh
#-----------------------------------------------------------------------
whoisPort = 43
#-----------------------------------------------------------------------
def __init__(self, consumer, host, provider ):
_sys_asyncore.dispatcher_with_send.__init__(self)
self.consumer = consumer
self.query = host
self.create_socket( _sys_socket.AF_INET, _sys_socket.SOCK_STREAM )
self.connect( ( provider, self.whoisPort, ) )
#-----------------------------------------------------------------------
def handle_connect(self):
self.send( bytes( '%s\r\n' % ( self.query, ), 'utf-8' ) )
#-----------------------------------------------------------------------
def handle_expt(self):
self.close() # connection failed, shutdown
self.consumer.abort()
#-----------------------------------------------------------------------
def handle_read(self):
# get data from server
self.consumer.feed( self.recv( 2048 ) )
#-----------------------------------------------------------------------
def handle_close(self):
self.close()
self.consumer.close()
##########################################################################
class _Whois_consumer( object ):
# original code by Frederik Lundh
#-----------------------------------------------------------------------
def __init__( self, host, provider, result ):
self.texts_as_bytes = []
self.host = host
self.provider = provider
self.result = result
#-----------------------------------------------------------------------
def feed( self, text ):
self.texts_as_bytes.append( text.strip() )
#-----------------------------------------------------------------------
def abort(self):
del self.texts_as_bytes[:]
self.finalize()
#-----------------------------------------------------------------------
def close(self):
self.finalize()
#-----------------------------------------------------------------------
def finalize( self ):
# join bytestrings and decode them (witha a guessed encoding):
text_as_bytes = b'\n'.join( self.texts_as_bytes )
self.result[ 'text' ] = text_as_bytes.decode( 'utf-8' )
##########################################################################
class DRWHO:
#-----------------------------------------------------------------------
whois_providers = {
'~isa': 'DRWHO/whois-providers',
'*': 'whois.opensrs.net', }
#-----------------------------------------------------------------------
def whois( self, domain ):
R = {}
provider = self._get_whois_provider( '*' )
self._fetch_whois( provider, domain, R )
return R
#-----------------------------------------------------------------------
def _get_whois_provider( self, top_level_domain ):
providers = self.whois_providers
R = providers.get( top_level_domain, None )
if R is None:
R = providers[ '*' ]
return R
#-----------------------------------------------------------------------
def _fetch_whois( self, provider, domain, pod ):
#.....................................................................
consumer = _Whois_consumer( domain, provider, pod )
request = _Whois_request( consumer, domain, provider )
#.....................................................................
_sys_asyncore_loop() # loops until requests have been processed
#=========================================================================
DRWHO = DRWHO()
domain = 'example.com'
whois = DRWHO.whois( domain )
print( whois[ 'text' ] )
import socket
socket.gethostbyname_ex('url.com')
if it returns a gaierror you know know it's not registered with any DNS