Is there a way to format with the new format syntax a string from a function call? for example:
"my request url was {0.get_full_path()}".format(request)
so it calls the function get_full_path function inside the string and not as a parameter in the format function.
EDIT:
Here is another example that will probably show my frustration better, this is what I would like:
"{0.full_name()} {0.full_last_name()} and my nick name is {0.full_nick_name()}".format(user)
this is what I want to avoid:
"{0} and {1} and my nick name is {2}".format(user.full_name(), user.full_last_name(), user.full_nick_name())
Not sure if you can modify the object, but you could modify or wrap the object to make the functions properties. Then they would look like attributes, and you could do it as
class WrapperClass(originalRequest):
#property
def full_name(self):
return super(WrapperClass, self).full_name()
"{0.full_name} {0.full_last_name} and my nick name is {0.full_nick_name}".format(user)
which IS legal.
Python 3.6 adds literal string interpolation, which is written with an f prefix. See PEP 0498 -- Literal String Interpolation.
This allows one to write
>>> x = 'hello'
>>> s = f'{x}'
>>> print(s)
hello
It should be noted that these are not actual strings, but represent code that evaluates to a string each time. In the above example, s will be of type str, with value 'hello'. You can't pass an f-string around, since it will be evaluated to the result str before being used (unlike str.format, but like every other string literal modifier, such as r'hello', b'hello', '''hello'''). (PEP 501 -- General purpose string interpolation (currently deferred) suggests a string literal that will evaluate to an object which can take substitutions later.)
Python does not directly support variable interpolation. This means that it lacks certain functionality (namely, function calling in strings) which other languages support.
So, there isn't really anything to say here other than no, you can't do that. That's just not how Python's formatting syntax works.
The best you have is this:
"my request url was {0}".format(request.get_full_path())
What about this very weird thing?
"my request url was %s and my post was %s"\
% (lambda r: (r.get_full_path(), r.POST))(request)
Explanation:
Classic way of formatting
Lambda function which takes a request and returns a tuple with what you want
Call the lambda inline as arguments for your string.
I still prefer the way you're doing it.
If you want readability you can do this:
path, post = request.get_full_path(), request.POST
"my request url was {} and my post was {}".format(path, post)
So summary of methods would be
(base) [1]~ $ cat r.py
# user is dict:
user = {'full_name': 'dict joe'}
print('{0[full_name]}'.format(user))
# user is obj:
class user:
#property
def full_name(self):
return 'attr joe'
print('{0.full_name}'.format(user()))
# Wrapper for arbitray values - as dict or by attr
class Getter:
def __init__(self, src):
self.src = src
def __getitem__(self, k):
return getattr(self.src, k, 'not found: %s' % k)
__getattr__ = __getitem__
print('{0[foo]} - {0.full_name}'.format(Getter(user())))
(base) [1]~ $ python r.py
dict joe
attr joe
not found: foo - attr joe
Related
I am using Python 3.9 and trying to write the function
def greet(request, name):
return HttpResponse(f'Hello, {name.capitalize}!')
Using f to format the string but it is not working. Any ideas on why?
capitalize is a method of the str object.
Therefore you need to add parenthesis for it to be called:
def greet(request, name):
return HttpResponse(f'Hello, {name.capitalize()}!')
Furthermore, name.capitalize is really just the reference to the function.
Try running the following inside a python interpreter:
print(str.capitalize)
You could even return this function:
def cap_str(string):
return string.capitalize
s = "programming in python"
capitalize_s = cap_str(s)
s_cap = capitalize_s()
print(s_cap)
I don't know how this would be particularly useful, but returning a function in general is pretty useful.
Can you print the error message? I suspect your error is that you want name.capitalize() rather than name.capitalize
Ah - this has already been added!
How can I find the number of arguments of a Python function? I need to know how many normal arguments it has and how many named arguments.
Example:
def someMethod(self, arg1, kwarg1=None):
pass
This method has 2 arguments and 1 named argument.
The previously accepted answer has been deprecated as of Python 3.0. Instead of using inspect.getargspec you should now opt for the Signature class which superseded it.
Creating a Signature for the function is easy via the signature function:
from inspect import signature
def someMethod(self, arg1, kwarg1=None):
pass
sig = signature(someMethod)
Now, you can either view its parameters quickly by string it:
str(sig) # returns: '(self, arg1, kwarg1=None)'
or you can also get a mapping of attribute names to parameter objects via sig.parameters.
params = sig.parameters
print(params['kwarg1']) # prints: kwarg1=20
Additionally, you can call len on sig.parameters to also see the number of arguments this function requires:
print(len(params)) # 3
Each entry in the params mapping is actually a Parameter object that has further attributes making your life easier. For example, grabbing a parameter and viewing its default value is now easily performed with:
kwarg1 = params['kwarg1']
kwarg1.default # returns: None
similarly for the rest of the objects contained in parameters.
As for Python 2.x users, while inspect.getargspec isn't deprecated, the language will soon be :-). The Signature class isn't available in the 2.x series and won't be. So you still need to work with inspect.getargspec.
As for transitioning between Python 2 and 3, if you have code that relies on the interface of getargspec in Python 2 and switching to signature in 3 is too difficult, you do have the valuable option of using inspect.getfullargspec. It offers a similar interface to getargspec (a single callable argument) in order to grab the arguments of a function while also handling some additional cases that getargspec doesn't:
from inspect import getfullargspec
def someMethod(self, arg1, kwarg1=None):
pass
args = getfullargspec(someMethod)
As with getargspec, getfullargspec returns a NamedTuple which contains the arguments.
print(args)
FullArgSpec(args=['self', 'arg1', 'kwarg1'], varargs=None, varkw=None, defaults=(None,), kwonlyargs=[], kwonlydefaults=None, annotations={})
import inspect
inspect.getargspec(someMethod)
see the inspect module
func.__code__.co_argcount gives you the number of any arguments BEFORE *args
func.__kwdefaults__ gives you a dict of the keyword arguments AFTER *args
func.__code__.co_kwonlyargcount is equal to len(func.__kwdefaults__)
func.__defaults__ gives you the values of optional arguments that appears before *args
Here is a simple illustration:
>>> def a(b, c, d, e, f=1, g=3, h=None, *i, j=2, k=3, **L):
pass
>>> a.__code__.co_argcount
7
>>> a.__defaults__
(1, 3, None)
>>> len(a.__defaults__)
3
>>>
>>>
>>> a.__kwdefaults__
{'j': 2, 'k': 3}
>>> len(a.__kwdefaults__)
2
>>> a.__code__.co_kwonlyargcount
2
someMethod.func_code.co_argcount
or, if the current function name is undetermined:
import sys
sys._getframe().func_code.co_argcount
inspect.getargspec()
Get the names and default values of a function’s arguments. A tuple of four things is returned: (args, varargs, varkw, defaults). args is a list of the argument names (it may contain nested lists). varargs and varkw are the names of the * and ** arguments or None. defaults is a tuple of default argument values or None if there are no default arguments; if this tuple has n elements, they correspond to the last n elements listed in args.
Changed in version 2.6: Returns a named tuple ArgSpec(args, varargs, keywords, defaults).
See can-you-list-the-keyword-arguments-a-python-function-receives.
Adding to the above, I've also seen that the most of the times help() function really helps
For eg, it gives all the details about the arguments it takes.
help(<method>)
gives the below
method(self, **kwargs) method of apiclient.discovery.Resource instance
Retrieves a report which is a collection of properties / statistics for a specific customer.
Args:
date: string, Represents the date in yyyy-mm-dd format for which the data is to be fetched. (required)
pageToken: string, Token to specify next page.
parameters: string, Represents the application name, parameter name pairs to fetch in csv as app_name1:param_name1, app_name2:param_name2.
Returns:
An object of the form:
{ # JSON template for a collection of usage reports.
"nextPageToken": "A String", # Token for retrieving the next page
"kind": "admin#reports#usageReports", # Th
Good news for folks who want to do this in a portable way between Python 2 and Python 3.6+: use inspect.getfullargspec() method. It works in both Python 2.x and 3.6+
As Jim Fasarakis Hilliard and others have pointed out, it used to be like this:
1. In Python 2.x: use inspect.getargspec()
2. In Python 3.x: use signature, as getargspec() and getfullargspec() were deprecated.
However, starting Python 3.6 (by popular demand?), things have changed towards better:
From the Python 3 documentation page:
inspect.getfullargspec(func)
Changed in version 3.6: This method was previously documented as deprecated in favour of signature() in Python 3.5, but that decision has been reversed in order to restore a clearly supported standard interface for single-source Python 2/3 code migrating away from the legacy getargspec() API.
You get the number of arguments by (replace "function" by the name of your function):
function.__code__.co_argcount ## 2
And the names for the arguments by:
function.__code__.co_varnames ## ('a', 'b')
As other answers suggest, getargspec works well as long as the thing being queried is actually a function. It does not work for built-in functions such as open, len, etc, and will throw an exception in such cases:
TypeError: <built-in function open> is not a Python function
The below function (inspired by this answer) demonstrates a workaround. It returns the number of args expected by f:
from inspect import isfunction, getargspec
def num_args(f):
if isfunction(f):
return len(getargspec(f).args)
else:
spec = f.__doc__.split('\n')[0]
args = spec[spec.find('(')+1:spec.find(')')]
return args.count(',')+1 if args else 0
The idea is to parse the function spec out of the __doc__ string. Obviously this relies on the format of said string so is hardly robust!
inspect.getargspec() to meet your needs
from inspect import getargspec
def func(a, b):
pass
print len(getargspec(func).args)
The accepted answer by Dimitris Fasarakis Hilliard suggests getting parameters in the string format but I think one can make a mistake when parsing this string and thus I created rather a list of the parameters directly using the inspect module
import inspect
def my_function(a,b,c):
#some code
pass
result=list(inspect.signature(my_function).parameters.keys())
print(result)
['a','b','c']
Assuming you may be dealing with class based methods or simply functions, you could do something like the following.
This will automatically subtract one input if the input is a class method (and therefore includes self).
import types
def get_arg_count(fn):
extra_method_input_count=1 if isinstance(fn, types.MethodType) else 0
return fn.__code__.co_argcount-extra_method_input_count
Then you can apply as you need to functions or methods:
def fn1(a, b, c):
return None
class cl1:
def fn2(self, a, b, c):
return None
print(get_arg_count(fn1)) #=> 3
print(get_arg_count(cl1().fn2)) #=> 3
In:
import inspect
class X:
def xyz(self, a, b, c):
return
print(len(inspect.getfullargspec(X.xyz).args))
Out:
4
Note: If xyz wasn't inside class X and had no "self" and just "a, b, c", then it would have printed 3.
For python below 3.5, you may want to replace inspect.getfullargspec by inspect.getargspec in the code above.
This is a solution to getting the number of mandatory arguments of a function (*)
Many of the solutions proposed here do not work for this purpose if some more uncommon parameter specifications are used (positional-only parameters with defaults, keyword-only parameters without defaults, etc.)
from typing import Callable, Any
import inspect
def get_mandatory_argcount(f: Callable[..., Any]) -> int:
"""Get the number of mandatory arguments of a function."""
sig = inspect.signature(f)
def parameter_is_mandatory(p: inspect.Parameter) -> bool:
return p.default is inspect.Parameter.empty and p.kind not in (
inspect.Parameter.VAR_POSITIONAL,
inspect.Parameter.VAR_KEYWORD,
)
return sum(parameter_is_mandatory(p) for p in sig.parameters.values())
# mandatory keyword-only
def f1(b=2, *args, c, d=1, **kwds): pass
print(get_mandatory_argcount(f1))
# positional only with default
def f2(a=1, /, b=3, *args, **kwargs): pass
print(get_mandatory_argcount(f2))
(*) I would have liked to put this as an answer to Programmatically determining amount of parameters a function requires - Python instead, but for some reason this question is marked as duplicate to this one despite it asking specifically about the number of required arguments whereas this question only asks about the general number of arguments.
does python 3.5 provide functions that allow to test whether a given
argument would fit the type hints given in the function declaration?
if i have e.g. this function:
def f(name: List[str]):
pass
is there a python method that can check whether
name = ['a', 'b']
name = [0, 1]
name = []
name = None
...
fit the type hints?
i know that 'no type checking happens at runtime' but can i still check the
validity of these arguments by hand in python?
or if python does not provide that functionality itself: what is the tool i'd
need to use?
Python itself doesn't provide such functions, you can read more about it here:
I wrote a decorator for that. This is the code of my decorator:
from typing import get_type_hints
def strict_types(function):
def type_checker(*args, **kwargs):
hints = get_type_hints(function)
all_args = kwargs.copy()
all_args.update(dict(zip(function.__code__.co_varnames, args)))
for argument, argument_type in ((i, type(j)) for i, j in all_args.items()):
if argument in hints:
if not issubclass(argument_type, hints[argument]):
raise TypeError('Type of {} is {} and not {}'.format(argument, argument_type, hints[argument]))
result = function(*args, **kwargs)
if 'return' in hints:
if type(result) != hints['return']:
raise TypeError('Type of result is {} and not {}'.format(type(result), hints['return']))
return result
return type_checker
You can use it like that:
#strict_types
def repeat_str(mystr: str, times: int):
return mystr * times
Though it's not very pythonic to restrict your function to accept only one type. Though you can use abc (abstract base classes) like number (or custom abc) as type-hints and restrict your functions to accept not only one type, but whatever combination of types you want.
Added a github repo for it, if anybody wants to use it.
This is an old question, but there is a tool I've written for doing run time type checking based on type hints: https://pypi.org/project/typeguard/
Problem:
I am trying to figure out how to convert a buildbot Property into a string value. I really don't have much experience with buildbot other than what I have read in the docs and someone elses code.
The issue is I have a Property that contains a path. I need to get the path as a string so that I can use some python functions such as 'split' and 'basename' to retrieve specific elements of the path.
What I Have Tried:
There is a property mapped like so
"artifact.output":"S3://dev/artifacts/out/package1.tar.gz"
When I call path.os.basename(util.Property("artifact.output")) it complains that Property has no 'rfind' method. I also tried using util.Interpolate but again, it has the same issue. Finally, I tried str(util.Property("artifact.output")) but it just outputs Property("artifact.output").
Question:
Is it possible to retrieve a buildbot Property as a string value?
note: I was only able to find one other post from someone back on 2014 asking the same thing but no answer.
A Property is not a string by itself, but it's a class that implements an IRenderable interface. This interface defines something, that can be "rendered" into a string when needed. To render a Property or any renderable (eg. the util.Interpolate object), you need a an IProperties provider.
The question is where to get such provider and how to render it. When implementing your own step, you can use the Build instance that you can access from self.build as such provider and use it to render the property.
class ExampleBuildStep(BuildStep):
def __init__(self, arg, **kwargs):
"""
Args:
arg - any string, Property or any renderable that will be rendered in run
"""
self.arg = arg
super().__init__(**kwargs)
#defer.inlineCallbacks
def run(self):
# the renderedcommand will be the string representation of the self.arg
renderedcommand = yield self.build.render(self.arg)
In the example above, the ExampleBuildStep takes an argument arg that will be rendered inside the run() function. Note that the arg does not have to be property, it can be a tring as well. You can now create use the build step with renderables:
step = ExampleBuildStep(util.Property("artifact.output"))
step = ExampleBuildStep(util.Interpolate('%(prop:artifact.output)s'))
step = ExampleBuildStep("string argument")
You can use Interpolate for that purpose:
util.Interpolate('string before ' + '%(prop:artifact.output)s' + ' string after')
If you have access to the BuildStep object, you can grab properties already formatted as a string via the getProperty() method.
If you wanted to grab the "workername" as a string and print it, you could call:
workerName = step.getProperties().getProperty('workername','wname')
print("workerName: %s" % workerName)
Note: workername is one of the Common Build Properties you should always expect to find, alongside the user-defined ones.
getProperty() is defined in properties.py:
def getProperty(self, name, default=None):
return self.properties.get(name, (default,))[0]
Remember to switch branches from 'master' to whatever your version of buildbot is.
I've recently noticed something interesting when looking at Python 3.3 grammar specification:
funcdef: 'def' NAME parameters ['->' test] ':' suite
The optional 'arrow' block was absent in Python 2 and I couldn't find any information regarding its meaning in Python 3. It turns out this is correct Python and it's accepted by the interpreter:
def f(x) -> 123:
return x
I thought that this might be some kind of a precondition syntax, but:
I cannot test x here, as it is still undefined,
No matter what I put after the arrow (e.g. 2 < 1), it doesn't affect the function behavior.
Could anyone familiar with this syntax style explain it?
It's a function annotation.
In more detail, Python 2.x has docstrings, which allow you to attach a metadata string to various types of object. This is amazingly handy, so Python 3 extends the feature by allowing you to attach metadata to functions describing their parameters and return values.
There's no preconceived use case, but the PEP suggests several. One very handy one is to allow you to annotate parameters with their expected types; it would then be easy to write a decorator that verifies the annotations or coerces the arguments to the right type. Another is to allow parameter-specific documentation instead of encoding it into the docstring.
These are function annotations covered in PEP 3107. Specifically, the -> marks the return function annotation.
Examples:
def kinetic_energy(m:'in KG', v:'in M/S')->'Joules':
return 1/2*m*v**2
>>> kinetic_energy.__annotations__
{'return': 'Joules', 'v': 'in M/S', 'm': 'in KG'}
Annotations are dictionaries, so you can do this:
>>> '{:,} {}'.format(kinetic_energy(12,30),
kinetic_energy.__annotations__['return'])
'5,400.0 Joules'
You can also have a python data structure rather than just a string:
rd={'type':float,'units':'Joules',
'docstring':'Given mass and velocity returns kinetic energy in Joules'}
def f()->rd:
pass
>>> f.__annotations__['return']['type']
<class 'float'>
>>> f.__annotations__['return']['units']
'Joules'
>>> f.__annotations__['return']['docstring']
'Given mass and velocity returns kinetic energy in Joules'
Or, you can use function attributes to validate called values:
def validate(func, locals):
for var, test in func.__annotations__.items():
value = locals[var]
try:
pr=test.__name__+': '+test.__docstring__
except AttributeError:
pr=test.__name__
msg = '{}=={}; Test: {}'.format(var, value, pr)
assert test(value), msg
def between(lo, hi):
def _between(x):
return lo <= x <= hi
_between.__docstring__='must be between {} and {}'.format(lo,hi)
return _between
def f(x: between(3,10), y:lambda _y: isinstance(_y,int)):
validate(f, locals())
print(x,y)
Prints
>>> f(2,2)
AssertionError: x==2; Test: _between: must be between 3 and 10
>>> f(3,2.1)
AssertionError: y==2.1; Test: <lambda>
In the following code:
def f(x) -> int:
return int(x)
the -> int just tells that f() returns an integer (but it doesn't force the function to return an integer). It is called a return annotation, and can be accessed as f.__annotations__['return'].
Python also supports parameter annotations:
def f(x: float) -> int:
return int(x)
: float tells people who read the program (and some third-party libraries/programs, e. g. pylint) that x should be a float. It is accessed as f.__annotations__['x'], and doesn't have any meaning by itself. See the documentation for more information:
https://docs.python.org/3/reference/compound_stmts.html#function-definitions
https://www.python.org/dev/peps/pep-3107/
As other answers have stated, the -> symbol is used as part of function annotations. In more recent versions of Python >= 3.5, though, it has a defined meaning.
PEP 3107 -- Function Annotations described the specification, defining the grammar changes, the existence of func.__annotations__ in which they are stored and, the fact that it's use case is still open.
In Python 3.5 though, PEP 484 -- Type Hints attaches a single meaning to this: -> is used to indicate the type that the function returns. It also seems like this will be enforced in future versions as described in What about existing uses of annotations:
The fastest conceivable scheme would introduce silent deprecation of non-type-hint annotations in 3.6, full deprecation in 3.7, and declare type hints as the only allowed use of annotations in Python 3.8.
(Emphasis mine)
This hasn't been actually implemented as of 3.6 as far as I can tell so it might get bumped to future versions.
According to this, the example you've supplied:
def f(x) -> 123:
return x
will be forbidden in the future (and in current versions will be confusing), it would need to be changed to:
def f(x) -> int:
return x
for it to effectively describe that function f returns an object of type int.
The annotations are not used in any way by Python itself, it pretty much populates and ignores them. It's up to 3rd party libraries to work with them.
This means the type of result the function returns, but it can be None.
It is widespread in modern libraries oriented on Python 3.x.
For example, it there is in code of library pandas-profiling in many places for example:
def get_description(self) -> dict:
def get_rejected_variables(self, threshold: float = 0.9) -> list:
def to_file(self, output_file: Path or str, silent: bool = True) -> None:
"""Write the report to a file.
def f(x) -> 123:
return x
My summary:
Simply -> is introduced to get developers to optionally specify the return type of the function. See Python Enhancement Proposal 3107
This is an indication of how things may develop in future as Python is adopted extensively - an indication towards strong typing - this is my personal observation.
You can specify types for arguments as well. Specifying return type of the functions and arguments will help in reducing logical errors and improving code enhancements.
You can have expressions as return type (for both at function and parameter level) and the result of the expressions can be accessed via annotations object's 'return' attribute. annotations will be empty for the expression/return value for lambda inline functions.
def function(arg)->123:
It's simply a return type, integer in this case doesn't matter which number you write.
like Java :
public int function(int args){...}
But for Python (how Jim Fasarakis Hilliard said) the return type it's just an hint, so it's suggest the return but allow anyway to return other type like a string..
def f(x) -> str:
return x+4
print(f(45))
Will give the result : 49.
Or in other words '-> str' has NO effect on return type:
print(f(45).__class__)
<class 'int'>
-> is introduced in python3.
In simpler words, the content after the -> denotes the return type of the function.
The return type is optional.
It's just telling the user what it expects or return the value
funcname.__annotations__ will print the details
like
def function(name:str ,age:int) -> "printing the personal details ":
print(f"name is {name} age is {age}")
function("test",20)
print(function.__annotations__)
The Output
name is test age is 20
{'name': <class 'str'>, 'age': <class 'int'>, 'return': 'printing the personal details '}
even when you return the values it display nothing.
Please refer to the PEP3107 specification. These are function annotations. Python 2.x has docstrings. Similarly, Python 3 introduced the use of -> as function annotations. Python uses these while generating documentation.