I have a class defined in Python 2.7 like this:
from future.builtins import object
class Point(object):
def __init__(self, x, y):
self.x = x
self.y = y
In PyCharm, this gives a warning in the __init__ line:
Signature is not compatible to __new__.
I don't understand what this warning is telling me. Can someone give an example where this warning would rightfully catch an error or can this warning be turned off?
There's a PyCharm thread for this, but it doesn't help me: https://intellij-support.jetbrains.com/hc/en-us/community/posts/115000254530-PyCharm-init-Signature-is-not-compatible-to-new-
I suffered from the same problem and found the solution in here.
As discussed in the above link, a direct solution is to remove your Pycharm config directory ("/home/username/.PyCharm2018.3" for me). However, it would remove all your other configs too. Finally, I just fix the problem by removing the rule from inspections. You can find the rule in the Pycharm Setting window (see the figure as follow).
This is a PyCharm bug; you can disable the warning via Xiong-Hui's post. Basically __new__ is a method that is called before __init__ with the same arguments when a class is constructed, so their signatures must be compatible (both functions must be able to be invoked with the same arguments) or the class cannot be instantiated. For example, here is a code snippet for which PyCharm correctly applies the warning:
class Test(object):
def __new__(cls, arg1):
return super().__new__(cls)
def __init__(self):
pass
Attempting to create the class with Test() will throw a TypeError since __new__ expects an argument, but creating the class with Test('something') will also throw a TypeError since __init__ doesn't expect any arguments, making it impossible to construct the class. Normally this is never an issue because the default implementation of __new__ in object accepts an arbitrary number of arguments, but if you define __new__ yourself you will need to be careful that the signatures remain compatible so the class can actually be constructed, which is the purpose of the warning.
Related
I've been looking into type hinting my code but noticed that Python programmers typically do not type hint self in their programs
Even when I look at the docs, they do not seem to type hint self, see
here. This is from version 3.10 post forward declarations
def __init__(self, value: T, name: str, logger: Logger) -> None:
I can understand why this is an issue before type annotations were introduced in 3.7 with Forward declarations
More info
here and here
The reason this seems useful to me is mypy seems able to catch bugs with this problem
example:
from __future__ import annotations
class Simple(object):
def __init__(self: Simple):
print(self.x)
would return this from mypy
mypy test.py
test.py:5: error: "Simple" has no attribute "x"
Found 1 error in 1 file (checked 1 source file)
Which if you remove the type from self becomes
Success: no issues found in 1 source file
Is there a reason that self is not annotated or is this only convention?
Are there trade offs I'm missing or is my annotation of self wrong for some reason?
mypy usually handles the type of self without needing an explicit annotation. You're running into a different problem - a method with no argument or return type annotations is not type-checked at all. For a method with no non-self arguments, you can avoid this by annotating self, but you can also avoid it by annotating the return type.
Hello their thanks in advance for helping me,
Please see below code:
import types
_MSG = ("Failed importing {name}. Please install {name}."
" Using pip install {name}")
class Empty(): # pylint: disable=too-few-public-methods
"""Empty class for beam API."""
def __call__(self, *args, **kwargs):
return None
class DummyBeam(types.ModuleType): # pylint: disable=too-few-public-methods
DoFn = Empty
Pipeline = Empty
def __init__(self, name="apache_beam"):
super(DummyBeam, self).__init__(name)
def __getattribute__(self, _):
if getattr(DummyBeam, _, None) is Empty:
err_msg = _MSG.format(name=self.__name__)
raise ImportError(err_msg)
What I want to check if apache_beam was not installed it will successfully load all beam classes like DoFn and pipeline but calling some function raises error please see below code to see above code in use.
try:
import apache_beam as beam
except ImportError:
beam = DummyBeam()
class SomeFn(beam.DoFn):
pass
class SomeOtherFn(beam.Pipeline):
pass
SomeFn()
In above code for now accessing beam.DoFn raises error but what I want it to not raise error when accessing beam.DoFn although it raises error when calling SomeFn(). Also tried to replace getattribute with getattr and it not gives me results as i expected it wont raise error when calling SomeFn() although it runs fine for all codes.
Thanks for looking into this.
As shown in the traceback (that you should have posted FWIW), your error is not in calling SomeFn() but in accessing beam.DoFn in the SomeFn class definition. And the reason is quite obvious: you very explicitely instructed Python to do so by plain overriding Beam.__getattribute__.
Note that object.__getattribute__ is the official default implementation of attribute lookup (it's invoked each time Python sees either obj.name or getattr(obj, "name"), and that it's a method that is better left alone unless you fully understand the implications of overriding it AND have no better solution.
In this case the very obvious solution is to instead implempent __getattr__, which is only invoked by __getattribute__ as a last resort if the attribute couldn't be resolved in any other way. You say that:
Also replace getattribute with getattr was not working
but I just tried it on your code snippet and it (of course) yield the result I expected. Whether this is what you expected is another question, but since you neither posted this version of your code nor cared to explain how it "was not working", you can't expect any answer on this point (hint: "is not working" is a totally useless description of an issue).
As a last note:
it will successfully load all beam methods like DoFn and pipeline
...
In above code for now calling beam.DoFn
It seems you're a bit confused about terminology. DoFn and Pipeline are classes, not methods, and (as already mentionned) your error is raised when accessing beam.DoFn, not when calling it.
EDIT:
by was not working I meant it not gives me error either when i am trying to access beam.DoFn or SomeFn() when use getattr instead getattribute
(...)
what i want is to raise error when calling someFn no accessing beam.DoFn
Ok, it looks that you don't quite get the execution order of a method call expression. When you do
obj.method()
this is actually a shortcut for
method = obj.__getattribute__("method")
method.__call__()
So overriding __getattribute__ isn't the proper solution (cf above), and defining __getattr__ is useless here - your DummyBeam class HAS DoFn and Pipeline attributes so __getattr__ will just not be invoked for those names.
Now the reason you don't get any exception when calling beam.DoFn or beam.Pipeline is that those name are bound to your Empty class, not instances of that class, so you actually never invoke Empty.__call__. Rhe __call__ method defined in a class is only used when an instance of that class is called, not when you instanciate the class (in which case it's the metaclass's __call__ method which is invoked):
>>> class MyCallable:
... def __init__(self):
... print("in MyCallable.__init__")
... def __call__(self):
... print("in MyCallable.__call__")
...
>>>
... c = MyCallable()
in MyCallable.__init__
>>> c()
in MyCallable.__call__
>>>
So if what you want is to raise your exception when someone tries to instanciate DoFn or ̀Pipelineyou either have to make them instances ofEmptyor keep them as they are and renameEmpty.calltoEmpty.newwhich is the first method called bytype.call(type` being the default metaclass for all classes).
I am using PyCharm IDE and below is a fragment of code where I am using a decorator. The decorator is basically checking whether the argument extract is an integer >= 0.
This code is working as far as I can tell however I noticed some syntax error in PyCharm.
class MyClass(object):
def _argument_test_extract(func):
def _helper(*args, **kwargs):
kwargs = inspect.getcallargs(func, *args, **kwargs)
if 'rule' in kwargs:
extract = kwargs['rule']['extract']
if type(extract) == int and extract >= 0:
return func(**kwargs)
else:
raise Exception("Argument `extract` is not an integer")
return _helper
#_argument_test_extract
def _perform_split_model_string(self, rule):
# do some stuff
PyCharm indicates the following message for the line where I decorate the function with #_argument_test_extract:
Function '_argument_test_extract' lacks a positional argument
The line where the decorator function is defined def _argument_test_extract(func): indicates the following message
Usually first parameter of a function is 'self'
Finally the line return func(**kwargs) indicates the following message:
'MyClass' is not callable
Obviously, if I decorate the function _argument_test_extract with #staticmethod all the warning of PyCharm disappear but the code is not working anymore because of this error, TypeError: 'staticmethod' object is not callable.
Is there something wrong with my syntax? Thanks
Is there something wrong with my syntax
well ... if it works, apparently not :-). There's probably something wrong with pycharm's static analysis.
With that said, there is something weird about what you're doing. PyCharm is noticing that (after class creation), _argument_test_abstract is going to become a method of the class. As a method of the class, _argument_test_abstract will require an argument. However, during class creation, _argument_test_abstract isn't yet a method and this is when the decoration occurs -- So it works. (as you've noted, #staticmethod doesn't solve the problem because the staticmethod descriptor itself isn't callable, only the function that it returns from __get__).
So, where does that leave us? You can continue with what you have and just ignore pycharm, or you can move the decorator out of the class and make it module level. It really doesn't need to be in the class in the first place :-). I would argue that moving it out of the class is the better solution because lots of people reading your code are going to wonder where self is and how this whole thing doesn't blow up every time you try to execute it, etc.
So I have a simple custom error class in Python that I created based on the Python 2.7 documentation:
class InvalidTeamError(Exception):
def __init__(self, message='This user belongs to a different team'):
self.message = message
This gives me warning W0231: __init__ method from base class %r is not called in PyLint so I go look it up and am given the very helpful description of "explanation needed." I'd normally just ignore this error but I have noticed that a ton code online includes a call to super in the beginning of the init method of custom error classes so my question is: Does doing this actually serve a purpose or is it just people trying to appease a bogus pylint warning?
This was a valid pylint warning: by not using the superclass __init__ you can miss out on implementation changes in the parent class. And, indeed, you have - because BaseException.message has been deprecated as of Python 2.6.
Here would be an implementation which will avoid your warning W0231 and will also avoid python's deprecation warning about the message attribute.
class InvalidTeamError(Exception):
def __init__(self, message='This user belongs to a different team'):
super(InvalidTeamError, self).__init__(message)
This is a better way to do it, because the implementation for BaseException.__str__ only considers the 'args' tuple, it doesn't look at message at all. With your old implementation, print InvalidTeamError() would have only printed an empty string, which is probably not what you wanted!
Lookinv at the cpython2.7 source code there should be no problem avoiding that call to super init and Yes it's done just because its generally a good practice to call base class init in your init.
https://github.com/python/cpython/blob/master/Objects/exceptions.c see line 60 for BaseException init and line 456 how Exception derives from BaseException.
I am working on creating a child class that inherits from PySndfile that would contain extra functions related to the PySndfile class.
However, when initialising the child class with a keyword argument not intended for the PySndfile class, it appears that arguments are sent straight to the parent class, bypassing the child's init altogether (ie the print statment isn't called, the argument isn't popped from kwargs and the traceback doesn't refer to the child class at all.
Any help would be much appreciated.
ClassProblem.py:
from pysndfile import PySndfile
class AudioFile(PySndfile):
def __init__(self, filename, mode, rms="123"):
print 'AudioFile init called'
self.rms = rms
super(AudioFile, self).__cinit__(
self,
filename,
mode=mode,
format=None,
channels=None,
samplerate=None
)
aa = AudioFile(
"/path/to/audio/file.wav",
'r',
rms="456",
)
Error produced:
Traceback (most recent call last):
File "ClassProblem.py", line 18, in <module>
rms="456",
File "_pysndfile.pyx", line 564, in _pysndfile.PySndfile.__cinit__ (_pysndfile.cpp:3308)
TypeError: __cinit__() got an unexpected keyword argument 'rms'
It seems you'll have to override __new__, according to the cython documentation:
If you anticipate subclassing your extension type in Python, you may find it useful to give the cinit() method * and ** arguments so that it can accept and ignore extra arguments. Otherwise, any Python subclass which has an init() with a different signature will have to override new() [1] as well as init(), which the writer of a Python class wouldn’t expect to have to do.
PySndfile appears to be a Cython library. I don't know anything at all about Cython, but it appears from the traceback that it uses __cinit__ rather than __init__. You should probably override that method instead.
You are effectively passing **kwargs up to the PySndfile constructor but it doesn't know what to do with the rms argument which is specific to your class. I don't have PySndfile locally but after a quick look online, I see the pyx constructor looks like this:
def __cinit__(self, filename, mode='r', int format=0,
int channels=0, int samplerate=0):
This is not an *args, **kwargs function, able to take any arguments, rather it has a fixed signature. I would suggest updating your class with a similar constructor to which you can add the arguments specific to your class:
def __init__(self, filename, mode='r', rms=0, format=0,
channels=0, samplerate=0):
Then pass all the arguments to the super constructor (except rms, which it does not understand).
I found the answer by posting this question to the /r/learnpython sub-reddit
The user yes_or_gnome provides a great explanation as to why I needed to override the __new__ class method as well as the __init__ method and has provided example code.