Rectangle Enclosure (Python3) - python

I am trying to create a solution to this problem, with my code focusing on looping(for and while):
Let parameter Q be a list of "rectangles", specified by width, height,
and (x,y)-location of the lower left corner. Enclosure(Q) returns the
smallest rectangle that contains each rectangle in Q . The way a
rectangle is represented is by tuple, (x,y,width,height).
Here is my code so far:
def Enclosure(Q):
c = []
d = []
for (x,y,width,height) in sorted(Q):
c.append(x)
d.append(y)
print(c)
print(min(c))
e=tuple([min(c)]+[min(d)])
print(e)
Ignore the print statements, those are only there for debugging purposes. My code to this point creates a tuple of the (x,y) coordinates of the lower left corner of the enclosing rectangle that I am trying to have the program form. But after this point, I have absolutely no idea on how to find the (height,width) of the enclosing rectangle. How can I do that?
Also, here is an example of the program should do when it runs:
R1 = (10,20,5,100) #R1,R2,R3 are each rectangles
R2 = (15,5,30,150) #layout of the tuple:(x,y,width,height)
R3 = (-4,30,20,17)
Enclosure([R1,R2,R3])
(-4, 5, 49, 150) #rectangle that encloses R1,R2,R3

Try this:
def Enclosure(Q):
c = []
d = []
x2 = []
y2 = []
for (x,y,width,height) in sorted(Q):
c.append(x)
d.append(y)
x2.append(width + x)
y2.append(height + y)
e = tuple([min(c)]+[min(d)] + [max(x2) - min(c)] + [max(y2) - min(d)])
print(e)
You could replace e = tuple... to:
e=(min(c), min(d), max(x2) - min(c), max(y2) - min(d)) # This makes a tuple directly
to avoid making first a list, and converting it to tuple as you did.

Make sure you can explain why you used inheritance or your teacher will know you are cheating:
class Geometry(object):
def __init__(self, x, y):
self.x = x
self.y = y
class Point(Geometry):
def __repr__(self):
return "<Point at {s.x},{s.y}>".format(s=self)
class Rectangle(Geometry):
def __init__(self, x, y, width, height):
super(Rectangle, self).__init__(x, y)
self.width = width
self.height = height
#property
def oposite_corner(self):
return Point(self.x + self.width, self.y + self.height)
def __repr__(self):
return "<Rectange of {s.width}x{s.height} at {s.x},{s.y}>".format(s=self)
def enclosing_rectangle(rect_list):
x1, y1, x2, y2 = (
min([r.x for r in rect_list]),
min([r.y for r in rect_list]),
max([r.oposite_corner.x for r in rect_list]),
max([r.oposite_corner.y for r in rect_list]),
)
return Rectangle(x1, y1, x2 - x1, y2 - y1)
Please test and fix any bug by yourself (hint: super changed):
>>> l = [Rectangle(1, 2, 3, 4), Rectangle(-1, -1, 1, 1), Rectangle(3, 4, 1, 1)]
>>> enclosing_rectangle(l)
<Rectangle of 5x7 at -1,-1>
[update]
What?!? A delete vote? Care to explain what is so offensive with this answer?

Related

How can I use an element of a python array that is an instance of an object in a function by using the index into the array?

I have a task wherein I am to determine if a Point(x,y) is closer than some amount to any of the Points that are stored in a Python array. Here is the test code:
from point import *
collection = []
p1 = Point(3,4)
collection.append(p1)
print(collection)
p2 = Point(3,0)
collection.append(p2)
print(collection)
p3 = Point(3,1)
radius = 1
print( collection[1] ) # This works, BTW
p = collection[1]
print( p ) # These two work also!
for i in collection:
p = collection[i] # THIS FAILS
if distance(p3,p) < 2*radius:
print("Point "+collection[i]+" is too close to "+p3)
The file point.py contains:
import math
class Point:
'''Creates a point on a coordinate plane with values x and y.'''
COUNT = 0
def __init__(self, x, y):
'''Defines x and y variables'''
self.X = x
self.Y = y
def move(self, dx, dy):
'''Determines where x and y move'''
self.X = self.X + dx
self.Y = self.Y + dy
def __str__(self):
return "Point(%s,%s)"%(self.X, self.Y)
def __str__(self):
return "(%s,%s)"%(self.X,self.Y)
def testPoint(x=0,y=0):
'''Returns a point and distance'''
p1 = Point(3, 4)
print (p1)
p2 = Point(3,0)
print (p2)
return math.hypot(p1, p2)
def distance(self, other):
dx = self.X - other.X
dy = self.Y - other.Y
return math.sqrt(dx**2 + dy**2)
#p1 = Point(3,4)
#p2 = Point(3,0)
#print ("p1 = %s"%p1)
#print ("distance = %s"%(distance(p1, p2)))
Now, I have a couple of questions here to help me understand.
In the test case, why doesn't the print of the array use the str function to
print the Point out as '(x,y)'?
In ' if distance(p3,collection[i]) ', why isn't collection[i] recognized as a Point which the distance function is expecting?
In the 'p = collection[i]' statement, why does python complain that the list indices must be integers or slices, not Point?
It appears that the collection array is not recognized as an array of Point instances. I'm confused as in other OO languages like Objective-C or Java, these are simple things to do.
Take a look at this question. __repr__() is used when rendering things in lists.
(and 3.) I'm not sure if I follow your questions, but the problem you have in your code is that Python hands you the object itself, not the index. So:
for i in collection:
p = collection[i] # THIS FAILS
if distance(p3,p) < 2*radius:
print("Point "+collection[i]+" is too close to "+p3)
should be:
for p in collection:
if distance(p3,p) < 2*radius:
print(f"Point {p} is too close to {p3}")

can't seem to run this code. trying to use tuples in class objects. i'm new to python.can anyone point me in the right direction?

class Line:
def __init__(self, coor1, coor2):
self.coor1 = coor1
self.coor2 = coor2
def distance(self):
x1, y1 = self.coor1
x2, y2 = self.coor2
return ((x2-x1)**2 + (y2-y1)**2)**0.5
def slope(self):
x1, y1 = self.coor1
x2, y2 = self.coor2
return ((y2-y1)/(x2-x1))
coor1 = (3, 2)
coor2 = (8, 10)
li = Line(coor1, coor2)
li.distance()
try this one on jupyter notebook
print("result: {}".format(li.distance()))

finding distance between two points in python, passing inputs via two different objects

I have to write a code to find the different between two point by passing value via two objects as below.
But I am getting TypeError: init() missing 3 required positional arguments: 'x', 'y', and 'z'
class Point:
def __init__(self, x, y,z):
self.x = x
self.y = y
self.z = z
def __str__(self):
return '(point: {},{},{})'.format(self.x, self.y, self.z)
def distance(self, other):
return sqrt( (self.x-other.x)**2 + (self.y-other.y)**2 + (self.z -other.z)**2 )
p = Point()
p1 = Point(12, 3, 4)
p2 = Point(4, 5, 6)
p3 = Point(-2, -1, 4)
print(p.distance(p1,p3))
The problem comes from this line:
p = Point()
When you defined you class, you specified it has to be passed 3 parameters for it to be initialised (def __init__(self, x, y,z)).
If you still want to be able to create this Point object without having to pass those 3 parameters, you can make them optional like this :
def __init__(self, x=0, y=0, z=0):
self.x = x
self.y = y
self.z = z
This way, if you were to not specify these parameters (as you did), it will create a point with coordinates {0, 0, 0} by default.
You are not passing the required 3 arguments for p = Point()
fixed your errors
from math import sqrt
class Point:
def __init__(self, x, y,z):
self.x = x
self.y = y
self.z = z
def __str__(self):
return '(point: {},{},{})'.format(self.x, self.y, self.z)
def distance(self, other):
return sqrt( (self.x-other.x)**2 + (self.y-other.y)**2 + (self.z -other.z)**2 )
# p = Point() # not required
p1 = Point(12, 3, 4)
p2 = Point(4, 5, 6)
p3 = Point(-2, -1, 4)
print(p1.distance(p3)) # use this to find distance between p1 and any other point
# or use this
print(Point.distance(p1,p3))
class Point:
def __init__(self, x, y,z):
self.x = x
self.y = y
self.z = z
def __str__(self):
return '(point: {},{},{})'.format(self.x, self.y, self.z)
def distance(self, other):
return math.sqrt( (self.x-other.x)**2 + (self.y-other.y)**2 + (self.z -other.z)**2 )
p1 = Point(12, 3, 4)
p2 = Point(4, 5, 6)
p3 = Point(-2, -1, 4)
print(Point.distance(p1,p3))
It works like this.You should not define a P point seperate than the other three points. Every point is a seperate instance. But when you try to use the function just call the class.

How to determine collision in python in right way

Here I have made a code to create random sized bubbles which can be destroyed by collision of another object:
import tkinter
window = tkinter.Tk()
window.title("...")
c = tkinter.Canvas(width=800, height=500, bg="...")
ojct_id1 = c.create_polygon(...)
ojct_id2 = c.create_oval(...) # A polygon in an oval should constitute the object
def move ojct(event):
...
from random import randint
bubbles = list()
bubbles_r = list() # radius
bubbles_speed = list()
def create_bub():
...
def move_bubbles():
...
from time import sleep
while True:
if randint(1, 10) == 1:
create_bub()
move_bubbles()
window.update()
sleep(0.01)
The following code determines the position of any bubble:That helps to find out collision.
def hole_coord(id_num):
pos = c.coords(id_num)
x = (pos[0] + pos[2])/2
y = (pos[1] + pos[3])/2
return x, y
Now I have to make func. for deleting bubbles:
def del_bubbles():
del bubbles_r[i]
del bubbles_speed[i]
c.delete(bubbles[i])
del bubbles[i]
The following code determines, if the two objects are colliding:
from math import sqrt
def distance(id1, id2):
x1, y1 = hole_coord(id1)
x2, y2 = hole_coord(id2)
return sqrt((x2 - x1)/2 + (y2 - y1)/2)
def collision():
for bub in range(len(bubbles)-1, -1, -1):
if distance(ojct_id2, bubbles[bub]) < (15 + bubbles_r[bub]):
del_bubbles(bub)
Here it is sth. wrong: bubbles get deleted without a hit but if they are hit
often they don't get deleted. Can anybody help me? Thanks!
You are not computing the euclidean distance correctly.
def distance(id1, id2):
x1, y1 = hole_coord(id1)
x2, y2 = hole_coord(id2)
return sqrt((x2 - x1)/2 + (y2 - y1)/2) # <----- this is wrong
should be
def distance(id1, id2):
x1, y1 = hole_coord(id1)
x2, y2 = hole_coord(id2)
return sqrt((x2 - x1)**2 + (y2 - y1)**2) # <----- square instead of halve

Crop a collection of lines using another collection of lines

First of all, here is a quick graphical description of what I intend to do. I will use Python, but feel free to use pseudo code in your answers.
I have 2 collections of 2D segments, stored as the following: [ [start_point, end_point], [...] ].
For the first step, I have to detect each segment of the blue collection which is colliding with the black collection. For this I use LeMothe's line/line intersection algorithm.
Then comes my first problem: Having a segment AB which intersects with my line in C, I don't know how to determine using code if I have to trim my segment as AC or as CB , ie: I don't know which part of my segment I need to keep and which one I need to remove.
Then for the second step, I have really no ideas how to achieve this.
Any help would be greatly appreciated, so thank you in advance!
The second step is trivial once you figure what to keep and what not, you just need to keep track of the segments you clipped and see where they were originally joined (e.g. assume that the segments are in order and form a connected line).
On the other hand, given that your black line is in fact a line and not a polygon, in your first step, choosing what is "outside" and what is "inside" seems completely arbitrary; is it possible to close that into a polygon? Otherwise, you may need to artificially create two polygons (one for each side of the line) and then do clipping inside those polygons. You could use something like the Cyrus and Beck line clipping algorithm (see this tutorial for an overview: https://www.tutorialspoint.com/computer_graphics/viewing_and_clipping.htm)
Feel free to use any of the code below as a starting point (you have an intersect function and some useful classes). Implements Sutherland and Hodgman.
class Point2(object):
"""Structure for a 2D point"""
def __init__(self, x=0, y=0):
self.x = x
self.y = y
def __copy__(self):
return self.__class__(self.x, self.y)
copy = __copy__
def __repr__(self):
return 'Point2(%d, %d)' % (self.x, self.y)
def __getitem__(self, key):
return (self.x, self.y)[key]
def __setitem__(self, key, value):
l = [self.x, self.y]
l[key] = value
self.x, self.y = l
def __eq__(self, other):
if isinstance(other, Point2):
return self.x == other.x and \
self.y == other.y
else:
assert hasattr(other, '__len__') and len(other) == 2
return self.x == other[0] and \
self.y == other[1]
def __ne__(self, other):
return not self.__eq__(other)
def __nonzero__(self):
return self.x != 0 or self.y != 0
def __len__(self):
return 2
class Line2(object):
"""Structure for a 2D line"""
def __init__(self,pt1,pt2):
self.pt1,self.pt2=pt1,pt2
def __repr__(self):
return 'Line2(%s, %s)' % (self.pt1, self.pt2)
class Polygon2(object):
def __init__(self,points):
self.points = points
def __repr__(self):
return '[\n %s\n]' % '\n '.join([str(i) for i in self.points])
def lines(self):
lines = []
e = self.points[-1].copy()
for p in self.points:
lines.append(Line2(e,p))
e = p.copy()
return lines
#return [Line2(a,b) for a,b in zip(self.points,self.points[1:]+[self.points[0]])]
def __copy__(self):
return self.__class__(list(self.points))
copy = __copy__
class Renderer(object):
"""Rendering algorithm implementations"""
def __init__(self,world,img,color=1):
self.world,self.img,self.color=world,img,color
def transform(self,s,r,m,n):
"""Homogeneous transformation operations"""
for i in self.world.points():
j = Matrix3.new_translate(m, n)*Matrix3.new_rotate(r)*Matrix3.new_scale(s)*i
i.x,i.y = j.x,j.y
def clip(self,a,b,c,d):
"""Clipping for the world window defined by a,b,c,d"""
self.clip_lines(a, b, c, d)
self.clip_polygons(a, b, c, d)
def shift(self,a,b,c,d):
"""Shift the world window"""
for i in self.world.points():
i.x -= a
i.y -= b
def clip_lines(self,a,b,c,d):
"""Clipping for lines (i.e. open polygons)"""
clipped = []
for i in self.world.lines:
clipped += [self.clip_lines_cohen_sutherland(i.pt1, i.pt2, a, b, c, d)]
self.world.lines = [i for i in clipped if i]
def clip_polygons(self,a,b,c,d):
"""Clipping for polygons"""
polygons = []
for polygon in self.world.polygons:
new_polygon = self.clip_polygon_sutherland_hodgman(polygon, a, b, c, d)
polygons.append(new_polygon)
self.world.polygons = polygons
def clip_polygon_sutherland_hodgman(self,polygon,xmin,ymin,xmax,ymax):
edges = [Line2(Point2(xmax,ymax),Point2(xmin,ymax)), #top
Line2(Point2(xmin,ymax),Point2(xmin,ymin)), #left
Line2(Point2(xmin,ymin),Point2(xmax,ymin)), #bottom
Line2(Point2(xmax,ymin),Point2(xmax,ymax)), #right
]
def is_inside(pt,line):
# uses the determinant of the vectors (AB,AQ), Q(X,Y) is the query
# left is inside
det = (line.pt2.x-line.pt1.x)*(pt.y-line.pt1.y) - (line.pt2.y-line.pt1.y)*(pt.x-line.pt1.x)
return det>=0
def intersect(pt0,pt1,line):
x1,x2,x3,x4 = pt0.x,pt1.x,line.pt1.x,line.pt2.x
y1,y2,y3,y4 = pt0.y,pt1.y,line.pt1.y,line.pt2.y
x = ((x1*y2-y1*x2)*(x3-x4)-(x1-x2)*(x3*y4-y3*x4)) / ((x1-x2)*(y3-y4)-(y1-y2)*(x3-x4))
y = ((x1*y2-y1*x2)*(y3-y4)-(y1-y2)*(x3*y4-y3*x4)) / ((x1-x2)*(y3-y4)-(y1-y2)*(x3-x4))
return Point2(int(x),int(y))
polygon_new = polygon.copy()
for edge in edges:
polygon_copy = polygon_new.copy()
polygon_new = Polygon2([])
s = polygon_copy.points[-1]
for p in polygon_copy.points:
if is_inside(s,edge) and is_inside(p,edge):
polygon_new.points.append(p)
elif is_inside(s,edge) and not is_inside(p,edge):
polygon_new.points.append(intersect(s,p,edge))
elif not is_inside(s,edge) and not is_inside(p,edge):
pass
else:
polygon_new.points.append(intersect(s,p,edge))
polygon_new.points.append(p)
s = p
return polygon_new
def clip_lines_cohen_sutherland(self,pt0,pt1,xmin,ymin,xmax,ymax):
"""Cohen-Sutherland clipping algorithm for line pt0 to pt1 and clip rectangle with diagonal from (xmin,ymin) to (xmax,ymax)."""
TOP = 1
BOTTOM = 2
RIGHT = 4
LEFT = 8
def ComputeOutCode(pt):
code = 0
if pt.y > ymax: code += TOP
elif pt.y < ymin: code += BOTTOM
if pt.x > xmax: code += RIGHT
elif pt.x < xmin: code += LEFT
return code
accept = False
outcode0, outcode1 = ComputeOutCode(pt0), ComputeOutCode(pt1)
while True:
if outcode0==outcode1==0:
accept=True
break
elif outcode0&outcode1:
accept=False
break
else:
#Failed both tests, so calculate the line segment to clip from an outside point to an intersection with clip edge.
outcodeOut = outcode0 if not outcode0 == 0 else outcode1
if TOP & outcodeOut:
x = pt0.x + (pt1.x - pt0.x) * (ymax - pt0.y) / (pt1.y - pt0.y)
y = ymax
elif BOTTOM & outcodeOut:
x = pt0.x + (pt1.x - pt0.x) * (ymin - pt0.y) / (pt1.y - pt0.y)
y = ymin
elif RIGHT & outcodeOut:
y = pt0.y + (pt1.y - pt0.y) * (xmax - pt0.x) / (pt1.x - pt0.x);
x = xmax;
elif LEFT & outcodeOut:
y = pt0.y + (pt1.y - pt0.y) * (xmin - pt0.x) / (pt1.x - pt0.x);
x = xmin;
if outcodeOut == outcode0:
pt0 = Point2(x,y)
outcode0 = ComputeOutCode(pt0)
else:
pt1 = Point2(x,y)
outcode1 = ComputeOutCode(pt1);
if accept:
return Line2(pt0,pt1)
else:
return False
I think what you'll need to do is find a line from the center of the blue object to the line segment in question. If that new line from the center to the segment AB or BC hits a black line on its way to the blue line segment, then that segment is outside and is trimmed. You would want to check this at a point between A and B or between B and C, so that you don't hit the intersection point.
As for the python aspect, I would recommend defining a line object class with some midpoint attributes and a shape class that's made up of lines with a center attribute, (Actually come to think of it, then a line would count as a shape so you could make line a child class of the shape class and preserve code), that way you can make methods that compare two lines as part of each object.
line_a = Line((4,2),(6,9))
line_b = Line((8,1),(2,10))
line_a.intersects(line.b) #Could return Boolean, or the point of intersection
In my mind that just feels like a really comfortable way to go about this problem since it lets you keep track of what everything's doing.

Categories