Can I put a condition for y-index in numpy.where? - python

I have a 2D numpy array taken from a segmentation. Therefore, it's an image like the one in the right:
https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQwYeYOHk0xUJ6vBd_g8Xn1LxMON0g2qHpf_TPJx6h7IM5nG2OXeKtDuCcjgN9mqFtLB5c&usqp=CAU
The colours you see means that each value of my array can only have a value in a limit range (e.g., green is 5, orange is 7...). Now I would like to change all the cells that contains a 5 (green) and its y-coordinate is up to a value I want (e.g. only apply the later condition up to row 400). What's the most optimized algorithm to do this?
I guess that you can use something like:
np.where(myarray == 5, myarray, valueIwant)
but I will need to apply the condition for y-index...

Your current example seems to be misaligned with what you want:
a = np.array([1, 1, 2, 2, 3, 3])
np.where(a==2, a, 7)
produces:
array([7, 7, 2, 2, 7, 7])
If you want to replace 2 with some other value:
array([1, 1, 7, 7, 3, 3])
you can do this:
np.where(a==2, 7, a)
or
a[a==2] = 7
To replace only up to a certain value:
sub_array = a[:3]
sub_array[sub_array==2] = 7
a
array([1, 1, 7, 2, 3, 3])

Related

How to reverse a numpy array and then also switch each 'pair' of positions?

For example, how would you do this sequence of operations on a np 1D array, x:
[1,2,3,4,5,6,7,8]
[8,7,6,5,4,3,2,1]
[7,8,5,6,3,4,1,2]
The transition from state 1 to state 2 can be done with numpy.flip(x):
x = numpy.flip(x)
How can you go from this intermediate state to the final state, in which each 'pair' of positions switches positions
Notes: this is a variable length array, and will always be 1D
It is assumed that the length is always even. At this time, you only need to reshape, reverse and flatten:
>>> ar = np.arange(1, 9)
>>> ar.reshape(-1, 2)[::-1].ravel()
array([7, 8, 5, 6, 3, 4, 1, 2])
This always creates a copy, because the elements in the original array cannot be continuous after transformation, but ndarray.ravel() must create a continuous view.
If it is necessary to transition from state 2 to state 3:
>>> ar = ar[::-1]
>>> ar # state 2
array([8, 7, 6, 5, 4, 3, 2, 1])
>>> ar.reshape(-1, 2)[:, ::-1].ravel()
array([7, 8, 5, 6, 3, 4, 1, 2])
This should work (assumin you have a even number of elements, otherwise you might want to check this before)
x = x.reshape((len(x)//2, 2)) #split in two wolumns
x[:,0], x[:,1] = x[:,1], x[:,0].copy() # switch the columns
x = x.reshape(2*len(x)) # reshape back in a 1D array
You can do:
import numpy as np
arr = np.array([8,7,6,5,4,3,2,1])
result = np.vstack((arr[1::2], arr[::2])).T.flatten()
output:
array([7, 8, 5, 6, 3, 4, 1, 2])

pythonic way to get the (2,2) for every (4,4) block / grid in nxn numpy array [duplicate]

I am a beginner with numpy, and I am trying to extract some data from a long numpy array. What I need to do is start from a defined position in my array, and then subsample every nth data point from that position, until the end of my array.
basically if I had
a = [1,2,3,4,1,2,3,4,1,2,3,4....]
I want to subsample this to start at a[1] and then sample every fourth point from there, to produce something like
b = [2,2,2.....]
You can use numpy's slicing, simply start:stop:step.
>>> xs
array([1, 2, 3, 4, 1, 2, 3, 4, 1, 2, 3, 4])
>>> xs[1::4]
array([2, 2, 2])
This creates a view of the the original data, so it's constant time. It'll also reflect changes to the original array and keep the whole original array in memory:
>>> a
array([1, 2, 3, 4, 5])
>>> b = a[::2] # O(1), constant time
>>> b[:] = 0 # modifying the view changes original array
>>> a # original array is modified
array([0, 2, 0, 4, 0])
so if either of the above things are a problem, you can make a copy explicitly:
>>> a
array([1, 2, 3, 4, 5])
>>> b = a[::2].copy() # explicit copy, O(n)
>>> b[:] = 0 # modifying the copy
>>> a # original is intact
array([1, 2, 3, 4, 5])
This isn't constant time, but the result isn't tied to the original array. The copy also contiguous in memory, which can make some operations on it faster.
Complementary to behzad.nouri's answer:
If you want to control the number of final elements and ensure it's always fixed to a predefined value (rather than controlling a fixed step in between subsamples) you can use numpy's linspace method followed by integer rounding.
For example, with num_elements=4:
>>> a
array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10])
>>> choice = np.round(np.linspace(1, len(a)-1, num=4)).astype(int)
>>> a[choice]
array([ 2, 5, 7, 10])
Or, subsampling an array with final start/end points in general:
>>> import numpy as np
>>> np.round(np.linspace(0, len(a)-1, num=4)).astype(int)
array([0, 3, 6, 9])
>>> np.round(np.linspace(0, len(a)-1, num=15)).astype(int)
array([0, 1, 1, 2, 3, 3, 4, 4, 5, 6, 6, 7, 8, 8, 9])

Numpy array getting only items and indexes in an ascending order

Suppose I have the following array:
arr = [1,2,4,5,6,5,4,3,2,3,4,5,6,7,8]
I want to get only the items which are in ascending order, and ignore the "reverse" in he middle.
So for this array I want to get:
res = [1,2,4,5,6,7,8]
,at the indexes: [0,1,2,4,13,14]
Any idea?
I think you should approach this using the accumulated maximum value, i.e., the maximum value at each given step:
>>> arr
array([1, 2, 4, 5, 6, 5, 4, 3, 2, 3, 4, 5, 6, 7, 8])
>>> np.maximum.accumulate(arr)
array([1, 2, 4, 5, 6, 6, 6, 6, 6, 6, 6, 6, 6, 7, 8])
You could do something like:
>>> arr[arr == np.maximum.accumulate(arr)]
array([1, 2, 4, 5, 6, 6, 7, 8])
However, that doesn't deal with numbers that stay the same (you get that extra 6), to handle this, you could "roll" the accumulated maximum array and add the condition that it isn't equal to that rolled array (i.e., the value of the array isn't equal to the previous maximum value):
>>> m = np.maximum.accumulate(arr)
>>> arr[(arr == m) & (arr != np.roll(m, -1))]
array([1, 2, 4, 5, 6, 7, 8])
But really, you want the unique values of the accumulated maximum, so you could also just use it with np.unique:
>>> np.unique(np.maximum.accumulate(arr))
array([1, 2, 4, 5, 6, 7, 8])
Not sure which would be faster, but coming up with good testing data isn't straight-forward. If you have a size-able array, I'd be interested in which approach is faster with your data.

Averaging over n elements

I have a numpy array like this [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
Let's assume I want the average of 3 elements, my target array should then look like this:
[2, 2, 2, 5, 5, 5, 8, 8, 8, 10]
Notice that when there is no triplet available I want to calculate the average over the remaining elements
Is there a neat way to do that with array operations?
You could reshape the array to use the mean function, for example:
a = np.arange(1,11)
b = a[:a.size//3*3]
b.shape = (-1,3)
c = np.mean(b, axis=1)
# c == array([2., 5., 8.])
Then reassign the results in the original array:
c.shape = (-1,1) # i.e. (len(b), 1)
b[:] = c
print(a)
# array([ 2, 2, 2, 5, 5, 5, 8, 8, 8, 10])
Note that this works because b is a sub-view of a. Also the last element is not as you asked the average (I left it untouched), but it'll be easy to fix, with e.g.:
a[9:] = np.mean(a[9:])
I have done most of it in a one liner just for fun :D
*Notice I'm using sum() to flatten the list.. (that's some weird python trick)
def custom_avg(group: int, arr):
out = list()
[out.append(list(np.full( (1, group), np.sum(arr[i:i+group])/ (1 if i + group > len(arr) else group), dtype=int))) for i in range(0, len(arr), group) ]
return sum(out,[])
Enjoy! good luck.

numpy slicing select parts of array

I have a one dimensional array from which I would like to create a new array containing only parts of user wished sizes of the beginning, the middle, and the end of the former.
import numpy
a = range(10)
a
array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
I would like b to be equal to:
b
array([0, 1, 2, 5, 6, 7, 9])
Assuming that b is constructed of the concatenation of a[:3], a[5:6], and a[9].
I can of course use things such as np.concatenate, but is there a way to do that with slicing method, or anything else in one line?
One way is to create an array of the indices you want to index your array with:
import numpy
a = numpy.arange(10)
i = numpy.array([0, 1, 2, 5, 6, 7, 9]) # An array containing the indices you want to extract
print a[i] # Index the array based on the indices you selected
OUTPUT
[0 1 2 5 6 7 9]
I found a solution:
import numpy as np
a = range(10)
b = np.hstack([a[:3], a[5:6], a[9])
b
array([0, 1, 2, 5, 6, 7, 9])
but does slicing allow such a move?

Categories