opening file in python turtle graphics - python

import turtle
t=turtle.Turtle()
wn=turtle.Screen()
wn.setworldcoordinates(-300,-300,300,300)
directions = { #dictionary of directions given in file.
'up': turtle.up,
'down': turtle.down
}
with open('dino.txt', 'r') as dino:
for line in dino:
line, pixel = line.split() #split line into two different directions.
if line in directions:#runs if within directions.
directions[line](pixel)
else:
raise() #raises error if not within directions.
I have this file titled "dino.txt" that has directions within it that are supposed to trace out a dinosaur in python turtle graphics. However i am having much trouble implementing a code that reads the file and traces out the image in the turtle graphics. The code i have written above opens the turtle graphics page but does not trace out anything. I was hoping someone on here could help me out or point out how exactly to implement a turtle graphic design in python from reading a text file. Thanks for any help/feedback.
here are the contents of the file "dino.txt":
UP
-218 185
DOWN
-240 189
-246 188
-248 183
-246 178
-244 175
-240 170
-235 166
-229 163
-220 158
-208 156
-203 153
-194 148
-187 141
-179 133
-171 119
-166 106
-163 87
-161 66
-162 52
-164 44
-167 28
-171 6
-172 -15
-171 -30
-165 -46
-156 -60
-152 -67
-152 -68
UP
-134 -61
DOWN
-145 -66
-152 -78
-152 -94
-157 -109
-157 -118
-151 -128
-146 -135
-146 -136
UP
-97 -134
DOWN
-98 -138
-97 -143
-96 -157
-96 -169
-98 -183
-104 -194
-110 -203
-114 -211
-117 -220
-120 -233
-122 -243
-123 -247
-157 -248
-157 -240
-154 -234
-154 -230
-153 -229
-149 -226
-146 -223
-145 -219
-143 -214
-142 -210
-141 -203
-139 -199
-136 -192
-132 -184
-130 -179
-132 -171
-133 -162
-134 -153
-138 -145
-143 -137
-143 -132
-142 -124
-138 -112
-134 -104
-132 -102
UP
-97 -155
DOWN
-92 -151
-91 -147
-89 -142
-89 -135
-90 -129
-90 -128
UP
-94 -170
DOWN
-83 -171
-68 -174
-47 -177
-30 -172
-15 -171
-11 -170
UP
12 -96
DOWN
9 -109
9 -127
7 -140
5 -157
9 -164
22 -176
37 -204
40 -209
49 -220
55 -229
57 -235
57 -238
50 -239
49 -241
51 -248
53 -249
63 -245
70 -243
57 -249
62 -250
71 -250
75 -250
81 -250
86 -248
86 -242
84 -232
85 -226
81 -221
77 -211
73 -205
67 -196
62 -187
58 -180
51 -171
47 -164
46 -153
50 -141
53 -130
54 -124
57 -112
56 -102
55 -98
UP
48 -164
DOWN
54 -158
60 -146
64 -136
64 -131
UP
5 -152
DOWN
1 -150
-4 -145
-8 -138
-14 -128
-19 -119
-17 -124
UP
21 -177
DOWN
14 -176
7 -174
-6 -174
-14 -170
-19 -166
-20 -164
UP
-8 -173
DOWN
-8 -180
-5 -189
-4 -201
-2 -211
-1 -220
-2 -231
-5 -238
-8 -241
-9 -244
-7 -249
6 -247
9 -248
16 -247
21 -246
24 -241
27 -234
27 -226
27 -219
27 -209
27 -202
28 -193
28 -188
28 -184
UP
-60 -177
DOWN
-59 -186
-57 -199
-56 -211
-59 -225
-61 -233
-65 -243
-66 -245
-73 -246
-81 -246
-84 -246
-91 -245
-91 -244
-88 -231
-87 -225
-85 -218
-85 -211
-85 -203
-85 -193
-88 -185
-89 -180
-91 -175
-92 -172
-93 -170
UP
-154 -93
DOWN
-157 -87
-162 -74
-168 -66
-172 -57
-175 -49
-178 -38
-178 -26
-178 -12
-177 4
-175 17
-172 27
-168 36
-161 48
-161 50
UP
-217 178
DOWN
-217 178
-217 177
-215 176
-214 175
-220 177
-223 178
-223 178
-222 178
UP
-248 185
DOWN
-245 184
-240 182
-237 181
-234 179
-231 177
-229 176
-228 175
-226 174
-224 173
-223 173
-220 172
-217 172
-216 171
-214 170
-214 169
UP
-218 186
DOWN
-195 173
-183 165
-175 159
-164 151
-158 145
-152 139
-145 128
-143 122
-139 112
-138 105
-134 95
-131 88
-129 78
-126 67
-125 62
-125 54
-124 44
-125 38
-126 30
-125 27
-125 8
-126 5
-125 -9
-122 -15
-115 -25
-109 -32
-103 -39
-95 -42
-84 -45
-72 -47
-56 -48
-41 -47
-31 -46
-18 -45
-1 -44
9 -43
34 -45
50 -52
67 -61
83 -68
95 -80
112 -97
142 -115
180 -132
200 -146
227 -159
259 -175
289 -185
317 -189
349 -190
375 -191
385 -192
382 -196
366 -199
352 -204
343 -204
330 -205
315 -209
296 -212
276 -214
252 -208
237 -202
218 -197
202 -193
184 -187
164 -179
147 -173
128 -168
116 -164
102 -160
88 -158
78 -159
69 -162
57 -164
56 -165
51 -165
UP
68 -144
DOWN
83 -143
96 -141
109 -139
119 -146
141 -150
161 -155
181 -163
195 -169
208 -179
223 -187
241 -191
247 -193
249 -194
UP
-6 -141
DOWN
-15 -146
-29 -150
-42 -154
-51 -153
-60 -152
-60 -152
UP
-90 -134
DOWN
-85 -131
-79 -128
-78 -123
-80 -115
-82 -106
-80 -101
-76 -101
UP
-81 -132
DOWN
-76 -130
-71 -126
-72 -124
UP
43 -118
DOWN
44 -125
47 -135
41 -156
37 -160
40 -166
47 -171
47 -171
UP
-106 -153
DOWN
-107 -167
-106 -178
-109 -192
-114 -198
-116 -201

This logic is wrong:
line, pixel = line.split()
If the data is as show in the edited version of your question, consider:
data = line.rstrip().split()
you only have two cases, either the length of list data is 1, meaning you have a direction like UP or DOWN in data[0], or the length of list data is 2, meaning you have the arguments for a goto() (after you first convert the two strings to integers). That's it.
If the data is more free form as show in your original post, then you need to process the data one token at a time. Pull off one data item and convert it via int() under a try clause. If it succeeds, read the next data item as an int and do a goto() using both, otherwise treat the current data item as a direction in the except clause since it's clearly not an int
Errors to check for include: numbers don't correctly convert to integers; direction not found in directions dictionary.
Other things to consider: change your directions keys to uppercase to match the data in the file or explicitly control the case yourself; if you use setworldcoordinates() this way, you may skew the aspect ratio of the image -- make your window square to begin with (whatever size) using wn.setup(size, size); your maximum virtual coordinate of 300 falls short of your data: 385.

Related

Time interval calculation for consecutive days in rows

I have a dataframe that looks like this:
Path_Version commitdates Year-Month API Age api_spec_id
168 NaN 2018-10-19 2018-10 39 521
169 NaN 2018-10-19 2018-10 39 521
170 NaN 2018-10-12 2018-10 39 521
171 NaN 2018-10-12 2018-10 39 521
172 NaN 2018-10-12 2018-10 39 521
173 NaN 2018-10-11 2018-10 39 521
174 NaN 2018-10-11 2018-10 39 521
175 NaN 2018-10-11 2018-10 39 521
176 NaN 2018-10-11 2018-10 39 521
177 NaN 2018-10-11 2018-10 39 521
178 NaN 2018-09-26 2018-09 39 521
179 NaN 2018-09-25 2018-09 39 521
I want to calculate the days elapsed from the first commitdate till the last, after sorting the commit dates first, so something like this:
Path_Version commitdates Year-Month API Age api_spec_id Days_difference
168 NaN 2018-10-19 2018-10 39 521 25
169 NaN 2018-10-19 2018-10 39 521 25
170 NaN 2018-10-12 2018-10 39 521 18
171 NaN 2018-10-12 2018-10 39 521 18
172 NaN 2018-10-12 2018-10 39 521 18
173 NaN 2018-10-11 2018-10 39 521 16
174 NaN 2018-10-11 2018-10 39 521 16
175 NaN 2018-10-11 2018-10 39 521 16
176 NaN 2018-10-11 2018-10 39 521 16
177 NaN 2018-10-11 2018-10 39 521 16
178 NaN 2018-09-26 2018-09 39 521 1
179 NaN 2018-09-25 2018-09 39 521 0
I tried first sorting the commitdates by api_spec_id since it is unique for every API, and then calculating the diff
final_api['commitdates'] = final_api.groupby('api_spec_id')['commitdate'].apply(lambda x: x.sort_values())
final_api['diff'] = final_api.groupby('api_spec_id')['commitdates'].diff() / np.timedelta64(1, 'D')
final_api['diff'] = final_api['diff'].fillna(0)
It just returns me a zero for the entire column. I don't want to group them, I only want to calculate the difference based on the sorted commitdates: starting from the first commitdate till the last in the entire dataset, in days
Any idea how can I achieve this?
Use pandas.to_datetime, sub, min and dt.days:
t = pd.to_datetime(df['commitdates'])
df['Days_difference'] = t.sub(t.min()).dt.days
If you need to group per API:
t = pd.to_datetime(df['commitdates'])
df['Days_difference'] = t.sub(t.groupby(df['api_spec_id']).transform('min')).dt.days
Output:
Path_Version commitdates Year-Month API Age api_spec_id Days_difference
168 NaN 2018-10-19 2018-10 39 521 24
169 NaN 2018-10-19 2018-10 39 521 24
170 NaN 2018-10-12 2018-10 39 521 17
171 NaN 2018-10-12 2018-10 39 521 17
172 NaN 2018-10-12 2018-10 39 521 17
173 NaN 2018-10-11 2018-10 39 521 16
174 NaN 2018-10-11 2018-10 39 521 16
175 NaN 2018-10-11 2018-10 39 521 16
176 NaN 2018-10-11 2018-10 39 521 16
177 NaN 2018-10-11 2018-10 39 521 16
178 NaN 2018-09-26 2018-09 39 521 1
179 NaN 2018-09-25 2018-09 39 521 0

jupyter notebook showing csv file output broken in lines

I am new to Jupyter notebook. I came across it to execute great expectation test cases.
I dowonloaded jupyter lab(3.4.4) and installed in my mac just before sometime.
I am trying to execute python code of great expectations on a sample csv file i got from internet. I checked the file manually it is in a good shape.
import great_expectations as ge
my_df = ge.read_csv("/Users/someuser/Desktop//100SalesRecords.csv")
print(my_df)
when i am trying to print it is showing me the result in lines wraped as below
Region Country \
0 Australia and Oceania Tuvalu
1 Central America and the Caribbean Grenada
2 Europe Russia
3 Sub-Saharan Africa Sao Tome and Principe
4 Sub-Saharan Africa Rwanda
5 Australia and Oceania Solomon Islands
6 Sub-Saharan Africa Angola
7 Sub-Saharan Africa Burkina Faso
8 Sub-Saharan Africa Republic of the Congo
9 Sub-Saharan Africa Senegal
10 Asia Kyrgyzstan
11 Sub-Saharan Africa Cape Verde
12 Asia Bangladesh
13 Central America and the Caribbean Honduras
14 Asia Mongolia
15 Europe Bulgaria
16 Asia Sri Lanka
17 Sub-Saharan Africa Cameroon
18 Asia Turkmenistan
19 Australia and Oceania East Timor
20 Europe Norway
21 Europe Portugal
22 Central America and the Caribbean Honduras
23 Australia and Oceania New Zealand
24 Europe Moldova
25 Europe France
26 Australia and Oceania Kiribati
27 Sub-Saharan Africa Mali
28 Europe Norway
29 Sub-Saharan Africa The Gambia
30 Europe Switzerland
31 Sub-Saharan Africa South Sudan
32 Australia and Oceania Australia
33 Asia Myanmar
34 Sub-Saharan Africa Djibouti
35 Central America and the Caribbean Costa Rica
36 Middle East and North Africa Syria
37 Sub-Saharan Africa The Gambia
38 Asia Brunei
39 Europe Bulgaria
40 Sub-Saharan Africa Niger
41 Middle East and North Africa Azerbaijan
42 Sub-Saharan Africa The Gambia
43 Europe Slovakia
44 Asia Myanmar
45 Sub-Saharan Africa Comoros
46 Europe Iceland
47 Europe Switzerland
48 Europe Macedonia
49 Sub-Saharan Africa Mauritania
50 Europe Albania
51 Sub-Saharan Africa Lesotho
52 Middle East and North Africa Saudi Arabia
53 Sub-Saharan Africa Sierra Leone
54 Sub-Saharan Africa Sao Tome and Principe
55 Sub-Saharan Africa Cote d'Ivoire
56 Australia and Oceania Fiji
57 Europe Austria
58 Europe United Kingdom
59 Sub-Saharan Africa Djibouti
60 Australia and Oceania Australia
61 Europe San Marino
62 Sub-Saharan Africa Cameroon
63 Middle East and North Africa Libya
64 Central America and the Caribbean Haiti
65 Sub-Saharan Africa Rwanda
66 Sub-Saharan Africa Gabon
67 Central America and the Caribbean Belize
68 Europe Lithuania
69 Sub-Saharan Africa Madagascar
70 Asia Turkmenistan
71 Middle East and North Africa Libya
72 Sub-Saharan Africa Democratic Republic of the Congo
73 Sub-Saharan Africa Djibouti
74 Middle East and North Africa Pakistan
75 North America Mexico
76 Australia and Oceania Federated States of Micronesia
77 Asia Laos
78 Europe Monaco
79 Australia and Oceania Samoa
80 Europe Spain
81 Middle East and North Africa Lebanon
82 Middle East and North Africa Iran
83 Sub-Saharan Africa Zambia
84 Sub-Saharan Africa Kenya
85 North America Mexico
86 Sub-Saharan Africa Sao Tome and Principe
87 Sub-Saharan Africa The Gambia
88 Middle East and North Africa Kuwait
89 Europe Slovenia
90 Sub-Saharan Africa Sierra Leone
91 Australia and Oceania Australia
92 Middle East and North Africa Azerbaijan
93 Europe Romania
94 Central America and the Caribbean Nicaragua
95 Sub-Saharan Africa Mali
96 Asia Malaysia
97 Sub-Saharan Africa Sierra Leone
98 North America Mexico
99 Sub-Saharan Africa Mozambique
ItemType SalesChannel OrderPriority OrderDate OrderID \
0 Baby Food Offline H 5/28/2010 669165933
1 Cereal Online C 8/22/2012 963881480
2 Office Supplies Offline L 5/2/2014 341417157
3 Fruits Online C 6/20/2014 514321792
4 Office Supplies Offline L 2/1/2013 115456712
5 Baby Food Online C 2/4/2015 547995746
6 Household Offline M 4/23/2011 135425221
7 Vegetables Online H 7/17/2012 871543967
8 Personal Care Offline M 7/14/2015 770463311
9 Cereal Online H 4/18/2014 616607081
10 Vegetables Online H 6/24/2011 814711606
11 Clothes Offline H 8/2/2014 939825713
12 Clothes Online L 1/13/2017 187310731
13 Household Offline H 2/8/2017 522840487
14 Personal Care Offline C 2/19/2014 832401311
15 Clothes Online M 4/23/2012 972292029
16 Cosmetics Offline M 11/19/2016 419123971
17 Beverages Offline C 4/1/2015 519820964
18 Household Offline L 12/30/2010 441619336
19 Meat Online L 7/31/2012 322067916
20 Baby Food Online L 5/14/2014 819028031
21 Baby Food Online H 7/31/2015 860673511
22 Snacks Online L 6/30/2016 795490682
23 Fruits Online H 9/8/2014 142278373
24 Personal Care Online L 5/7/2016 740147912
25 Cosmetics Online H 5/22/2017 898523128
26 Fruits Online M 10/13/2014 347140347
27 Fruits Online L 5/7/2010 686048400
28 Beverages Offline C 7/18/2014 435608613
29 Household Offline L 5/26/2012 886494815
30 Cosmetics Offline M 9/17/2012 249693334
31 Personal Care Offline C 12/29/2013 406502997
32 Office Supplies Online C 10/27/2015 158535134
33 Household Offline H 1/16/2015 177713572
34 Snacks Online M 2/25/2017 756274640
35 Personal Care Offline L 5/8/2017 456767165
36 Fruits Online L 11/22/2011 162052476
37 Meat Online M 1/14/2017 825304400
38 Office Supplies Online L 4/1/2012 320009267
39 Office Supplies Online M 2/16/2012 189965903
40 Personal Care Online H 3/11/2017 699285638
41 Cosmetics Online M 2/6/2010 382392299
42 Cereal Offline H 6/7/2012 994022214
43 Vegetables Online H 10/6/2012 759224212
44 Clothes Online H 11/14/2015 223359620
45 Cereal Offline H 3/29/2016 902102267
46 Cosmetics Online C 12/31/2016 331438481
47 Personal Care Online M 12/23/2010 617667090
48 Clothes Offline C 10/14/2014 787399423
49 Office Supplies Offline C 1/11/2012 837559306
50 Clothes Online C 2/2/2010 385383069
51 Fruits Online L 8/18/2013 918419539
52 Cereal Online M 3/25/2013 844530045
53 Office Supplies Offline M 11/26/2011 441888415
54 Fruits Offline H 9/17/2013 508980977
55 Clothes Online C 6/8/2012 114606559
56 Clothes Offline C 6/30/2010 647876489
57 Cosmetics Offline H 2/23/2015 868214595
58 Household Online L 1/5/2012 955357205
59 Cosmetics Offline H 4/7/2014 259353148
60 Cereal Offline H 6/9/2013 450563752
61 Baby Food Online L 6/26/2013 569662845
62 Office Supplies Online M 11/7/2011 177636754
63 Clothes Offline H 10/30/2010 705784308
64 Cosmetics Offline H 10/13/2013 505716836
65 Cosmetics Offline H 10/11/2013 699358165
66 Personal Care Offline L 7/8/2012 228944623
67 Clothes Offline M 7/25/2016 807025039
68 Office Supplies Offline H 10/24/2010 166460740
69 Clothes Offline L 4/25/2015 610425555
70 Office Supplies Online M 4/23/2013 462405812
71 Fruits Online L 8/14/2015 816200339
72 Beverages Online C 5/26/2011 585920464
73 Cereal Online H 5/20/2017 555990016
74 Cosmetics Offline L 7/5/2013 231145322
75 Household Offline C 11/6/2014 986435210
76 Beverages Online C 10/28/2014 217221009
77 Vegetables Offline C 9/15/2011 789176547
78 Baby Food Offline H 5/29/2012 688288152
79 Cosmetics Online H 7/20/2013 670854651
80 Household Offline L 10/21/2012 213487374
81 Clothes Online L 9/18/2012 663110148
82 Cosmetics Online H 11/15/2016 286959302
83 Snacks Online L 1/4/2011 122583663
84 Vegetables Online L 3/18/2012 827844560
85 Personal Care Offline L 2/17/2012 430915820
86 Beverages Offline C 1/16/2011 180283772
87 Baby Food Offline M 2/3/2014 494747245
88 Fruits Online M 4/30/2012 513417565
89 Beverages Offline C 10/23/2016 345718562
90 Office Supplies Offline H 12/6/2016 621386563
91 Beverages Offline H 7/7/2014 240470397
92 Office Supplies Online M 6/13/2012 423331391
93 Cosmetics Online H 11/26/2010 660643374
94 Beverages Offline C 2/8/2011 963392674
95 Clothes Online M 7/26/2011 512878119
96 Fruits Offline L 11/11/2011 810711038
97 Vegetables Offline C 6/1/2016 728815257
98 Personal Care Offline M 7/30/2015 559427106
99 Household Offline L 2/10/2012 665095412
ShipDate UnitsSold UnitPrice UnitCost TotalRevenue TotalCost \
0 6/27/2010 9925 255.28 159.42 2533654.00 1582243.50
1 9/15/2012 2804 205.70 117.11 576782.80 328376.44
2 5/8/2014 1779 651.21 524.96 1158502.59 933903.84
3 7/5/2014 8102 9.33 6.92 75591.66 56065.84
4 2/6/2013 5062 651.21 524.96 3296425.02 2657347.52
5 2/21/2015 2974 255.28 159.42 759202.72 474115.08
6 4/27/2011 4187 668.27 502.54 2798046.49 2104134.98
7 7/27/2012 8082 154.06 90.93 1245112.92 734896.26
8 8/25/2015 6070 81.73 56.67 496101.10 343986.90
9 5/30/2014 6593 205.70 117.11 1356180.10 772106.23
10 7/12/2011 124 154.06 90.93 19103.44 11275.32
11 8/19/2014 4168 109.28 35.84 455479.04 149381.12
12 3/1/2017 8263 109.28 35.84 902980.64 296145.92
13 2/13/2017 8974 668.27 502.54 5997054.98 4509793.96
14 2/23/2014 4901 81.73 56.67 400558.73 277739.67
15 6/3/2012 1673 109.28 35.84 182825.44 59960.32
16 12/18/2016 6952 437.20 263.33 3039414.40 1830670.16
17 4/18/2015 5430 47.45 31.79 257653.50 172619.70
18 1/20/2011 3830 668.27 502.54 2559474.10 1924728.20
19 9/11/2012 5908 421.89 364.69 2492526.12 2154588.52
20 6/28/2014 7450 255.28 159.42 1901836.00 1187679.00
21 9/3/2015 1273 255.28 159.42 324971.44 202941.66
22 7/26/2016 2225 152.58 97.44 339490.50 216804.00
23 10/4/2014 2187 9.33 6.92 20404.71 15134.04
24 5/10/2016 5070 81.73 56.67 414371.10 287316.90
25 6/5/2017 1815 437.20 263.33 793518.00 477943.95
26 11/10/2014 5398 9.33 6.92 50363.34 37354.16
27 5/10/2010 5822 9.33 6.92 54319.26 40288.24
28 7/30/2014 5124 47.45 31.79 243133.80 162891.96
29 6/9/2012 2370 668.27 502.54 1583799.90 1191019.80
30 10/20/2012 8661 437.20 263.33 3786589.20 2280701.13
31 1/28/2014 2125 81.73 56.67 173676.25 120423.75
32 11/25/2015 2924 651.21 524.96 1904138.04 1534983.04
33 3/1/2015 8250 668.27 502.54 5513227.50 4145955.00
34 2/25/2017 7327 152.58 97.44 1117953.66 713942.88
35 5/21/2017 6409 81.73 56.67 523807.57 363198.03
36 12/3/2011 3784 9.33 6.92 35304.72 26185.28
37 1/23/2017 4767 421.89 364.69 2011149.63 1738477.23
38 5/8/2012 6708 651.21 524.96 4368316.68 3521431.68
39 2/28/2012 3987 651.21 524.96 2596374.27 2093015.52
40 3/28/2017 3015 81.73 56.67 246415.95 170860.05
41 2/25/2010 7234 437.20 263.33 3162704.80 1904929.22
42 6/8/2012 2117 205.70 117.11 435466.90 247921.87
43 11/10/2012 171 154.06 90.93 26344.26 15549.03
44 11/18/2015 5930 109.28 35.84 648030.40 212531.20
45 4/29/2016 962 205.70 117.11 197883.40 112659.82
46 12/31/2016 8867 437.20 263.33 3876652.40 2334947.11
47 1/31/2011 273 81.73 56.67 22312.29 15470.91
48 11/14/2014 7842 109.28 35.84 856973.76 281057.28
49 1/13/2012 1266 651.21 524.96 824431.86 664599.36
50 3/18/2010 2269 109.28 35.84 247956.32 81320.96
51 9/18/2013 9606 9.33 6.92 89623.98 66473.52
52 3/28/2013 4063 205.70 117.11 835759.10 475817.93
53 1/7/2012 3457 651.21 524.96 2251232.97 1814786.72
54 10/24/2013 7637 9.33 6.92 71253.21 52848.04
55 6/27/2012 3482 109.28 35.84 380512.96 124794.88
56 8/1/2010 9905 109.28 35.84 1082418.40 354995.20
57 3/2/2015 2847 437.20 263.33 1244708.40 749700.51
58 2/14/2012 282 668.27 502.54 188452.14 141716.28
59 4/19/2014 7215 437.20 263.33 3154398.00 1899925.95
60 7/2/2013 682 205.70 117.11 140287.40 79869.02
61 7/1/2013 4750 255.28 159.42 1212580.00 757245.00
62 11/15/2011 5518 651.21 524.96 3593376.78 2896729.28
63 11/17/2010 6116 109.28 35.84 668356.48 219197.44
64 11/16/2013 1705 437.20 263.33 745426.00 448977.65
65 11/25/2013 4477 437.20 263.33 1957344.40 1178928.41
66 7/9/2012 8656 81.73 56.67 707454.88 490535.52
67 9/7/2016 5498 109.28 35.84 600821.44 197048.32
68 11/17/2010 8287 651.21 524.96 5396577.27 4350343.52
69 5/28/2015 7342 109.28 35.84 802333.76 263137.28
70 5/20/2013 5010 651.21 524.96 3262562.10 2630049.60
71 9/30/2015 673 9.33 6.92 6279.09 4657.16
72 7/15/2011 5741 47.45 31.79 272410.45 182506.39
73 6/17/2017 8656 205.70 117.11 1780539.20 1013704.16
74 8/16/2013 9892 437.20 263.33 4324782.40 2604860.36
75 12/12/2014 6954 668.27 502.54 4647149.58 3494663.16
76 11/15/2014 9379 47.45 31.79 445033.55 298158.41
77 10/23/2011 3732 154.06 90.93 574951.92 339350.76
78 6/2/2012 8614 255.28 159.42 2198981.92 1373243.88
79 8/7/2013 9654 437.20 263.33 4220728.80 2542187.82
80 11/30/2012 4513 668.27 502.54 3015902.51 2267963.02
81 10/8/2012 7884 109.28 35.84 861563.52 282562.56
82 12/8/2016 6489 437.20 263.33 2836990.80 1708748.37
83 1/5/2011 4085 152.58 97.44 623289.30 398042.40
84 4/7/2012 6457 154.06 90.93 994765.42 587135.01
85 3/20/2012 6422 81.73 56.67 524870.06 363934.74
86 1/21/2011 8829 47.45 31.79 418936.05 280673.91
87 3/20/2014 5559 255.28 159.42 1419101.52 886215.78
88 5/18/2012 522 9.33 6.92 4870.26 3612.24
89 11/25/2016 4660 47.45 31.79 221117.00 148141.40
90 12/14/2016 948 651.21 524.96 617347.08 497662.08
91 7/11/2014 9389 47.45 31.79 445508.05 298476.31
92 7/24/2012 2021 651.21 524.96 1316095.41 1060944.16
93 12/25/2010 7910 437.20 263.33 3458252.00 2082940.30
94 3/21/2011 8156 47.45 31.79 387002.20 259279.24
95 9/3/2011 888 109.28 35.84 97040.64 31825.92
96 12/28/2011 6267 9.33 6.92 58471.11 43367.64
97 6/29/2016 1485 154.06 90.93 228779.10 135031.05
98 8/8/2015 5767 81.73 56.67 471336.91 326815.89
99 2/15/2012 5367 668.27 502.54 3586605.09 2697132.18
TotalProfit
0 951410.50
1 248406.36
2 224598.75
3 19525.82
4 639077.50
5 285087.64
6 693911.51
7 510216.66
8 152114.20
9 584073.87
10 7828.12
11 306097.92
12 606834.72
13 1487261.02
14 122819.06
15 122865.12
16 1208744.24
17 85033.80
18 634745.90
19 337937.60
20 714157.00
21 122029.78
22 122686.50
23 5270.67
24 127054.20
25 315574.05
26 13009.18
27 14031.02
28 80241.84
29 392780.10
30 1505888.07
31 53252.50
32 369155.00
33 1367272.50
34 404010.78
35 160609.54
36 9119.44
37 272672.40
38 846885.00
39 503358.75
40 75555.90
41 1257775.58
42 187545.03
43 10795.23
44 435499.20
45 85223.58
46 1541705.29
47 6841.38
48 575916.48
49 159832.50
50 166635.36
51 23150.46
52 359941.17
53 436446.25
54 18405.17
55 255718.08
56 727423.20
57 495007.89
58 46735.86
59 1254472.05
60 60418.38
61 455335.00
62 696647.50
63 449159.04
64 296448.35
65 778415.99
66 216919.36
67 403773.12
68 1046233.75
69 539196.48
70 632512.50
71 1621.93
72 89904.06
73 766835.04
74 1719922.04
75 1152486.42
76 146875.14
77 235601.16
78 825738.04
79 1678540.98
80 747939.49
81 579000.96
82 1128242.43
83 225246.90
84 407630.41
85 160935.32
86 138262.14
87 532885.74
88 1258.02
89 72975.60
90 119685.00
91 147031.74
92 255151.25
93 1375311.70
94 127722.96
95 65214.72
96 15103.47
97 93748.05
98 144521.02
99 889472.91
you can see it is showing two columns only.
Can someone please help me to find out how can I see all the columns ? I mean all the columns together without linebreak(with scrolling)
I searched in internet but i found only to enable vertical scrolling. Can't it get horizontal scrolling ?
Thanks in advance.

Python: interpolate z value based on its neighbor z1 value, the coordinates is based on (x,y)

In the x,y coordinate, each (x,y) has its corresponding z value, but some of them are missing. Can someone help to interpolate the missing data of z?
x = [161 enter image description here177 193 209 225 241 257 273 289 305 321 337 353 145 161 177 193 209 225 241 257 273 289 305 321 337 353 369 145 161 177 193 209 225 241 257 273 289 305 321 337 353 369 145 161 177 193 209 225 241 257 273 289 305 321 337 353 369 385 369 353 337 321 305 289 273 257 241 225 209 193 177 161 145 129 97 113 129 145] y = [55 55 55 55 55 55 55 55 55 55 55 55 55 57 57 57 57 57 57 57 57 57 57 57 57 57 57 57 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 65 65 65 65 65 65 65 65 65 65 65 65 65 65 65 74 74 74 74 74 74 74 74 74 74 74 74 74 74 74 74 74 115 115 115 115] z = [0.635 0.559 0.506 nan 0.597 nan 0.644 0.66 0.644 0.642 nan 0.545 nan nan nan 0.432 0.45 nan 0.517 0.521 nan 0.547 0.528 0.52 0.505 0.446 0.51 0.547 0.734 0.045 0.227 0 0.164 0.41 0.431 0.343 0.351 0.405 0.43 0.023 0.391 0.246 0.437 1.005 0.889 0.926 0.895 0.992 1.008 0.921 0.944 0.959 0.96 1.019 1.033 1.009 0.991 0.952 1.008 0.994 0.93 1.003 0.96 0.92 0.886 0.919 0.922 0.923 0.91 1.006 1.006 0.91 0.893 0.89 1 0.618 0.654 0.647 0.664]
The countour map can be found below, there are some Z value are missing. The calculation of missing Z-value can be some built-in matlab matric, such as use nearest 4 points of (x,y), or nearest 9 points of (x,y). countour map Thank you all for the help. enter image description here

How do i convert Gray to RGB in nrrd file

i trying convert grayscale 3D image to RGB 3D image.
Now i can get each slice's array. this array value's grayscale pixel value.
but i don't know how to covert RGB value.
i tried convert color using opencv function.
import numpy as np
import nrrd
import cv2
data = nrrd.read('C:\\Users\\admin\\Desktop\\sample data\\sample_nrrd.nrrd')
print(data[0][442][376])
cv2.cvtColor(data,cv2.COLOR_GRAY2BGR)
but it's not working...
i first time using nrrd file.
how to convert gray to rgb.
And this array is example of my data.
Thanks.
[-1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000
-1000 -1000 -1000 -1000 -1000 80 236 1830 1901 1852 1742 1430
1147 1088 1285 1240 989 969 787 791 1073 1098 1380 1320
1125 1075 1209 1433 1505 1349 1114 1261 1463 1454 1696 1435
1301 1448 1384 1146 1220 1054 829 1189 1245 1319 1293 986
695 672 594 709 583 503 601 562 440 418 764 967
1275 911 842 761 652 479 691 715 505 442 768 650
705 938 1079 1076 969 936 907 902 755 588 614 770
738 646 971 802 625 890 1020 929 941 824 800 803
920 843 793 834 937 877 737 494 621 605 763 825
642 548 527 427 552 529 572 345 442 455 603 614
712 521 603 687 770 665 744 604 642 791 971 980
1059 1020 842 781 793 845 860 982 916 1077 907 491
806 533 327 709 817 913 977 735 958 624 547 651
952 1171 1184 1033 1262 2015 2193 2444 2830 2678 2650 2473
2528 2766 2915 2991 2654 2403 2700 2646 2302 2276 2706 3003
2639 2499 2414 1948 1456 1908 1409 852 500 946 747 715
864 899 960 977 807 954 1348 1053 1242 1346 1732 1634
1600 1690 1730 1797 1833 1963 1795 1775 2016 2182 2260 2132
1912 1651 1380 1576 1768 2275 1934 1790 1740 1908 2061 2068
1879 1714 1801 1678 1588 1669 1717 1596 1573 2080 1869 1922
2080 1701 2003 1617 1917 1810 1437 1292 1110 813 1079 1166
1037 1111 1518 1417 1037 603 120 137 15 -30 -197 -409
-133 -72 80 7 10 -7 -28 29 -219 -12 3 18
144 120 -89 -4 101 143 66 -162 96 218 153 120
36 188 275 58 -64 28 9 -77 89 202 206 243
349 234 54 163 262 313 282 131 175 234 102 263
109 93 57 143 282 235 175 189 217 200 297 345
314 150 -24 105 111 202 -58 20 -67 -175 -39 271
292 -2 -153 -181 -41 200 67 104 128 91 -154 -171
-42 -125 67 -172 -101 -59 -130 -94 -146 -175 23 -51
230 104 91 -16 -75 -169 -246 -203 -90 45 -99 11
72 287 149 57 111 79 -12 -104 206 0 41 68
78 -65 -255 -136 -115 53 52 61 -30 119 -155 -229
-190 -36 -163 -240 98 84 85 -17 1 54 81 -173
-205 -172 -351 -19 -86 -172 -98 -90 -169 257 126 83
171 284 297 159 50 -150 -94 -45 -39 -12 230 201
215 328 144 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000
-1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000
-1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000
-1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000
-1000 -1000 -1000 -1000 -1000 -1000 -1000 -1000]
Your question is not very precise and accurate and there can be multiple understanding of it.
First of all there is no actual 1D color to 3D color conversion, the COLOR_GRAY2BGR only convert 1-channel grey to 3-channel grey. This conversion does not add any color, just change internal image representation so we can save it in the video . The color can not recover. Since it is a 3D depth map, I think what you want is color mapping function like the image below.
Not sure your data is the final disparity or final depth. So you have to figur out it yourself later. But the general idea is to contain it within cv::Mat and the use opencv color mapping function to colorize.
Assume data_image_1D is np array
import numpy as np
import cv2
data_image_1D # this is the np array that you got it from somewhere
bw_img = np.reshape(data_image_1D,(rows,cols))
#rows and cols are the size of the depth image that you have. Try to see if you can get this convetsion working.
im_color = cv2.applyColorMap(bw_img, cv2.COLORMAP_JET)
cv2.imshow("im_color",im_color)
cv2.imshow("bw_img",bw_img)
cv2.waitKey(0)
cv2.destroyAllWindows()
But still, you have to deal with the negative disparity, minimal disparity and maximum disparity issue. That one I can leave it to you
For more
data mapping or custom color mapping you can follow this guide https://www.learnopencv.com/applycolormap-for-pseudocoloring-in-opencv-c-python/
edit
I think what you want might be just a simple merge function. See this
import numpy as np
import nrrd
######you orginal code put it here###
bw_img = np.reshape(data_image_1D,(rows,cols))
im_color = cv2.merge([bw_img, bw_img, bw_img])
nrrd.write(filename, im_color)# Write to a NRRD file

Pandas: Join dataframe with condition

So I have this dataframe (as below), I am trying to join itself by copying it into another df. The join condition as below;
Join condition:
Same PERSONID and Badge_ID
But different SITE_ID1
Timedelta between the two rows should be less than 48 hrs.
Expecting
PERSONID Badge_ID Reader_ID1_x SITE_ID1_x EVENT_TS1_x Reader_ID1_y SITE_ID1_x EVENT_TS1_y
2553-AMAGID 4229 141 99 2/1/2016 3:26 145 97 2/1/2016 3:29
2553-AMAGID 4229 248 99 2/1/2016 3:26 145 97 2/1/2016 3:29
2553-AMAGID 4229 145 97 2/1/2016 3:29 251 99 2/1/2016 3:29
2553-AMAGID 4229 145 97 2/1/2016 3:29 291 99 2/1/2016 3:29
Here is what I tired,
Make a copy of df and then filter each df with this condition like below and then join them back again. But the below condition doesn't work :(
I tried this filters in SQL before reading into df but that's too slow for 600k+ rows, event with indexes.
df1 = df1[(df1['Badge_ID']==df2['Badge_ID']) and (df1['SITE_ID1']!=df2['SITE_ID1']) and ((df1['EVENT_TS1']-df2['EVENT_TS1'])<=datetime.timedelta(hours=event_time_diff))]
PERSONID Badge_ID Reader_ID1 SITE_ID1 EVENT_TS1
2553-AMAGID 4229 141 99 2/1/2016 3:26:10 AM
2553-AMAGID 4229 248 99 2/1/2016 3:26:10 AM
2553-AMAGID 4229 145 97 2/1/2016 3:29:56 AM
2553-AMAGID 4229 251 99 2/1/2016 3:29:56 AM
2553-AMAGID 4229 291 99 2/1/2016 3:29:56 AM
2557-AMAGID 4219 144 99 2/1/2016 2:36:30 AM
2557-AMAGID 4219 144 99 2/1/2016 2:40:00 AM
2557-AMAGID 4219 250 99 2/1/2016 2:40:00 AM
2557-AMAGID 4219 290 99 2/1/2016 2:40:00 AM
2557-AMAGID 4219 144 97 2/1/2016 4:02:06 AM
2557-AMAGID 4219 250 99 2/1/2016 4:02:06 AM
2557-AMAGID 4219 290 99 2/1/2016 4:02:06 AM
2557-AMAGID 4219 250 97 2/2/2016 1:36:30 AM
2557-AMAGID 4219 290 99 2/3/2016 2:38:30 AM
2559-AMAGID 4227 141 99 2/1/2016 4:33:24 AM
2559-AMAGID 4227 248 99 2/1/2016 4:33:24 AM
2560-AMAGID 4226 141 99 2/1/2016 4:10:56 AM
2560-AMAGID 4226 248 99 2/1/2016 4:10:56 AM
2560-AMAGID 4226 145 99 2/1/2016 4:33:52 AM
2560-AMAGID 4226 251 99 2/1/2016 4:33:52 AM
2560-AMAGID 4226 291 99 2/1/2016 4:33:52 AM
2570-AMAGID 4261 141 99 2/1/2016 4:27:02 AM
2570-AMAGID 4261 248 99 2/1/2016 4:27:02 AM
2986-AMAGID 4658 145 99 2/1/2016 3:14:54 AM
2986-AMAGID 4658 251 99 2/1/2016 3:14:54 AM
2986-AMAGID 4658 291 99 2/1/2016 3:14:54 AM
2986-AMAGID 4658 144 99 2/1/2016 3:26:30 AM
2986-AMAGID 4658 250 99 2/1/2016 3:26:30 AM
2986-AMAGID 4658 290 99 2/1/2016 3:26:30 AM
4133-AMAGID 6263 142 99 2/1/2016 2:44:08 AM
4133-AMAGID 6263 249 99 2/1/2016 2:44:08 AM
4133-AMAGID 6263 141 34 2/1/2016 2:44:20 AM
4133-AMAGID 6263 248 34 2/1/2016 2:44:20 AM
4414-AMAGID 6684 145 99 2/1/2016 3:08:06 AM
4414-AMAGID 6684 251 99 2/1/2016 3:08:06 AM
4414-AMAGID 6684 291 99 2/1/2016 3:08:06 AM
4414-AMAGID 6684 145 22 2/1/2016 3:19:12 AM
4414-AMAGID 6684 251 22 2/1/2016 3:19:12 AM
4414-AMAGID 6684 291 22 2/1/2016 3:19:12 AM
4414-AMAGID 6684 145 99 2/1/2016 4:14:28 AM
4414-AMAGID 6684 251 99 2/1/2016 4:14:28 AM
4414-AMAGID 6684 291 99 2/1/2016 4:14:28 AM
4484-AMAGID 6837 142 99 2/1/2016 2:51:14 AM
4484-AMAGID 6837 249 99 2/1/2016 2:51:14 AM
4484-AMAGID 6837 141 99 2/1/2016 2:51:26 AM
4484-AMAGID 6837 248 99 2/1/2016 2:51:26 AM
4484-AMAGID 6837 141 99 2/1/2016 3:05:12 AM
4484-AMAGID 6837 248 99 2/1/2016 3:05:12 AM
4484-AMAGID 6837 141 99 2/1/2016 3:08:58 AM
4484-AMAGID 6837 248 99 2/1/2016 3:08:58 AM
Try the following:
# Transform data in first dataframe
df1 = pd.DataFrame(data)
# Save the data in another datframe
df2 = pd.DataFrame(data)
# Rename column names of second dataframe
df2.rename(index=str, columns={'Reader_ID1': 'Reader_ID1_x', 'SITE_ID1': 'SITE_ID1_x', 'EVENT_TS1': 'EVENT_TS1_x'}, inplace=True)
# Merge the dataframes into another dataframe based on PERSONID and Badge_ID
df3 = pd.merge(df1, df2, how='outer', on=['PERSONID', 'Badge_ID'])
# Use df.loc() to fetch the data you want
df3.loc[(df3.Reader_ID1 < df3.Reader_ID1_x) & (df3.SITE_ID1 != df3.SITE_ID1_x) & (pd.to_datetime(df3['EVENT_TS1']) - pd.to_datetime(df3['EVENT_TS1_x'])<=datetime.timedelta(hours=event_time_diff))]

Categories