I am trying to run a function in a loop many times using parallel multiprocessing.
When I run this simple code:
import time
from multiprocessing import Pool
def heavy_processing(number):
time.sleep(0.05) # simulate a long-running operation
output = number + 1
return output
with Pool(4) as p:
numbers = list(range(0, 1000))
results = p.map(heavy_processing, numbers)
I get the following error:
Process SpawnPoolWorker-1:
Traceback (most recent call last):
File "C:\ProgramData\Miniconda3\lib\multiprocessing\process.py", line 315, in _bootstrap
self.run()
File "C:\ProgramData\Miniconda3\lib\multiprocessing\process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "C:\ProgramData\Miniconda3\lib\multiprocessing\pool.py", line 114, in worker
task = get()
File "C:\ProgramData\Miniconda3\lib\multiprocessing\queues.py", line 367, in get
return _ForkingPickler.loads(res)
AttributeError: Can't get attribute 'heavy_processing' on <module '__main__' (built-in)>
I'm not sure why, since I pretty much pulling this example straight from other sources. Any idea what's going on?
You have to run the multiprocessing code always under the if __name__ == '__main__': or
else it doesn't work. If you see the last line of your cmd you can see AttributeError: Can't get attribute 'heavy_processing' on <module 'main' (built-in)> which specifies that it couldn't find 'heavy_processing' on 'main'
Full Code
import time
from multiprocessing import Pool
def heavy_processing(number):
time.sleep(0.05) # simulate a long-running operation
output = number + 1
print(output)
return output
if __name__ == '__main__':
with Pool(4) as p:
numbers = list(range(0, 1000))
results = p.map(heavy_processing, numbers)
Output
1
64
127
190
2
65
128
191
3
66
129
192
4
67
130
193
5
68
131
194
6
69
132
195
7
70
133
196
8
71
134
197
9
72
135
198
10
73
136
199
11
74
137
200
75
12
138
201
76
13
139
202
77
14
140
203
78
15
141
204
79
16
142
205
17
80
143
206
18
81
144
207
19
82
145
208
20
83
146
209
21
84
147
210
22
85
148
211
23
86
149
212
24
87
150
213
25
88
151
214
26
89
152
215
90
27
153
216
91
28
154
217
92
29
155
218
30
93
156
219
31
94
157
220
95
32
158
221
33
96
159
222
97
34
160
223
98
35
161
224
99
36
162
225
100
37
163
226
101
38
164
227
102
39
165
228
103
40
166
229
104
41
167
230
105
42
168
231
106
43
169
232
107
44
170
233
108
45
171
234
109
46
172
235
110
47
173
236
111
48
174
237
112
49
175
238
113
50
176
239
114
51
177
240
52
115
178
241
53
116
179
242
117
54
180
243
55
118
181
244
56
119
182
245
57
120
183
246
121
58
184
247
122
59
185
248
123
60
186
249
124
61
187
250
125
62
188
251
126
63
189
252
253
316
379
442
254
317
380
443
255
318
381
444
256
319
382
445
257
320
383
446
258
321
384
447
259
322
385
448
260
323
386
449
261
324
387
450
262
325
388
451
263
326
389
452
264
327
390
453
265
328
391
454
266
329
392
455
267
330
393
456
268
331
394
457
269
332
395
458
270
333
396
459
271
334
397
460
272
335
398
461
273
336
399
462
274
337
400
463
275
338
401
464
276
339
402
465
277
340
403
466
278
341
404
467
279
342
405
280
468
343
406
281
469
344
407
470
282
345
408
471
283
346
409
472
284
347
410
473
285
348
411
474
286
349
412
475
287
350
413
476
288
351
414
477
289
352
415
290
478
353
416
479
291
354
417
480
292
355
418
293
481
356
419
294
482
357
420
295
483
358
421
296
484
359
422
297
485
360
423
298
486
361
424
299
487
362
425
300
488
363
426
301
489
364
427
302
490
365
428
303
491
366
429
304
492
367
430
305
493
368
431
306
494
369
432
307
495
370
433
308
496
371
434
309
497
372
435
310
498
373
436
311
499
374
437
312
500
375
438
313
501
376
439
314
502
377
440
315
503
378
441
505
504
568
631
506
694
569
632
507
695
570
633
508
696
571
634
509
697
572
635
510
698
573
636
511
699
574
637
512
700
575
638
513
701
576
639
514
702
577
640
515
703
578
516
704
641
579
517
705
642
580
518
706
643
581
519
707
644
582
520
708
645
583
521
709
646
584
522
710
647
585
523
711
648
586
524
712
649
587
525
650
713
588
526
651
714
589
527
715
652
590
528
716
653
591
529
717
654
592
530
718
655
593
531
719
656
594
532
720
657
595
721
533
658
596
534
722
659
597
535
723
660
598
536
724
661
599
537
725
662
600
538
726
663
601
539
727
664
602
540
728
665
603
541
729
666
604
542
730
667
605
543
731
668
606
544
732
669
607
545
733
670
608
546
734
671
609
547
735
672
610
548
736
673
611
549
737
674
612
550
738
675
613
551
739
676
614
552
740
677
615
553
741
678
616
554
742
679
617
555
743
680
618
556
744
681
619
557
745
682
620
558
746
683
621
559
747
684
622
560
748
685
623
561
749
686
624
562
750
687
625
563
751
688
626
564
752
689
627
565
753
690
628
566
754
691
629
567
755
692
630
757
756
693
820
758
883
946
821
759
884
947
822
760
885
948
823
761
886
949
824
762
887
950
825
763
888
951
826
764
889
952
827
765
890
953
828
766
891
954
829
767
892
955
830
768
893
956
831
769
894
957
832
770
895
958
833
771
896
959
834
772
897
960
835
773
898
961
836
774
899
962
837
775
900
963
838
776
901
964
839
777
902
965
840
778
903
966
841
779
904
967
842
780
905
968
843
781
906
969
844
782
907
970
845
783
908
971
846
784
909
972
847
785
910
973
848
786
911
974
849
787
912
975
850
788
913
976
851
789
914
977
852
790
915
978
853
791
916
979
854
792
917
980
855
793
918
981
856
794
919
982
857
795
920
983
858
796
921
984
859
797
922
985
860
798
923
986
861
799
924
987
862
800
925
988
863
801
926
989
864
802
927
990
865
803
928
991
866
804
929
992
867
805
930
993
868
806
931
994
869
807
932
995
870
808
933
996
871
809
934
997
872
810
935
998
873
811
936
999
874
812
937
1000
875
813
938
876
814
939
877
815
940
878
816
941
879
817
942
880
818
943
881
819
944
882
945
Hope this helps. Happy Coding :)
Related
I'm learning python and pandas, and I know how to do basic operations like groupby() and sum(). But I'm trying to do more complex operations like categorizing using rows and columns, but I'm not sure how to begin the problem below.
Here's the dataset from GitHub:
https://github.com/KeithGalli/pandas/blob/master/pokemon_data.csv
Here's what I'm trying to produce:
Generation
Fire A-M
Fire N-Z
Water A-M
Water N-Z
Grass A-M
Grass N-Z
1
#pokemon
2
3
4
5
6
Here's what my approach:
df = pd.read_csv(pokemon_data.csv, header=0)
fire = df.loc[df['Type 1'] == 'Fire']
water = df.loc[df['Type 1'] == 'Water']
grass = df.loc[df['Type 1'] == 'Grass']
# Trim down columns to only related data
fire = fire[['Name', 'Type 1', 'Generation']]
water = water[['Name', 'Type 1', 'Generation']]
grass = grass[['Name', 'Type 1', 'Generation']]
Next steps: Should I begin to sort by Generation first, or by alphabetical range (A-M and N-Z)? I can't wrap my head around this.
An explanation of your work is much appreciated. Thank you!
Create helper column for new columns in final DataFrame by compare first letter of column Name and then use DataFrame.pivot_table, if need aggregate strings in Name need aggregate function join:
df['cat'] = df['Type 1'] + ' ' + np.where(df['Name'].str[0].gt('M'), 'N-Z','A-M')
print (df)
# Name Type 1 Type 2 HP Attack Defense \
0 1 Bulbasaur Grass Poison 45 49 49
1 2 Ivysaur Grass Poison 60 62 63
2 3 Venusaur Grass Poison 80 82 83
3 3 VenusaurMega Venusaur Grass Poison 80 100 123
4 4 Charmander Fire NaN 39 52 43
.. ... ... ... ... .. ... ...
795 719 Diancie Rock Fairy 50 100 150
796 719 DiancieMega Diancie Rock Fairy 50 160 110
797 720 HoopaHoopa Confined Psychic Ghost 80 110 60
798 720 HoopaHoopa Unbound Psychic Dark 80 160 60
799 721 Volcanion Fire Water 80 110 120
Sp. Atk Sp. Def Speed Generation Legendary cat
0 65 65 45 1 False Grass A-M
1 80 80 60 1 False Grass A-M
2 100 100 80 1 False Grass N-Z
3 122 120 80 1 False Grass N-Z
4 60 50 65 1 False Fire A-M
.. ... ... ... ... ... ...
795 100 150 50 6 True Rock A-M
796 160 110 110 6 True Rock A-M
797 150 130 70 6 True Psychic A-M
798 170 130 80 6 True Psychic A-M
799 130 90 70 6 True Fire N-Z
df = df.pivot_table(index='Generation', columns='cat', values='Name', aggfunc=','.join)
# print (df)
Create your column names first then pivot your dataframe:
df['Group'] = df['Type 1'] + ' ' + np.where(df['Name'].str[0].between('A', 'M'), 'A-M', 'N-Z')
out = df.astype({'#': str}).pivot_table('#', 'Generation', 'Group', aggfunc=' '.join)
Output
>>> out
Group Bug A-M Bug N-Z Dark A-M ... Steel N-Z Water A-M Water N-Z
Generation ...
1 10 11 12 14 15 15 13 46 47 48 49 123 127 127 NaN ... NaN 9 9 55 87 91 98 99 116 118 129 130 130 131 7 8 54 60 61 62 72 73 79 80 80 86 90 117 119 1...
2 165 166 168 205 214 214 167 193 204 212 212 213 198 228 229 229 ... 208 208 227 159 160 170 171 183 184 222 226 230 158 186 194 195 199 211 223 224 245
3 267 268 269 284 314 265 266 283 290 291 292 313 262 359 359 ... 379 258 259 270 271 272 318 339 341 342 349 350 36... 260 260 278 279 319 319 320 321 340 369
4 401 402 412 414 415 413 413 413 416 469 430 491 ... NaN 395 418 419 423 456 457 458 490 393 394 422 484 489
5 542 557 558 588 589 595 596 617 632 636 649 540 541 543 544 545 616 637 510 625 630 633 635 ... NaN 502 550 565 580 592 593 594 647 647 501 503 515 516 535 536 537 564 581
6 NaN 664 665 666 686 687 ... NaN 656 657 658 692 693 NaN
[6 rows x 35 columns]
Transposed view for readability:
>>> out.T
Generation 1 2 3 4 5 6
Group
Bug A-M 10 11 12 14 15 15 165 166 168 205 214 214 267 268 269 284 314 401 402 412 414 415 542 557 558 588 589 595 596 617 632 636 649 NaN
Bug N-Z 13 46 47 48 49 123 127 127 167 193 204 212 212 213 265 266 283 290 291 292 313 413 413 413 416 469 540 541 543 544 545 616 637 664 665 666
Dark A-M NaN 198 228 229 229 262 359 359 430 491 510 625 630 633 635 686 687
Dark N-Z NaN 197 215 261 302 302 461 509 559 560 570 571 624 629 634 717
Dragon A-M 147 148 149 NaN 334 334 371 380 380 381 381 443 444 445 445 610 611 612 621 646 646 646 704 706
Dragon N-Z NaN NaN 372 373 373 384 384 NaN 643 644 705 718
Electric A-M 81 82 101 125 135 179 180 181 181 239 309 310 310 312 404 405 462 466 522 587 603 604 694 695 702
Electric N-Z 25 26 100 145 172 243 311 403 417 479 479 479 479 479 479 523 602 642 642 NaN
Fairy A-M 35 36 173 210 NaN NaN NaN 669 670 671 683
Fairy N-Z NaN 175 176 209 NaN 468 NaN 682 684 685 700 716
Fighting A-M 56 66 67 68 106 107 237 296 297 307 308 308 448 448 533 534 619 620 701
Fighting N-Z 57 236 NaN 447 532 538 539 674 675
Fire A-M 4 5 6 6 6 58 59 126 136 146 155 219 240 244 250 256 257 257 323 323 390 391 392 467 485 500 554 555 555 631 653 654 655 662 667
Fire N-Z 37 38 77 78 156 157 218 255 322 324 NaN 498 499 513 514 663 668 721
Flying N-Z NaN NaN NaN NaN 641 641 714 715
Ghost A-M 92 93 94 94 200 354 354 355 356 425 426 429 477 487 487 563 607 608 609 711 711 711 711
Ghost N-Z NaN NaN 353 442 562 708 709 710 710 710 710
Grass A-M 1 2 44 69 102 103 152 153 154 182 187 189 253 286 331 332 388 406 420 421 455 460 460 470 546 549 556 590 591 597 598 650 652 673
Grass N-Z 3 3 43 45 70 71 114 188 191 192 252 254 254 273 274 275 285 315 357 387 389 407 459 465 492 492 495 496 497 511 512 547 548 640 651 672
Ground A-M 50 51 104 105 207 232 330 343 344 383 383 449 450 472 529 530 552 553 622 623 645 645 NaN
Ground N-Z 27 28 111 112 231 328 329 464 551 618 NaN
Ice A-M 124 144 225 362 362 471 473 478 613 614 615 712 713
Ice N-Z NaN 220 221 238 361 363 364 365 378 NaN 582 583 584 NaN
Normal A-M 22 39 52 83 84 85 108 113 115 115 132 133 162 163 174 190 203 206 241 242 264 294 295 298 301 351 352 399 400 424 427 428 428 431 440 441 446 463 493 506 507 531 531 572 573 585 626 628 648 648 659 660 661 676
Normal N-Z 16 17 18 18 19 20 21 40 53 128 137 143 161 164 216 217 233 234 235 263 276 277 287 288 289 293 300 327 333 335 396 397 398 432 474 486 504 505 508 519 520 521 586 627 NaN
Poison A-M 23 24 42 88 89 109 169 316 452 453 569 691
Poison N-Z 29 30 31 32 33 34 41 110 NaN 317 336 434 435 451 454 568 690
Psychic A-M 63 64 65 65 96 97 122 150 150 150 151 196 249 251 281 282 282 326 358 386 386 386 386 433 439 475 475 481 482 488 517 518 574 575 576 578 605 606 677 678 678 720 720
Psychic N-Z NaN 177 178 201 202 280 325 360 480 494 527 528 561 577 579 NaN
Rock A-M 74 75 76 140 141 142 142 246 337 345 346 347 348 408 411 438 525 526 566 567 688 689 698 699 703 719 719
Rock N-Z 95 138 139 185 247 248 248 299 338 377 409 410 476 524 639 696 697
Steel A-M NaN NaN 303 303 304 305 306 306 374 375 376 376 385 436 437 483 599 600 601 638 679 680 681 681 707
Steel N-Z NaN 208 208 227 379 NaN NaN NaN
Water A-M 9 9 55 87 91 98 99 116 118 129 130 130 131 159 160 170 171 183 184 222 226 230 258 259 270 271 272 318 339 341 342 349 350 36... 395 418 419 423 456 457 458 490 502 550 565 580 592 593 594 647 647 656 657 658 692 693
Water N-Z 7 8 54 60 61 62 72 73 79 80 80 86 90 117 119 1... 158 186 194 195 199 211 223 224 245 260 260 278 279 319 319 320 321 340 369 393 394 422 484 489 501 503 515 516 535 536 537 564 581 NaN
I have defined a pascal_label_map.pbtext with 824 classes to create TFRecord files from my JPEG dataset with Pascal VOC style annotations with create_pascal_tf_record.py.
The script seems to generate these TFRecords correctly (e.g. I checked that all classes from pascal_label_map.pbtext occur in the annotations and that each JPEG comes with the correct annotation). But when I start object_detection/model_main.py I see the following:
WARNING:root:The following classes have no ground truth examples:
[
2 3 5 7 9 10 13 14 15 16 17 18 19 20 21 22 23 24
25 26 27 30 35 36 37 38 40 42 43 44 47 48 49 51 52 53
55 58 59 60 61 62 64 65 69 70 71 73 74 75 77 78 79 81
82 84 85 86 87 88 90 91 92 93 94 95 96 97 98 99 100 101
102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119
120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137
138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155
156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173
174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191
192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209
210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227
228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245
246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263
264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281
282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299
300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317
318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335
336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353
354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371
372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389
390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407
408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425
426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443
444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461
462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479
480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497
498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515
516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533
534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551
552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569
570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587
588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605
606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623
624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641
642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659
660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677
678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695
696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713
714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731
732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749
750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767
768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785
786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803
804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821
822 823 824]
How can I fix this?
I have tried both python2.7 and python3.7 (installed with anaconda in bash on Ubuntu in Windows). Instead of model_main.py I tried object_detection/legacy/train.py and object_detection/legacy/eval.py.
train.py seemed to run correctly.
When I opened tensorboard after running train.py and eval.py, I noticed that neither the images had any object detection boxes nor any correct ground-truth. Expect for one or two cases, the incorrect ground truth label was the one corresponding to 1 in pascal_label_map.pbtext.
The bounding box coordinates are correct though.
This is my model .config file (with the right pathes of course):
model {
faster_rcnn {
num_classes: 821
image_resizer {
keep_aspect_ratio_resizer {
min_dimension: 600
max_dimension: 1024
}
}
feature_extractor {
type: 'faster_rcnn_inception_v2'
first_stage_features_stride: 16
}
first_stage_anchor_generator {
grid_anchor_generator {
scales: [0.25, 0.5, 1.0, 2.0]
aspect_ratios: [0.5, 1.0, 2.0]
height_stride: 16
width_stride: 16
}
}
first_stage_box_predictor_conv_hyperparams {
op: CONV
regularizer {
l2_regularizer {
weight: 0.0
}
}
initializer {
truncated_normal_initializer {
stddev: 0.01
}
}
}
first_stage_nms_score_threshold: 0.0
first_stage_nms_iou_threshold: 0.7
first_stage_max_proposals: 300
first_stage_localization_loss_weight: 2.0
first_stage_objectness_loss_weight: 1.0
initial_crop_size: 14
maxpool_kernel_size: 2
maxpool_stride: 2
second_stage_box_predictor {
mask_rcnn_box_predictor {
use_dropout: false
dropout_keep_probability: 1.0
fc_hyperparams {
op: FC
regularizer {
l2_regularizer {
weight: 0.0
}
}
initializer {
variance_scaling_initializer {
factor: 1.0
uniform: true
mode: FAN_AVG
}
}
}
}
}
second_stage_post_processing {
batch_non_max_suppression {
score_threshold: 0.0
iou_threshold: 0.6
max_detections_per_class: 100
max_total_detections: 300
}
score_converter: SOFTMAX
}
second_stage_localization_loss_weight: 2.0
second_stage_classification_loss_weight: 1.0
}
}
train_config: {
batch_size: 1
optimizer {
momentum_optimizer: {
learning_rate: {
manual_step_learning_rate {
initial_learning_rate: 0.0002
schedule {
step: 900000
learning_rate: .00002
}
schedule {
step: 1200000
learning_rate: .000002
}
}
}
momentum_optimizer_value: 0.9
}
use_moving_average: false
}
gradient_clipping_by_norm: 10.0
#fine_tune_checkpoint: "PATH_TO/models/model/model.ckpt"
#from_detection_checkpoint: true
#load_all_detection_checkpoint_vars: true
# Note: The below line limits the training process to 200K steps, which we
# empirically found to be sufficient enough to train the COCO dataset. This
# effectively bypasses the learning rate schedule (the learning rate will
# never decay). Remove the below line to train indefinitely.
num_steps: 5000
data_augmentation_options {
random_horizontal_flip {
}
}
data_augmentation_options {
random_vertical_flip {
}
}
data_augmentation_options {
random_rotation90 {
}
}
}
train_input_reader {
label_map_path: "PATH_TO/data/pascal_label_map.pbtxt"
tf_record_input_reader {
input_path:"PATH_TO/data/pascal_train.record-?????-of-00010"
}
}
eval_config {
num_examples: 1886
# Note: The below line limits the evaluation process to 100 evaluations.
# Remove the below line to evaluate indefinitely.
max_evals: 1886
#use_moving_averages: false
metrics_set: "pascal_voc_detection_metrics"
}
eval_input_reader {
label_map_path: "PATH_TO/data/pascal_label_map.pbtxt"
shuffle: false
num_readers: 10
tf_record_input_reader {
input_path: "PATH_TO/data/pascal_val.record-?????-of-00010"
}
}
The "pascal_voc_detection_metrics" don't seems to work either.
I had the same problem but I managed to fix it. From your explanation, I have noticed that you have said you have 824 classes, but in your model .config file you have written 821 classes (in the num_classes parameter). This is the possible cause of the error. Make sure that the num_classes has the actual number of classes (824). I do hope the label_map file also reflects the same number of classes.
All the best.
The first line gives n, the number of integers in the next line.
Then n integers are given .
My problem is how to accept the inputs, I tried using
ab = list(map(int,input().split()))
But no use.
Traceback (most recent call last):
File "", line 1, in
TypeError: list() takes at most 1 argument (2 given)
The format of input is:
409 //this is n
1 4 6 7 9 11 12 13 16 18 19 21 23 24 32 35 39 41 43 44 46 48 50 52 54 56 60 61 63 64 69 72 73 74 82 85 86 91 94 97 99 100 104 106 110 112 115 117 120 121 123 126 130 131 134 137 138 142 143 144 145 150 151 152 156 157 158 162 165 170 171 172 173 180 181 183 188 191 192 194 196 199 201 202 205 208 212 214 216 219 220 225 227 235 240 243 244 245 246 247 251 252 253 257 258 261 269 270 271 274 276 277 278 285 286 288 291 293 297 301 302 303 304 310 311 316 318 319 321 322 323 327 329 331 332 346 347 349 350 353 356 357 358 362 363 373 376 379 381 384 386 388 390 391 392 394 398 402 403 404 407 412 413 414 416 417 418 421 422 425 428 429 431 433 436 437 438 442 443 444 451 453 459 461 466 473 478 481 483 484 486 487 504 508 513 514 515 520 521 524 527 531 535 537 538 540 541 544 546 549 550 551 554 555 556 557 560 561 562 566 572 574 575 577 583 587 589 592 593 595 596 597 598 600 603 604 606 611 612 616 626 627 629 631 637 638 639 641 644 645 646 647 648 650 652 654 659 660 661 664 665 666 668 669 672 673 677 679 681 683 685 686 688 693 699 701 705 706 707 708 709 710 711 715 717 719 724 725 727 729 730 733 735 736 737 738 739 740 746 747 755 759 761 764 766 767 770 772 773 775 776 780 782 783 788 790 792 793 796 797 798 799 808 814 821 822 825 828 838 843 855 856 861 862 869 871 872 877 884 885 887 890 891 893 894 895 897 901 902 903 908 915 916 918 919 921 922 923 924 928 929 932 934 935 937 938 950 958 959 961 962 967 969 971 972 973 976 978 979 980 982 983 985 988 989 990 991 992 995 998 999
Newline character matters.
n = input()
input() # skip 2nd line
ab = list(map(int, input().split()))
print(n)
print(ab)
I have a file with set of points like 652 653 655... and when plotted looks like this.
I want to detect the local minima (in this case, approximately at, 100, 250, 370, 500, etc but not in the aberrations in between of the plot). How can this be done using Python/Octave?
My Data is something like this
762 659 658 658 659 660 660 660 660 658 657 656 654 652 650 649 648 648 647 646 645 645 645 645 645 644 644 643 641 640 638 637 635 634 634 634 635 637 640 641 641 640 637 633 631 631 633 639 648 659 672 686 702 718 735 751 765 777 787 794 800 804 805 805 806 808 808 807 806 803 801 799 796 793 788 784 781 782 784 785 785 783 780 775 770 765 758 749 737 726 716 708 703 699 697 695 694 693 692 693 692 694 697 701 704 709 713 715 716 716 717 717 715 712 708 703 698 694 690 689 688 687 685 684 682 681 680 678 676 674 672 669 664 661 659 658 660 661 662 662 661 660 659 658 657 655 654 655 658 660 661 661 659 658 658 657 657 655 654 651 650 648 646 645 646 647 647 648 649 651 652 654 655 656 657 658 659 658 657 658 661 667 676 689 704 719 736 752 767 782 793 801 808 812 816 817 818 818 816 813 810 808 804 802 799 798 796 795 792 790 786 784 783 782 781 778 775 770 763 754 744 732 721 711 703 697 692 689 686 686 687 690 693 694 695 696 696 695 694 692 690 689 689 689 690 690 690 689 688 686 683 681 678 674 670 667 663 661 659 657 657 657 658 659 660 661 662 661 660 658 657 656 656 656 656 656 657 658 659 659 660 660 661 661 661 661 661 660 660 659 659 659 659 659 659 660 660 659 659 658 657 656 656 655 655 655 655 653 651 650 649 649 649 650 650 652 657 665 677 692 708 725 742 758 774 787 800 809 816 822 828 833 836 838 838 836 834 832 828 825 823 820 819 817 816 816 814 812 809 808 809 808 806 805 802 797 792 784 774 763 752 741 731 723 716 711 706 702 702 705 710 715 718 721 725 728 730 731 731 730 729 726 722 718 714 711 708 705 702 698 697 696 696 695 695 694 692 692 691 689 688 688 688 686 685 685 683 682 681 680 678 678 677 676 674 672 671 670 669 667 666 666 665 664 665 668 671 673 674 674 675 675 674 671 668 666 663 661 659 658 656 654 653 652 652 654 657 661 665 669 672 674 677 678 679 680 682 688 696 708 721 734 749 762 774 786 797 806 812 817 821 825 828 829 829 829 828 827 825 824 820 817 814 810 807 804 801 799 796 793 790 786 784 781 775 768 759 748 736 724 713 703 696 691 688 687 687 688 689 690 690 689 690 691 691 691 691 689 687 687 688 689 689 689 687 686 686 684 681 677 673 669 667 664 662 660 658 657 656 657 657 657 658 657 656 655 654 654 653 652 652 652 651 651 650 648 647 647 647 647 648 648 648 647 647 646 646 647 648 648 646 645 646 646 646 645 644 643 643 643 644 645 646 647 647 646 645 643 640 638 636 635 633 633 635 641 650 661 673 687 702 718 734 750 764 777 788 799 807 815 820 824 826 825 823 820 815 810 804 799 795 793 790 787 785 783 781 779 777 773 770 767 762 756 749 742 733 722 714 707 701 698 696 694 694 694 694 695 695 695 695 696 697 697 697 696 696 695 693 692 691 689 687 687 686 685 684 683 682 680 680 679 677 674 671 669 667 665 662 661 659 659 659 660 663 665 666 666 667 667 666 665 663 661 659 657 654 651 650 651 652 652 653 653 654 653 652 651 649 648 646 643 641 639 638 637 636 636 637 639 640 640 642 643 644 646 646 647 648 652 657 667 679 695 712 730 750 767 783 797 807 815 821 825 828 830 831 832 831 830 830 828 825 823 820 817 814 810 806 803 801 800 799 796 793 790 787 783 777 770 761 750 738 728 720 713 708 706 704 704 705 707 709 711 713 715 718 720 722 723 723 722 720 719 718 717 714 711 708 705 704 703 702 701 699 696 694 693 695 696 697 697 697 696 695 694 692 689 685 682 680 679 676 673 671 670 669 668 670 672 673 675 677 679 680 680 680 680 679 679 678 677 675 674 673 673 673 675 676 676 676 676 677 678 680 681 683 685 687 687 686 684 683 681 680 679 679 681 685 691 701 713 728 744 761 779 796 810 822 831 840 846 850 851 851 851 849 847 844 841 837 833 831 829 826 822 820 819 820 821 823 823 823 823 822 819 814 806 799 789 779 767 758 749 740 734 728 725 724 726 727 730 734 737 741 743 745 747 747 746 744 741 738 736 733 729 725 723 721 719 717 715 711 709 707 705 703 699 694 690 688 687 686 685 686 686 687 687 687 686 686 684 683 681 681 680 680 681 683 686 690 692 693 693 692 691 691 690 688 687 685 684 683 681 680 679 679 680 681 682 682 684 686 687 688 688 688 688 688 686 685 685 684 683 682 681 679 678 678 679 680 683 686 691 699 712 728 746 762 778 793 807 819 830 840 850 857 863 866 870 872 873 874 873 873 871 869 866 862 857 852 848 844 840 837 836 836 836 835 833 828 822 814 803 791 778 767 756 748 744 742 743 746 748 751 754 756 757 759 761 763 764 764 764 764 765 764 763 761 758 755 751 747 742 738 736 735 735 735 734 734 734 733 733 733 732 731 728 725 721 716 710 708 708 709 710 711 711 712 713 712 711 710 709 707 706 705 706 708 710 711 711 710 711 711 713 714 714 712 709 707 705 704 703 702 699 696 693 692 692 693 693 692 692 691 690 690 689 688 688 690 695 704 717 732 749 767 784 801 816 828 839 847 855 860 863 866 869 870 870 869 867 864 861 858 855 852 849 847 845 844 844 843 843 842 839 835 829 823 814 804 792 781 770 761 754 750 747 747 747 747 749 750 750 749 748 749 749 749 750 749 747 746 744 743 742 740 738 736 732 729 725 722 719 717 716 716 715 715 714 713 713 711 710 710 710 709 708 708 708 708 708 710 710 711 712 711 710 709 708 707 707 707 707 705 705 706 706 708 709 709 709 709 708 708 709 709 711 711 713 713 714 715 714 714 713 712 711 711 712 713 714 716 717 717 717 716 717 719 723 726 730 736 746 758 774 790 806 821 836 849 860 871 881 888 894 898 900 900 899 895 890 886 881 875 870 866 865 864 863 862 861 861 859 859 857 855 851 847 842 836 828 819 808 796 782 770 759 752 747 745 743 744 745 747 749 751 753 753 753 753 754 754 753 750 748 746 742 738 734 731 727 724 722 721 721 721 724 726 727 727 726 725 723 720 717 713 710 708 705 704 703 702 700 700 700 700 700 699 697 696 697 696 695 693 692 690 688 686 685 684 683 683 681 679 677 677 677 676 676 674 674 672 671 669 668 667 666 665 663 663 665 667 669 671 674 675 676 676 675 676 680 690 701 709 714 722 734 751 769 785 801 816 830 843 853 860 865 868 871 871 870 870 868 866 863 860 857 854 852 850 849 848 846 843 840 837 835 834 833 831 827 821 814 805 794 781 769 759 751 745 742 739 737 735 735 737 740 742 746 748 749 750 752 753 754 755 756 756 756 756 754 752 749 746 742 739 736 732 729 726 724 723 724 725 727 729 729 729 729 730 730 730 728 726 726 725 724 722 720 718 717 717 718 718 719 720 721 722 722 721 720 718 718 718 717 717 718 719 719 720 719 718 717 716 714 712 710 707 704 703 703 702 701 701 700 700 702 705 709 717 728 742 758 776 793 811 828 843 855 865 873 879 883 885 885 883 880 876 872 869 866 865 864 863 862 862 862 861 859 858 858 856 852 847 840 831 821 810 798 787 778 769 763 759 757 757 757 758 760 761 763 764 765 767 767 767 766 764 763 761 759 758 757 756 756 755 753 752 750 748 746 744 741 738 735 733 732 731 730 729 728 726 723 721 719 717 717 717 717 718 718 718 717 715 713 710 706 703 700 697 696 695 694 694 694 695 696 696 697 697 697 698
I show here one solution with peakdet which is based on http://billauer.co.il/peakdet.html <- (you shoud read this page)
set (0, "defaultlinelinewidth", 2)
d = [your data above];
delta = 100;
[~, MINTAB] = peakdet (d', delta);
plot (d, ";;", MINTAB, d(MINTAB), "o")
grid on
print out.png
The C++ peakdet.cc is also intended to be used in GNU Octave. You can compile it with "make" (which uses mkoctfile) into an OCT file and use this in Octave like a regulär function. Remember this if speed gets a problem (peakdet.m is much slower than the compiled version)
I have been trying to decode the Raw Code from the AC remote on raspberry pi 2.
I am not able to decode it in hex value.
Here is the Raw code for On and Off :
name bhutOn
8423 4226 566 544 576 1617
571 1622 576 537 573 1620
568 545 574 1618 571 549
571 1621 577 536 574 1619
569 1624 574 538 572 1629
559 1627 572 548 571 540
570 542 567 545 575 537
572 541 568 542 568 544
576 543 566 546 574 538
571 541 569 542 567 545
575 542 567 539 571 549
570 1622 577 1617 571 541
568 544 566 551 569 1619
569 543 566 553 567 544
576 563 546 566 543 568
542 576 544 562 547 564
545 575 545 566 543 569
541 571 548 564 546 538
571 542 568 543 576 543
577 535 574 538 572 539
570 542 567 545 575 536
574 545 564 549 571 540
569 543 577 535 574 537
573 539 570 542 567 545
575 545 575 536 574 537
572 540 569 543 577 534
575 537 573 539 570 549
571 541 568 544 575 536
574 538 571 541 569 543
577 534 575 545 575 536
573 539 571 541 568 544
576 535 574 538 571 541
569 550 569 543 567 544
575 544 566 539 571 541
568 560 560 535 574 545
574 538 572 540 569 543
567 572 547 563 547 565
544 568 541 578 542 1625
573 1620 569 546 564 545
574 538 572 1621 567 545
575 529 570
name bhutOff
8421 4223 566 543 566 1626
572 1622 577 536 574 1618
569 543 567 1626 573 547
572 1621 573 539 575 1618
570 1624 574 538 572 1621
567 1627 571 548 571 541
569 542 567 545 575 536
573 539 570 542 568 544
575 544 576 536 573 538
572 540 569 543 566 546
574 537 572 540 580 539
570 1623 576 1618 570 569
540 571 549 563 546 1620
568 571 549 570 550 562
547 565 545 567 542 569
541 571 548 563 547 1620
568 1633 576 563 546 565
544 568 542 570 549 562
547 565 545 568 541 550
570 542 577 535 575 537
572 540 569 542 568 544
575 537 573 546 573 538
572 540 569 543 577 535
574 537 572 540 570 569
540 552 568 571 548 563
547 565 544 568 541 571
549 562 547 565 545 574
545 567 543 569 540 545
575 537 572 539 571 541
568 544 576 543 576 536
573 539 571 540 569 543
567 545 574 537 572 540
570 550 569 542 568 544
575 537 573 539 570 541
569 543 566 546 574 545
574 538 572 539 570 542
567 545 575 537 573 538
571 541 568 551 569 1624
574 1619 570 1624 574 1619
570 543 566 1626 572 540
569 535 57
Raspberry pi is not able to decode the Raw code and showing following error as :
pi#raspberrypi ~ $ sudo irrecord -a /home/pi/temp1.conf
Unknown encoding found.
irrecord: decoding of on failed
irrecord: decoding of off failed
#
# this config file was automatically generated
# using lirc-0.9.0-pre1(emulation) on Sun Mar 13 13:19:20 2016
#
# contributed by
#
# brand: lgac
# model no. of remote control:
# devices being controlled by this remote:
#
begin remote
name lgac
bits 0
flags RC5
eps 30
aeps 100
one 0 0
zero 0 0
gap 28205
toggle_bit_mask 0x0
begin codes
end codes
end remote
pi#raspberrypi ~ $
I am new to Raspberry pi. Any help would be great. Thanks in advance. :)
I know this is an old question, but maybe it helps someone else who (like me) stumbles onto this via a web search.
First of all, since this is an AC remote, those are not "the on and off button". An AC remote usually sends the full state of the remote (on/off, set temperature, mode, fan speed, etc) with each button press. This is done in order to keep the remote screen synchronized with the actual AC device (since there is no feedback from the AC to the remote).
So, for example, the first code might be "AC on, 20 degrees, automatic fan speed, mode cooling".
old_timer's question was referring to the fact that you listed the remote as RC5 in your config file ("flags RC5" - RC5 is one protocol used by some remote controls). But it is most likely not RC5...
Your best bet right now (and what I ended up doing) is to record the signals for your most frequently used settings, and create a config file with the actual raw codes. Like this:
begin remote
name MY_REMOTE
flags RAW_CODES
begin raw_codes
name SETTING1
8423 4226 566 544 576 1617
571 1622 576 537 573 1620
...
If you wish to decode this further, your code looks similar to the one on my AC (a Samsung one). The bits (after the 8400/4200 start delimiter) seem to be encoded as 550/550 for a zero, and 550/1600 for a one. (that is, 550us of LED on time followed by 550us of off time encode a zero bit; 550us/1600us encode a one).
Once you have the bits, you will need to try and make sense of it - change settings, and see how the code changes. However, you will soon find that the rabbit hole quickly gets deeper:
You still need to find out the "endianness" of each byte (my Samsung unit sends the bytes LSB-first - that is, the least significant bit of each byte is first "on the wire")
Most AC units also have a checksum (to ensure that there were no errors in the transmission). If you want to generate your own codes, you will also need to compute and transmit the correct checksum.
As mentioned above, I ended up just using the raw codes in my LIRC config file :)