%PDF-1.3 1 0 obj << /Kids [ 4 0 R 5 0 R 6 0 R 7 0 R 8 0 R 9 0 R 10 0 R 11 0 R 12 0 R ] /Type /Pages /Count 9 >> endobj 2 0 obj << /Subject (Neural Information Processing Systems http\072\057\057nips\056cc\057) /Publisher (Curran Associates\054 Inc\056) /Language (en\055US) /Created (2015) /EventType (Spotlight) /Description-Abstract (Recent object detection systems rely on two critical steps\072 \0501\051 a set of object proposals is predicted as efficiently as possible\054 and \0502\051 this set of candidate proposals is then passed to an object classifier\056 Such approaches have been shown they can be fast\054 while achieving the state of the art in detection performance\056 In this paper\054 we propose a new way to generate object proposals\054 introducing an approach based on a discriminative convolutional network\056 Our model is trained jointly with two objectives\072 given an image patch\054 the first part of the system outputs a class\055agnostic segmentation mask\054 while the second part of the system outputs the likelihood of the patch being centered on a full object\056 At test time\054 the model is efficiently applied on the whole test image and generates a set of segmentation masks\054 each of them being assigned with a corresponding object likelihood score\056 We show that our model yields significant improvements over state\055of\055the\055art object proposal algorithms\056 In particular\054 compared to previous approaches\054 our model obtains substantially higher object recall using fewer proposals\056 We also show that our model is able to generalize to unseen categories it has not seen during training\056 Unlike all previous approaches for generating object masks\054 we do not rely on edges\054 superpixels\054 or any other form of low\055level segmentation\056) /Producer (PyPDF2) /Title (Learning to Segment Object Candidates) /Date (2015) /ModDate (D\07220191111132259\05508\04700\047) /Published (2015) /Type (Conference Proceedings) /firstpage (1990) /Book (Advances in Neural Information Processing Systems 28) /Description (Paper accepted and presented at the Neural Information Processing Systems Conference \050http\072\057\057nips\056cc\057\051) /Editors (C\056 Cortes and N\056D\056 Lawrence and D\056D\056 Lee and M\056 Sugiyama and R\056 Garnett) /Author (Pedro O\056 O\056 Pinheiro\054 Ronan Collobert\054 Piotr Dollar) /lastpage (1998) >> endobj 3 0 obj << /Type /Catalog /Pages 1 0 R >> endobj 4 0 obj << /Contents 13 0 R /Parent 1 0 R /Resources 14 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 40 0 R 41 0 R 42 0 R 43 0 R 44 0 R 45 0 R 46 0 R 47 0 R 48 0 R 49 0 R 50 0 R ] /Type /Page >> endobj 5 0 obj << /Contents 51 0 R /Parent 1 0 R /Resources 52 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 57 0 R 58 0 R 59 0 R 60 0 R 61 0 R 62 0 R 63 0 R 64 0 R 65 0 R 66 0 R 67 0 R 68 0 R 69 0 R 70 0 R 71 0 R 72 0 R 73 0 R 74 0 R 75 0 R 76 0 R 77 0 R 78 0 R 79 0 R 80 0 R 81 0 R 82 0 R 83 0 R 84 0 R 85 0 R 86 0 R 87 0 R 88 0 R 89 0 R 90 0 R 91 0 R 92 0 R 93 0 R ] /Type /Page >> endobj 6 0 obj << /Contents 94 0 R /Parent 1 0 R /Resources 95 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 171 0 R 172 0 R 173 0 R 174 0 R 175 0 R ] /Type /Page >> endobj 7 0 obj << /Contents 176 0 R /Parent 1 0 R /Resources 177 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 190 0 R 191 0 R 192 0 R 193 0 R ] /Type /Page >> endobj 8 0 obj << /Contents 194 0 R /Parent 1 0 R /Resources 195 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 198 0 R 199 0 R 200 0 R 201 0 R 202 0 R 203 0 R ] /Type /Page >> endobj 9 0 obj << /Contents 204 0 R /Parent 1 0 R /Resources 205 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 239 0 R 240 0 R 241 0 R 242 0 R 243 0 R 244 0 R 245 0 R 246 0 R 247 0 R 248 0 R 249 0 R 250 0 R 251 0 R 252 0 R 253 0 R ] /Type /Page >> endobj 10 0 obj << /Contents 254 0 R /Parent 1 0 R /Resources 255 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 367 0 R 368 0 R 369 0 R 370 0 R 371 0 R 372 0 R ] /Type /Page >> endobj 11 0 obj << /Contents 373 0 R /Parent 1 0 R /Resources 374 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 402 0 R 403 0 R 404 0 R 405 0 R 406 0 R 407 0 R 408 0 R 409 0 R 410 0 R 411 0 R 412 0 R 413 0 R 414 0 R 415 0 R 416 0 R 417 0 R 418 0 R 419 0 R ] /Type /Page >> endobj 12 0 obj << /Contents 420 0 R /Parent 1 0 R /Resources 421 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 422 0 R 423 0 R 424 0 R 425 0 R 426 0 R 427 0 R 428 0 R 429 0 R 430 0 R 431 0 R 432 0 R 433 0 R 434 0 R 435 0 R 436 0 R 437 0 R 438 0 R 439 0 R 440 0 R 441 0 R 442 0 R 443 0 R 444 0 R 445 0 R 446 0 R 447 0 R 448 0 R 449 0 R 450 0 R 451 0 R 452 0 R 453 0 R 454 0 R 455 0 R 456 0 R 457 0 R 458 0 R 459 0 R 460 0 R 461 0 R 462 0 R 463 0 R 464 0 R 465 0 R 466 0 R 467 0 R 468 0 R 469 0 R 470 0 R 471 0 R 472 0 R 473 0 R 474 0 R 475 0 R 476 0 R 477 0 R 478 0 R 479 0 R 480 0 R 481 0 R 482 0 R 483 0 R ] /Type /Page >> endobj 13 0 obj << /Length 3286 /Filter /FlateDecode >> stream xڝ˶6r%7AgVxxY$YAݭ}sS/ƞd6 JRTo1ow qA^Q>!PWŷW/u"JTHAWQ\}fWYعxK8zdz&~a5f/-eYD~%> (|$3QmnEhUyi%yJG:I7I-CLjo\wNF^*QU+(Nr^8xoR`|wcm \ЕJa~x05BI.!`
o>hwβiht Ef@ަ%cD
iۼDI0>WkG4sm[
`[l4[A&Z$xTmҗҗ'ac'hiy`'㽊I^ XnriedƯ2bCF0>awtUiGĵ'0Ȱ='
cR;8+~VݾϠZp$6b;6L/^^Se6E
\vz0H$#YXЮ[@tx;܃HV
]@\xlꓕ-Nu#
eMSLyfjlUtk3N0
x