%PDF-1.3 1 0 obj << /Kids [ 4 0 R 5 0 R 6 0 R 7 0 R 8 0 R 9 0 R 10 0 R 11 0 R 12 0 R ] /Type /Pages /Count 9 >> endobj 2 0 obj << /Subject (Neural Information Processing Systems http\072\057\057nips\056cc\057) /Publisher (Curran Associates\054 Inc\056) /Language (en\055US) /Created (2014) /EventType (Poster) /Description-Abstract (Extracting 3D shape of deforming objects in monocular videos\054 a task known as non\055rigid structure\055from\055motion \050NRSfM\051\054 has so far been studied only on synthetic datasets and controlled environments\056 Typically\054 the objects to reconstruct are pre\055segmented\054 they exhibit limited rotations and occlusions\054 or full\055length trajectories are assumed\056 In order to integrate NRSfM into current video analysis pipelines\054 one needs to consider as input realistic \055thus incomplete\055 tracking\054 and perform spatio\055temporal grouping to segment the objects from their surroundings\056 Furthermore\054 NRSfM needs to be robust to noise in both segmentation and tracking\054 e\056g\056\054 drifting\054 segmentation \140\140leaking\047\047\054 optical flow \140\140bleeding\047\047 etc\056 In this paper\054 we make a first attempt towards this goal\054 and propose a method that combines dense optical flow tracking\054 motion trajectory clustering and NRSfM for 3D reconstruction of objects in videos\056 For each trajectory cluster\054 we compute multiple reconstructions by minimizing the reprojection error and the rank of the 3D shape under different rank bounds of the trajectory matrix\056 We show that dense 3D shape is extracted and trajectories are completed across occlusions and low textured regions\054 even under mild relative motion between the object and the camera\056 We achieve competitive results on a public NRSfM benchmark while using fixed parameters across all sequences and handling incomplete trajectories\054 in contrast to existing approaches\056 We further test our approach on popular video segmentation datasets\056 To the best of our knowledge\054 our method is the first to extract dense object models from realistic videos\054 such as those found in Youtube or Hollywood movies\054 without object\055specific priors\056) /Producer (PyPDF2) /Title (Grouping\055Based Low\055Rank Trajectory Completion and 3D Reconstruction) /Date (2014) /ModDate (D\07220141202174307\05508\04700\047) /Published (2014) /Type (Conference Proceedings) /firstpage (55) /Book (Advances in Neural Information Processing Systems 27) /Description (Paper accepted and presented at the Neural Information Processing Systems Conference \050http\072\057\057nips\056cc\057\051) /Editors (Z\056 Ghahramani and M\056 Welling and C\056 Cortes and N\056D\056 Lawrence and K\056Q\056 Weinberger) /Author (Katerina Fragkiadaki\054 Marta Salas\054 Pablo Arbelaez\054 Jitendra Malik) /lastpage (63) >> endobj 3 0 obj << /Type /Catalog /Pages 1 0 R >> endobj 4 0 obj << /Contents 13 0 R /Parent 1 0 R /Resources 14 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 40 0 R 41 0 R ] /Type /Page >> endobj 5 0 obj << /Contents 42 0 R /Parent 1 0 R /Resources 43 0 R /Group 61 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 63 0 R 64 0 R 65 0 R 66 0 R 67 0 R 68 0 R 69 0 R 70 0 R 71 0 R 72 0 R 73 0 R 74 0 R 75 0 R 76 0 R 77 0 R ] /Type /Page >> endobj 6 0 obj << /Contents 78 0 R /Parent 1 0 R /Resources 79 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 104 0 R 105 0 R 106 0 R 107 0 R 108 0 R 109 0 R 110 0 R 111 0 R 112 0 R 113 0 R 114 0 R 115 0 R 116 0 R 117 0 R 118 0 R 119 0 R ] /Type /Page >> endobj 7 0 obj << /Contents 120 0 R /Parent 1 0 R /Resources 121 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 162 0 R 163 0 R 164 0 R ] /Type /Page >> endobj 8 0 obj << /Contents 165 0 R /Parent 1 0 R /Resources 166 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 179 0 R 180 0 R 181 0 R 182 0 R 183 0 R 184 0 R 185 0 R 186 0 R 187 0 R 188 0 R 189 0 R 190 0 R 191 0 R 192 0 R 193 0 R 194 0 R 195 0 R 196 0 R 197 0 R 198 0 R 199 0 R ] /Type /Page >> endobj 9 0 obj << /Contents 200 0 R /Parent 1 0 R /Resources 201 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 233 0 R 234 0 R 235 0 R 236 0 R 237 0 R 238 0 R 239 0 R 240 0 R 241 0 R 242 0 R 243 0 R 244 0 R 245 0 R 246 0 R 247 0 R 248 0 R 249 0 R ] /Type /Page >> endobj 10 0 obj << /Contents 250 0 R /Parent 1 0 R /Resources 251 0 R /Group 277 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 279 0 R 280 0 R 281 0 R 282 0 R 283 0 R 284 0 R 285 0 R 286 0 R 287 0 R 288 0 R 289 0 R 290 0 R 291 0 R 292 0 R ] /Type /Page >> endobj 11 0 obj << /Contents 293 0 R /Parent 1 0 R /Resources 294 0 R /Group 330 0 R /MediaBox [ 0 0 612 792 ] /Annots [ 332 0 R ] /Type /Page >> endobj 12 0 obj << /Contents 333 0 R /Parent 1 0 R /Type /Page /Resources 334 0 R /MediaBox [ 0 0 612 792 ] >> endobj 13 0 obj << /Length 2776 /Filter /FlateDecode >> stream xڭ˒ܶ_1GN !oɱkSDCbfcj0/I' FFگw냀1_UYSU{|@jʚ\y=)LX=VRFՇ;uk=n2O"OӟĐHi'_O TVU*%_@QǟMqs=}RY u>6~XY|_Wk+Y[7MQl*fG=o|zdu,2+3l`k#oŧ>ص,OkQ&Əv q կH4RfMQr5{mJɼ2 |cSld+oS([}o@(dk\f(WJf?i?!L±/ Bv_@?i;,ΊB^7`hqṩDUBQŮ[[% +4!YY ˖6_YɩwcdUfbMlO \&ˋkVK^% W5e35rvnɴTJܘv2Cu^$5%Jmh;w/ƘvN!ӥ#-І[RWXD'Eᷟ m`)98cg@83byF {yvƁ%*ͰIO<{9߈hpCނ+'goҝw(/y`"^1 Da"1`G>y5fdΚp8_t iEUixa`ijMXÍEn8auZb'/'꾏FosWVx,ĞpXA=F2=^j:^J:Y1anĀP*2nғ]z#1] (XX$E88GӁtJ,|gPz 2nkg d:zˢrGLdO_E01Ӎlt PDp'y>?N:@-B&='W%ٜGxt2Ǔgse+DC S:lj@7;[0