{"title": "On Single Source Robustness in Deep Fusion Models", "book": "Advances in Neural Information Processing Systems", "page_first": 4814, "page_last": 4825, "abstract": "Algorithms that fuse multiple input sources benefit from both complementary and shared information. Shared information may provide robustness against faulty or noisy inputs, which is indispensable for safety-critical applications like self-driving cars. We investigate learning fusion algorithms that are robust against noise added to a single source. We first demonstrate that robustness against single source noise is not guaranteed in a linear fusion model. Motivated by this discovery, two possible approaches are proposed to increase robustness: a carefully designed loss with corresponding training algorithms for deep fusion models, and a simple convolutional fusion layer that has a structural advantage in dealing with noise. Experimental results show that both training algorithms and our fusion layer make a deep fusion-based 3D object detector robust against noise applied to a single source, while preserving the original performance on clean data.", "full_text": "On Single Source Robustness in Deep Fusion Models\n\nThe University of Texas at Austin\n\nThe University of Texas at Austin\n\nJoydeep Ghosh\n\nAustin, TX\n\njghosh@utexas.edu\n\nTaewan Kim\u2217\n\nAustin, TX\n\ntwankim@utexas.edu\n\nAbstract\n\nAlgorithms that fuse multiple input sources bene\ufb01t from both complementary and\nshared information. Shared information may provide robustness against faulty or\nnoisy inputs, which is indispensable for safety-critical applications like self-driving\ncars. We investigate learning fusion algorithms that are robust against noise added\nto a single source. We \ufb01rst demonstrate that robustness against single source\nnoise is not guaranteed in a linear fusion model. Motivated by this discovery, two\npossible approaches are proposed to increase robustness: a carefully designed\nloss with corresponding training algorithms for deep fusion models, and a simple\nconvolutional fusion layer that has a structural advantage in dealing with noise.\nExperimental results show that both training algorithms and our fusion layer make\na deep fusion-based 3D object detector robust against noise applied to a single\nsource, while preserving the original performance on clean data.\n\n1\n\nIntroduction\n\nDeep learning models have accomplished superior performance in several machine learning problems\n[26] including object recognition [24, 40, 42, 15, 18], object detection [37, 16, 7, 36, 30, 35] and\nspeech recognition [17, 14, 38, 5, 2, 4], which use either visual or audio sources. One natural way\nof improving a model\u2019s performance is to make use of multiple input sources relevant to a given\ntask so that enough information can be extracted to build strong features. Therefore, deep fusion\nmodels have recently attracted considerable attention for autonomous driving [21, 3, 33, 25], medical\nimaging [23, 48, 39, 29], and audio-visual speech recognition [19, 32, 41, 6].\nTwo bene\ufb01ts are expected when fusion-based learning models are selected for a given problem.\nFirst, given adequate data, more information from multiple sources can enrich the model\u2019s feature\nspace to achieve higher prediction performance, especially, when different input sources provide\ncomplementary information to the model. This expectation coincides with a simple information\ntheoretic fact: if we have multiple input sources X1,\u00b7\u00b7\u00b7 , Xns and a target variable Y , mutual\ninformation I(; ) obeys I(Y ; X1,\u00b7\u00b7\u00b7 , Xns ) \u2265 I(Y ; Xi) (\u2200i \u2208 [ns]).\nThe second expected advantage is increased robustness against single source faults, which is the\nprimary concern of our work. An underlying intuition comes from the fact that different sources may\nhave shared information so one sensor can partially compensate for others. This type of robustness\nis critical in real-world fusion models, because each source may be exposed to different types of\ncorruption but not at the same time. For example, self-driving cars using an RGB camera and ranging\nsensors like LIDAR and radar are exposed to single source corruption. LIDARs and radars work \ufb01ne\nat night whereas RGB cameras do not. Also, each source used in the model may have its own sensing\ndevice, and hence not necessarily be corrupted by some physical attack simultaneously with others.\nIt would be ideal if the structure of machine learning based fusion models and shared information\ncould compensate for the corruption and automatically guarantee robustness without additional steps.\n\u2217This work was done when Taewan Kim was at the University of Texas at Austin, prior to joining Amazon.\n\n33rd Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada.\n\n\fThis paper shows that a fusion model needs a supplementary strategy and a specialized structure to\navoid vulnerability to noise or corruption on a single source. Our contributions are as follows:\n\n\u2022 We show that a fusion model learned with a standard robustness is not guaranteed to provide\nrobustness against noise on a single source. Inspired by the analysis, a novel loss is proposed\nto achieve the desired robustness (Section 3).\n\nto ensure robustness without impacting performance on clean data (Section 4.1).\n\n\u2022 Two ef\ufb01cient training algorithms for minimizing our loss in deep fusion models are devised\n\u2022 We introduce a simple but an effective fusion layer which naturally reduces error by applying\n\nensembling to latent convolutional features (Section 4.2).\n\nWe apply our loss and the fusion layer to a complex deep fusion-based 3D object detector used\nin autonomous driving for further investigation in practice. Note that our \ufb01ndings can be easily\ngeneralized to other applications exhibiting intermittent defects in a subset of input sources, e.g.,\nrobustness given k of ns corrupted sources, and single source robustness should be studied in depth\nprior to more general cases.\n\n2 Related Works\n\nDeep fusion models have been actively studied in object detection for autonomous vehicles. There\nexist two major streams classi\ufb01ed according to their algorithmic structures: two-stage detectors with\nR-CNN (Region-based Convolutional Neural Networks) technique [12, 11, 37, 7, 16], and single\nstage detectors for faster inference speed [36, 35, 30].\nEarlier deep fusion models extended Fast R-CNN [11] to provide better quality of region proposals\nfrom multiple sources [21, 1]. With a high-resolution LIDAR, point cloud was used as a major source\nof the region proposal stage before the fusion step [8], whereas F-PointNet [33] used it for validating\n2D proposals from RGB images and predicting 3D shape and location within the visual frustum.\nMV3D [3] extended the idea of region proposal network (RPN) [37] by generating proposals from\nRGB image, and LIDAR\u2019s front view and BEV (bird\u2019s eye view) maps. Recent works tried to remove\nregion proposal stages for faster inference and directly fused LIDAR\u2019s front view depth image [22]\nor BEV image [47] with RGB images. ContFuse [27] utilizes both RGB and LIDAR\u2019s BEV images\nwith a new continuous fusion scheme, which is further improved in MMF [28] by handling multiple\ntasks at once. Our experimental results are based on AVOD [25], a recent open-sourced 3D object\ndetector that generates region proposals from RPN using RGB and LIDAR\u2019s BEV images.\nCompared to the active efforts in accomplishing higher performance on clean data, very few works\nhave focused on robust learning methods in multi-source settings to the best of our knowledge.\nAdaptive fusion methods using gating networks weight the importance of each source automatically\n[31, 46], but these works lack in-depth studies of the robustness against single source faults. A\nrecent work proposed a gated fusion at the feature level and applied data augmentation techniques\nwith randomly chosen corruption methods [20]. In contrast, our training algorithms are surrogate\nminimization schemes for the proposed loss function, which is grounded from the analyses on\nunderlying weakness of fusion methods. Also the fusion layer proposed in this paper focuses more on\nhow to mix convolutional feature maps channel-wise with simple trainable procedures. For extensive\nliterature reviews, please refer to the recent survey papers about deep multi-modal learning methods\nin general [34] and for autonomous driving [9].\n\n3 Single Source Robustness of Fusion Models\n\n3.1 Regression on linear fusion data\n\nTo show the vulnerability of naive fusion models, we introduce a simple data model and a fusion\nalgorithm. Suppose y is a linear function consisting of three different inherent (latent) components\nzi \u2208 Rdi (i \u2208 {1, 2, 3}). There are two input sources, x1 and x2. Here \u03c8\u2019s are unknown functions.\n(1)\n\nT zi, where z1 = \u03c81(x1), z2 = \u03c82(x2), z3 = \u03c83,1(x1) = \u03c83,2(x2)\n\ny =\n\n\u03b2i\n\n3(cid:88)i=1\n\n2\n\n\fOur simple data model simulates a target variable y relevant to two different sources, where each\nsource has its own special information z1 and z2 and a shared one z3. For example, if two sources\nare obtained from an RGB camera and a LIDAR sensor, one can imagine that any features related\nto objectness are captured in z3 whereas colors and depth information may be located in z1 and z2,\nrespectively. Our objective is to build a regression model by effectively incorporating information\nfrom the sources (x1, x2) to predict the target variable y.\nNow, consider a fairly simple setting x1 = [z1; z3] \u2208 Rd1+d3 and x2 = [z2; z3] \u2208 Rd2+d3, where\n(\u03c81, \u03c82, \u03c83,1, \u03c83,2) can be de\ufb01ned accordingly to satisfy (1). A straightforward fusion approach is to\nstack the sources, i.e. x = [x1; x2] \u2208 Rd1+d2+2d3, and learn a linear model. Then, it is easy to show\nthat there exists a feasible error-free model for noise-free data:\n\nfdirect(x1, x2) = hT\n\n(2)\nwhere h1 = [\u03b21; g1], h2 = [\u03b22; g2]. Parameter vectors responsible for the shared information z3 are\ndenoted by g1 and g2.2\n\n2 z3), s.t. g1 + g2 = \u03b23\n\n1 z3) + (\u03b2T\n\n2 z2 + gT\n\n1 x1 + hT\n\n2 x2 = (\u03b2T\n\n1 z1 + gT\n\nUnbalanced robustness (Motivation) Suppose the true parameters of data are scalar values, i.e.\n\u03b2i = ci \u2208 R and in\ufb02uence of the complementary information is relatively small, c1 \u2248 c2 and c3 (cid:29) c1.\nAssume that the obtained error-free solution\u2019s parameters for z3 are unbalanced, i.e. g1 = \u2206 and\ng2 = c3 \u2212 \u2206 with some weight parameter \u2206 (cid:28) c3, so that g1 gives a negligible contribution. Then\nadd single source corruption \u03b41 = [\u00011; \u00013] and \u03b42 = [\u00012; \u00013] and compute absolute difference between\nthe true data y and the prediction from the corrupted data:\n\n|y \u2212 fdirect(x1 + \u03b41, x2)| = |c1\u00011 + \u2206\u00013|,\n\n|y \u2212 fdirect(x1, x2 + \u03b42)| = |c2\u00012 + (c3 \u2212 \u2206)\u00013|\n\nIn this case, adding noise to the source x2 will give signi\ufb01cant corruption to the prediction while x1\nis relatively robust because |(c3 \u2212 \u2206)\u00013| (cid:29) |\u2206\u00013| for any noise \u00013 affecting z3. This simple example\nillustrates that additional training strategies or components are indispensable to achieve robust fusion\nmodel working even if one of the sources is disturbed. The next section introduces a novel loss for a\nbalanced robustness against a fault in a single source.\n\n3.2 Robust learning for single source noise\n\nFusion methods are not guaranteed to provide robustness against faults in a single source without\nadditional supervision. Also, we demonstrate that naive regularization or robust learning methods\nare not suf\ufb01cient for the robustness later in this section. Therefore, a supplementary constraint or\nstrategy needs to be considered in training which can correctly guide learning parameters for the\ndesired robustness.\nOne essential requirement of fusion models is showing balanced performance regardless of corruption\nadded to any source. If the model is signi\ufb01cantly vulnerable to corruption in one source, this model\nbecomes untrustworthy and we need to balance the degradation levels of different input sources\u2019\nfaults. For example, suppose there is a model robust against noise in RGB channels but shows huge\ndegradation in performance for any fault of LIDAR. Then the overall system should be considered\nuntrustworthy, because there exist certain corruption or environments which can consistently fool the\nmodel. Our loss, MAXSSN (Maximum Single Source Noise), for such robustness is introduced to\nhandle this issue and further analyses are provided under the linear fusion data model explained in\nSection 3.1. This loss makes the model focus more on corruption of a single source, SSN, rather than\nfocusing on noise added to all the sources at once, ASN.\nDe\ufb01nition 1. For multiple sources x1,\u00b7\u00b7\u00b7 , xns and a target variable y, denote a prede\ufb01ned loss\nfunction by L. If each source xi is perturbed with some additive noise \u0001i for i \u2208 [ns], MAXSSN loss\nfor a model f is de\ufb01ned as follows:\n(cid:20)h1\n(cid:21)\ni {L (y, f (x1,\u00b7\u00b7\u00b7 , xi\u22121, xi + \u0001i, xi+1,\u00b7\u00b7\u00b7 , xns))}ns\nLMAXSSN(f, \u0001) (cid:44) max\nhas to be solved for X1 \u2208 Rn\u00d7(d1+d3), X2 \u2208 Rn\u00d7(d2+d3) and Y \u2208 Rn\nwith enough number of n data samples. Then a standard least squares solution using a pseudo-inverse gives\nh1 = [\u03b21; \u03b23/2], h2 = [\u03b22; \u03b23/2]. This is equivalent to the solution robust against random noise added to all\nthe sources at once, which is vulnerable to single source faults (Section 3.2).\n\n2In practice, Y = [X1, X2]\n\ni=1\n\nh2\n\n3\n\n\fAnother key principle in our robust training is to retain the model\u2019s performance on clean data.\nAlthough techniques like data augmentation help improving a model\u2019s generalization error in general,\nlearning a model robust against certain types perturbation including adversarial attacks may harm the\nmodel\u2019s accuracy on non-corrupt data [43]. Deterioration in the model\u2019s ability on normal data is an\nunwanted side effect, and hence our approach aims to avoid this.\n\nRandom noise To investigate the importance of our MAXSSN loss, we revisit the linear fusion\ndata model with the optimal direct fusion model fdirect of the regression problem introduced in\nSection 3.1. Suppose the objective is to \ufb01nd a model with robustness against single source noises,\nwhile preserving error-free performance, i.e., unchanged loss under clean data. For the noise model,\nconsider \u0001 = [\u03b41; \u03b42] where \u03b41 = [\u00011; \u00013] and \u03b42 = [\u00012; \u00014], which satisfy E[\u0001i] = 0, V ar(\u0001i) = \u03c32I,\nand E[\u0001i\u0001T\nj ] = 0 for i (cid:54)= j. Note that noises added to the shared information, \u00013 and \u00014, are not\nidentical, which resembles direct perturbation to the input sources in practice. For example, noise\ndirectly affecting a camera lens does not need to perturb other sources.\n\n2 z1 + gT\n\nOptimal fusion model for MAXSSN The robust linear fusion model f (x1, x2) = (wT\n1 z3)+\n2 z3) is found by minimizing LMAXSSN(f, \u0001) over parameters w1, w2, g1 and g2. As shown\n(wT\nin the previous section, any fdirect satisfying w1 = \u03b21, w2 = \u03b22 and g1 + g2 = \u03b23 should achieve\nzero-error. Therefore, overall optimization problem can be reduced to the following one:\n\n1 z1+gT\n\nmin\ng1,g2\n\n1\n\n2\n\n2\n\n2\n\n2\n\n(4)\n\n2\n\n2\n\n2\n\notherwise\n\n2\n\n+ ||\u03b23||2\n\n2\n\n\uf8f1\uf8f4\uf8f4\uf8f4\uf8f4\uf8f2\uf8f4\uf8f4\uf8f4\uf8f4\uf8f3\n\ns.t. g1 + g2 = \u03b23\n\nmax{L (y, fdirect(x1 + \u03b41, x2)) ,L (y, fdirect(x1, x2 + \u03b42))}\n\n(3)\nIf we use a standard expected squared loss L(y, f (x1, x2)) = E[(y \u2212 f (x1, x2))2] and solve the\noptimization problem, the following solution L\u2217MAXSSN with corresponding parameters g\u22171, g\u22172 can be\nobtained, and there exist three cases based on the relative sizes of ||\u03b2i||2\u2019s.\nif ||\u03b21||2\n2 + ||\u03b23||2\nif ||\u03b22||2\n2 + ||\u03b23||2\n(cid:17) ,\n2)2\n2\u2212||\u03b21||2\n4||\u03b23||2\n(cid:17)(cid:17)\n2\u2212||\u03b21||2\n||\u03b23||2\n\n2, \u03b23, 0(cid:1)\n(cid:0)\u03c32||\u03b22||2\n2, 0, \u03b23(cid:1)\n(cid:0)\u03c32||\u03b21||2\n(cid:16)\u03c32(cid:16)||\u03b21||2\n2(cid:16)1 + ||\u03b22||2\n\n(L\u2217MAXSSN, g\u22171, g\u22172) =\n\n2 \u2264 ||\u03b22||2\n2 \u2264 ||\u03b21||2\n\n4 + (||\u03b22||2\n\n2\u2212||\u03b21||2\n||\u03b23||2\n\n2+||\u03b22||2\n2\n\nThe three cases re\ufb02ect the relative in\ufb02uence of each weight vector for zi. For instance, if z2 has\nlarger importance compared to the rest in generating y, the optimal way of balancing the effect\nof noise over z3 is to remove all the in\ufb02uence of z2 in x2 by setting g2 = 0. When neither of\n\n2(cid:16)1 \u2212 ||\u03b22||2\n(cid:17) , 1\n(cid:12)(cid:12)(cid:12) < 1, the optimal solution tries to make\nz1 nor z2 dominates the importance, i.e. (cid:12)(cid:12)(cid:12)||\u03b22||2\n\n2\u2212||\u03b21||2\n||\u03b23||2\nL (y, fdirect(x1 + \u03b41, x2)) = L (y, fdirect(x1, x2 + \u03b42)).\nComparison with the standard robust fusion model Minimizing loss with noise added to a\nmodel\u2019s input is a standard process in robust learning. The same strategy can be applied to learn\nfusion models by considering all sources as a single combined source, then add noise to all the\nsources at once. However, this simple strategy cannot achieve low error in terms of the single source\nrobustness. The optimal solution to ming1,g2 E[(y \u2212 fdirect(x1 + \u03b41, x2 + \u03b42))2], a least squares\n2 . The corresponding MAXSSN loss can be evaluated\nsolution, is achieved when g1 = g2 = \u03b23\n2(cid:9). A nontrivial gap exists between\nas L(cid:48)MAXSSN = \u03c32 max(cid:8)||\u03b21||2\n2,||\u03b22||2\n4||\u03b23||2\nLMAXSSN and L(cid:48)MAXSSN, which is directly proportional to the data model\u2019s inherent characteristics:\nif (cid:12)(cid:12)(cid:12)||\u03b22||2\n(5)\n\nL(cid:48)MAXSSN \u2212 L\u2217MAXSSN \u2265(cid:40) 1\n\nIf either z1 or z2 has more in\ufb02uence on the target value y than the other components, single source\nrobustness of the model trained by MAXSSN loss is better than the fusion model for the general noise\nrobustness with an amount proportional to the in\ufb02uence of shared feature z3. Otherwise, the gap\u2019s\n2|/4.\nlower bound is proportional to the difference in complementary information, |||\u03b22||2\nRemark 1. In linear systems such as the one studied above, having redundant information in the\nfeature space is similar to multicollinearity in statistics. In this case, feature selection methods usually\ntry to remove such redundancy. However, this redundant or shared information helps preventing\ndegradation of the fusion model when a subset of the input sources are corrupted.\nRemark 2. Similar analyses and a loss de\ufb01nition against adversarial attacks [13] are provided in\nappendix A.2.\n\n4(cid:12)(cid:12)||\u03b22||2\n\n(cid:12)(cid:12)(cid:12) \u2265 1\n\n2\u2212||\u03b21||2\n||\u03b23||2\n\n2 \u2212 ||\u03b21||2\n\n2 \u2212 ||\u03b21||2\n\n4||\u03b23||2\n\n2\n\n4||\u03b23||2\n\notherwise\n\n2(cid:12)(cid:12)\n\n2 + 1\n\n2 + 1\n\n1\n\n2\n\n2\n\n2\n\n2\n\n4\n\n\f4 Robust Deep Fusion Models\n\nIn simple linear settings, our analyses illustrate that using MAXSSN loss can effectively minimize\nthe degradation of a fusion model\u2019s performance. This suggests a training strategy for complex\ndeep fusion models to be equipped with robustness against single source faults. A principal factor\nconsidered in designing a common framework for our algorithms is the preservation of model\u2019s\nperformance on clean data while minimizing a loss for defending corruption. Therefore, our training\nalgorithms use data augmentation to encounter both clean and corrupted data. The second way of\nachieving robustness is to take advantage of the fusion method\u2019s structure. A simple but effective\nmethod of mixing convolutional features coming from different input sources is introduced later in\nthis section.\n\n4.1 Robust training algorithms for single source noise\n\nIn the previous section, we solve problem (3) by optimizing over \ufb02exible parameters g1 and g2. If\nthe parts of input sources contributing to z3 are known, then indeed we can achieve this goal. In\npractice however, it is dif\ufb01cult to know which parts of an input source (or latent representation) are\nrelated to shared information and which parameters are \ufb02exible. Therefore, our common training\nframework alternately provides clean samples and corrupted samples per iteration to preserve the\noriginal performance of the model on uncontaminated data.3 On top of this strategy, one standard\nrobust training scheme and two algorithms for minimizing MAXSSN loss are introduced for handling\nrobustness against noise in different sources.\n\nStandard robust training method A standard robust training algorithm can be developed by\nconsidering all ns sources as a single combined source. Given noise generating functions \u03d5i(\u00b7) (i \u2208\n[ns]), the algorithm generates and adds corruption to all the sensors at once. Then the corresponding\nloss can be computed to update parameters using back-propagation. This algorithm is denoted by\nTRAINASN and tested in experiments to investigate whether the procedure is also able to cover\nrobustness against single source noise.\n\nAlgorithm 1 TRAINSSN\n\nAlgorithm 2 TRAINSSNALT\n\nfor iiter = 1 to m do\nSample (y,{xi}ns\ni=1)\nif iiter \u2261 1 (mod 2) then\nfor j = 1 to ns do\n\nGenerate noise \u0001j = \u03d5j(xj)\n\u02c6L(iiter)\nj \u2190 L(y, f ({xj + \u0001j, x\u2212j}))\nend for\nL(iiter) \u2190 maxj \u02c6L(iiter)\nelse\nL(iiter) \u2190 L(y, f ({xi}ns\n\ni=1))\n\nj\n\nend if\nUpdate f using \u2207L(iiter)\n\nend for\n\nfor iiter = 1 to m do\nSample (y,{xi}ns\ni=1)\nif iiter \u2261 1 (mod 2) then\nj \u2190 ((cid:98)iiter/2(cid:99) mod ns) + 1\nGenerate noise \u0001j = \u03d5j(xj)\nL(iiter) \u2190 L(y, f ({xj + \u0001j, x\u2212j}))\nL(iiter) \u2190 L(y, f ({xi}ns\n\ni=1))\n\nelse\n\nend if\nUpdate f using \u2207L(iiter)\n\nend for\n\nMinimization of MAXSSN loss Minimization of the MAXSSN loss requires ns (number of\ninput sources) forward-propagations within one iteration. Each propagation needs a different set\nof corrupted samples generated by adding single source noise to the \ufb01xed clean mini-batch of\ndata. There are two possible approaches to compute gradients properly from these multiple passes.\nFirst, we can run back-propagation ns times to save the gradients temporarily without updating\nany parameters, then the saved gradients with the maximum loss is used for updating parameters.\n3We also try \ufb01ne-tuning only a subset of the model\u2019s parameters, \u03b8fusion \u2282 f, to preserve essential parts for\nextracting features from normal data. Although this strategy is similar to optimizing over only g1 and g2 in our\nlinear fusion case, training the whole network from the beginning shows better performance in practice. See\nAppendix B for a detailed comparison.\n\n5\n\n\fHowever, this process requires not only ns forward and backward passes but also large memory\nusage proportional to ns for saving the gradients. Another reasonable approach is to run ns forward\npasses to \ufb01nd the maximum loss and compute gradients by going back to the corresponding set\nof corrupted samples. Algorithm 1 adopts this idea for its ef\ufb01ciency, ns + 1 forward passes and\none back-propagation. A faster version of the algorithm, TRAINSSNALT, is also considered since\nmultiple forward passes may take longer as the number of sources increases. This algorithm ignores\nthe maximum loss and alternately augments corrupted data. By a slight abuse of notation, symbols\nused in our algorithms also represent the iteration steps with the size of mini-batches greater than one.\nAlso, f (x1,\u00b7\u00b7\u00b7 , xj\u22121, xj + \u0001j, xj+1,\u00b7\u00b7\u00b7 , xns) is shortened to f ({xj + \u0001j, x\u2212j}) in the algorithms.\n\n4.2 Feature fusion methods\n\nFusion of features extracted from multiple input sources can be done in various ways [3]. One of\nthe popular methods is to fuse via an element-wise mean operation [25], but this assumes that each\nfeature must have a same shape, i.e., width, height, and number of channels for a 3D feature. An\nelement-wise mean can be also viewed as averaging channels from different 3D features, and it has\nan underlying assumption that the channels of each feature should share same information regardless\nof the input source origin. Therefore, the risk of becoming vulnerable to single source corruption\nmay increase with this simple mean fusion method.\n\nFigure 1: Latent ensemble layer (LEL)\n\nOur fusion method, latent ensemble layer (LEL), is devised for three objectives: (i) maintaining the\nknown advantage\u2014error reduction\u2014of ensemble methods [45, 44], (ii) admitting source-speci\ufb01c\nfeatures to survive even after the fusion procedure, and (iii) allowing each source to provide a different\nnumber of channels. The proposed layer learns parameters so that channels of the 3D features from\nthe different sources can be selectively mixed. Sparse constraints are introduced to let the training\nprocedure \ufb01nd good subsets of channels to be fused across the ns feature maps. For example, mixing\nthe ith channel of the convolutional feature from an RGB image with the jth and kth channels of the\nLIDAR\u2019s latent feature is possible in our LEL, whereas in an element-wise mean layer the ith latent\nchannel from RGB is only mixed with the other sources\u2019 ith channels.\nIn practice, this layer can be easily constructed by using 1 \u00d7 1 convolutions with the ReLU activation\nand (cid:96)1 constraints. We also apply an activation function to supplement a semi-adaptive behavior to the\nfusion procedure. Depth of the output channel is set to \u02c6d = maxi{di} and we set the hyper-parameter\nfor (cid:96)1 constraint as 0.01 in the experiments. De\ufb01nition 2 explains the details of our LEL, and Figure\n1 visualizes the overall process.\nDe\ufb01nition 2 (Latent ensemble layer). Suppose we have ns convolutional features zi \u2208 Ra\u00d7b\u00d7di\nfrom different input sources (i \u2208 [ns]), which can be stacked as z = (z1,\u00b7\u00b7\u00b7 , zm) \u2208 Ra\u00d7b\u00d7dsum\n(dsum =(cid:80)m\ni=1 di). The kth channel of the stacked feature is denoted by [z]k \u2208 Ra\u00d7b. Let wj =\n1 ,\u00b7\u00b7\u00b7 , w(j)\n(w(j)\ndsum) be a dsum-dimensional weight vector to mix zi\u2019s in channel-wise fashion. Then LEL\nk [z]k(cid:17),\noutputs \u02c6z \u2208 Ra\u00d7b\u00d7 \u02c6d where each channel is computed as [\u02c6z]j = \u03c6(wj (cid:12) z) (cid:44) \u03c6(cid:16)(cid:80)dsum\nwith some activation function \u03c6 and sparse constraints ||wj||0 \u2264 t for all j \u2208 {1,\u00b7\u00b7\u00b7 , \u02c6d}.\n\nk=1 w(j)\n\n6\n\n\ud835\udc33\"\ud835\udc4e\ud835\udc4f\ud835\udc51\"\ud835\udc33&\u2019\ud835\udc4e\ud835\udc4f\ud835\udc51&\u2019\u22eeLatent\tEnsembleLayer(LEL)8\ud835\udc33\ud835\udc4e\ud835\udc4f9\ud835\udc51wj=(w(j)1,\u00b7\u00b7\u00b7,w(j)dsum)AAAFhnicbdTdbtMwFABgb6wwys82uOQmYkLq0FQ1E2jcIHVi2rjYRen6u6ZEjuOu3pw4sx2yzso78DTcwmvwNiRrJqcxlqIcne8c23EkexElQrZaf9fWH23UHj/ZfFp/9vzFy63tnVcDwWKOcB8xyvjIgwJTEuK+JJLiUcQxDDyKh971l9yHPzAXhIU9uYjwNICXIZkRBGWWcrffO17iXn1uJK79XTWu9tJ9B/lMiv3EVb6rRByk6RL23O3dVrN1PywzsItgFxSj4+5sHDs+Q3GAQ4koFGJityI5VZBLgihO604scATRNbzEkywMYYDFVN1/VGq9yzK+NWM8e0Jp3WfLHQoGQiwCL6sMoJyLquXJ/9kklrNPU0XCKJY4RMuFZjG1JLPyE7J8wjGSdJEFEHGS7dVCc8ghktk5rszUs6cq31w+zcryQUwl4SxJ63UnxAliQQBDXznebaqcfEfeTN2maQUXGhcGCo3CwCONRwZyjdzASGNk4I3GGwOTDD1G/fwfMKoSo6Cvu/sGDjQODBxqHBo40jgycKxxbOC5xnMD7zTeGdhZ/dZOtaD70Oypblr96eisUASpOqu2onFJjT2j05KeGnpS0hNz4W6Ju0bzqKTGSaKLkl48TC3nmHEcqOKdql4RrKiPZyQk+f2STorcVB3r5EpxxFnERLW6U8pm945dvWXMYHDQtLP424fd9kFxA22CN+AtaAAbHII2+Ao6oA8Q+Al+gd/gT22z1qx9rB0uS9fXip7XYGXU2v8AZQ3/iA==AAAFhnicbdTdbtMwFABgb6wwys82uOQmYkLq0FQ1E2jcIHVi2rjYRen6u6ZEjuOu3pw4sx2yzso78DTcwmvwNiRrJqcxlqIcne8c23EkexElQrZaf9fWH23UHj/ZfFp/9vzFy63tnVcDwWKOcB8xyvjIgwJTEuK+JJLiUcQxDDyKh971l9yHPzAXhIU9uYjwNICXIZkRBGWWcrffO17iXn1uJK79XTWu9tJ9B/lMiv3EVb6rRByk6RL23O3dVrN1PywzsItgFxSj4+5sHDs+Q3GAQ4koFGJityI5VZBLgihO604scATRNbzEkywMYYDFVN1/VGq9yzK+NWM8e0Jp3WfLHQoGQiwCL6sMoJyLquXJ/9kklrNPU0XCKJY4RMuFZjG1JLPyE7J8wjGSdJEFEHGS7dVCc8ghktk5rszUs6cq31w+zcryQUwl4SxJ63UnxAliQQBDXznebaqcfEfeTN2maQUXGhcGCo3CwCONRwZyjdzASGNk4I3GGwOTDD1G/fwfMKoSo6Cvu/sGDjQODBxqHBo40jgycKxxbOC5xnMD7zTeGdhZ/dZOtaD70Oypblr96eisUASpOqu2onFJjT2j05KeGnpS0hNz4W6Ju0bzqKTGSaKLkl48TC3nmHEcqOKdql4RrKiPZyQk+f2STorcVB3r5EpxxFnERLW6U8pm945dvWXMYHDQtLP424fd9kFxA22CN+AtaAAbHII2+Ao6oA8Q+Al+gd/gT22z1qx9rB0uS9fXip7XYGXU2v8AZQ3/iA==AAAFhnicbdTdbtMwFABgb6wwys82uOQmYkLq0FQ1E2jcIHVi2rjYRen6u6ZEjuOu3pw4sx2yzso78DTcwmvwNiRrJqcxlqIcne8c23EkexElQrZaf9fWH23UHj/ZfFp/9vzFy63tnVcDwWKOcB8xyvjIgwJTEuK+JJLiUcQxDDyKh971l9yHPzAXhIU9uYjwNICXIZkRBGWWcrffO17iXn1uJK79XTWu9tJ9B/lMiv3EVb6rRByk6RL23O3dVrN1PywzsItgFxSj4+5sHDs+Q3GAQ4koFGJityI5VZBLgihO604scATRNbzEkywMYYDFVN1/VGq9yzK+NWM8e0Jp3WfLHQoGQiwCL6sMoJyLquXJ/9kklrNPU0XCKJY4RMuFZjG1JLPyE7J8wjGSdJEFEHGS7dVCc8ghktk5rszUs6cq31w+zcryQUwl4SxJ63UnxAliQQBDXznebaqcfEfeTN2maQUXGhcGCo3CwCONRwZyjdzASGNk4I3GGwOTDD1G/fwfMKoSo6Cvu/sGDjQODBxqHBo40jgycKxxbOC5xnMD7zTeGdhZ/dZOtaD70Oypblr96eisUASpOqu2onFJjT2j05KeGnpS0hNz4W6Ju0bzqKTGSaKLkl48TC3nmHEcqOKdql4RrKiPZyQk+f2STorcVB3r5EpxxFnERLW6U8pm945dvWXMYHDQtLP424fd9kFxA22CN+AtaAAbHII2+Ao6oA8Q+Al+gd/gT22z1qx9rB0uS9fXip7XYGXU2v8AZQ3/iA==AAAFhnicbdTdbtMwFABgb6wwys82uOQmYkLq0FQ1E2jcIHVi2rjYRen6u6ZEjuOu3pw4sx2yzso78DTcwmvwNiRrJqcxlqIcne8c23EkexElQrZaf9fWH23UHj/ZfFp/9vzFy63tnVcDwWKOcB8xyvjIgwJTEuK+JJLiUcQxDDyKh971l9yHPzAXhIU9uYjwNICXIZkRBGWWcrffO17iXn1uJK79XTWu9tJ9B/lMiv3EVb6rRByk6RL23O3dVrN1PywzsItgFxSj4+5sHDs+Q3GAQ4koFGJityI5VZBLgihO604scATRNbzEkywMYYDFVN1/VGq9yzK+NWM8e0Jp3WfLHQoGQiwCL6sMoJyLquXJ/9kklrNPU0XCKJY4RMuFZjG1JLPyE7J8wjGSdJEFEHGS7dVCc8ghktk5rszUs6cq31w+zcryQUwl4SxJ63UnxAliQQBDXznebaqcfEfeTN2maQUXGhcGCo3CwCONRwZyjdzASGNk4I3GGwOTDD1G/fwfMKoSo6Cvu/sGDjQODBxqHBo40jgycKxxbOC5xnMD7zTeGdhZ/dZOtaD70Oypblr96eisUASpOqu2onFJjT2j05KeGnpS0hNz4W6Ju0bzqKTGSaKLkl48TC3nmHEcqOKdql4RrKiPZyQk+f2STorcVB3r5EpxxFnERLW6U8pm945dvWXMYHDQtLP424fd9kFxA22CN+AtaAAbHII2+Ao6oA8Q+Al+gd/gT22z1qx9rB0uS9fXip7XYGXU2v8AZQ3/iA==j2{1,\u00b7\u00b7\u00b7,\u02c6d}AAAFdnicbdRPb9MwFABwb7Qwyp91cAMJRVQTHKYpmZDguIlp47BD6fp3TVU5jrOaOXFmO3SdFYlPwxW+Dt+EI0mXyUmMpShP7/ee7TiSvZgSIW37z8bmg0bz4aOtx60nT589327vvBgKlnCEB4hRxsceFJiSCA8kkRSPY45h6FE88q4+5z76jrkgLOrLVYxnIbyMSEAQlFlq3n71zXJJZLnK2XORz6TYcxdQKj9103m7Y+/b62GZgVMEHVCM7nyncez6DCUhjiSiUIipY8dypiCXBFGcttxE4BiiK3iJp1kYwRCLmVp/RGrtZhnfChjPnkha62y5Q8FQiFXoZZUhlAtRtzz5P5smMvg0UySKE4kjdLdQkFBLMis/EcsnHCNJV1kAESfZXi20gBwimZ1bZaa+M1P55vJpKsuHCZWEs2XaarkRXiIWhjDylevdpMrNd+QF6iZNa7jSuDJQaBQGHmk8MpBr5AbGGmMDrzVeG7jM0GPUz/8Bo2ppFAx098DAocahgSONIwPHGscGTjRODDzXeG7grcZbA7vVb+3WC3r3zZ7qpfWfjs4KRZCqs3ormpTU2DM6LempoSclPTEX7pW4ZzSPS2qcJLoo6cX91HKBGcehKt6p6hdBRX0ckIjk90k6LXIzdayTleKYs5iJenW3lM3uHad+y5jB8GDfyeKvHzqHB8UNtAVeg7fgPXDAR3AIvoAuGAAEfoCf4Bf43fjbfNPcbb67K93cKHpegspo2v8AC+D6Bw==AAAFdnicbdRPb9MwFABwb7Qwyp91cAMJRVQTHKYpmZDguIlp47BD6fp3TVU5jrOaOXFmO3SdFYlPwxW+Dt+EI0mXyUmMpShP7/ee7TiSvZgSIW37z8bmg0bz4aOtx60nT589327vvBgKlnCEB4hRxsceFJiSCA8kkRSPY45h6FE88q4+5z76jrkgLOrLVYxnIbyMSEAQlFlq3n71zXJJZLnK2XORz6TYcxdQKj9103m7Y+/b62GZgVMEHVCM7nyncez6DCUhjiSiUIipY8dypiCXBFGcttxE4BiiK3iJp1kYwRCLmVp/RGrtZhnfChjPnkha62y5Q8FQiFXoZZUhlAtRtzz5P5smMvg0UySKE4kjdLdQkFBLMis/EcsnHCNJV1kAESfZXi20gBwimZ1bZaa+M1P55vJpKsuHCZWEs2XaarkRXiIWhjDylevdpMrNd+QF6iZNa7jSuDJQaBQGHmk8MpBr5AbGGmMDrzVeG7jM0GPUz/8Bo2ppFAx098DAocahgSONIwPHGscGTjRODDzXeG7grcZbA7vVb+3WC3r3zZ7qpfWfjs4KRZCqs3ormpTU2DM6LempoSclPTEX7pW4ZzSPS2qcJLoo6cX91HKBGcehKt6p6hdBRX0ckIjk90k6LXIzdayTleKYs5iJenW3lM3uHad+y5jB8GDfyeKvHzqHB8UNtAVeg7fgPXDAR3AIvoAuGAAEfoCf4Bf43fjbfNPcbb67K93cKHpegspo2v8AC+D6Bw==AAAFdnicbdRPb9MwFABwb7Qwyp91cAMJRVQTHKYpmZDguIlp47BD6fp3TVU5jrOaOXFmO3SdFYlPwxW+Dt+EI0mXyUmMpShP7/ee7TiSvZgSIW37z8bmg0bz4aOtx60nT589327vvBgKlnCEB4hRxsceFJiSCA8kkRSPY45h6FE88q4+5z76jrkgLOrLVYxnIbyMSEAQlFlq3n71zXJJZLnK2XORz6TYcxdQKj9103m7Y+/b62GZgVMEHVCM7nyncez6DCUhjiSiUIipY8dypiCXBFGcttxE4BiiK3iJp1kYwRCLmVp/RGrtZhnfChjPnkha62y5Q8FQiFXoZZUhlAtRtzz5P5smMvg0UySKE4kjdLdQkFBLMis/EcsnHCNJV1kAESfZXi20gBwimZ1bZaa+M1P55vJpKsuHCZWEs2XaarkRXiIWhjDylevdpMrNd+QF6iZNa7jSuDJQaBQGHmk8MpBr5AbGGmMDrzVeG7jM0GPUz/8Bo2ppFAx098DAocahgSONIwPHGscGTjRODDzXeG7grcZbA7vVb+3WC3r3zZ7qpfWfjs4KRZCqs3ormpTU2DM6LempoSclPTEX7pW4ZzSPS2qcJLoo6cX91HKBGcehKt6p6hdBRX0ckIjk90k6LXIzdayTleKYs5iJenW3lM3uHad+y5jB8GDfyeKvHzqHB8UNtAVeg7fgPXDAR3AIvoAuGAAEfoCf4Bf43fjbfNPcbb67K93cKHpegspo2v8AC+D6Bw==AAAFdnicbdRPb9MwFABwb7Qwyp91cAMJRVQTHKYpmZDguIlp47BD6fp3TVU5jrOaOXFmO3SdFYlPwxW+Dt+EI0mXyUmMpShP7/ee7TiSvZgSIW37z8bmg0bz4aOtx60nT589327vvBgKlnCEB4hRxsceFJiSCA8kkRSPY45h6FE88q4+5z76jrkgLOrLVYxnIbyMSEAQlFlq3n71zXJJZLnK2XORz6TYcxdQKj9103m7Y+/b62GZgVMEHVCM7nyncez6DCUhjiSiUIipY8dypiCXBFGcttxE4BiiK3iJp1kYwRCLmVp/RGrtZhnfChjPnkha62y5Q8FQiFXoZZUhlAtRtzz5P5smMvg0UySKE4kjdLdQkFBLMis/EcsnHCNJV1kAESfZXi20gBwimZ1bZaa+M1P55vJpKsuHCZWEs2XaarkRXiIWhjDylevdpMrNd+QF6iZNa7jSuDJQaBQGHmk8MpBr5AbGGmMDrzVeG7jM0GPUz/8Bo2ppFAx098DAocahgSONIwPHGscGTjRODDzXeG7grcZbA7vVb+3WC3r3zZ7qpfWfjs4KRZCqs3ormpTU2DM6LempoSclPTEX7pW4ZzSPS2qcJLoo6cX91HKBGcehKt6p6hdBRX0ckIjk90k6LXIzdayTleKYs5iJenW3lM3uHad+y5jB8GDfyeKvHzqHB8UNtAVeg7fgPXDAR3AIvoAuGAAEfoCf4Bf43fjbfNPcbb67K93cKHpegspo2v8AC+D6Bw==1)\tConcatenation2)\t1x1 Convolution with sparse constraints\ud835\udc33\"\ud835\udc33&\u2019\u22ef[z]k2Ra\u21e5bAAAFdHicbdRPb9MwFABwb7Qwyr8OLkhwiKgmcZqaCQmOm5g2DjuUrn/XhMpxnNWqE2e2Q9da4dNwhe/DF+FM0mVyEmMpytP7vWc7jmQvpkTIbvfPzu6DRvPho73HrSdPnz1/0d5/ORIs4QgPEaOMTzwoMCURHkoiKZ7EHMPQo3jsLT/nPv6OuSAsGsh1jN0QXkckIAjKLDVvv5453sadLx0SOf1vCjqShFhYXjpvd7qH3e2wzMAugg4oRm++3zh1fIaSEEcSUSjEzO7G0lWQS4IoTltOInAM0RJe41kWRjBbyFXbT0itgyzjWwHj2RNJa5stdygYCrEOvawyhHIh6pYn/2ezRAafXEWiOJE4QncLBQm1JLPy87B8wjGSdJ0FEHGS7dVCC8ghktmpVWYa2K7KN5dPU1k+TKgknK3SVsuJ8AqxMISRrxzvNlVOviMvULdpWsO1xrWBQqMw8ETjiYFcIzcw1hgbeKPxxsBVhh6jfv4PGFUro2Cou4cGjjSODBxrHBs40TgxcKpxauClxksDNxo3Bvaq39qrF/Tvmz3VT+s/HV0UiiBVF/VWNC2psWd0XtJzQ89KemYu3C9x32ielNQ4SXRV0qv7qeUCM45DVbxTNSiCivo4IBHJb5N0VuRcdaqTleKYs5iJenWvlM3uHbt+y5jB6OjQzuKvHzrHR8UNtAfegHfgPbDBR3AMvoAeGAIEfoCf4Bf43fjbfNvsNA/uSnd3ip5XoDKah/8AvwL5wQ==AAAFdHicbdRPb9MwFABwb7Qwyr8OLkhwiKgmcZqaCQmOm5g2DjuUrn/XhMpxnNWqE2e2Q9da4dNwhe/DF+FM0mVyEmMpytP7vWc7jmQvpkTIbvfPzu6DRvPho73HrSdPnz1/0d5/ORIs4QgPEaOMTzwoMCURHkoiKZ7EHMPQo3jsLT/nPv6OuSAsGsh1jN0QXkckIAjKLDVvv5453sadLx0SOf1vCjqShFhYXjpvd7qH3e2wzMAugg4oRm++3zh1fIaSEEcSUSjEzO7G0lWQS4IoTltOInAM0RJe41kWRjBbyFXbT0itgyzjWwHj2RNJa5stdygYCrEOvawyhHIh6pYn/2ezRAafXEWiOJE4QncLBQm1JLPy87B8wjGSdJ0FEHGS7dVCC8ghktmpVWYa2K7KN5dPU1k+TKgknK3SVsuJ8AqxMISRrxzvNlVOviMvULdpWsO1xrWBQqMw8ETjiYFcIzcw1hgbeKPxxsBVhh6jfv4PGFUro2Cou4cGjjSODBxrHBs40TgxcKpxauClxksDNxo3Bvaq39qrF/Tvmz3VT+s/HV0UiiBVF/VWNC2psWd0XtJzQ89KemYu3C9x32ielNQ4SXRV0qv7qeUCM45DVbxTNSiCivo4IBHJb5N0VuRcdaqTleKYs5iJenWvlM3uHbt+y5jB6OjQzuKvHzrHR8UNtAfegHfgPbDBR3AMvoAeGAIEfoCf4Bf43fjbfNvsNA/uSnd3ip5XoDKah/8AvwL5wQ==AAAFdHicbdRPb9MwFABwb7Qwyr8OLkhwiKgmcZqaCQmOm5g2DjuUrn/XhMpxnNWqE2e2Q9da4dNwhe/DF+FM0mVyEmMpytP7vWc7jmQvpkTIbvfPzu6DRvPho73HrSdPnz1/0d5/ORIs4QgPEaOMTzwoMCURHkoiKZ7EHMPQo3jsLT/nPv6OuSAsGsh1jN0QXkckIAjKLDVvv5453sadLx0SOf1vCjqShFhYXjpvd7qH3e2wzMAugg4oRm++3zh1fIaSEEcSUSjEzO7G0lWQS4IoTltOInAM0RJe41kWRjBbyFXbT0itgyzjWwHj2RNJa5stdygYCrEOvawyhHIh6pYn/2ezRAafXEWiOJE4QncLBQm1JLPy87B8wjGSdJ0FEHGS7dVCC8ghktmpVWYa2K7KN5dPU1k+TKgknK3SVsuJ8AqxMISRrxzvNlVOviMvULdpWsO1xrWBQqMw8ETjiYFcIzcw1hgbeKPxxsBVhh6jfv4PGFUro2Cou4cGjjSODBxrHBs40TgxcKpxauClxksDNxo3Bvaq39qrF/Tvmz3VT+s/HV0UiiBVF/VWNC2psWd0XtJzQ89KemYu3C9x32ielNQ4SXRV0qv7qeUCM45DVbxTNSiCivo4IBHJb5N0VuRcdaqTleKYs5iJenWvlM3uHbt+y5jB6OjQzuKvHzrHR8UNtAfegHfgPbDBR3AMvoAeGAIEfoCf4Bf43fjbfNvsNA/uSnd3ip5XoDKah/8AvwL5wQ==AAAFTHicbdRPb9MwFABwd7RQwoDtzCViQuI0JVzgyMS0cdghdP27Npocx1mtOXFmO3SdlS/AlU+H+DI4XSYnMZaqPr3fe/ZLIjnKKRHS8/709p71B89fDF86r/ad12/eHuxPBSs4whPEKOPzCApMSYYnkkiK5znHMI0onkW33yqf/cRcEJaN5TbHYQpvMpIQBKVOBdcHR96xt1uuHfh1cATqdX3YP13FDBUpziSiUIil7+UyVJBLgigunVUhcA7RLbzBSx1mMMUiVLs5S/eDzsRuwrj+ZdLdZZsdCqZCbNNIV6ZQrkXXquT/bFnI5EuoSJYXEmfo8aCkoK5kbvXQbkw4RpJudQARJ3pWF60hh0jqV9PaaeyHqhqu2qZ1fFpQSTjblI6zyvAGsTSFWaxW0X2pVtVEUaLuy7KDW4NbC4VBYeGJwRMLuUFuYW4wt/DO4J2FG40Ro3H1DRhVG6tgYronFk4NTi2cGZxZODc4t3BhcGHhpcFLCx8MPlgYtJ816BaMnpojNSq7Hx1d1IogVRfdVrRoqDUzOm/ouaVnDT2zDx41eGQ1zxtqvUl01dCrp63lGjOOU1X/l2pcBy2NcUIyUl0Z5bLOherUJFvFOWc5E93qoJHV147fvWTsYPrp2NfxDw8MwTvwHnwEPvgMvoLvIAATgEAMfoHf/b+D3mD4eD3t9ep76hC01sD5B0VE7aU=AAAFaXicbdRPb9MwFABwb7QwyoCOExIcIqZJnKaECxyZmDYOO5Su/7YmVI7jrlYdO7MdutYKn4YrfB++CGeSkslJjKUoT+/nZzuO9MKEEqlc9/fO7oNW++GjvcedJ/tPnz3vHuyPJE8FwkPEKReTEEpMCcNDRRTFk0RgGIcUj8Plp8LH37CQhLOBWic4iOENI3OCoMpTs+7LqR9ugtnSJ8zvf9XQVyTG0gmzWffQPXa3w7EDrwwOQTl6s4PWqR9xlMaYKUShlFPPTVSgoVAEUZx1/FTiBKIlvMHTPGQw3yjQ20/InKM8EzlzLvKHKWebrVZoGEu5jsN8ZgzVQjatSP7Ppqmafwg0YUmqMEP/Npqn1FHcKe7DiYjASNF1HkAkSH5WBy2ggEjlt1ZbaeAFujhcsUxt+ziligi+yjodn+EV4nEMWaT98C7TfnGicK7vsqyBa4NrC6VBaeGJwRMLhUFhYWIwsfDW4K2FqxxDTqPiH3CqV9aEoakeWjgyOLJwbHBs4cTgxMIrg1cWXhq8tHBjcGNhr/6tveaE/n1xqPtZ86eji1IRpPqiWYquKmqdGZ1X9NzSs4qe2Rv3K9y3iicVtW4SXVf0+n5ptcBc4FiX70wPyqCmEZ4TRopukk3LXKBPTbI2ORE84bI5u1fJ5n3Ha3YZOxi9O/by+IsL9sAr8Aa8BR54Dz6Cz6AHhgCB7+AH+Al+tf60X7fLHrW7U7aqF6A22kd/AfHp+O4=AAAFaXicbdRPb9MwFABwb7QwyoCOExIcIqZJnKaECxyZmDYOO5Su/7YmVI7jrlYdO7MdutYKn4YrfB++CGeSkslJjKUoT+/nZzuO9MKEEqlc9/fO7oNW++GjvcedJ/tPnz3vHuyPJE8FwkPEKReTEEpMCcNDRRTFk0RgGIcUj8Plp8LH37CQhLOBWic4iOENI3OCoMpTs+7LqR9ugtnSJ8zvf9XQVyTG0gmzWffQPXa3w7EDrwwOQTl6s4PWqR9xlMaYKUShlFPPTVSgoVAEUZx1/FTiBKIlvMHTPGQw3yjQ20/InKM8EzlzLvKHKWebrVZoGEu5jsN8ZgzVQjatSP7Ppqmafwg0YUmqMEP/Npqn1FHcKe7DiYjASNF1HkAkSH5WBy2ggEjlt1ZbaeAFujhcsUxt+ziligi+yjodn+EV4nEMWaT98C7TfnGicK7vsqyBa4NrC6VBaeGJwRMLhUFhYWIwsfDW4K2FqxxDTqPiH3CqV9aEoakeWjgyOLJwbHBs4cTgxMIrg1cWXhq8tHBjcGNhr/6tveaE/n1xqPtZ86eji1IRpPqiWYquKmqdGZ1X9NzSs4qe2Rv3K9y3iicVtW4SXVf0+n5ptcBc4FiX70wPyqCmEZ4TRopukk3LXKBPTbI2ORE84bI5u1fJ5n3Ha3YZOxi9O/by+IsL9sAr8Aa8BR54Dz6Cz6AHhgCB7+AH+Al+tf60X7fLHrW7U7aqF6A22kd/AfHp+O4=AAAFdHicbdRPb9MwFABwb7Qwyr8OLkhwiKgmcZrSXcZxE9PGYYfS9e+aUDmOs1p14sx26ForfBqu8H34IpxJukxOYixFeXq/92zHkezFlAhp2392dh81mo+f7D1tPXv+4uWr9v7rkWAJR3iIGGV84kGBKYnwUBJJ8STmGIYexWNv+Tn38XfMBWHRQK5j7IbwJiIBQVBmqXn77czxNu586ZDI6X9T0JEkxMLy0nm7Yx/a22GZQbcIOqAYvfl+48zxGUpCHElEoRCzrh1LV0EuCaI4bTmJwDFES3iDZ1kYwWwhV20/IbUOsoxvBYxnTyStbbbcoWAoxDr0ssoQyoWoW578n80SGXxyFYniROII3S8UJNSSzMrPw/IJx0jSdRZAxEm2VwstIIdIZqdWmWnQdVW+uXyayvJhQiXhbJW2Wk6EV4iFIYx85Xh3qXLyHXmBukvTGq41rg0UGoWBpxpPDeQauYGxxtjAW423Bq4y9Bj183/AqFoZBUPdPTRwpHFk4Fjj2MCJxomBU41TA680Xhm40bgxsFf91l69oP/Q7Kl+Wv/p6LJQBKm6rLeiaUmNPaOLkl4Yel7Sc3Phfon7RvOkpMZJouuSXj9MLReYcRyq4p2qQRFU1McBiUh+m6SzIueqM52sFMecxUzUq3ulbHbvdOu3jBmMjg67WfzV7pwcFTfQHngHPoCPoAuOwQn4AnpgCBD4AX6CX+B342/zfbPTPLgv3d0pet6Aymge/gO9wvm9AAAFdHicbdRPb9MwFABwb7Qwyr8OLkhwiKgmcZqaCQmOm5g2DjuUrn/XhMpxnNWqE2e2Q9da4dNwhe/DF+FM0mVyEmMpytP7vWc7jmQvpkTIbvfPzu6DRvPho73HrSdPnz1/0d5/ORIs4QgPEaOMTzwoMCURHkoiKZ7EHMPQo3jsLT/nPv6OuSAsGsh1jN0QXkckIAjKLDVvv5453sadLx0SOf1vCjqShFhYXjpvd7qH3e2wzMAugg4oRm++3zh1fIaSEEcSUSjEzO7G0lWQS4IoTltOInAM0RJe41kWRjBbyFXbT0itgyzjWwHj2RNJa5stdygYCrEOvawyhHIh6pYn/2ezRAafXEWiOJE4QncLBQm1JLPy87B8wjGSdJ0FEHGS7dVCC8ghktmpVWYa2K7KN5dPU1k+TKgknK3SVsuJ8AqxMISRrxzvNlVOviMvULdpWsO1xrWBQqMw8ETjiYFcIzcw1hgbeKPxxsBVhh6jfv4PGFUro2Cou4cGjjSODBxrHBs40TgxcKpxauClxksDNxo3Bvaq39qrF/Tvmz3VT+s/HV0UiiBVF/VWNC2psWd0XtJzQ89KemYu3C9x32ielNQ4SXRV0qv7qeUCM45DVbxTNSiCivo4IBHJb5N0VuRcdaqTleKYs5iJenWvlM3uHbt+y5jB6OjQzuKvHzrHR8UNtAfegHfgPbDBR3AMvoAeGAIEfoCf4Bf43fjbfNvsNA/uSnd3ip5XoDKah/8AvwL5wQ==AAAFdHicbdRPb9MwFABwb7Qwyr8OLkhwiKgmcZqaCQmOm5g2DjuUrn/XhMpxnNWqE2e2Q9da4dNwhe/DF+FM0mVyEmMpytP7vWc7jmQvpkTIbvfPzu6DRvPho73HrSdPnz1/0d5/ORIs4QgPEaOMTzwoMCURHkoiKZ7EHMPQo3jsLT/nPv6OuSAsGsh1jN0QXkckIAjKLDVvv5453sadLx0SOf1vCjqShFhYXjpvd7qH3e2wzMAugg4oRm++3zh1fIaSEEcSUSjEzO7G0lWQS4IoTltOInAM0RJe41kWRjBbyFXbT0itgyzjWwHj2RNJa5stdygYCrEOvawyhHIh6pYn/2ezRAafXEWiOJE4QncLBQm1JLPy87B8wjGSdJ0FEHGS7dVCC8ghktmpVWYa2K7KN5dPU1k+TKgknK3SVsuJ8AqxMISRrxzvNlVOviMvULdpWsO1xrWBQqMw8ETjiYFcIzcw1hgbeKPxxsBVhh6jfv4PGFUro2Cou4cGjjSODBxrHBs40TgxcKpxauClxksDNxo3Bvaq39qrF/Tvmz3VT+s/HV0UiiBVF/VWNC2psWd0XtJzQ89KemYu3C9x32ielNQ4SXRV0qv7qeUCM45DVbxTNSiCivo4IBHJb5N0VuRcdaqTleKYs5iJenWvlM3uHbt+y5jB6OjQzuKvHzrHR8UNtAfegHfgPbDBR3AMvoAeGAIEfoCf4Bf43fjbfNvsNA/uSnd3ip5XoDKah/8AvwL5wQ==AAAFdHicbdRPb9MwFABwb7Qwyr8OLkhwiKgmcZqaCQmOm5g2DjuUrn/XhMpxnNWqE2e2Q9da4dNwhe/DF+FM0mVyEmMpytP7vWc7jmQvpkTIbvfPzu6DRvPho73HrSdPnz1/0d5/ORIs4QgPEaOMTzwoMCURHkoiKZ7EHMPQo3jsLT/nPv6OuSAsGsh1jN0QXkckIAjKLDVvv5453sadLx0SOf1vCjqShFhYXjpvd7qH3e2wzMAugg4oRm++3zh1fIaSEEcSUSjEzO7G0lWQS4IoTltOInAM0RJe41kWRjBbyFXbT0itgyzjWwHj2RNJa5stdygYCrEOvawyhHIh6pYn/2ezRAafXEWiOJE4QncLBQm1JLPy87B8wjGSdJ0FEHGS7dVCC8ghktmpVWYa2K7KN5dPU1k+TKgknK3SVsuJ8AqxMISRrxzvNlVOviMvULdpWsO1xrWBQqMw8ETjiYFcIzcw1hgbeKPxxsBVhh6jfv4PGFUro2Cou4cGjjSODBxrHBs40TgxcKpxauClxksDNxo3Bvaq39qrF/Tvmz3VT+s/HV0UiiBVF/VWNC2psWd0XtJzQ89KemYu3C9x32ielNQ4SXRV0qv7qeUCM45DVbxTNSiCivo4IBHJb5N0VuRcdaqTleKYs5iJenWvlM3uHbt+y5jB6OjQzuKvHzrHR8UNtAfegHfgPbDBR3AMvoAeGAIEfoCf4Bf43fjbfNvsNA/uSnd3ip5XoDKah/8AvwL5wQ==AAAFdHicbdRPb9MwFABwb7Qwyr8OLkhwiKgmcZqaCQmOm5g2DjuUrn/XhMpxnNWqE2e2Q9da4dNwhe/DF+FM0mVyEmMpytP7vWc7jmQvpkTIbvfPzu6DRvPho73HrSdPnz1/0d5/ORIs4QgPEaOMTzwoMCURHkoiKZ7EHMPQo3jsLT/nPv6OuSAsGsh1jN0QXkckIAjKLDVvv5453sadLx0SOf1vCjqShFhYXjpvd7qH3e2wzMAugg4oRm++3zh1fIaSEEcSUSjEzO7G0lWQS4IoTltOInAM0RJe41kWRjBbyFXbT0itgyzjWwHj2RNJa5stdygYCrEOvawyhHIh6pYn/2ezRAafXEWiOJE4QncLBQm1JLPy87B8wjGSdJ0FEHGS7dVCC8ghktmpVWYa2K7KN5dPU1k+TKgknK3SVsuJ8AqxMISRrxzvNlVOviMvULdpWsO1xrWBQqMw8ETjiYFcIzcw1hgbeKPxxsBVhh6jfv4PGFUro2Cou4cGjjSODBxrHBs40TgxcKpxauClxksDNxo3Bvaq39qrF/Tvmz3VT+s/HV0UiiBVF/VWNC2psWd0XtJzQ89KemYu3C9x32ielNQ4SXRV0qv7qeUCM45DVbxTNSiCivo4IBHJb5N0VuRcdaqTleKYs5iJenWvlM3uHbt+y5jB6OjQzuKvHzrHR8UNtAfegHfgPbDBR3AMvoAeGAIEfoCf4Bf43fjbfNvsNA/uSnd3ip5XoDKah/8AvwL5wQ==AAAFdHicbdRPb9MwFABwb7Qwyr8OLkhwiKgmcZqaCQmOm5g2DjuUrn/XhMpxnNWqE2e2Q9da4dNwhe/DF+FM0mVyEmMpytP7vWc7jmQvpkTIbvfPzu6DRvPho73HrSdPnz1/0d5/ORIs4QgPEaOMTzwoMCURHkoiKZ7EHMPQo3jsLT/nPv6OuSAsGsh1jN0QXkckIAjKLDVvv5453sadLx0SOf1vCjqShFhYXjpvd7qH3e2wzMAugg4oRm++3zh1fIaSEEcSUSjEzO7G0lWQS4IoTltOInAM0RJe41kWRjBbyFXbT0itgyzjWwHj2RNJa5stdygYCrEOvawyhHIh6pYn/2ezRAafXEWiOJE4QncLBQm1JLPy87B8wjGSdJ0FEHGS7dVCC8ghktmpVWYa2K7KN5dPU1k+TKgknK3SVsuJ8AqxMISRrxzvNlVOviMvULdpWsO1xrWBQqMw8ETjiYFcIzcw1hgbeKPxxsBVhh6jfv4PGFUro2Cou4cGjjSODBxrHBs40TgxcKpxauClxksDNxo3Bvaq39qrF/Tvmz3VT+s/HV0UiiBVF/VWNC2psWd0XtJzQ89KemYu3C9x32ielNQ4SXRV0qv7qeUCM45DVbxTNSiCivo4IBHJb5N0VuRcdaqTleKYs5iJenWvlM3uHbt+y5jB6OjQzuKvHzrHR8UNtAfegHfgPbDBR3AMvoAeGAIEfoCf4Bf43fjbfNvsNA/uSnd3ip5XoDKah/8AvwL5wQ==AAAFdHicbdRPb9MwFABwb7Qwyr8OLkhwiKgmcZqaCQmOm5g2DjuUrn/XhMpxnNWqE2e2Q9da4dNwhe/DF+FM0mVyEmMpytP7vWc7jmQvpkTIbvfPzu6DRvPho73HrSdPnz1/0d5/ORIs4QgPEaOMTzwoMCURHkoiKZ7EHMPQo3jsLT/nPv6OuSAsGsh1jN0QXkckIAjKLDVvv5453sadLx0SOf1vCjqShFhYXjpvd7qH3e2wzMAugg4oRm++3zh1fIaSEEcSUSjEzO7G0lWQS4IoTltOInAM0RJe41kWRjBbyFXbT0itgyzjWwHj2RNJa5stdygYCrEOvawyhHIh6pYn/2ezRAafXEWiOJE4QncLBQm1JLPy87B8wjGSdJ0FEHGS7dVCC8ghktmpVWYa2K7KN5dPU1k+TKgknK3SVsuJ8AqxMISRrxzvNlVOviMvULdpWsO1xrWBQqMw8ETjiYFcIzcw1hgbeKPxxsBVhh6jfv4PGFUro2Cou4cGjjSODBxrHBs40TgxcKpxauClxksDNxo3Bvaq39qrF/Tvmz3VT+s/HV0UiiBVF/VWNC2psWd0XtJzQ89KemYu3C9x32ielNQ4SXRV0qv7qeUCM45DVbxTNSiCivo4IBHJb5N0VuRcdaqTleKYs5iJenWvlM3uHbt+y5jB6OjQzuKvHzrHR8UNtAfegHfgPbDBR3AMvoAeGAIEfoCf4Bf43fjbfNvsNA/uSnd3ip5XoDKah/8AvwL5wQ==dsum=nsXi=1diAAAFfnicbdRPb9MwFABwb6wwyp91cOQSrYC4MBqExC6TNjFtHHYoXf+uKZHjuKs1J85sh7azfOfTcIWvwrch6TI5jbEU5en93rMdR3KQUCJkq/V3Y/PBVu3ho+3H9SdPnz3faey+6AuWcoR7iFHGhwEUmJIY9ySRFA8TjmEUUDwIrr/kPviBuSAs7splgicRvIrJlCAos5Tf2At95Um8kEqkkdaHXvbyFTl09XcV+0KHPvEbzdZ+azUcO3CLoAmK0fZ3t068kKE0wrFEFAoxdluJnCjIJUEU67qXCpxAdA2v8DgLYxhhMVGrj9HOmywTOlPGsyeWzipb7lAwEmIZBVllBOVMVC1P/s/GqZweTBSJk1TiGN0tNE2pI5mTn4wTEo6RpMssgIiTbK8OmkEOkczOb22mrjtR+ebyadaWj1IqCWdzXa97MZ4jFkUwDpUXLLTy8h0FU7XQuoJLg0sLhUFh4bHBYwu5QW5hYjCx8MbgjYXzDANGw/wfMKrmVkHPdPcs7BvsWzgwOLBwaHBo4cjgyMILgxcW3hq8tbC9/q3takHnvjlQHV396ei8UASpOq+2olFJrT2js5KeWXpa0lN74U6JO1bzsKTWSaLLkl7eTy1nmHEcqeKtVbcI1jTEUxKT/F7R4yI3UScmuVaccJYwUa1ul7LZveNWbxk76H/cd7P426fm0eviBtoGr8AeeAdc8Bkcga+gDXoAgZ/gF/gN/tRA7W3tfe3DXenmRtHzEqyN2sE/Ci79zQ==AAAFfnicbdRPb9MwFABwb6wwyp91cOQSrYC4MBqExC6TNjFtHHYoXf+uKZHjuKs1J85sh7azfOfTcIWvwrch6TI5jbEU5en93rMdR3KQUCJkq/V3Y/PBVu3ho+3H9SdPnz3faey+6AuWcoR7iFHGhwEUmJIY9ySRFA8TjmEUUDwIrr/kPviBuSAs7splgicRvIrJlCAos5Tf2At95Um8kEqkkdaHXvbyFTl09XcV+0KHPvEbzdZ+azUcO3CLoAmK0fZ3t068kKE0wrFEFAoxdluJnCjIJUEU67qXCpxAdA2v8DgLYxhhMVGrj9HOmywTOlPGsyeWzipb7lAwEmIZBVllBOVMVC1P/s/GqZweTBSJk1TiGN0tNE2pI5mTn4wTEo6RpMssgIiTbK8OmkEOkczOb22mrjtR+ebyadaWj1IqCWdzXa97MZ4jFkUwDpUXLLTy8h0FU7XQuoJLg0sLhUFh4bHBYwu5QW5hYjCx8MbgjYXzDANGw/wfMKrmVkHPdPcs7BvsWzgwOLBwaHBo4cjgyMILgxcW3hq8tbC9/q3takHnvjlQHV396ei8UASpOq+2olFJrT2js5KeWXpa0lN74U6JO1bzsKTWSaLLkl7eTy1nmHEcqeKtVbcI1jTEUxKT/F7R4yI3UScmuVaccJYwUa1ul7LZveNWbxk76H/cd7P426fm0eviBtoGr8AeeAdc8Bkcga+gDXoAgZ/gF/gN/tRA7W3tfe3DXenmRtHzEqyN2sE/Ci79zQ==AAAFfnicbdRPb9MwFABwb6wwyp91cOQSrYC4MBqExC6TNjFtHHYoXf+uKZHjuKs1J85sh7azfOfTcIWvwrch6TI5jbEU5en93rMdR3KQUCJkq/V3Y/PBVu3ho+3H9SdPnz3faey+6AuWcoR7iFHGhwEUmJIY9ySRFA8TjmEUUDwIrr/kPviBuSAs7splgicRvIrJlCAos5Tf2At95Um8kEqkkdaHXvbyFTl09XcV+0KHPvEbzdZ+azUcO3CLoAmK0fZ3t068kKE0wrFEFAoxdluJnCjIJUEU67qXCpxAdA2v8DgLYxhhMVGrj9HOmywTOlPGsyeWzipb7lAwEmIZBVllBOVMVC1P/s/GqZweTBSJk1TiGN0tNE2pI5mTn4wTEo6RpMssgIiTbK8OmkEOkczOb22mrjtR+ebyadaWj1IqCWdzXa97MZ4jFkUwDpUXLLTy8h0FU7XQuoJLg0sLhUFh4bHBYwu5QW5hYjCx8MbgjYXzDANGw/wfMKrmVkHPdPcs7BvsWzgwOLBwaHBo4cjgyMILgxcW3hq8tbC9/q3takHnvjlQHV396ei8UASpOq+2olFJrT2js5KeWXpa0lN74U6JO1bzsKTWSaLLkl7eTy1nmHEcqeKtVbcI1jTEUxKT/F7R4yI3UScmuVaccJYwUa1ul7LZveNWbxk76H/cd7P426fm0eviBtoGr8AeeAdc8Bkcga+gDXoAgZ/gF/gN/tRA7W3tfe3DXenmRtHzEqyN2sE/Ci79zQ==AAAFfnicbdRPb9MwFABwb6wwyp91cOQSrYC4MBqExC6TNjFtHHYoXf+uKZHjuKs1J85sh7azfOfTcIWvwrch6TI5jbEU5en93rMdR3KQUCJkq/V3Y/PBVu3ho+3H9SdPnz3faey+6AuWcoR7iFHGhwEUmJIY9ySRFA8TjmEUUDwIrr/kPviBuSAs7splgicRvIrJlCAos5Tf2At95Um8kEqkkdaHXvbyFTl09XcV+0KHPvEbzdZ+azUcO3CLoAmK0fZ3t068kKE0wrFEFAoxdluJnCjIJUEU67qXCpxAdA2v8DgLYxhhMVGrj9HOmywTOlPGsyeWzipb7lAwEmIZBVllBOVMVC1P/s/GqZweTBSJk1TiGN0tNE2pI5mTn4wTEo6RpMssgIiTbK8OmkEOkczOb22mrjtR+ebyadaWj1IqCWdzXa97MZ4jFkUwDpUXLLTy8h0FU7XQuoJLg0sLhUFh4bHBYwu5QW5hYjCx8MbgjYXzDANGw/wfMKrmVkHPdPcs7BvsWzgwOLBwaHBo4cjgyMILgxcW3hq8tbC9/q3takHnvjlQHV396ei8UASpOq+2olFJrT2js5KeWXpa0lN74U6JO1bzsKTWSaLLkl7eTy1nmHEcqeKtVbcI1jTEUxKT/F7R4yI3UScmuVaccJYwUa1ul7LZveNWbxk76H/cd7P426fm0eviBtoGr8AeeAdc8Bkcga+gDXoAgZ/gF/gN/tRA7W3tfe3DXenmRtHzEqyN2sE/Ci79zQ==Learnable\tparametersdsumAAAFaHicbdTdbtMwFABgb7Qwyl8HFwhxE60gcTU1ExJcbmLauNhF6fq7pkyO46zW7DizHdrOyntwC2/FK/AUJF0mJzGWohyd7xzbcST7MSVSdbt/trYfNJoPH+08bj15+uz5i/buy5HkiUB4iDjlYuJDiSmJ8FARRfEkFhgyn+Kxf/0l9/EPLCTh0UCtYzxn8CoiIUFQZanvwaX2FF4pLROWppftTne/uxmOHbhF0AHF6F3uNo69gKOE4UghCqWcud1YzTUUiiCK05aXSBxDdA2v8CwLI8iwnOvNtlPnfZYJnJCL7ImUs8mWOzRkUq6Zn1UyqBaybnnyfzZLVPh5rkkUJwpH6G6hMKGO4k5+Bk5ABEaKrrMAIkGyvTpoAQVEKjupykwDd67zzeXTVJZnCVVE8GXaankRXiLOGIwC7fmrVHv5jvxQr9K0hmuDawulQWnhkcEjC4VBYWFsMLbwxuCNhcsMfU6D/B9wqpdWwdB0Dy0cGRxZODY4tnBicGLh1ODUwnOD5xbeGry1sFf91l69oH/f7Ot+Wv/p6KxQBKk+q7eiaUmtPaPTkp5aelLSE3vhfon7VvOkpNZJoouSXtxPrRaYC8x08U71oAgqGuCQRCS/QdJZkZvrY5OsFMeCx1zWq3ulbHbvuPVbxg5GB/tuFn/72Dl8V9xAO+At2AMfgAs+gUPwFfTAECAgwE/wC/xu/G22m6+bb+5Kt7eKnlegMpp7/wBMB/X1AAAFaHicbdTdbtMwFABgb7Qwyl8HFwhxE60gcTU1ExJcbmLauNhF6fq7pkyO46zW7DizHdrOyntwC2/FK/AUJF0mJzGWohyd7xzbcST7MSVSdbt/trYfNJoPH+08bj15+uz5i/buy5HkiUB4iDjlYuJDiSmJ8FARRfEkFhgyn+Kxf/0l9/EPLCTh0UCtYzxn8CoiIUFQZanvwaX2FF4pLROWppftTne/uxmOHbhF0AHF6F3uNo69gKOE4UghCqWcud1YzTUUiiCK05aXSBxDdA2v8CwLI8iwnOvNtlPnfZYJnJCL7ImUs8mWOzRkUq6Zn1UyqBaybnnyfzZLVPh5rkkUJwpH6G6hMKGO4k5+Bk5ABEaKrrMAIkGyvTpoAQVEKjupykwDd67zzeXTVJZnCVVE8GXaankRXiLOGIwC7fmrVHv5jvxQr9K0hmuDawulQWnhkcEjC4VBYWFsMLbwxuCNhcsMfU6D/B9wqpdWwdB0Dy0cGRxZODY4tnBicGLh1ODUwnOD5xbeGry1sFf91l69oH/f7Ot+Wv/p6KxQBKk+q7eiaUmtPaPTkp5aelLSE3vhfon7VvOkpNZJoouSXtxPrRaYC8x08U71oAgqGuCQRCS/QdJZkZvrY5OsFMeCx1zWq3ulbHbvuPVbxg5GB/tuFn/72Dl8V9xAO+At2AMfgAs+gUPwFfTAECAgwE/wC/xu/G22m6+bb+5Kt7eKnlegMpp7/wBMB/X1AAAFaHicbdTdbtMwFABgb7Qwyl8HFwhxE60gcTU1ExJcbmLauNhF6fq7pkyO46zW7DizHdrOyntwC2/FK/AUJF0mJzGWohyd7xzbcST7MSVSdbt/trYfNJoPH+08bj15+uz5i/buy5HkiUB4iDjlYuJDiSmJ8FARRfEkFhgyn+Kxf/0l9/EPLCTh0UCtYzxn8CoiIUFQZanvwaX2FF4pLROWppftTne/uxmOHbhF0AHF6F3uNo69gKOE4UghCqWcud1YzTUUiiCK05aXSBxDdA2v8CwLI8iwnOvNtlPnfZYJnJCL7ImUs8mWOzRkUq6Zn1UyqBaybnnyfzZLVPh5rkkUJwpH6G6hMKGO4k5+Bk5ABEaKrrMAIkGyvTpoAQVEKjupykwDd67zzeXTVJZnCVVE8GXaankRXiLOGIwC7fmrVHv5jvxQr9K0hmuDawulQWnhkcEjC4VBYWFsMLbwxuCNhcsMfU6D/B9wqpdWwdB0Dy0cGRxZODY4tnBicGLh1ODUwnOD5xbeGry1sFf91l69oH/f7Ot+Wv/p6KxQBKk+q7eiaUmtPaPTkp5aelLSE3vhfon7VvOkpNZJoouSXtxPrRaYC8x08U71oAgqGuCQRCS/QdJZkZvrY5OsFMeCx1zWq3ulbHbvuPVbxg5GB/tuFn/72Dl8V9xAO+At2AMfgAs+gUPwFfTAECAgwE/wC/xu/G22m6+bb+5Kt7eKnlegMpp7/wBMB/X1AAAFTHicbdRPb9MwFABwd7RQwoDtzCViQuI0JVzgyMS0cdghdP27Npocx1mtOXFmO3SdlS/AlU+H+DI4XSYnMZaqPr3fe/ZLIjnKKRHS8/709p71B89fDF86r/ad12/eHuxPBSs4whPEKOPzCApMSYYnkkiK5znHMI0onkW33yqf/cRcEJaN5TbHYQpvMpIQBKVOBdcHR96xt1uuHfh1cATqdX3YP13FDBUpziSiUIil7+UyVJBLgigunVUhcA7RLbzBSx1mMMUiVLs5S/eDzsRuwrj+ZdLdZZsdCqZCbNNIV6ZQrkXXquT/bFnI5EuoSJYXEmfo8aCkoK5kbvXQbkw4RpJudQARJ3pWF60hh0jqV9PaaeyHqhqu2qZ1fFpQSTjblI6zyvAGsTSFWaxW0X2pVtVEUaLuy7KDW4NbC4VBYeGJwRMLuUFuYW4wt/DO4J2FG40Ro3H1DRhVG6tgYronFk4NTi2cGZxZODc4t3BhcGHhpcFLCx8MPlgYtJ816BaMnpojNSq7Hx1d1IogVRfdVrRoqDUzOm/ouaVnDT2zDx41eGQ1zxtqvUl01dCrp63lGjOOU1X/l2pcBy2NcUIyUl0Z5bLOherUJFvFOWc5E93qoJHV147fvWTsYPrp2NfxDw8MwTvwHnwEPvgMvoLvIAATgEAMfoHf/b+D3mD4eD3t9ep76hC01sD5B0VE7aU=AAAFXXicbdTPbtMwGABwb7QwyoCOC0JcIiYkTlPCZRyZmDYOO5Suf9eEynHc1ZodZ7ZD21l5D67wVrwCT0HSZXISYynKp+/3+U8c6QsTSqRy3T87u49a7cdP9p52nu0/f/Gye7A/kjwVCA8Rp1xMQigxJTEeKqIoniQCQxZSPA5vvhQ+/oGFJDweqE2CAwavY7IgCKo89T2aa1/htdIyZVk27x66R+52OHbglcEhKEdvftA69SOOUoZjhSiUcua5iQo0FIogirOOn0qcQHQDr/EsD2PIsAz09tiZ8z7PRM6Ci/yJlbPNVmdoyKTcsDCvZFAtZdOK5P9slqrFp0CTOEkVjtH9RouUOoo7xR04EREYKbrJA4gEyc/qoCUUEKn8pmorDbxAF4crlqltz1KqiOCrrNPxY7xCnDEYR9oP15n2ixOFC73OsgZuDG4slAalhScGTywUBoWFicHEwluDtxaucgw5jYp/wKleWQVDM3to4cjgyMKxwbGFE4MTC6cGpxZeGry08M7gnYW9+rf2mgX9h8mh7mfNn44uSkWQ6ovmVDStqHVmdF7Rc0vPKnpmb9yvcN+aPKmodZPoqqJXD0urJeYCM12+Mz0og5pGeEFiUnSQbFbmAn1qkrXiRPCEy2Z1r5LN+47X7DJ2MPp45OXxNxfsgbfgHfgAPHAMPoOvoAeGAAEBfoJf4Hfrb7vbfn3foXZ3ylb1CtRG+80/qib1PA==AAAFXXicbdTPbtMwGABwb7QwyoCOC0JcIiYkTlPCZRyZmDYOO5Suf9eEynHc1ZodZ7ZD21l5D67wVrwCT0HSZXISYynKp+/3+U8c6QsTSqRy3T87u49a7cdP9p52nu0/f/Gye7A/kjwVCA8Rp1xMQigxJTEeKqIoniQCQxZSPA5vvhQ+/oGFJDweqE2CAwavY7IgCKo89T2aa1/htdIyZVk27x66R+52OHbglcEhKEdvftA69SOOUoZjhSiUcua5iQo0FIogirOOn0qcQHQDr/EsD2PIsAz09tiZ8z7PRM6Ci/yJlbPNVmdoyKTcsDCvZFAtZdOK5P9slqrFp0CTOEkVjtH9RouUOoo7xR04EREYKbrJA4gEyc/qoCUUEKn8pmorDbxAF4crlqltz1KqiOCrrNPxY7xCnDEYR9oP15n2ixOFC73OsgZuDG4slAalhScGTywUBoWFicHEwluDtxaucgw5jYp/wKleWQVDM3to4cjgyMKxwbGFE4MTC6cGpxZeGry08M7gnYW9+rf2mgX9h8mh7mfNn44uSkWQ6ovmVDStqHVmdF7Rc0vPKnpmb9yvcN+aPKmodZPoqqJXD0urJeYCM12+Mz0og5pGeEFiUnSQbFbmAn1qkrXiRPCEy2Z1r5LN+47X7DJ2MPp45OXxNxfsgbfgHfgAPHAMPoOvoAeGAAEBfoJf4Hfrb7vbfn3foXZ3ylb1CtRG+80/qib1PA==AAAFaHicbdTPbtMwGABwb7Qwyr8ODghxiVaQOE0JFzhuYto47FC6/l0TKsdxVmtOnNkObWflPbjCW/EKPAVJl8lJjKUon77f99mOI9lPKBHStv/s7D5otR8+2nvcefL02fMX3f2XY8FSjvAIMcr41IcCUxLjkSSS4mnCMYx8iif+9ZfCJz8wF4TFQ7lJsBfBq5iEBEGZp74HC+VKvJZKpFGWLbo9+9DeDssMnDLogXL0F/utEzdgKI1wLBGFQswdO5GeglwSRHHWcVOBE4iu4RWe52EMIyw8td12Zr3PM4EVMp4/sbS22WqHgpEQm8jPKyMol6JpRfJ/Nk9l+NlTJE5SiWN0t1CYUksyqzgDKyAcI0k3eQARJ/leLbSEHCKZn1RtpqHjqWJzxTS15aOUSsLZKut03BivEIsiGAfK9deZcosd+aFaZ1kDNxo3BgqNwsBjjccGco3cwERjYuCNxhsDVzn6jAbFP2BUrYyCke4eGTjWODZwonFi4FTj1MCZxpmBFxovDLzVeGtgv/6t/WbB4L7ZV4Os+dPReakIUnXebEWzihp7RmcVPTP0tKKn5sKDCg+M5mlFjZNElxW9vJ9aLjHjOFLlO1PDMqhpgEMSk+IGyeZlzlMnOlkrTjhLmGhW9yvZ/N5xmreMGYw/Hjp5/M3uHb0rb6A98BYcgA/AAZ/AEfgK+mAEEODgJ/gFfrf+trvt1+03d6W7O2XPK1Ab7YN/Ssf18Q==AAAFaHicbdTdbtMwFABgb7Qwyl8HFwhxE60gcTU1ExJcbmLauNhF6fq7pkyO46zW7DizHdrOyntwC2/FK/AUJF0mJzGWohyd7xzbcST7MSVSdbt/trYfNJoPH+08bj15+uz5i/buy5HkiUB4iDjlYuJDiSmJ8FARRfEkFhgyn+Kxf/0l9/EPLCTh0UCtYzxn8CoiIUFQZanvwaX2FF4pLROWppftTne/uxmOHbhF0AHF6F3uNo69gKOE4UghCqWcud1YzTUUiiCK05aXSBxDdA2v8CwLI8iwnOvNtlPnfZYJnJCL7ImUs8mWOzRkUq6Zn1UyqBaybnnyfzZLVPh5rkkUJwpH6G6hMKGO4k5+Bk5ABEaKrrMAIkGyvTpoAQVEKjupykwDd67zzeXTVJZnCVVE8GXaankRXiLOGIwC7fmrVHv5jvxQr9K0hmuDawulQWnhkcEjC4VBYWFsMLbwxuCNhcsMfU6D/B9wqpdWwdB0Dy0cGRxZODY4tnBicGLh1ODUwnOD5xbeGry1sFf91l69oH/f7Ot+Wv/p6KxQBKk+q7eiaUmtPaPTkp5aelLSE3vhfon7VvOkpNZJoouSXtxPrRaYC8x08U71oAgqGuCQRCS/QdJZkZvrY5OsFMeCx1zWq3ulbHbvuPVbxg5GB/tuFn/72Dl8V9xAO+At2AMfgAs+gUPwFfTAECAgwE/wC/xu/G22m6+bb+5Kt7eKnlegMpp7/wBMB/X1AAAFaHicbdTdbtMwFABgb7Qwyl8HFwhxE60gcTU1ExJcbmLauNhF6fq7pkyO46zW7DizHdrOyntwC2/FK/AUJF0mJzGWohyd7xzbcST7MSVSdbt/trYfNJoPH+08bj15+uz5i/buy5HkiUB4iDjlYuJDiSmJ8FARRfEkFhgyn+Kxf/0l9/EPLCTh0UCtYzxn8CoiIUFQZanvwaX2FF4pLROWppftTne/uxmOHbhF0AHF6F3uNo69gKOE4UghCqWcud1YzTUUiiCK05aXSBxDdA2v8CwLI8iwnOvNtlPnfZYJnJCL7ImUs8mWOzRkUq6Zn1UyqBaybnnyfzZLVPh5rkkUJwpH6G6hMKGO4k5+Bk5ABEaKrrMAIkGyvTpoAQVEKjupykwDd67zzeXTVJZnCVVE8GXaankRXiLOGIwC7fmrVHv5jvxQr9K0hmuDawulQWnhkcEjC4VBYWFsMLbwxuCNhcsMfU6D/B9wqpdWwdB0Dy0cGRxZODY4tnBicGLh1ODUwnOD5xbeGry1sFf91l69oH/f7Ot+Wv/p6KxQBKk+q7eiaUmtPaPTkp5aelLSE3vhfon7VvOkpNZJoouSXtxPrRaYC8x08U71oAgqGuCQRCS/QdJZkZvrY5OsFMeCx1zWq3ulbHbvuPVbxg5GB/tuFn/72Dl8V9xAO+At2AMfgAs+gUPwFfTAECAgwE/wC/xu/G22m6+bb+5Kt7eKnlegMpp7/wBMB/X1AAAFaHicbdTdbtMwFABgb7Qwyl8HFwhxE60gcTU1ExJcbmLauNhF6fq7pkyO46zW7DizHdrOyntwC2/FK/AUJF0mJzGWohyd7xzbcST7MSVSdbt/trYfNJoPH+08bj15+uz5i/buy5HkiUB4iDjlYuJDiSmJ8FARRfEkFhgyn+Kxf/0l9/EPLCTh0UCtYzxn8CoiIUFQZanvwaX2FF4pLROWppftTne/uxmOHbhF0AHF6F3uNo69gKOE4UghCqWcud1YzTUUiiCK05aXSBxDdA2v8CwLI8iwnOvNtlPnfZYJnJCL7ImUs8mWOzRkUq6Zn1UyqBaybnnyfzZLVPh5rkkUJwpH6G6hMKGO4k5+Bk5ABEaKrrMAIkGyvTpoAQVEKjupykwDd67zzeXTVJZnCVVE8GXaankRXiLOGIwC7fmrVHv5jvxQr9K0hmuDawulQWnhkcEjC4VBYWFsMLbwxuCNhcsMfU6D/B9wqpdWwdB0Dy0cGRxZODY4tnBicGLh1ODUwnOD5xbeGry1sFf91l69oH/f7Ot+Wv/p6KxQBKk+q7eiaUmtPaPTkp5aelLSE3vhfon7VvOkpNZJoouSXtxPrRaYC8x08U71oAgqGuCQRCS/QdJZkZvrY5OsFMeCx1zWq3ulbHbvuPVbxg5GB/tuFn/72Dl8V9xAO+At2AMfgAs+gUPwFfTAECAgwE/wC/xu/G22m6+bb+5Kt7eKnlegMpp7/wBMB/X1AAAFaHicbdTdbtMwFABgb7Qwyl8HFwhxE60gcTU1ExJcbmLauNhF6fq7pkyO46zW7DizHdrOyntwC2/FK/AUJF0mJzGWohyd7xzbcST7MSVSdbt/trYfNJoPH+08bj15+uz5i/buy5HkiUB4iDjlYuJDiSmJ8FARRfEkFhgyn+Kxf/0l9/EPLCTh0UCtYzxn8CoiIUFQZanvwaX2FF4pLROWppftTne/uxmOHbhF0AHF6F3uNo69gKOE4UghCqWcud1YzTUUiiCK05aXSBxDdA2v8CwLI8iwnOvNtlPnfZYJnJCL7ImUs8mWOzRkUq6Zn1UyqBaybnnyfzZLVPh5rkkUJwpH6G6hMKGO4k5+Bk5ABEaKrrMAIkGyvTpoAQVEKjupykwDd67zzeXTVJZnCVVE8GXaankRXiLOGIwC7fmrVHv5jvxQr9K0hmuDawulQWnhkcEjC4VBYWFsMLbwxuCNhcsMfU6D/B9wqpdWwdB0Dy0cGRxZODY4tnBicGLh1ODUwnOD5xbeGry1sFf91l69oH/f7Ot+Wv/p6KxQBKk+q7eiaUmtPaPTkp5aelLSE3vhfon7VvOkpNZJoouSXtxPrRaYC8x08U71oAgqGuCQRCS/QdJZkZvrY5OsFMeCx1zWq3ulbHbvuPVbxg5GB/tuFn/72Dl8V9xAO+At2AMfgAs+gUPwFfTAECAgwE/wC/xu/G22m6+bb+5Kt7eKnlegMpp7/wBMB/X1AAAFaHicbdTdbtMwFABgb7Qwyl8HFwhxE60gcTU1ExJcbmLauNhF6fq7pkyO46zW7DizHdrOyntwC2/FK/AUJF0mJzGWohyd7xzbcST7MSVSdbt/trYfNJoPH+08bj15+uz5i/buy5HkiUB4iDjlYuJDiSmJ8FARRfEkFhgyn+Kxf/0l9/EPLCTh0UCtYzxn8CoiIUFQZanvwaX2FF4pLROWppftTne/uxmOHbhF0AHF6F3uNo69gKOE4UghCqWcud1YzTUUiiCK05aXSBxDdA2v8CwLI8iwnOvNtlPnfZYJnJCL7ImUs8mWOzRkUq6Zn1UyqBaybnnyfzZLVPh5rkkUJwpH6G6hMKGO4k5+Bk5ABEaKrrMAIkGyvTpoAQVEKjupykwDd67zzeXTVJZnCVVE8GXaankRXiLOGIwC7fmrVHv5jvxQr9K0hmuDawulQWnhkcEjC4VBYWFsMLbwxuCNhcsMfU6D/B9wqpdWwdB0Dy0cGRxZODY4tnBicGLh1ODUwnOD5xbeGry1sFf91l69oH/f7Ot+Wv/p6KxQBKk+q7eiaUmtPaPTkp5aelLSE3vhfon7VvOkpNZJoouSXtxPrRaYC8x08U71oAgqGuCQRCS/QdJZkZvrY5OsFMeCx1zWq3ulbHbvuPVbxg5GB/tuFn/72Dl8V9xAO+At2AMfgAs+gUPwFfTAECAgwE/wC/xu/G22m6+bb+5Kt7eKnlegMpp7/wBMB/X1AAAFaHicbdTdbtMwFABgb7Qwyl8HFwhxE60gcTU1ExJcbmLauNhF6fq7pkyO46zW7DizHdrOyntwC2/FK/AUJF0mJzGWohyd7xzbcST7MSVSdbt/trYfNJoPH+08bj15+uz5i/buy5HkiUB4iDjlYuJDiSmJ8FARRfEkFhgyn+Kxf/0l9/EPLCTh0UCtYzxn8CoiIUFQZanvwaX2FF4pLROWppftTne/uxmOHbhF0AHF6F3uNo69gKOE4UghCqWcud1YzTUUiiCK05aXSBxDdA2v8CwLI8iwnOvNtlPnfZYJnJCL7ImUs8mWOzRkUq6Zn1UyqBaybnnyfzZLVPh5rkkUJwpH6G6hMKGO4k5+Bk5ABEaKrrMAIkGyvTpoAQVEKjupykwDd67zzeXTVJZnCVVE8GXaankRXiLOGIwC7fmrVHv5jvxQr9K0hmuDawulQWnhkcEjC4VBYWFsMLbwxuCNhcsMfU6D/B9wqpdWwdB0Dy0cGRxZODY4tnBicGLh1ODUwnOD5xbeGry1sFf91l69oH/f7Ot+Wv/p6KxQBKk+q7eiaUmtPaPTkp5aelLSE3vhfon7VvOkpNZJoouSXtxPrRaYC8x08U71oAgqGuCQRCS/QdJZkZvrY5OsFMeCx1zWq3ulbHbvuPVbxg5GB/tuFn/72Dl8V9xAO+At2AMfgAs+gUPwFfTAECAgwE/wC/xu/G22m6+bb+5Kt7eKnlegMpp7/wBMB/X1z2Ra\u21e5b\u21e5dsumAAAFlnicbdRNb9MwGABgb6wwylcHFyQuERUSp6lFk0Ac0CbGymGH0vVzTakcx1mtOXFmO/TD8n/h13CFK/+GpE3lNMZSlbfv8/ozir2YEiEbjb97+/cOKvcfHD6sPnr85Omz2tHzvmAJR7iHGGV86EGBKYlwTxJJ8TDmGIYexQPv9nPmgx+YC8KirlzGeBLCm4gEBEGZpqa1j24I5cwL1Eo7LomczV9PdfR3BV1JQiwcL3/6U+VKvJBKJKHWelqrN44b6+bYQTMP6iBv7enRwbnrM5SEOJKIQiHGzUYsJwpySRDFuuomAscQ3cIbPE7DCKazTtR6k9p5k2Z8J2A8/UXSWWeLPRQMhViGXlqZ7UGULUv+z8aJDD5MFIniROIIbSYKEupI5mQn5viEYyTpMg0g4iRdq4NmkEMk03PdGanbnKhscdkwO9OHCZWEs7muVt0IzxELQxj5yvUWWm1fwELrEi4NLi0UBoWFZwbPLOQGuYWxwdjCO4N3Fs5T9Bj1s3fAqJpbBT3Tu2dh32DfwoHBgYVDg0MLRwZHFl4ZvLJwZXBlYXt3r+1yQWfbOfuQyi8dXeaKIFWX5a5oVFBrzahV0JalFwW9sCfuFLhjdR4W1DpJdF3Q6+3QcoYZx6HKn1p182BHfRyQiGT3jR7nuYk6N8md4pizmIlydbuQTe+dZvmWsYP+u+NmGn87qZ828hvoELwCr8Fb0ATvwSn4CtqgBxD4CX6B3+BP5WXlU+VLpbUp3d/L+7wAO63S/gdkKggWAAAFlnicbdRNb9MwGABgb6wwylcHFyQuERUSp6lFk0Ac0CbGymGH0vVzTakcx1mtOXFmO/TD8n/h13CFK/+GpE3lNMZSlbfv8/ozir2YEiEbjb97+/cOKvcfHD6sPnr85Omz2tHzvmAJR7iHGGV86EGBKYlwTxJJ8TDmGIYexQPv9nPmgx+YC8KirlzGeBLCm4gEBEGZpqa1j24I5cwL1Eo7LomczV9PdfR3BV1JQiwcL3/6U+VKvJBKJKHWelqrN44b6+bYQTMP6iBv7enRwbnrM5SEOJKIQiHGzUYsJwpySRDFuuomAscQ3cIbPE7DCKazTtR6k9p5k2Z8J2A8/UXSWWeLPRQMhViGXlqZ7UGULUv+z8aJDD5MFIniROIIbSYKEupI5mQn5viEYyTpMg0g4iRdq4NmkEMk03PdGanbnKhscdkwO9OHCZWEs7muVt0IzxELQxj5yvUWWm1fwELrEi4NLi0UBoWFZwbPLOQGuYWxwdjCO4N3Fs5T9Bj1s3fAqJpbBT3Tu2dh32DfwoHBgYVDg0MLRwZHFl4ZvLJwZXBlYXt3r+1yQWfbOfuQyi8dXeaKIFWX5a5oVFBrzahV0JalFwW9sCfuFLhjdR4W1DpJdF3Q6+3QcoYZx6HKn1p182BHfRyQiGT3jR7nuYk6N8md4pizmIlydbuQTe+dZvmWsYP+u+NmGn87qZ828hvoELwCr8Fb0ATvwSn4CtqgBxD4CX6B3+BP5WXlU+VLpbUp3d/L+7wAO63S/gdkKggWAAAFlnicbdRNb9MwGABgb6wwylcHFyQuERUSp6lFk0Ac0CbGymGH0vVzTakcx1mtOXFmO/TD8n/h13CFK/+GpE3lNMZSlbfv8/ozir2YEiEbjb97+/cOKvcfHD6sPnr85Omz2tHzvmAJR7iHGGV86EGBKYlwTxJJ8TDmGIYexQPv9nPmgx+YC8KirlzGeBLCm4gEBEGZpqa1j24I5cwL1Eo7LomczV9PdfR3BV1JQiwcL3/6U+VKvJBKJKHWelqrN44b6+bYQTMP6iBv7enRwbnrM5SEOJKIQiHGzUYsJwpySRDFuuomAscQ3cIbPE7DCKazTtR6k9p5k2Z8J2A8/UXSWWeLPRQMhViGXlqZ7UGULUv+z8aJDD5MFIniROIIbSYKEupI5mQn5viEYyTpMg0g4iRdq4NmkEMk03PdGanbnKhscdkwO9OHCZWEs7muVt0IzxELQxj5yvUWWm1fwELrEi4NLi0UBoWFZwbPLOQGuYWxwdjCO4N3Fs5T9Bj1s3fAqJpbBT3Tu2dh32DfwoHBgYVDg0MLRwZHFl4ZvLJwZXBlYXt3r+1yQWfbOfuQyi8dXeaKIFWX5a5oVFBrzahV0JalFwW9sCfuFLhjdR4W1DpJdF3Q6+3QcoYZx6HKn1p182BHfRyQiGT3jR7nuYk6N8md4pizmIlydbuQTe+dZvmWsYP+u+NmGn87qZ828hvoELwCr8Fb0ATvwSn4CtqgBxD4CX6B3+BP5WXlU+VLpbUp3d/L+7wAO63S/gdkKggWAAAFlnicbdRNb9MwGABgb6wwylcHFyQuERUSp6lFk0Ac0CbGymGH0vVzTakcx1mtOXFmO/TD8n/h13CFK/+GpE3lNMZSlbfv8/ozir2YEiEbjb97+/cOKvcfHD6sPnr85Omz2tHzvmAJR7iHGGV86EGBKYlwTxJJ8TDmGIYexQPv9nPmgx+YC8KirlzGeBLCm4gEBEGZpqa1j24I5cwL1Eo7LomczV9PdfR3BV1JQiwcL3/6U+VKvJBKJKHWelqrN44b6+bYQTMP6iBv7enRwbnrM5SEOJKIQiHGzUYsJwpySRDFuuomAscQ3cIbPE7DCKazTtR6k9p5k2Z8J2A8/UXSWWeLPRQMhViGXlqZ7UGULUv+z8aJDD5MFIniROIIbSYKEupI5mQn5viEYyTpMg0g4iRdq4NmkEMk03PdGanbnKhscdkwO9OHCZWEs7muVt0IzxELQxj5yvUWWm1fwELrEi4NLi0UBoWFZwbPLOQGuYWxwdjCO4N3Fs5T9Bj1s3fAqJpbBT3Tu2dh32DfwoHBgYVDg0MLRwZHFl4ZvLJwZXBlYXt3r+1yQWfbOfuQyi8dXeaKIFWX5a5oVFBrzahV0JalFwW9sCfuFLhjdR4W1DpJdF3Q6+3QcoYZx6HKn1p182BHfRyQiGT3jR7nuYk6N8md4pizmIlydbuQTe+dZvmWsYP+u+NmGn87qZ828hvoELwCr8Fb0ATvwSn4CtqgBxD4CX6B3+BP5WXlU+VLpbUp3d/L+7wAO63S/gdkKggW[\u02c6z]j=(wjz), dsumXk=1w(j)k[z]k!AAAF5XicbdTPbtMwGADwdKwwyr8NxAkJIiak7jI1CAkukzYxbRx2KN36Z2tC5Dhu6zWJM9uh6yw/AAduiCvPxVPwCjhbhpOYSFU+fb/vs524cZBGmPFO53dj5c5q8+69tfutBw8fPX6yvvF0wEhGIepDEhE6CgBDEU5Qn2MeoVFKEYiDCA2D+cfch18RZZgkJ3yZIi8G0wRPMARcpfz1b2N3BrhwY8BnwURcSen55ztuOsNtNyBRyJaxuomF9M9dEhKuC7dcTjFIphG6yMvdCE1422VZ7Iv5jiO/iNAXLkeXXKiclHKh8irbPt+SYz2K589diqczvuWvb3a2O9eXbQZOEWxaxdX1N1b33ZDALEYJhxFgbOx0Uu4JQDmGEZItN2MoBXAOpmiswgTEiHni+p1J+43KhPaEUPVLuH2dLXcIELP84VVlvlhWtzz5PxtnfPLBEzhJM44SeDPRJItsTux8A+wQUwR5tFQBgBSrtdpwBiiAXG1TZaQTxxP54vJhKtPHWcQxJQvZarkJWkASxyAJhRtcyn87eSllDZcalwYyjczAPY17BlKN1MBUY2rghcYLAxcKK3/AekFfd/cNHGgcGDjUODRwpHFk4KnGUwOPNR4beCXL31gNu9Vn7dYLerfNgejJ+qbDo0IhiMRRvRWeltRYMzws6aGhByU9MCfulbhnNI9KarxJeFbSs9uh+QwRimJR3KU4KYKKhmiCE5wfX3Jc5Dyxr5OV4pSSlLB6dbeUVeeOUz9lzGDwdttR8ed3m7ud4gRas15Yr6225VjvrV3rk9W1+ha0/jSeN142XjWnze/NH82fN6UrjaLnmVW5mr/+AmRdKII=AAAF5XicbdTPbtMwGADwdKwwyr8NxAkJIiak7jI1CAkukzYxbRx2KN36Z2tC5Dhu6zWJM9uh6yw/AAduiCvPxVPwCjhbhpOYSFU+fb/vs524cZBGmPFO53dj5c5q8+69tfutBw8fPX6yvvF0wEhGIepDEhE6CgBDEU5Qn2MeoVFKEYiDCA2D+cfch18RZZgkJ3yZIi8G0wRPMARcpfz1b2N3BrhwY8BnwURcSen55ztuOsNtNyBRyJaxuomF9M9dEhKuC7dcTjFIphG6yMvdCE1422VZ7Iv5jiO/iNAXLkeXXKiclHKh8irbPt+SYz2K589diqczvuWvb3a2O9eXbQZOEWxaxdX1N1b33ZDALEYJhxFgbOx0Uu4JQDmGEZItN2MoBXAOpmiswgTEiHni+p1J+43KhPaEUPVLuH2dLXcIELP84VVlvlhWtzz5PxtnfPLBEzhJM44SeDPRJItsTux8A+wQUwR5tFQBgBSrtdpwBiiAXG1TZaQTxxP54vJhKtPHWcQxJQvZarkJWkASxyAJhRtcyn87eSllDZcalwYyjczAPY17BlKN1MBUY2rghcYLAxcKK3/AekFfd/cNHGgcGDjUODRwpHFk4KnGUwOPNR4beCXL31gNu9Vn7dYLerfNgejJ+qbDo0IhiMRRvRWeltRYMzws6aGhByU9MCfulbhnNI9KarxJeFbSs9uh+QwRimJR3KU4KYKKhmiCE5wfX3Jc5Dyxr5OV4pSSlLB6dbeUVeeOUz9lzGDwdttR8ed3m7ud4gRas15Yr6225VjvrV3rk9W1+ha0/jSeN142XjWnze/NH82fN6UrjaLnmVW5mr/+AmRdKII=AAAF5XicbdTPbtMwGADwdKwwyr8NxAkJIiak7jI1CAkukzYxbRx2KN36Z2tC5Dhu6zWJM9uh6yw/AAduiCvPxVPwCjhbhpOYSFU+fb/vs524cZBGmPFO53dj5c5q8+69tfutBw8fPX6yvvF0wEhGIepDEhE6CgBDEU5Qn2MeoVFKEYiDCA2D+cfch18RZZgkJ3yZIi8G0wRPMARcpfz1b2N3BrhwY8BnwURcSen55ztuOsNtNyBRyJaxuomF9M9dEhKuC7dcTjFIphG6yMvdCE1422VZ7Iv5jiO/iNAXLkeXXKiclHKh8irbPt+SYz2K589diqczvuWvb3a2O9eXbQZOEWxaxdX1N1b33ZDALEYJhxFgbOx0Uu4JQDmGEZItN2MoBXAOpmiswgTEiHni+p1J+43KhPaEUPVLuH2dLXcIELP84VVlvlhWtzz5PxtnfPLBEzhJM44SeDPRJItsTux8A+wQUwR5tFQBgBSrtdpwBiiAXG1TZaQTxxP54vJhKtPHWcQxJQvZarkJWkASxyAJhRtcyn87eSllDZcalwYyjczAPY17BlKN1MBUY2rghcYLAxcKK3/AekFfd/cNHGgcGDjUODRwpHFk4KnGUwOPNR4beCXL31gNu9Vn7dYLerfNgejJ+qbDo0IhiMRRvRWeltRYMzws6aGhByU9MCfulbhnNI9KarxJeFbSs9uh+QwRimJR3KU4KYKKhmiCE5wfX3Jc5Dyxr5OV4pSSlLB6dbeUVeeOUz9lzGDwdttR8ed3m7ud4gRas15Yr6225VjvrV3rk9W1+ha0/jSeN142XjWnze/NH82fN6UrjaLnmVW5mr/+AmRdKII=AAAF5XicbdTPbtMwGADwdKwwyr8NxAkJIiak7jI1CAkukzYxbRx2KN36Z2tC5Dhu6zWJM9uh6yw/AAduiCvPxVPwCjhbhpOYSFU+fb/vs524cZBGmPFO53dj5c5q8+69tfutBw8fPX6yvvF0wEhGIepDEhE6CgBDEU5Qn2MeoVFKEYiDCA2D+cfch18RZZgkJ3yZIi8G0wRPMARcpfz1b2N3BrhwY8BnwURcSen55ztuOsNtNyBRyJaxuomF9M9dEhKuC7dcTjFIphG6yMvdCE1422VZ7Iv5jiO/iNAXLkeXXKiclHKh8irbPt+SYz2K589diqczvuWvb3a2O9eXbQZOEWxaxdX1N1b33ZDALEYJhxFgbOx0Uu4JQDmGEZItN2MoBXAOpmiswgTEiHni+p1J+43KhPaEUPVLuH2dLXcIELP84VVlvlhWtzz5PxtnfPLBEzhJM44SeDPRJItsTux8A+wQUwR5tFQBgBSrtdpwBiiAXG1TZaQTxxP54vJhKtPHWcQxJQvZarkJWkASxyAJhRtcyn87eSllDZcalwYyjczAPY17BlKN1MBUY2rghcYLAxcKK3/AekFfd/cNHGgcGDjUODRwpHFk4KnGUwOPNR4beCXL31gNu9Vn7dYLerfNgejJ+qbDo0IhiMRRvRWeltRYMzws6aGhByU9MCfulbhnNI9KarxJeFbSs9uh+QwRimJR3KU4KYKKhmiCE5wfX3Jc5Dyxr5OV4pSSlLB6dbeUVeeOUz9lzGDwdttR8ed3m7ud4gRas15Yr6225VjvrV3rk9W1+ha0/jSeN142XjWnze/NH82fN6UrjaLnmVW5mr/+AmRdKII=(||wj||1\uf8fft)AAAFe3icbdRPb9MwFABwb7Qwyr8OjhyIqJAGQlODQHDcxLRx2KF0/bsmqhzHXc2cOLUdus7NkU/DFT4MHwaJpMvkJMZSlKf3e8923MpeRImQ7fafre07tfrdezv3Gw8ePnr8pLn7dCBYzBHuI0YZH3lQYEpC3JdEUjyKOIaBR/HQu/yc+fA75oKwsCdXEXYDeBGSGUFQpqlp88Xeeu14jPpiFaQvtUym39brqW05FC8s+XrabLX325thmYGdBy2Qj850t3bk+AzFAQ4lolCIid2OpKsglwRRnDScWOAIokt4gSdpGMIAC1dtviSxXqUZ35oxnj6htDbZYoeCgch2mlYGUM5F1bLk/2wSy9knV5EwiiUO0c1Cs5haklnZsVg+4RhJukoDiDhJ92qhOeQQyfTwSjP1bFdlm8umKS0fxFQSzpZJo+GEeIlYEMDQV453lSgn25E3U1dJUsGVxpWBQqMw8FDjoYFcIzcw0hgZuNC4MHCZYunfUi3o6+6+gQONAwOHGocGjjSODBxrHBt4pvHMwGuN1wZ2yt/aqRZ0b5s91U2qPzo6zRVBqk6rrWhcUGPP6KSgJ4YeF/TYXLhb4K7RPCqocZLovKDnt1PLOWYcByp/J6qXByX18YyEJLtUkkmec9WRTpaKI84iJqrVnUI2vXfs6i1jBoN3+3Yaf33fOmjnN9AOeA5egj1gg4/gAHwBHdAHCPwAP8Ev8Lv2t96qv6m/vSnd3sp7noHSqH/4ByDI/KE=AAAFe3icbdRPb9MwFABwb7Qwyr8OjhyIqJAGQlODQHDcxLRx2KF0/bsmqhzHXc2cOLUdus7NkU/DFT4MHwaJpMvkJMZSlKf3e8923MpeRImQ7fafre07tfrdezv3Gw8ePnr8pLn7dCBYzBHuI0YZH3lQYEpC3JdEUjyKOIaBR/HQu/yc+fA75oKwsCdXEXYDeBGSGUFQpqlp88Xeeu14jPpiFaQvtUym39brqW05FC8s+XrabLX325thmYGdBy2Qj850t3bk+AzFAQ4lolCIid2OpKsglwRRnDScWOAIokt4gSdpGMIAC1dtviSxXqUZ35oxnj6htDbZYoeCgch2mlYGUM5F1bLk/2wSy9knV5EwiiUO0c1Cs5haklnZsVg+4RhJukoDiDhJ92qhOeQQyfTwSjP1bFdlm8umKS0fxFQSzpZJo+GEeIlYEMDQV453lSgn25E3U1dJUsGVxpWBQqMw8FDjoYFcIzcw0hgZuNC4MHCZYunfUi3o6+6+gQONAwOHGocGjjSODBxrHBt4pvHMwGuN1wZ2yt/aqRZ0b5s91U2qPzo6zRVBqk6rrWhcUGPP6KSgJ4YeF/TYXLhb4K7RPCqocZLovKDnt1PLOWYcByp/J6qXByX18YyEJLtUkkmec9WRTpaKI84iJqrVnUI2vXfs6i1jBoN3+3Yaf33fOmjnN9AOeA5egj1gg4/gAHwBHdAHCPwAP8Ev8Lv2t96qv6m/vSnd3sp7noHSqH/4ByDI/KE=AAAFe3icbdRPb9MwFABwb7Qwyr8OjhyIqJAGQlODQHDcxLRx2KF0/bsmqhzHXc2cOLUdus7NkU/DFT4MHwaJpMvkJMZSlKf3e8923MpeRImQ7fafre07tfrdezv3Gw8ePnr8pLn7dCBYzBHuI0YZH3lQYEpC3JdEUjyKOIaBR/HQu/yc+fA75oKwsCdXEXYDeBGSGUFQpqlp88Xeeu14jPpiFaQvtUym39brqW05FC8s+XrabLX325thmYGdBy2Qj850t3bk+AzFAQ4lolCIid2OpKsglwRRnDScWOAIokt4gSdpGMIAC1dtviSxXqUZ35oxnj6htDbZYoeCgch2mlYGUM5F1bLk/2wSy9knV5EwiiUO0c1Cs5haklnZsVg+4RhJukoDiDhJ92qhOeQQyfTwSjP1bFdlm8umKS0fxFQSzpZJo+GEeIlYEMDQV453lSgn25E3U1dJUsGVxpWBQqMw8FDjoYFcIzcw0hgZuNC4MHCZYunfUi3o6+6+gQONAwOHGocGjjSODBxrHBt4pvHMwGuN1wZ2yt/aqRZ0b5s91U2qPzo6zRVBqk6rrWhcUGPP6KSgJ4YeF/TYXLhb4K7RPCqocZLovKDnt1PLOWYcByp/J6qXByX18YyEJLtUkkmec9WRTpaKI84iJqrVnUI2vXfs6i1jBoN3+3Yaf33fOmjnN9AOeA5egj1gg4/gAHwBHdAHCPwAP8Ev8Lv2t96qv6m/vSnd3sp7noHSqH/4ByDI/KE=AAAFe3icbdRPb9MwFABwb7Qwyr8OjhyIqJAGQlODQHDcxLRx2KF0/bsmqhzHXc2cOLUdus7NkU/DFT4MHwaJpMvkJMZSlKf3e8923MpeRImQ7fafre07tfrdezv3Gw8ePnr8pLn7dCBYzBHuI0YZH3lQYEpC3JdEUjyKOIaBR/HQu/yc+fA75oKwsCdXEXYDeBGSGUFQpqlp88Xeeu14jPpiFaQvtUym39brqW05FC8s+XrabLX325thmYGdBy2Qj850t3bk+AzFAQ4lolCIid2OpKsglwRRnDScWOAIokt4gSdpGMIAC1dtviSxXqUZ35oxnj6htDbZYoeCgch2mlYGUM5F1bLk/2wSy9knV5EwiiUO0c1Cs5haklnZsVg+4RhJukoDiDhJ92qhOeQQyfTwSjP1bFdlm8umKS0fxFQSzpZJo+GEeIlYEMDQV453lSgn25E3U1dJUsGVxpWBQqMw8FDjoYFcIzcw0hgZuNC4MHCZYunfUi3o6+6+gQONAwOHGocGjjSODBxrHBt4pvHMwGuN1wZ2yt/aqRZ0b5s91U2qPzo6zRVBqk6rrWhcUGPP6KSgJ4YeF/TYXLhb4K7RPCqocZLovKDnt1PLOWYcByp/J6qXByX18YyEJLtUkkmec9WRTpaKI84iJqrVnUI2vXfs6i1jBoN3+3Yaf33fOmjnN9AOeA5egj1gg4/gAHwBHdAHCPwAP8Ev8Lv2t96qv6m/vSnd3sp7noHSqH/4ByDI/KE=\f(a) Original\n\n(b) Gaussian noise\n\n(c) Downsampling\n\nFigure 2: Visualization of corrupted samples, (left) RGB images (right) LIDAR point clouds. The\npoint clouds are projected onto the 2D image plane for easier visual comparison.\n\n5 Experimental Results\n\nWe test our algorithms and the LEL fusion method on 3D and BEV object detection tasks using the\ncar class of the KITTI dataset [10]. 3D detection is both an important problem in self-driving cars and\none where multiple sensors can contribute fruitfully by providing both complementary and shared\ninformation. In contrast, models for 2D object detection heavily rely on RGB data, which typically\ndominates other modalities. As our experiments include random generation of corruption, each task\nis evaluated 5 times to compare average scores (reported with 95% con\ufb01dence intervals), and thus\na validation set is used for ease of manipulating data and repetitive evaluation. We follow the split\nof Ku et al. [25], 3712 and 3769 frames for training and validation sets, respectively. Results are\nreported based on three dif\ufb01culty levels de\ufb01ned by KITTI (easy, medium, hard) and a standard metric\nAverage Precision (AP) is used. A recent open-sourced 3D object detector AVOD [25] with a feature\npyramid network is selected as a baseline algorithm.\nFour different algorithms are compared: AVOD trained on (i) clean data, (ii) data augmented with\nASN samples (TRAINASN), (iii) SSN augmented data with direct MAXSSN loss minimization\n(TRAINSSN), and (iv) SSN augmented data (TRAINSSNALT). The AVOD architecture is varied to\nuse either element-wise mean fusion layers or our LELs. We follow the original training setups of\nAVOD, e.g., 120k iterations using an ADAM optimizer with an initial learning rate of 0.0001.4\n\nCorruption methods Gaussian noise generated i.i.d. with N (0, \u03c32\nGaussian) is directly added to the\npixel value of an image (r, g, b) and the coordinate value of a LIDAR\u2019s point (x, y, z). \u03c3Gaussian is\nset to 0.75\u03c4 experimentally with \u03c4RGB = 255 and \u03c4LIDAR = 0.2. The second method downsampling\nselects only 16 out of 64 lasers of LIDAR data. To match this effect, 3 out of 4 horizontal lines of\nan RGB image are deleted. Effects of corruption on each input source are visualized in Figure 2,\nwhere the color of a 2D LIDAR image represents a distance from the sensor. Although our analyses\nin Section 3.2 assume the noise variances to be identical, it is nontrivial to set equal noise levels\nfor different modalities in practice, e.g., RGB pixels vs. points in a 3D space. Nevertheless, an\nunderlying objective of our MAXSSN loss, balancing the degradation rates of different input sources\u2019\nfaults, does not depend on the choice of noise types or levels.\n\n4Our methods are implemented with TensorFlow on top of the of\ufb01cial AVOD code. The computing\nmachine has a Intel Xeon E5-1660v3 CPU with Nvidia Titan X Pascal GPUs. The source code is available at\nhttps://github.com/twankim/avod_ssn.\n\n7\n\n\fEvaluation metrics for single source robustness To assess the robustness against single source\nnoise, a new metric minAP is introduced. The AP score is evaluated on the dataset with a single\ncorrupted input source, then after going over all ns sources, minAP reports the lowest score among\nthe ns AP scores. Our second metric maxDiffAP computes the maximum absolute difference among\nthe scores, which measures the balance of different input sources\u2019 single source robustness; low value\nof maxDiffAP means the well-balanced robustness.\n\nTable 1: Car detection (3D/BEV) performance of AVOD with element-wise mean fusion layers and\nlatent ensemble layers (LEL) against Gaussian SSN on the KITTI validation set.\n\n(Data) Train Algo.\n\nEasy\n\nModerate\n\nHard\n\nEasy\n\nModerate\n\nHard\n\n(Clean Data)\n\nAVOD [25]\n+TRAINASN\n+TRAINSSN\n+TRAINSSNALT\n\n(Gaussian SSN)\n\nAVOD [25]\n+TRAINASN\n+TRAINSSN\n+TRAINSSNALT\n\n(Gaussian SSN)\nAVOD [25]\n+TRAINASN\n+TRAINSSN\n+TRAINSSNALT\n\n(Clean Data)\n\nAVOD [25]\n+TRAINASN\n+TRAINSSN\n+TRAINSSNALT\n\n(Gaussian SSN)\n\nAVOD [25]\n+TRAINASN\n+TRAINSSN\n+TRAINSSNALT\n\nFusion method: Mean\n\n76.41\n75.96\n76.28\n77.46\n\nAP3D(%)\n\n72.74\n66.68\n67.10\n67.61\n\n66.86\n65.97\n66.51\n66.06\n\n89.33\n88.63\n88.86\n89.68\n\nmin AP3D(%)\n41.84\u00b10.17\n52.72\u00b10.08\n62.14\u00b10.08\n57.61\u00b10.12\n\nmax DiffAP3D(%)\n\n36.47\u00b10.16\n47.25\u00b10.13\n56.78\u00b10.12\n55.90\u00b10.11\n\n22.42\u00b10.29\n12.72\u00b10.33\n3.42\u00b10.25\n8.73\u00b10.32\n\n20.92\u00b10.25\n11.18\u00b10.27\n7.50\u00b10.25\n2.91\u00b10.22\n\n65.63\u00b10.28\n87.71\u00b10.14\n88.21\u00b10.08\n89.42\u00b10.04\n\n22.27\u00b10.41\n0.88\u00b10.22\n0.36\u00b10.17\n0.09\u00b10.14\n\nFusion method: Latent Ensemble Layer\n\n47.41\u00b10.28\n61.53\u00b10.57\n71.65\u00b10.31\n71.66\u00b10.48\n\n26.70\u00b10.52\n14.48\u00b10.82\n3.71\u00b10.46\n5.55\u00b10.81\n\n77.79\n75.00\n74.25\n76.04\n\n61.97\u00b10.55\n74.24\u00b10.38\n68.16\u00b10.88\n68.63\u00b10.40\n\nAP3D(%)\n\n67.69\n64.75\n65.00\n66.42\n\nmin AP3D(%)\n53.95\u00b10.42\n58.25\u00b10.16\n60.39\u00b10.38\n55.48\u00b10.16\n\nAPBEV (%)\n\n86.49\n79.45\n79.60\n86.71\n\nmin APBEV (%)\n\n79.44\n78.79\n79.11\n79.41\n\n58.02\u00b10.23\n78.37\u00b10.06\n78.90\u00b10.09\n79.56\u00b10.06\n\n50.43\u00b10.14\n77.85\u00b10.08\n77.92\u00b10.11\n77.92\u00b10.05\n\nmax DiffAPBEV (%)\n\n20.76\u00b10.33\n0.48\u00b10.13\n0.04\u00b10.15\n0.13\u00b10.11\n\nAPBEV (%)\n\n85.64\n78.60\n78.84\n79.53\n\n20.09\u00b10.20\n0.28\u00b10.12\n0.71\u00b10.17\n0.18\u00b10.11\n\n78.86\n77.23\n77.66\n78.53\n\nmin APBEV (%)\n\n66.31\n58.28\n63.83\n64.41\n\n88.90\n88.30\n87.88\n88.80\n\n47.24\u00b10.27\n56.13\u00b10.10\n56.04\u00b10.28\n54.42\u00b10.17\n\n79.44\u00b10.09\n88.10\u00b10.26\n88.12\u00b10.16\n86.51\u00b10.46\n\n72.46\u00b13.14\n78.19\u00b10.13\n78.17\u00b10.06\n76.85\u00b10.11\n\n68.25\u00b10.06\n70.42\u00b10.07\n70.21\u00b10.05\n71.95\u00b12.72\n\nTable 2: Car detection (3D/BEV) performance of AVOD with latent ensemble layers (LEL) against\ndownsampling SSN on the KITTI validation set.\n\n(Data) Train Algo.\n\nEasy\n\nModerate\n\nHard\n\nEasy\n\nModerate\n\nHard\n\n(Clean Data)\n\nAVOD [25]\n+TRAINASN\n+TRAINSSN\n+TRAINSSNALT\n\n(Downsample SSN)\n\nAVOD [25]\n+TRAINASN\n+TRAINSSN\n+TRAINSSNALT\n\n77.79\n71.74\n75.54\n76.22\n\n61.70\n65.74\n73.33\n64.77\n\nAP3D(%)\n\nAPBEV (%)\n\n67.69\n61.78\n66.26\n66.05\n\nmin AP3D(%)\n\n51.66\n53.49\n57.85\n53.34\n\n66.31\n60.26\n63.72\n63.87\n\n46.17\n51.35\n54.91\n48.29\n\n88.90\n87.29\n88.07\n89.00\n\n85.64\n77.08\n79.18\n79.65\n\n78.86\n75.89\n78.03\n78.03\n\nmin APBEV (%)\n\n86.08\n82.27\n86.61\n85.27\n\n69.99\n67.88\n76.07\n69.87\n\n61.55\n65.79\n68.59\n67.77\n\nResults When the fusion model uses the element-wise mean fusion (Table 1), TRAINSSN algo-\nrithm shows the best single source robustness against Gaussian SSN while preserving the original\nperformance on clean data (only small decrease in the moderate BEV detection)5. Also a balance\n5In practice, it is dif\ufb01cult to identify \ufb02exible parameters related to shared information in advance, and also the\ndesign goal becomes a soft rather than a hard constraint. Therefore there is minor degradation in performance, to\npay for the added robustness.\n\n8\n\n\fof the both input sources\u2019 performance is dramatically decreased compared to the models trained\nwithout robust learning and a naive TRAINASN method.\nEncouragingly, AVOD model constructed with our LEL method already achieves relatively high\nrobustness without any robust learning strategies compared to the mean fusion layers. For all the\ntasks, minAP scores are dramatically increased, e.g., 61.97 vs. 47.41 for the easy 3D detection task,\nand the maxDiffAP scores are decreased (maxDiffAP scores for AVOD with LEL are reported in\nAppendix B.). Then the robustness is further improved by minimizing our MAXSSN loss. As our\nLEL\u2019s structure inherently handles corruption on a single source well, even the TRAINASN algorithm\ncan successfully guide the model to be equipped with the desired robustness.\nA corruption method with a different style, downsampling, is also tested with our LEL fusion method.\nTable 2 shows that the model trained with our TRAINSSN algorithm achieves the best robustness\namong the four algorithms for this complex and realistic perturbation.\nRemark 3. A simple TRAINSSNALT achieves fairly robust models in both fusion methods against\nGaussian noise, and two reasons may explain this phenomenon. First, all parameters are updated\ninstead of \ufb01ne-tuning only fusion related parts. Therefore, unlike our analyses on the linear model,\nthe latent representation can be transformed to meet the objective function. In fact, TRAINSSNALT\nperforms poorly when we \ufb01ne-tune the model with concatenation fusion layers as shown in the\nsupplement. Secondly, the loss function L inside our LMAXSSN is usually non-convex so that it may\nbe enough to use an indirect approach for small number of sources, ns = 2.\nRemark 4. Without applying fancier approaches which could increase computational cost, our LEL\nshowed appealing effectiveness even with simple implementation.\n\n6 Conclusion\n\nWe study two strategies to improve robustness of fusion models against single source corruption.\nMotivated by analyses on linear fusion models, a loss function is introduced to balance performance\ndegradation of deep fusion models caused by corruption in different sources. We also demonstrate\nthe importance of a fusion method\u2019s structure by proposing a simple ensemble layer achieving such\nrobustness inherently. Our experimental results show that deep fusion models can effectively use\ncomplementary and shared information of different input sources by training with our loss and\nfusion layer to obtain both robustness and high accuracy. We hope our results motivate further work\nto improve the single source robustness of more complex fusion models with either large number\nof input sources or adaptive networks. Another interesting direction is to investigate the single\nsource robustness against adversarial attacks in deep fusion models, which can be compared with our\nanalyses in the supplementary material.\n\nReferences\n[1] Markus Braun, Qing Rao, Yikang Wang, and Fabian Flohr. Pose-rcnn: Joint object detection and pose\nestimation using 3d object proposals. In IEEE 19th international conference on intelligent transportation\nsystems (ITSC), pages 1546\u20131551, 2016.\n\n[2] William Chan, Navdeep Jaitly, Quoc Le, and Oriol Vinyals. Listen, attend and spell: A neural network for\nlarge vocabulary conversational speech recognition. In IEEE international conference on acoustics, speech\nand signal processing (ICASSP), pages 4960\u20134964, 2016.\n\n[3] Xiaozhi Chen, Huimin Ma, Ji Wan, Bo Li, and Tian Xia. Multi-view 3d object detection network for\nautonomous driving. In IEEE conference on computer vision and pattern recognition (CVPR), pages\n1907\u20131915, 2017.\n\n[4] Chung-Cheng Chiu, Tara N Sainath, Yonghui Wu, Rohit Prabhavalkar, Patrick Nguyen, Zhifeng Chen,\nAnjuli Kannan, Ron J Weiss, Kanishka Rao, Ekaterina Gonina, et al. State-of-the-art speech recognition\nwith sequence-to-sequence models. In IEEE international conference on acoustics, speech and signal\nprocessing (ICASSP), pages 4774\u20134778, 2018.\n\n[5] Jan K Chorowski, Dzmitry Bahdanau, Dmitriy Serdyuk, Kyunghyun Cho, and Yoshua Bengio. Attention-\nbased models for speech recognition. In Advances in neural information processing systems (NeurIPS),\npages 577\u2013585, 2015.\n\n9\n\n\f[6] Joon Son Chung, Andrew Senior, Oriol Vinyals, and Andrew Zisserman. Lip reading sentences in the wild.\n\nIn IEEE conference on computer vision and pattern recognition (CVPR), pages 3444\u20133453, 2017.\n\n[7] Jifeng Dai, Yi Li, Kaiming He, and Jian Sun. R-fcn: Object detection via region-based fully convolutional\n\nnetworks. In Advances in neural information processing systems (NeurIPS), pages 379\u2013387, 2016.\n\n[8] Xinxin Du, Marcelo H Ang, and Daniela Rus. Car detection for autonomous vehicle: Lidar and vision\nfusion approach through deep learning framework. In IEEE/RSJ international conference on intelligent\nrobots and systems (IROS), pages 749\u2013754, 2017.\n\n[9] Di Feng, Christian Haase-Schuetz, Lars Rosenbaum, Heinz Hertlein, Fabian Duffhauss, Claudius Glaeser,\nWerner Wiesbeck, and Klaus Dietmayer. Deep multi-modal object detection and semantic segmentation\nfor autonomous driving: Datasets, methods, and challenges. arXiv preprint arXiv:1902.07830, 2019.\n\n[10] Andreas Geiger, Philip Lenz, and Raquel Urtasun. Are we ready for autonomous driving? the kitti\nvision benchmark suite. In IEEE conference on computer vision and pattern recognition (CVPR), pages\n3354\u20133361, 2012.\n\n[11] Ross Girshick. Fast r-cnn. In IEEE international conference on computer vision (ICCV), pages 1440\u20131448,\n\n2015.\n\n[12] Ross Girshick, Jeff Donahue, Trevor Darrell, and Jitendra Malik. Rich feature hierarchies for accurate\nobject detection and semantic segmentation. In IEEE conference on computer vision and pattern recognition\n(CVPR), pages 580\u2013587, 2014.\n\n[13] Ian J Goodfellow, Jonathon Shlens, and Christian Szegedy. Explaining and harnessing adversarial examples.\n\nIn International conference on learning representations (ICLR), 2015.\n\n[14] Alex Graves, Abdel-rahman Mohamed, and Geoffrey Hinton. Speech recognition with deep recurrent\nneural networks. In IEEE international conference on acoustics, speech and signal processing (ICASSP),\npages 6645\u20136649, 2013.\n\n[15] Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Deep residual learning for image recognition.\n\nIn IEEE conference on computer vision and pattern recognition (CVPR), pages 770\u2013778, 2016.\n\n[16] Kaiming He, Georgia Gkioxari, Piotr Doll\u00e1r, and Ross Girshick. Mask r-cnn. In IEEE international\n\nconference on computer vision (ICCV), pages 2961\u20132969, 2017.\n\n[17] Geoffrey Hinton, Li Deng, Dong Yu, George Dahl, Abdel-rahman Mohamed, Navdeep Jaitly, Andrew\nSenior, Vincent Vanhoucke, Patrick Nguyen, Brian Kingsbury, et al. Deep neural networks for acoustic\nmodeling in speech recognition. IEEE signal processing magazine, 29, 2012.\n\n[18] Gao Huang, Zhuang Liu, Laurens Van Der Maaten, and Kilian Q Weinberger. Densely connected\nconvolutional networks. In IEEE conference on computer vision and pattern recognition (CVPR), pages\n4700\u20134708, 2017.\n\n[19] Jing Huang and Brian Kingsbury. Audio-visual deep learning for noise robust speech recognition. In IEEE\ninternational conference on acoustics, speech and signal processing (ICASSP), pages 7596\u20137599, 2013.\n\n[20] Jaekyum Kim, Junho Koh, Yecheol Kim, Jaehyung Choi, Youngbae Hwang, and Jun Won Choi. Robust\ndeep multi-modal learning based on gated information fusion network. In Asian conference on computer\nvision (ACCV), 2018.\n\n[21] Taewan Kim and Joydeep Ghosh. Robust detection of non-motorized road users using deep learning on\noptical and lidar data. In IEEE 19th international conference on intelligent transportation systems (ITSC),\npages 271\u2013276, 2016.\n\n[22] Taewan Kim, Michael Motro, Patr\u00edcia Lavieri, Saharsh Samir Oza, Joydeep Ghosh, and Chandra Bhat.\nPedestrian detection with simpli\ufb01ed depth prediction. In IEEE 21st international conference on intelligent\ntransportation systems (ITSC), pages 2712\u20132717, 2018.\n\n[23] Ryan Kiros, Karteek Popuri, Dana Cobzas, and Martin Jagersand. Stacked multiscale feature learning\nfor domain independent medical image segmentation. In International workshop on machine learning in\nmedical imaging, pages 25\u201332. Springer, 2014.\n\n[24] Alex Krizhevsky, Ilya Sutskever, and Geoffrey E Hinton. Imagenet classi\ufb01cation with deep convolutional\nneural networks. In Advances in neural information processing systems (NeurIPS), pages 1097\u20131105,\n2012.\n\n10\n\n\f[25] Jason Ku, Melissa Mozi\ufb01an, Jungwook Lee, Ali Harakeh, and Steven L Waslander. Joint 3d proposal\ngeneration and object detection from view aggregation. In IEEE/RSJ international conference on intelligent\nrobots and systems (IROS), pages 1\u20138, 2018.\n\n[26] Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. Deep learning. nature, 521(7553):436, 2015.\n\n[27] Ming Liang, Bin Yang, Shenlong Wang, and Raquel Urtasun. Deep continuous fusion for multi-sensor 3d\n\nobject detection. In European conference on computer vision (ECCV), pages 641\u2013656, 2018.\n\n[28] Ming Liang, Bin Yang, Yun Chen, Rui Hui, and Raquel Urtasun. Multi-task multi-sensor fusion for 3d\n\nobject detection. In IEEE conference on computer vision and pattern recognition (CVPR), 2019.\n\n[29] Siqi Liu, Sidong Liu, Weidong Cai, Hangyu Che, Sonia Pujol, Ron Kikinis, Dagan Feng, Michael J Fulham,\net al. Multimodal neuroimaging feature learning for multiclass diagnosis of alzheimer\u2019s disease. IEEE\ntransactions on biomedical engineering, 62(4):1132\u20131140, 2015.\n\n[30] Wei Liu, Dragomir Anguelov, Dumitru Erhan, Christian Szegedy, Scott Reed, Cheng-Yang Fu, and\nAlexander C Berg. Ssd: Single shot multibox detector. In European conference on computer vision\n(ECCV), pages 21\u201337. Springer, 2016.\n\n[31] Oier Mees, Andreas Eitel, and Wolfram Burgard. Choosing smartly: Adaptive multimodal fusion for\nobject detection in changing environments. In IEEE/RSJ international conference on intelligent robots and\nsystems (IROS), pages 151\u2013156, 2016.\n\n[32] Youssef Mroueh, Etienne Marcheret, and Vaibhava Goel. Deep multimodal learning for audio-visual\nspeech recognition. In IEEE international conference on acoustics, speech and signal processing (ICASSP),\npages 2130\u20132134, 2015.\n\n[33] Charles R Qi, Wei Liu, Chenxia Wu, Hao Su, and Leonidas J Guibas. Frustum pointnets for 3d object\ndetection from rgb-d data. In IEEE conference on computer vision and pattern recognition (CVPR), pages\n918\u2013927, 2018.\n\n[34] Dhanesh Ramachandram and Graham W Taylor. Deep multimodal learning: A survey on recent advances\n\nand trends. IEEE signal processing magazine, 34(6):96\u2013108, 2017.\n\n[35] Joseph Redmon and Ali Farhadi. Yolo9000: better, faster, stronger. In IEEE conference on computer vision\n\nand pattern recognition (CVPR), pages 7263\u20137271, 2017.\n\n[36] Joseph Redmon, Santosh Divvala, Ross Girshick, and Ali Farhadi. You only look once: Uni\ufb01ed, real-time\nobject detection. In IEEE conference on computer vision and pattern recognition (CVPR), pages 779\u2013788,\n2016.\n\n[37] Shaoqing Ren, Kaiming He, Ross Girshick, and Jian Sun. Faster r-cnn: Towards real-time object detection\nwith region proposal networks. In Advances in neural information processing systems (NeurIPS), pages\n91\u201399, 2015.\n\n[38] Tara N Sainath, Abdel-rahman Mohamed, Brian Kingsbury, and Bhuvana Ramabhadran. Deep convo-\nlutional neural networks for lvcsr. In IEEE international conference on acoustics, speech and signal\nprocessing (ICASSP), pages 8614\u20138618, 2013.\n\n[39] Martin Simonovsky, Benjam\u00edn Guti\u00e9rrez-Becker, Diana Mateus, Nassir Navab, and Nikos Komodakis. A\ndeep metric for multimodal registration. In International conference on medical image computing and\ncomputer-assisted intervention, pages 10\u201318. Springer, 2016.\n\n[40] Karen Simonyan and Andrew Zisserman. Very deep convolutional networks for large-scale image recogni-\n\ntion. In International conference on learning representations (ICLR), 2015.\n\n[41] Chao Sui, Mohammed Bennamoun, and Roberto Togneri. Listening with your eyes: Towards a practical\nvisual speech recognition system using deep boltzmann machines. In IEEE international conference on\ncomputer vision (ICCV), pages 154\u2013162, 2015.\n\n[42] Christian Szegedy, Wei Liu, Yangqing Jia, Pierre Sermanet, Scott Reed, Dragomir Anguelov, Dumitru\nErhan, Vincent Vanhoucke, and Andrew Rabinovich. Going deeper with convolutions. In IEEE conference\non computer vision and pattern recognition (CVPR), pages 1\u20139, 2015.\n\n[43] Dimitris Tsipras, Shibani Santurkar, Logan Engstrom, Alexander Turner, and Aleksander Madry. Ro-\nbustness may be at odds with accuracy. In International conference on learning representations (ICLR),\n2019.\n\n11\n\n\f[44] Kagan Tumer and Joydeep Ghosh. Analysis of decision boundaries in linearly combined neural classi\ufb01ers.\n\nPattern Recognition, 29(2):341\u2013348, 1996.\n\n[45] Kagan Tumer and Joydeep Ghosh. Error correlation and error reduction in ensemble classi\ufb01ers. Connection\n\nscience, 8(3-4):385\u2013404, 1996.\n\n[46] Abhinav Valada, Johan Vertens, Ankit Dhall, and Wolfram Burgard. Adapnet: Adaptive semantic segmen-\ntation in adverse environmental conditions. In IEEE international conference on robotics and automation\n(ICRA), pages 4644\u20134651, 2017.\n\n[47] Zining Wang, Wei Zhan, and Masayoshi Tomizuka. Fusing bird\u2019s eye view lidar point cloud and front\nview camera image for 3d object detection. In IEEE intelligent vehicles symposium (IV), pages 1\u20136, 2018.\n\n[48] Pengcheng Wu, Steven CH Hoi, Hao Xia, Peilin Zhao, Dayong Wang, and Chunyan Miao. Online\nIn 21st ACM international\n\nmultimodal deep similarity learning with application to image retrieval.\nconference on multimedia, pages 153\u2013162. ACM, 2013.\n\n12\n\n\f", "award": [], "sourceid": 2683, "authors": [{"given_name": "Taewan", "family_name": "Kim", "institution": "Amazon"}, {"given_name": "Joydeep", "family_name": "Ghosh", "institution": "UT Austin"}]}