Quentin Bertrand

is this you? claim profile

0 followers

  • Concomitant Lasso with Repetitions (CLaR): beyond averaging multiple realizations of heteroscedastic noise

    Sparsity promoting norms are frequently used in high dimensional regression. A limitation of Lasso-type estimators is that the regulariza-tion parameter depends on the noise level which varies between datasets and experiments. Esti-mators such as the concomitant Lasso address this dependence by jointly estimating the noise level and the regression coefficients. As sample sizes are often limited in high dimensional regimes, simplified heteroscedastic models are customary. However, in many experimental applications , data is obtained by averaging multiple measurements. This helps reducing the noise variance, yet it dramatically reduces sample sizes, preventing refined noise modeling. In this work, we propose an estimator that can cope with complex heteroscedastic noise structures by using non-averaged measurements and a con-comitant formulation. The resulting optimization problem is convex, so thanks to smoothing theory, it is amenable to state-of-the-art proximal coordinate descent techniques that can leverage the expected sparsity of the solutions. Practical benefits are demonstrated on simulations and on neuroimaging applications.

    02/07/2019 ∙ by Quentin Bertrand, et al. ∙ 18 share

    read it

  • Anytime Exact Belief Propagation

    Statistical Relational Models and, more recently, Probabilistic Programming, have been making strides towards an integration of logic and probabilistic reasoning. A natural expectation for this project is that a probabilistic logic reasoning algorithm reduces to a logic reasoning algorithm when provided a model that only involves 0-1 probabilities, exhibiting all the advantages of logic reasoning such as short-circuiting, intelligibility, and the ability to provide proof trees for a query answer. In fact, we can take this further and require that these characteristics be present even for probabilistic models with probabilities near 0 and 1, with graceful degradation as the model becomes more uncertain. We also seek inference that has amortized constant time complexity on a model's size (even if still exponential in the induced width of a more directly relevant portion of it) so that it can be applied to huge knowledge bases of which only a relatively small portion is relevant to typical queries. We believe that, among the probabilistic reasoning algorithms, Belief Propagation is the most similar to logic reasoning: messages are propagated among neighboring variables, and the paths of message-passing are similar to proof trees. However, Belief Propagation is either only applicable to tree models, or approximate (and without guarantees) for precision and convergence. In this paper we present work in progress on an Anytime Exact Belief Propagation algorithm that is very similar to Belief Propagation but is exact even for graphical models with cycles, while exhibiting soft short-circuiting, amortized constant time complexity in the model size, and which can provide probabilistic proof trees.

    07/27/2017 ∙ by Gabriel Azevedo Ferreira, et al. ∙ 0 share

    read it