Randomly Aggregated Least Squares for Support Recovery

2021 
Abstract We study the problem of exact support recovery: given an (unknown) vector θ * ∈ { − 1 , 0 , 1 } D with known sparsity k = ∥ θ * ∥ 0 , we are given access to the noisy measurement y = X θ * + ω , where X ∈ R N × D is a (known) Gaussian matrix and the noise ω ∈ R N is an (unknown) Gaussian vector. How small can N be for reliable recovery of the support of θ * ? We present RAWLS (Randomly Aggregated unWeighted Least Squares Support Recovery): the main idea is to take random subsets of the N equations, perform least squares over this reduced bit of information, and average over many random subsets. We show that the proposed procedure can provably recover an approximation of θ * and demonstrate its use through numerical examples. We use numerical simulations to demonstrate that the proposed procedure is beneficial for the task of support recovery. Finally, we observe that RAWLS is at par with several strong baselines in the low information regime (i.e. N is small or k is large).
    • Correction
    • Source
    • Cite
    • Save
    28
    References
    3
    Citations
    NaN
    KQI
    []
    Baidu
    map