This thesis is concerned with recovery guarantees and sensitivity analysis of variational regularization for noisy linear inverse problems. This is cast as a convex optimization problem by combining a data fidelity and a regularizing functional promoting solutions conforming to some notion of low complexity related to their non-smoothness points. Our approach, based on partial smoothness, handles a variety of regularizers including analysis/structured sparsity, antisparsity and low-rank structure. We first give an analysis of the noise robustness guarantees, both in terms of the distance of the recovered solutions to the original object, as well as the stability of the promoted model space. We then turn to sensivity analysis of these optimization problems to observation perturbations. With random observations, we build unbiased estimator of the risk which provides a parameter selection scheme.