We will explain how, motivated by the study of the General Belief Propagation algorithm, which is known to solve variational inference under constraints on the marginal of probability distributions (minimization of Bethe Free Energy), we can define a theoretical framework for a whole new class of optimisation problems built by gluing together local optimization problems. We will show that critical points of these optimization problems are fix points of new message passing algorithms. More generally this setting answers the question of how to define an optimization problem when given a functor of constraints and local cost functions. One direct application of these results is a PCA for filtered data. Such approach is particularly adapted for inference with multimodal integration, inference on scenes with multiple views.