We present and evaluate a new CNF encoding scheme for reducing probabilistic inference from a graphical model to weighted model counting. This new encoding scheme elaborates on the CNF encoding scheme ENC4 introduced by Chavira and Darwiche, and improves it by taking advantage of log encodings of the elementary variable/value assignments and of the implicit encoding of the most frequent probability value per conditional probability table. From the theory side, we show that our encoding scheme is faithful, and that for each input network, the CNF formula it leads to contains less variables and less clauses than the CNF formula obtained using ENC4. From the practical side, we show that the C2D compiler empowered by our encoding scheme performs in many cases significantly better than when ENC4 is used, or when the state-of-the-art ACE compiler is considered instead.