It was recently proposed that neural networks could be used to approximate many-dimensional probability distributions that appear e.g. in lattice field theories or statistical mechanics. Subsequently they can be used as variational approximators to asses extensive properties of statistical systems, like free energy, and also as neural samplers used in Monte Carlo simulations. In this talk I will discuss two algorithms suitable for these purposes: Variational Autoregressive Networks and Normalizing Flows and present recent improvements.