In this paper, we extend the setting of the online prediction with expert advice to function-valued forecasts. At each step of the online game several experts predict a function, and the learner has to efficiently aggregate these functional forecasts into a single forecast. We adapt basic mixable (and exponentially concave) loss functions to compare functional predictions and prove that these adaptations are also mixable (exp-concave). We call this phenomenon mixability (exp-concavity) of integral loss functions. As an application of our main result, we prove that various loss functions used for probabilistic forecasting are mixable (exp-concave). The considered losses include Sliced Continuous Ranked Probability Score, Energy-Based Distance, Optimal Transport Costs & Sliced Wasserstein-2 distance, Beta-2 and Kullback-Leibler divergences, Characteristic function and Maximum Mean Discrepancies.