Hidden Markov Models, HMM’s, are mathematical models of Markov processes whose state is hidden but from which information can leak via channels. They are typically represented as 3-way joint probability distributions. We use HMM’s as denotations of probabilistic hidden-state sequential programs, after recasting them as “abstract” HMM’s, i.e. computations in the Giry monad D, and equipping them with a partial order of increasing security. However to encode the monadic type with hiding over state X we use DX→D2X rather than the conventional X→DX.

We illustrate this construction with a very small Haskell prototype. We then present uncertainty measures as a generalisation of the extant diversity of probabilistic entropies, and we propose characteristic analytic properties for them. Based on that, we give a “backwards”, uncertainty-transformer semantics for HMM’s, dual to the “forwards” abstract HMM’s. Finally, we discuss the Dalenius desideratum for statistical databases as an issue in semantic compositionality, and propose a means for taking it into account.