Homepage of Peter Harremoës (Entro-Peter)


E-mail: harremoes@ieee.org

Mobile: +45 27 82 41 71

Skype: peterharremoes

Rønne Allé 1, st.
2860 Søborg
+45 39 56 41 71

Research news

At ISIT I will give a talk entitled Information Theory on Spectral Sets. The results strongly indicate that information theory is only possible in mathematical structures that can be represented  on Jordan algebras. There are 5 basic types of Jordan algebras, and density matrices with complex entries is the most important in the sense that quantum information theory is normally represented on this type of algebra. I guess that many results from quantum information theory can be generalized to Jordan algebras. This work follows up on a paper entitled Divergence and Sufficiency for Convex Optimization that I recently published in Entropy. 

I published a long paper with various bounds on tail probabilities in terms of the signed log-likelihood function in Kybernetika. It has just been awarded the best paper in Kybernetika 2016. Although the inequalities are quite sharp I am sure even sharper inequalities of these types can be obtained. I think the results can be generalized to cover all exponential families with simple variance functions. Even more general inequalities may exist, but at the moment I don't know how to attack the general problem.

In the following paper arXiv:1601.04255 we obtain a lower bound on the rate of convergence in the law of small numbers, that seem to be asymptotically tight.

Lattice theory of causation I an working on a larger project where I want to see to what extend concepts related to cause and effect can be studied using lattice theory. In 2015 I presented some derived results related to Lattices with non-Shannon Inequalities. In January and February 2016 I was visiting University of Hawai'i where several lattice experts are situated and now I try to put all the results together.

Link to my page on Research Gate

List of Publications

ORCID: 0000-0002-0441-6690



My research is centered on information theory. One of my interests is how to use ideas from information theory to derive results in probability theory. Many of the most important reslts in probability theory are convergence theorems, and many of these convergence theorems can be reformulated so that they state that the entropy of a system increases to a maximum or that a divergence converge to a minimum. These ideas are also relevant in the theory of statistical tests. Recently I have formalized a method for deriving Jeffreys prior as the optimal prior using the minimum description length principle.
I am also interested in quantum information theory, and I think that information theory sheds new light on the problems of the foundation of quantum mechanics. In a sense the distinction between matter and information about matter disappear on the quantum level. Combining this idea with group representations should be a key to a better understanding of quantum theory.
I have also worked on the relation between Bayesian networks and irreversibility, and my ultimate goal is to build a bridge between these ideas and information theory. I am working on a new theory where methods from lattice theory are used. I think lattices of functional dependence will provide a more transparent framework to describe causation. Hopefully it will lead to better algorithms for detecting causal relationship, but the most important application might be in our description of quantum systems, where we know that our usual notion of causation break down.

Editorial work

I am associated editor of the journal IEEE Transactions on Information Theory covering probability and statistics. I am editorial board member of the journal Entropy. I was Editor-in-Chief of Entropy 2007-2011 and editor 2011-2015. 

I serve as reviewer of numerous other journals as can be seen from my newly created Publons profile.


I was visiting University of Hawai'i Jan.-Feb. 2016.
I will gave a lecture series on Bregman divergences. The first will be held Thursday 21/1 in Holmes Hall. See flyer for detailed information.
Quantum Lunch 21/12 2016.
Entropy Power Inequalities at AIM, San José May 1st-5th. 2017.
ISIT 2017 June. Takes place in Aachen in Germany. I will give a presentation entitled Quantum Information on Spectral sets.


I have made a BibTeX file with a lot of items related to information theory.

Information Theory Society I am senoir member og IEEE and the Information Theory Society is a part of this organization. The page has a lot of useful links related to information theory.
Danish Climbing Federation I am active climber and has served as president of this organization.
Entropy The Journal that I was Editor-in-Chief for.
Ideal scientific equipment A less useful link.
IEEE Transactions on Information Theory I am associated editor of this journal covering Information Theory and Statistics.
Minimum Description Length on the web This pages is devoted to MDL and its applications. I contains links to articles and people working in the field.
Teaching portfolio Here I have collected various material relevant for my teaching activities.
The cross entropy method


Here are links to some of my favorite software.

Dropbox Very convenient for storing and sharing documents.
GeoGebra Software to create mathematical figures. It can output an image as pgf code.
JabRef Program that helps you manage your BibTeX database.
Online LaTeX diff tool You can send two files with LaTeX code to this cite and you will get one PDF file back with colormarking of all changes.
LyX User friendly interface to LaTeX.
PDF-XChange Great for annotating PDF documents. The free version is good and the prof. version is even better.
R Great for statistical programming. With the module Sweave R-code can be embedded directly into LaTeX documents.
TikzEdt Editor for TikZ/pgf code.


Here are links to some persons that I use to collaborate with.

Andrew Barron
Christophe Vignat
Flemming Topsøe
Frantisek Matús
Gábor Tusnády
Ioannis Kontoyiannis
László Györfi
Lukasz Debowski
Narayana Prasad Santhanam
Oliver Johnson
Peter Grunwald
Tim van Erven