Email me: Steve Gull
On this page I'm posting my MaxEnt 2016 talk with Anton Garrett (and Bob Silver (deceased)).
Abstract: irreversible_processes_abstract.pdf
Talk: irreversible_processes.pdf
I'm also posting the MaxEnt 2009 talk and quite a lot of other potentially interesting stuff.
This is a talk I gave about 2 years ago to advertise Ed Jaynes paper `Gibbs vs. Boltzmann entropies' (1965). Tom Grandy published several books about Ed's approach to statistical mechanics, and I hope he will approve of my presentation. It was published in Brian Buck and Vincent Macauley's little volume `Maximum |Entropy in Action' (1991). Although there's one tiny mistake towards the end of that paper, I still feel passionately about this matter. I have added some remarks about phase-averaging and the Tsallis herersy: misconceptions.pdf
Quantum Acausality and Bell's Theorem
Many years ago (about 1984), I used to give a Mathematical Physics course to the Part II students. I illustrated the quantum paradox covered by Bell's theorem by showing that you can't program two independently running computers to mimic the results of spin measurements on two spin-1/2 particles in a singlet state. I believe this demonstration is actually better than Bell's original argument: bell.pdf
How it all started
I was greatly influenced by a talk on MaxEnt by John Ponsonby (1972) and a remark by Dave Rogstadt (1974) which he made whilst I was CLEANing an map of Kepler's SNR. A few years later Geoff Daniell told me about Bayes' Theorem and introduced me to the papers of Ed Jaynes.
The Bayesian Radio Astronomer
A long time ago (latest version I found was 1991) I gave this talk at Jodrell (and probably at the VLA site too). We had all the Bayesian technology then, with MaxEnt (i.e. MemSys) and optimal gridding. My view of the fundamentals hasn't changed at all, so here it is...bayesradio.pdf
Bayesian Target Recognition
This is an essentially similar presentation from the early 1990s describing target recognition using data with a position-independent PSF. 15 years of MCMC program development have changed the conclusions, as we can now do multiple point source fitting and detection thresholding properly. Here it is: target_recognition.pdf
Programs: MaxEnt and MassInf
From 1978 onwards we developed Maximum Entropy algorithms for deconvolution, John Skilling taking over control of the programming in 1981, and leading to the commercial MemSys code. In its fully developed form (circa 1993) it is a fully Bayesian implementation, using conjugate gradient approximations to evaluate and provide error bounds on the all-important "Evidence" (which can be used to compare different prior models). Details of the theory and the algorithm are in the MemSys Users' Manual here: MemSys5_manual.pdf
Radioastronomical imaging from interferometer data is a slightly unusual application for MemSys. It's fine if you go back to the raw data via slow transforms or gridding/degridding, but if you want to use the dirty map and beam as data, the algorithm does fit into our standard "Opus/Tropus" structure for the conjugate gradient loop. Simpler algorithms (such as the one in AIPS) are coded up directly, but our MemSys version known as RAMem was not written until about 1994. It brought the full power of conjugate gradients to DirtyMapMem and I have running using FITS maps and beams on a Windows platform. RAMem also a very good trainer for Bayesian Neural Nets, where we call it NeuroSys.
From about 1989 onwards the "Massive Inference" prior was developed and implemented through Markov Chain Monte Carlo methods. These "atomic" models for the sky and other images were desperately slow, and although some applications were very successful it has never been applied to large radio images. The usual MCMC methods use "thermodynamic integration" to approximate the evidence, but there was no good theory that allowed one to compute error bar on it. The MassInf prior is these days incorporated into our workhorse MCMC programs BayeSys, which is publicly available. The BayeSys manual contains all the details of the theory and algorithms: BayeSys_manual.pdf
Then it all changed....
Nested Sampling
John Skilling invented this in 2004, though he called it something else until David MacKay pointed out that the name could be misunderstood as a tribute to a thoroughly non-Bayesian statistician. The basic idea is to convert the integral required for the evidence into a one-dimensional graph of likelihood against prior range. I'd better let him tell you about it himself... nested.pdf
Here is a poster he gave earlier this year comparing MCMC and nested sampling: nestposter.pdf
This is Farhan Feroz's recent PhD thesis (under the supervision of Mike Hobson) describing the "MultiNest" implementation of nested sampling, together with recent applications from cosmology. This is the state-of-the art Nested Sampling code -- I've used it and it works well. Farhan is currently coding John's conjugate gradient ideas into it: Feroz_thesis.pdf
Nested Sampling & Conjugate Gradient
Nested sampling tells you what to do when you get a new sample from the set with likelihood L>L_0, but it doesn't tell you how you find one. Here are John Skilling's latest thoughts on how you can apply all the conjugate gradient technology from MemSys to find that elusive new sample: nested_conjgrad.pdf
Bayesian Rotation Measure determinations
Here is a little study I did for Paul Alexander on Faraday rotation analysis for the SKA. The conclusion is that rotation measures of 200 or so are easily found using MCMC for sources greater than about 4 sigma polarised flux. faraday.pdf
Background Papers
This is my Bayesian "coming out" paper in 1979. I am writing under the influence of Geoff Daniell at this stage. He used to drip-feed me Ed Jaynes papers, sometimes saying "Hmm.. I don't think you're ready for that one yet...". gulldan2.pdf
This is the original "second level of inference" paper given at the 1986 MaxEnt meeting, stressing the use of what we now call the evidence. inference.pdf
RANT: John Skilling on statisticians and why you should be a Bayesian
I suspect that I might have calmed down a bit after all these years, but John certainly hasn't... Enjoy: bayes_rant.pdf