We have shown that single-parameter estimation by likelihood analysis can be made efficient in the sense that we can compress the original data set to make parameter estimation tractable, and it is optimal in the sense that there is no loss of information about the parameter we wish to estimate. Our eigenmodes are generalised versions of the signal-to-noise eigenmodes, and are optimal for parameters entering the data covariance matrix in arbitrary ways.
As with all parameter estimation, this is a model-dependent method in the sense that we need only to know the covariance matrix of the data and the assumption of Gaussianity. However we have not had to introduce anything more than the standard assumptions of likelihood analysis. The dependence on the initial choice of parameter values is minimal, and can be reduced further by iteration.
For many-parameter estimation, we have shown the effects of two algorithms for optimisation. Optimising separately for several parameters by the single-parameter method, and trimming the resulting dataset via an SVD step is successful in recovering the conditional likelihood errors. For correlated parameter estimates, a promising technique appears to be to diagonalize the Fisher matrix and optimise for the single parameter along the likelihood ridge.