91探花

Skip to main content
Department Of Physics text logo
  • Research
    • Our research
    • Our research groups
    • Our research in action
    • Research funding 91探花
    • Summer internships for undergraduates
  • Study
    • Undergraduates
    • Postgraduates
  • Engage
    • For alumni
    • For business
    • For schools
    • For the public
  • Support
91探花
Theoretical physicists working at a blackboard collaboration pod in the Beecroft building.
Credit: Jack Hobhouse

Ard Louis

Professor of Theoretical Physics

Research theme

  • Biological physics

Sub department

  • Rudolf Peierls Centre for Theoretical Physics

Research groups

  • Condensed Matter Theory
ard.louis@physics.ox.ac.uk
  • About
  • Research
  • Publications on arXiv/bioRxiv
  • Publications

Visualising Feature Learning in Deep Neural Networks by Diagonalizing the Forward Feature Map

(2024)

Authors:

Yoonsoo Nam, Chris Mingard, Seok Hyeong Lee, Soufiane Hayou, Ard Louis

An exactly solvable model for emergence and scaling laws in the multitask sparse parity problem

Advances in Neural Information Processing Systems 37 (NeurIPS 2024) Curran Associates 37 (2024)

Authors:

Yoonsoo Nam, Nayara Fonseca, Sh Lee, Christopher Mingard, Ard A Louis

Abstract:

Deep learning models can exhibit what appears to be a sudden ability to solve a new problem as training time, training data, or model size increases, a phenomenon known as emergence. In this paper, we present a framework where each new ability (a skill) is represented as a basis function. We solve a simple multi-linear model in this skill-basis, finding analytic expressions for the emergence of new skills, as well as for scaling laws of the loss with training time, data size, model size, and optimal compute. We compare our detailed calculations to direct simulations of a two-layer neural network trained on multitask sparse parity, where the tasks in the dataset are distributed according to a power-law. Our simple model captures, using a single fit parameter, the sigmoidal emergence of multiple new skills as training time, data size or model size increases in the neural network.

Exploiting the equivalence between quantum neural networks and perceptrons

(2024)

Authors:

Chris Mingard, Jessica Pointing, Charles London, Yoonsoo Nam, Ard A Louis

Exploring Simplicity Bias in 1D Dynamical Systems

Entropy MDPI 26:5 (2024) 426

Authors:

Kamal Dingle, Mohammad Alaskandarani, Boumediene Hamzi, Ard A Louis

Abstract:

Arguments inspired by algorithmic information theory predict an inverse relation between the probability and complexity of output patterns in a wide range of input-output maps. This phenomenon is known as simplicity bias. By viewing the parameters of dynamical systems as inputs, and the resulting (digitised) trajectories as outputs, we study simplicity bias in the logistic map, Gauss map, sine map, Bernoulli map, and tent map. We find that the logistic map, Gauss map, and sine map all exhibit simplicity bias upon sampling of map initial values and parameter values, but the Bernoulli map and tent map do not. The simplicity bias upper bound on the output pattern probability is used to make a priori predictions regarding the probability of output patterns. In some cases, the predictions are surprisingly accurate, given that almost no details of the underlying dynamical systems are assumed. More generally, we argue that studying probability-complexity relationships may be a useful tool when studying patterns in dynamical systems.

Non-Poissonian Bursts in the Arrival of Phenotypic Variation Can Strongly Affect the Dynamics of Adaptation

Molecular Biology and Evolution 91探花 University Press 41:6 (2024) msae085

Authors:

Nora S Martin, Steffen Schaper, Chico Q Camargo, Ard A Louis

Abstract:

Modeling the rate at which adaptive phenotypes appear in a population is a key to predicting evolutionary processes. Given random mutations, should this rate be modeled by a simple Poisson process, or is a more complex dynamics needed? Here we use analytic calculations and simulations of evolving populations on explicit genotype鈥損henotype maps to show that the introduction of novel phenotypes can be 鈥渂ursty鈥 or overdispersed. In other words, a novel phenotype either appears multiple times in quick succession or not at all for many generations. These bursts are fundamentally caused by statistical fluctuations and other structure in the map from genotypes to phenotypes. Their strength depends on population parameters, being highest for 鈥渕onomorphic鈥 populations with low mutation rates. They can also be enhanced by additional inhomogeneities in the mapping from genotypes to phenotypes. We mainly investigate the effect of bursts using the well-studied genotype鈥損henotype map for RNA secondary structure, but find similar behavior in a lattice protein model and in Richard Dawkins鈥檚 biomorphs model of morphological development. Bursts can profoundly affect adaptive dynamics. Most notably, they imply that fitness differences play a smaller role in determining which phenotype fixes than would be the case for a Poisson process without bursts.

Pagination

  • First page First
  • Previous page Prev
  • Page 1
  • Page 2
  • Current page 3
  • Page 4
  • Page 5
  • Page 6
  • Page 7
  • Page 8
  • Page 9
  • …
  • Next page Next
  • Last page Last

Footer 91探花

  • Contact us
  • Giving to the Dept of Physics
  • Work with us
  • Media

User account menu

  • Log in

Follow us

FIND US

Clarendon Laboratory,

Parks Road,

91探花,

OX1 3PU

CONTACT US

Tel: +44(0)1865272200

Department Of Physics text logo

漏 91探花 - Department of Physics

Cookies | Privacy policy | Accessibility statement

  • Home
  • Research
  • Study
  • Engage
  • Our people
  • News & Comment
  • Events
  • Our facilities & services
  • About us
  • Giving to Physics