We investigate the neural mechanisms underlying learning and memory using computational approaches. In particular, we study how a brain region, called the hippocampus, is involved in storing and retrieving episodic memories and in generating representations of space. There is overwhelming experimental evidence that the hippocampus is involved in both these functions, but it remains unclear why these two functions go together and how these functions are implemented in the hippocampus. To address these two questions, we employ a number of computational and theoretical approaches, including

  • biologically realistic neural network models, which nonetheless are highly simplified, that capture the essence of the neural circuit mechanism underlying learning and memory.
  • algorithmic models of the storage and retrieval of episodic memories.
  • theoretical models of the nature of episodic memory.
  • robotics simulations of spatial memory in rodents.

Below we describe a selection of projects that are currently ongoing in our group.


The shape of the hippocampal formation and its connections

Martin Pyka in our group has recently developed a method for modeling the anatomical layout of neurons and their projections. In this video, he uses his software to illustrate the peculiar gross anatomy of the hippocampal formation.





 The interaction between semantic and episodic memory


We are developing a computational model of encoding, storage and retrieval of episodic memory which takes into account the interrelation between episodic memory and semantic representations. In the model, episodes are stored in terms of higher order information, i.e., semantic representation, not their underlying sensory inputs. We investigate, for example, what role the semantic representation might play in episodic memory and how episodic memory can be used to infer semantic information.


Robotics simulation of place-selective responses driven by visual inputs


The aim of the project is to understand how the rodent brain generates place-selective responses based on visual inputs alone. To model as closely as possible the conditions that a rodent faces, we let the small ePuck robot explore a real environment to collect images. These images are then processed with an algorithm called slow-feature analysis (SFA). In the future, we will study how to combine this visually-driven (allothetic) information with idiothetic spatial information to generate more robust location estimates.


The generation and propagation of neural sequences in the hippocampus

Created with MuxVi

Temporal sequences of neural activation can be observed in the hippocampus during the theta state and during sharp-wave ripple events. Since these temporal sequence are related to the ordering the cells' place fields, it has been suggested that the sequence are important for spatial navigation, planing or learning. We are trying to understand how neural networks generate these sequences and how they propagate to downstream regions.