skip to main content

Behavioral Social Neuroscience Seminar

Wednesday, October 11, 2017
4:00pm to 5:00pm
Add to Cal
Baxter B125
Modeling Ignorance: Uncertainty or Complexity?
Peter Bossaerts, Redmond Barry Distinguished Professor, University of Melbourne & the Florey Institute of Neuroscience and Mental Health,

Abstract: Classical decision theory uses the tools of probability theory when dealing with situations where an answer is unknown. It treats the unknown parameter (say, the distance between Sydney and Melbourne) as a random variable, endows it with a prior belief, samples, and updates beliefs, preferably using Bayes' law. As such, classical decision theory treats ignorance as uncertainty. Experimental and theoretical paradigms then generally build on situations where uncertainty can be resolved by sampling.  Decision neuroscience has taken the same route, very successfully so. It has identified multiple "prediction error" signals in the brain, as well as neural signals that reflect expected size of these prediction errors and corresponding updates or "surprises."
But there are situations, quite ubiquitous, where such an approach is known to be ineffective. These are situations which computer scientists have categorized as "computationally complex." I will focus on one subset, namely, problems said to be"NP hard." Here, it is not obvious how to define traditional concepts such as "prediction errors" and "surprise." But sampling is rather inefficient anyways. Moreover, I will show that human choice appears to follow the route that the theory of computation — and your computer — has taken, in many respects. But unlike computers (at this time), humans are able to sense the complexity of single instances. The theory of computation applies more generally, because it is able to predict performance of another type of computer: markets.

For more information, please contact Mary Martin by phone at 626-395-5884 or by email at [email protected] or visit the supplemental paper "How Humans Solve Complex Problems: The Case of the Knapsack Problem.".