Statistical Scalability Programme at the Isaac Newton Institute

Institute Co-director John Aston, along with Idris Eckley (Lancaster), Paul Fearnhead (Lancaster), Po-Ling Loh (Wisconsin-Madison), Rob Nowak (Wisconsin-Madison) and Richard Samworth (Cambridge) are hosting an Isaac Newton programme on Statistical Scalability, running from 10th January 2018 to 29th June 2018,

We are living in the information age. Modern technology is transforming our ability to collect and store data on unprecedented scales. From the use of Oyster card data to improve London’s transport network, to the Square Kilometre Array astrophysics project that has the potential to transform our understanding of the universe, `Big Data’ can inform and enrich many aspects of our lives. Given the prospects of transformational advances to standard practice in a plethora of data-rich industries, government agencies, science and technology, it is unsurprising that Big Data is currently receiving such a high level of media publicity.

Of course, the important role of statistics within Big Data has been clear for some time. However the current tendency has been to focus purely on algorithmic scalability, such as how to develop versions of existing statistical algorithms that scale better with the amount of data. Such an approach, however, ignores the fact that fundamentally new issues often arise, and highly innovative solutions are required. In particular, the thesis of this programme is that it is only by simultaneous consideration of the methodological, theoretical and computational challenges involved that we can hope to provide robust, scalable methods that are crucial to unlocking the potential of Big Data.

For more information please visit www.newton.ac.uk/event/sts