Rothschild Lecture: From Small Data to Big Data and Back: Statistics and Data Science

1 hour 1 min,  233.27 MB,  iPod Video  480x270,  29.97 fps,  44100 Hz,  522.11 kbits/sec
Share this media item:
Embed this media item:


About this item
Image inherited from collection
Description: Bickel, P (University of California, Berkeley)
Wednesday 17th August 2016 - 16:00 to 17:00
 
Created: 2016-08-23 16:44
Collection: Theoretical Foundations for Statistical Network Analysis
Publisher: Isaac Newton Institute
Copyright: Bickel, P
Language: eng (English)
 
Abstract: Modern statistics began with R.A.Fisher’s seminal work in the early twentieth century, with important predecessors such as Karl Pearson, contemporaries such as J.Neyman and successors such as A. Wald. The focus was on small data sets or summaries of larger ones. Analyses were based on simple models and given in terms of estimates, testing and confidence bounds. With the advent of big data, the size of datasets, their complexity and heterogeneity, and the lack of theory to build mechanistic probability models brought new issues to the fore. I will discuss some of these issues: 1. Computation 2. Prediction 3. Sparsity/dimension reduction4. Stability/robustness 5. Reduction to small data
Available Formats
Format Quality Bitrate Size
MPEG-4 Video 640x360    1.93 Mbits/sec 884.84 MB View Download
WebM 640x360    672.32 kbits/sec 300.38 MB View Download
iPod Video * 480x270    522.11 kbits/sec 233.27 MB View Download
MP3 44100 Hz 250.15 kbits/sec 111.76 MB Listen Download
Auto (Allows browser to choose a format it supports)