Research

Lookit: bringing developmental studies home

toddler on the computer

The Lookit website allows families to participate in developmental studies online using their own computer and webcam. Why put experiments online? We’re looking to:

Learn more or participate with your kids at Lookit, view the source code on github (lookit, experimenter, and exp-addons repos), or connect with us on Facebook or Instagram. To stay up-to-date on progress on the platform, you can join the lookit-research mailing list.

Current & planned projects

Automated gaze coding for developmental research

The next bottleneck for running infant studies at scale is coding the data: a human needs to watch the collected video, generally at 1/10 - 1/2 speed depending on the desired granularity, to record where the infant is looking each frame. Algorithms for automated gaze coding from natural-light video (as opposed to measurements from specialized eyetracking hardware) are finally reaching the point where they could be productively adapted for developmental data collection. (Take a look at some examples of running OpenFace on webcam video of babies: 1, 2.) To get involved in efforts in this direction, you can join the baby-gaze-coding mailing list.

What’s it like to be a baby?

newborn baby

My more empirical work has focused on early conscious experience: for instance, how infants merge multiple representations of the same concept; understand the passage of time; and distinguish among imagery, memory, and perception.

Papers

Scott, K. M. Split-brain babies? Differences in representation of bilaterally and unilaterally presented visual concepts in infancy. Submitted (preprint).

Scott, K. M. and Kline, M. Enabling confirmatory secondary data analysis by logging data ‘checkout’. Under revision (preprint).

Chouinard, B., Scott, K., and Cusack, R. Using automatic face analysis to score infant behaviour from video collected online. Under revision (preprint).

Scott, K. M. and Schulz, L. E. (2017). Lookit (part 1): a new online platform for developmental research. Open Mind 1(1):4-14. (full text)

Scott, K. M., Chu, J., and Schulz, L. E. (2017). Lookit (part 2): Assessing the viability of online developmental research, results from three case studies. Open Mind 1(1):15-29. (full text)

Scott, K.M. & Schulz, L.E. (2014, July). Interhemispheric integration of visual concepts in infancy. Paper presented at the annual meeting of the Cognitive Science Society, Quebec City, Canada. (pdf)

Scott, K. M., Du, J., Lester, H. A., & Masmanidis, S. C. (2012). Variability of acute extracellular action potential measurements with multisite silicon probes. Journal of Neuroscience Methods 211(1), 22-30. (pdf)

Moss, F. J., Imoukhuede, P. I., Scott, K., Hu, J., Jankowsky, J. L., Quick, M. W., & Lester, H. A. (2009). GABA transporter function, oligomerization state, and anchoring: correlates with subcellularly resolved FRET. The Journal of general physiology, 134(6), 489-521. (full text)

Student papers

PhD thesis: Online data collection for developmental research. (Thesis commons)

Undergraduate SURF paper: Effects of nicotine on neuronal firing patterns in human subthalamic nucleus (pdf, slides)

Undergraduate SURF paper: Detecting multiple oligomerization states by multidimensional analysis of FRET images (pdf, slides)

Media coverage

Society for Science and the Public: Babies, parenthood, and science (June 2018)

MITili blog: Q&A with MIT’s Kim Scott in the Early Childhood Cognition Lab (December 2017)

MIT Campaign: Better data on how babies learn (November 2016)

Pacific Standard: The 30 top thinkers under 30 (March 2016)

Science News, Growth Curve: Your baby can watch movies for science (June 2014)

Parents Magazine: Parents Perspective (June 2014)

MIT News: MIT launches online lab to study early childhood learning (June 2014)

Boston Magazine: MIT launches a new online lab (June 2014)