Sign up to receive the latest Insights posts in your inbox.

Subscribe
  • Data Science

Highlights of the 2018 Joint Statistical Meetings

Two Sigma researchers provide an overview of some of the most interesting lectures and sessions at JSM 2018, and highlight some of the most important challenges statisticians face going forward.

  • Data Science

Learning and Memorization

In the machine learning research community, it is generally believed that there is a tension between memorization and generalization. This paper examines the extent to which this tension exists, by exploring whether it is possible to generalize by memorizing alone.

  • Data Science

Rademacher Averages: Theory and Practice

An overview of Rademacher Averages, a fundamental concept from statistical learning theory that can be used to derive uniform sample-dependent bounds to the deviation of samples averages from their expectations.

  • Data Science

NIPS 2016: A Survey of Tutorials, Papers, and Workshops

Two Sigma researchers discuss notable advances in deep learning, optimization algorithms, Bayesian techniques, and time-series analysis presented at 2016’s Conference on Neural Information Processing Systems (NIPS).