This presentation surveys the principles needed for a successful AI programming competition and describes the architecture of the game environment, particularly the support that GCP provided for the support of 12 million game executions written in over 20 programming languages.
In the machine learning research community, it is generally believed that there is a tension between memorization and generalization. This paper examines the extent to which this tension exists, by exploring whether it is possible to generalize by memorizing alone.
An overview of Rademacher Averages, a fundamental concept from statistical learning theory that can be used to derive uniform sample-dependent bounds to the deviation of samples averages from their expectations.
Two Sigma researchers discuss notable advances in deep learning, optimization algorithms, Bayesian techniques, and time-series analysis presented at 2016’s Conference on Neural Information Processing Systems (NIPS).