- Technology
Tech Stew: Highlights from Strange Loop 2019 and PWLConf 2019
Two Sigma researchers and engineers review some of the most fascinating and offbeat papers and presentations from Strange Loop 2019 and Papers We Love 2019.
Two Sigma researchers and engineers review some of the most fascinating and offbeat papers and presentations from Strange Loop 2019 and Papers We Love 2019.
Two Sigma researchers highlight a few particularly insightful papers, talks, and presentations from ICLR 2019.
A Two Sigma AI engineer outlines several approaches for understanding how machine learning models arrive at the answers they do.
Speaking at the 2019 Milken Institute Global Conference, Two Sigma co-founder David Siegel discusses the challenges and opportunities AI offers for individuals, companies, and societies.
Speaking on a panel at the 2019 World Economic Forum, Two Sigma co-founder David Siegel discusses key challenges and opportunities as computers assume greater decision-making power globally.
Modern large-scale ML applications require stochastic optimization algorithms to be implemented on distributed computational architectures. A key bottleneck is the communication overhead for exchanging information such as stochastic gradients among different workers. In this paper, to reduce the communication cost, we propose a convex optimization formulation to minimize the coding length of stochastic gradients.
The authors suggest a general oracle-based framework that captures different parallel stochastic optimization settings described by a dependency graph, and derive generic lower bounds in terms of this graph, as well as lower bounds for several specific parallel optimization settings. They highlight gaps between lower and upper bounds on the oracle complexity, and cases where the “natural” algorithms are not known to be optimal.
Sparse Principal Component Analysis (SPCA) and Sparse Linear Regression (SLR) have a wide range of applications and have attracted attention as canonical examples of statistical problems in high dimension. A variety of algorithms have been proposed for both SPCA and SLR, but an explicit connection between the two had not been made. This paper shows how to efficiently transform a black-box solver for SLR into an algorithm for SPCA.
Two Sigma researchers highlight several papers from ICML 2018 that they found particularly novel, practical, or otherwise compelling.
Two Sigma researchers provide an overview of some of the most interesting lectures and sessions at JSM 2018, and highlight some of the most important challenges statisticians face going forward.