• GRACE accepted at ICDCS’21.
  • SIDCo, an efficient gradient compression technique for distributed deep learning accepted at MLSys’21.
  • SwitchML accepted at NSDI’21.
  • Congratulations Arnaud and Ahmed! Two papers accepted at INFOCOM’21.
  • Congratulations Waleed! Assise is accepted at OSDI’20.
  • Bilal gets papers in at VLDB’20 and SoCC’20! In the VLDB paper, we systematically investigate the problem of cloud configuration using black-box optimization methods and uncover how different methods behave with more than 20 workloads. At SoCC, we propose a new method, Vanir, that reins in configuring data analytics clusters formed by multiple distributed systems whose joint optimization is required.
  • We survey popular gradient compression techniques for distributed deep learning and perform a comprehensive comparative evaluation. Read our technical report.
  • Is there a discrepancy between the theory and practice of gradient compression for distributed deep learning? We argue so in our AAAI’20 paper.
  • Our paper on improving how to reason about and explain the behavior of reinforcement learning agents in networking applications accepted at NetAI’19.
  • A paper on the DAIET project published at HotNets’17.

© 2012-2020. All rights reserved.