[theory students] Fwd: Fall 2019 Quarterly Theory Workshop showcasing Junior Theorists


Date: Wed, 23 Oct 2019 06:51:26 -0500
From: Shuchi Chawla <shuchi@xxxxxxxxxxx>
Subject: [theory students] Fwd: Fall 2019 Quarterly Theory Workshop showcasing Junior Theorists
FYI...

---------- Forwarded message ---------
From: Aravindan Vijayaraghavan <aravindv@xxxxxxxxxxxxxxxx>
Date: Tue, Oct 22, 2019, 5:18 PM
Subject: Fall 2019 Quarterly Theory Workshop showcasing Junior Theorists
To: shuchi@xxxxxxxxxxx <shuchi@xxxxxxxxxxx>, Christos Tzamos <tzamos@xxxxxxxx>, yliang@xxxxxxxxxxx <yliang@xxxxxxxxxxx>
Cc: Samir Khuller <samir.khuller@xxxxxxxxxxxxxxxx>, Konstantin Makarychev <konstantin@xxxxxxxxxxxxxxxx>, Jason D Hartline <hartline@xxxxxxxxxxxxxxxx>


Dear Shuchi, Christos, Bruce,

Please save the November 14-15 date for Northwestern Computer Scienceâs Fall 2018 Quarterly Theory Workshop highlighting Junior Theorists. Full details and registration are available on the workshop webpage:

Âhttps://theory.cs.northwestern.edu/events/2019-junior-theorists-workshop/

It would be great if you could also forward this announcement to the rest of your group.

We hope to see you there!
Aravindan, Jason, Kostya, and Samir

â

ABOUT THE SERIES

The Quarterly Theory Workshop brings in three or four theoretical computer science experts present their perspective and research on a common theme. Chicago and Midwest area researchers with interest in theoretical computer science are invited to attend. The technical program is in the morning and includes coffee and lunch. The afternoon of the workshop will allow for continued discussion between attendees and the speakers.


SYNOPSIS

The focus of this workshop will be on junior researchers in all areas of theoretical computer science. The talks will be on the afternoon of the first day and morning of the second day. There will be time for open discussion after lunch on the second day. The speakers for this workshop are:

ÂÂÂÂÂÂÂ â Soheil Behnezad (UMD)
ÂÂÂÂÂÂÂ â Greg Bodwin (Georgia Tech)
ÂÂÂÂÂÂÂ â Sumegha Garg (Princeton)
ÂÂÂÂÂÂÂ â Andrea Lincoln (MIT)
ÂÂÂÂÂÂÂ â Thatchapol Saranuik (TTIC)
ÂÂÂÂÂÂÂ â Vatsal Sharan (Stanford)
ÂÂÂÂÂÂÂ â Sahil Singla (Princeton and IAS)
ÂÂÂÂÂÂÂ â Manolis Zampetakis (MIT)
ÂÂÂÂÂÂÂ â Jiapeng Zhang (Harvard)


LOGISTICS

ÂÂÂÂÂÂÂ â Date: Thursday-Friday, Nov 14-15, 2019.
 â Location: Seeley Mudd 3514, (map), Northwestern U, 2211 Campus Dr, Evanston, IL 60208.
ÂÂÂÂÂÂÂ â Transit: Noyes St. Purple Line (map).
ÂÂÂÂÂÂÂ â Parking: Validation for North Campus Parking Garage (map) available at workshop.
ÂÂÂÂÂÂÂ â Recommended Hotel: Hilton Orrington.
 â Registration: Please register (not compulsory). Please bring your own name badge from past conference.


SCHEDULE

Thursday:

ÂÂÂÂÂÂÂ â 12:30-12:35: Opening Remarks
ÂÂÂÂÂÂÂ â 12:35-1:15: Sumegha Garg (Princeton)
ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂ Extractor-based Time-Space Lower Bounds for Learning
ÂÂÂÂÂÂÂ â 1:20-2:00: Jiapeng Zhang (Harvard)
ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂ An improved sunflower bound
ÂÂÂÂÂÂÂ â 2:05-2:35: Coffee Break
ÂÂÂÂÂÂÂ â 2:35-3:15: Thatchaphol Saranurak (TTIC)
ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂ Expander decomposition: applications and how to use it
ÂÂÂÂÂÂÂ â 3:20-4:00: Andrea Lincoln (MIT)
ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂ Tight Hardness for Shortest Cycles and Paths in Sparse Graphs
ÂÂÂÂÂÂÂ â 4:05-4:45: Soheil Behnezad (UMD)
ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂ Improved Massively Parallel Algorithms for Maximal Matching and Graph Connectivity
ÂÂÂÂÂÂÂ â 5:00-6:30: Cocktail Reception

Friday:

ÂÂÂÂÂÂÂ â 8:30-9:00: Continental Breakfast
ÂÂÂÂÂÂÂ â 9:00-9:05: Opening Remarks
ÂÂÂÂÂÂÂ â 9:05-9:45: Sahil Singla (Princeton and IAS)
ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂ Improved Truthful Mechanisms for Combinatorial Auctions with Submodular Bidders
ÂÂÂÂÂÂÂ â 9:50-10:30: Greg Bodwin (Georgia Tech)
ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂ Regularity Decompositions for Sparse Pseudorandom Graphs
ÂÂÂÂÂÂÂ â 10:35-11:05: Coffee Break
ÂÂÂÂÂÂÂ â 11:05-11:45: Vatsal Sharan (Stanford)
ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂ New Problems and Perspectives on Learning, Sampling, and Memory
ÂÂÂÂÂÂÂ â 11:50-12:30: Manolis Zampetakis (MIT)
ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂ Computationally and Statistically Efficient Truncated Statistics
ÂÂÂÂÂÂÂ â 12:35-1:30: Lunch

TITLES AND ABSTRACTS

Speaker: Sumegha Garg (Princeton)
Title: Extractor-based Time-Space Lower Bounds for Learning

A recent line of works has focused on the following question: Can one prove strong unconditional lower bounds on the number of samples needed for learning under memory constraints? We study an extractor-based approach to proving such bounds for a large class of learning problems as follows.

A matrix M: A x X -> {-1,1} corresponds to the following learning problem: An unknown element x in X is chosen uniformly at random. A learner tries to learn x from a stream of samples, (a_1, b_1), (a_2, b_2) â, where for every i, a_i in A is chosen uniformly at random and b_i = M(a_i,x).

Assume that k, l, r are such that any submatrix of M, with at least 2^{-k}|A| rows and at least 2^{-l}|X| columns, has a bias of at most 2^{-r}. We show that any learning algorithm for the learning problem corresponding to M requires either a memory of size at least \Omega(k l), or at least 2^{\Omega(r)} samples.

In particular, this shows that to learn d-degree multi-linear polynomial from its evaluation on random points from {0,1}^n, we either need \tilde{\Omega}(n^{d+1}) memory or 2^{\Omega(n)} samples (~ tight). We also extend the lower bounds to a learner that is allowed two passes over the stream of samples.

Joint works with Ran Raz and Avishay Tal.


Speaker: Jiapeng Zhang (Harvard)
Title: An improved sunflower bound.

A sunflower with $r$ petals is a collection of $r$ sets so that the intersection of each pair is equal to the intersection of all. Erdos and Rado in 1960 proved the sunflower lemma: for any fixed $r$, any family of sets of size $w$, with at least about $w^w$ sets, must contain a sunflower. The famous sunflower conjecture is that the bound on the number of sets can be improved to $c^w$ for some constant $c$. Despite much research, the best bounds until recently were all of the order of $w^{cw}$ for some constant c. In this work, we improve the bounds to about $(log w)^w$.

Joint work with Ryan Alweiss, Shachar Lovett and Kewen Wu.


Speaker: Thatchaphol Saranurak (TTIC)
Title: Expander decomposition: applications and how to use it

Expander decomposition has been a central tool in designing graph algorithms in many fields (including fast centralized algorithms, approximation algorithms, and property testing) for decades. Recently, we found that it also gives many impressive applications in dynamic graph algorithms and distributed graph algorithms. In this talk, I will explain the key tools that enable this development, show how to apply these tools, and survey the recent results based on expander decomposition.


Speaker: Andrea Lincoln (MIT)
Title: Tight Hardness for Shortest Cycles and Paths in Sparse Graphs

Fine-grained reductions have established equivalences between many core problems with \tilde{O}(n^3)-time algorithms on n-node weighted graphs, such as Shortest Cycle, All-Pairs Shortest Paths (APSP), Radius, Replacement Paths, Second Shortest Paths, and so on. These problems also have \tilde{O}(mn)-time algorithms on m-edge n-node weighted graphs, and such algorithms have wider applicability. Are these mn bounds optimal when m << n^2? Starting from the hypothesis that the minimum weight (2â+1)-Clique problem in edge weighted graphs requires n^{2â +1-o(1)} time, we prove that for all sparsities of the form m = \Theta(n^{1+1/â}), there is no O(n^2 + mn^{1- Ï}) time algorithm for Ï >0 for any of the below problems

ÂÂÂÂÂÂÂ â Minimum Weight (2â+1)-Cycle in a directed weighted graph,
ÂÂÂÂÂÂÂ â Shortest Cycle in a directed weighted graph,
ÂÂÂÂÂÂÂ â APSP in a directed or undirected weighted graph,
ÂÂÂÂÂÂÂ â Radius (or Eccentricities) in a directed or undirected weighted graph,
ÂÂÂÂÂÂÂ â Wiener index of a directed or undirected weighted graph,
ÂÂÂÂÂÂÂ â Replacement Paths in a directed weighted graph,
ÂÂÂÂÂÂÂ â Second Shortest Path in a directed weighted graph,
ÂÂÂÂÂÂÂ â Betweenness Centrality of a given node in a directed weighted graph.

That is, we prove hardness for a variety of sparse graph problems from the hardness of a dense graph problem. Our results also lead to new conditional lower bounds from several related hypothesis for unweighted sparse graph problems including k-cycle, shortest cycle, Radius, Wiener index and APSP.
Joint work with Virginia Vassilevska Williams and Ryan Williams.


Speaker: Soheil Behnezad (UMD)
Title: Improved Massively Parallel Algorithms for Maximal Matching and Graph Connectivity

In this talk we will discuss the recent algorithmic progress made on the Massively Parallel Computations (MPC) model. The MPC model provides a clean theoretical abstraction of modern parallel computation frameworks such as MapReduce, Hadoop, Spark, etc., which have been extremely successful in processing large-scale data-sets.

Our main focus in the talk will be on the maximal matching problem. We provide a novel analysis of an extremely simple algorithm and show that it leads to exponentially faster MPC algorithms for maximal matching. Our analysis is based on a novel method of proving concentration bounds for algorithms satisfying a certain âlocalityâ property.

We will also overview an algorithm for graph connectivity in the MPC model that takes O(log D) rounds for graphs with diameter D > log n and takes O(loglog n) rounds for all other graphs. This algorithm is near-optimal since Î(log D) is a (conditional) lower-bound for the problem.

Based on joint works with Mohammad Hajiaghayi and David Harris (FOCS 2019), and Laxman Dhulipala, Hossein Esfandiari, Jakub ÅÄcki, and Vahab Mirrokni (FOCS 2019).


Speaker: Sahil Singla (Princeton and IAS)
Title: Improved Truthful Mechanisms for Combinatorial Auctions with Submodular Bidders

A longstanding open problem in Algorithmic Mechanism Design is to design computationally efficient truthful mechanisms for (approximately) maximizing welfare in combinatorial auctions with submodular bidders. The first such mechanism was obtained by Dobzinski, Nisan, and Schapira [STOCâ06] who gave an O(log^2 m)-approximation where m is number of items. This problem has been studied extensively since, culminating in an O(\sqrt{log m})-approximation mechanism by Dobzinski [STOCâ16]. We present a computationally-efficient truthful mechanism with approximation ratio that improves upon the state-of-the-art by an exponential factor. In particular, our mechanism achieves an O((loglog m)^3)-approximation in expectation, uses only O(n) demand queries, and has universal truthfulness guarantee. This settles an open question of Dobzinski on whether Î(\sqrt{log m}) is the best approximation ratio in this setting in negative.

This is joint work with Sepehr Assadi and appears in FOCS 2019.


Speaker: Greg Bodwin (Georgia Tech)
Title: Regularity Decompositions for Sparse Pseudorandom Graphs

The âregularity methodâ is a powerful tool in graph theory where one analyzes a graph G by first approximating it using a union of random graphs H, and then one analyzes H using easy probabilistic processing. This method lets one solve otherwise-hard or intractable problems on G, with some extra error depending on how well it is approximated by H. How big can this error possibly be? For dense graphs, the answer is well known. But for slightly sparser graphs â say, on âonlyâ n^{1.99} edges â it is suddenly less clear which types of graphs can be well approximated, how to measure approximation error, etc. We will survey this research area and discuss some new progress in classifying the types of graphs for which a âsparse regularity methodâ can be effectively applied.

Based on joint work with Santosh Vempala.


Speaker: Vatsal Sharan (Stanford)
Title: New Problems and Perspectives on Learning, Sampling, and Memory

What is the role of memory in continuous optimization and learning? Are there inherent trade-offs between the available memory and the data requirement? Is it possible to achieve the sample complexity of second-order optimization methods with significantly less memory? I will discuss these questions in this talk, and a recent result along these lines. For a basic continuous optimization and learning problemâlinear regressionâwe show that there is a gap between the sample complexity of algorithms with quadratic memory and that of any algorithm with sub-quadratic memory. I will also discuss several promising future directions to show strong memory/sample tradeoffs for continuous optimization. This is based on joint work with Aaron Sidford and Greg Valiant.

If there is time, I will introduce the problem of data âamplificationâ: given n independent draws from a distribution, D, to what extent is it possible to output a set of m > n datapoints that are indistinguishable from m i.i.d. draws from D? Curiously, we show that nontrivial amplification is often possible in the regime where n is too small to learn D to any nontrivial accuracy. We also discuss connections between this setting and the challenge of interpreting the behavior of GANs and other ML/AI systems. This is based on joint work with Brian Axelrod, Shivam Garg and Greg Valiant.


Speaker: Manolis Zampetakis (MIT)
Title: Computationally and Statistically Efficient Truncated Statistics

Censoring and truncation occur when data falling outside of a subset of the population are not observable. In practice, it often arises as a result of saturation of measurement devices, experimental design, and legal or privacy constraints preventing the use of some of the data. Such phenomena have been known to affect experimental results in a counterintuitive way, as per
Berksonâs paradox.

In our recent work, we provide the first provably computationally and statistically efficient methods accomplishing the fundamental task of statistical estimation for the entire population out of exclusively censored data. Our first result [w/ Daskalakis, Gouleakis, Tzamos FOCSâ18] assumes that the population follows a multi-dimensional normal distribution and the survival set is known. In follow-up works, we have extended our result to the case of censored linear [w/ Daskalakis, Gouleakis, Tzamos COLTâ19], logistic and probit regression [w/ Daskalakis, Ilyas, Rao â19] and we have also explored the case of unknown survival set [w/ Kontonis, Tzamos FOCSâ19].
[← Prev in Thread] Current Thread [Next in Thread→]
  • [theory students] Fwd: Fall 2019 Quarterly Theory Workshop showcasing Junior Theorists, Shuchi Chawla <=