Abstract
We consider the problem of estimating the support size of a distribution D. Our investigations are pursued through the lens of distribution testing and seek to understand the power of conditional sampling (denoted as COND), wherein one is allowed to query the given distribution conditioned on an arbitrary subset S. The primary contribution of this work is to introduce a new approach to lower bounds for the COND model that relies on using powerful tools from information theory and communication complexity.
Our approach allows us to obtain surprisingly strong lower bounds for the COND model and its extensions.
- We bridge the longstanding gap between the upper ($O(loglogn+1/ϵ^2)$) and the lower bound $Ω(√loglogn)$ for COND model by providing a nearly matching lower bound. Surprisingly, we show that even if we get to know the actual probabilities along with COND samples, still $Ω(loglogn+1/(ϵ^2log(1/ϵ)))$ queries are necessary.
- We obtain the first non-trivial lower bound for COND equipped with an additional oracle that reveals the conditional probabilities of the samples (to the best of our knowledge, this subsumes all of the models previously studied): in particular, we demonstrate that $Ω(logloglogn+1/(ϵ^2log(1/ϵ)))$ queries are necessary.
Publication
In International Symposium on Mathematical Foundations of Computer Science (MFCS)