The next Foundations of Data Science virtual talk will take place on Friday, Nov 20th at 10:00 AM Pacific Time (13:00 Eastern Time, 18:00 Central European Time, 17:00 UTC, 23:30 Indian Time). Himanshu Tyagi from IISc will speak about “General lower bounds for estimation under information constraints”.
Abstract: We present very general lower bounds for parametric estimation when only limited information per sample is allowed. These limitations can arise, for example, in form of communication constraints, privacy constraints, or linear measurements. Our lower bounds hold for discrete distributions with large alphabet as well as continuous distributions with high-dimensional parameters, apply for any information constraint, and are valid for any $\ell_p$ loss function. Our bounds recover both strong data processing inequality based bounds and Cramér-Rao based bound as special cases.
This talk is based on joint work with Jayadev Acharya and Clément Canonne.
Please register here to join the virtual talk.
The series is supported by the NSF HDR TRIPODS Grant 1934846.