This work proves nearly tight lower bounds on the length of relaxed Locally Decodable Codes (LDCs) that use a non-adaptive decoder (that has one-sided error). While the claimed result applies only to non-adaptive decoders (of one-sided error), the benefit of adaptivity (and two-sided error) in the current context is in doubt and certainly does not exist when the code is linear.
While I doubted the usefulness of the notion of a daisy in the prior work of Gur and Lachish (and its follow-up), it seems that this notion is important in the current work. As demonstrated in here, for non-adaptive decoders, the prior lower bound could be obtained (and even slightly improved) by considering the distribution of individual queries. In contrast, for the current lower bound, it seems necessary to consider the distribution of $q$-subsets.
We show a nearly optimal lower bound on the length of linear relaxed locally decodable codes (RLDCs). Specifically, we prove that any $q$-query linear RLDC $C\colon \{0,1\}^k \to \{0,1\}^n$ must satisfy $n = k^{1+\Omega(1/q)}$. This bound closely matches the known upper bound of $n = k^{1+O(1/q)}$ by Ben-Sasson, Goldreich, Harsha, Sudan, and Vadhan (STOC 2004).
Our proof introduces the notion of robust daisies, which are relaxed sunflowers with pseudorandom structure, and leverages a new spread lemma to extract dense robust daisies from arbitrary distributions.
See ECCC TR25-192.