Towards proving the lower bounds, the authors introduce two types of sources, which may be of independent interest.
The first type, called "zero-fixing source", is a special case of a bit-fixing source in which all fixed bits are fixed to zero. That is a $(n,k)$-zero-fixing source in a source in which $n-k$ of the bits are set to zero, and the other $k$ bits are perfectly random (i.e., uniformly distributed over $\{0,1\}^k$). Such sources may arise in some applications that are modeled by bit-fixing sources; that is, in applications in which the fixed bit model defected devices, but these defects cause a default value rather than an arbitrary one (modeled by a worst-case choice).
As for the second type, I am presenting a version of it that is obtained from their notion of a $(n,k,w)$-fixed-weight source by letting $k=n$ and letting $w$ be a random variable. Specifically, for a class of distributions $D$ over $[n]$, an $(n,D)$-weight source is obtained by selecting $w$ according to some $W$ in $D$, and then letting the output be selected uniformly among all $n$-bit strings of Hamming weight $w$.
An $(n,k)$-bit-fixing source is a distribution on $n$ bit strings, that is fixed on $n-k$ of the coordinates, and jointly uniform on the remaining $k$ bits. Explicit constructions of bit-fixing extractors by Gabizon, Raz and Shaltiel [SICOMP 2006] and Rao [CCC 2009], extract $(1-o(1)) \cdot k$ bits for $k = polylog(n)$, almost matching the probabilistic argument. Intriguingly, unlike other well-studied sources of randomness, a result of Kamp and Zuckerman [SICOMP 2006] shows that, for any $k$, some small portion of the entropy in an $(n,k)$-bit-fixing source can be extracted. Although the extractor does not extract all the entropy, it does extract $(1/2 - o(1)) \cdot \log(k)$ bits.
In this paper we prove that when the entropy $k$ is small enough compared to $n$, this exponential entropy-loss is unavoidable. More precisely, one cannot extract more than $\log(k)/2 + O(1)$ bits from $(n,k)$-bit-fixing sources. The remaining entropy is inaccessible, information theoretically. By the Kamp-Zuckerman construction, this negative result is tight.
Our impossibility result also holds for what we call zero-fixing sources. These are bit-fixing sources where the fixed bits are set to $0$. We complement our negative result, by giving an explicit construction of an $(n,k)$-zero-fixing extractor, that outputs $\Omega(k)$ bits, even for $k = polyloglog(n)$. Furthermore, we give a construction of an $(n,k)$-bit-fixing extractor, that outputs $k-O(1)$ bits, for entropy $k = (1+o(1)) \cdot \log\log{n}$, with running-time $n^{O(( \log{\log{n}})^2)}$.
See ECCC TR14-160.