This paper is concerned with caching support of access points (APs) for fast handoff within IEEE 802.11 networks. A common flavor of current schemes is to let a mobile station preauthenticate or distribute the security context of the station proactively to neighboring APs. Each target AP caches the received context beforehand and can save itself backend-network authentication if the station reassociates. We present an approach to ameliorating cache effectiveness under the least recently used (LRU) replacement policy, additionally allowing for distinct cache miss penalty indicative of authentication delay. We leverage the widely used LRU caching techniques to effect a new model where high-penalty cache entries are prevented from being prematurely evicted under the conventional replacement policy so as to save frequent, expensive authentications with remote sites. This is accomplished by introducing software-generated reference requests that trigger cache hardware machinery in APs to refresh certain entries in an automated manner. Performance evaluations are conducted using simulation and analytical modeling. Performance results show that our approach, when compared with the base LRU scheme, reduces authentication delay by more than 51 percent and cache miss ratio by over 28 percent on average. Quantitative and qualitative discussions indicate that our approach is applicable in pragmatic settings.