This paper presents a novel algorithm for performing linearized confidence-weighted (LCW) learning on a fixed budget. LCW learning has been applied to solve online classification problems in recent years. To make better classification performance, it is common to combine with kernel functions through the kernel trick. However, the trick makes the LCW learning vulnerable to the curse of kernelization that causes unlimited growth in memory usage and run-time. To address this issue, we first re-interpret the LCW learning by using a resource perspective deeming every instance as a potential resource to exploit. Based on the perspective, we then propose a budgeted algorithm that approximates the LCW learning under a finite constraint on the number of available resources. The proposed algorithm enjoys finite complexities of time and space and thus is able to break the curse. Experiments on several open datasets show that the proposed algorithm approximates the LCW learning well and is competitive to state-of-the-art budgeted algorithms.