Online multiclass classification is a specific problem of online learning that performs a sequence of multiclass classification tasks given the knowledge of previous tasks. The goal is to make correct predictions for this sequence. It is generally considered a more complicated problem than its binary counterpart, online binary classification. A popular algorithm, called the passive-aggressive algorithm, was primarily proposed for binary problems and later extended as the multiclass passive-aggressive (MPA) algorithm for multiclass problems. The nature of MPA allows itself to implement the kernel trick, which enables us to make better predictions with a kernel-based model. However, this approach suffers from the curse of kernelization that causes unbounded growth of the model in memory usage and runtime. To solve the growth problem, we first introduce a resource perspective that gives an alternative and equivalent interpretation of the kernel-based MPA algorithm. Based on the resource perspective, we propose the budgeted MPA (BMPA) algorithm, which approximates the kernel-based MPA algorithm. BMPA limits the maximum number of available resources by removal and fully exploits them through a constrained optimization. We study three removal strategies and give a relative mistake bound that provides a unified analysis. Simulation experiments on various datasets are conducted to demonstrate that BMPA is effective and competitive with state-of-the-art budgeted online algorithms.