Falls are a leading health risk for the elderly. Various wearable-based fall detection systems based on machine learning models have been developed to provide emergency alarms and services to improve safety and health-related quality of life. To support long-term healthcare services, the appropriate sampling rate, which plays a vital role in a fall detection system, must be investigated to guarantee the performance accuracy and energy efficiency. Intuitively, decreasing the sampling rate of sensor nodes can provide computing and energy savings. However, there is a lack of research exploring the performance accuracy of fall detection systems in terms of sampling rate, especially for machine learning models. In this paper, the effects of decreasing sampling rates (ranging from 200/128 to 3 Hz) on wearable-based fall detection systems are investigated based on four machine learning models: support vector machine (SVM), k -nearest neighbor, Naïve Bayes, and decision tree. Two emulated fall data sets, the SisFall public data set and the proposed data set of this paper, are used to allow an objective investigation of sampling rates. The findings show that fall detection systems based on SVM modeling and a radial basis function could achieve at least 98% and 97% accuracy, with sampling rates of 11.6 and 5.8 Hz, respectively. Overall, the experimental results demonstrate that a sampling rate of 22 Hz is sufficient for most machine learning models to support wearable-based fall detection systems (accuracy ≥97%).