In this paper, we put forward a novel local loss-assisted client selection approach for communication-efficient federated learning. Unlike many prior works that require all clients to upload compressed local gradients or model parameters in each round, the proposed approach intelligently selects a small subset of clients to send local model parameters to the server in each round. Specifically, in each round, clients send their latest local model losses to the server and the server picks up the clients with the smallest losses to upload the best set of local model parameters. Although different clients are selected in different rounds, the number of selected clients in a round is fixed and therefore the communication cost of a round is time-invariant. We use collaborative image classification as the application for evaluating the proposed approach of federated learning. Simulation results reveal that the proposed local loss-assisted client selection approach could significantly reduce the communication cost of federated learning at the cost of slightly reducing the accuracy of image classification.