Body shape is about proportion, and fashion style is all about dressing those proportions to look their very best. Figuring out the styles to suit a body shape can be a daunting task for many people. It is, therefore, essential to develop a framework for learning the compatibility of body shapes and clothing styles. Though fashion designers and fashion stylists have analyzed the correlation between human body shapes and fashion styles for a long time, this issue did not receive much attention in multimedia science. In this paper, we present a novel style recommender, on the basis of the user's body attributes. The rich amount of fashion styling knowledge from social big data is exploited for this purpose. We first construct a joint embedding of clothing styles and human body measurements with deep multimodal representation learning on a reference dataset that has been sorted to meet the fashion rules. We then discover the relevant semantic features by propagation and selection in clothing style and body shape graphs. Experiments demonstrate the effectiveness of the proposed framework when compared with several baseline methods.