The detection and recognition of human emotional states have raised recent research interests for various applications from e-learning to chronic health conditions prevention. In this paper, we proposed an emotion recognition system based on the electrocardiogram (ECG) and photoplethysmogram (PPG) signals as objectives data input sources. Three emotion states (positive, neutral, negative) were defined as classification outputs. The training and validation data were collected by Kaohsiung Medical University (KMU) from 47 participants aged from 30 to 50 years old diagnosed with chronic cardiovascular health conditions. A convolution neural network (CNN) was built to efficiently map the subject's emotions with the extracted features from both ECG and PPG signals. This emotion recognition system achieved an accuracy of 75.4% for 3 classes outputs higher or similar than other models used in other works.