The long term goal of our research is to develop a tool for recognizing human emotions during e-learning processes. This could be accomplished by combining quantitative indexes extracted from non-invasive recordings of four physiological signals: namely skin conductance, blood volume pulse, electrocardiogram and electroencephalogram. Wearable, non-invasive sensors, communicating with a PC, were applied to 30 students and data were collected during exposure to three different computer-mediated content stimuli designed to evoke specific emotional states: stress, relaxation and engagement. In this paper we describe both the general emotion evaluation algorithm, and present a preliminary results suggesting that some of the quantitative indexes may be successful in characterizing and distinguishing between the three different emotional states.