Torrent Hash - Hash of all existing torrents
Please, pay attention to the fact that you are about to download the torrent NOT from torhash.net
torhash.net is just a torrent search engine, no torrents are hosted here.
torhash.net is just a torrent search engine, no torrents are hosted here.
STOIC Static facial expressions database
Infohash:
D8563D6A9ABF89C19A1B2EB3A5A5EEAD9E6EA049
Type:
Movies
Title:
STOIC Static facial expressions database
Category:
Video/Movie clips
Uploaded:
2010-08-12 (by roy_sylvain)
Description:
This is a scientific torrent. Access to a validated database of static facial expressions of emotion. This Database was created at the University of Montreal and is freely accessible to the scientific community. Please use only for clinical, teaching or scientific endeavours.
The STOIC team.
Tags:
Files count:
1
Size:
3.68 Mb
Trackers:
udp://tracker.openbittorrent.com:80
udp://open.demonii.com:1337
udp://tracker.coppersurfer.tk:6969
udp://exodus.desync.com:6969
udp://open.demonii.com:1337
udp://tracker.coppersurfer.tk:6969
udp://exodus.desync.com:6969
Comments:
roy_sylvain (2010-08-18)
The authors may be reached via email at [email protected]The article has been submitted for publication. If you decide to use our stimuli, please reference our vision science poster:
http://www.journalofvision.org/content/7/9/944.abstract
Abstract:
Facial expressions provide crucial information for adaptive behaviors, since they help us make inferences about what others are thinking and feeling. To date, most studies that have investigated the perception of facial expressions have used static displays (Ekman & Friesen, 1975). Such stimuli underestimate the importance of motion, or the dynamic changes that occur in a face, in emotion recognition (Ambadar, Schooler, & Cohn, 2005).The few studies of dynamic facial expressions have used stimuli, which present methodological limitations that we sought to remedy. In particular, most video database currently used, have not been empirically validated. For our freely available database, we recruited a total of 34 actors to express various emotions. A total of 1,088 grayscale video clips (34 actors * 4 exemplars * 8 expressions) were created. Clips include all basic emotions (happiness, fear, surprise, disgust, sadness, anger) as well as pain and neutral expressions. These videos were spatially aligned frame by frame, on the average coordinates of the eyes and nose (i.e., the clips only contain facial movements), and the luminance was calibrated to allow linear manipulation. All clips contain 15 frames (30 Hz) beginning on the last neutral frame. We empirically validated these stimuli through participants' rating the intensity of the emotions in all stimuli on continuous scales. The video database was adjusted to reflect a confusability matrix with satisfactory d's. We will discuss the main characteristics of the selected clips.