Professional Documents
Culture Documents
1163/22134808-00002535
brill.com/msr
Abstract
Dogs respond to human emotional expressions. However, it is unknown whether dogs can match
emotional faces to voices in an intermodal matching task or whether they show preferences for looking at certain emotional facial expressions over others, similar to human infants. We presented 52
domestic dogs and 24 seven-month-old human infants with two different human emotional facial expressions of the same gender simultaneously, while listening to a human voice expressing an emotion
that matched one of them. Consistent with most matching studies, neither dogs nor infants looked
longer at the matching emotional stimuli, yet dogs and humans demonstrated an identical pattern of
looking less at sad faces when paired with happy or angry faces (irrespective of the vocal stimulus),
with no preference for happy versus angry faces. Discussion focuses on why dogs and infants might
have an aversion to sad faces, or alternatively, heightened interest in angry and happy faces.
Keywords
Domestic dogs, infants, sadness, intermodal matching, preference looking
1. Introduction
Anecdotal reports of canine understanding of human emotions are a common
theme among dog owners (Howell et al., 2013). Although such reports are
notoriously difficult to interpret, they have formed the basis for several recent
studies. Dogs respond differently when a human expresses sadness (crying)
versus humming or talking (Custance and Mayer, 2012), and to happiness and
anger (Mller et al., 2015). In addition, like human adults, they also show
a unique combination of elevated cortisol and increased alertness and submissiveness when listening to infant crying versus infant babbling or white
*
DOI:10.1163/22134808-00002535
noise (Yong and Ruffman, 2014). Moreover, there is some indication that dogs
show avoidance when a human expresses fear towards an object (Merola et al.,
2011), although see (Yong and Ruffman, 2015a), or when a human expresses
disgust toward food as opposed to happiness (Buttelmann and Tomasello,
2013).
In the present study, we used an intermodal matching task [also known as
a Matching to Sample (MTS) task] to explore canine understanding of human
emotion. Researchers have often used MTS tasks in emotion recognition studies with non-human animals and preverbal human infants because they do not
require initial task training and require minimal habituation to the experimental setting (Izumi, 2013). In MTS tasks, two facial expressions are presented,
together with an auditory expression, with the interest in whether a participant looks more at the facial expression that matches the auditory expression
(Spelke, 1976). Using this paradigm, past studies have shown that rhesus monkeys match conspecific faces to their calls (Ghazanfar and Logothetis, 2003),
as well as conspecific faces and voices, and familiar human faces and voices
(Sliwa et al., 2011), as do capuchin monkeys (Evans et al., 2005).
In addition, dogs have been shown to match dog growls to images of dogs
(Farago et al., 2010; Taylor et al., 2010) as well as life-sized replicas of dogs
(Taylor et al., 2011). They also show a preference for looking at a familiar conspecific or human face compared to an unfamiliar one (Somppi et al., 2013),
although in another study they also preferred to look at novel images and human faces compared to dog faces and inanimate everyday objects (e.g., chair,
spoon) (Racca et al., 2010). Two studies have found evidence that dogs were
able to match the gender of a live human when listening to vocal playback
(Ratcliffe et al., 2014; Yong and Ruffman, 2015b). Nevertheless, dogs are
not always successful in MTS tasks. For instance, dogs looked longer at a
strangers face when listening to their owners voice (Adachi et al., 2007),
suggesting that looking is sometimes driven by idiosyncratic features of the
stimulus rather than matching. Dogs also show some form of preference when
selecting faces as shown in this study (Nagasawa et al., 2011). Nagasawa et
al. (2011) first trained nine dogs to discriminate between photographs of their
owners smiling and blank neutral faces. Five dogs that successfully learned
to discriminate the photos were then tested with a new set of photos of the
owner and stranger while smiling or with neutral expressions. They found that
dogs preferred a smiling face compared to a neutral face, with the preference
limited to faces of the same gender as the owner. Nevertheless, although dogs
discriminated the faces, the sample was very small (five) and had been selected
on the basis of previous success and was therefore likely unrepresentative of
dogs in general.
In a study that used emotive stimuli, Racca et al. (2012) compared the
looking biases of 37 dogs and 25 four-year-old children when viewing different human and canine emotional expressions (threatening, friendly, neutral).
Dogs had a left-side gaze bias when viewing negative (threatening) and neutral human expressions, but no bias towards the friendly face. Like dogs, the
four-year-old children showed a left-side gaze bias to negative expressions,
although they differed in also showing a left-side bias to positive human expressions while showing no bias to neutral expressions. Muller et al. (2015)
found that dogs approached happy faces faster compared to angry faces, and
learned discriminations more quickly when rewarded with happy versus angry
faces. Albuquerque et al. (2016) examined emotion matching and found that
dogs could match visual and auditory expressions of anger and happiness in
both conspecific and human stimuli.
Albuquerque et al.s (2016) findings seem to provide support for canine
matching of emotion faces and voices. However, since their study included
only 17 dogs, it is important to determine whether such findings can be replicated. Furthermore, Albuquerque et al. only used expressions of happiness
and anger, expressions that comprised two broad categories of positive versus
negative emotion. At a minimum, dogs might have broad conceptions of positive and negative emotion and thus might be able to match faces and voices
when exemplars of each category are contrasted, yet they might not be able
to match when different exemplars from within a category are included such
as two negative emotions (anger and sadness). Another possibility is that dogs
only possess an understanding of one category of emotion (positive: happiness) and are confused by the other (e.g., anger). If so, they might look to a
happy face when a happy voice was played because they understood these two
expressions went together. Such dogs might then understand that the angry
voice does not go with the happy face, and then by default, look to the angry
face because there is no other face to look at rather than through a genuine
understanding that the angry face goes with the angry voice. In essence, happiness is likely to be much more familiar to the dog and familiarity with an
expression does seem to play a role in deciding which object a dog will select
(Buttelmann and Tomasello, 2013; Merola et al., 2011). It would be helpful
to include other emotions in a matching task. While happiness is arguably the
only positive basic emotion (e.g., surprise can be positive or negative), there
are other negative basic emotions (e.g., anger versus sadness) that could be
included in addition to anger to examine whether dogs can match emotions
within a single category of negative emotions.
While MTS has been reliably used in animal studies, MTS tasks have also
been used with human infants to show that they can reliably discriminate emotional expressions between different persons from 3.5 to 7 months of age (De
Haan and Nelson, 1998; Leppnen and Nelson, 2006), and perhaps as early as
ten weeks of age (Haviland and Lelwica, 1987). At this age, preverbal infants
also follow anothers gaze direction (DEntremont and Muir, 1997) and recognize various affective expressions (Kestenbaum and Nelson, 1990; LaBarbera
et al., 1976; Montague and Walker-Andrews, 2002; Soken and Pick, 1992).
Furthermore, by 7 months of age, infants recognize upright versus inverted facial orientation (Kestenbaum and Nelson, 1990) and are sensitive to changes
in intonation or prosody that typically convey emotion (Walker-Andrews and
Grolnick, 1983; Walker-Andrews and Lennon, 1991). From an evolutionary
perspective, there are benefits (e.g., survival) for infants to recognize and respond to emotional expressions. Emotional expressions are evolutionary adaptations to social environments to create and maintain social relationships and
interactions (Darwin, 1872; Keltner and Kring, 1998).
Nevertheless, like dogs, human infants do not always show a successful
matching outcome in the MTS paradigm. Studies have shown that while infants sometimes can match affective facial displays that correspond to vocal
displays (Montague and Walker-Andrews, 2002; Soken and Pick, 1992, 1999;
Walker, 1982), they more frequently fail to do so. Table 1 summarizes the results from infant intermodal matching studies and shows that matching has
occurred in less than half (41%) of published studies. Yet, even this success
rate should be viewed with caution given the possibility of a publication bias
that results in an exaggeration of the success rate of infant matching abilities. Table 1 also demonstrates that infants sometimes display preferences for
certain facial emotions over others when faces are presented simultaneously.
Of particular interest are the summaries at the bottom of the table that show
the percentage of studies indicating a preference for one emotion over another
when preferences arose (i.e., ignoring studies in which there was no difference). These results indicate a trend for human infants to look more at happy
faces when paired with either sad or angry faces, and for no preference when
sad and angry faces are paired (see Table 1).
There have been various explanations of infants pattern of looking at faces.
They have sometimes been described as demonstrating a gaze aversion to sad
and angry faces because of the negative valence of such emotions (DEntremont and Muir, 1997, 1999; Haviland and Lelwica, 1987; Izard, 1991;
Termine and Izard, 1988), as indicating a preference to view familiar emotions
over unfamiliar ones (Soken and Pick, 1999), or as demonstrating empathy
in the case of looking away from sad faces (Hoffman, 1981; Termine and
Izard, 1988). Yet, it has also been argued that increased attention to angry
faces might reflect the unique function of anger, one that includes vigilance
and high arousal to a potentially stressful event (Izard, 1993). However, Hunnius et al. (2011) showed that four- and seven-month-olds fixate less on the
inner features of faces (eyes, nose, mouth) showing anger and fear compared
to happy ones. Infants tendency to shift attention away from a parents sad
ND = No difference.
41%
matching
no
no
yes
no
yes
yes
no
no
yes
no
no
yes
no
Summary
yes
no
no
yes
Matching
of face to
voice?
Studies
Table 1.
Summary of findings for intermodal matching tasks
100%
happy
ND
ND
ND
happy
ND
happy
ND
happy
happy
HappySad
100%
happy
ND
ND
ND
ND
happy
happy
happy
ND
ND
ND
HappyAngry
no
preference
sad
angry
SadAngry
5
7
7
7
7
5 and 7
7
5
7
5
3.5
5
3.5
5
5
5
3.5
Infants
age
(months)
mother
mother
father
mother
Emoter
Multisensory Research (2016) DOI:10.1163/22134808-00002535
avoidance of sad expressions because of the reduced reward value of interactions with sad individuals, and characteristic responding to such expressions
could have been selected for in human and canine evolution. Emotion matching paradigms have also been viewed as a good choice for examining potential
abilities selected through evolution (Preston and De Waal, 2002; Prguda and
Neumann, 2014). We argue that the capacity for learning about face-voice
contingencies could have been selected for in evolution (or canine breeding)
because it facilitates infant and canine social functioning in the world.
We used three expressions frequently used in previous research: human
expressions of anger, sadness and happiness. Thus, we included two negative emotions anger and sadness to more comprehensively test canine
matching ability than previous matching studies (Albuquerque et al., 2016).
Participants were simultaneously presented with pairs of facial expressions
depicting either happiness and sadness, happiness and anger, or sadness and
anger, while listening to a human voice expressing an emotion that matched
one of the facial expressions. All stimuli were adult human males or females.
We examined whether dogs and human infants looked more at the face that
matched the voice, and if not, whether they had a preference for looking at
certain emotional faces over others irrespective of the auditory expression. If
the latter, our expectation based on previous research was that infants would
look toward happy faces when paired with sad or angry faces (see Table 1).
2. Material and Methods
2.1. Participants
Fifty-two pet dogs (32 females, M = 5.47 years, SD = 4.01) of different
breeds participated in this study (Appendix A). Dogs were recruited from advertisements placed in the university newsletter, local canine clubs, and flyers
distributed to dog owners from the local city council. Dog owners were given
a petrol voucher as compensation for participating in the study and dogs received sausage pieces as a reward upon completion.
Twenty-four human infants (10 females) participated in the task (M = 7.16
months, SD = 0.85) (Appendix B). Twenty infants were European New
Zealanders while four infants were mixed-parentage (Maori, Chinese). All infants were born healthy with no known visual or auditory impairments, and
were carried full-term with the exception of one born pre-term. The infant
participants were volunteers or were referrals from previous participants, and
each received a petrol voucher and a toy for their participation. Twenty-one
infants (87.5%) came from middle-class families in which at least one parent
had a university education and was working full-time.
Left image
Right image
Sound
Emotion
1
2
3
4
5
6
happy male
sad female
angry male
sad male
angry female
happy female
sad male
angry female
happy male
angry male
happy female
sad female
happy male
angry female
angry male
sad male
happy female
sad female
happy
angry
angry
sad
happy
sad
10
11
Each trial started with a clicking tone for one second. The clicking tone
attracted participants attention. One voice recording was then played for five
seconds (e.g., angry male voice) with the computer monitors remaining blank
(showing a black screen). The voice recording was then repeated for another
five seconds (angry male voice) and two different faces were shown simultaneously on each computer monitor (angry male, happy male). Then a different
tone was played for one second and the computer monitors went blank to indicate the end of that trial. This format was repeated for the remaining five trials.
To maintain interest, for the human infants, the tones at the beginning and end
of each trial were substituted with animal sounds (e.g., bird chirps, cow moos)
and the blank monitors with still cartoon images (e.g., Winnie the Pooh).
2.6. Measured Variables
All behavioural coding was conducted by two coders. The primary coder was
blind to the conditions and to the hypothesis, and the secondary coder coded
33% of the videos for inter-rater reliability. Both coders were able to hear
the auditory stimuli because they included tones that identified coding start
times. However, they were unable to view the emotion stimuli presented on
the monitors and were therefore unaware of what constituted correct matching.
The primary coder measured canine and infant time spent looking at the left
and right screens, time spent looking away from the screens, and time spent
looking at the owner or parent. The participants eyes were clearly visible
on the primary camera view to indicate looking direction, and the secondary
camera confirmed the participants head turning. The inter-rater correlations
between the two coders were good; looking duration at the left screen: rs =
0.91, looking duration at the right screen: rs = 0.94, looking away: rs = 0.86,
and looking at parent/owner: rs = 0.95. The primary coders ratings were used
in the analyses.
3. Results
Shapiro-Wilks tests for normality (all ps < 0.05) and histograms suggested
that the data for dogs and infants were not normally distributed, and for this
reason, non-parametric analyses were used.
3.1. Looking Duration at Matching Concordant Emotions
Two dogs did not look at any of the six pairs of faces over the six trials and
were excluded from analysis, leaving a final sample of 50 dogs for analysis. No
infants were excluded on this or any other basis. We examined the time spent
looking at the matching (concordant) image (e.g., looking time at angry male
face when listening to the angry male voice) and non-matching (discordant)
image (e.g., looking time at the happy male face when listening to the angry
12
Table 3.
Mean looking duration to a concordant or discordant face when listening to an emotional voice
(seconds)
Dogs (n = 50)
Vocal expression
Infants (n = 24)
M (SD)
M (SD)
Happy
concordant
discordant
1.40 (1.68)
1.30 (1.66)
0.74
4.23 (1.77)
2.93 (1.52)
0.01
Anger
concordant
discordant
1.56 (1.78)
1.00 (1.42)
0.10
4.13 (2.38)
3.06 (1.90)
0.10
Sad
concordant
discordant
0.88 (1.29)
1.96 (2.42)
0.02
2.40 (1.22)
4.62 (2.04)
<0.01
male voice) using Wilcoxon Signed-Ranks test. Table 3 shows dogs and infants looking durations at the concordant and discordant face when listening
to a particular vocal expression.
Dogs looked longer at the discordant face when listening to a sad voice,
Z = 2.37, p = 0.02, r = 0.34 (Wilcoxon Signed-Ranks test). When dogs were
presented with a happy or angry voice, there was no difference in their looking
at concordant or discordant faces, Z = 0.33, p = 0.74, r = 0.05 and Z = 1.67,
p = 0.10, r = 0.24 respectively.
Similar to dogs, infants looked longer at the discordant face when listening
to a sad voice, Z = 3.77, p < 0.01, r = 0.77 (Wilcoxon Signed-Ranks test)
and there was no significant difference in infant looking at the concordant or
discordant face when listening to an angry voice, Z = 1.63, p = 0.10, r =
0.33. When infants were listening to a happy voice, they looked longer at the
concordant happy face, Z = 2.63, p = 0.01, r = 0.54 (see Table 3).
In general, there was no evidence of reliable matching for either dogs or
infants, a result consistent with the majority of infant matching studies summarized in Table 1. Nevertheless, canine and infant looking were not random.
Instead, both tended to look toward angry and happy faces, and away from
sad faces. For this reason, we examined total looking time at sad faces, happy
faces, and angry faces, irrespective of the accompanying voice (see Fig. 1).
Dogs and infants showed the same pattern of looking across the three pairings
of emotion faces, both ps < 0.01 (Friedmans two-way analysis of variance by
ranks). Even after correcting for multiple comparisons using Holms correction, dogs looked less at the sad face compared to both the happy and angry
faces, Z = 2.66, p < 0.01, r = 0.38 and Z = 4.16, p < 0.01, r = 0.59, respectively (Wilcoxon Signed-Ranks Test), with similar findings for infants,
Z = 4.14, p < 0.01, r = 0.85 and Z = 3.66, p < 0.01, r = 0.75, respectively.
There was no preference in looking when the happy and angry faces were
13
Figure 1. Box and whisker plot displaying canine and infant median interest for an emotion
face (regardless of the accompanying voice). p < 0.01.
14
Figure 2. Box and whisker plot displaying intensity ratings for angry, happy and sad faces.
15
angry and happy faces, although our findings are consistent with Mller et al.
(2015). Also in contrast, in previous research infants tended to look equally
to angry and sad faces, although only two studies provided this comparison
(Schwartz et al., 1985; Soken and Pick, 1999), whereas in our study dogs and
infants looked more at angry than sad faces.
Our findings are also discrepant with Albuquerque et al. (2016) who found
that dogs could match human expressions of anger and happiness. We pointed
out that Albuquerque et al. compared just happiness and anger whereas we
included a comparison of two negative emotions (anger and sadness) as well
as happiness and sadness. We did not find that dogs could match human expressions for any of the three comparisons, leaving it unclear as to how easily
dogs can match human emotional expressions. Certainly, our failure to match
was not due to a lack of statistical power because Albuquerque et al. tested
just 17 dogs whereas we tested 52.
It is possible that the differences between our results and previous results
were due to differences in stimuli. For instance, if our angry faces were more
intense than in previous research, this might have increased looking and might
have led to our findings of equal looking at angry and happy faces, and more
looking at angry than sad faces (because of the perceived threat value of angry
faces). In general, angry faces do tend to capture attention (Fox et al., 2000;
Ruffman et al., 2009). Yet, given canine and infant interest in happy facial
expressions, when they viewed happy and angry facial expressions together,
both expressions would tend to attract attention equally, which might explain
the lack of a difference in looking at either emotion. One might argue that
each face shown was distinctive or unique in some other manner other than
the emotion expressed. However, as described in the Stimuli section above, we
selected stimuli that corresponded to a typical representation of the NZ population. We also have minimised the characteristic differences by specifically
choosing middle-aged Caucasian and Asian people without visible markings,
that is, faces that these dogs would likely be familiar with in their everyday life.
What is perhaps more striking than the differences between the present
studys results and those of previous research, is the similarity between canine and infant looking preferences in our study, which as stated above, can be
viewed either as a preference to look at happy and angry faces or as an aversion
to sad faces. One of the possible reasons for the lack of looking at sad faces
in both dogs and infants is that they try to reduce stressful visual information
from sad faces (Grossmann, 2010; Nesse, 1990). Shifting attention away from
a parents sad face has been considered a way for infants to reduce negative
feelings generated by the sad face, and therefore, indicative of a rudimentary
form of empathy (Hoffman, 1981; Termine and Izard, 1988). This result is
also consistent with previous studies. For instance, when infants view a sad
expression on their mothers face, they play less, have greater gaze aversion,
16
less smiling, and increased grimacing (DEntremont and Muir, 1999, 1997),
and when they witness an adult frowning or crying, infants become more
agitated and distressed (DEntremont and Muir, 1999; Kahana-Kalman and
Walker-Andrews, 2001), experiences that would likely lead to gaze aversion.
In addition, dogs also have an aversion to auditory expressions of sadness; like
adult humans, they experience an increase in cortisol (a stress hormone linked
to empathy) as well as behavioural signs of stress when listening to human infant crying but not other sounds (Yong and Ruffman, 2014). This finding that
dogs show aversion when listening to sad voices suggests that their tendency
to look away from sad faces in the present study might also be best interpreted
as aversion and a rudimentary form of empathy (i.e., a spontaneous response to
sad stimuli not requiring explicit insight that an individual is experiencing sadness). Such findings point to the possibility that sensitivity to certain emotional
expressions has been selected for in the evolution of human infants and/or the
breeding of dogs, and that the preferential looking behaviour is aligned with
an evolutionary response towards emotional expressions.
Our results are consistent with the majority of previous infant studies that
failed to find emotion matching. One possible limitation of the present study is
that participants were unfamiliar with the emoter. It could be argued that stimuli utilizing familiar individuals would more likely lead to matching. However,
there is little difference in success rates when infants have matched emotions
with familiar and unfamiliar individuals in previous research (see Table 1) so
that this seems an unlikely possibility.
Another possible limitation is that dogs could not clearly see the images.
However, the faces shown on the computer screen were life-sized and similar to those used in previous research (Huber et al., 2013; Nagasawa et al.,
2011). Visual acuity in dogs is known to be worse than humans (Miller and
Murphy, 1995) and wolves (Peichl, 1992), with brachycephalic (short-nosed)
dog breeds having better visual acuity compared to dolichocephalic (longnosed) breeds due to the fact that the ganglion cells in their retinas are in a
more central location (McGreevy et al., 2004). The images in the present study
were either 1.5 m or 1.8 m away from dogs and at this distance should have
enabled dogs to distinguish facial features and expressions. For instance, this
distance has been successfully used in other studies with no known impairment
of canine performance (Huber et al., 2013; Nagasawa et al., 2011). Moreover,
dogs did respond differently to the different emotion faces, indicating that they
could see them.
Our findings demonstrate that dogs process emotional faces similarly to
human infants, detecting differences in facial emotions and showing viewing
preferences just like human infants. Our findings certainly do not mean that
dogs (or infants) have deeper insight into the emotional experience accompanying facial expressions. Instead, they suggest that dogs can differentiate facial
17
18
Fox, E., Lester, V., Russo, R., Bowles, R. J., Pichler, A. and Dutton, K. (2000). Facial expressions of emotion: are angry faces detected more efficiently? Cogn. Emot. 14, 6192.
Ghazanfar, A. A. and Logothetis, N. K. (2003). Neuroperception: facial expressions linked to
monkey calls, Nature 423, 937938.
Grossmann, T. (2010). The development of emotion perception in face and voice during infancy,
Restor. Neurol. Neurosci. 28, 219236.
Hare, B. and Tomasello, M. (2005). Human-like social skills in dogs? Trends Cogn. Sci. 9,
439444.
Haviland, J. M. and Lelwica, M. (1987). The induced affect response: 10-week-old infants
responses to three emotion expressions, Dev. Psychol. 23, 97104.
Hoffman, M. L. (1981). Is altruism part of human nature? J. Pers. Soc. Psychol. 40, 121137.
Howell, T. J., Toukhsati, S., Conduit, R. and Bennett, P. (2013). The perceptions of dog intelligence and cognitive skills (PoDIaCS) survey, J. Vet. Behav. Clin. Appl. Res. 8, 418424.
Huber, L., Racca, A., Scaf, B., Virnyi, Z. and Range, F. (2013). Discrimination of familiar
human faces in dogs (Canis familiaris), Learn. Motiv. 44, 258269.
Hunnius, S., de Wit, T. C. J., Vrins, S. and von Hofsten, C. (2011). Facing threat: infants
and adults visual scanning of faces with neutral, happy, sad, angry, and fearful emotional
expressions, Cogn. Emot. 25, 193205.
Izard, C. E. (1991). The Psychology of Emotions. Springer, New York, NY, USA.
Izard, C. E. (1993). Organizational and motivational functions of discrete emotions, in: Handbook of Emotions, M. Lewis and J. M. Haviland (Eds), pp. 631641. Guilford Press, New
York, NY, USA.
Izumi, A. (2013). Cross-modal representation in humans and nonhuman animals: a comparative
perspective, in: Integrating Face and Voice in Person Perception, P. Belin, S. Campanella and
T. Ethofer (Eds), pp. 2943. Springer, New York, NY, USA.
Kahana-Kalman, R. and Walker-Andrews, A. S. (2001). The role of person familiarity in young
infants perception of emotional expressions, Child Dev. 72, 352369.
Keltner, D. and Kring, A. M. (1998). Emotion, social function, and psychopathology, Rev. Gen.
Psychol. 2, 320342.
Keltner, D., Ellsworth, P. C. and Edwards, K. (1993). Beyond simple pessimism: effects of
sadness and anger on social perception, J. Pers. Soc. Psychol. 64, 740752.
Kestenbaum, R. and Nelson, C. A. (1990). The recognition and categorization of upright and
inverted emotional expressions by 7-month-old infants, Infant Behav. Dev. 13, 497511.
LaBarbera, J. D., Izard, C. E., Vietze, P. and Parisi, S. A. (1976). Four- and six-month-old
infants visual responses to joy, anger, and neutral expressions, Child Dev. 47, 535538.
Leppnen, J. M. and Nelson, C. A. (2006). The development and neural bases of facial emotion
recognition, in: Advances in Child Development and Behavior, Vol. 34, R. V. Kail (Ed.),
pp. 207246. Elsevier, Amsterdam, The Netherlands.
McGreevy, P., Grassi, T. D. and Harman, A. M. (2004). A strong correlation exists between
the distribution of retinal ganglion cells and nose length in the dog, Brain Behav. Evol. 63,
1322.
Merola, I., Prato-Previde, E. and Marshall-Pescini, S. (2011). Social referencing in dog-owner
dyads? Anim. Cogn. 15, 175185.
Miller, P. E. and Murphy, C. J. (1995). Vision in dogs, J. Am. Vet. Med. Assoc. 207, 16231634.
19
Montague, D. P. F. and Walker-Andrews, A. S. (2002). Mothers, fathers, and infants: the role of
person familiarity and parental involvement in infants perception of emotion expressions,
Child Dev. 73, 13391352.
Mller, C. A., Schmitt, K., Barber, A. L. A. and Huber, L. (2015). Dogs can discriminate emotional expressions of human faces, Curr. Biol. 25, 601605.
Nagasawa, M., Murai, K., Mogi, K. and Kikusui, T. (2011). Dogs can discriminate human
smiling faces from blank expressions, Anim. Cogn. 14, 525533.
Nesse, R. M. (1990). Evolutionary explanations of emotions, Hum. Nat. 1, 261289.
Peichl, L. (1992). Topography of ganglion cells in the dog and wolf retina, J. Comp. Neurol.
324, 603620.
Preston, S. D. and De Waal, F. B. M. (2002). Empathy: its ultimate and proximate bases, Behav.
Brain Sci. 25, 120.
Prguda, E. and Neumann, D. L. (2014). Inter-human and animal-directed empathy: a test for
evolutionary biases in empathetic responding, Behav. Processes 108, 8086.
Racca, A., Amadei, E., Ligout, S., Guo, K., Meints, K. and Mills, D. (2010). Discrimination
of human and dog faces and inversion responses in domestic dogs (Canis familiaris), Anim.
Cogn. 13, 525533.
Racca, A., Guo, K., Meints, K. and Mills, D. S. (2012). Reading faces: differential lateral gaze
bias in processing canine and human facial expressions in dogs and 4-year-old children,
PLoS ONE 7, e36076. DOI:10.1371/journal.pone.0036076.
Ratcliffe, V., McComb, K. and Reby, D. (2014). Cross-modal discrimination of human gender
by domestic dogs, Anim. Behav. 91, 126134.
Rivers, S. E., Brackett, M. A., Katulak, N. A. and Salovey, P. (2006). Regulating anger and
sadness: an exploration of discrete emotions in emotion regulation, J. Happiness Stud. 8,
393427.
Ruffman, T., Ng, M. and Jenkin, T. (2009). Older adults respond quickly to angry faces despite
labeling difficulty, J. Gerontol. B. Psychol. Sci. Soc. Sci. 64B, 171179.
Schwartz, G. E., Weinberger, D. A. and Singer, J. A. (1981). Cardiovascular differentiation of
happiness, sadness, anger, and fear following imagery and exercise, Psychosom. Med. 43,
343364.
Schwartz, G. M., Izard, C. E. and Ansul, S. E. (1985). The 5-month-olds ability to discriminate
facial expressions of emotion, Infant Behav. Dev. 8, 6577.
Shields, S. A. (1984). Reports of bodily change in anxiety, sadness, and anger, Motiv. Emot. 8,
121.
Sliwa, J., Duhamel, J.-R., Pascalis, O. and Wirth, S. (2011). Spontaneous voiceface identity
matching by rhesus monkeys for familiar conspecifics and humans, Proc. Natl Acad. Sci.
USA 108, 17351740.
Sobin, C. and Alpert, M. (1999). Emotion in speech: the acoustic attributes of fear, anger, sadness, and joy, J. Psycholinguist. Res. 28, 347365.
Soken, N. H. and Pick, A. D. (1992). Intermodal perception of happy and angry expressive
behaviors by seven-month-old infants, Child Dev. 63, 787795.
Soken, N. H. and Pick, A. D. (1999). Infants perception of dynamic affective expressions: do
infants distinguish specific expressions? Child Dev. 70, 12751282.
Somppi, S., Trnqvist, H., Hnninen, L., Krause, C. M. and Vainio, O. (2013). How dogs scan
familiar and inverted faces: an eye movement study, Anim. Cogn. 17, 793803.
20
21
No
Breed
Include
Sex
Neutered
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
Greyhound
German Shepherd Greyhound
Golden Retriever
Lab/Collie
Border Collie
Golden Retriever
Labrador/Greyhound/Collie/Bully
Husky Golden Retriever cross
Border Retriever cross
Labrador/Border Collie
German Short-Haired Pointer
Labrador/Poodle
Staffy
Shihtzu/Lhasa Apso
Miniature Poodle
Labrador
French Mastiff
Foxy Jack Russell cross
Lab cross
Cocker Spaniel/Labrador/Hungarian Visla
Heading
Collie/Husky/Heading
Schnauzer
Shetland Sheepdog
Labrador Retriever
Border Collie
Border Collie
Golden Retriever
Labrador Retriever
Bichon/Poodle/Chihuahua/Terrier
Terrier cross
German Shepherd Greyhound
Great Dane cross
Huntaway cross
Belgian Shepherd
Standard Poodle
Schnauzer, Staffy and Labrador
Spoodle
Weimaraner
German Short-Haired Pointer
Labrador/Staffordshire Terrier
American Red-Nose Pitbull
Blue Heeler/Beardie
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
yes
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
female
male
male
male
male
male
male
male
male
male
male
male
male
no
no
yes
yes
yes
yes
yes
yes
no
yes
yes
yes
no
yes
yes
no
yes
yes
yes
yes
yes
no
yes
yes
yes
yes
yes
yes
yes
yes
yes
no
yes
yes
no
yes
no
yes
yes
no
yes
no
yes
Age
1.17
1.17
1.20
1.64
1.73
1.78
1.85
2.08
2.18
2.37
2.48
2.48
2.55
2.78
3.19
3.32
3.71
3.90
4.27
5.35
5.77
5.88
8.56
8.93
9.92
11.79
11.82
12.84
12.95
14.77
15.21
1.17
1.42
2.74
2.95
3.93
4.12
4.79
5.17
5.18
5.38
5.56
6.08
22
(Appendix A. Continued)
No
Breed
Include
Sex
Neutered
44
45
46
47
48
49
50
51
52
Labrador/Huntaway
Black Lab
German Shepherd Greyhound
Border Collie
Golden Retriever
English Setter
Boxer
Maltese/Cavalier King Charles Spaniel cross
Whippet
yes
yes
yes
yes
yes
yes
yes
no
no
male
male
male
male
male
male
male
female
male
yes
yes
no
yes
no
yes
yes
yes
yes
No
Sex
Task
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
female
female
female
female
female
female
female
female
female
female
male
male
male
male
male
male
male
male
male
male
male
male
male
male
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
emotion
Age (months)
7.00
7.43
7.20
7.67
7.67
6.73
6.83
6.53
7.30
8.50
7.13
7.17
5.90
6.93
7.77
7.23
8.20
9.07
7.07
5.73
7.30
5.13
7.77
6.57
Age
6.72
6.76
7.59
8.14
8.14
9.16
10.17
2.92
6.84
23