You are on page 1of 14

The Hiring And Employment Interview Decision-Making Process: A Case Study Of 37signalss Hiring Practices

Alex Sachon April 16, 2011

Introduction to the Hiring and Employment Interview Process ! Of the numerous strategic decisions organizations of all types are faced with,

hiring decisions are among the most important and lasting. The main tool for the hiring process in most organizations is the employment interview, which typically occurs after an initial round of screening and generally takes place as an unstructured, face-to-face meeting (Bazerman and Moore, 2009). This interview process is designed to assess the capability (or potential) of applicants to perform a given job (Huffcut, 2011, p.64). Peterson et al (1999) dene job performance as the knowledge, skills, and abilities related to a position. Campbell et al (1993) alternately dene it as pertaining to

declarative knowledge, procedural knowledge, skills, abilities, and motivation to expend time and energy. ! While the employment interview is widely used, its effectiveness in predicting

future job performance is a subject of debate among business and organizational theorists. According to Bazerman and Moore (2009), the overall consensus of interview research is that job interviews do not work well (p.184). Supporting this conclusion is a 1998 study by Schmidt and Hunter, which concludes that interviews of this type predict only about 14% of the variability in employee performance. ! A key driver behind the questionable effectiveness of the employment interview

may owe to the fact that, fundamentally, interviews are an interpersonal exchange between people. The actions and behaviors of individuals are embedded within social contexts and institutional structures and these factors can have an unintended yet signicant inuence on peoples behavior (Posthuma and others, 2002). Thus,

underlying every interview process is the potential for various social dynamics to inuence interview outcomes. ! Interviews conducted in a loosely structured conversational format, perhaps the

most commonly used job interview template, are found to be particularly susceptible to these types of inuences. Fundamentally, humans are limited information processors whose ability to discern and rate relevant performance characteristics are often lacking (Morgeson and Campion, 1997). Consequently, as Van Iddekinge, Raymark, and Roth (2005) conclude, there is little to no evidence that interviewer ratings of applicants in the most commonly conducted job interview formats are valid predictors of these applicants future job performance. ! Psychological Biases at Play in the Hiring and Employment Interview Process ! Because humans are awed information processors, our decision-making

abilities are vulnerable to cognitive biases, awed decision making heuristics, and other mechanisms of indirect social inuence (Posthuma and others, 2002). This paper will briey overview, in bullet format, several of the systematic inuences that academic literature has shown to adversely affect the job interview decision-making process. Availability Heuristic: Because most hiring processes do not utilize or collect systematic data that allows them to empirically test out an applicants t with their culture or with the task demands of a specic position, interviewers are often left to their own intuition in rating an applicants t (Bazerman and Moore, 2009). This can be a hinderance to successful decision-making, however, because intuition is highly

susceptible to the inuence of emotion, memory, and other factors potentially unrelated to an applicants relevant qualities pertaining to the position. Anchoring: Anchoring is the tendency for decision-makers to (consciously or

unconsciously) select an arbitrary standard as an anchor of expectation that consequently inuences their perception and evaluation of forthcoming information. Kataoka and others (1997) studied the inuence of anchoring in the interview process, nding that interviewers given an interviewee rating scale with a high anchor evaluated interviewees more favorably than those given a rating scale with a low anchor. Thus, the presence of arbitrary or subjective anchors can inuence an interviewers decision making in unconscious and undesired ways. Conrmation Bias: Conrmation bias is the tendency for individuals to seek out information that supports a pre-conceived belief or notion. Phillips and Dipboye

(1989) found that interviewers favorable pre-interview impressions of applicants increased their tendency to have positive post-interview impressions of applicants. In a similar study, Dougherty and Others (1994) found that interviewers increased behaviors associated with the extending of a job offer, including an increased tendency to adopt a positive attitude and sell the organization and job to the interviewee. A study by Kinicki and others (1990), meanwhile, found that a signicant amount of the variance in post-interview hiring recommendations came from preinterview impressions from interviewee applications. Affect Heuristic: The affect heuristic occurs when peoples decisions become heavily inuenced by quick and often supercial evaluations, such as a persons attractiveness. Often, these characteristics, which can also include race, gender,

background, or attitude, are often not relevant to the qualities required by the position (Postuma and others, 2002). For example, Forsythe (1990) found that female

candidates wearing what interviewers perceived as masculine clothing were judged by interviewers to be more self-reliant, decisive, and aggressive, and, consequently, were more likely to be viewed favorably for hire. Pingitore and others (1994) found that interviewers were similarly affected by interviewee weight, with applicant obesity explaining 35% of the variance in hiring decisions. Relating to the perceived

attitudinal similarity between interviewer and applicant, Graves (1993) found that its presence inuenced interviewer question type and post-interview applicant ratings. Doughery and others (1994) studied the lasting impact of this heuristic, nding that interviewers rarely revised their rst impressions of candidates over the course of the job interview. Representativeness Heuristic: Closely related to the affect heuristic is the representativeness heuristic, which explains peoples tendency to make judgments about individuals based on their perceived similarity to a previously-formed stereotype. Bazerman and Moore (2009) write that intuition leads managers to think that, if a person can speak coherently about her goals, the organization, or the job, then she will perform well at the job. For most jobs, however, interview performance is weakly related to actual job performance. (For example), extroverted, sociable,...and ingratiating people often make more positive interview impressions than others. However, these traits are often less critical to job performance than other, less immediately observable traits, such as conscientiousness and intelligence (p.185). Huffcut (2011) cites a study by Barrick and others (2009), which concludes that the

manner in which interviewees conduct themselves, including the use of impression management tactics (such as engaging in self-promotion), can inuence interviewer ratings over and above the degree to which those skills are required for job performance (p.63). Biases and Impacts From Individual Interviewer Characteristics: In their literature review, Posthuma and others (2002) conclude that an interviews goals or purposes (e.g., initial screening, nal selection, recruiting, or realistic job preview) can have a marked inuence on interviewer-applicant interaction and interview outcomes (p.32). They cite studies by Adkins and others (1994) and Rynes and Gerhart (1990) and nd that interviewer ratings of applicants carried across one organization to another, suggesting that they had some conception of an ideal candidate unrelated to the (specicities of the) organization (or position) (p.35).

Overall, Posthuma and others (2002) conclude that there is an increased recognition that the interview is a complex multifaceted process with underlying psychological determinants (p.50).

A Case Study In Hiring and Employment Interview Decision-Making: 37signals ! 37signals is a small software design company that sells simple and accessible

web-based workow applications targeted toward small companies or small groups within big companies. The focus of the software is on helping small teams to collaborate, share information, and make decisions. Examples of their products include Highrise, an application that lets customers track contacts, leads, and deals, and

Basecamp, a project management application that enhances collaboration within a project team or between a team and its clients. According to the companys website, their customers include designers, developers, freelancers, lawyers, accountants, architects, non-prots, charities, universities, PR rms, retailers, manufacturers, consultants, and more. ! The company, which is very transparent about its organizational practices and

strategies, hosts popular blogs on the company and its products and regularly publishes a podcast featuring interviews with its employees regarding various workplace and technology issues. Co-founder Jason Fried frequently publishes magazine editorial

pieces for Inc. Magazine and has published two books overviewing 37signalss organizational ideas and practices. The company currently totals 26 people (Fried, 2011). ! In a recent episode of their company podcast, 37signals co-founders Fried and

David Heinemeier Hansson discussed in detail their interviewing and hiring practices (Linderman, 2011). There they explained that, in addition to requiring the standard resume and cover letter for each application, 37signals also has applicants demonstrate their skills and abilities in the important tasks and functions of the job, with the method of demonstration varying depending on job. For example, for programming positions, as Fried and Hansson explain, they ask applicants to submit a portfolio of their work so they look at peoples code and their Open Source contributions. For designers, its not that hard to judge design... You can look at someones design work or their personal blog.

For positions in elds like customer service, where it is more difcult to quantify

skill and talent, Fried and Hansson require writing samples: We created some scenarios and we wanted to see some writing samples, how would someone respond to those scenarios or those customers. And, we found that it was a fantastic way to whittle down the pool from 185 or so applicants that we got for the customer service job down to a short list of maybe 10 people who we thought were qualied at the level we wanted and then we could whittle that down even further.... (The responses to these scenarios) speak far more than years of experience or how procient you are with this piece of software or where you have worked in the past (Linderman, 2011). ! In evaluating applications in this manner, 37signals alleviates themselves from

having to rely solely on the interview format in evaluating an applicants t. When it nally comes to bringing someone in for an interview, state Fried and Hansson, it is mostly just about getting to know them, feeling them out. Is this kind of person a cultural t? What's their demeanor like? We also introduce them... to the people they'd be working with.... We've already sort of hammered out the basics before we brought them in. It is more about what are they like as a person now.... This in-person meeting is much more subjective. If somebody comes in to the room and I just don't like that person for whatever reason, we are not going to hire them. It is as simple as that. We are not going to hire people we just don't gel with or don't like and sometimes you just get a bad vibe and you couldn't pick up on that. You just need the in-person meeting to either pick up on that or not pick up on that (Linderman, 2011). ! Realizing that this well-considered hiring strategy may still lead to non-ideal hires,

37signals has also adopted a hedging strategy of, in effect, extending the evaluation process beyond the interview and into short-term projects or contracts. In their book, Rework (2010), Friend and Hansson discuss their approach to this: Interviews are only worth so much. Some people sound like pros but don't work like pros. You need to evaluate the work they can do now, not the work they say they did in the past. The best way to do that is to actually see them work. Hire them for a mini-project, even if it's for just twenty or forty hours. You'll see how they make decisions. You'll see if you get

along. You'll see what kind of questions they ask. You'll get to judge them by their actions instead of just their words. (p.104). ! In an Inc. Magazine feature, Fried expands on these mini-projects and talks

about this trial period, in the form of a pre-employment, short-term-contract-based tenure, for new employees: We try to test-drive people before hiring them full time. We give designers a one-week design project to see how they approach the problem. We pay them $1,500 for their work. Sometimes, we'll hire someone on a contract basis for a month to see how we feel about the person and how the person feels about us. Sometimes that project is just a few hours a week because the candidate already has a day job. But that's often enough to check out the person's work, how the person communicates, and how the person works under pressure. These real-work tests have saved us a few mismatched hires and conrmed a bunch of great people (Fried, 2010).

An Evaluation Of 37signalss Hiring and Employment Interview Practices ! Several of the academic studies reviewed in this paper present best practices for

how to structure decision-making processes during the hiring and employment interview process. The overarching goal of these best practices is to reduce vulnerability to bias in order to better assess applicant capability and potential for future job performance. Comparing 37signalss interviewing and hiring decision-making practices to the ndings presented in these best practices can be helpful in assessing the quality of the 37signalss hiring strategy. ! From his review of literature, Huffcut (2011) concludes that, in order to best

assess applicant ability, interview questions should elicit information pertaining to the characteristics needed to perform in that position, and in turn those constructs should be reected in (interviewee) ratings (p.64). He also recommends that interviewers be trained to immune themselves to interview performance tactics such as self-promotion

and ingratiation, as recent research indicates that these tactics are utilized to a much greater degree in interviews than on the job (p.75). ! Bazerman and Moore (2009) recommend simple, inexpensive tools such as

intelligence tests over the loose, unstructured interview practiced by many organizations. But if organizations insist on conducting interviews, they write, they ought to use structured interviews in which all job candidates are reviewed by the same set of interviewers and in which each interviewer asks the same questions of each candidate (p.185). Posthuma and others (2002) come to similar conclusions in their review of meta-analyses of the effectiveness of job interviews. They nd that structure enhances both the reliability and validity of the interview and that structured interviews provide incremental validity over other selection instruments, such as the intelligence tests recommended above by Bazerman and Moore (p.50). ! In terms of avoiding bias and other decision-making pitfalls, 37signals appears to As highlighted by Huffcut (2011), the overarching

be far ahead of the curve.

shortcoming of the interview process is that it is descriptive rather than demonstrative, in that candidates describe things rather than perform them. The unveriable nature of most of these descriptions, (as well as the reliance on subjective assessment by the interviewer), opens the door for an enhanced possibility of distortion (and bias) (p.64). ! The 37signals process of evaluating skill and job t before the in-person

interview, via their method of critically assessing an applicants skill through their programming or design portfolio or through their responses to required essay questions, allows the company a means of objectivity in evaluating and comparing applicants. It also frees these assessments of skill and t from falling victim to any of major biases


explored above, all of which occur because of the fundamentally social nature of the inperson interview. To again rehash a conclusion of Huffcut (2011), which cites the work of Barrick, Shaffer, & DeGrassi (2009), the manner in which interviewees conduct themselves, including the use of impression management tactics, can inuence interviewer ratings over and above the degree to which those skills are required for job performance (p.74). 37signals seems to have avoided, or at least drastically

diminished, the effect of these unintended and unwanted inuences on their evaluations of applicants expected job performance. ! When 37signals is at the point in their evaluation process of nally inviting

applicants to an in-person interview, they have already screened them enough to where they are comfortable with a persons skill and ability to perform the necessary job tasks. At this point, then, the company not only recognizes but embraces the subjective nature of the interview. In fact, the recognition of the presence of potentially biasing decisionmaking factors such as the affect and representativeness heuristics seems to be the very purpose of this component of the 37signals in-person interview strategy. To rehash a quote from the companys owners, If somebody comes into the room and I just don't like that person for whatever reason, we are not going to hire them. It is as simple as that. We are not going to hire people we just don't gel with or don't like and sometimes you just get a bad vibe. By allowing themselves a level of purely subjective 37signals

assessment following a thoroughly objective assessment of skill and talent,

has positioned itself well to hire employees that both produce high-quality work and t in well within their organizational culture.


The nal best practice recommendation of Bazerman and Moore (2009) states

that interviewers quantitative assessments ought to be just one component fed into a linear model, along with intelligence, years of relevant work experience, and so on (p. 185). 37signals appears to have found a successful linear model for their interviewing and hiring decision-making process, one in which the in-person interview is but one incremental step in a linear model of assessment. Has the model worked well for them? As owner Jason Fried wrote in a recent Inc. Magazine piece, In 11 years, we've lost just ve people (Fried, 2011).


Sources Academic Bazerman, M. and Moore, D.. 2009. Judgment in Managerial Decision Making. ! Hoboken, NJ: John Wiley and Sons, Inc. Campbell, J., McCloy, R., Oppler, S., and Sager, C. 1993. A theory of ! performance. Personnel Selection In Organizations . San Francisco: ! Jossey-Bass, p.35-70. Dougherty T., Turban D., and Callender J. 1994. Conrming rst impressions in the ! employment interview: A eld study of interviewer behavior. Journal of Applied ! Pychology, 79, p.659-665. Forsythe S. 1990. Effect of applicants clothing on interviewers decisions to hire. ! Journal of Applied Social Psychology, 20, p.1579-1595. Graves L. 1993. Sources of individual differences in interviewer effectiveness: A ! model and implications for future research. Journal of Organizational Behavior, ! 14, p. 349-370. Huffcut, A. 2011. An emprical review of the employment interview construct literature. ! International Journal of Selection and Assessment, 19, p.62-81. Kataoka H., Latham G., and Whyte G. .1997. The relative resistance of the situational, ! patterned behavior, and conventional structured interviews to anchoring effects. ! Human Performance, 10, p.47-63. Kinicki A., Lockwood C., Hom P., and Griffeth R. 1990. Interviewer predictions of ! applicant qualications and interviewer validity: Aggregate and individual ! analyses. Journal of Applied Psychology, 75, p.477-486. Morgeson F., and Campion M. 1997. Social and cognitive sources of potential ! inaccuracy in job analysis. Journal of Applied Psychology, 82, p.627-655. Peterson, N., Mumford, M., Borman, W., Jeanneret, P., and Fleishman, E. (Eds.) 1999. ! An occupational information system for the 21st century. Washington, DC: ! American Psychological Association. Pingitore R., Dugoni B., Tindale R., and Spring B. 1994. Bias against overweight job ! applicants in a simulated employment interview. Journal of Apped Psychology, ! 79, p.909-917. Phillips A. and Dipboye R. 1989. Correlational tests of predictions from a process ! model of the interview. Journal of Applied Psychology, 74, p.41-52.

Posthuma, R., Morgeson, R., & Campion, M. (2002). Beyond employment interview ! validity: A comprehensive narrative review of recent research and trends over ! time. Personnel Psychology, 55, p.181. Schmidt, F. And Hunter, J. 1998. The validity and utility of selection methods in ! personnel psychology: Practical and theoretical implications of 85 years of ! research ndings. Psychological Bulletin, 124, p. 262-274. Van Iddekinge, C. H., Raymark, P. H., & Roth, P. L. (2005). Assessing personality with a ! structured employment interview: Construct-related validity and susceptibility to ! response ination. Journal of Applied Psychology, 90, p.536552. Applied 37signals. 2011. 37signals, LLC. Retrieved April 15, 2011. Retrieved from: ! Fried, J. 2010., June. Never read another resume. Inc. Retrieved from: ! Fried, J. 2011, April. Why I run a at company. Inc. Retrieved from: !! company.html Fried, J. and Hansson, D. 2010. Rework. New York, NY: Crown Publishing. Lindermann, M. (interviewer) and Fried, J. And Hansson, D. (interviewee). Episode ! #25: Hiring. Podcast retrieved from: !