Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Businesses Technology

A Face-Scanning Algorithm Increasingly Decides Whether You Deserve the Job (washingtonpost.com) 128

Shmoodling shares a report from The Washington Post: Designed by the recruiting-technology firm HireVue, the system uses candidates' computer or cellphone cameras to analyze their facial movements, word choice and speaking voice before ranking them against other applicants based on an automatically generated "employability" score. HireVue's "AI-driven assessments" have become so pervasive in some industries, including hospitality and finance, that universities make special efforts to train students on how to look and speak for best results. More than 100 employers now use the system, including Hilton, Unilever and Goldman Sachs, and more than a million job seekers have been analyzed.

But some AI researchers argue the system is digital snake oil -- an unfounded blend of superficial measurements and arbitrary number-crunching, unrooted in scientific fact. Analyzing a human being like this, they argue, could end up penalizing nonnative speakers, visibly nervous interviewees or anyone else who doesn't fit the model for look and speech. The system, they argue, will assume a critical role in helping decide a person's career. But they doubt it even knows what it's looking for: Just what does the perfect employee look and sound like, anyway?
"It's a profoundly disturbing development that we have proprietary technology that claims to differentiate between a productive worker and a worker who isn't fit, based on their facial movements, their tone of voice, their mannerisms," said Meredith Whittaker, a co-founder of the AI Now Institute, a research center in New York. "It's pseudoscience. It's a license to discriminate," she added. "And the people whose lives and opportunities are literally being shaped by these systems don't have any chance to weigh in."
This discussion has been archived. No new comments can be posted.

A Face-Scanning Algorithm Increasingly Decides Whether You Deserve the Job

Comments Filter:
  • by Anonymous Coward

    Same idea as the polygraph, reward sociopaths and punish anyone who shows emotion.

    Great!

  • by Joe_Dragon ( 2206452 ) on Tuesday October 22, 2019 @08:05PM (#59337474)

    and end user Company will get sued for discriminating.

    What happens when an blind person needs to use this? Some one in an wheelchair?

    • If people can be taught how to look and speak, in order to fool the system, the system doesn't work.
      • If they can look and speak well enough to fool the system, than for many jobs it proves they can do the job.

        • Exactly. If you want to work the reception desk you can't look like Quasimodo.

        • by rtb61 ( 674572 )

          So frowny bitch face makes for bad salespeople and generally well, statistically more difficult to deal with employees. If they were smart they would simply conduct proper psychological assessment of the final employee to test for psychopathy and narcissism, both very damaging normie to staff moral. Not question and answers, but images and brain emotional reactions, activity and controls, psychopaths and narcissist absolutely can not cheat those tests, it is by the own genetic nature impossible to cheat.

          Bi

          • Re: (Score:3, Informative)

            > If they were smart they would simply conduct proper psychological assessment of the final employee to test for psychopathy

            Except that psychopathy is helpful for some careers, listed at https://en.wikipedia.org/wiki/... [wikipedia.org] . Jobs with the highest percentages of psychopaths include clergy, CEO, surgeon, and chef.

            • by Anonymous Coward

              The Wikipedia article you linked doesn't make the claim that psychopathy is helpful for the careers it lists, it only points out that those careers show an over-representation of people with psychopathic tendencies.

        • It's like a Voight-Kampff test with a positive result getting you the job.
      • So how exactly do you teach the black guy to look white?

    • by antdude ( 79039 )

      And multiple disabilities like me. :(

    • My thoughts exactly. I just finished my required annual online "Preventing Workplace Harassment" training, and this "recruiting technology" clearly violates my company's policies and code of conduct. It's more like phrenology than snake oil.
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Tuesday October 22, 2019 @08:25PM (#59337510)
    Comment removed based on user account deletion
    • by sphealey ( 2855 ) on Tuesday October 22, 2019 @09:13PM (#59337600)

      My thought as well, although they seem to have combined that fraud with Meyers-Briggs, MMPI, and other pseudo-scientific jargon. A lose-lose mixture of self-referential theories.

      • by cas2000 ( 148703 )

        Don't worry, they'll give it a cutesy name like iPhren (or, better yet, iPhrend) and everyone will love it.

    • Oh really?

      Is this [pinimg.com] what you want the customer to see sitting at the front desk?

      • Ah, so you posted your picture after all.

        What I want from the person on the front desk isn't good looks, buddy, it is competence. I've had virtually all pretty people whom I've been served by at a front desk fuck up simple things like calling a cab or scheduling a reception room in the "business centre". I avoid these types like the plague now, rarely have trouble.

        Looks don't mean shit when it comes to competence or ability. Let me try to come up with examples good even for the zoomers here... See, looks do

      • Exactly. That's why we need this new technology. A human could never determine if someone looked like that or not!
  • by ensignyu ( 417022 ) on Tuesday October 22, 2019 @08:27PM (#59337514)

    This is going to make some lawyers rich when they demonstrate that it's biased for or against certain races or genders.

    • plus the EEOC lawsuits.

      Party City make an $155,000 payout for a retrain level job
      https://www1.eeoc.gov/eeoc/new... [eeoc.gov]

      Bradley Universit tryed to post an job saying must be able to work in a non-ADA compliant building

      • They've had many instances of AI sussing out valid statistics that turn out to be racism and so should not be applied. I can't imagine "computer says we shouldn't hire you because you're African American and white people are slightly disinclined to buy from you" will fare much better in a lawsuit.

        • I doubt anyone would admit that "the computer said we shouldn't hire you" for any reason. In fact when does a company every tell a candidate why they did not hire them.

          Which is exactly how this software, or bigoted hiring policies, can exist. It is hard to prove. And most people declined either decide "I didn't want to work for those assholes anyhow" or are concerned about repercussions on their career. Blacklists might be illegal, but managers of various companies do talk to each other, depending on the in

          • Sensible companies never tell candidates why they were not hired. It brings no benefit to the company, and may later be used against them in court.

          • and when some with an disability requests an interview help with this and the get told NO GOOD BY?

    • by jrumney ( 197329 )

      I think the whole point of using systems like this is to defend against the inevitable lawsuits while still getting the results you wanted. Your Honor, how can it be racist or sexist when the decision is made by an algorithm that does not take gender or race into consideration?

    • by AmiMoJo ( 196126 )

      Also against people with disabilities.

  • Get a picture. Is the skin light, can the applicant grow a beard. Then they deserve the job.
    • by Cederic ( 9623 )

      In my industry people with light coloured skin are very much under represented if you compare to society demographics.

      Similarly, entering 'female' on the application form boosts your chances of interview, and success at interview.

      But you may not live in the UK or work in an IT field.

  • That or have a clue about people To actually do the HR thing, not pass all responsibility onto automatons.

  • by gijoel ( 628142 ) on Tuesday October 22, 2019 @09:12PM (#59337592)
    Same as the old phrenology.
  • before I'd submit to this.

  • It should be rather testable. The system could rate new employees, no-one at the company is aware of the systems ratings - the ratings should be "sealed away". Then after a year we have the relevant humans rate the employees on the same metric again, unseal and compare.

    Sure sounds like snake oil, but it'd be nice to know. Not that the system should be used even if it had a very accuracy.
    • How could it not work? Simply by evaluating age and gender, it can select against older employees most likely to become ill or select against women young enough to become pregnant. I'm old enough that if my training and expertise were not overwhelming, I would be unemployable.

      • I'm old enough that if my training and expertise were not overwhelming, I would be unemployable.

        Same here.

        We have what they need though, so they swallow hard and overlook the unspoken "age disqualifier" factor.
         

        • by raymorris ( 2726007 ) on Tuesday October 22, 2019 @11:23PM (#59337874) Journal

          I'm over 40 and the last time I was on the job market I had three job offers to choose from. Hmm, we're starting to get a collection of older people who have no problem getting hired here.

          Actually a few years ago I was thinking about dying my hair. Looking older could be bad for my career, I'd heard on a particular web forum. At an "all hands" meeting at work I looked around and noticed something interesting. Everyone in the organization who made more money than me had gray hair. Every single one. I decided not to dye my hair.

          • by LatencyKills ( 1213908 ) on Wednesday October 23, 2019 @07:05AM (#59338498)
            Over 40? At 50 I had job offers out the wazoo. Try a job interview at 60 or better yet 70. You can see most of the people across the table trying to do the math in their head to figure out how old you are (though people today are so bad at math, I've seen people pull out the calculator app on their phone to figure it out). Back in my 40's, I interviewed a 62 year old for a developer job and he impressed the heck out of me. I went to my boss afterwards and said "We have to hire this guy." He replied "Are you kidding me? He's over 60." Ten terrible candidates later, I managed to convince my boss to hire him, but could only get him as a C2H, and at a pretty insulting hourly rate frankly. The man took the job - I suspect he had no other job offers despite his skill. In six months, not only did we hire him full time, but he became the development lead.
            • Over 40? At 50 I had job offers out the wazoo. Try a job interview at 60 or better yet 70.

              I'm 60-ish and I have no problem getting offers and interviews. It's my decades of experience that outweigh but-he's-so-old age factor.

              I've told interviewers who pissed me off that "I have t-shirts older than you" and that I didn't see how they could be expected to evaluate my skills without some experience of their own. I mean, I had a decade of work under my belt before their parents had gotten to 2nd base.

          • Over 40 here. What I observe is that at my age and for the type of money I want, companies want to hire me as a manager rather than a developer. And I'd rather write code. But if I want to pay for my kids' college and maybe retire someday, I have to do it.
            • Here is the story I told hiring managers:

              --
              At my last long-term role it was structured in a way that saved my employer a money on developer payroll, while getting high-quality code produced. We had several programmers with less experience. More junior programmers would touch base with me when they weren't sure how to proceed with a project, or I would touch base wirh them if I saw a likely trap ahead for them. I would spend a few minutes giving them a suggested path to develop the thing they were going to

    • Don't know about this one. However, someone gifted me a book about hiring at Google with their tricky questions and all.
      In the preface of the book the author stated that over the years HR has suffered plenty of fads regarding hiring techniques. And not a single one has stood up to scrutiny years later when you can evaluate the performance. Including Google's strategy outlined in the book.
      But there's always a new fad.
      Would have been hilarious if actual people's fates weren't at stake....

  • Just great (Score:4, Funny)

    by fahrbot-bot ( 874524 ) on Tuesday October 22, 2019 @09:50PM (#59337672)

    A Face-Scanning Algorithm Increasingly Decides Whether You Deserve the Job

    Now only people with faces will get jobs - geesh.

  • by Antique Geekmeister ( 740220 ) on Tuesday October 22, 2019 @09:56PM (#59337690)

    Oh, dear. What an obvious opportunity to conceal what are _illegal_ biases involving gender, disabilities, age, race, or race. All of these traits do correlate with "fitness", since workers with the right features are likely to be able to work longer hours, take less sick time, especially take less parental leave, and are less likely to retire. All of them are reasons for businesses of all sizes to avoid the legislation that compels them to ignore these physical factors of for new employees..

  • by BytePusher ( 209961 ) on Tuesday October 22, 2019 @09:58PM (#59337694) Homepage

    If technology like this becomes widespread, it will effectively create a caste system in America. Guaranteed they are simply classifying people by socioeconomic class with a few biasing factors to make it appear impartial to race.

    Having recently gone through a job search, I found arrogant managers at second tier companies almost reveling in the power they have to reject applicants for trivial reasons. One had me take over an hour of tests that seemed like big 5, plus a test that had me rank three jobs, two of which would be boring manual labor involving alphabetizing records, one would be a fun creative/social job. They came back and said I didn't understand software development without ever asking me a question about software development.

    https://m.phys.org/news/2019-10-class-bias-hiring-based-seconds.html

  • Grow a pair and refuse to have anything to do with companies that do this, and let them know WHY that is the case. Easy Peasy Simple Wimple. In short order they will stop what they are doing or face bankruptcy.

  • Oh lovely (Score:5, Interesting)

    by JustAnotherOldGuy ( 4145623 ) on Tuesday October 22, 2019 @10:30PM (#59337772) Journal

    This sounds like a splendid addition to the world if you're looking to create a dystopian society.

    With these kinds of weirdly invasive job 'evaluation' hurdles increasingly becoming the norm, I'm glad I'm near the end of my career rather than at the beginning. I'd hate to go through machine evaluations and possibly 'fail' because of the way my lip twitched.

    And speaking of dystopian, heavily-monitored societies, China is pretty much already there in many ways. The thing a lot of people don't realize is that we're not far behind them. The capability to do that kind of surveillance and monitoring is already here, it's just that the tying-it-all-together part hasn't happened yet.

  • But maybe they'll find the Roys they're looking for.

    Glad I'm 5+ years from early retirement rather than interviewing for my first soul sucking job.

  • When I first saw Gattaca, it was an amazing look into a possible (scary) future. What I didn't realise, is that we are already there!
    *takes blood sample*
    "Congratulations, you got the job!"
    "What about the interview?"
    "uhh...That was it"
    • Check out Un-Natural Selection on Netflix.

      Gattaga is right around the corner.

      It's a bit sensationalist (at first), but that's fine with me. The ability to choose eye color and gender are sensational. But, the most incredible is the kid who gets vision that he never had.

  • "AI driven" usually means machine learning. How can they be sure that an application that measure faces hasn't picked up gender, ethnicity, or age bias? There are minor regional language variations that may well be correlated with ethnicity, even though all are "correct" English. Shapes of faces and eyes of course vary a lot - so is a person looking "happy" "enthusiastic" or is the algorithm triggering on an ethnic feature.

  • for Automated Racism

  • 100 employers doesn't seem all that pervasive to me.

    And employers have used snake oil criteria to judge applicants since before the term "snake oil" was invented.

    This is no stupider than using personality tests, or voice stress analysis, or any other flavor of snake oil.

    Nothing to see here, move along.

    • At one point, I worked for a very large insurance company, and they had a very, very narrow profile of what their ideal Life Insurance Salesperson should be like. Skin colour wasn't part of the profile, but the rest (IIRC) was male, 20-30, recently married. And pretty much the whole industry shared this same profile ... and, ho boy, did they have tons and tons of bogus and scientifically bovine excremental reasons for that profile. Good luck getting hired if you didn't fit the profile.

      Here's the kicker:

  • Human interviewers are not particularly smart. They get an emotional feel for a person based on very superficial traits. Sure the program will be dumb. But maybe not dumber than human interviewers.

    A related trend is having AI systems mark English assignments. They can be fooled into giving high scores to well crafted garbage based on vocabulary etc. But test show that their assessments are actually better than human markers. The underpaid human only looks at each essay for a few seconds, checks a bit

  • by nicolaiplum ( 169077 ) on Wednesday October 23, 2019 @04:56AM (#59338262)

    The obvious next step is to deceive the AI by using the same technology as Deep Fake videos. Take the images of yourself and modify it in real-time to make yourself more normally attractive, your speech patterns smoother, and so on.

    Want the job? You need to buy an appearance smoother to use on your phone before you go to the interview. Better appearance smoothers will be more expensive, of course. I hope your family has money or you can afford a loan shark, you'll need money to get the job. Of course you can't get a mainstream loan, since that will also judge you by your appearance.

    Having to buy your way into a job is not new - traditionally it simply means bribing the officials or leaders assigning the jobs, but in future it will mean buying the technology to deceive the employer's technology.

  • But some AI researchers argue the system is digital snake oil -- an unfounded blend of superficial measurements and arbitrary number-crunching, unrooted in scientific fact. Analyzing a human being like this, they argue, could end up penalizing nonnative speakers, visibly nervous interviewees or anyone else who doesn't fit the model for look and speech.

    So HR departments will snap this up.

  • The real measure in these cases is a Pass/Fail checkbox that tells a prospective employer that the candidate is willing to comply with dehumanizing procedures in order to get a job.

    Companies don't want candidates who have the kind of confidence and sense of self-worth that makes them able to refuse these types of hiring techniques. They want someone who will do what they're told, when they're told to do it, no matter how demeaning or unreasonable it is. The kind of person who refuses this test is much mor

    • by Cederic ( 9623 )

      Companies don't want candidates who have the kind of confidence and sense of self-worth that makes them able to refuse these types of hiring techniques

      Pretty much every senior manager I've worked with greatly values those traits.

      Of course, confidence needs to be backed up by substance; simple arrogance is undesirable.

      The important piece of information gleaned is "did the candidate take the test/sit for the interview/etc.?" If yes, you know that they'll accept that salary freeze or work three extra hours after clocking out or tear their back to shreds making that manufacturing quota.

      Nonsense. People accept all manner of bollocks at interview, but that doesn't change their expectations in how they'll be treated, doesn't impact whether they'll be loyal to the company, doesn't demonstrate their work ethic, doesn't even hint towards how much they value their home life.

      Asking for an AI video interview, making you take a "personality assessment test," drug tests, and things like that are basically enormous red flags telling you to stay away from that company.

      Ok, I agree with you on this. I'd suggest 'proceed with c

  • by lamer01 ( 1097759 ) on Wednesday October 23, 2019 @08:40AM (#59338704)
    Did they find the 'perfect' employees and used them to build their models?
  • Recognition is growing that job interviews are pretty much useless when it comes to predicting candidate performance and suitability. Even Forbes is on that bandwagon [forbes.com]. What makes HireVue think they have an algorithm that can judge 'employability' better than an experienced and insightful human being? How was their "AI" trained? How were the algorithms developed? How was their product tested for effectiveness? Was it tested at all? HireVue's website is long on motherhood-and-apple-pie promotional bullshit, b

  • And just whom is it that creates an impartial algorithm? Face book? Google? China? A Democrat/Republican? Will the AI discriminate against organics and only hire robots? Eliminating the human aspect means eliminating humanity.
  • by WindBourne ( 631190 ) on Wednesday October 23, 2019 @09:41AM (#59338918) Journal
    I used to teach various Comp. Sci classes at various companies at end of 90s/early 00s. Best student I had, was a black woman at Bell Labs (wish I had worked with her when I worked there ) when I was teaching Linux internals of the kernel. She was smart, fast, and her code was a joy to see. Yet, she had nails that had to be a good 4-5". She was cute, but, was also on the far edge of what 90s society would accept. Basically, had she gone through something like this, she would likely have flunked and Bell labs (lucent now) would have lost one of their brilliant ppl. In fact, thinking about it, many of us Unix/Linux geeks probably would not pass such BS.
  • I hear that measuring & mapping the size & shape of bumps on candidates' skulls is more accurate & reliable.

  • -or it's he soup line for you. Didn't a country in Europe 80 odd years ago attempt to create the 'perfect' human being, an entire empire filled with them? I heard it didn't turn out so well.

Never buy what you do not want because it is cheap; it will be dear to you. -- Thomas Jefferson

Working...