Location, Singapore
eddywebcreative@gmail.com

Fear and liability in algorithmic hiring 

We Design You're In Control

Fear and liability in algorithmic hiring 

It would be a foolish U.S. business that tried to sell chlorine-washed chicken in Europe — a region where very different food standards apply. But in the high-tech world of algorithmically assisted hiring, it’s a different story.

A number of startups are selling data-driven tech tools designed to comply with U.S. equality laws into the European Union, where their specific flavor of anti-discrimination compliance may be as legally meaningless as the marketing glitter they’re sprinkling — with eye-catching (but unquantifiable) claims of “fairness metrics” and “bias beating” AIs.

First up, if your business is trying to crystal-ball-gaze something as difficult to quantify (let alone predict) as “job fit” and workplace performance, where each individual hire will almost certainly be folded into (and have their performance shaped by) a dynamic mix of other individuals commonly referred to as “a team” — and you’re going about this job matchmaking “astrology” by working off of data sets that are absolutely not representative of our colorful, complex, messy human reality — then the most pressing question is probably, “what are you actually selling?”

Snake oil in software form? Automation of something math won’t ever be able to “fix?” An impossibly reductionist dream of friction-free recruitment?

Deep down in the small print, does your USP sum to claiming to do the least possible damage? And doesn’t that sound, well, kind of awkward?

Click HERE To Continue Reading On… TechCrunch ☕

🎁 Credit are given to the original owner of this post! Disclaimer : All information and intellectual property are sorely belong to the original author/blogger. We are sharing this informative input for reading pleasure. Enjoy surfing!

Join our list

Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

Thank you for subscribing.

Something went wrong.

 

No Comments

Add your comment