We hear a lot about how AI can be biased, but until it's staring us in the face, we may not be motivated to do much about it. So, researchers at the University of Melbourne developed the Biometric Mirror, an interactive app that at first glance seems to be a cool new facial recognition app that can discern things about you, spitting out an evaluation of your personality based on characteristics such as attractiveness, how responsible you are, emotional stability and "level of ‘weirdness,'" among others. In reality, it's an intentionally bias-prone app, with limited demographic categories and psychometric data that's analyzed by a model trained on a subjective, crowdsource-labeled facial image dataset. The goal behind the project is to show you just how flawed some AI can be, a point driven particularly home when the app asks you if you're okay with your personality assessment being sent to a prospective employer or insurer. Meanwhile, in straight-up, non-skeptical study news: Researchers from the University of South Australia and the University of Stuttgart demonstrated how tracking eye movements can help predict personality traits.
sign up for our newsletter