FYI.

This story is over 5 years old.

Tech

A Third of Young Canadians Would Prefer a Robot Boss

Bleep bloop you're fired.
Image: Flickr/Alex Dixon

(Human) bosses suck.

They're a necessary evil of society and they exist for one reason: to make you work. But there are good bosses and bad bosses. At their best, bosses fulfill their function without frustrating you too much, but at their worst, they'll steal your wages and pass you over because they're racist as hell—whether you work for a massive company or a mom and pop operation doesn't make much of a difference here.

Advertisement

It's no wonder that so many young Canadians would rather a computer program do the job—31 percent, to be exact, according to a new survey of more than two thousand Canadian adults by Vancouver-based consulting firm Intensions.

"Cognitive bias, which is a very human condition, can make our jobs quite tough," said Nikolas Badminton, a Canadian futurist and biohacker who helped write the survey questions. "Some people, like HR people and managers, kind of need to get out of the way so a better job can get done."

According to the survey, 31 percent of Canadians aged 20 to 39 agreed that an "unbiased" computer program would be more ethical and trustworthy than a human boss. Thirty-four percent said they would rather be hired by an "unbiased" computer program, 33 percent would prefer to be assessed, and twenty six percent said they would prefer to be managed by a robot.

Watch more from Motherboard: Inhuman Kind

The survey findings appear to lend weight to previous studies that have suggested people would be open to being bossed around by a computer, but if you pay close attention, you can see where the survey makes a massive leap: remember, Canadians were asked if they would trust an "unbiased" computer program, which, I mean, of course you would.

The better question is if a computer program can ever be unbiased in the first place. After all, humans make computers, and humans are full of all sorts of ugly preconceptions about the world.

Badminton suggested that algorithms making hiring decisions could be "trained" by experts to ensure that the programs exhibit as little human bias as possible. Indeed, he believes that HR departments will be replaced by "human ethicists working out the boundaries of how software can work in the business and what that means for humans, as well as artificial intelligence trainers, or humanoiders," Badminton said. I think he made up that last one.

It's absolutely true that (simple) bots can be corralled with good engineering in order to avoid slipping into flagrant white supremacist propaganda, as Microsoft's recent Twitter bot unfortunately did, mostly thanks to human Twitter users. Even if this kind of bias can be phased out of a program, we have to ask ourselves another question: If a robot is the perfect boss, is that even a good thing?

A "good" boss might be someone who lets you take a half day on the down-low because you're feeling ill. A "perfect" boss might not. Because in the modern workplace, even a supposedly "unbiased" bot will have at least one guaranteed predilection: maximizing value-creation among the workforce. And that means you, friend.