The Catch 22s of HR Analytics: Knowing When You Can’t Win
Recently, I was hired to conduct presentations to business analytics and HR teams. While providing examples of where analytics can benefit organizations, I also presented some of the concerns, ethical considerations and challenges associated with analytics applied to people.
When You Just Can't Win
In HR Analytics, it is important to look ahead and think about unintended consequences. With some people analytics studies, you need to know when you just can't win otherwise you'll waste valuable budget/resources to produce no results. I'll provide examples.
In my former career as an engineer, we studied the parameters of manufacturing equipment using the "design of experiments" or DOE. Once we ran our experiments, we could optimize the settings of the machine to maximize output and the quality of that output.
With people, you can't turn the dials and change people's behaviour overnight. Sure, you can put some metrics in place to try and steer behaviour, but the unintended consequences can be massive. In other situations, your experiment can be doomed from the beginning. Take the following newspaper company as an example.
This newspaper company recognized that its reporters spent most of their time in the field, i.e., not in the office. This company wanted to determine how often reporters used their desks to see if there was an opportunity to consolidate office space and reduce the cost of their physical office.
They placed sensors under the reporters' desks to detect when they were there. What happened next? The reporters found the sensors and removed them. The experiment was over quickly and a great deal of money was wasted.
This is why it's crucial to recognize that there can be a Catch 22 in people experiments. If the company had communicated their experimental intentions, the employees likely would have modified their behaviour and spent more time at their desks. This would have produced inaccurate data. Since they didn't communicate their intentions, having the employees discover the sensors destroyed the trust between employer and employee. This was an experiment that just couldn't be won.
The Best of Intentions
In another example, companies have thought of providing objects such as wearables to their employees under the premise that employees who are aware of their own health data will take action to improve their health. As a result, the cost of healthcare will, in theory, go down.
Unfortunately, there's a Catch 22 here because the employees who would benefit the most from knowing their health data (those that are less healthy at the moment) are the same subset of employees who would most fear that their "bad data" would be used against them. Just as smokers pay higher health premiums, these employees wonder if they would eventually be penalized with higher health care premiums if their wearable data becomes known.
Many analytics teams are failing to ask the following questions:
What can happen that will make our experiment fail?
Will we destroy employer-employee trust by running this experiment?
Does the VALUE of the outcome justify the risk and cost of the experiment?
I hope you found this topic to be thought provoking. If you'd like to comment on this topic, you can leave a comment on this page.
Feel free to subscribe to the Numerical Insights YouTube channel (NEW!) and/or follow me on Facebook, Twitter and LinkedIn. I am always interested in reading your comments and questions. Social links are located on our web site.
Tracey Smith is a recognized analytics expert, speaker and author. She is the President of Numerical Insights LLC, a boutique analytics firm addressing the needs of businesses large and small. If you would like to learn more about using data for better decision-making, please visit www.numericalinsights.com or contact Tracey Smith through LinkedIn.