I’m often asked to identify the single biggest factor that derails performance benchmarking initiatives. The list of potential culprits can be quite extensive and I detailed a handful of mandatory building blocks in an earlier blog. Now that’s not to say that one culprit’s star doesn’t shine brighter than the others. Benchmarking does have an arch enemy and its name is Data Denial.
A big part of the value proposition of benchmarking is monitoring and measuring performance. And when you identify variation, you are creating a form of data-driven accountability. Anytime you change the parameters of accountability, you’re going to find nay-sayers (across all levels of the organization) who will attempt to discredit all or specific portions of an initiative.
Staff resistance will come from a dislike of change, a weariness of ‘big brother,’ or the fear that their perception of performance doesn’t match reality. Here are some of the comments I hear the most:
- ‘I’m not sure where this data came from; it can’t be accurate’
- ‘Those aren’t apples-to-apples peer facilities’
- ‘Data’s black and white and doesn’t capture the shades of grey that exist in our department’
- ‘I’m not about to let data drive my decision for what’s best for my patients’
Problems with data accuracy typically occur during the very early stages of a benchmarking initiative when different data sets are being normalized and ‘mapped’ to provide the necessary links for comparison. Do you know the scene in the movie ‘A Christmas Story’ when Ralphie is decoding the secret message? The number 2 equaled ‘B’ and so forth. That’s pretty much how mapping between two data sets works. While Ralphie probably wished he’d made an error after learning the message was nothing more than an ad for Ovaltine, you don’t want errors littering your mapping process. We have found that a small degree of incorrect data can derail the entire cost improvement initiative.
In the face of data deniers, you can’t counter by saying ‘the data is correct, trust me.’ You have to be able to articulate why the data is accurate. Here are three key talking points I often provide to our clients for use in conversations with data deniers:
- We didn’t rely on managers or other hospital staff to input the data. In order to ensure that definitions and guidelines were interpreted correctly, our solution provider worked side by side with us to manage the data submission, coding, normalization and report production. And they applied tightly defined functional definitions to minimize data anomalies.
- Our peer group was mutually agreed upon. Who we are compared with isn’t a secret, and there are no unidentified data tables to interpret. Each calculation for savings potential is a real facility exhibiting that cost per unit performance, not a hypothetical mathematical average.
- There’s no secret black box interpreting the results. Everyone can see the results online for their areas of responsibility, and there’s no need to rely on an outside expert to interpret the results or weed out bad data points.
Accurate data is your ace in the hole, and is really the only thing capable of creating greater buy-in across the organization. Managers understand their data better, and senior leaders can stand tall knowing that the results are correct and that they can confidently move forward with cost management initiatives. And data deniers will – yes, will – come onboard to.
Check out this case study for some more perspective on how overcoming data denial and creating a new, data-driven culture. And as always, reach out to me with any questions at firstname.lastname@example.org.