Why Culture Trumps Training in Achieving Technological Success

In a 2013 op-ed in the New York Times, Lt. General H.R. McMaster, now the National Security Advisor, cautioned against approaches to military strategy that rely on technological superiority but ignore the human dimensions of war. “Be skeptical,” he said, “of concepts that divorce war from its political nature, particularly those that promise fast, cheap victory through technology.” In a separate interview from the same year, McMaster elaborated on this point, saying, “We’re so enamored [with] technological advancements that we fail to think about how to best apply those technologies to what we’re trying to achieve.”


Although McMaster was speaking about the U.S. military, the same warnings could be directed at today’s corporations, who are turning to technology for “fast, cheap victory” while at times forgetting about the people who have to use them and the people those solutions are meant to benefit.


The result of this scotosis, this blindness, is the inability for organizations to generate return on their technology investments. They spend millions on new digital products or services, only to find they’re at odds with customer needs. They pour resources into online marketing and advertising, only to find that their consumers don’t make purchase decisions online. They buy expensive IT platforms and spend months deploying them, only to find frustrated employees can’t or won’t use them. What organizations miss is that the inability or unwillingness of employees or customers to embrace technological change often emerges from the culture of their workplace or marketplace—not the quality or quantity of the technology being deployed.


Take college classrooms, for example. Professors are accustomed to being in charge, being right, being respected and being relatively autonomous. They are the masters of their domain and are accustomed to teaching their way. New technologies imposed from the top down, such as distance learning solutions, often require instructors to change their process or approach. Not only does this interrupt the professor’s routine (and we all hate it when something disrupts our routine!), but it implicitly suggests there’s something wrong or inadequate with the way they’ve been teaching before. Many of these scholars earned their doctorates to focus on innovating in their fields—not on innovating how their fields are taught. There’s a misalignment of motivations here that’s at the core of recalcitrant or slow technology adoption, even among constituencies that have the appropriate training. Simply teaching people how to use something doesn’t motivate them to use it.


A similar scenario is playing out in hospitals, many of which have spent millions of dollars deploying Electronic Health Record systems only to see a decline in productivity and patient care. The answer for many hospital administrators is to invest more resources into staff and training. But physicians, PAs, nurses and other caregivers are busy people, and just because they know how to use a piece of hardware or software doesn’t guarantee they’ll use it at all, much less practice using it to increase their proficiency. What does guarantee they’ll use it, however, is if they understand and believe it’s critical to their success and the success of their patients. That’s not a skill you teach. It’s an attitude you cultivate. If people are truly motivated, they’ll train themselves.  


Organizations, regardless of their size or industry, must always account for the human element before, during and after implementing large scale IT changes. The psychology and sociology of your workforce should be studied and understood. Its culture should be analyzed and leveraged. Key internal influencers should be brought into the process early and the needs of employees should be assessed prior to making any IT investment. Cultural transformation and employee engagement initiatives—along with training and incentive programs—should be implemented in parallel with IT deployments rather than waiting until employee frustration mounts and trust between your people and your technology has been lost.


Managing human-technology interactions is going to be increasingly important in the coming decades. Organizations that succeed at using technology to cut costs and accelerate growth will be those that foster a culture that understands technology and its importance, embraces change and is always trying to improve itself. This is a tall order, but one that will ultimately help companies stay competitive in the 21st century.


Technologies are tools. Processes are means. People, however, are the origin and end of every economic activity. Companies may deploy a new software system to make people’s jobs easier. They may develop new digital products or services to better satisfy consumers. They may even invest on automation to create value for investors or lower prices for consumers (even when companies use technology to obsolete people, they’re still doing it for the benefit of other people). In all of these cases, decision makers need to ask how these stakeholder groups will react to technological change and make cultural transformation a core part of technology projects. After all, your technology is only as valuable as the people who use it.


About the Author
Remington Tonar is a Partner at Brandsinger, where he specializes in strategic innovation, change management and technology strategy.