Yes, users do know what they want from software. And developers must listen.
Benjamin Franklin said, "An ounce of prevention is worth a pound of cure." Then why is so much software rework done today, and why do so many software products fail to delight users?
In my view, it is because most companies don't understand or implement soft skills to develop great software.
In my career in the software business, I have experienced and/or led a number of interesting failed deployments. One effort involved software designed to help physicians enter orders. As homework, one tech person spoke to a physician and translated the physician's insights to an analyst, who wrote them up. An engineer then developed the solution and tested it for quality assurance, and the implementation team installed it.
But the receiving clinicians at another hospital wanted nothing to do with it, so it was back to the drawing board. The IT team went to fix it. Who got egg on their face? Development -- and not because of a lack of talented people. I strongly believe it is because the approach used old processes and an old mindset. We can do better.
Engineers, analysts, and others often believe they know what the user needs. A friend's teammate, who happened to be a trained physician but never practiced medicine, often said, "Users don't know what they want."
The fact is, he didn't know either. How could he? He may have studied and even practiced a little medicine. He may have done the job years before, but he was no longer involved in the medical field. A user may not know how to express what they need, but they certainly know, as Clay Christensen teaches, the job to be done. We can do better.
The first change we must make is to observe. We must do the contextual inquiry needed to understand why a user will "hire" the software. It must be done with a "customer empathy" mindset. Success stories achieved in other industries can show us the critical nature of this approach.
A good example is Huggies Pull-Ups. Parent company Kimberly-Clark created a whole new product line after spending time with parents and understanding that diapers are about the growth and development of babies. Another favorite example is the Oxo Good Grips measuring cup: After watching people crouch to see the lines in their measuring cups, Oxo designed its cups to be legible from any angle.
Agile, scrum, rapid cycle ... these are terms we are becoming more familiar with. They are often touted as the balm that will heal the wounds of bad software. So why do companies who get these theories right still build less-than-thrilling software? Are we focusing too much on the middle of the software information chain and on developing better engineering skills and capabilities?
We need to teach contextual inquiry skills to observation teams. Software, as innovation, is a collaborative team sport, yet we don't seem to collaborate around the context of the job we're doing. We still fail to think of software development as a team sport. What information does a developer need to do a great job? What information does a quality assurance person need to do a great job? The same question can be applied right through to the implementation consultant.
But it all starts with the user. In his famous article Integrating Around the Job to be Done, Clay Christensen asks: "Why would someone 'hire' a milkshake? What 'job' are they trying to get done?"
Many people talk about user-centered design, and I agree with that mindset. With a little creative license, I suggest that we should add "job-centered design" as well. We need to teach the mindset and skillset, and instantiate the toolset that enables great observation teams and outcomes.
Consider the concept of a "love metric." IT experts are familiar with the metrics of on time and on budget. However, these are entry-level expectations -- they are simply expected, but we are rarely rewarded in the long term for achieving them. Isn't our job really to help users get a job done?
If users want software to help them more easily record a patient note, or get paid faster, we must instrument our code and reviews to determine if the software delivers what users would love it to do. If they love it, we've succeeded.