Do we really learn from our (tech) mistakes?
I was intrigued by the feature "Taking Lessons From What Went Wrong" in the NYTimes earlier this week. Author Broad makes a compelling case that disasters, when properly examined after the fact, lead to engineering advances.
Do we ever do this education - admit our errors and learn from them?
A few years ago, I wrote a short article called The PLSA (Probability of Large Scale Adoption) Predictors for an Australian tech journal. It must have been so brilliant that I never had a single question about it (or other reaction). But the premise of the piece always struck me as a pretty good - an analysis, based on projects that bombed, of why some technologies are adopted and other not. I've always wondered why others never share their "mistakes and what I learned from them" stories.
In the article, I looked at
- Digital Video Editing (1996) At this time we had a young and ambitious video tech on staff who wanted in the worst way to replace our analog editing equipment with the latest in digital editing software. We spent about $13K for a system that just never did work quite right. The tech became so frustrated (and probably tired of my questions) that she quit and the equipment sat unused. iMovie made an appearance not long after, and the rest is history.
- Interactive Television (1999) At the cost of about $20K (from a grant, not local dollars), we installed an interactive television hook-up in our district staff development room. Other than one university course, an after school advanced math class, and a few meetings, the equipment did not get used and we removed it after two years. We now use other ITV facilities in town for meetings when needed.
- Data-mining (2001) We contracted with a regional tech center to develop a data ware-housing, data-mining solution. About six months into the project, the tech center closed. We found another developer. He bailed after deciding his company would rather focus on online testing. Total lose of funds was about $20K and countless staff hours of planning. (We have had a more successful data mining project since then...)
Taking a serious look at mistakes is difficult to do for any number of reasons...
- It's not always clear when we've made a mistake, especially in education.
- Admitting error requires more self-confidence (and job security?) than many of us have.
- The optimist in us wants to give the project just a few more (weeks, months, years, etc.)
- Most of us are working with scarce, public dollars and feel pretty guilty when these funds don't have an impact.
- We are not accustomed to a transparent environment in which learning from failed experiments is lauded rather than condemned.
We need to change our altitudes about mistakes in education, both at a local and at a national level. It's tragic to lose the benefits of a mistake. After all, we do make enough of them.
Any boo-boos you're willing to admit - and the lessons you learned from them?
Reader Comments (6)
In design, the most important operational maxim is 'fail early' followed by 'fail often'.
Failing late is a very expensive way to do things. Failing once can be catastrophic. So designers model solutions using cheap gear (like pieces of paper showing web pages, or pieces of cardboard marking out a new room layout) and try to spot the obvious problems early on. Failing often is the means by which they then advance rapidly on a working solution, by eradicating each error one at a time. It is absolutely the opposite way to how things are done in school - which you get marked down for failing and the timetable requires that we get things right first time. That, in turn, may help to explain why people then end up spending 520k on something that doesn't work or isn't needed.
There's a very good example of a paper prototype of a web interface at: http://www.youtube.com/watch?v=I7SGIC7_zHA
Josh's paper prototype
It seems to me, that in my failed implementations, I have almost always underestimated the complexity of issue. Second on my list of reasons for failure, is an inability to gather the right ownership across the proper populations.
I've come to realize, that most of the time, people have a passive acceptance of big implementations or broad strategic plans. Passive acceptance is miles away from owning the success or failure. It takes a lot of work to move people from simply nodding their head and agreeing to actually wanting the project to succeed so badly, that they will help when it is sinking.
Hi Sean,
Thanks for an expert view of this topic. I don't remember my educational training ever really visiting this topic.
I've always wondered about the wisdom of every school being a "research" school. (University lab schools in the US seem to e a thing of the past.) We certainly don't expect every hospital to be a "research" hospital.
Anyway, thanks again for the post.
Doug
Hi Joel,
My experience is similar. But I have never figured out a really good way of building such enthusiasm, especially in admin types. I tend to settle for passive support as the result of good communication about a project. Enthusiasm grows as the project becomes popular/successful.
Let me know your secret!
Doug
Doug,
My biggest failures in tech have come from focusing too much on functionality and ignoring useability. Sometimes fixing the problem with a solution that is too complex, or difficult to implement, is just as bad as the initial problem and usually comes at greater cost and a higher level of frustration.
Hank
Hi Hank,
Yup - I agree 100%. I've always thought we lose 10% of users for every "step" that needs to be followed to complete a tech related task. My own tolerance for tech complexity is getting lower all the time as well!
Doug
The key is to examine ALL projects after completion. What went well, what didn't go well, and why. I think you'll find that you learn more from what didn't go well. The military is pretty good at conducting post-project analysis. Corporations aren't very good at it...mostly because of the fear of admitting mistakes. However, in the military mistakes cost lives. In corporations, mistakes lead to advancement and promotions (because they're so rarely identified), due to lack of accountability.