You know much less than you think you know. You misunderstand many more things than you think you do. You’re also much more wrong much more often than you think.
(Don’t worry, it’s not just you, the same is true for everyone else.)
Even better, this is how science works. Being a scientist is all about actively trying to be wrong (and proving everyone else wrong), all the time. When you do science, you don’t know, and what you learn doing the science, you don’t ever know for sure.
The scientific method
Here’s the basic steps in the scientific method:
- Based on past experience of you and others, try and make some sense of a problem
- Try to find a reasonable explanation for the problem
- If the explanation is correct, what else would you be able to see or measure?
- Try to disprove the explanation by doing the observation and measuring
Scientists do this all day every day, they do it together on a world-wide scale, and they do it to each other.
Experimentation
In uni, studying applied physics, I was trained in a specific application of the scientific method to experimentation, which went something like:
- Define a question to answer.
- Define what you already know (or will assume) that is related.
- Form a hypothesis of what the answer may be.
- Figure out what you can measure.
- Define how those measurements could be interpreted to verify or disprove the hypothesis.
- Do the experiments and collect the measurements.
- Analyze the data.
- Assert the internal consistency of the experimental data by applying statistics.
- Draw conclusions from the analysis.
The course was called Introduction to Experimentation, and it included many more specifics than just that process. For example, it was also about teamwork basics, the use of lab journals, safe lab practices, how to think about accuracy and precision, and quite a lot of engineering discipline.
The course was nearly completely free of actually interesting math or physics content. For example, the first two 4-hour practicums of the course centered around the measurement of the resistance of a 10 ohm resistor. Some of the brighest 18- and 19-year olds in the country would leave that practicum feeling properly stupid for the first time, very frustrated that they had “proven” the resistor to have a resistance of 11+/-0.4 Ohm (where in reality the resistor was “known” to be something like 10.000+/-0.001 Ohm).
The art of being wrong
Teaching that same course (some 2 years later) has turned out to be one of the most valuable things I’ve ever done in my life. One of the key things that students learned in that course was that the teacher might not know either – after all a lab is a strange and wonderful place, and volt meters can in fact break! The teacher in turn learned that even when teaching something seemingly trivial it is possible to be utterly wrong. Powerful phrases that I learned to use included “I don’t know either”, “You are probably right, but I really don’t understand what’s going on”, “Are you sure?”, “I’m not sure”, “How can you be so sure?”, “How can we test that?”, and the uber-powerful “Ah yes, so I was wrong” (eclipsed in power only by “Ok, enough of this, let’s go drink beer”).
This way of inquisitive thinking, with its fundamental acceptance of uncertainty and being wrong, was later amplified by studying things like quantum mechanics with its horrible math and even more horrible concepts. “I don’t know” became my default mind-state. Today, it is one of the most important things I contribute to my work environment (whether it is doing software development, project management, business analytics doesn’t matter) – the power to say “I don’t know” and/or “I was wrong”.
For the last week or two I’ve had lots of fun working closely with a similarly schooled engineer (he really doesn’t know anything either…) to try and debug and change a complex software system. It’s been useful staring at the same screen, arguing with each other that really we don’t know enough about X or Y or Z to even try and form a hypothesis. Communicating out to the wider group, I’ve found that almost everyone cringes at the phrase “we don’t know” or my recent favorite “we still have many unknown unknowns”. Not knowing seems to be a horrible state of mind, rather than the normal one.
Bits and bytes don’t lie?
I have a hypothesis about that aversion to the unknown: people see computers as doing simple boolean logic on bits and bytes, so it should be quite possible to just know everything about a software system. As they grow bigger, all that changes is that there are more operations on more data, but you never really stop knowing. A sound and safe castle of logic!
In fact, I think that’s a lot of what computer science teaches (as far as I know, I never actually studied computer science in university, I just argued a lot with the computer so-called-scientists). You start with clean discrete math and through state machines and automata and functional programming you can eventually find your way to the design of distributed systems and all the way to the nirvana of the artificial intelligence. (AI being much better than the messy biological reality of forgetting things and the like.) Dealing with uncertainty and unknowns is not what computer science seems to be about.
The model of “clean logic all the way down” is completely useless when doing actual software development work. Do you really know which compiler was used on which version of the source code that led to the firmware that is now in your raid controller, and that there are no relevant bugs in it or in that compiler? Are you sure the RAM memory is plugged in correctly in all your 200 boxes? Is your data centre shielded enough from magnetic disturbances? Is that code you wrote 6 months ago really bug-free? What about that open source library you’re using everywhere?
In fact, this computer scientist focus on logic and algorithms and a high appreciation building systems is worse than just useless. It creates real problems. It means the associated industry sees its output in terms of lines of code written, features delivered, etc. The most revered super star engineers are those that crank out new software all the time. Web frameworks are popular because you can build an entire blog with them in 5 minutes.
Debugging and testing, that’s what people that make mistakes have to do. Software design is a group activity but debugging is something you do on your own without telling anyone that you can’t find your own mistake. If you are really good you will make fewer mistakes, will have to spend less time testing, and so produce more and better software more quickly. If you are really really good you might do test-driven development and with your 100% test coverage you just know that you cannot be wrong…
The environment in which we develop software is not nearly as controlled as we tend to assume. Our brains are not nearly as powerful as we believe. By not looking at the environment, by not accepting that there is quite a lot we don’t know, we become very bad at forming a reasonable hypothesis, and worse at interpreting our test data.
Go measure a resistor
So here’s my advice to people that want to become better software developers: try and measure some resistors. Accept that you’re wrong, that you don’t know, and that you don’t understand.
Nice posting.