2014-05-31

Talking to 6th graders about careers in technology

For the final week before summer vacation, Washington Middle School teacher Daniele Albrecht invited me and several others to Career Week for an opportunity to speak to 6th graders about what we do for a living. The chance to do this excited me because I often reflect on my own childhood at the point when I was beginning to think about technology as a job; having accessible people around willing to share their experiences, guidance, and knowledge can be a powerful motivator.

I did two sessions, and the students were top-notch in both of them (the teachers Bill Spradley and Bill Ethridge were excellent hosts as well). They really seemed to be interested in what it takes to do what I do; I could tell because we went over allotted time with more questions than I could answer. We talked about things like:
  • "What's your typical day look like?"
  • "How much do you make?" - one of the first questions both times, but I didn't mind giving a ballpark figure; mainly because I remember hearing as a child how much doctors made; that left an impression on me and motivated me to get a good job actually
  • "What can I do to learn and start programming?"
  • "Can I code games at home?" - a good question actually; that's how I got started a Commodore 64!
I stuck to a simple narrative: there are companies who want to pay people to use technology to make more money, and I laced it with examples about making video games, building robots, creating mobile apps, etc (things they could relate to). It worked out really well I think. I could see a little bit of me in some of the student's reactions; their interest in my words captured their attention and you could see I got through to a few of them.

The key things I wanted them to walk away with were:
  • Start tinkering and self-learning now
  • Pursue a degree
  • Most of all, apply yourself
Lastly, Jimmy Jacobson's (@jimmyjacobson) post today in /r/programming reminded me to blog about this. In hindsight it would have be useful to have fun illustrations to share while I talked. Jimmy sets a creative example to follow; fork and modify for yourself if you ever have the opportunity to do this!

Thanks to the teachers who organized this and to the students who listened and wrote me this nice thank you card!




2014-05-23

What a radioactive leak can teach us about avoiding blame culture

The Verge published a notable article today titled "Radioactive kitty litter may have ruined our best hope to store nuclear waste". It's a well published story by reporter Matt Stroud (@ssttrroouudd) about the New Mexico Waste Isolation Pilot Plant (WIPP) and how a seemingly banal procedural human error has resulted in the shutdown of the site and jeopardized the future of radioactive waste disposal for the facility.

More interestingly is the teachable moment in all this about avoiding blame culture. It's easy to quickly react and suggest that the human who made the error should be punished, perhaps heavily fined, fired, or worse (some of the comments in the article suggest just that, and even Jim Conca a PhD and ex-geologist at WIPP who Matt interviewed suggested the same but later backtracked). Matt jumped in to reply to a comment suggesting the offender be jailed with an insightful rebuttal from Per Peterson, a professor at UC Berkeley’s Department of Nuclear Engineering who Matt exchanged emails with for the article. In the comment Matt mentions what Peterson had to say about this:

"The natural tendency in events and accidents is to focus on assigning blame and punishing human errors. This approach is generally ineffective because human error happens. The critical issue for safety is to design systems which are tolerant of human error and which encourage reporting of problems and errors and effective corrective action."

He's absolutely right about this. And it's applicable to so many other industries like health, construction, banking, and more, but its especially relevant to me as it applies to my line of work: IT. It's not uncommon that major mistakes happen in software development, that a small coding error brings down a system or an incorrect infrastructure config causes downtime in the middle of the night. It's hard not to react with blame top of mind when these moments happen.

Peterson suggests we eschew the natural reaction and instead design processes that account for the possibility of human error AND that we promote feedback loops allowing for process improvement. In IT that can manifest itself as infrastructure-automation (Chef, Puppet, etc.), continuous integration with good test coverage, etc. for the former, AND things like DevOps culture, blameless post-mortems, etc. for the latter. Making this the default mindset in a company really comes down to culture and how this type of behavior is rewarded and encouraged. I know I don't always practice building a blameless culture myself, but stories like this and advice like the kind that Peterson gives reminds me of the importance of doing it.