Buy My Book, "The Manager's Path," Available March 2017!
Monday, August 20, 2012
The Science of Development
Despite the study of computing being termed "Computer Science", most working software developers will agree that development is more art than science. In most application code, the hard decisions are not which algorithm to use or which data structure makes the most sense. Rather, they tend to focus on how to build out a big system in a way that will let lots of people contribute to it, how to build a system such that when you go back to it you can understand what's going on, how to make things that will last longer than 6 months. While occasionally those decisions have a "right" answer, more often they are the type of judgement calls that fall into the art of computer programming.
There are, however, two very important areas in which science, and by science I mean careful observation and logical deduction, play a very important role. These two areas are debugging and performance analysis. And where developers are often good at the art of programming, many of us fall short on the science.
I've been meaning to write a blog post about what makes a person a truly great debugger. This will not be that post, but in thinking about the topic I polled some of the people I believe are great debuggers including my longtime mentor. His response was interesting and boils down the concepts very succinctly. Good debuggers have a scientific bent. Some people, when faced with a bug, will attempt to fix it via trial and error. If changing the order of these statements makes the threading problem go away, the bug must be solved by that change. Good debuggers, on the other hand, know that there is a root cause for errors that can be tracked down, and they are able to break down the problem in a methodical way, checking assumptions and observing behavior through repeated experiments and analysis.
It's a wise point, and one that I must agree with. If I observe myself debugging hard problems, a few things come out. One, I always believe that with enough patience and help I can in fact find the root of the problem, and I pretty much always do. Two, when I dig into a tough problem, I start keeping notes that resemble a slightly disorganized lab notebook. In particular when debugging concurrency problems between distributed systems, to get anywhere sensible you have to observe the behavior of the systems in correct state, and in problem state, and use the side-effects of those correct and incorrect states (usually log files) to build hypotheses about the root cause of the problem.
I've also been thinking lately about this scientific requirement in performance analysis and tuning. We're going through a major performance improvement exercise right now, and one of the things that has caused us challenges is a lack of reproducible results. For example, one of the first changes we made was to improve our CSS by converting it to SCSS. We were relying on New Relic, our site metrics monitor, to give us an idea of how well we improved our performance. This caused us a few problems. One, our staging environment is so random due to network and load that performance changes rarely show up there. Two, New Relic is not a great tool for evaluating certain kinds of client-side changes. Lacking any sort of reliable evaluation of performance changes in dev/staging caused us to make a guess, SCSS would be better, that took a week plus to pan out, and resulted in inconclusive measurements. We were looking for luck, but we need science, and our lack of scientific approach hampered our progress. Now we have taken a step back and put reproducible measures in place, such as timing a set of well-known smoke tests that run very regularly, and using GTMetrix to get a true sense of client-side timings. Our second performance change, minified javascript, has solid numbers behind it and we finally feel certain we're moving in the right direction.
Is there art in debugging and performance tuning? Well, there is the art of instinct that lets you cut quickly through the noise to the heart of a problem. Well-directed trial and error (and well-placed print statements) can do a lot for both debugging and performance analysis. But getting these instincts takes time, and good instincts start from science and cold, hard, repeatable, observable truths.
Sunday, August 12, 2012
Being Right
One of the stereotypes of the computer industry is the person that knows they're right, and doesn't understand when it doesn't matter. I have been this person. I once got into a knock-down-drag-out fight with another engineer over the decision to make an API I had written accept case-sensitive data. He absolutely insisted that it must, even though pretty much any case that would require case-sensitivity would almost certainly mean abusing the system design for nefarious purposes. We argued over email. We argued over the phone. I would be damned if I would give in.
At some point my boss stepped in and told me I had to change it, because this person was one of the most important clients of the API and he didn't want to have strife between our teams. I was furious. I yelled at him. I told him we shouldn't negotiate with terrorists. It didn't make a difference, his decision was final. I made the change, and the other developer proceeded to abuse the system in exactly the ways I had predicted. But he also used it, which was all that mattered to my boss, and in retrospect, was more important for the success of the project than my sense of rightness.
As a manager, I find myself on the other side of the fray, as I negotiate with one of my own developers over doing something "right". The time it takes to do something "right" simply isn't worth the fight I would have to have with product, or analytics, or other members of the tech team. It's not worth the time we would spend debating correctness. (At this point, it's usually ALREADY not worth the time we've spent arguing with each other). It's not worth the testing overhead. It's not going to move the needle on the business.
No matter how much I say, you're right, but it doesn't matter, it doesn't sink in. They don't believe me that I know that they're right, that I agree with their technical analysis but that it's not enough to change my mind. They tell me that they understand that technical decisions are sometimes a series of non-technical tradeoffs, but in this case, why can't I see that this is just the right way to do it?
Here's what I wish my boss had spelled out to me, and what I hope I can explain to my own developers when we run into such conflicts in the future: I know where you are coming from. I have nothing but immense sympathy for the frustration you feel. I know that it seems like a trivial thing, why does that other team feel the need to insist that it would take them too much time to integrate, that they don't want to test it, why does that idiot say that it MUST be case-sensitive? Someday, you will be in my shoes. You will be worn down from fighting for right, and you will be driven to produce the best results with the most consensus and not distract everyone with the overhead of debating every decision. And you'll probably smile and think, well, that dude abused my system but he also drove adoption across the company, and I still resent my boss for not fighting for me, but I understand where she was coming from.
At some point my boss stepped in and told me I had to change it, because this person was one of the most important clients of the API and he didn't want to have strife between our teams. I was furious. I yelled at him. I told him we shouldn't negotiate with terrorists. It didn't make a difference, his decision was final. I made the change, and the other developer proceeded to abuse the system in exactly the ways I had predicted. But he also used it, which was all that mattered to my boss, and in retrospect, was more important for the success of the project than my sense of rightness.
As a manager, I find myself on the other side of the fray, as I negotiate with one of my own developers over doing something "right". The time it takes to do something "right" simply isn't worth the fight I would have to have with product, or analytics, or other members of the tech team. It's not worth the time we would spend debating correctness. (At this point, it's usually ALREADY not worth the time we've spent arguing with each other). It's not worth the testing overhead. It's not going to move the needle on the business.
No matter how much I say, you're right, but it doesn't matter, it doesn't sink in. They don't believe me that I know that they're right, that I agree with their technical analysis but that it's not enough to change my mind. They tell me that they understand that technical decisions are sometimes a series of non-technical tradeoffs, but in this case, why can't I see that this is just the right way to do it?
Here's what I wish my boss had spelled out to me, and what I hope I can explain to my own developers when we run into such conflicts in the future: I know where you are coming from. I have nothing but immense sympathy for the frustration you feel. I know that it seems like a trivial thing, why does that other team feel the need to insist that it would take them too much time to integrate, that they don't want to test it, why does that idiot say that it MUST be case-sensitive? Someday, you will be in my shoes. You will be worn down from fighting for right, and you will be driven to produce the best results with the most consensus and not distract everyone with the overhead of debating every decision. And you'll probably smile and think, well, that dude abused my system but he also drove adoption across the company, and I still resent my boss for not fighting for me, but I understand where she was coming from.
Wednesday, August 1, 2012
Growing New Leaders: A Modest Proposal
I've often thought that the startup industry has a leadership problem. It has a leadership problem thanks to its virtues: small companies, agile processes, minimum viable products. The dirty, tiring work of people leadership is often at odds with the hustle of startup life. And yet we as an industry need to make space for that work, because we're starving for leaders and hurting our companies due to this lack of leadership.
Part of the problem is the myth of the natural leader. We'll promote from within, just take the smartest engineer that is able to communicate adequately and make them team lead. But there is very little that is natural about team management. Management is more than just being articulate in meetings and giving a good presentation now and again. It's learning how to communicate with stakeholders in their language. It's figuring out how to resolve conflicts between personalities on your team, even when one of the personalities is your own. It's the detail work of project planning for two people over the next month and twenty people over the next year. Heck, people go to business school and spend a good chunk of time studying these issues. We often disparage the MBA, but we do very little to replicate that skill channel in our own industry.
How does one end up with these abilities? Well, some read books and try to muddle their way through, with varying degrees of success. An ambitious person might reach out to external parties for help, people they know have been there and can provide advice. But both of these channels pale in comparison to learning leadership in an apprenticeship model. I know because I myself have tried books and advisers, but in the end I've learned most of my leadership skills thanks to hours of one-on-one time with current and former bosses, walking through presentations and plans, discussing personnel issues, perfecting pitches. I'm still learning and growing these skills, having moved from mentoring one or two people, to managing a small team, to working as a director of engineering and technical architect for a growing company. And now it is time for me to grow my own leaders. I've written before about the importance of training your replacement, I wish that as an industry we would all take this step seriously.
Here's my proposal. Let's stop promoting people to the role of senior engineer who have never had the experience of hands-on direction of people and projects. If someone has five to ten years of experience writing software in teams, and they have never had the responsibility of mentoring junior developers in a meaningful way, they are lacking something in their skill set. Let's give them the tools they need to be successful as leaders, taking time from the hustle to teach planning, managing, communicating across teams. This experience might just be a short stopover from individual contributor position to more senior individual contributor position, but it's a necessary part of that career path. If we could get a standard across our industry that everyone that was called senior developer had this experience in mentoring and management, we would have something to work on. It would be very unlikely for a company of 20 people to have no one with prior leadership experience. We could successfully grow our leaders from within knowing that we have a small core of people that has already learned the fundamentals of management.
This training isn't free. It will cost us time that we might think should be spent towards the hustle of delivering. But we know that software is humans, and we know that the payoff for delivering is bigger teams, bigger companies, and more responsibilities to manage. If the title of senior engineer recognizes that you can deliver, it should also recognize that you can handle what happens when you do.
Subscribe to:
Posts (Atom)