Archive

Posts Tagged ‘Jeff Atwood’

Math, No. Set Theory, Yes.

2009.04.04 Comments off

Jeff Atwood couldn’t be more right when he says

I have not found in practice that programmers need to be mathematically inclined to become great software developers. Quite the opposite, in fact. This does depend heavily on what kind of code you’re writing, but the vast bulk of code that I’ve seen consists mostly of the “balancing your checkbook” sort of math, nothing remotely like what you’d find in the average college calculus textbook, even.

Exactly.  Programming – especially GUI-based web-centric software development of the sort that most people are up to these days – is much more a “right-brained” than “left-brained” activity.

Question for the group: is logic – especially set theory – more right- or left-brained?  Modern software development may not be highly mathematical, but it often requires heavy database design and optimization, where a strong aptitude in set theory is a big plus.

Categories: Software & IT Tags:

Coding Horror-ibly

2008.11.25 Comments off

I love Jeff Atwood’s blog.  But sometimes, I think he’s smoking crack.

His latest post, “That’s Not a Bug, It’s a Feature Request” gets it way wrong.  Regarding Bugs versus Feature Requests, Jeff writes:

There’s no difference between a bug and a feature request from the user’s perspective.

I almost burst out laughing when I read that.

Jeff goes on:

Users never understand the difference anyway, and what’s worse, developers tend to use that division as a wedge against users. Nudge things you don’t want to do into that “feature request” bucket, and proceed to ignore them forever. Argue strongly and loudly enough that something reported as a “bug” clearly isn’t, and you may not have to to do any work to fix it. Stop dividing the world into Bugs and Feature Requests, and both of these project pathologies go away.

I’m not sure what kind of “users” Jeff has to support, but in my universe, there is a very clear difference between “Bugs” and “Feature Requests” which users clearly understand, and which Jeff would do well to learn.

The difference is simple:

  1. Functionality that was part of the specification (i.e. something the user paid for) and which fails to work as specified is a Bug.
  2. Functionality that was not part of the specification (i.e. something the user has not paid for yet) is a Feature Request.

I confess that the vast majority of my experience comes from designing, creating, and supporting so-called “backoffice” applications for companies like VeryBigCo.  And believe me, at VeryBigCo, users know what they need to get their work done, and demand it in the application requirements.  If the application does not meet requirements during UAT, or stops performing correctly in use, then the application is defective.

But what about shrink-wrapped apps?  Do users understand the difference between a bug and a feature request?  In a recent discussion thread on the Sonar forum, a set of users were complaining that a refund is in order because the company focused on “New Features” instead of “Fixing Bugs” from the previous version.

So we can lay to rest the question of whether users think of Bugs and Feature Requests differently.  They do.  And users expect fixing bugs to take precedence over adding new features.  The next question is: should “bugs” and “feature requests” be handled differently from the development point of view?

Let me answer that question with another question: should all items on the applications to-do list be handled with the same priority?  Should Issue #337 (Need “Right-Click to Copy”) be treated with the same urgency as Issue #791 (“Right-Click to Copy” Does Nothing).

Apparently, in Jeff’s world, they should.  Bug?  Feature Request?  It’s all just “stuff that we need to change” so let’s just get right on that shall we? According to Jeff, if we just drop the distinction, then it all goes away.

But how can it?  In the first example, the application doesn’t have a “Right-Click to Copy” feature, which is a good idea, so we should add it.  In the second example, we committed to providing a “Right-Click to Copy” feature, and the user paid for it, but it isn’t there! Does Jeff really think the user is ambivalent to these two situations?  Not the users I know.

There is yet another point that Jeff misses: continuous improvement.  The fact is, if we aim to develop defect-free software, we have to start by understanding our defect rate.  The simple fact is that a Bug is a defect, and a Feature Request is not.  Ignoring the difference means ignoring the defect rate, which means you can forget about continuous improvement.

Furthermore, if you’re doing a really good job at continuous improvement, then you care about the distinction on a finer level and for an even better reason: it helps you understand and improve

  • defects caused by failure to transform stated requirements into product (“development defects”, a.k.a. “bugs”)
  • defects caused by failure to capture requirements that should have been caught initially (requirements and analysis defects that arrive as “feature requests” but which should be reclassified as “requirements defects”)
  • non-defect feature requests – good ideas that just happened to emerge post-deployment (the real “feature requests”)

The only valid point to emerge from the cloud of hazy ideas Jeff presents is that it is unconstructive to use the Bug versus Feature Request distinction as a wedge against users.  It is true that bedraggled development teams will often try to conceal the magnitude of their defectiveness by reclassifying Bugs as Feature Requests.  But this is symptomatic of a far larger problem of defective project / product management, not overclassification.

I’m reminded of an episode of The Wire, in which the department is trying desperately to reclassify homicides as unintentional deaths to drop the city’s murder rate statistics.  Following Jeff’s reasoning, the better solution would be to drop the terms “homicide” and “manslaughter” altogether.  “Stop dividing dead people into Homicides and Involuntary Manslaughters, and both of these human pathologies go away.”

Right, Jeff?

Categories: Software & IT Tags:

The Valid Lure of The Next Big Thing

2008.02.27 Comments off

Jeff Atwood writes about the “The Years of Experience Myth” in his usual, dead-on style:

Imagine how many brilliant software engineers companies are missing out on because they are completely obsessed with finding people who match– exactly and to the letter– some highly specific laundry list of skills.

Somehow, they’ve forgetten that what software developers do best is learn. Employers should be loooking for passionate, driven, flexible self-educators who have a proven ability to code in whatever language — and serving them up interesting projects they can engage with.

Jeff goes on to make the point that you can use job requirements like “3-5 years of experience in such-and-such” as a baseline for determining the quality of the hiring company.  If they hire based on irrelevant (or counterproductive) measures of skill, chances are good that “the rest of the team will be stooges picked for the wrong reasons.”

Let’s take this a few steps further.

Consider the impact of this situation on emergent technologies, and I think a case can be made for why new, “special” technologies have the lifecycle they do.

I’ve often wondered why new technologies get the hoopla that they do.  Is C# really such a huge improvement over C++?  Is Ruby on Rails really such an extraordinary step up from PHP?  Sure, these technologies usually represent improvements – sometimes significant ones.  But by now we all should know that the real cost – and opportunity – of technologies lies in the human capital involved.  I for one will take an A+ PHP developer over a B- Ruby developer any day.

So why do new technologies get the buzz?

If the best programmers are learners, then it stands to reason that when a new technology is introduced, it will be the curious learners – the best programmers, in other words – who early-adopt it.  It therefore follows that the first wave of solutions based on the new technologies are often the outstanding solutions – not because the technologies are so great, but because by virtue of their newness, only the very best programmers are using them.

Once the technology has matured somewhat (and the requisite training curricula, certifications, and dead-meat programmers take their places in the market), productivity understandably falls off.  The quality and novelty of the solutions drop.  The buzz and hype fade.  And the best programmers, having been a part of this technology shift for several years at this point, start looking for something new to learn, and move on.

This offers some interesting lessons to learn.

  • If you’re a software badass, then it really pays to stay on the leading edge of new technologies, even ones that don’t seem to offer a real benefit.  By doing so, you not only differentiate yourself from your peers, but you also will probably end up in the company of other badasses doing badass work.
  • If you’re a company looking to start some new development, consider implementing using new technology as a way of discerning and attracting talent.  Your risk – that of implementing an untried technology – may be lower than the risk of implementing an established technology with tired, unproductive, uncreative developers.
  • If you’re a software toolset provider, your marketing strategy should be based on finding the software badasses and convincing them to early-adopt your technology.  This may even dictate that you do not offer training or certification, and keep your documentation sketchy and terse – just the way badasses like it.
Categories: Software & IT Tags: