I’m My Own Worst Enemy

I Can Code

I’m something of a freak, though certainly not unique, in that I started my career in software as a developer and am now far more on the Design/UX side of things. Not only that, but I was one of those untrained/uneducated ones with no CS degree that jumped on the dot-com-bubble wagon to break into the industry. For years, I worked my way up through the ranks, learning on the job mostly with the smattering of self study here, conference/training there.

So it was only natural to me, when given the opportunity, to jump on the UX bandwagon several years ago. Again, I find myself a foreigner with no formal Design or HCI or library science or psychology education (but hey, I did take Psych 101 in college!). But I’ve done a good bit of self study, here and there under the mentorship of “real” UX pros (ya know, the ones with the PhDs and MAs and such). I now have several years of experience under my belt, and I had something of a unique opportunity working on Indigo Studio, an interaction design and prototyping tool, to really research and study the discipline/practice/field of UX and Design. That opportunity has given me a lot of exposure and insights into UX/Design that I know I wouldn’t  otherwise have.

UX for Devs ManifestoAlong the way, I’ve been (and still am) an advocate for developers spending time and effort learning about and practicing, when necessary, UX principles, techniques, and processes. This is because today, still, the vast majority of software being built does not involve UX professionals, or if it does, their role is often minimized and marginalized (they have to fight for their lives, or at least for good UX). In the end, devs, being the ones who build the stuff, have dramatically more influence in most cases over the actual UX of software. This is not going to change anytime soon. Maybe not ever. It’s reality. Deal with it.

And yet! And yet, despite my position advocating for devs “doing UX” and despite my bad example as a dev turned UX guy, I have learned enough in these many years to know that designers are indeed a different sort of animal. They think differently about problems than devs–significantly so. They employ different approaches when tackling issues. Heck, there’s a whole thing called “design thinking” that has sprung up around this notion. And not only that, just like every other professional, professional designers learn and hone their expertise over years as they practice. It really is a profession, a discipline, a field of expertise.

And that brings me to the image at the top of this post. (This is an actual shirt I made at one of the T-shirt sites.) It’s funny on a superficial level, and on the level that people can remember actually writing BASIC programs like that. But the underlying thing is–does being able to write lines of code really mean “I can code!”? Does that put me on the same level as an experienced and (possibly) formally educated developer?

That’s patently and obviously absurd–being able to perform one (or even some) of the basic functional activities involved in a profession does not make one a professional in that discipline. Do people who can give themselves shots claim to be doctors? Do people who represent themselves in court for a traffic citation claim to be lawyers? Does being able to shoot a firearm make me a policeman? Obviously not! I can install GFCI outlets like nobody’s business, but I hardly consider myself an electrician.

And yet this is precisely the attitude taken by those who dabble in Design. Being able to sketch UI ideas on a whiteboard doesn’t make someone a UX professional, but you’d never know that by the way people are eager to second guess and criticize professionally designed UIs or by the way they clearly think that their opinion on a UX design is as weighty as a seasoned UX design veteran.

This is particularly troublesome when dealing with what I’ll call “design smell” (lifting the concept of “code smell” from the dev world). Sometimes–often–an experienced designer can consider a design and immediately tell there is something off about it. Sure, there are heuristics, and principles, and testing, and metrics, and so on that can give definition and language to talk about the bad smell, but not always, certainly not always to everybody’s satisfaction.

Maybe they subconsciously recognize things about it that are just asking to go wrong, based on their distilled knowledge, skills, and experience. Or maybe they had more exposure to what went into a particular design–so what is being considered is something they already explored or something conflicting with their sense of propriety for the problem at hand in the context it is in. The bottom line is, maybe the designer can explain it in a way that resonates convincingly with others and maybe not. But sometimes you just gotta defer to their judgment and rely on their expertise. 

My Ivory TowerNow don’t mistake me. I’m not suggesting the all powerful designer dictating from on high, but judging from a very common theme amongst designers–complaining how everybody thinks they’re designers–I think it is safe to say we are squarely on the other end of that particular spectrum, especially in software. People–particularly those who consider themselves smart and talented (and maybe are)–naturally assume that if something doesn’t make sense/resonate with them, then it must be wrong.

And yet, when those same people think about it in terms of their own expertise (or maybe other expertises they have less of a clue about), they obviously can see that people should respect their expertise and defer to them. Something is broken here. What I’m suggesting is that those who work with designers need to keep this in mind and move the needle further to the respecting designer expertise side of the spectrum.

The fact that I have made this transition from dev to UX pro makes me my own worst enemy in this argument, though. What folks may not think about is that this multiyear, full-time professional journey has really warped my thinking. I may not yet (or ever) be in the same mindset as a formally educated, “untainted” designer, but I have had a lot of time and opportunities that have changed my thinking and given me a lot more Design knowledge and experience to draw on than I had before I started this journey.

And despite that, I maintain what I consider to be a healthy seed of doubt as to my own tendencies when it comes to tackling problems and designing solutions using, shall we say, a pure/disciplined Design approach. And if I still do that after years of practice, self-study, and mentoring, maybe folks who lack pretty much any background in UX should doubt their own design skills and defer to experienced designers?

LOL. Who am I kidding?!? Everyone’s a designer, right?

Advertisements

My Job is Bigger Than Your Job

Superhero Me!I’ve been in software for a while now. I started out on the dev side of things, and within that are the folks who like to use the term “architect” to connote how they have the “big” view of things. There were those who advocated for the “architect” role to be more and more involved in business, move up the chain, etc. Because of course, they are uniquely suited to help the business achieve their goals.

Now I’ve been in “UX” for a while. And there are those who like to use the title “architect” there as well. And yes, there are those who advocate for UX/Design to move up the business chain. Because of course, designers are uniquely suited to help the business achieve their goals. UX is everything of course. (That is a truism as far as I’m concerned.)

I’ve also interacted with folks on the branding side. They may not use the term “architect,” but they do have this sort of “brand is everything” kind of mentality, and of course, they also are uniquely suited to help businesses achieve their goals.

So I had to chuckle today when I read in this article about the “website architect” role that “goes beyond — or rather encompasses — the user interface, user experience, and information architecture of the site” and “needs to have a solid understanding of usability, in-depth knowledge of web development tools, online marketing technologies, and everything else involved in the construction and maintenance of a website.” Okay, so now “website architecture” is everything, and doubtless this role is also uniquely suited to help businesses achieve their goals.

Anybody see a trend?

The funny thing is, to an extent, they’re all right. Considered from certain vantage points, they all are uniquely suited. Any good CEO would want to leverage that unique special sauce from each one.

On the other hand, they can’t all be “everything.” They can’t all “encompass” each other. That aspect of it seems to be simply power grabbing–my discipline, my job, my title is “bigger” than yours. It encompasses yours. It is more important than yours.  (And thus I should have more say in/control over what gets done.)

There was a time in my life when I was enchanted with the term “software architect.” I admit. I used it. But I have become increasingly disenchanted with it, as I see more and more people grabbing at the “architect” title and crafting a role with it, more or less as a way to say, “my job is bigger than yours, more important, more encompassing, and thus what I think/say is more valuable to the business.”  Now if someone tells me they’re an architect, I sorta cringe.

Everybody needs to just slow down and take a breather. How about we each acknowledge each others’ distinct special sauces and work together to make better stuff? We need mutual respect. We don’t need to imagine that our expertise supersedes and encompasses others’ expertise in order to be valuable and meaningfully contribute. Of course, the saving grace here is that in the end, it is the business itself that is in charge, and a good leader of business will do just that–get surrounded by folks with expertise in all these areas, encourage cross-discipline teamwork, and help ensure that everybody is moving along towards the same shared vision to achieve their goals.

It Feels Good to Know and Do Things

He-Man says I have the Power!

Every so often another article appears somewhere advocating creating prototypes by coding. There are many drawbacks in doing that, not the least of which is simply wasted time–time spent dorking around with code that would be better spent evaluating, iterating, and synthesizing design ideas. In response to one such article, I penned “Yes, Ditch Traditional Wireframes, But Not for Code” that goes over the various drawbacks.

Prototyping is Hard
I suspect part of the reason people want to jump into code is potentially a misunderstanding about what a prototype needs to be. Many people, when you say “prototype” think something like a near full-on app simulation, they worry about whether or not it is responsive, or at least, there is some latent idea that it is time consuming and involved. This does not have to be the case, and in fact, I would suggest that it is not good if that is the case, for the most common prototyping needs–the ones that enable you to explore interaction designs and find the best.

Prototyping Tools Are Hard
Another part of the problem, related to the weighty idea, is that most prototyping tools are themselves time consuming to learn and use, even if you don’t want to build a particularly deep, complex prototype. That is a core problem we have tried to address with Indigo Studio; we focused on the idea of sketching prototypes, that is, to make creating a prototype as easy and simple as sketching out ideas on paper/whiteboard (and even faster than that).

You’re Just Biased
Now, some have said, “Ambrose, you only advocate code-free prototyping because you have a vested interest in hawking Indigo Studio.” Well, leaving aside that this would be an ad hominem fallacy, I will first point out that Indigo Studio v1 is totally free of charge, and that you can keep it forever–you never have to upgrade. Everything I advocate for is essentially contained in the free version, so I have little to gain. I am also not saying Indigo Studio is your only code-free option; I just happen to think it is the best. 😉

Second, I invite anyone to spend the amount of time it takes to become effectively familiar with any code-based prototyping framework. Then spend the same amount of time familiarizing yourself with Indigo Studio. I kid! You need spend nowhere near that much time to become effective with Indigo!  

And once you are passingly capable with both tools, do a head-to-head challenge, starting from zero. I guarantee that in the time it takes you to just get a project environment set up with your favorite prototyping framework, you will already have created a working prototype in Indigo. It’s just that fast and easy.

Nope. It Really is More Efficient and Effective for Design Exploration
What I’m saying is that, essentially, by any objective measure, it will be faster to create prototypes that are good enough for evaluation in a tool like Indigo. Not only that, Indigo helps keep you from being unnecessarily distracted with unimportant details, while coding does the opposite. Indigo also helps you stay focused on users and their concerns, while coding does the opposite. 

Now granted, there are exceptional circumstances, but I’m talking about a general rule here. If nothing else, one doesn’t need to invest a lot to sketch prototypes with Indigo, so you don’t lose much if you find that for whatever reason, Indigo is not sufficing for your evaluation/design exploration. The inverse is absolutely not true with coding frameworks.

It Feels Good to Know and Do Things
Given all this, I have been thinking about why people would still cling to the idea that jumping right into a coded prototype is the best way to go, as a rule, for designing. I think at least part of it, if not a large part of it, has to do with simply feeling more knowledgeable and competent.

There is a certain satisfaction that comes with knowing arcane knowledge (like how to code)–one joins the ranks of the elite designers who can code. There is also a certain sense of accomplishment in using that knowledge, struggling with code, and coming out on top in the end (assuming you do come out on top and don’t walk away defeated). It’s like He-Man–by the power of code school, I have the powerrrr! 

As someone who first learned to code and worked for years as a professional developer, and then learned to design as a professional interaction designer, I can relate. (I can also, thereby, speak from experience and not ignorance that coding prototypes is as a rule a less effective starting point for design exploration.) The challenge for those who can code is to ensure that we are making choices for what is best for the design problem at hand, and not what is best to stimulate our own sense of empowerment and accomplishment.

It can be fun to code–especially when you are new to it. It’s similar to making cookies from scratch, the way grandmama use to make them, instead of just buying the pre-made dough you just break apart. That’s fine when it’s for our own entertainment and enrichment, but when we’re being paid as professionals to be as effective and efficient as possible to design the best thing we can, we probably should think twice about taking the slow prototyping approach because we enjoy it more.

There Is Satisfaction in a Job Well Done
And that’s not to say that there is no enjoyment in using code-free tools. It’s just a different kind of enjoyment and satisfaction, one that comes from feeling more efficient and effective in solving design problems rather than coding problems.

I am not saying definitively that one should never code a prototype–far from it. But in their enthusiasm for their skills, I am concerned about this trend in the software design community to advocate coding as somehow better, more superior, or more effective in doing design work. Most of the reasons given for doing so are missing the mark for design/human concerns, all the while ignoring the many hidden drawbacks.

The rule should be to avoid coding except when you are fairly sure it is the only or most effective way to prototype your design ideas.

 

It’s Time to Grow Up

Evolution of Man to IT

As commentary on the latest brouhaha in IT over sexism has gotten the notice of the New Yorker, it strikes me how we’re at an interesting point in the IT industry. It has become axiomatic that (high) technology is everywhere these days. It is becoming increasingly ubiquitous and increasingly invisible (both of which are good things in my book). So it seems to me that the adolescent, socially incapable stereotype of the computer nerd can no longer be safely hidden away in the basement. As technology becomes more and more a part of the social fabric, so do those who make that fabric capable. We are no longer the inconspicuous mice protruding in on the reality of the greater world.

As such, it is becoming increasingly important for us to grow out of our adolescence. Many of us have been told that we are “smart” (because we understand and work with technology). We have been told in stories many times that all those popular, socially capable kids who picked on us during our school years will be sorry, not because we are a threat as such, but because we will, through our intellectual pursuits in practical and productive sciences, be the ones making the big bucks. We are Bill Gates. We are Steve Jobs. We are Steve “The Woz” Wozniak, to name a few of the geek-turned-billionaire heroes.

Part of that narrative is that we can be successful without needing to adhere to normal social rules. We don’t have to strive to be popular, friendly, or even socially conversant at all, because we can still be successful without that, and goshdarnit, we are smarter than all of those popular people anyways! “Idiots! Gah!”

So, many tech heads relish that narrative, because it reinforces what we value about ourselves (that we are so much smarter than everyone else), while freeing us from all that troublesome behaving like civilized people in a civilized society–stuff we were never good at anyways, as was pointed out to us painfully while growing up.

And embracing that narrative has mostly worked, for a time. Yes, some of us have to play nice in the highly regulated corporate bubbles we work in, but in as much as we can isolate ourselves from all that, we do. And often, the “business” wants to isolate us, too. After all, they don’t want us to embarrass them in front of customers. At least most of us in corporate environments have to learn some basics on how to play nice, or we won’t survive. It may gall us, but we swallow it for practical reasons.

On the other hand, as far as the stereotype goes, the open source community has more factors going against it, especially certain strains of ideology within it. It collects anti-establishment mindsets together and gives them a sense of camaraderie when otherwise most would be loners, socially speaking. The internet has made it possible to more or less live in something like a virtual world (more or less literally depending on the person). These mostly online communities are a major echo chamber where members are safely isolated from the rest of the world. They don’t even have to learn to pretend to play nice in a corporate environment, and so it is no surprise that most of the public snafus about sexism show up at conferences loaded with these types, and not in the more corporate/commercial conferences run by the big companies. Plus, it doesn’t help that they are mostly young, and mostly single.

But as I said before, this bullshit has to stop. It is interesting, some of the comments I got in response to that post. One person said he didn’t see what all the fuss is about–there are laws against sexism in the workplace, as if that settles it, and completely ignoring all the evidence to the contrary that these laws do not in fact settle it. This is a cultural issue. And more to the point, when there is a culture in which anti-establishment bias is a major undercurrent, such laws are largely irrelevant. And this is what we’re seeing here–the IT culture, the IT narrative is perpetuating sexism, in forms that have been driven out of other sector’s work cultures for decades.

I can’t tell you the number of passing comments I’ve heard over the years–and this mostly amongst the corporate and indie consultant IT crowd, the ones who should know better–that insinuate or right out claim that women are just not suited to IT work, that they don’t have the brain for it, that they aren’t smart enough for it. Wow. Just wow. Talk about a throwback. I mean, nobody ever said that about women in other industries either, right? Somehow IT is special, right?

I tell you, guys, as a relatively successful person who has been in IT for a good long while, this is simpleminded, naive, utter hogwash. It is nonsense. It is a heinous lie. It is begging the question to the nth degree. The stupidity of the reasoning is so obvious: 1) There are very few women in IT. 2) Therefore, women are not capable in IT.  This is not a rational argument. It is bald-faced, bigoted prejudice. It’s made worse because the actual (usually unsaid) reasoning for guys who think along these lines is the reverse: 1) Women are not as smart as men. 2) Therefore, women don’t belong in IT, and oh look, they aren’t much. (Thus confirming my prejudice.)

It never crosses such neanderthals’ minds that maybe there are other factors at play here. Because the other things at play are social factors, which are, as a rule, foreign and incomprehensible to them. When society portrays IT professionals as 1) mostly male and 2) mostly socially inept, how is that an attractive role for women to aspire to? Especially when the women who are in IT are also typically portrayed as unattractive and socially inept as well?

Further, guys in our society can and do have their mediocre to poor looks and lack of social skills fairly easily overlooked. This is far less so for women because of many factors, not the least of which is that, especially in IT, socially inept males are in positions of authority and positioned to do the hiring most of the time. The bar is de facto higher for women because not only do they have to show–more so than their male counterparts–that they are capable (in order to counteract the prejudice mentioned above), they also have to fight against these males’ judging them by their looks, which doesn’t even come into the equation when they are hiring men. Maybe their looks work in their favor, maybe not, but in a supposed meritocracy, it shouldn’t come into the picture at all, and in any case, it is simply unjust.

Of course, these men in IT never experience this, so they can’t relate, and due to their extraordinarily low aptitude for empathy, they outright deny such factors exist and instead blindly assert that women just don’t measure up. As I said–hogwash. And this is just the basic entry level for women. As a rule, they have to 1) go against society’s expectations to aspire to IT jobs, 2) break into it despite the prejudices against them, and 3) struggle harder to receive equal treatment and respect (and all that entails, pay being just one of the factors).  “Nah.. they’re not in IT just because they can’t hack it.”  Riiiight…

In the (figuratively) smoke-filled rooms of the men’s only club of IT, women are treated as something odd. If they are perceived to not be attractive, they are again perceived to be less intelligent by these men and openly mocked amongst their peers. If the women who break into IT are perceived as attractive, then you get all the adolescent foolishness we have seen perpetrated in the public spats, and more. The women who speak up tell us that this kind of brutish, rutting behavior is not unusual; it is rather the norm.

Maybe the behavior only openly happens away from the regulated corporate environments (i.e., away from where such men cower in fear from legal reprisals and/or the loss of a job), so that’s why we see it more at conferences, mailing lists, anonymous comments on blogs, etc. Yet it remains that there is tons of evidence like this to show that such behavior is not rare, isolated, or unusual. So there’s no reason to doubt it when women say they experience it. It is nothing less than a hostile professional environment–yet another reason women are dissuaded from entering IT and continuing on with it, much less putting themselves out there in public ways to become special targets of these puerile brutes.

Now don’t get me wrong. There are plenty of mature men in IT as well, many of whom I’ve had the pleasure to work with. And there are certainly degrees of this. I am not condemning all guys in IT by any means; just the ones whose behavior is reflected in what I am writing about, the ones who cause and perpetuate all this garbage.

Remarkably, even some otherwise enlightened guys I know still insist that it is somehow the women who are at fault, because they dress so provocatively, or whatever. I am loth to touch that with a ten-foot pole, but I’ll dip my toe in. Yes, we guys (especially in our habitual, hypersexualized popular culture) are easily stimulated by the female form. It’s just a physiological fact. No two ways around it. On the other hand, we don’t have to act on that stimulation. We restrain all sorts of biological impulses for the good of civilization, our own good, and the good of those we care about, if for no other higher reasons. We can help it; we can be “the bigger person” in this matter. We need to be the bigger person, in fact.

As another example, many guys will often talk about the need to get more good-looking women into the workplace. And yet when the women show up on the scene, the guys readily make salacious comments behind their backs (again, in the regulated environments, this behavior must be hidden). Some of this is normal and natural, but it is plainly counterproductive to the goal–treating women as sexual objects is no way to make them feel welcome, equal, nor respected, so why should they hang around? Yet another reason for them to not be prevalent in IT that has nothing to do with their intellectual capacity for the job…

Now I don’t think every flippant comment must be loudly condemned. Even for guys who try to be sensitive to this issue, things will slip out; it is hard to work against the culture, but that doesn’t mean the extreme opposite is therefore the way. It doesn’t mean that mindlessly perpetuating an adolescent professional atmosphere is okay; it doesn’t mean that we shouldn’t try to be better. Women don’t need to be treated with kid gloves–they are adults and should be treated that way. The problem is, though, if you don’t have much in the way of social skills, you won’t readily see the subtle cues that women give when they are uncomfortable about these things. And even if you do have social skills, it is a well-known fact that, for men, women are a mystery. So it’s better to be more sensitive on this point than to assume that a little backslapping and jovial winking will make sexual objectification all okay. And given our poor track record in IT, we have extra reason to try to make up ground on this point.

Lastly, as I said in my previous post on sexism in IT, this culture change has to come from the inside. It is ultimately about maturing past puberty. It’s about becoming men and treating women with respect. It is about encouraging others to do likewise. That’s a good first step.

We also have to recognize and admit the problem exists. The idea that ours is an industry of pure meritocracy where women get fair and equal treatment is a fairy tale. Certainly, we do appreciate good work on its merits (however arbitrary and subjective our criteria may be), but it is just sheer ignorance and wishful thinking to pretend that is all you need to succeed and thrive in our industry, even for men, much less for women who face all these other obstacles.

Society is changing. Tech is becoming more ubiquitous. Pretending we live in some kind of social bubble where it is safe (much less morally acceptable) to denigrate women on the basis that they are women is not going to fly. If nothing else, the industry needs more good people, regardless of their private bits. Keeping it nearly all male is just not tenable, nor is it good.

I am no advocate for some kind of formal, enforced affirmative action for women. I am certainly no advocate for more laws and regulations on the matter. On the contrary, I am an advocate for all of us to work towards changing this unacceptable culture in whatever way we can–each of us, as a mature, responsible adult, can make a positive impact. I am an advocate for us to not hide behind corporate regulations as if they suffice to address the problem. I am an advocate for doing what is right, regardless of whether or not there are consequences for you personally if you don’t.

It’s really quite simple: It’s time to grow up. It’s time for us to lose the backwards culture and the stereotype. If you’re truly good at what you do, you have nothing to worry about.

 

P.S. In commenting on this, one of my female friends who works in IT had this advice to offer: “Bottom line: If you wouldn’t talk that way around your wife, daughters, or own mother, DON’T talk that way with your chick coworkers.”  Good, wise, and practical advice. Thanks!

Nativist Nonsense and Idiotic Idealism

Hard to see when blinded by ideology

I very much appreciate, understand, and value design aesthetics and well built technology. I’m also an amateur philosopher in my free time, so I can appreciate ideas, ideals, and ideologies in themselves. All of this is all well and good, but what I don’t get is people who get so wrapped up in some design or technological ideology that they blind themselves to what is good apart from that. Let me give you some examples that I have heard and seen many times in my career in one flavor or another:

  • Blindly preferring some piece of software or technology purely on the basis that it is “open” or even “standards based.”
  • Blindly preferring some piece of software or technology purely on the basis that it is made by your pet favorite company.
  • Refusing to install or use some piece of software or technology on the basis that it is made by some company you don’t like.
  • Refusing to install or use some piece of software or technology on the basis that it is “open” or “free.”
  • Irrationally assuming that because some company had a challenge with a bug, virus, security, privacy, free-ness, openness, whatever, then everything that company does thereafter is tainted and to be avoided.
  • Irrationally assuming that because something is “native” that it must be better than a non-native alternative.
  • Refusing to code in some language on the basis that you don’t like it/it’s not your preferred one.
  • Prejudging a piece of software because it is built on <insert name of technology stack you don’t like>.

And there are a host of other, even less defensible positions that otherwise quite intelligent people take in relation to design and technology. Especially for people who are supposed to be professionals in technology and/or design, this sort of blind prejudice and ideology-based thinking is inanity; it is out of place, unbecoming, and simply unacceptable.

Most of us in design and technology are not paid to promote ideologies; we are paid to produce things. At the end of the day, the things that make us more productive and solve each particular problem best are the things we should be using. There are good ideas everywhere, and if we blind ourselves to them, we are injuring our careers and doing an injustice to those who pay us with the understanding that we will make the best thing for them in the most productive way possible.

Sure, you can have your preferences. Sure, you can espouse best practices and design philosophies that make sense to you. Heck, you can even advocate for them. But just don’t let those loom so large in your mind’s eye that you cannot see the good in things that don’t align with them. Don’t get so stuck on a technology or a framework or a practice or a pattern or a principle that you choose it when there are better options available for the problem at hand. Everything is not a nail, no matter how superior you think your hammer is. Don’t let your ideals become prejudices that instead of fostering awesomeness rather become a roadblock for you and those you work with and for.

And this extends, importantly, to people as well. Don’t treat those who don’t share your ideals with disdain. Don’t imagine for a second that because you adhere to some ideology (“craftsmanship” or “big ‘D’ Design” or whatever) this makes you more professional or better than they are. I’ve even heard people judge other professionals by when they purportedly clock in and out, as if having a healthy work-life balance somehow makes you less professional or capable!

In our line of work, it is the output, the products of our efforts, that matter most, not how we get there, and there are most definitely many paths to good outcomes. The judges of these outcomes are our clients, our customers, our markets, our users–not us. And the primary criterion in judging a good outcome is most certainly not how well our work aligned with any given ideology, however well-intentioned it may be.

Confidence Builders?

I just read Phil Haack’s Test Better post, and I liked the dicing of terminology around testing/QA. It occurred to me that if the goal is more to build/generate confidence (and it is), then maybe the better job title would just be Confidence Builder. What do you think?

P.S. I totally agree that automated testing isn’t enough. I also agree that you don’t have to have others do the testing for ya. But there is validity in the concern about bias. To test well is a discipline, even an art, but as with many things in software, if you don’t have specific professionals doing it, you can try to do some cover–you can make it suck less.

Like good Design, testing takes an empathy with users. So testers (confidence builders?) can leverage practices from Design disciplines to build that empathy. Doing at least some lightweight user research, spending time as much as possible with users, those can help.

Testing also takes imagination; it takes a certain amount of compartmentalization–to try to set aside what you know of the software design and approach it with fresh eyes. Imagine that all you know about the software is what you see in front of you. Imagine that you have some goal/desire for why you’re looking at it (this is where the user research can help again). Now let yourself go.

Still, you will get better results having people who actually are not so biased doing the testing. You’ll get the best results (and most confidence) testing with and observing users. So even if you don’t have dedicated confidence builders, you’ll be better off doing that.

This isn’t to say you shouldn’t test your own stuff–you absolutely should, especially if you don’t have folks dedicated to it, but if you’re going for confidence, nothing beats both testing with users and testing with professional confidence builders.