Test It Before You Wreck It

I have an embarrassing programming secret. Please don’t judge me harshly.

I don’t use automated tests.



I’m not a complete moron. Whatever I’m currently working on is thoroughly tested manually before being committed in git. And I commit pretty frequently with descriptive commit messages. I also fork and merge religiously.

I haven’t really felt the need to set up automated tests. Not til yesterday.


I’m currently building an API server for a project. An offline-capable smart spreadsheet browser app.

Made some updates to the client and wired it up to the API endpoint to test the feature and the server came crashing down. My changes were to the client, which was in a separate repo, so I didn’t change anything on the server. Restarted the API server and tested it again. Crash and burn.

What went wrong?

A quick investigation pointed to when I renamed a few variables to make things more readable. I didn’t change every instance of a particular variable. Two slipped by me. I didn’t even realize there were two instances til I fixed one, pushed the change up and the server crashed again.


It’s supremely easy to make mistakes or not check as thoroughly as you think, which introduces bugs into you code.

Manual testing is okay, but you can’t count on it to catch every single problem. And every single problem matters in production. It’s the difference between a well-running app and steaming pile.

No more relying on myself to make sure everything is working properly. I’m writing a complete test suite for the server before I build any further.

From now on, I’ll test it so I don’t wreck it.


How I Built My First Desktop App in 3 Days

“Twelve scanners! For that Sheraton job there were three scanners and it was too much work to review properly. The workload is going to be insane.”

Homeboy, the pessimist he is, was highlighting the downside of his company’s most recent contract. To be fair, he was one of the supervisors on the Sheraton gig and was literally drowning in the work, so the pessimism wasn’t unmerited.

I’d dropped by to return a flash drive and somehow ended up in a lengthy discussion about his challenges at work.

“What makes a supervisor’s role so hard?”

“We need to make sure the scanners are doing their job correctly. We were excited about almost completing the last job when we discovered a problem two scanners made weeks ago. We had to spend an extra month fixing it up.”

“What exactly do supervisors do?”

“We check to see if the scanned documents tally up with the report sheet the scanner produces. A document is a collection of PDF files. We make sure the PDFs are in the right document and have the right number of pages.”

“That sounds tedious.”

“In a week, a scanner can generate up to a thousand PDFs. It’s very easy to make a mistake and put a file in the wrong document or write the wrong page tally. It’s our job to catch and correct those mistakes.”

I stood in silence as I absorbed the gravity of the situation. 1000 PDFs per scanner is a frightening amount of data to manually verify. Pessimism must be contagious because I felt myself coming down with it. Making software is tricky, but it certainly wasn’t a tireless grind like this.

Maybe I could build something that could help him. He could select the folder containing the documents and it will determine what PDFs are in what documents, how many pages each PDF has and generate a report.

“I’ll build something to help you this weekend. Don’t sweat it”.

“Dude I’ll never ask you for your sister’s phone number again in my life if you do”.


I’ve only built web and browser apps, but everything about this sounds like a desktop app.

It can’t be a browser app. There aren’t APIs in the browser that allow me to crawl a client’s system. It can’t be a web app either. Uploading that many PDFs is seriously time-consuming and data intensive.

If I could get Node.js on his system, I can treat his computer like a server and build a web interface for him to interact with. Made for his computer, using web technology.

My search led me to Electron. Desktop applications built with Node.js? Everything I need to build something for a desktop with tools I use daily.

Getting “Hello World” up and running in Electron was a piece of cake. The documentation gave me enough information to do so on the first page. The code was easy to reason about and immediately dealt with the finer points of what differentiated electron apps from regular Node.js apps.

There’s a main process and a render process. Think of main as the server and render as a browser.  In a traditional web application, all the brains are on the server and the client browser talks to the server by sending text via HTTP. In Electron, the render process talks to the main process by sending text through a package called ipc.  You coordinate both processes via ipc to accomplish your object just like the browser and server are coordinated via http requests.

Once I understood this much, the plan was crystallized. Use node’s fs package to do the crawling, use comma-separated-values to generate the report excel doc and use the interface to collect input from Homeboy.

Time to start wiring this thing up!


Being able to drag and drop the folder containing the documents would be most intuitive way to get the document location, so I started building from there.

Normal browsers have a security model that prevent you from getting a file’s full path with the File API, but it was there for the taking in Electron.

Used ipc to send the paths to the main process and started writing the script to crawl the paths.

The first version I wrote was procedural. Get a list of folders and retrieve their content.  If any are PDFs, report them. If they are folders, go into those and do the same thing.

No matter how deeply nested the folder structure was or how files there were, it returned the information in less than a second. Amazing fast!

Every virtue the first version had was made irrelevant by how unusable the user interface was. It had one line of unstyled text that said little more than “drop folders here”. When you do, it creates some barely-styled boxes with the folder path, number of files and number of PDFs. It worked though, so I was super excited to demo it to him.


Homeboy watched it run through the test data and display the results.

“Can it tell me how many pages are in the PDFs?”

“That’s the very next feature I’m adding.”

“Okay. It needs to be able to tell me how many pages are in the PDFs and create an Excel sheet with that information.”

Was it too much to expect a little gratitude at this stage? No matter. I’ll just use a package from npm to get the page count. Shouldn’t take me more than 15 minutes max.

Two npm packages and as many hours later, still no page counts.


My first attempt was using an aptly named package: pdf_page_count. The package kept returning with “undefined” no matter what I did. Homeboy’s periodical “is it done yet?” didn’t help matters. After taking a look at how many people downloaded it recently, I came to the conclusion that the package was broken. Needed to find an alternative.

pdf2json seemed to fit the bill. Way more uses and downloads than pdf_page_count, so I installed it and gave it a shot.

No such luck.

I was so close! All I need is the page counts and I can output out on the interface and generate the excel report.


It didn’t help that the instructions for pdf2json  were woeful, so it felt like I was listening to the wrong event or setting the listener on the wrong object or something.

In my desperation I started stepping through the source code and adding log statements so I can follow the process through and see what’s wrong.

I was listening to the wrong object. The right one nested inside. Quick change, restart app and … nothing.

Turns out the information isn’t passed to the nested object. You have to check the outer one again. Quick change, restart app and … still nothing?!?


Oh wow. Turns out I was only passing the name of the PDF, not where it was located. Quick change, restart app and …. PAGE NUMBERS!

Quickly exported the app to the other laptop for him to test and proceeded to watch it hang up his computer for five minutes before producing the report.

“Is it exporting to Excel yet?”


Though I was able to extract page numbers, I was completely unsatisfied with its performance. It took an unreasonably long time to get.

To make matters worse, it completely hung up the system while getting the page counts. Forget multi-tasking. The mouse barely responded to movement and you certainly couldn’t click on anything.


pdf2json grabs a whole lot more than page counts. It tells you the height of each page, what is contained on every line (vertical AND horizontal), text and so much more. Useful information for someone else but totally irrelevant to Homeboy’s needs.

I took another look at pdf_page_count, armed with the knowledge that its failure was most likely due to me not passing the right page path.

A quick swap later it was running correctly. Way faster too! Understandable, since it’s a one-trick pony. Or a one-trick Veyron, speed improvement considered.

Processing the PDFs still hung up the system though. The app would navigate to the folders and go as deeply as it could, sorting between directories and files and trying to get the page numbers from the PDFs. My sample data has 2999 PDFs in total and the current architecture processes them all at the same. No wonder the system becomes completely unusable.

I created a queue where the PDFs to process are placed. I instructed the app to process a maximum of 8 at a time (get it? 8 looks like an upright mobius strip). The end result is not as many PDFs are processed at once. Definitely wouldn’t be as fast as opening all 2999 at once, but the system doesn’t experience any performance degradation. He’d be able to drop a bunch of folders to be processed and can muck around on Facebook, check back in 5 minutes and it’s all done.


I could have aimed for feature complete at this point and done the Excel exporting but the unusable user interface bothered me far too much. I knew when the app was processing and when it stopped simply because I had a console attached to the app. This shouldn’t be background information. This is data the app user needs to know, even if they don’t know they need it.

I wired up ipc to log how many PDFs were queued up twice in a second so the user knows what’s left to process and how fast the system is running through it. I put an indicator at the top left that’s green when there is no processing going on, red with a pulsing orange light when its busy. A quick glance should be good enough to let you know what exactly is going on.

Not quite the most beautiful swan, but this was a Sunday afternoon. Being feature complete at the end of the day was far more important than making it the most pretty thing. Enhancements could wait.


I left the Excel report til very late because I’d just worked on a Meteor app with Adim and a feature I worked on was exporting datasets to Excel. My understanding of the problem space and what it takes is still very fresh.

Generating the data wasn’t a problem, but the method of export is slightly different than it would be in a browser. Data URIs were not working in Electron, so I pushed the data back to the main process and wrote the file out with Node fs package.

Admittedly the workaround didn’t come to me immediately but I realized I’m not alone, searched and found a post online detailing the problem as well as a possible solution and figured out exactly how to do it in my app. Problem solved, feature complete.



“Duuuuuuuude. This thing is awesome! You don’t know how much easier you just made my life. Thanks a lot man!”

Might be because I was expecting some pessimism but I was extremely pleased to hear how happy he was with it. He’s taken it to his office and is currently giving it out to his coworkers to use. I’m totally floored.

There’s only a handful of people using things I’ve made or libraries I’ve written, so every additional person enjoying what I’ve worked on brings a feeling of joy and motivates me to work harder to build more things.

Definitely going to be build more things and writing about it 🙂

Serially Iterating An Array, Asynchronously

Meet Derick Bailey. He runs Watch Me Code, has written a few programming books and blogs a fair bit, so he’s always on my radar when I need to learn a little bit more about programming.

A few months ago, he wrote a post, Serially Iterating An Array, Asynchronously, which was about a very specific programming issue he had to solve.

I recently found myself looking at an array of functions in JavaScript. I needed to iterate through this list, process each function in an asynchronous manner (providing a “done” callback method), and ensure that I did not move on to the next item in the list until the current one was completed.

Wait a minute … I faced that same problem a while ago!

My Node.js Express server routes have a list of steps they need to complete before sending out a response to the client.

The /signup route verifies the email address through an external service, bcrypts the password, stores the information in the database, retrieves the insert id and sends the information over to the client.

None of those operations are synchronous. I can’t run them in parallel either, since some operations depend on the result of a previous one. It’s pointless to continue if any step fails.

A serial list of functions that need to be processed asynchronously.

Derick posted his solution to the problem and where he felt it could be improved. My solution addresses quite a few of those, so I thought I’d share.


available on github and npm


Couple points to highlight here.

1. task.end can be used to end a task prematurely. If any of your steps fails, use it to bring the task to a halt instead of wasting more time and resources.

2. task.next is a step control mechanism.

3. task.set / task.get can be used to store and return data relevant to the task at any step. Use it to store the initial data set, keep api responses, set flags … anything really.

4. Under the hood, task.end  nulls the data store, the task list and the callback list after the final callback has been triggered, in an attempt to prevent memory leaks.

5. Derick uses a destructive process on the task steps list. I simply keep track of the current index and increment after each task.next .

6. Under the hood I’m backing cjs-task with a pubsub, so I can trigger events for updates to the data store as well as when steps are triggered. Currently not implemented and probably not necessary, but I feel it’d add tremendous value and make it easy to monitor or modify your task instance. Most likely just an excuse to justify using the pubsub instead of something dead simple like a hashmap.

The longer I look at this, the more they look remarkably different, despite the similar API, job to be done and identical operation loop. Interesting.

Looks like someone just released queuer.js which looks like a hybrid approach. Combines the kind of event hooks I was looking to build into mine and Derick’s approach to handling data. 

Assembling a Practical JavaScript Toolkit

I was motivated by an opportunity to win N1,000,000 (around $5000) in a programming challenge this weekend.

The challenge was hosted on Codility’s platform. I’d never heard of their service before, let alone used it, so I really didn’t know what to expect going in.

On starting the timed test, the most notable constraint was my inability to import packages from github, npm or http. Maybe Codility lets you do it, but I didn’t have time to experiment and there was no clear way to do so in the interface.

No github. No npm. No libraries.

The thought of writing JavaScript without jQuery, Underscore or access to random libraries is very intimidating for many programmers.

It’s not every day you’ll experience such constraints, but you must be prepared to work under peculiar conditions. Your toolkit must be flexible enough that there’s hardly an environment you wouldn’t be productive in.

All the questions in the coding challenge dealt with manipulating collections of data and teasing out results from them. Interestingly enough, I’d written a blog post on Writing Reusable JavaScript a little while ago. The purpose of the reusable code I’d written? Iterating over collections of data and teasing out results from them.

Could I have done the same thing using jQuery, Underscore, Lodash or some other package out there? Yes.

However I wrote my own function to understand what’s going on behind the scenes much better. Because I did, I knew the solution a bit more intimately and could pare down my tool to the bare necessities or add needed bells and whistles on demand. I can perform these operations in JavaScript environments without those libraries or fancy new JavaScript language properties.

When building your code toolkit, don’t be so dependent on ideal environments. You may work in a code base that doesn’t use the lastest ecmascript features. You may not be allowed to use a transpiler. You may not even have access to packages.

You need to understand the principles of what your tool do and how to accomplish it with the barest of necessities: a text editor and your wits.

If you find yourself heavily dependent on something, try to spend a little time creating a dependency-free version of it: a version that you can copy and paste and use without relying on specific libraries environment properties.

When I fell in love with the ability to decouple code with publisher/subscriber pattern, I wrote my own pubsub. As my needs grew, I evolved my pubsub into cjs-noticeboard (pubsub with a noticeboard pattern). Today, my server-side and client-side code use cjs-noticeboard extensively.

Maybe I’m biased towards my own tools because I built them. I think what’s more important is that I understand exactly how they work and that I can take them anywhere. I can put them in IE6, a Meteor.js app or in a Codility challenge and know they’ll work exactly as expected.

You don’t have to build your own tools but you must understand how to carry them with you to all sorts of environments. Not every environ will be ideal, but you need to be prepared to kick ass no matter the conditions.

I’ll conclude with some of my Considerations for Assembling a Practical Javascript Toolkit.

1. Is this dependent on a specific version of JavaScript? (do I need a transpiler or shim to get this working everywhere?)

2. Is this dependent on a library? (Can this work without jQuery or Underscore or a specific framework?)

3. Is this single-purpose? (does it come with baggage I don’t need?)

4. Is it something I can make by myself? (do I understand the principles behind it or is it magic?)

Oh yeah … if I win the money I’ll be sure to write more about it 😀

Deal With It

It’s rational to expect a solution to be big if the problem is big, right?

Sound rationale but inherrently flawed. Big things are simply a collection of little things working together. If one little thing stops working as expected, it could bring down the whole thing with it.

You would expect that someone who makes a living assembling little things together to make big things to understand this intuitively, but you would be wrong. So very wrong.

I wrote about a problem I was facing with an app I built. I didn’t have much time to dedicate to investigating and debugging the issue, and since it was non-critical I decided to ignore it til I had time to dedicate to the matter.

When I finally got around to tackling the problem, I was amazed to find out the solution was literally one line of code.

Continue reading Deal With It

Writing Reusable Javascript

It’s very easy to lose sight of the big picture when writing Javascript. Your attention is spent entirely on the little details of the current bit of code you’re writing, not why you started writing them in the first place. This isn’t necessarily a bad thing. What’s problematic is doing it repeatedly for the same type of problem.

I’ll walk you through how I wrote a reusable function, how it evolved and my thought process the entire journey, so you can recognize the patterns and know when it’s time to make something reusable.

I’m working on a webapp that collects information from different members of an organization and displays it on their public website. This means much of the work is data storage and processing. Majority of my code will be looping over data structures and doing something in the loop.

No big deal writing it the first or the next time. By the fourth or fifth time though, not quite as fun. All that code is trying to say is take every item in this array and do something with it. Decided to make it a function since even the most verbose function name is a lot shorter and more intuitively understood than the first two lines of that snippet.

Every time I need to loop over an array, it now looks so much neater.

So far so good. My array_each function makes my code very readable and does everything I need. For now.

The next time I need to revisit my function is when I need to loop over a data structure and return the first match for a specific criteria. Normally I can use break to end the loop, but my process item isn’t actually in the loop. I need a way to signal from my process_item function that the loop needs to trigger a break.

To stop a loop, all I need to do is return false from my process_item function.

Best part is that it’s still backwards-compatible! I don’t need modify my previous process_item functions because the new functionality triggers only when process_item returns false.

This is pretty good as is, but it can be better. Sometimes I need to know item’s position in the array. Need to make sure its available when I need it.

I’m passing the current item’s index to the process_item function as a second argument. My previous functions aren’t broken by this update. Javascript is cool enough to let me pass any amount of arguments to a function, even if the function doesn’t use all (or any) of them. Most times I’d be dealing directly with the item, which is why I’m leaving it as the first argument.  The index is passed as a secondary argument so it’s there when I need it, but can be safely ignored when constructing process_item functions that don’t need it.

At this point I’m thoroughly pleased. I’m comfortable processing an array, stopping when I want and using my current index in the array to do other things. I’m pretty confident I wouldn’t need to modify this.

Unfortunately I set myself up for failure right from the beginning by forgetting something very important: arrays are not the only type of data collection.

Objects are conceptually similar to arrays, but different enough to make all the difference in the world.

1. Object keys are not necessarily numbers.
2. Objects don’t come with a built-in length property that tells us how many attributes it has.

This means it’s not possible to retrofit array_each to handle objects because it needs to know how many properties it’s looping through and walks the array by increasing a numerical index. Bummer.

I need to make an object_each with a similar interface to array_each, so that my process_item functions are identical regardless what type of data structure is being passed.

Instead of using a for loop that increments the index being accessed in the array, I’m using a for .. in loop that iterates over the properties of an object. object.hasOwnProperty checks if the property being accessed actually belongs to the object or is inherited. I only care about the data directly in the object, so this is how I filter out the other properties objects come loaded with.

I can safely pass the property name to object_each‘s process_item because accessing an item in an object and an array are interchangeable.

Here’s an example of what object_each can do.

Looking pretty concise! The code is neat and readable. I could leave it like this, but notice object_each and array_each have the exact same function and interface. I can create a wrapper around both of them. The wrapper will determine which function is more appropriate and use it to process the data structure handed to it. Throw a few checks to make sure everything is kosher and voila! One function to rule them all.

A few things going on here. First off, each makes sure it is being passed what is needed to work and confirms the item processor function is a function. Next, it detects what type of data is being passed to it to determine which function is appropriate – array_each for arrays and object_each for objects. I set each to default to using object_each. In Javascript, pretty much everything is an object. I can safely assume that if you pass an array, you’re interested in the array’s content. If you pass any other thing, you’re interested in the object’s properties. A notable exception is string content-type, which behaves pretty much like an array.

My previous example now looks like this:


Pulling it all together into a single reusable function results in:

I can do further optimizations and cleanup, but that’s for another day and another post. The current state of each scratches my itch, so there’s no immediate need for me to optimize further.


I really hope this glimpse into my thinking and coding process has been helpful 🙂

Yes and No are Two Sides of the Same Coin

I’m writing this to myself because it’s likely I might forget this lesson; it would be a crying shame to have to learn it twice. 

Historically I’m terrible at saying no. I love leaving myself open for new opportunities. I like the freedom of choosing a new direction at every intersection. This means I’m more likely to say yes than no to pretty much any opportunity. Here’s the thing though: saying yes to one thing automatically means there’s a lot I must say no to make the yes happen.

Don’t Be So Negative
No isn’t a negative word, like I used to think. It’s reinforcement for a yes I’ve committed myself to. I say yes to jogging to the gym in the morning for a workout, I can’t say yes to talking to my girlfriend at the same time. I can’t say yes to writing code or a new blog post at that time. I can’t say yes to lazily rolling around in bed til I’m ready to get out of bed. Yes to the gym means I must say no to those other things. Not that they’re bad things that automatically deserve a no. The issue is I can’t go to the gym if I’m doing any of the other things. It’s simple, really.

You’ll Know When to Say Yes and When to Say No
Be grateful for the times its clear. Those are the easy ones. The things you really need to say no to don’t present themselves as an obvious no. They’re subtle. Deceptively subtle. In fact, you’ve probably assumed the correct answer to the question is yes long before you discover that you should have said no in the first place. If you’ve said yes to the gym, do you say yes to jogging to the gym with a friend? On one hand, having a buddy to go with is pretty helpful; your buddy could motivate you on lazy days, so pairing up is a good idea. On the other hand, you might spend lots of time waiting for, chatting with or hanging out with said friend. Enough that your time at the gym isn’t very productive. Do you say yes or no to your friend who wants to jog to the gym with you?

It’s A Lot More Clear When It Really Matters
Saying yes to your partner means saying no to every other future match, no matter how much more compatible they appear to be. If you’re having serious trouble in your current relationship, do you stay to work it out or do you go with a potentially better partner? Saying yes to a project means saying no to any other project that comes up while you’re currently engaged. Do you continue working on a project that feels like it’s going nowhere when you just got invited to work on an awesome new project with better pay and awesome team members?

It’s Not a Yes or No Question
The phrasing or logic of a question is really boils down to some points. You can’t pursue every opportunity. You need to be able to close some off in order to chase others appropriately.


It doesn’t mean something you said no to today will forever be a no. I never worked out for majority of my life. I was proud to be a nerd that focused on building my mind. I’d pick a good book over gym time any day. Today, I appreciate being able to jog a kilometer without stopping to catch my breath. Saying yes to working on a healthier body has meant I have to say no to some good books. I had to close off the possibility of reading every good book that came my way for that to happen.

It’s not always clear you’re saying no to many things when you say yes to one thing. It’s been lost on me in times past but I can’t afford to remain ignorant of the relationship between them. My future depends on it.

Not Alone

“Anything that can go wrong, will go wrong” – Murphy’s Law

Or Chinua Achebe would put it: “Things Fall Apart”.

Isolation is my coping mechanism when things fall apart. I feel fully responsible for what’s gone wrong so it’s unlikely I’m ready to surface if I don’t have a solution in hand.

Elusive solutions reinforce how I’m not suited to be doing what I’m doing in the first place. Nagging guilt rips me up, tears down my confidence and sends me on a downward emotional spiral. Justification for such self-destructive thoughts are firmly rooted in the belief that it’s my fault everything is broken.

Two unrelated problems I’m dealing with are convincing me to reconsider this way of coping when things go wrong.

Continue reading Not Alone

Transfer Files to Your Blog in Seconds

I had an interesting conversation with a friend of mine who wrote for a Nigerian entertainment blog. As much as he loved blogging, a lot of the tedious bits continually frustrated him.

1. He spent too much time downloading files just to rename them before uploading to the blog.

2. He paid a lot of money to his internet subscriber, due to the amount of downloading and uploading he needed to do.

I realized I could solve some of these problems for him, so I started to build something to help him out.

I set up a website where he posts a link to what he wanted to download and what he wanted to rename the file to. He pushes a button and the file is renamed and uploaded to his server and a link to the file pops up on his screen.

Here’s what it looks like in action.

This was a solid solution for #1 because he didn’t have to download and reupload anymore. Just press a button and a link will be provided when its done. It was done in much quicker time than he could do it too. When I tested it with a 500MB video file, it got uploaded to the server in 19 seconds! Most files he was uploading are under 30MB so I knew it would be good enough for the job to be done.

It also worked out to be a great solution for #2 because typically he would use up 10MB downloading the file and another 10MB uploading it. Now, he doesn’t waste a single MB. The download and upload happens elsewhere, so he got to keep his MB and get the job done faster too.

Adim and I are working to make it better so bloggers aren’t wasting time and money downloading a file just to reupload it again. We need a few people who want to test it out, so if you (or someone you know) can benefit from this, drop us a line below 🙂

Stay Awhile And Listen

I like building. There are very few activities that can monopolize my attention span for hours on end like assembling individual pieces to form something new. It doesn’t matter how trivial its purpose is. I obsess over it, sifting through all the components to find which combinations give me the best result. As useful as this has been for work, I didn’t get it from freelancing or employment. I got it from videogames.

Deckard Cain is a chatty old man in one of my favorite videogames of all times, Diablo II. He wants to tell you all about the world you’ve been dropped into. He wants to warn you about upcoming challenges. He wants to advise you on where you should be heading to make progress. All Deckard Cain wants to do is talk to you. His catchphrase is “stay awhile and listen”.

Conversations with Deckard Cain are useful because the world of Diablo 2 is gigantic. The maps you must explore are randomly generated every time you start the game. You must re-explore it every single time you play the game. The checkpoints scantily scattered across the world don’t save your exploration progress. They save your time but it’s likely your objective is 3 or 4 (or as many as 8!) randomly-generated maps away from the checkpoint anyways. Considering how long it takes to fully explore each map, it’s really not a game you want to be running around blindly in.

Building for the web is very similar to the ever-morphing world of Diablo 2. Every browser is its own unique environment – no two are completely identical. Browsers change between versions. Sometimes very drastically. Internet Explorer is so different from version to version that it requires special attention to accomplish the same task in different versions. Web Developers have leaned on collections of code to make working in browsers easier. Today there are so many different collections of code that choosing the right one for your project is a bigger problem than dealing with different browsers! If you’re working with a team, better pray they are willing to use the same collection of code you want to work with.

All of these “explorations” keep web developers from reaching the true objective: the thing we set out to build in the first place. To that, I’d like to channel Deckard Cain and give one piece of advice: “Stay awhile and listen”.

1. Listen to The Client
The desire to start building right away is so strong that it’s entirely possible to miss what the client wants and build something they aren’t willing to pay you for. Deckard Cain doesn’t speak until you click on him, so don’t expect to get nuggets of wisdom from your client without engaging them in a conversation. Whatever you do, get them to talk and be sure to listen. They may tell you in absolute terms what you should build. Sometimes they won’t. What’s most important is understanding why they need a solution in the first place. Nothing requires you to solve it in the same way the user expects it to be solved. In many cases, solving it exactly as the user expects will result in something they’re not willing to use! Listen to the user to understand why need a solution. Let their problem guide you to the best approach.

2. Listen to Your Environment
What you’re working with directly limits what you can build. There are things that are impossible to do in the browser. Out of the things possible in the browser, some are impossible in your framework. Some features available in your framework are unfeasible due to your project structure. You really want to listen to all these constraints, because the solution you’ve envisioned might not be possible to execute due to circumstances beyond your control.  Don’t waste your time when your environment and what’s available in it will tell you the limits of what you can accomplish.

The more you stay awhile and listen, the better the quality of things you build. This isn’t limited to Web Apps or Diablo 2 characters. The quality of my relationships have improved as my ability to listen increased. An area I’m still working on is completing people’s sentences. It’s nice when I’m right but the satisfaction is shallow and fleeting. It’s downright embarrassing when I’m wrong. All things considered, I should be fully listening; focused on what they have said and never assuming what their next words would be. That’s not what Deckard Cain would advise me to do.

Building is hard. The challenges keep coming. Even when you’re “done”, you can expect more challenges to turn up. Don’t sweat it though. Repeat the following words and let them guide you to the right solution:

When the going gets tough, stay awhile and listen.