(Previously published on DZone) In four years, the Agile Manifesto will celebrate its silver jubilee. I want to devote another post to these sensible recommendations you find hanging in poster format on office walls the world over. Agile as a noun has become solidified in the established practice of software development to the point where many treat its values as if descended from Mount Sinai on stone tablets. They are not eternal truths and can make no such claims. They were written when Perl/CGI was the go-to stack for web apps. Agility encompasses adaptability, so the values themselves must be constantly re-evaluated and questioned.
The Agile Manifesto summarizes its values and principles on a single sheet of paper. The Scrum Guide needs more space, but it’s still concise enough to be called a summary. That needn’t be a problem. Many complex cooking recipes fit on a single page. But Agile, however you define it, can at best only be a set of ideas and values, never a recipe or how-to guide for building great software – Scrum doesn’t even talk about software. That’s why the agile method is a deceptive term. The Oxford dictionary definition of method resembles that of a recipe quite well: “A particular procedure for accomplishing or approaching something, especially a systematic or established one”. It’s funny that the entry should give a method for software maintenance as an example. The lexicographer probably thought refactoring is like plastering a ceiling: something hard that requires skill, but no originality or imagination.
My last Java certification is the SCJP 6 from 2010, so when the OCP 17 came out, I thought it high time for a refresher. I am setting aside an hour every day when I’m at my brightest, which is after coffee and before breakfast. I’ll share my personal study tips later, but it’s more credible if I first pass the exam. So, I found another topic to distract me from my studies that I want to dwell on. It’s the effect that modern IDEs and resources like Stack Overflow are having on our brains. Once you dig into the 900+ pages of schoolwork that is the OCP Study Guide, you’re painfully reminded that they have been making us lazy and maybe even a bit stupid.
Previously published on DZone. Date/times logic in code is where the messiness of the real world upsets the relatively straightforward rules of the digital realm. Blame the bewildering hodgepodge of edge cases on the movement of celestial bodies and pope Gregory XIII (of the Gregorian calendar) but deal with it you must. I’m sure you all know that the 31st of December can be in week 52 or 53, while the 1st of January can be in week 0 or 1 and that you know these rules by heart for various countries (oh yes!). I’m also confident you can hand-code the logic to calculate the difference in seconds between two representations that span multiple time zones as well as a jump in daylight saving time.
In a room full of architects, no two will agree about their exact job description, as the joke goes. In a similar vein, someone in our team had a refreshing solution to another persistent bone of contention: how do you define an integration test? Don’t try to reconcile differing opinions and just stop using the term altogether. I liked it very much. Rather than think of unit and integration tests, it’s more helpful to distinguish between tests that validate code versus ones that validate public interfaces and recognize that within each category the scope varies for each individual test.
I started my programming career without a proper computer science qualification, which wasn’t exceptional in the Netherlands in the wild years preceding the dotcom boom. A sensible dose of the impostor syndrome and a lucky sense of how best to fill the knowledge gaps has stood me in good stead. Starting out as a glorified amateur myself, I have always sympathized with the poor end user, perhaps out of a sense of my own bewilderment with all this needless complexity. I was an early fan of Jakob Nielsen’s Alertbox column, who started writing about (web) usability since the nineties.
I don’t think there has ever been a time when computers and software had the same gentle learning curve as using a toaster. Certainly not in the 1950s, when programmer and user were the same person, i.e., an engineer. Dedicated systems remained far from idiot proof long after that and took considerable (memorization) skills to master. Before barcode scanners became common, checkout operators at German ALDI supermarkets had to enter all prices from memory (items had no price tags). It must have been a steep learning curve, but it sure was fast! Such mental feats were required less than a generation ago for a job we would now classify as unskilled labour.
In a recent article, I argued that it is becoming even harder to be a credible full stack developer, one who is sufficiently up to date to contribute to every stage of developing a modern web- and/or mobile-based application. You’re not expected to build it all by yourself, obviously, but given enough time and with a little help from Stack Overflow, you could.
True, a battle was raging between competing web “standards,” with proprietary goodies like Explorer’s marquee tag and Netscape’s layers. But otherwise, the stack was reassuringly stable and manageable. The necessary tools on your belt didn’t weigh you down. And we just built simpler things back then, before the internet became indispensable. Today’s demands for scaling and security are much higher.
Previously published on DZone There are reasons to give key stakeholders the opportunity to officially sign off on a new software release. We need some formal approval from the people who commissioned it, or at least their delegates. This last stage prior to release is commonly called the user-acceptance test and executed in a UAT environment. It’s an indispensable stage, but treating it as the final step in testing is problematic for several reasons.
Let me start with a car example. Dealerships are generous with free test drives for the same reason that clothing stores let your try on three different shirts. It’s to let the endowment effect do its dirty work: wearing (or driving) something feels more like you already own it. It’s to get a taste for the look and feel, and a catalyst for closing the deal. It’s not about really testing the vehicle — they expect it back unscratched. Toyota doesn’t need their customers taking an active part in their QA process.
Many non-technical skills, qualities, and mindsets are part of software craftsmanship. Today I want to talk about two:
Resilience helps us cope with difficulties by not giving up too soon. Improvisation deals with compromise and creativity in the face of the unexpected. The intuition to distinguish negotiable best practices from unchangeable truths is what agility is about. In earlier posts, I drew analogies between people from art and fiction (Stanley Kubrick, Woody Allen, and Gomer Goof).
Here’s the story of an agile musician I find very inspiring.
Previously posted on Dzone Joel Spolsky’s once prolific blogging output dried up years ago, but Things You Should Never Do, Part I is still a classic after 22 years. He wrote it as an outsider’s postmortem following the first beta release (6) of Netscape’s browser, three years after the previous major release 4. There never was a version 5. The team had decided on a full rewrite, and the resulting delay probably cost them their competitive advantage over Microsoft’s Internet Explorer.
“If Netscape actually had some adult supervision with software industry experience, they might not have shot themselves in the foot so badly”, he closes.
I like reading computing history and consider to what degree those articles are relevant today. Here’s the gist of the original argument in my own words.
By stalling development on their current product, Netscape appeared to have shut for business, at least from the end user’s perspective. That was strategically disastrous. It made no business sense to abandon their flagship project like that. From a technical standpoint, it was even worse to throw away all working code and start from scratch. Most code is harder to read than to write, but you should always prefer refactoring to rewriting. Full rewrites are no guarantee that you will not introduce the same bugs again, that were so painstakingly discovered and fixed in the old base.
Originally published on DZone There’s not much buzz about design patterns these days. They appear to have joined the hall of fame of accepted wisdom, alongside the Silver Bullet, SOLID and DRY. Lately, I had the opportunity to share some thoughts on the importance of good old design patterns with Koen Aerts, CTO of Team Rockstars IT. Here’s the gist of that talk in a more digestible format.
Before I start, let me set some boundaries, as people can get doctrinaire about definitions. I refer to the good old bridge, builder, decorator, and factory patterns. Architectural patterns like MVC do not fall into the same category, much fewer paradigms like serverless and microservices (aka SOA the next generation).
Then again, the latter do constitute a legitimate grey area. They’re clearly about design, patterns, and best practices as well. They offer standard solutions to common challenges. But what makes them different is the scale at which they operate. Classic design patterns are recipes for manageable bits of code, solutions that often fit in a single screen. They explain how to stack the bricks for your new house, whereas microservices show you how to lay out the entire neighborhood.