Every startup that develops software needs to think about quality. Surely not every startup creates mission critical code or medical software where bugs can kill patients. But as startup, you don’t want to scare away your lead customers and early adopters with failing software either. While many software quality aspects only pertain to a fully established company, you need to insert quality into the company mindset from the get go. Trying to add quality later on requires a change of process, mindset, and you may even have to fire and hire to get this right.
So what about quality? Good software quality comes in many different forms:
- Good product managers that understand that quality is not for free and will steal time from feature development.
- A strong architecture focused on maintainability, simplicity and generally without so-called accidental complexity.
- Writing good specs and test plans to capture the high level concept before you dive into the code and lose touch with the design.
- Consistent reviewing and/or pair programming.
- Defining and adhering to good coding standards, enforcing the basics directly in your version management system, e.g. using tabs iso. spaces to always see the nesting of conditional statements and loops.
- Using static code checkers (linting) to enforce that at least the number of violations goes down on every commit.
- Making sure the code fails early with continuous integration and regression tests on every platform you support.
- Test-driven design with an extensive test suite that include unit tests, integration tests, packaging (install and post-install) tests, and random tests (“fuzzers”) where possible.
- Performance and memory tests in your CI to immediately track performance regressions before it is too late.
- Tracking your code quality in a bug tracker, allowing you to detect weak spots where most of the bugs are caused.
- Measuring code coverage to detect weak spots in your test suite.
- Running static and dynamic analysis tools as part of continuous integration to find non obvious bugs, duplicate code, and potential security vulnerabilities.
That is a long list. This is what usually happens in startup land: “Sure I run agile development so I don’t need to write specs” and “I don’t have time for all this because we need to get our first product to the customer!”. So what happens if you don’t build in quality up front?
Fixing quality afterwards means closing your shop
So you’ve released the first product and written a million lines of code. And it starts failing at your customers. So you decide to start writing tests. For the million lines of code you already have developed? That means you can close shop for months!
While writing tests you probably uncover some performance, power consumption, or security problem in code that was bolted together without a solid design. What could have been a simple refactoring then now means starting from scratch to redesign the code. Netscape did a full redesign from scratch and in the process their market share plummeted. Joel Spolsky on Netscape’s demise:
“They did it by making the single worst strategic mistake that any software company can make: they decided to rewrite the code from scratch.”
A lack of tests often happens with code that is acquired from somewhere else, not in the least from open source. The only way a quality-minded company can work with this is to spend significant time to add tests. One of the proven ways is to first setup continuous integration and code coverage measurement, and then to incrementally improve the code coverage by adding tests.
Similarly, if you run static analysis tools and coding standards such as MISRA on an existing code base you will spend weeks to get to a clean report. Adding all exceptions to avoid false positives and getting reported violations back to zero is virtually impossible. Especially if the original coder is not around anymore to explain the intention of the code or constructs used.
What if your customer demands that your code is certified? You’d have to show the specs, show how each requirement can be traced to a test plan and to a passing test. And vice versa from test to plan to requirement. Again, creating such documentation after the fact is a huge effort. An important aspect is that if your developers never got used to write such design documentation, you will first have to change the mindset of your people. Imagine telling a free climber he all of sudden must use a safety rope.
A culture of quality
Often, the culture of a startup is defined by its founders. That includes the coding/quality culture ingrained in the development team. If the founders code like cowboys, you bet the R&D culture will reflect that. One of the hardest things to change is culture. So make sure you think about what quality culture you want to lay down.
So where to start? Make a good assessment of what elements you need for your domain. Agile teams often devote the first iteration(s) to setting up a good development and quality environment. Such a first “infrastructure sprint” will always pay off in later sprints, saving time in building, testing, debugging and reverse engineering. My recommended absolute minimum you need to get going:
- Version control, e.g., git with a central server to host all repositories (GitHub, Bitbucket, GitLab) and a document how you will use the versioning system. Do you work on branches with pull requests, are there release branches, etc.
- Continuous integration, e.g., Jenkins to make sure you build and test all software in a reproducible way and that you detect problems early.
- A (simple) bug tracker, e.g., Jira to make sure you have a way of capturing issues and feature requests.
- Some form of documentation system to write specs, e.g., a wiki like Confluence or MoinMoin or special-purpose tool.
- A wiki to at the very least document you development environment and specifics on the installation of tools, etc.
- A (minimal) coding standard to make sure the code looks uniform and remains portable.
- A simple hook in your versioning system, e.g., a git hook to enforce minimal coding standard requirements such as spaces iso. tabs, maximum line length, and a commit message that refers to the planning item or issue ID. You can find many examples of these online.
- A (basic) framework for writing unit tests and running them as part of continuous integration, e.g., Google test or the xUnit set of tools. Without such a framework in place, a developer will quickly deem it too much effort to write automated tests and will resort to home-grown tests that only live at his PC and are not maintained once the story is done.
- A brief description of the development process, to keep everybody honest and have a basis to improve on. One aspect to include are the rules you define for reviewing and/or pair programming. This is part of your definition of development culture.
Sounds expensive? Most tools are open source or vendors have a startup-friendly license. Sounds like a lot of work? Not really. Most of these can be setup in a few days, saving weeks to months later on. There are even specialist service companies that can setup such a “software street” for you. Typically that includes static and dynamic analysis tools (and a large invoice).
What if you are already (partially) in the woods? How do you maintain good quality over time? In my experience, the key mantra here is to:
Leave code you touch behind in a better shape than how you encountered it.
A small refactoring to cleanup a botchy design, a few fixes to remove coding standard violations, an extra test for every bug triggered in the field, etc. Get this into the mindset and daily habits of your programmers and your code quality will improve remarkably fast.
What really works to add to the incremental improvement are tools to track the quality. We developed a simple script at Vector Fabrics to run a coding style checker such as Pylint on Python code at every commit and push to git. The script enforced that newly written files had to adhere to a certain minimal quality, and edits to an existing file were not allowed to make things worse. Over time, you can raise the bar and enforce any edit to even reduce the number of coding style violations. At a larger scale, you can enforce this in continuous integration, keeping a trend of your overall quality from static analysis tools and the like. Dutch company TIOBE does this nicely, creating an “energy label” for your software, e.g. denoting your software quality as class D. While this is based on some magic formula that combines trends in static analysis, code coverage, etc. the simple view does give a strong incentive to improve or stay on top.
Bootstrap quality before it is too late
Summing up, changing a running software team is painful, so you’d better make sure to have good quality ingrained in your way of working right from the start. And with open source tooling and the whole agile and extreme programming legacy that is not as hard as you may think.
What techniques and processes would you see as the minimally required?