Early in my sales career, I was introduced to quality management and learned that errors are cheaper and easier to fix the earlier they are found – whether “earlier” applies to time, the stage of a process or anything else. A nice, simple fact that it was easy to get my head around.
When I moved into selling IT application development services, my nice, simple fact turned up as a principle of software engineering. I learned that software bugs are cheaper and easier to fix the earlier they are found. I also learned that software design and continuous testing are critical to producing error-free code.
Fast forward a few years, enter the Internet and, along with it, the threat of people hacking into organisational and personal computers. The new world of cybersecurity opened, and I began selling cybersecurity solutions. I came across a strange phenomenon – in those early days, the security of many systems and the information they contained was often forgotten or, at best, bolted on as an afterthought. This seems crazy; surely it would be better to build security in from the get-go? I wasn’t wrong.
I was fortunate to work with some rather clever technical folk who understood something known as ‘Secure by Design’ – principles developed in the 1970s, which took on new importance as more and more systems got joined up across the tech universe.
Secure by Design means that security is thought about early in the software development process. Security requirements are gathered and analysed with equal importance as other requirements; they inform design decisions, which drive software development. In good software development, tests are designed in parallel with the requirements, so it is easy to check later that what you set out to achieve has been achieved.
So, what’s all this got to do with Social Value? Let me explain.