Overview

On the Understandability of Today's Technology

An important success factor for today’s complex technological systems is understandability.

There is a race between the increasing complexity of the systems we build and our ability to develop intellectual tools for understanding their complexity. If the race is won by our tools, then systems will eventually become easier to use and more reliable. If not, they will continue to become harder to use and less reliable for all but a relatively small set of common tasks. Given how hard thinking is, if those intellectual tools are to succeed, they will have to substitute calculation for thought. - Leslie Lamport

This quote is quite long, but Leslie Lamport, one of the pioneers in distributed systems and the initial developer of the LaTeX typesetting system, hits it on the point.

It is not sufficient to build the most complex and super-high technological systems to be groundbreaking. We as humans need to build up skills to keep up with our inventions. As we are becoming more and more specialized in our fields, we continuously become more and more dependent on others. We need to focus on understandability to empower others to use those systems.

Technological growth is exponential

In terms of the exponential growth of modern technology and the upcoming singularity, it becomes even more important for humankind to keep up with their own technology - before the technology becomes uncontrollable and we won’t understand a thing anymore what’s going around us.

As a product manager, one rule of thumb is to always prefer reduction of complexity and increasing of understandability before turning your product into some kind of feature creep.

This rule should count for software engineers and technical roles, too. As a software engineer, you are building great things that others will continue to use, update and evolve. Thus, you should aim to make these things easy to use - by providing nice APIs - but also easy to understand in the inner.

Especially for machine learning engineers, complex multi-layered deep neural networks are kind of a black box. Things work, and what’s going on becomes less and lesser understandable. By giving up the attempt to make it clear why several model tweaks work like they do and stacking one layer after the other of not-so-clear complexity on it, we lose control up to the point where we can not get back.

So, always think of others that will use your system in any manner when designing systems.


Written by
Christian Konrad
Product Manager, UI/UX Designer and Web Developer in Frankfurt a. Main, Germany. T-shaped, focused on building experiences by forming habits.