The Future of Computing Part II – The Problem
Last week, in our first installment, we talked about the "purpose" of technology . . . to make the lives of humans easier. In this post, we'll talk about the problem with today's current thinking about how technology is used.
Most software applications do not run directly on the computer hardware. Instead, a special software application called an Operating System provides a layer of abstraction between the hardware and typical software applications that people use.
What is an Abstraction?
An abstraction is a way of thinking that enables us to ponder what is required without needing to understand the specific details of how to do it. For example, a computer is an abstraction comprised of components like a central processing unit (CPU), memory, storage, input and output. Each of these components are in turn abstractions that enables us to focus on higher level abstractions rather then get lost in the details of how they work. Abstractions enable us to break complex problems down into smaller, manageable problems that other people can work on.
Software applications are inherently limited by the abstractions and constraints imposed by the computer’s hardware and the underlying operating system on which it is executed. Input devices, processing power, memory, storage, output devices and network capabilities limit what can be produced.
Yet, even with these constraints, the sophistication of software that currently exists in our world is breathtaking.
These powerful abstractions have shaped, and continue to shape, the way that we think about, design, write and use software.
We now have applications to help us perform tasks like composing letters and emails, creating complex spreadsheets to help make powerful business decisions. We also have ecommerce web applications that enable consumers to purchase just about anything, whenever they desire.
Applications have, most certainly, opened up new ways for us to be more effective. But we are near the limits of what we can do by continuing to think about applications.
Software applications are inheritably compartmentalized due to the way developers write software. They have to be to in order for teams of people to collaborate effectively and efficiently. As a result, the modularization and decomposition has resulted in an extremely large number of disparate software applications.
With all of these software applications, there is an underlying assumption that people want or need to know what applications are required to perform a specific task. This assumption is fundamentally flawed and places unnecessary constraints that limits the solutions we can design. It also increases the costs of learning. For instance, each new application that people use MAY require them to learn a new way of performing a task.
If a person needs to write a letter, an application centric philosophy forces the person to launch the appropriate word processing application to do so. And if they don’t have an application installed to do so, they are required to find, purchase (if necessary), install it before they can do what they originally intended. While cloud computing applications eliminate the need to install each application, they do require a modern web browser to access the functionality.
In our next post, we will talk about a different philosophy that will help shape our thinking and orient us to invent a more powerful future.