If the internet is the nervous system for a new global culture, should it be artificially limited by human self interest? |
Cory Doctorow ended a harrowing editorial on artificially limited computing in WIRED this month with the observation that the internet isn’t simply an information medium but has, in fact, become the nervous system of the Twenty First Century.
Doctorow begins by questioning why we shackle computers with controls that users can’t overpower, and in many cases don’t even know exist. He uses the example of the Sony rootkit, that would install viral software on machines whenever a consumer would run one of their music CDs. The idea was to curb pirating, the result was creating a blind spot in millions of customer’s machines that immediately got exploited by hackers.
Whenever we build a computer that is subservient to anything other than the user, we’re creating blind spots that hackers can exploit. Whenever our software or hardware is artificially limited to satisfy human values, whether they be government or business or even educationally motivated, we are creating a machine that is flawed.
There is a simple honesty to computing that I find very appealing. When we’re building a circuit or working with a computer or coding, students will often say that they didn’t change anything but got a different output, or that they did everything exactly right and it doesn’t work. The subtext is always that computer is up to something. Whatever the computer is up to, you put it up to it. Computers don’t make mistakes, humans do. This is why it’s vital that computers are not controlled by remote interests. When remote interests dictate computer outputs, you end up with confused users who start to blame the machine.
… because someone programmed HAL to kill. Machines don’t make mistakes, unless people tell them to. |
I’ve long said that computers are merely a tool, but many people see them as intelligent entities with hidden agendas. If we allow institutions to hard code their interests into our computers then we are intentionally allowing our flaws to infect one of the most honest expressions of human ingenuity. We’re also creating that confusion around computers as entities with evil intent (we provide the intent).
What goes for our personal devices also goes for our networks. Unless we are going to continually battle for net neutrality and efficiency over self interest, we’re going to find ourselves with hobbled machines on near sighted networks, seeing only what vested interests want us to see. In that environment computers and the internet can very quickly move from democratizing force to Orwellian control. Keeping computers free of human influence is vital to human well being.
I’ve been uneasy about the nature of the modern internet as distraction engine as well as the branding of edtech. Both examples reek of the infected human influence that Doctorow refers to in his editorial. Wouldn’t it be ironic if we, as a species, were on the verge of building a more perfect machine that allows us to move beyond our short-sighted selves, but instead of building that wonder we infect it with our own shortcomings and end up using it to create a kind of subservience never before imagined?
I see it every day in machines so locked down that they barely function as computers, with limitations on virtually everything they do. This is done for ease of management, to satisfy legal paranoia and, ultimately, to ease the burden of digitally illiterate educators, but this approach has me watching whole generations growing up in an increasingly technology driven world having no idea what is is or how it works. As a computer technology teacher this is difficult to swallow.
The only restriction on a computer should be the laws of physics and the state of the art. Efficiency and user empowerment should be the machine’s and our only focus. Everything should be up to the user otherwise these magical machines aren’t empowering us, they’re being used to create dangerous fictions. Is it difficult to teach students how to use computers like this? Perhaps, but at least we’d be teaching them a genuine understanding of what digital technology is, and how to wield that power responsibly. All we’re doing now in education is feeding the infection.