Machines That Challenge

Tim’s Motorcycle Diaries

I’ve been taking a break from writing on work over the holiday and have instead been writing about my new love: motorbikes.  In the last post I wrote about mechanical empathy and how a machine that challenges you can also encourage growth; this resonates with technology and how we use and teach it.

Using technology without understanding it is what we aim for in school because we have so many other important things to get to, I think this is a fundamental mistake.  Using a tool in ignorance means you’re never really using the tool effectively.  I’m not suggesting that everyone needs to be a computer engineer in order to use computers, but there is a fundamental level of familiarity you need in order to use any machine, including a computer, effectively.

A machine that does too much for you, even to the point of making decisions for you, is a dangerous machine indeed.  An education system that caters to this kind of thinking is equally dangerous.  Our use of technology should never be founded on ignorance.

“That a machine should place demands on us isn’t a bad thing, especially if it leads to a nuanced awareness of our own limitations.  The machine that can overextend you, challenge you, stress you, is a machine that can teach you something.  We fool ourselves into stagnation when we design machines that do more and ask less from us.”

There is a consumerist drive to produce machines that appear to be our servants, that will do what we want, sometimes without us evening knowing that we want it.  This kind of magical thinking might sell units but it doesn’t offer any room for growth. That educators are willing to cater to this approach isn’t very flattering.

I’d originally written on this from the point of view of motorcycling, which makes extreme demands on the rider.  Compared to driving a car, especially a modern car that shifts, brakes and even parks for you, riding a motorcycle is a physical and mental challenge.  In that challenge lies a great deal of risk and reward.  The opportunity to amplify your thoughts and actions through a complex, nuanced, challenging machine is a growth medium.  Growth in our students is what we should always be aiming at, even in using the tools we hand them.

In extreme cases machines take over decision making for us, reducing us to irrelevance.  Teachers need to be especially vigilant about how students use technology.  It’s very easy for the tech to take over (it only wants to help!)  and the human being it’s supposed to be assisting becomes a passenger.

When we use a machine to amplify ourselves it not only magnifies our achievements, it also subtly changes how we create.  Any teacher who has observed the digitization of student work in the past ten years has noticed how cookie-cutter the material has become.  Plagiarism is just one aspect of the cut and paste nature of modern student work.

Even in a scenario where the machine is a responsive tool, it will colour how you create.  Some technology is even predicated on this thinking.  Your degree of technical understanding minimizes this influence and allows you to side-step homogenized technological presentation.  If you don’t care that what you are producing has been cookie-cuttered into a template that looks like everyone else’s, then what does that say about what you’re learning?  If you’re using technology to do something else you need to understand the technology in order to realize how it’s colouring your learning.

It’s a shame that so many of us prefer machines that will do it all for us rather than taking up the slack ourselves.  There are two ways we can integrate with machines, I’ll always go for the road less traveled and ask for a machine that offers me more opportunity, even if it also demands more expertise.

Scripted Lives

I’ve been mulling this over on the motorcycle side of things, but the idea runs throughout modern digital life, so I’m going to open it up further here.

Being a computer technology teacher I have a passing acquaintance with software.  I’d even say I’m pretty handy with it, but I don’t really like where it’s going since it has become an integrated part of modern life.

Since we started carrying networked computers around with us we have become scripted creatures.  Our devices wake us up, tell us what we’re

Turing tests have computers imitating people in order to demonstrate intelligence. Ironically, it’s pretty much the opposite nowadays. Instead of bringing machines up to human intelligence, we’re watching human intelligence lower itself to a simpler standard.

doing, and how to get where we’re going.  They remove doubts and make memory redundant.  We no longer guess at unknown information, or watch media by accident.  We live in a walled garden of playlists and information at our fingertips, surprises seldom happen.  Technology gives us access to information and media, as well as allowing us to communicate, but it changes how we do it; the medium is indeed the message.
When we connect to The Network we are operating within a script, quite literally, all the time.  Software scripts dictate what we see, how we see it, and how we express ourselves. Complex human relationships are being reduced to scripted simplicity dictated by technological limitations rather than the full range of human ability.  This restriction has begun to redefine what people are capable of doing.

I struggle to find non-scripted moments when software isn’t dictating my responses.  You’d think this only happens when you make a choice to connect on a device, but it happens constantly in the world of action.  I can’t stop my car in heavy snow as quickly because a computer steps in to keep the wheels spinning, even when I’m making a conscious choice to lock them.  Scripts are written for the largest possible population.  We’re all being held to the outcomes of average thinking.

As Kenneth Clark states in Civilisation:
35:36: The obvious: “…our increasing reliance on machines. They have really ceased to be tools and have begun to give us directions…”

… and that was his angle on things in 1969.  Things have come a long way since.  Our brave new world of technology is levelling everyone off.  Individual ability doesn’t matter when we are all just variables in an equation.

Students experience education, entertainment and interpersonal relationships through a digital lens whose singular intent is that of continued engagement.  When your world is housed within a simplistic digital process designed to constantly get your attention you have a lot of trouble dealing with your irrelevance in the real world.

When prompted into unscripted situations where I am asking them to critically analyze a piece of media, students long for a Google search to tell them what to think.  When given a opportunity to express themselves many students will leap into the same template to organize other people’s material they copy off the internet.  When given a stochastic engineering problem with no clear, linear resolution they freeze up and long to return to scripted experience.

Technology is such an enabler, but it’s also limited by its capabilities.  If friendship is now understood through the lens of social media then it isn’t what it once was, it’s less with more people.  More isn’t necessarily better even though we’re told that it is more efficient.  If communication with a student is primarily through screens then teaching isn’t what it once was, it’s more information with less learning.  Both friendship and teaching pre-date digital communication and have deep, nuanced social histories, but we are happy to simplify them into oblivion for convenience and the illusion of efficiency.

If you ever find yourself struggling against invisible limitations, fighting to express yourself but finding it increasingly difficult, you’re up against this reductive technology.  That freedom of choice you feel when you put aside the digital and reclaim your full range of sense and capability is intoxicating.  It supercharges your mind and allows you to retain your humanity.  That I see so few people having those moments is a real cause for concern.

My son and I searching the tidal pools at Pacific Rim National Park on the edge of the world.  Carefully selected technology (a motorbike – so no digital distractions and out in the world) got us there, and then we put it all down and got lost in the world with no scripts telling us how to interact with it.  When was the last time you were unplugged?

 

This was such a complicated idea it spawned a number of others, including these thoughts of gamification.
The wise Skillen of the internet also shared this article on distraction prevention by a new media professor, which led to thoughts on distraction.

We’re Not Ready For This: A.I.

I saw this the other day:

He goes over deep learning, self-directed computer intelligence for the first fifteen minutes or so and summarizes at about 17:00 minutes.  The social implications of deep machine learning are quite profound.

Here are some other artificial intelligence related media that you might want to peruse:

Ray Kurzweil’s The Singularity is Near (a long and tedious mathematical read with some wonderful implications mixed in.)

Her, Spike Jonze’ deep ode to A.I.:


A lot of Hollywood A.I. talk falls short into HAL type horror, but this one doesn’t, it goes all the way.  By the end you’ll be questioning our short comings rather than fearing what a superior intelligence might do.  I wonder what Kurzweil thought of the A.I. in this film and what it ends up doing.

Better education doesn’t help? Work is irrelevant? What do we do
in a world of human pets that serve no real function in terms of survival?
This could be an age of unprecedented creativity, or the beginning of the end.

The TED talk has an interesting moment in those final two minutes where Howard is talking about the social implications of an imminent (the next five years!) machine intelligence revolution.  He talks about computers taking over jobs that we consider to be human-only and doing them better than people ever could.  This isn’t about coding a better piece of software, it’s about computers coding themselves in a never ending cycle of improvement.  It’s also about people no longer having to be responsible for their own survival decisions.


What happens to insurance companies when automotive accidents are a thing of the past?  Accidents don’t happen when the A.I. managing it can not only control the car in question, but also move the entire traffic jam up ten feet to avoid accidents.  This is often misunderstood as people say that A.I. driven vehicles could have bad code that causes a massive pile up.  These aren’t machines running code, these are machines that create code as they need it, kind of like people do, but much faster, and with absolute precision.  And however well they do it now, they’ll do it better tomorrow.

What happens to human beings when they are no longer
responsible for their own survival?

The busy truck driver still needs to sleep, what replaces him won’t.  It’ll never drive tired or hungry or angry or distracted either.  It’ll only ever use the least amount of gas to get where it’s going.  One of the tricky things about trying to grasp human superior A.I. is in trying to envisage all the ways that it would be superior.  That superior A.I. would never stop improving, it would take over any concept of efficiency in business.

As Howard says, machines that are able to build machines in a continuously improving manner are going to make the social change caused by the industrial revolution look like a blip on the radar.

Perhaps the hardest implication of a machine intelligence revolution is the idea that your income is tied to your usefulness.  Our entire society is predicated on the idea that your income somehow reflects your usefulness.  If human usefulness is no longer tied to social status, what would society look like?

During the big market bailouts in 2008 someone online described business as the cockroaches that feed off the work of human society.  He suggested that you don’t feed them steak, you just let them thrive on the waste.  The implication was that capitalism is a necessary evil that serves human beings, not the other way around as it’s often stated (people are a necessary evil in capitalism).

The idea that people could be free to pursue their own excellence in the future without having to work for the cockroaches is quite thrilling, though it would require a huge jump in social maturity for human beings.  We’d have to begin identifying our own self worth through our own actions rather than our education and employment.  I suspect most people aren’t close to that.  We’d also have to recognize that everyone has a unique and valuable place in society, which sounds like socialism!

Education is as guilty as any social construction in aiming children towards the idea of success being employability and income.  We stream students according to their intellectual capital and then tell them to work hard in order to achieve financial success in the future.  The very idea of effort is tied to financial success – something we’d have to change in a machine intelligent future.  Can humans value themselves and seek excellence without the yoke of survival hung around their necks?

Universal income is an idea being floated in Switzerland and elsewhere.  If the future is one where people are no longer integral to their own survival, we better find something other than a survival instinct to base our self value on, or we’re going to quickly run out of reasons for being.

Deep Learning AI & the Future of Work

Originally published on Dusty World in April, 2016:  https://temkblog.blogspot.com/2016/04/deep-learning-ai-future-of-work.html

 

Most jobs have tedium as a prerequisite.  No one does tedium
better than a machine, but we still demand that kind of work
for humans… to give them self worth?

This isn’t the first time our compulsive urge to assign monetary value to survival has struck me as strange.  This time it was prompted by an article on deep learning AI and how machines are close to resolving many jobs that are currently reserved for human beings (so that they can feel relevant).  We like to think employment is what makes us worthwhile, but it really isn’t, and hasn’t been for a long time.

The graph above is from that article and it highlights how repetitive jobs are in recession as machines more effectively take over those roles.  As educators this leaves us in a tricky situation because we oversee an education system modelled on factory routines that is designed to fit students into repetitive labour (cognitive for the ‘smart’ office bound kids, manual for the other ones).

 
 

How can an education system modelled on Taylorist principles produce students able to succeed in the Twenty-First Century?  It can’t, because it can’t even imagine the world those students are going to live in.  There is a lot of push back in educational theory around the systemic nature of school administration, but I see little movement from management other than lip service.  Educational stakeholders from unions to ministries and even parents like our conservative education system just the way it is.

Between neuroscience and freeing ourselves of academic prejudices
(ie: creativity happens in art class), we could be amplifying what
human beings are best at instead of stifling it. (from Newsweek)

In the meantime, people who are taught to sit in rows, do what they’re told and hit clearly defined goals are becoming increasingly irrelevant.  We have machines that do those very things better than any human can, and they’ll only be doing more of it in the future.

Ironically, just at the time where human beings might have technically developed a way out of having to justify their survival all the time they are also crippling their ability to do what humans do best.  In recent years creativity,as critically assessed in children, is diminishing.  The one thing we are able to do better than machines is being systemically beaten out of us by outmoded education systems and  machines that cognitively infect us with their own shortcomings!


Machines offer us powerful tools for a wide variety of tasks.  I use digital technology to express my interest in the natural world, publish, and learn, but for the vast majority of people digital technology is an amplifier of bad habits and ignorance.  Many people use the personalization possible in digital technology to amplify their own prejudices, juice their brains like Pavlovian dogs in empty games, and all while living in a cocoon of smug self justification.

Just when we’re able to leverage machines to free human beings from the tedium of working for a living, those same machines are shaping people to be as lazy, directionless and self assured as they wish.

In the meantime the education system keeps churning out widget people designed for a century ago and the digital attention economy turns their mental acuity into a commodity.


Rise of the machines indeed.


 

 

A nice bit of alternate future, but the description at the end is chilling – it’s how I see most people using the internet: “At its best, edited for the savviest readers, EPIC is a summary of the world, deeper and broader and more nuanced than anything available before. But at its worst, and for too many, EPIC is merely a collection of trivia, much of it untrue, all of it narrow, shallow and sensational, but EPIC is what we wanted…”