Best advice on blogging

Fourteen months into Dusty World and I’m approaching 5000 views, which is exciting.

When I started blogging after ECOO 2010, I did some reading about just how to do it.  A couple of suggestions have borne up well after fourteen months of blogging, but they aren’t your typical how-to-drive-page-views blog advice.



/WARNING
This isn’t a how-to-make-a-successful-business-blog type post.  There are piles of how to become a pro-blogger advice columns out there; this isn’t that.  I didn’t start blogging to make money or win fame.  If you’re looking for a how to monetize your blog and push traffic type thing, then I’d argue that you’re a shallow git who doesn’t get what this really is.  Blogging isn’t supposed to replace traditional media with a new way for a few people to target, market and sell to many, blogging is democratized publication.  Old, industrialized media is dying.  This isn’t a bad thing, stop trying to be like them.
/WARNING OFF

To that end, I found a piece of advice early that has allowed me to keep blogging regardless; always write about what you want, the way you want.  This works for me in a couple of different ways.  The main reason I’m still at it over a year later is that I find writing about problems, solutions and challenges in technology and teaching to be cathartic.  Blogging gives me perspective, sometimes it prompts responses that help me see a way through, and ultimately allows me to be better at what I do.  These are all ends in themselves; reason enough to blog right there.

I also never feel like this is work, more like therapy.  I know I’ll probably feel better after blogging, so tend to want to do it.  I’ve found that blogging tends to result in a very direct style of writing that clarifies thoughts and my feelings on issues (you can’t go on and on in a blog, it’s the wrong format).

I don’t shy away from a philosophical approach because I’m representing my peeps.  I am what I am and write what I write, I wouldn’t expect everyone to read it, I wouldn’t want them to.  I’m aiming at a specific reader (thoughtful, educationally and digitally curious), and I’m fine with that.  If I wanted page views I’d write college humor blog entries (but that would be work).

Another suggestion I read was write often (and don’t forget to label your entries so people can find them).  Once you’ve built up a library of entries, the views will find you.  This has been the case.  Looking at the stats, there is always a bump on a new post, but the archive gets more hits now than even the new-entry bump.  Keep writing, eventually you’ll build up enough content on enough subjects that people will find you.  Your entries get shared, and shared again, you start to see views from all around the world (which makes me wonder just how Canadian-centric my thinking may appear), and before you know it, there are many avenues to your blog, not just you pushing it out on your social networks.

Publishing your writing puts it out there.  It makes you want to put your best ideas forward.  I proof my entries, and try and make them technically correct because I never know who might end up reading them. Diary or journal writing for yourself doesn’t drive you in that direction, that’s a different sort of therapy.

As an English teacher, I feel like blogging makes me a better writer and gives students a chance to see how I do what I’m expecting them to do.  So many English teachers don’t write, or don’t offer students access to their writing… it strikes me as a bit hypocritical.

In the meantime, I find myself, a digitally skilled student of philosophy and history, living at a pivotal moment in human history; the birth of a digital revolution that will rock the world in much the same way (and much more quickly) than the industrial revolution did in the past two hundred years.  The world is changing in ways we can’t even begin to foresee, and I’m teaching in it!

It’s an exciting time to pick a timely new medium and enjoy using it.  Blog well my friends.

Deep Learning AI & the Future of Work

Originally published on Dusty World in April, 2016:  https://temkblog.blogspot.com/2016/04/deep-learning-ai-future-of-work.html

 

Most jobs have tedium as a prerequisite.  No one does tedium
better than a machine, but we still demand that kind of work
for humans… to give them self worth?

This isn’t the first time our compulsive urge to assign monetary value to survival has struck me as strange.  This time it was prompted by an article on deep learning AI and how machines are close to resolving many jobs that are currently reserved for human beings (so that they can feel relevant).  We like to think employment is what makes us worthwhile, but it really isn’t, and hasn’t been for a long time.

The graph above is from that article and it highlights how repetitive jobs are in recession as machines more effectively take over those roles.  As educators this leaves us in a tricky situation because we oversee an education system modelled on factory routines that is designed to fit students into repetitive labour (cognitive for the ‘smart’ office bound kids, manual for the other ones).

 
 

How can an education system modelled on Taylorist principles produce students able to succeed in the Twenty-First Century?  It can’t, because it can’t even imagine the world those students are going to live in.  There is a lot of push back in educational theory around the systemic nature of school administration, but I see little movement from management other than lip service.  Educational stakeholders from unions to ministries and even parents like our conservative education system just the way it is.

Between neuroscience and freeing ourselves of academic prejudices
(ie: creativity happens in art class), we could be amplifying what
human beings are best at instead of stifling it. (from Newsweek)

In the meantime, people who are taught to sit in rows, do what they’re told and hit clearly defined goals are becoming increasingly irrelevant.  We have machines that do those very things better than any human can, and they’ll only be doing more of it in the future.

Ironically, just at the time where human beings might have technically developed a way out of having to justify their survival all the time they are also crippling their ability to do what humans do best.  In recent years creativity,as critically assessed in children, is diminishing.  The one thing we are able to do better than machines is being systemically beaten out of us by outmoded education systems and  machines that cognitively infect us with their own shortcomings!


Machines offer us powerful tools for a wide variety of tasks.  I use digital technology to express my interest in the natural world, publish, and learn, but for the vast majority of people digital technology is an amplifier of bad habits and ignorance.  Many people use the personalization possible in digital technology to amplify their own prejudices, juice their brains like Pavlovian dogs in empty games, and all while living in a cocoon of smug self justification.

Just when we’re able to leverage machines to free human beings from the tedium of working for a living, those same machines are shaping people to be as lazy, directionless and self assured as they wish.

In the meantime the education system keeps churning out widget people designed for a century ago and the digital attention economy turns their mental acuity into a commodity.


Rise of the machines indeed.


 

 

A nice bit of alternate future, but the description at the end is chilling – it’s how I see most people using the internet: “At its best, edited for the savviest readers, EPIC is a summary of the world, deeper and broader and more nuanced than anything available before. But at its worst, and for too many, EPIC is merely a collection of trivia, much of it untrue, all of it narrow, shallow and sensational, but EPIC is what we wanted…”

Cyber Dissonance: The Struggle for Access, Privacy & Control in our Networked World

Back in the day when I was doing IT full time (pre-2004), we were doing a lot of local area networking builds for big companies.  There was web access, but never for enterprise software.  All that mission-critical data was locked down tight locally on servers in a back room.  When I returned from Japan in 2000, one of my jobs as IT Coordinator at a small company was to do full tape backups off our server at the end of each day and drop off the tapes in our offsite data storage centre.  Network technology has leapt ahead in the fifteen years since, and as bandwidth has improved the idea of locally stored data and our responsibility for it has become antiquated.


We were beginning to run into security headaches from networked threats in the early zeroes when our sales force would come in off the road to the main office and plug their laptops into the network.  That’s how we got Code Redded, and Fissered, and it helped me convince our manager to install a wireless network with different permissions so ethernet plugged laptops wouldn’t cronk our otherwise pristine and secure network where all our locally stored, critical business data lived.  We had internet access on our desktops, but with everyone sipping through the same straw, it was easy to manage and moderate that data flow.  Three years later I was helping the library at my first teaching job install the first wireless router in Peel Board so students could BYOD – that was in 2005.


Back around Y2K,  IT hygiene and maintenance were becoming more important as data started to get very slippery and ubiquitous.  In a networked world you’re taking real risks by not keeping up with software updates. This is still an issue in 2019, at least in education.  We’re currently running into all sorts of headaches at school because our Windows 7 image is no longer covered by Microsoft.  Last year one of our math teachers got infected by a virus sent from a parent that would be unable to survive in a modern operating system, but thanks to old software still infesting the internet, even old trojans get a second and third chance.  Our networked world demands a degree of keep-up if everyone is going to share the same online data – you can’t be ten paces behind and expect to survive in an online environment like that, you’re begging to be attacked.


Last summer I took Cisco’s Cyber Operations Instructor’s Program, which was a crash course in just how fluidly connected the modern world is, and how dangerous that can be.  After logging live data on networks and seeing just how much traffic is happening out there from such a wide range of old and new technology, it’s a wonder that it works as well as it does.  Many cybersecurity professionals feel the same way, our networks aren’t nearly as always on as you think.


This past week I attended Cisco’s Connect event which once again underlined how much IT has changed since I was building LANs in the 90s and early 00s.  The drive to cloud computing where we save everything into data centres connected to the internet comes from a desire for convenience, dependability and the huge leap in bandwidth on our networks – and you ain’t seen nothing yet.  There was a time when you had to go out and buy some floppy disks and then organize and store them yourself when you wanted to save data.  Now that Google and the rest are doing it for you, you can find your stuff and it’s always there because you’ve handed off that local responsibility to professionally managed multi-nationals who have made a lot of money from the process, but there is no doubt it’s faster and more efficient than what we did before with our ‘sneaker-nets‘.

You probably spend most of your day with
a browser open.  Ever bothered to understand
how they work?  Google’s Chrome Intro Comic
is a great place to start.

If you ever look behind the curtain, you’ll be staggered by how many processes and how much memory web based applications like Google Chrome use.  Modern browsers are essentially another operating system working on top of your local operating system, but that repetition will soon fade as local operating systems atrophy and evolve into the cloud.


At Cisco Connect there was a lot of talk around how to secure a mission critical, cloud based business network full of proprietary IP when the network isn’t physically local, has no real border and really only exists virtually.

Cisco Umbrella and other full service cloud computing security suites do this by logging you into their always on, cloud based network through specific software.  Your entire internet experience happens through the lens of their software management portal.  When you lookup a website, you’re directed to an Umbrella DNS server that checks to make sure you’re not up to no good and doing what you’re supposed to be doing.  Systems like this are called IaaS – infrastructure as a service, and they not only provide secure software, but also integrate with physical networking hardware so that the IaaS provider can control everything from what you see to how the hardware delivers it.



In 2019 the expectation is for your business data to be available everywhere all the time.  It’s this push towards access and connectedness, built on the back of our much faster network, that has prompted the explosion of cloud based IT infrastructure.  In such an environment, you don’t need big, clunky, physically local  computer operating systems like Windows and OSx.  Since everything happens inside one of the browser OSes, like Chrome, all you need is a thin client with fast network access.

The irony in Chromebooked classrooms is that the fast network and software designed to work on it aren’t necessarily there, especially for heavy duty software like Office or Autocad, so education systems have migrated to thin clients and found that they can’t do what they need them to do.  If you’ve ever spent too much time each day waiting for something to load in your classroom, you know what I’m talking about.  A cloud based, networked environment isn’t necessarily cheaper because you should be building network bandwidth and redundancy out of the savings from moving to thin clients.  What happened in education was a cash grab moving to thin clients without the subsequent network and software upgrades.  This lack of understanding or foresight has produced a lot of dead ended classrooms where choked networks mean slow, minimalist digital skills development.  Ask any business department how useful it is teaching students spreadsheets on Google Sheets when every business expectation starts macros in Excel.

Seeing how business is doing things before diving back into my classroom is never wasted time.  The stable, redundant wireless networks in any modern office put our bandwidth and connectivity at school to shame.  In those high speed networks employees can expect flawless connectivity and collaboration regardless of location with high gain software, even doing complex, media heavy tasks like 3d modelling and video editing in the cloud – something that is simply impossible from the data that drips into too many classrooms onto emaciated thin clients.  Data starvation for the less fortunate is the new normal – as William Gibson said, the future is already here, it’s just not evenly distributed.


Seeing the state of the art in AI driven cybersecurity systems is staggering when returning to static, easily compromised education networks still struggling to get by with out of date software and philosophies.  The heaps of students on VPNs bypassing locks and the teachers swimming through malware emails will tell you the truth of this.  The technicians in education IT departments are more than capable of running with current business practices, but administration in educational IT has neither the budget nor the vision to make it happen.  I have nothing but sympathy for IT professionals working in education.  Business admin makes the argument that poor IT infrastructure hurts their bottom line, but relevant, quality digital learning for our students doesn’t carry the same weight for educational IT budgets.


In addition to the state of the ICT art display put on at Cisco’s conference, I’m also thinking about the University of Waterloo’s Cybersecurity & Privacy Conference from last month.  The academic research in that conference talked at length about our expectations of privacy in 2019.  Even a nuanced understanding of privacy would probably find some discomfort with the IaaS systems that cloud computing is making commonplace.  The business perspective was very clear: you’re here to work for us and should be doing that 24/7 now that we’ve got you hooked up to a data drip (smartphone) in your pocket.  Now that we can quantify every moment of your day, you’re expected to be producing. All. The. Time.  I imagine education technology will be quick to pick up on this trend in the next few years.  Most current IaaS systems, increasingly built on machine learning in order to manage big data that no person could grasp, offer increasingly detailed analysis (and control) of all user interaction.  Expect future report cards to show detailed time wasted by your child data on report cards, especially if it can reduce the number of humans on the payroll.


These blanket IaaS systems are a handy way of managing the chaos that is an edgeless network, and from an IT Technician and Cybersec Operator point of view I totally get the value of them, but if the system gives you that much control over your users, what happens when it is put in the hands of someone that doesn’t have their best interests at heart?

WIRED had an article on how technology is both enabling and disabling Hong Kong protestors in the latest edition.  While protestors are using networked technology to organize themselves, an authoritarian government is able to co-opt the network and use it against its own citizens.  I wonder if they’re using business IaaS software that they purchased.  I wonder if many of the monitoring systems my students and I are becoming familiar with in our cybersecurity research is being purchased by people trying to hurt other people.




As usual, after an interesting week of exploring digital technology I’m split on where things are going.  We’ve seen enough nonsense in cybersecurity by criminals and government supported bad actors on the international stage that there is real concern around whether the internet can survive as an open information sharing medium.  Between that and business pushing for greater data access on increasingly AI controlled internets of their own that could (and probably are) used by authoritarian governments to subjugate people, I’m left wondering how much longer it’ll be before we’re all online through the lens of big brother.  If you’re thinking this sounds a bit panicky, listen to the guy who invented the world wide web.


The internet might feel like the wild west, but I’d rather that than blanket, authoritarian control.  Inevitably, the moneyed interests that maintain that control will carve up the internet, reserving clean, usable data for those that they think deserve it and withholding it, or leaving polluted information from everyone else.  I get frustrated at the cybercriminals and state run bad actors that poison the internet, but I get even more frustrated at the apathy of the billions who use it every day.  If we were all more engaged internet citizens, the bad actors would be diminished and we wouldn’t keep looking for easy answers from self-serving multinationals looking to cash in on our laziness.  I’ve said it before and I’ll say it again, if I could help make a SkyNet that would protect the highest ideals of the internet as its only function, I’d press START immediately.


The internet could be one of the most powerful tools we’ve ever invented for resolving historical equity issues and allowing us to thrive as a species, but between criminality, user apathy and a relentless focus on cloud computing and the control creep it demands, we’re in real danger of turning this invention for collaboration and equity into a weapon for short term gain and authoritarian rule.



“It’s astonishing to think the internet is already half a century old. But its birthday is not altogether a happy one. The internet — and the World Wide Web it enabled — have changed our lives for the better and have the power to transform millions more in the future. But increasingly we’re seeing that power for good being subverted, whether by scammers, people spreading hatred or vested interests threatening democracy.”

– Tim Berners Lee



“The internet could be our greatest collaborative tool for overcoming historical inequity and building a fair future, or it could be the most despotic tool for tyranny in human history.  What we do now will decide which way this sword will fall.  Freely available information for all will maximize our population’s potential and lead to a brighter future.  The internet should always be in service of that, and we should all be fighting for that outcome in order to fill in the digital divide and give everyone access to accurate information.  Fecundity for everyone should be an embedded function of the internet – not voracious capitalism for short term gain, not cyber criminality and not nation state weaponization.  Only an engaged internet citizenship will make that happen.”

– my comment upon signing a contract for the web.

from Blogger https://ift.tt/2Nc23AI
via IFTTT

The Tyranny of Collaboration

I was talking to a digital native the other day in English class about Shakespeare.  This particular Millennial is a top 5%er who will go on to do great things.  She was wondering who the people who wrote Shakespeare were.  I was surprised at the question as I’ve always thought one person wrote Shakespeare.  I even have trouble with the classist conspiracy types who think an actor couldn’t be that smart so a noble must have done it.  Having read a lot of Shakespeare (all of it actually) over decades, I know his voice, and it isn’t a voice by committee; that kind of brilliance doesn’t happen around a meeting table.

I thought it interesting that the Millennial mind assumes collaboration, infecting her own generation’s constant interaction across history.  The internet has turned the digital natives who live in it into a hive mind.  They can’t form an opinion without socializing or turning to the internet for information. Their waking lives are awash in constant communication.  They describe moments ‘trapped’ in their own mind when they are unplugged as boring.

The modern mind is open in a way that someone from 20 years ago, let alone 400 years ago, would find alarming. Our marvellous information revolution has not only made our data public, it is also changing what we think we are individually capable of.  Needless to say, if we start thinking that individual genius can’t happen in the quiet of our own minds, it won’t.

A smart, capable digital native can’t conceive of a single mind being capable of producing great works, they must be the result of never ending communication and collaboration.  A couple of centuries from now people who have been immersed in digital communications for generations will wander around The Van Gogh Museum or read Macbeth and think that people from back then must have been mental giants to do these things alone, that or they’ll reinvent history as each age does, in its own image, seeing collaboration and minds peeled open under a barrage of constant communication where none were.

Education hops on the back of this communication revolution (flood?) and has integrated collaboration into just about every aspect of learning.  Leveraging technology to find new and exciting ways of collaborating is one of the pillars of early Twenty-First Century education.  Students have lost the idea of personal mind-space thanks to current communications habits.  The classroom, one of the last places where a student might find privacy in their own heads has been crushed under the weight of expectations from this social shift.  Much of this is shrouded in talk of engagement and preparing students for the modern world.  I just hope that preparation has real advantages for the student in terms of personal development.  I’m starting to doubt that.

Brainstorming about the advantages of deep thinking in your own head – from an ENG3u class two years ago…



Lessons From Skills Canada

Originally published April, 2012 on Dusty World (and the precursor to many more Skills Ontario posts)…

Friday I chaired the video creation Skills Canada regional competition in Guelph.  Ours was a competitive division with five teams who had to film, edit and post-produce a pre-planned thirty second ad in four hours.  Only three teams could place and only the top team could move on to the provincial competition.

Some observations stood out:

  • The hard deadlines came as a shock to many of the students, who aren’t used to them any more (we don’t really require hard deadlines in class any more)
  • The competitive nature of the competition concerned a number of the teams, who couldn’t comprehend being allowed to lose in school (we don’t really integrate competitive winning and losing in class any more)
  • The sense of satisfaction that resulted from getting a quality piece of work done in the time given surprised many of the students (we don’t really allow students to develop a sense of satisfaction from completing work on time – on the contrary, a number of students recently told me at parent teacher interviews that they are sick and tired of knocking themselves out to complete work by deadlines only to see slack and idle students hand in the same thing whenever they get around to it).
  • At the rewards ceremony many of the students were at a loss as to how to act when they’d won (stony faced and blankly indifferent were the norm, broken up by the odd grin).  They were also unable to recognize what losing gracefully looked like.
  • In the automotive technology section the announcer said, “congratulations gentlemen” only to realize that one of the gold medallist was female (from our school!) and back pedal.   If we’re going to break the gender assumptions around skilled trades, it starts here (and is).
  • Skills Canada has reinforced for me (yet again) that media arts isn’t an arts course so much as it’s a technical skills course that includes artistic input (like carpentry).  We just got rather brutally cut for new students while being administered by the fine arts department, I think in great part because what we’re teaching is being administered by a department that doesn’t know how to present us or what to do with us.
Skills Canada is a wonderful program that empowers students to embrace their passions in the skilled trades.  Often looked down upon by the academically prejudiced teachers (all university grads deeply ingrained in academia), many of these students with smart hands and kinesthetically focused minds look like failures to the pen & paper classroom teacher.
Our school is fortunate to have a busy and wide ranging technology department with many course options.  Those hands-smart, kinesthetic thinkers must suffer in smaller schools full of class rooms and little else.
Having participated in Skills Canada for two years now, I’m a fan.  I plan to encourage our computer engineering students to put their names in for the IT competition, and our media arts students to jump into the crucible, they come out tempered by the experience.
As one of the grade 12s said at the end of the day, “I was put off by the competition and now I’m sorry I never tried this before.  It was a great experience, and a great challenge.  I wish I had a chance to do it again, now that I’ve tried it, I want to do it again better.”  That is the greatest lesson of competition, it clarifies how you can improve in no uncertain terms, and then offers you another chance to show what you know.  Of course, as a senior he won’t be here next year.
I’ve got to find ways to get younger students involved in taking this risk, the rewards are great, and by grade 12 they’ll be weathered veterans who can take a competitive run at the medal stand.  Nothing they do in class helps prepare them for the world they are about to walk out into more.

ECOO 2016 Reflections: maker spaces and iteration

The maker movement isn’t a fad to
engage students.  The people who
believe in it live it.

Back from the 2016 ECOO Conference, I’ve let things mull over for a couple of days before reflecting:  

On maker spaces…

Last year’s conference was very excited about Maker Spaces, and that focus seems to have died down.  To develop meaningful maker spaces means believing in and adopting the thinking behind it.  The people behind the maker movement believe in it passionately, they live it. Education’s ADD means that making was never going to go that far in the classroom.  The moment I heard teachers complaining about the extra work makerspaces created I knew it was doomed.  Most teachers aren’t curious about how things work and don’t want to play with reality, they’re concerned about delivering curriculum.  

I suspect many maker spaces in classrooms have become either dusty corners or play areas.  It was nice to see the monolithic educational system flirt with something as energetic and anarchistic as the maker movement though, even if it was only for a short while.

On Iteration…

This came up a several times in the conference.  A couple of years ago Jaime Cassup gave an impassioned keynote on the value of iteration.  His argument, based on the software industry’s approach to building code, was to fail early and fail often.

This time around Jesse Brown brought it up again, citing Edison’s, I didn’t fail a thousand times, I found a thousand ways that didn’t work quote.  He then (strangely) went on to compare his being let go as a radio broadcaster and lucking in to a tech startup as an example of iteration, which it isn’t.  Doing one thing and then stumbling into something completely unrelated when it ends isn’t iteration.

In education this misunderstanding is rampant.  Good students learn to do what they’re told as efficiently as possible in order to succeed in the classroom (‘lower level’ students are much more willing to take risks – they’re not as invested in the system).  A misunderstanding of iteration is what we use to justify and even encourage failure.   It has become another way to let digital natives’ video-game driven process of learning have its way, but it isn’t very efficient.

There is iteration in the engineering process, but it’s never
a fail early, fail often approach. If you don’t know why you
failed then you shouldn’t be rushing off to fail again.

The other week I gave my grade 12 computer engineers detailed explanations of how to build a network cable, a video showing it being done and then posted wiring diagrams showing the proper order.  The most capable students followed engineering process (a directed iterative process, rather than a random one) and produced working network cables more and more quickly.  The end result was no real cost for me (all my ends and wires were made into functional cables).

The majority of the students, perhaps because they live in our brave new Google world of fail often and fail early, or because people keep misquoting Edison at them, didn’t read the instructions (who does any more, right?) and just started throwing ends on cables, crimping them badly and producing failure after failure.  This is great though because they’re engaged, right?

When I got angry at them they were belligerent in return.  How dare I stifle their creativity!  Unfortunately, I’m not assessing their creativity.  They are trying and that’s all I should be asking for!  I’m not grading them on engagement either.  I have been brandishing the engineering process throughout their careers in computer technology, but these video-game driven iterators think their die early, die often approach in games is perfectly transferable to the real world.  Bafflelingly, many educators are gee-whizzing themselves into this mindset as well.  You’ll quickly find that you run out of budget if you do.

from Blogger http://ift.tt/2eQcaqx
via IFTTT

Architect of the Future

I just read @banana29‘s “Emergence of Web3.0” blog on the immediate future of the web.  Web3.0, if Alanna is on her game (and I know she is), looks like the next step in managing our data meltdown.

Last year ended with me in a dark and questioning place about the effects of digital media on how people think.  I’ve done my due diligence, and read The Shallows by Nick Carr.   Carr puts forward a compelling, well researched and accurate account of just what the internet is doing to people in the early 21st Century.  I see it in school every day with the digital zombies.  What is to become of the poor human too stupid to pass the are-you-human capcha?  The Shallows points us to our failure to manage the digital revolution we’ve begun.

I’ve decided to start off the new year by going to the opposite side of the digital Armageddon/digital paradise debate; I’ve just started Ray Kurzweil’s The Singularity Is Near on the advice of a Quora member who describes The Singularity as the opposite of The Shallows.  Kurzweil begins the book with some math and an explanation of how exponential growth works.  In the process he suggests a different growth pattern than the one most people would intuitively follow.

If Kurzweil is right, and I suspect he is closer than many futurist speculators, then we are about to hit a period of accelerated growth similar to that of the industrial revolution.  Our floundering in data is much the same as the mid-nineteenth century’s floundering in early industrialization.  Like Dickins, Carr points to the perils of new technology and how it’s making us worse, and there is no doubt that, for the vast majority, it is making them worse at this early stage digitization.

Just as children were pressed into dangerous factory work and pollution killed millions in early industrialization, so our first steps into digitization have zombified much of the populace, making them less than what they were before.  Our heavy-handed, pre-digital habits have been hugely amplified by networked efficiencies and have hurt many digital natives in the process.  What used to be slow moving, linear marketing in the pre-digital age has become an unending avalanche of brain numbing, tedious attention grabbing on the nascent world wide web.

Sharing music on a mixed tape used to be a benign bit of theft between friends, of no real damage.  Take that idea of sharing music and digitize it, and suddenly you’ve crippled a major industry that only existed in the first place because live music was industrialized into sell-able media.  Digitization creates efficiencies that would seem completely foreign and unbelievable in previous contexts.

Having friends over to watch a movie, or going out to a movie together that happened before home video, suddenly turns into video sharing online, and stuns another media empire.  They struggled against VCRs, then got knocked flat by torrents, but at no point did they think it wasn’t OK to charge me $6 to see Star Wars in the theatre each of nine times, then $40 for the VHS, then another $40 for the DVD, then another $40 for the bluray (it’s not done yet, they’re going to resell it to me in 3D next).

Suddenly police states (like Egypt, Libya or San Francisco) can’t create silence and obedience out of fear, and dictators around the world are faced with a slippery new medium for communication that is not centrally administrated and controlled.  Dictators around the world (from media companies to Gaddafi) fear their loss of control over the signal.

We’ve always shared media, we’re a social species and love to share art that represents our stories and culture.  Digitization brought that back after a century of industrialized, centralization of culture that trivialized and often eradicated memes that weren’t attractive to enough people.  This subtle and persistent destruction of variation culturally bankrupted us by the end of the 20th Century.  To many, watching that monster die doesn’t bring on any waves of despair, and will usher in a renaissance of creativity.

Web2.0 pushed social media, allowing common interests and individual ideas to flourish regardless of geography.  No matter how trivial or insignificant your interest, you are always able to find a critical mass of people online who you can share your fascination with.  This has corrosively weakened the century of industrialized, forced shared interests we’ve all been required to live with.

Digitization is re-animating the idea of a more unique sense of the self.  You no longer have to be a brand name junkie based on massive, global industrial interests telling you what you should like.  Advertising is agonizing over this now, as are those massive, global interests.

Into this maelstrom of early digitization comes Carr, accurately describing how the early internet is a new medium, infected by the old industrial interests whose heavy handed marketing has created whole generations of attention deficit zombies.  When you combine the heavy handed tactics of pre-digital business with the near frictionless and always on nature of digital media, you get a recipe for Ritalin.

Like the soot covered, pollution infected children of the industrial revolution, the screen caged digital child is being treated roughly, but to expect that the early days of a revolution will be like the later days is not historically reasonable; though that shouldn’t stop us from fighting against the dehumanization of children caused by our current mistakes.

Those soot covered child-laborers prompted society to develop public education systems that eventually produced stunning break-throughs in all eras of human endeavor.  In fact, that initial failure of industrialization eventually produced a more educated and capable population thanks to the public education it caused.  We won’t see soot covered digital children forever.

The digital world we will eventually develop will have as much in common with 2012, as 1970 did with 1870.  And if you believe Kurzweil, the exponential growth curve will develop information technology and artificial intelligence so advanced that it begins self-recursion, drastically increasing capabilities.  No longer limited to biological evolution, Kurzweil forsees a  rate of growth that makes the industrial revolution look positively anemic.  It won’t take one hundred years for us to see as much change as industrialization did in a century.

This will happen less soon but more quickly than people suspect, such is the nature of exponential growth.  In the process we will  be abused by old habits on new technology less and less as more of us become more  capable.  Web2.0 and social media are a huge step in this direction.  We’ll beat back the manipulators and make the technology serve us rather than having economic interests overpowering us with their own heavy handedness.

If this seems like a lost cause, it isn’t; you can’t let something like The Shallows scare you off inevitable change.  You’re living in a transformative time, and these are the moments when the people who can see the truth of things to come become architects of the future.

Once more into the breach dear friends!

Originally posted on Dusty World in February, 2014…

From thirteen years old in Air Cadets onward I’ve taken leadership courses.  I think I have a pretty good grasp of the mechanics, though its often hard to see my own shortcomings in the process.  One of those short comings is I tend to leap into the breach rather than direct the battle.  I’d rather be hands-on and leading by example, but this creates its own problems.

This past couple of years I’ve been working as Head of Computer Studies.  I inherited that job and the rather unique responsibilities that came with it, but rather than moan about it I stepped up and did everything I could to make it work.  While I was running one of the only remaining integrated computer studies departments in the board I was also managing an increasingly complicated IT budget (which I had suggested in the first place).

Ten years ago there was one kind of printer in our school and it was tightly integrated into a closed, wired board network.  In the past three years especially, our board (in a very forward thinking move) began to diversify technology beginning with wifi a couple of years ago.  This has peaked with the introduction of a Bring Your Own Device (BYOD) initiative that has caused a diaspora of technology in our school.  Where once we had a single kind of printer, now we

You need to be wearing this shirt yourself

have dozens.  Where once everyone was on the same kind of desktop on the same operating system with access to the same applications, now we have hundreds if not thousands of combinations of hardware and software in the school.  I think this is a good thing, but it asks a lot of questions of teachers when they are expecting students, who aren’t as digitally native as you might think, to get work done.  Many of those teachers aren’t interested in being their own technology support either.

While all this has been happening, due to politics beyond their control, our IT budget has been slashed and the amount of support we get has dried up.  Where once we could expect our centralized board IT department to support a monolithic technology environment, we now have a diverse technology wilderness.

Into that wilderness I tried to maintain the level of support our staff and students had become accustomed to.  Being ‘mixed’ into a headship, our key computer teacher position was at best vague, and as the undercurrents in technology trends and support became clear, the job became heavier and heavier, to the point where I was taking days off from teaching to move labs around because IT couldn’t manage it.

One of the reasons I’m good at this sort of thing is because I throw myself into it, body and soul.  With that emotional energy I get a lot done, and it stings when it isn’t recognized or appreciated.  As the headship restructuring occurred it was hard not to take the dismissal of any role I had at the table personally.  That is one of the short comings of my approach to work, lots gets done, but I take it personally.

My main concern is successfully engaging staff and students with vital 21st Century digital fluencies that our graduates will need outside the walls of our school.  Perhaps plugging in network cables for people isn’t the best way to achieve that goal.  One of the problems with being a go-get-em type problem solver is I tend to have a myopic view of the bigger picture, especially when circumstances conspire to bury me in tech support.

When I came into teaching in 2004 I was shocked at how far behind education was compared to the business environment I’d just been an IT coordinator in.  In 2003 we’d already moved most staff to one to one technology (laptops) and our ordering system was accessible online.  In 2004 teachers were still filling in bubble sheets for attendance and having a secretary run them through a card reader (like it was 1980).  What few labs there were old desktops running six year old versions of windows that barely had any network functionality.

I started a computer club at my first school in Brampton and we put a wireless router into the library – the first one in the board as far as I know.  Students immediately began using it and our librarian was overjoyed, he could suddenly supply internet to all sorts of students.  That would be BYOD and wifi, in 2004 in an Ontario public high school.

I’ve pushed and pushed to connect education to more current information technologies, and there has been constant if slow improvement.  We’ve now caught up with 2004, we’re probably well into 2007 by now.  Of course, when students graduate they aren’t going to be expected to have a firm knowledge of 2007 digital workflow, so I’ll keep pushing.  

One of the reasons young people look so out of touch with business need is due to our outdated handling of technology in their education; it’s tough keeping up with a revolution in a system as conservative as education.

This matter of technology support is something I’ve got to reconsider, especially if we aren’t going to make a space for it locally.  The goal was never to do everything for everyone, the goal was to teach people how to perform basic troubleshooting themselves in order to make digital tools available when they need them; I’m not sure how that will happen in the future.  I don’t think a strong central support role is something that will return.  We need to find a way to integrate digital fluencies, including a basic understanding of how to get computers working, across the curriculum so that all teachers and students feel responsible for their own tech-use.  The idea is to see an acceleration in how current educational technology compares to what happens outside of the walls of a school.  This disparity causes tensions in both graduates and students who strain at the differences between school-tech expectations and how they are experiencing technology in the rest of their lives.

I’d make the argument that if you’re going to drive a car you should know how to change a tire and take care of basic maintenance, but many people can’t be bothered (though they are quick to complain about how much it costs to have other people do these things for them).  The same thing happens with computers.  Not everyone needs to be able to rebuild a computer from the ground up, but if you want to use one you should be able to do basic troubleshooting in order to have the technology work when you need it to.  How to create that self sufficiency is the question.

I’m not sure how that’s going to happen in the future, but I’m still determined to create an educational experience that produces digitally relevant graduates.  Rather than leaping into the breach and doing onsite technology support I have to find another way of getting more people technologically self sufficient.

Cybersecurity and the AI Arms Race

We had a very productive field trip to the University of Waterloo for their Cybersecurity and Privacy Conference last week. From a teacher point of view, I had to do a mad dance trying to work out how to be absent from the classroom since our school needs days got cut and suddenly any enrichment I’m looking for seemingly isn’t possible.  I managed to find some board support from our Specialist High Skills Major program and pathways and was able not only arrange getting thirty students and two teachers out to this event, but also to do it without touching the school’s diminished cache of teacher out of the classroom days.

We arrived at the conference after the opening keynote had started.  The only tables were the ones up front (adults are the same as students when it comes to where you sit in a room).  Sarah Tatsis, the VP, Advanced Technology Development Labs at BlackBerry, kindly stopped things and got the students seated.  The students were nervous about being there, but the academic and industry professionals were nothing but approachable and interested in their presence.


What followed was an insightful keynote into Blackberry’s work in developing secure systems in an industry famous for fail fast and early.  Companies that take a more measured approach to digital technology can sometimes seem out of step with the rock-star Silicon Valley crowd, but after a day of listening to software engineers from various companies lamenting ‘some companies’ (no one said the G-word), who tend to throw unfinished software out and then iterate (and consider that a virtue), the hard work of securing a sustainable digital ecosystem seems further and further out of reach.  The frustration in the air was palpable and many expressed a wish for more stringent engineering in online applications.

From Sarah Tatsis I learned about Cylance, Blackberry’s AI driven cybersecurity system.  This reminded me of an article I read in WIRED recently about Mike Beck, a (very) experienced cybersec analyst who has been working on a system called Darktrace, that uses artificial intelligence to mimic his skills and experience as a cybersecurity analyst in tracking down incursions.

 I spent a good chunk of this past summer becoming the first high school teacher in Canada qualified to teach Cisco’s CCNA Cyber Operations course which, as you can gather from the name, is focused on the operational nature of cybersecurity.  After spending that time learning about the cyber-threatscape, I was more and more conscious of how attackers have automated the attack process.  Did you know criminals with little or no skill or experience can buy an exploit kit that gives them a software dashboard?  From that easy to use dashboard, complex attacks on networks are a button push away.

So, bad actors can perform automated attacks on networks with little or no visibility, or experience.  On the other side of the fence you’ve got people in a SOC (so much of this is the acronyms – that’s a Security Operations Centre), picking through anomalies in the system and then analyzing them as potential threats. That threat analysis is based on intuition, itself developed from years of experience.  Automating the response to automated attacks only makes sense.

In the WIRED article they make a lot of hay about how AI driven systems like Darktrace or Cylance could reduce the massive shortage of cybersecurity professionals (because education seems singularly disinterested in helping), but I don’t think that will happen.  In an inflationary technology race like this, when everyone ups their technology it amplifies the complexity and importance of jobs, but doesn’t make them go away.  I think a better way to look at this might be with an analogy to one of my other favourite things.

Automating our tech doesn’t reduce our effort.  If
anything it amplifies it.  The genius of Marc Marquez
can only be really understood in slow motion as he
drifts a 280hp bike at over 100mph.  That’s what an

AI arms race in cybersec will look like too – you’ll only
be able to watch it played back in slow motion to
understand what is happening.
What’s been happening to date is that bad actors have automated much of their work, sort of like how a bicycle automated the pedaling by turning into a motorcycle.  If you’re trying to race a bicycle (human based cyber-defence) against a motorcycle (bad actors using automated systems) you’re going to quickly find yourself dropping behind – much like cybersecurity has.  As the defensive side of things automates, it will amplify the importance of an experienced cybersec operator, not make it irrelevant.  The engines will take on the engines, but the humans at the controls become even more important and have to be even more skilled since the crashes are worse.  Ironically, charging cyber defence with artificial intelligence will mean fewer clueless script kiddies running automated attack software and more crafty cybercriminals who can ride around the AI.  I’ve also been spending a bit of time working with AI in my classroom and can appreciate the value of machine learning, but it’s a data driven thing, and when it’s working with something it has never seen before you quickly come to see its limitations.  AI is going to struggle, especially with things like zero day threats.  There’s another vocab piece for you – zero day threats are attacks that have never been seen before, so there is no established defence!

Once a vulnerability is found in software it’s often held back and sold to the highest bidder.  If you discovered a backdoor into banking software, imagine what that might sell for.  Did you know that there is a huge market for zero day threats online?  Between zero day attacks, nation-state cyberwar on a level never seen before and increasingly complex cybercriminals (some of whom were trained in those nation state cyber war operations), the digital space we spend so much of our time in and more and more of our critical infrastructure relies on is only going to get more fraught.  If you feel like our networked world and all this cybersecurity stuff is coming out of nowhere, you ain’t seen nothing yet.  AI may very well help shore up the weakest parts of our cyber-defence, but the need for people going into this underserved field isn’t going away any time soon.


***


Where did the Cybersecurity & Privacy Conference turn next?  To privacy!  Which is (like most things) more complicated than you think.  The experts on stage ranged from legal experts to sociologists and tackled the concept from many sides, with an eye on trying to expose how our digitally networked world is eroding expectations of private information.


I found the discussion fascinating, as did my business colleague, but many of the students were finding this lecture style information delivery to be exhausting.  When I asked who wanted to stick around in the afternoon for the industry panel on ‘can we fix the internet’, only a handful had the will and interest.  We had an interesting discussion after about whether or not university is a good fit for most students.  Based on our time at the conference, I’d say it isn’t – or they just haven’t grown into the brain they need to manage it yet.  What’s worrying is that in our increasingly student centred, digital classrooms we’re not graduating students who can handle this kind of information delivery.  That kind of metacognitive awareness is gold if you can find it in high school, and field trips like this one are a great way to highlight it.


The conference (for us anyway) wrapped up with an industry panel asking the question, “Can the Internet be saved?”  In the course of the discussion big ideas, like public, secure internet for all (ie: treating our critical ICT infrastructure with the same level of intent as we do our water, electrical and gas systems) were bandied about.  One of my students pointed out that people don’t pirate software or media for fun, they do it because they can’t afford it, which leads to potential hazards.  There was no immediate answer for this, but many of the people up there were frustrated at the digital divide.  As William Gibson so eloquently said, “the future is already here – it’s just not evenly distributed.”  That lack of equity in entering our shared digital space and the system insecurity this desperation causes was a recurring theme.  One speaker pointed out that a company only fixated on number of users has a dangerously single minded obsession that is undermining the digital infrastructure that increasingly manages our critical systems.  If society is going to embrace digital, then that future better reach everyone, or there are always going to be people upsetting the boat if they aren’t afforded a seat on it.  That’s also assuming the people building the boats are more interested in including everyone rather than chasing next quarter earnings.


This conversation wandered in many directions, yet it always came back to something that should be self-evident to everyone.  If we had better users, most of our problems would disappear.  I’ve been trying to drive this ‘education is the answer‘ approach for a while now, but interest in picking up this responsibility seems to slip off everyone from students and teachers to administration at all levels.  We’re all happy to use digital tools to save money and increase efficiencies, but want to take no individual responsibility for them.


I’ve been banging this drum to half empty rooms for over a year now.  You say the c-word (cybersecurity) and people run away, and then get on their networked devices and keep doing the same silly things they’ve always done.  Our ubiquitous use of digital technology is like everyone getting a new car that’s half finished and full of safety hazards and then driving it on roads where no one can be bothered to learn the rules.  We could do so much better.  How digital skills isn’t a mandatory course in Ontario high schools is a mystery, especially when every class uses the technology.



I was surprised to bump into Diana Barbosa, ICTC’s Director of Education and Standards at the conference.  She was thrilled to see a troop of CyberTitans walk in and interrupt the opening keynote.  The students themselves, including a number of Terabytches from last year’s national finalist team who met Diana in Ottawa, were excited to have a chat and catch up.  This kind of networking is yet another advantage of getting out of the classroom on field trips like this.  If our pathways lead at the board hadn’t helped us out, all of that would have been lost.


We left the conference early to get everyone back in time for the end of the school day.  When I told them we’d been invited back on the bus ride home they all gave out a cheer.  Being told you belong in a foreign environment like an industry and academic conference full of expert adults is going to change even more student trajectories.  If our goal is to open up new possibilities to students, this opportunity hit the mark.


From a professional point of view, I’m frustrated with the lack of cohesion and will in government and industry to repair the fractured digital infrastructure they’ve made.  Lots of people have made a lot of money driving our society onto the internet.  The least they could do is ensure that the technology we’re using is as safe as it can be, but there seems to be no short term gain in it.


The US hacked a drone out of the sky this summer.

Some governments have militarized their cyber-capabilities and are building weapons grade hacks that will trickle down into civilian and criminal organizations.  In this inflationary threat-scape, cybersecurity is gearing up with AI and operational improvements to better face these threats, but it’s still a very asymmetrical situation.  The bad actors have a lot more going for them than the too few who stand trying to protect our critical digital infrastructure.  


Western governments have stood by and let this happen with little oversight, and the result has been a wild west of fake news, election tampering, destabilizing hacks and hackneyed software.  There are organizations in this that are playing a long game.  If this digital revolution is to become a permanent part of our social structure, a part that runs our critical infrastructure, then we all need to start taking networked infrastructure as something more than an entertaining diversion.


One of the most poignant moments for me was when one of the speakers asked the audience full of cybersecurity experts who they should call, police wise, if their company has been hacked.  There was silence.  In a room full of experts no one could answer because there is no answer.  That tells you something about just how asymetrical the threat-scape is these days.  Criminals and foreign powers can hack at will and know there are no repercussions, because there are none.


Feel safer now?  Reading this?  Online?  I didn’t even tell you about how many exploit kits drop hidden iframe links into web pages without their owners even knowing and then infect any machine that looks at the page anonymously.  Or the explosion of tracking cookies designed to sell your browsing habits to any interested party.

***

I’m beating this drum again at ECOO’s #BIT19 #edtech Conference in Niagara Falls on November 6, 7 and 8…

from Blogger https://ift.tt/2IHHQjL
via IFTTT

What Is Learning?

What is Learning?

Thrown out casually during a teacher conference and then immediately forgotten, but it lingered with me.

I heard the initial “transmission of information” definitions around me and shook my head. Saying that learning is simply information transmission is like saying killing is a physical effort that ends a life; a very simplistic definition designed to make a complex idea manageable.

I caught a National Geographic special a few years ago in which a team studying the differences between great apes and humans made the sweeping statement that teaching and learning are the key difference between humans and apes. There is little else to distinguish us from our close cousins.

If it is so pivotal to the definition of our species, it deserves a better definition than “the transmission of knowledge.”

Learning (def’n): the enrichment of our mental facilities that ultimately gives us power over the physical world. We are able to know truth in a broader and deeper way because we can experience the world indirectly and abstract the world in order to understand it beyond our own senses. Learning allows us to preserve and enhance this discipline independent of our individual existences. We are the only species that does not have to relearn how to master our physical environment in every generation; more than that, we are able to amplify previous learning and build on it at an astonishingly proliferate rate. We are dangerous animals indeed.

This definition has a couple of challenges:

Firstly, the idea that knowledge and learning it is very powerful makes people uncomfortable. If you’re teaching and you just want to transmit information, you can simplify your practice to that simple goal. Accepting that learning and knowledge are powerful and potentially dangerous (giving the learner power over the physical world), a teacher would have to also accept some moral responsibility for imparting information, and many teachers don’t want to take that on.

Secondly, since our brains (hardware) became sophisticated enough to develop this viral learning (software), we have developed well beyond the constraints of our immediate physical environment. We have mostly deferred the costs of overcoming our immediate physical space to a macro/planetary level that we haven’t had to deal with directly yet. When I look at all the teachers who drive into my school alone in large SUVs in the morning, I get the sense that most teachers aren’t any more aware of these challenges than the general public; they are either unwilling or unable to consider a larger picture. The viral nature of our learning means the people teaching and the people learning are not learning hard truths with any real discipline. Learning how to overcome nature taught the first learners some hard truths, truths we forget when we are the billionth person to learn a hard won truth as a fact in a text book.

Calling learning the dissemination of information is a very dangerous thing indeed. This is the viral core of learning; when learning becomes knowledge transmission with no real context. The dangers appear thick and fast. Teaching becomes indoctrination and learning devolves into belief generation rather than a coherent, candid body of knowledge. Standardized learning does this in spades. Standardized tests force it, curriculum defines it, cutting knowledge into independent disciplines clouds it and grading validates it. Instead of developing a student’s body of knowledge in a coherent, interconnected, meaningful manner, the industrialized education system creates information overloaded human beings with limited (or no) understanding of what their knowledge is capable of.

This is disastrous for us as a society and a species, especially if you want human beings to live in democratic circumstances with relative economic and civic freedom. The fact that we don’t want to appreciate complexity will result in simple solutions, like simplified education, dictatorial government and poor economic choices. In those circumstances the urge to control the herds of the ignorant would become overwhelming for those in power.

Making learning easy is a disaster, it should be challenging, not pointlessly so, but contextually it has to be, ignorance is preferable to a passing on knowledge that empowers a human being beyond the confines of their natural world.

If learning devolves into knowledge transmission, we populate the world with dangerous fools.