Sunday, 10 November 2019

Life Long Learning is The New Degree

Last March Break I attended an industry focused future of the workplace conference in Toronto.  That event aggressively underlined the importance of micro-credentials in the modern workplace.  The idea of years long programs, especially in technology where changes are happening regularly, suddenly feels like a lumbering has-been rather than a vital foundation to your workplace success.  The same conference caused me to examine the purpose of public education (there is much more to it than simply preparing students for work), but the gulf between school and the world beyond our classrooms continues to expand.

Since then I've worked with ICTC on a badging system for Focus on IT students that would allow them to micro-credential their progress through the program.  Anyone involved with the scouting program will know all about analogue badges well before there were any digital ones; badging has a long history of marking progress and expertise.  The military has always used badging to denote rank and expertise.  More recently badging has become popular in gaming culture to show skills and achievements and this has crossed over into the real world in terms of gamification of learning in education.  Badging as a form of micro-credentialing is a cultural phenomenon familiar to everyone, so micro-credentialing is nothing new.


We spent the afternoon yesterday attending the 4th annual CAN-CWiC Conference in Mississauga.  For someone who has been struggling against genderized pathways in his rural high school, attending a conference with hundreds of women in digital technology was like stepping into a future we may never reach where I teach, and isn't the case in the vast majority of Ontario digital technology classrooms.

A couple of conversations prompted by the indomitable Alanna about how some of the women at the conference got into tech were very telling.  We're both on the pathways committee at our school and the divide between high school career planning and what's happening in the real world was shocking.  While we're busy running a system that divides students by some pretty arbitrary standards and then builds up a marks history that defines student pathways into traditional post secondary learning, the rest of the world is struggling to find life long learners, something we only pay lip service to in our schools (don't believe me? Find out how much PD time was spent on EQAO and how much was spent on life long learning).  What we view as a static, established learning schedules (one the vast majority of teachers work in very successfully), is pretty much meaningless in 2019 beyond the walls of our ivory towers.

We just did a staff survey on the last PD day and the data aligned with my anecdotal experience in secondary education.  When you fill a school with university graduates, many of whom have never worked in anything other than than the academic education system as either a successful student or teacher, you end up with a very blinkered view of the where the majority of our graduates go.  Academics tend to overly value their own experience and encourage students to do the same.  Students are directed to follow that long academic trajectory over developing lifelong learning skills valued elsewhere.  The students that do follow it are considered 'the best' ones.

What is happening in the workplace?  Digital disruption is rippling across all industry and is doing what it does, upturning traditional standards of practice and demanding agility before allegiance to tradition.  In everyone we talked to at CAN-CWiC, traditional credentials were nice to have, but by no means were they the standard requirement they used to be.  Industry people said that, sure, they have some post-secondary graduates in specific fields, but even in their case there was something that trumped any other credential:  the willingness to adapt and learn more, even if you have a Ph.D.

Danielle at IBM had a background typical of what many of our strongest female students experience.  She did well in high school, and especially English, but took no tech because she wasn't encouraged to take it - it isn't what academic girls do.  She went to the University of Guelph, ran the student newspaper, got a degree in English and then worked in radio as a writer for a couple of years until this shrinking traditional medium laid her off.  She then found a ten week full time boot camp training program on full stack developing and is now a web developer with IBM Canada.  She said that she greatly values her degree and time spent at Guelph and wouldn't change any of it, but she wishes she'd had access to technology training in high school and university so she wasn't getting into it with no experience in her twenties.  Our tradition education systems plays to traditional stereotypes.

I had what I consider a feminist/woke colleague tell me about how her daughter is now taking bio-technology.  I never saw her once in my computer engineering classes, but if it's an academic girl aiming for university you'd be hard pressed to find anyone in high school telling them to take any applied technology course, even when that's what they're aiming at in post secondary.  It's much more important that all your classes end in a U and are in an academic situation (rows of desks) that prepare you for university.  She's now coding and is glad I put her on to Codecademy.  That's like being handed water wings when there is an olympic swim team you could have trained with in the building.

Whether talking to post-secondary education, skills training organizations or companies, the idea that we need to be able to quickly adapt in a rapidly evolving workplace probably sounds like it's from another planet to an Ontario educator inured in our factory shift driven system.  We aren't skills focused, we're shift focused.  You might be miles ahead of what's happening in your 3U maths class, but that's your shift and you're going to sit through it, for months on end.  You might be miles behind in your 4U English class, but you'll get passed along with the rest of your cohort with a mark that is pretty much meaningless.  What does a 60% in 4U English do for you?  What does a 100% in 3U math mean?  It keeps you with your cohort and does very little in terms actual learning.  We're all held prisoner by our 19th Century education production line schedule that churns out grades.  Every time the bell goes off to signal a shift change I wonder what year I'm in.  But considering how difficult it is to timetable a grossly simplistic, generalized curriculum, I shudder to think what would happen if the system actually did need to schedule itself around individual student need.

Does this mean the end of traditional, years long learning programs?  No, specialists still need that depth of training, but for many these years long, financially crippling programs aren't leading to a job, so we have to change that expectation.  I had a student last year who struggled in traditional classrooms but had good hands.  He went to college because that's what everyone expected him to do but dropped out in the first semester due to a lack of maths fundamentals (he probably got passed through everything with a 60% - gotta keep 'em with their cohorts!).  My suggestion before and after all that was to start knocking out industry ICT qualifications and gaining experience in the workplace.  Demonstration of your willingness to learn and evidence showing that you have a good work ethic will take you where a college diploma won't.  ICT is still a pretty new industry, so it doesn't have the embedded, historically recognized apprenticeship pathways that other technology  pathways do, but it should.  Apprenticeship training with its mentored, skills focused, individualized learning is what the majority of applied training should be modelled around, but that system is foreign to all the Bachelor carrying people doing the teaching.


Nice eh? One of only a handful of people in Canada with
this qualification, but it doesn't count in our
academics-only education system.
After my degree I went to work in ICT and ended up getting my qualifications as a technician.  Those were micro-credential bootcamp style courses I was taking way back in 2000.  I AQ'd (AQs are micro-credentials) frequently when I started teaching and recently did two more ICT qualifications just to stay current and give my students access to material.  OCT is very stingy around what it shows in teachers qualifications - mine shows only academic qualifications, but none of the technical qualifications including my apprenticeship because they are "less than" in our academically focused education system.  Teacher training only matters if a university had a hand in it.  Ironically, my board paid me nothing for my technical upgrading, even though it directly serves my students (thankfully my union did help me cover it).

Micro-credentialing is the new normal in the world beyond our school walls.  A big degree or diploma also shows your willingness to learn, but if it's all you've got in 15 years on the job, then most companies will ignore you.  If you think it's your passport to a good paying job, you'll find yourself stuck in customs.  Micro-credentialing shows an employer that you're always willing to upgrade your learning and stay relevant in a changeable workplace.  What they're looking for is life long learners, not a one trick pony with a single degree or diploma from years ago, no matter what your grades.  Aiming for an outcome like that (earn my degree and I'm set) is aiming for failure in 2019, no matter what grades you're getting and how excited guidance counsellors are about your opportunities.  If we were focusing our students on developing the confidence needed to always be open to learning something new, and the hunger and resiliency needed to leap into learning opportunities, they'd be in the right mindset to survive in the 21st Century workplace.  Dragging unwilling kids through months of instruction isn't doing that.

What this has done for me is underline all the extracurricular training and competition work we do in our program.  All of those awards and the effort that goes into them highlights that go-the-extra-mile lifelong-learning skills that are so in demand in the world.  That these efforts aren't integrated into our curriculum is yet another failure of our marks based, traditional model.

Maybe in the future Ontario classrooms we'll begin to break down our schedules into micro-credentials.  Students aiming at current and emerging technologies could take quickly updated, personalized, micro-credentials that focus them on the specific skills they need without months long classes.  Traditional subjects like English could be broken down into their fundamental components.  While everyone would need to take the literacy strand, not everyone needs to take the historical literature piece.  What would our maths and sciences classes look like if students were working on particular, skills based micro credentials rather than grinding through months long, generalized curriculum aimed at a mythical average student?   Digital disruption has produced differentiated production lines focused on more high value, bespoke products. Education could follow the same evolution and begin using ICT to differentiate student scheduling and specify learning so that it wasn't locked into a pedantic and ineffective 19th Century model.

In 2011 I imagined a fictional account of what a system designed around student differentiation rather than enabling our traditional model would look like.  The divide between what's happening in our classrooms and the digitally disrupted workplace our students are graduating into has never been wider.  If the various stakeholders in the education system can rejig the system while maintaining the highest standards (this isn't about cheaper, it's about greater flexibility in service of our students), then it needs to happen yesterday - we're falling further and further out of relevance for too many of our students.

Saturday, 2 November 2019

Cyber Dissonance: The Struggle for Access, Privacy & Control in our Networked World

Back in the day when I was doing IT full time (pre-2004), we were doing a lot of local area networking builds for big companies.  There was web access, but never for enterprise software.  All that mission-critical data was locked down tight locally on servers in a back room.  When I returned from Japan in 2000, one of my jobs as IT Coordinator at a small company was to do full tape backups off our server at the end of each day and drop off the tapes in our offsite data storage centre.  Network technology has leapt ahead in the fifteen years since, and as bandwidth has improved the idea of locally stored data and our responsibility for it has become antiquated.

We were beginning to run into security headaches from networked threats in the early zeroes when our sales force would come in off the road to the main office and plug their laptops into the network.  That's how we got Code Redded, and Fissered, and it helped me convince our manager to install a wireless network with different permissions so ethernet plugged laptops wouldn't cronk our otherwise pristine and secure network where all our locally stored, critical business data lived.  We had internet access on our desktops, but with everyone sipping through the same straw, it was easy to manage and moderate that data flow.  Three years later I was helping the library at my first teaching job install the first wireless router in Peel Board so students could BYOD - that was in 2005.

Back around Y2K,  IT hygiene and maintenance were becoming more important as data started to get very slippery and ubiquitous.  In a networked world you're taking real risks by not keeping up with software updates. This is still an issue in 2019, at least in education.  We're currently running into all sorts of headaches at school because our Windows 7 image is no longer covered by Microsoft.  Last year one of our math teachers got infected by a virus sent from a parent that would be unable to survive in a modern operating system, but thanks to old software still infesting the internet, even old trojans get a second and third chance.  Our networked world demands a degree of keep-up if everyone is going to share the same online data - you can't be ten paces behind and expect to survive in an online environment like that, you're begging to be attacked.
The hard sell on cybersecurity perils only lasted a minute.
The possibility of nuanced control of users was much of
the rest of the presentation. When you work through an
IaaS lense, you're not on the public internet any more.

Last summer I took Cisco's Cyber Operations Instructor's Program, which was a crash course in just how fluidly connected the modern world is, and how dangerous that can be.  After logging live data on networks and seeing just how much traffic is happening out there from such a wide range of old and new technology, it's a wonder that it works as well as it does.  Many cybersecurity professionals feel the same way, our networks aren't nearly as always on as you think.

This past week I attended Cisco's Connect event which once again underlined how much IT has changed since I was building LANs in the 90s and early 00s.  The drive to cloud computing where we save everything into data centres connected to the internet comes from a desire for convenience, dependability and the huge leap in bandwidth on our networks - and you ain't seen nothing yet.  There was a time when you had to go out and buy some floppy disks and then organize and store them yourself when you wanted to save data.  Now that Google and the rest are doing it for you, you can find your stuff and it's always there because you've handed off that local responsibility to professionally managed multi-nationals who have made a lot of money from the process, but there is no doubt it's faster and more efficient than what we did before with our 'sneaker-nets'.


You probably spend most of your day with
a browser open.  Ever bothered to understand
how they work?  Google's Chrome Intro Comic
is a great place to start.
If you ever look behind the curtain, you'll be staggered by how many processes and how much memory web based applications like Google Chrome use.  Modern browsers are essentially another operating system working on top of your local operating system, but that repetition will soon fade as local operating systems atrophy and evolve into the cloud.  Those local operating systems allowed us a great deal of individual control over our computing, but we give that away when we hand off management of our data to someone else in the cloud.

At Cisco Connect there was a lot of talk around how to secure a mission critical, cloud based business network full of proprietary IP when the network isn't physically local, has no real border and really only exists virtually.

Cisco Umbrella and other full service cloud computing security suites do this by logging you into their always on, cloud based network through specific software.  Your entire internet experience happens through the lens of their software management portal.  When you lookup a website, you're directed to an Umbrella DNS server that checks to make sure you're not up to no good and doing what you're supposed to be doing.  Systems like this are called IaaS - infrastructure as a service, and they not only provide secure software, but also integrate with physical networking hardware so that the IaaS provider can control everything from what you see to how the hardware delivers it.


In 2019 the expectation is for your business data to be available everywhere all the time.  It's this push towards access and connectedness, built on the back of our much faster network, that has prompted the explosion of cloud based IT infrastructure.  In such an environment, you don't need big, clunky, physically local  computer operating systems like Windows and OSx.  Since everything happens inside one of the browser OSes, like Chrome, all you need is a thin client with fast network access.


The irony in Chromebooked classrooms is that the fast network and software designed to work on it aren't necessarily there, especially for heavy duty software like Office or Autocad, so education systems have migrated to thin clients and found that they can't do what they need them to do.  If you've ever spent too much time each day waiting for something to load in your classroom, you know what I'm talking about.  A cloud based, networked environment isn't necessarily cheaper because you should be building network bandwidth and redundancy out of the savings from moving to thin clients.  What happened in education was a cash grab moving to thin clients without the subsequent network and software upgrades.  This lack of understanding or foresight has produced a lot of dead ended classrooms where choked networks mean slow, minimalist digital skills development.  Ask any business department how useful it is teaching students spreadsheets on Google Sheets when every business expectation starts with macros in Excel.

Seeing how business is doing things before diving back into my classroom is never wasted time.  The stable, redundant wireless networks in any modern office put our bandwidth and connectivity at school to shame.  In those high speed networks employees can expect flawless connectivity and collaboration regardless of location with high gain software, even doing complex, media heavy tasks like 3d modelling and video editing in the cloud - something that is simply impossible from the data that drips into too many classrooms onto emaciated thin clients.  Data starvation for the less fortunate is the new normal - as William Gibson said, the future is already here, it's just not evenly distributed.

Seeing the state of the art in AI driven cybersecurity systems is staggering when returning to static, easily compromised education networks still struggling to get by with out of date software and philosophies.  The heaps of students on VPNs bypassing locks and the teachers swimming through malware emails will tell you the truth of this.  The technicians in education IT departments are more than capable of running with current business practices, but administration in educational IT has neither the budget nor the vision to make it happen.  I have nothing but sympathy for IT professionals working in education.  Business admin makes the argument that poor IT infrastructure hurts their bottom line, but relevant, quality digital learning for our students doesn't carry the same weight for educational IT budgets.

In addition to the state of the ICT art display put on at Cisco's conference, I'm also thinking about the University of Waterloo's Cybersecurity & Privacy Conference from last month.  The academic research in that conference talked at length about our expectations of privacy in 2019.  Even a nuanced understanding of privacy would probably find some discomfort with the IaaS systems that cloud computing is making commonplace.  The business perspective was very clear: you're here to work for us and should be doing that 24/7 now that we've got you hooked up to a data drip (smartphone) in your pocket.  Now that we can quantify every moment of your day, you're expected to be producing. All. The. Time.  I imagine education technology will be quick to pick up on this trend in the next few years.  Most current IaaS systems, increasingly built on machine learning in order to manage big data that no person could grasp, offer increasingly detailed analysis (and control) of all user interaction.  Expect future report cards to show detailed time wasted by your child data on report cards, especially if it can reduce the number of humans on the payroll.

These blanket IaaS systems are a handy way of managing the chaos that is an edgeless network, and from an IT Technician and Cybersec Operator point of view I totally get the value of them, but if the system gives you that much control over your users, what happens when it is put in the hands of someone that doesn't have their best interests at heart?


WIRED had an article on how technology is both enabling and disabling Hong Kong protestors in the latest edition.  While protestors are using networked technology to organize themselves, an authoritarian government is able to co-opt the network and use it against its own citizens.  I wonder if they're using business IaaS software that they purchased.  I wonder if many of the monitoring systems my students and I are becoming familiar with in our cybersecurity research is being purchased by people trying to hurt other people.


As usual, after an interesting week of exploring digital technology I'm split on where things are going.  We've seen enough nonsense in cybersecurity by criminals and government supported bad actors on the international stage that there is real concern around whether the internet can survive as an open information sharing medium.  Between that and business pushing for greater data access on increasingly AI controlled internets of their own that could (and probably are) used by authoritarian governments to subjugate people, I'm left wondering how much longer it'll be before we're all online through the lens of big brother.  If you're thinking this sounds a bit panicky, listen to the guy who invented the world wide web.

The internet might feel like the wild west, but I'd rather that than blanket, authoritarian control.  Inevitably, the moneyed interests that maintain that control will carve up the internet, reserving clean, usable data for those that they think deserve it and withholding it, or leaving polluted information from everyone else.  I get frustrated at the cybercriminals and state run bad actors that poison the internet, but I get even more frustrated at the apathy of the billions who use it every day.  If we were all more engaged internet citizens, the bad actors would be diminished and we wouldn't keep looking for easy answers from self-serving multinationals looking to cash in on our laziness.  I've said it before and I'll say it again, if I could help make a SkyNet that would protect the highest ideals of the internet as its only function, I'd press START immediately.

The internet could be one of the most powerful tools we've ever invented for resolving historical equity issues and allowing us to thrive as a species, but between criminality, user apathy and a relentless focus on cloud computing and the control creep it demands, we're in real danger of turning this invention for collaboration and equity into a weapon for short term gain and authoritarian rule.



“It’s astonishing to think the internet is already half a century old. But its birthday is not altogether a happy one. The internet — and the World Wide Web it enabled — have changed our lives for the better and have the power to transform millions more in the future. But increasingly we’re seeing that power for good being subverted, whether by scammers, people spreading hatred or vested interests threatening democracy."
- Tim Berners Lee

"The internet could be our greatest collaborative tool for overcoming historical inequity and building a fair future, or it could be the most despotic tool for tyranny in human history.  What we do now will decide which way this sword will fall.  Freely available information for all will maximize our population's potential and lead to a brighter future.  The internet should always be in service of that, and we should all be fighting for that outcome in order to fill in the digital divide and give everyone access to accurate information.  Fecundity for everyone should be an embedded function of the internet - not voracious capitalism for short term gain, not cyber criminality and not nation state weaponization.  Only an engaged internet citizenship will make that happen."
- my comment upon signing a contract for the web.

Thursday, 10 October 2019

Cybersecurity and the AI Arms Race

We had a very productive field trip to the University of Waterloo for their Cybersecurity and Privacy Conference last week. From a teacher point of view, I had to do a mad dance trying to work out how to be absent from the classroom since our school needs days got cut and suddenly any enrichment I'm looking for seemingly isn't possible.  I managed to find some board support from our Specialist High Skills Major program and pathways and was able not only arrange getting thirty students and two teachers out to this event, but also to do it without touching the school's diminished cache of teacher out of the classroom days.

We arrived at the conference after the opening keynote had started.  The only tables were the ones up front (adults are the same as students when it comes to where you sit in a room).  Sarah Tatsis, the VP, Advanced Technology Development Labs at BlackBerry, kindly stopped things and got the students seated.  The students were nervous about being there, but the academic and industry professionals were nothing but approachable and interested in their presence.


What followed was an insightful keynote into Blackberry's work in developing secure systems in an industry famous for fail fast and early.  Companies that take a more measured approach to digital technology can sometimes seem out of step with the rock-star Silicon Valley crowd, but after a day of listening to software engineers from various companies lamenting 'some companies' (no one said the G-word), who tend to throw unfinished software out and then iterate (and consider that a virtue), the hard work of securing a sustainable digital ecosystem seems further and further out of reach.  The frustration in the air was palpable and many expressed a wish for more stringent engineering in online applications.

From Sarah Tatsis I learned about Cylance, Blackberry's AI driven cybersecurity system.  This reminded me of an article I read in WIRED recently about Mike Beck, a (very) experienced cybersec analyst who has been working on a system called Darktrace, that uses artificial intelligence to mimic his skills and experience as a cybersecurity analyst in tracking down incursions.

 I spent a good chunk of this past summer becoming the first high school teacher in Canada qualified to teach Cisco's CCNA Cyber Operations course which, as you can gather from the name, is focused on the operational nature of cybersecurity.  After spending that time learning about the cyber-threatscape, I was more and more conscious of how attackers have automated the attack process.  Did you know criminals with little or no skill or experience can buy an exploit kit that gives them a software dashboard?  From that easy to use dashboard, complex attacks on networks are a button push away.

So, bad actors can perform automated attacks on networks with little or no visibility, or experience.  On the other side of the fence you've got people in a SOC (so much of this is the acronyms - that's a Security Operations Centre), picking through anomalies in the system and then analyzing them as potential threats. That threat analysis is based on intuition, itself developed from years of experience.  Automating the response to automated attacks only makes sense.

In the WIRED article they make a lot of hay about how AI driven systems like Darktrace or Cylance could reduce the massive shortage of cybersecurity professionals (because education seems singularly disinterested in helping), but I don't think that will happen.  In an inflationary technology race like this, when everyone ups their technology it amplifies the complexity and importance of jobs, but doesn't make them go away.  I think a better way to look at this might be with an analogy to one of my other favourite things.

Automating our tech doesn't reduce our effort.  If
anything it amplifies it.  The genius of Marc Marquez
can only be really understood in slow motion as he
drifts a 280hp bike at over 100mph.  That's what an

AI arms race in cybersec will look like too - you'll only
be able to watch it played back in slow motion to
understand what is happening.
What's been happening to date is that bad actors have automated much of their work, sort of like how a bicycle automated the pedaling by turning into a motorcycle.  If you're trying to race a bicycle (human based cyber-defence) against a motorcycle (bad actors using automated systems) you're going to quickly find yourself dropping behind - much like cybersecurity has.  As the defensive side of things automates, it will amplify the importance of an experienced cybersec operator, not make it irrelevant.  The engines will take on the engines, but the humans at the controls become even more important and have to be even more skilled since the crashes are worse.  Ironically, charging cyber defence with artificial intelligence will mean fewer clueless script kiddies running automated attack software and more crafty cybercriminals who can ride around the AI.  I've also been spending a bit of time working with AI in my classroom and can appreciate the value of machine learning, but it's a data driven thing, and when it's working with something it has never seen before you quickly come to see its limitations.  AI is going to struggle, especially with things like zero day threats.  There's another vocab piece for you - zero day threats are attacks that have never been seen before, so there is no established defence!

Once a vulnerability is found in software it's often held back and sold to the highest bidder.  If you discovered a backdoor into banking software, imagine what that might sell for.  Did you know that there is a huge market for zero day threats online?  Between zero day attacks, nation-state cyberwar on a level never seen before and increasingly complex cybercriminals (some of whom were trained in those nation state cyber war operations), the digital space we spend so much of our time in and more and more of our critical infrastructure relies on is only going to get more fraught.  If you feel like our networked world and all this cybersecurity stuff is coming out of nowhere, you ain't seen nothing yet.  AI may very well help shore up the weakest parts of our cyber-defence, but the need for people going into this underserved field isn't going away any time soon.

***


Where did the Cybersecurity & Privacy Conference turn next?  To privacy!  Which is (like most things) more complicated than you think.  The experts on stage ranged from legal experts to sociologists and tackled the concept from many sides, with an eye on trying to expose how our digitally networked world is eroding expectations of private information.

I found the discussion fascinating, as did my business colleague, but many of the students were finding this lecture style information delivery to be exhausting.  When I asked who wanted to stick around in the afternoon for the industry panel on 'can we fix the internet', only a handful had the will and interest.  We had an interesting discussion after about whether or not university is a good fit for most students.  Based on our time at the conference, I'd say it isn't - or they just haven't grown into the brain they need to manage it yet.  What's worrying is that in our increasingly student centred, digital classrooms we're not graduating students who can handle this kind of information delivery.  That kind of metacognitive awareness is gold if you can find it in high school, and field trips like this one are a great way to highlight it.

The conference (for us anyway) wrapped up with an industry panel asking the question, "Can the Internet be saved?"  In the course of the discussion big ideas, like public, secure internet for all (ie: treating our critical ICT infrastructure with the same level of intent as we do our water, electrical and gas systems) were bandied about.  One of my students pointed out that people don't pirate software or media for fun, they do it because they can't afford it, which leads to potential hazards.  There was no immediate answer for this, but many of the people up there were frustrated at the digital divide.  As William Gibson so eloquently said, "the future is already here - it's just not evenly distributed."  That lack of equity in entering our shared digital space and the system insecurity this desperation causes was a recurring theme.  One speaker pointed out that a company only fixated on number of users has a dangerously single minded obsession that is undermining the digital infrastructure that increasingly manages our critical systems.  If society is going to embrace digital, then that future better reach everyone, or there are always going to be people upsetting the boat if they aren't afforded a seat on it.  That's also assuming the people building the boats are more interested in including everyone rather than chasing next quarter earnings.

This conversation wandered in many directions, yet it always came back to something that should be self-evident to everyone.  If we had better users, most of our problems would disappear.  I've been trying to drive this 'education is the answer' approach for a while now, but interest in picking up this responsibility seems to slip off everyone from students and teachers to administration at all levels.  We're all happy to use digital tools to save money and increase efficiencies, but want to take no individual responsibility for them.


I've been banging this drum to half empty rooms for over a year now.  You say the c-word (cybersecurity) and people run away, and then get on their networked devices and keep doing the same silly things they've always done.  Our ubiquitous use of digital technology is like everyone getting a new car that's half finished and full of safety hazards and then driving it on roads where no one can be bothered to learn the rules.  We could do so much better.  How digital skills isn't a mandatory course in Ontario high schools is a mystery, especially when every class uses the technology.

I was surprised to bump into Diana Barbosa, ICTC's Director of Education and Standards at the conference.  She was thrilled to see a troop of CyberTitans walk in and interrupt the opening keynote.  The students themselves, including a number of Terabytches from last year's national finalist team who met Diana in Ottawa, were excited to have a chat and catch up.  This kind of networking is yet another advantage of getting out of the classroom on field trips like this.  If our pathways lead at the board hadn't helped us out, all of that would have been lost.

We left the conference early to get everyone back in time for the end of the school day.  When I told them we'd been invited back on the bus ride home they all gave out a cheer.  Being told you belong in a foreign environment like an industry and academic conference full of expert adults is going to change even more student trajectories.  If our goal is to open up new possibilities to students, this opportunity hit the mark.

From a professional point of view, I'm frustrated with the lack of cohesion and will in government and industry to repair the fractured digital infrastructure they've made.  Lots of people have made a lot of money driving our society onto the internet.  The least they could do is ensure that the technology we're using is as safe as it can be, but there seems to be no short term gain in it.


The US hacked a drone out of the sky this summer.
Some governments have militarized their cyber-capabilities and are building weapons grade hacks that will trickle down into civilian and criminal organizations.  In this inflationary threat-scape, cybersecurity is gearing up with AI and operational improvements to better face these threats, but it's still a very asymmetrical situation.  The bad actors have a lot more going for them than the too few who stand trying to protect our critical digital infrastructure.  

Western governments have stood by and let this happen with little oversight, and the result has been a wild west of fake news, election tampering, destabilizing hacks and hackneyed software.  There are organizations in this that are playing a long game.  If this digital revolution is to become a permanent part of our social structure, a part that runs our critical infrastructure, then we all need to start taking networked infrastructure as something more than an entertaining diversion.

One of the most poignant moments for me was when one of the speakers asked the audience full of cybersecurity experts who they should call, police wise, if their company has been hacked.  There was silence.  In a room full of experts no one could answer because there is no answer.  That tells you something about just how asymetrical the threat-scape is these days.  Criminals and foreign powers can hack at will and know there are no repercussions, because there are none.

Feel safer now?  Reading this?  Online?  I didn't even tell you about how many exploit kits drop hidden iframe links into web pages without their owners even knowing and then infect any machine that looks at the page anonymously.  Or the explosion of tracking cookies designed to sell your browsing habits to any interested party.


***

AI isn't just helping the defenders:  https://www.globalsign.com/en/blog/new-tool-for-hackers-ai-cybersecurity/


I'm beating this drum again at ECOO's #BIT19 #edtech Conference in Niagara Falls on November 6, 7 and 8...

Sunday, 15 September 2019

There is no STEM

There has recently been a fair bit of push back against STEM as a focus in schools, but as a classroom technology and engineering teacher I have to tell you, there is no STEM.  By sticking science, technology, engineering and mathematics in an acronym, many people, especially people who aren't in classrooms, think that this is some kind of coherent strategy, but I can assure you it isn't, at least not in Ontario.

Maths and sciences are mandatory courses throughout a student's career.  Technology and engineering are not, ever.  Maths gets even more additional attention because of EQAO standardized testing, so numeracy is an expectation for all teachers throughout the school.  Science is mandatory throughout elementary grades and high school students are required to take two science credits to graduate.  Maths and science are baked into a student's school experience.

Want to feel the sting of irrelevance?  Waterloo University (and many others) do a fine job of underlining how little technology and engineering programs matter in high schools.  If you're signing up for their software engineering program you need lots of maths... and lots of science.  Engineering for an engineering program?  Well, there's no point in making it a requirement because it's an optional course that is barely taught anywhere in Ontario.  At one point I heard less than 15% of Ontario schools run any kind of coherent computer engineering program.  The technology prejudice is a bit different, that's more of a blue collar white collar thing, but engineering, as an academic focus, has been swallowed whole by science and maths.

SM has always been a foundational piece of public education, and remains so, but the entire 'STEM push' is really an SM push, engineering and technology remain barely taught and entirely optional and peripheral in Ontario classrooms, assuming they exist at all.  Tactile, hands on technology programs with their lower class sizes, expensive tools and safety concerns are the first to get canned when the money tightens up.  It's cheaper to stuff 30+ kids into an 'academic' (aka: text based/theory) course where you can sit them in efficient rows and learn linearly until everyone gets the same right answer.  It doesn't do much for them in the real world, but it's cheaper.  Math and science make sense in a school system focused on those kinds of academic economics.

Governments get voted in by creating panic about student mathematics skills, and how science is taught is another political hot-spot that gives politicians lots of traction.  I have no doubt that these two subjects enjoy the attention they do because of this political fecundity.  Engineering and technology?  The skills that build the critical infrastructures that allow us to feed, connect and house people?  Not much political mucking to be done there, it just needs to work.

Last year I had a student graduate and go on to college for computer technology.  He had some trouble in school, but was on track to be a successful computer technician.  In his first post-secondary computer technology courses he was feeling well ahead of his classmates and was confident of success, but not all his classes went so well.  He ended up failing his maths course and eventually dropped out of the whole program.  Talking to his mother after this happened, she implied that I'd failed to teach him the mathematics he needed to succeed.  I didn't argue the point (I don't teach mathematics other than in conjunction with what we're doing in computer technology).  There is an entire mathematics department with ten times more personnel, resources and infinitely more presence in the school than me an my oft-forgotten program, but with STEM ringing in her ears we're all lumped into that failure.

This year I'm rocking a budget (which I've already exceeded in the second week of September) that is 25% of what it was a few years ago.  Everyone is seeing cuts, but the mandatory departments are protected in a way that our optional courses are not.  Where they might see a 10% cut, I'm seeing 75%, because what I teach is not a priority.  That cut is happening while I'm actually up in sections due the success we've had in various competitions and the media attention we've received (but not in our own yearbook).

You can rail against STEM all you like, but there is no such thing.  If there has been any STEM funding with this focus it hasn't found my technology and engineering courses because not all STEMs are considered worthy of political attention.  The best I've seen out of this are a few more manipulatives in maths classes based on corporate tech-in-a-box, but building a kit isn't engineering.  When you're engineering there are no instructions and the end goal may not even be possible, you certainly don't end up with everyone looking at the same finished product.  That kind of stochastic process is another reason why eng/tech is frowned upon in academic settings; they like everyone to arrive at the same correct answer.  It makes for a clear sense of progress, but learning to deal with potential failure in reality isn't wasted time in school.


In the article that kicked this off, you get a very articulate and scholarly take on the value of a liberal arts education and how it can free you from economic bondage in our overpopulated and automated world.  The down-your-nose 'yeomanship' / servitude argument pasted on STEM and CTE as a preparation for the workplace ignores the many soft skills that hands-on technical training can provide in favour of the argument that students of technology are dimensionless corporate shills whose only interest is to find work in a system that doesn't really need them.  But aren't we all yoked to our broken economic system?  A degree doesn't somehow free you from that commitment, but it will bury you in debt and the attendant servitude to it.  A technical education costs less and teaches you some valuable soft skills that will help you in any vocation, while also offering you a shot at something other than general labour.  The engineering design process technology training is predicated on would help anyone in any aspect of life where they must self-organize and tackle a problem that may not have a solution.

I have a liberal arts education (English and philosophy majors) and I greatly value the discipline it has brought to both my thinking and writing, but that doesn't mean I don't value hands-on mastery and the attendant good habits that accompany it.  It took me a long time to value my technical, hands-on skills against the constant noise of academic/white collar prejudice and privilege.  Since moving to technology from English teaching, I face that pressure daily, as do my tech-teaching colleagues.  In speaking to many people I still get the sense that technical, hands-on skills are inferior to academic skills, but I find them complimentary, not less than.  It would be quite a thing if we could value a student's technical hands-on mastery as much as we value their academic grades... or even their sports abilities.

I get the sense that Professor Zaloom believes the future will be full of highly educated academics elucidating on the state of humanity while they float above economic necessities with their intellectual freedom.  I'd argue that learning hands-on technical skills gives you a variety of soft-skills (persistence, self-organization, resilience, humility to name a few) that will help students deal with that overpopulated, automated future every bit as much as a degree might.

If you follow that article through, it's less about STEM and more about what we're going to do in an increasingly automated world populated by more and more people with less and less to do.  In that no-win situation, the value of being able to repair your own technology and understand the hidden systems that regulate your life is another kind of literacy that I think all students should have, especially if they are going to depend on those systems and let them direct their lives.


A good read on the fecundity of hands-on mastery.
Technology education offers that insight along with a plethora of tough-soft skills that are wanting in many academic programs where established reality is whatever the teacher thinks it should be.  There is a hard, real-world edge to technology training that is often hard to find in the mentally constructed world of academic achievement.  Matt Crawford describes management thinking in Shop Class as Soulcraft as having a 'peculiarly chancy and fluid character' due to its success criteria being changeable depending on the whims of the people in charge.  That was my experience in too many academic situations.  You know where you stand in technology because reality isn't fickle.

It's a shame that this pointless acronym has thrown a blanket over the grossly neglected curriculums of technology and engineering, while giving even more attention to two of the Disney princesses of academia.  To be honest with you, I think technology and engineering would be just where it is now had this STEM focus never happened, which tells you something about how this ed-fad has gone down.


Additional Reading:






The rich intersection of a liberal arts background and technology expertise:  Zen and the Art of Motorcycle Maintenance.










Shop Class as Soulcraft is a must read, but so is Matt's follow-up, The World Beyond Your Head.  A philosophical look at the power of tactile skills to free us from consumerism and the mental world of the digital attention economy.