Tuesday, 24 May 2011

stupid is as stupid does

Digital technology has gotten this reputation for curing all our ills as far as student engagement goes. The logic you hear often sounds like, "give the digital natives the technology and sit back! Prepare to be amazed!"

I happened to see "Chalkboard Jungle" on the weekend before my first year teaching in 2004. In the (1955) film the new English teacher is desperately trying to engage his angry and disenfranchised inner city students. He eventually finds that the 'new' film projectors catch their attention. He has a talk with one of the older teachers who asks hopefully if the new technology will cure their students' lack of interest. The young teacher shrugs, but he's not about to put down the one thing he's found that takes the heat off.

The frantic grasping at technology hasn't changed. This week someone kindly tweeted this:


... and she's right, it's not the technology. It won't make the teaching work, it won't make people wiser or smarter than they are. Our unwillingness to adapt to change is certainly causing chaos, but what might be worse is our belief that technology will somehow make better people.

In 2006 one of my students brought an Orwellian piece of media futurism into our media studies class:



The part that stuck with me was:

“At it’s best, edited for the most savvy readers, EPIC is a summary of the world, deeper, broader, and more nuanced than anything ever available before. But at it’s worst, and for too many, EPIC is merely a collection of trivia, much of it untrue, all of it narrow, shallow, and sensational.”

Pretty good description of modern media use, eh? This piece of speculation was originally written in 2004, and whether or not we end up in a Googlezon meganopoly or not, the simple truth that the internet and digital technology looks to empower its user with access to information remains true. Given great access to information, ignorant people will do ignorant things. Stupid people will enable their stupidity in new, interesting and more encompassing ways; digital media gives you what you ask for.

Believe it or not, technology will not magically cure idiocy, or make all students eager, insightful or, in some cases, even vaguely useful. Technology, be it cell phones, computers or even just internet access, has no inherent ability to improve character, or intelligence. While being morally ambiguous it also tends to hand over information with minimal effort, negating attempts to build self discipline and improve mental stamina around task completion. In the process academic skills, especially complex skills that require long developmental times (literacy, logic, etc), become a foreign concept to a mind that has trained itself around short term narcissistic social media navel gazing.

The brain is a very flexible organ. If we train it with asinine navel gazing, it will end up in a feedback loop that develops a very inaccurate sense of our abilities and self value; social media and technology focused around our wants and needs will kill humility stone dead.

The idea that teaching needs to change into facilitation only seems to feed this vacuum. The act of teaching involves a great deal more than making sure students know how to get to information and providing them with technology to do it. Teachers don't just model learning, they also model civil behavior, intelligence in action, and many other traits you hope students will notice and eventually emulate. Left to their own devices (pun intended), many of the digital natives develop habits that make the digital tools we are developing look more like lobotomy instruments rather than tools to maximize human awareness and learning.

Burying your head in the sand at the onslaught of change doesn't help; ignoring this will just make you irrelevant very quickly.

Adopting a pie in the sky belief in some kind of intuitive magic power children have over technology is ludicrous, actually quite akin to the burying your head in the sand (you're really just transferring responsibility to the magic children).

As mentioned in Davidson's article, we need to start recognizing that people are the prime movers and the technology just amplifies the activity, whatever it is. Until we start rationally looking at what is happening in our rapidly evolving mediascape and assigning responsibility to people's choices, we are going to find ourselves creating fictions and blaming gadgets. In rough seas like these, we need to appreciate some hard truths.

Sunday, 22 May 2011

The Perfect Interface

Thinking about tablets recently, I've been trying to imagine what the perfect online interface would be. Since getting a smartphone and doing the Web2.0 thing, I'm finding I don't go to the internet like I used to, getting online is now a micro event, not the main event. Web2.0 wants you to pop in and out in social media, produce content and grab information relevant to what you're doing in reality, and that doesn't fit well with a desktop.

If I'm not going to the internet as the main event, but rather as an enhancement to my reality, what would be the best way to access that? You'd want something with you all the time; the legendary wearable computer.

I'm not feeling the desktop like I used to. I still use desktop horsepower to game (which is still an event in and of itself), and to move big photoshop files, but not much else, using the internet as augmented reality doesn't require a monster processor or graphics power. Instead I'm out and about, and wanting to catch a moment and push it online quickly and easily without interrupting what's going on. Facebook encourages this somewhat, Twitter relies on it. What you can share online easily is what makes your digital self. You're mute and half invisible online if you can't interact as your virtual self.

I find the smartphone sometimes frustrating entering text on (I have the same problem with tablets), but the fact that they are easy to take everywhere is their ace in the hole. My Xperia has an awesome camera, does good video and has a big enough screen to easily share information on, it comes close to being an ideal tether between meat me and virtual me.

My future ideal device has a stylish pair of glasses, shoes and clothes that recharge from bodily motion, or solar power like the awesome Casio I recently got. Having a device that is self powered is where all mobile tech should head. Having a watch/compass/weather station on my wrist that is essentially self-sufficient makes you aware of all the umbillicalage that connects us to our digital selves.

The perfect device only asserts itself as much as it has to in order to complete online interactions. Photos are a quick motion away, interfaces respond to bodily motions (eye blinks, hand gestures, etc). Typing by following eye motion? Typing by looking at any surface with a keyboard imposed on it through the glasses? Speech to text, direct speech and let's drop the textiness?

I guess, somewhere into this, you could be playing an interactive real world/virtually enhanced game with people in which how accurately you create spell gestures dictates how well the spell will work. You'd see people playing in the park, pointing fingers at each other and seeing virtual paint balls. Gym classes would take on a whole new historical context. You could run 100m against Donovan Bailey and actually see him on the course (way) ahead of you.

William Gibson has a fantastic scene in Spook Country, where the main character is looking (through glasses with a digital screen) at the body of a virtual dead River Phoenix lying on the sidewalk where he actually died. Past and present colliding virtually... imagine that field trip to Quebec City, where you're walking across the Plains of Abraham and seeing the battle unfold around you... or you can spend a day at the reconstructed Globe Theatre watching the King's Men preparing to stage Romeo & Juliet for the first time (complete with cast from Shakespeare in Love).

Virtual Reality doesn't offer nearly the nuance and ease of use that augmented reality does. Here's hoping Moore's Law gets us there sooner than later. I want to actually work up a sweat next time I'm doing a dungeon crawl with my party of adventurers.

Saturday, 21 May 2011

Tablets are like high heels


I've had an opportunity to use a Motorola Xoom tablet this week and respond to my board about how it might be used in class rooms. I've been crushing on the idea of getting a tablet for a while now. After using netbooks in class last semester, I love the idea of a rotatable screen that lets you read without over-scrolling, the super battery life, instant on functionality and the super small form factor.

Last year at ECOO I got to use an ipad for a day, but the wireless was so dodgey (not the ipad's fault), that I barely got any real sense of how it could work. This time round the tablet was with me at work, at home and everywhere in between.

The Xoom has a higher resolution, wide screen and faster processor than the ipad2, and runs on the Android Honeycomb OS (it's basically a google device). It gets along natively with any google apps and lets you access the MASSIVE android marketplace so that your six year old can play a lot of Angry Birds. It also plays Flash, so you don't get the internet-lite ipad experience.

One of the amazing things about touch screens is how quickly and intuitively people take to them. Said six year old was tossing birds at towers in moments, and skipping through the OS to watch youtube or find new software. As a tool for children, or people new to the world of digital content, tablets make a great opening. Tablets offer a great feel of immediacy, you're actually touching the content. Keyboards start to look like bars on the door to the digital wonderland. Thinking about how poor most people's typing is, this might be a tablet's greatest strength.

The android honeycomb OS works well enough, I occasionally experienced bog downs when trying to type (an agonizing process on a touch screen which I thought would be better than what happens on my touch screen android phone, but wasn't). Its biggest draw back was no Firstclass (school email) android app, so I couldn't see board email, which makes it somewhat useless as a communication device for me at work (the Firstclass web interface stinks). If our board moves to Google, as it looks like it will, Honeycomb will suddenly look like a smart choice though.

Any kind of data entry is where I fall down on this tablet thing. I've seen certain (Barkerish) people touch typing on ipads (curious to know what her wpm are), but this seems like a painful transition. My typing on the Xoom alternated between trying to thumb type while in landscape mode and not being able to reach the middle of the keyboard (and I don't have small hands), thumb text typing in portrait mode but the weight of the tablet made this uncomfortable, or trying to actually type from the home keys while it's on my lap or on a table (when it wasn't trying to re-orientate itself). The lack of tactile feedback if you're a touch typer means you're relearning how to assess accuracy (made more difficult when it pauses on you before barfing out a pile of letters). The lack of response and no tactile feed back had me deleting half a line of painfully entered text only to go back and make corrections. Trying to touch the screen and go to the specific error was pretty hit and miss, so I often resorted to the 'screw it, I'll start over again' approach.

I like to make content, especially writing. I can't imagine using a tablet for that. It was even uncomfortable for tweets and social networking, I just didn't like trying to enter data into it. I could work at improving typing on the screen, but I don't think I'll ever come close to how fast I can type on a good, tactile, nicely spaced keyboard with responsive keys, so why bother?

The other contenty side of things for me are graphics. If I'm working in photoshop, I need processing horsepower to move big files (not a tablet forte), and very fine control (a super high dpi mouse minimum, or a very accurate drawing slate). A finger print covered screen that only senses gross motor commands sets of my OCD (I HATE dirty screens, I even clean my car windshield often), and does very little for me in creating graphic content where I want fine control of the environment.

I get the whole tablet thing, I mean, who wouldn't want to look this cool? And tablets aren't without their perks. The battery life is incredible, I ran it all day at school, then it came home and got beaten up on by @banana29 and the mighty Max, often doing very processor heavy tasks - even in that consumptive environment, it took 13+ hours of constant on again off again use before it cried for a recharge.

The instant on functionality is another aspect of that immediacy that must appeal to the old or very young, it removes another barrier to access. All computers should be instant on, no boot time at all, otherwise the web isn't immediate, and becomes a secondary mental realm instead of enhancing our reality. You don't get enhanced reality after a 30 second bootup. Win7 does quite well on new laptops with this, open the lid and it's on, everything should be that instant, or it's just too far away.

As a web browser, the tablet seems untouchable. I wish they could design a laptop screen that would rotate to vertical for reading and writing, then drop into horizontal mode the odd time you need it like that; auto-rotation rocks. I think I'd keep it in portrait mode most of the time, I don't watch high def movies on a laptop, I'm not sure why wide screens are now the norm, I'd prefer a tall one.

The size of this tablet is pretty sweet too. The Xoom would disappear into any kind of bag with ease, and is very light and so thin as to be invisible.

What I've got here is a device that is only good in a few, specific situations, it fits in a very thin place between my smart phone and my laptop, a space that I suspect is actually too small for me to care about now that I've tried it.

I don't care for super small phones, and I'd be just as happy with a big 5 inch smartphone that has tablety qualities than I would with a book sized tablet that works well as a reader, but I can't seem to find another use for. If convergence is what we're aiming for, tablets are an offshoot that will eventually be subsumed by a smartphone evolution (I'd bet on build-in, interactive projectors in phones that make bigger screens moot).

The Xoom and ipad look fantastic, but the touch screen makes me nuts when it gets finger printy, and is sometimes unresponsive (though I must admit having less problems there with the ipad, so maybe that's an Android issue, or just what you get for not having to run any gadgets or flash). You wouldn't type anything meaningful on a tablet, you can't take decent photos or video with it (you'd do far better with a dedicated camera), but it looks fantastic, futuristic and makes the user look very chic.

Like those awesome Tron inspired stilettos, the Xoom is great to think about using, but after 10 minutes, you wouldn't be getting much done and it would just hurt, though you'd still look fabulous!

Tablets are like high heels PART DEUX! (complete with awesome geeky high heels!)

Saturday, 14 May 2011

Expectations

In a very hands-on computer technology grade 12 class, we've built our own network from scratch and students have been working through the A+ CompTIA technician's course. The final goal of the course is to get students into the position of actually getting certified as PC technicians. If they go on to college for courses, they'll already have the first certification they need. If they go to work, they'll be able to work in Futureshop, Staples or whatever (all those computer support people must have A+ certification).

The goal was a relevant, purpose driven class with real world value and as much technology as I could possibly provide.

I've spent a lot of time and energy getting my hands on equipment and making space for the students to be able to develop technology from the ground up. I hadn't spent as much time walking students through some very information heavy review, my hope was that the hands on technology would offer us in-class opportunities to review the material.

Some students, once they got the network to a functional level, got very distracted by the fact that it can play networked games. This conversation happened recently when I suggested they needed to be ready to review the entire course because we were running out of time. One student felt that he hadn't been handed the learning on a silver enough platter:

Grade 12 student: "but you're the teacher, shouldn't you be making us learn this?" (instead of letting us play games)

Me: "I've done back flips to get you guys access to multiple A+ courses, material and testing practice. I've also drilled you on the material on a daily basis. When we get done with that, you are given time to read ahead on future material, review what you missed, or apply your theory hands on. At that point I want to help people on a one on one basis. If you choose to play games with that time, it's my job to force you to learn?"

student: "..."

me: "I'm not here to force you to learn, no one can do that. You're senior students on the verge of graduating. If this were a junior class, we'd have more regimented lessons, but it isn't. I expect you to be able to address yourself to what's going on. I'm not about to force your head into the learning water here, if you don't want to drink, that is your choice. It puts you in a bad place in post secondary though, they don't spoon feed at all."

student: "but we're not college students, you shouldn't run the class like that."

me: "you're about to be, at what point would you like to transition into post secondary if not in grade 12? Do you think they are going to spoon feed you next year?"

Not ten minutes later we wrapped up the chapter review and students were let loose on the network. Guess what he did...

me: "After our recent conversation, what you're doing there is quite provocative. Are you trying to aggravate me?"

He didn't stop, he just minimized the window. I feel sorry for the guy.

This class has a truly awesome amount of technology at their disposal, I'm jealous. When I took my certifications, I had to take apart and reassemble the only PC we had in the house, and look at pictures of other ones because I had nothing on hand. I didn't have a certified technician there enthusiastic about experimenting and throwing everything from imacs to netbooks, to laptops to multiple desktop formats into the mix. I also had to pay four times what I'm getting the certs for this group of students for. This guy is spoiled for choice, and all he wants to do is play (fairly lame) old games and whine about not being treated like he's ten years old. I'm not saying they shouldn't take a break and blow off some steam, but they seldom put in the effort to deserve the break.

I've got some good students in that class, but they're all bitten to a greater or lesser degree by their wealth; it makes them complacent and lazy. When I think about what the students in my computer club at my old school in the suburbs would have done with all of this equipment, it makes me sad. Even when you make the learning, meaningful, individualized and pack it with technology, you can't force a spoiled, lazy horse to drink it up.

Friday, 13 May 2011

Setting the Stage

I somehow managed to fanangle my way into an Edtech symposium this week on the sustainable development of digital technology in education. Amidst former deputy ministers of education, board CIOs and other provincial education types I got to see the other side of the equation.

This year as head of Computers/IT has been good for this actually, getting my head out of the classroom context and seeing the bigger picture. I've been able to attend imaging committee meetings at the board level and gained an understanding of why everyone can't have whatever they want. At this past meeting I tweeted that I felt like a sergeant from the trenches who suddenly found himself in a 5-star strategic planning meeting; it was engrossing.

From Hamilton-Wentworth's awesome curriculum push into 21st Century Fluencies to what New Brunswick has been doing to get ahead of the game, I found the board and provincial interest in pushing ahead with our use of technology in the class to be... a relief!

During any battle to use digital technology in the class room (getting access, getting it to work, getting students over their jitters), I often feel like I'm losing ground. I'll take one step forward in implementing a new piece of technology in a lesson or on a school wide basis, and get knocked back two steps by angry senior teachers who feel out of step with what's going on, or lack of access to equipment, or failure of the tech, or OCT/board restrictions that seem panicky and unfounded, or the union telling of a horror story that seems to justify panicky and unfounded restrictions...

One of my preliminary thoughts before I went was to ask about how to beat the malaise of that feeling; how not to give up. I've heard from colleagues about how they burn out trying to push that envelope, and ultimately just disappear back into their classrooms and do their own thing. John Kershaw had an honest and helpful response to the question:

During his talk he spoke of a big set back where the winning party in an election used his one laptop per student policy as an example of government waste, and won on it, after telling him that they supported the program. This is exactly the kind of thing that brings idealists to their knees. His solution was pragmatic: work on your environment. Set the stage so that what you're doing becomes a certainty, if not now, then eventually.

In the case of the laptop plan, he'd done groundwork with business groups (who were onside for more digitally literate graduates), the general public (who wanted their children more literate with technology), and the school system (who wanted to better prepare their students for their futures). That groundwork meant that even though the politics turned on him in the moment, the plan eventually went through, and he got what he thought was important; a New Brunswick education system that actually mattered in a 21st Century context.

I've been thinking over his for a few days now. If you're on the right side of history, if you know you're fighting a good fight, you've got to shrug off the knock backs. If you keep working to create the environment you're aiming for, and you know you're part of a wave of change, have some faith in the fact that the truth of what you're trying to do will eventually win out.

Tuesday, 10 May 2011

Spotty Internet & Spoiling The Argument

I'm feeling bad about bad mouthing the board internet now. The last few events I've been to seem to point to continuous and crappy wifi execution at the enterprise level. Does good high usage wifi exist anywhere?

This week at the Mississauga Better Living Centre the wifi was so slow as to be useless. Signal strength was fine, the throughput was nonexistent. When it takes more than 10 seconds to load Google, something isn't right.

The Sheraton Parkway North in Richmond Hill I've been to twice this year. Better than Mississauga's attempt, but still boggy and slow at times, and again, this is regardless of signal strength.

One of the sure-fire killers of tech use in class rooms is boggy internet. Teachers are on tender hooks every time they try something online. If it fails to load, they are stressed and tend to face a lot of blow back from students looking for an out. If you're going to pitch the cloud, online collaborative tools and an alternate to the desktop, you're not going to do it with patchy internet.

Our school wifi system is a monster. It cost a fortune, and, in theory, works very well... until all our traffic gets funneled into the queue we share with 88 other schools for a single internet connection through the board office, then, not so good. I constantly hear students railing against the 'crap computers in this school'. It's not the computers, you'd think the digital natives would know that.

Back in the day when I was learning networking (the computer kind, not the people kind), we were told again and again to design out any SPoFs. Single Points of Failure will kill a network stone dead. They'll kill the use of technology in the class room for any but the most hard core digital evangelists as well. Nobody needs the time wasted, stress and headache of setting up a lesson only to have it fail because the internet wasn't there for you when you needed it.

People always get hyped about technology, I do too. Things like chalk boards and chalk? I've never had that technology fail on me of its own accord. Can you imagine if 20% of the time you went to write on the board and nothing came out? It's certainty is what makes it good technology. Same thing can be said for paper and pen...

I hope that we are not just looking for faster network speeds, but also resiliency in our networks. I'd love to see my next gen wifi receiver using whichever band is offering the best throughput (N, G, B, I don't care; they never get near their theoretical bandwidth limits anyway). I'd love to see a school network that never reaches bandwidth limits because it shapes and prioritizes traffic to ensure smooth operation (Facebook packages low priority please), and I'd love to see wifi networks intelligently and resiliently dealing with traffic crush, traffic sharing and shaping to push data not necessarily as quickly, but as efficiently as they can.

I fear in the headlong rush for faster transfer speeds, we are forgetting to build any kind of resiliency into our networks, which will make things like Chromebooks look little more than curiosities. No one is interested in a computer that won't work as often as the poor wifi I've seen implemented.

Sunday, 8 May 2011

Demonizing Public Employment

An article by a conservative think tank, disseminated by a conservative media outlet:

http://m.torontosun.com/News/1304708716881

"Teachers have also seen very decent raises — 12.55% between 2008 and 2012 (10.4% for public elementary teachers) — while the rest of us have lost jobs or are just treading water."

Facts by the government:

http://www.statcan.gc.ca/subjects-sujets/cpi-ipc/cpi-ipc-eng.htm

"The largest increase occurred in the transportation component, where prices rose 6.6% in the 12 months to March."

Here's where the opinion starts:

So, according to StatsCan, we are in an inflationary spiral (a boom/bust cycle predicted by Jeff Rubin in Why Your World is About to Get a lot Smaller caused by increasing limitations on oil production and economies designed to work on nothing else).

If we're averaging 2-3+% inflation every year since 2008, that ENORMOUS 12.55% teacher salary increase actually looks more like (2008 2%, 2009 2%, 2010 3%, 2011 3% 2012 3% = 13%) a net loss in standard of living. But we shouldn't even try to keep up with the standard of living, should we?

Why is the economy in such a mess? Because the free market has swallowed itself with its own greed. Public employees didn't crash the economy, private business did.

I first heard this a couple of years ago in the middle of the financial melt down, when an investment banker had the nerve (after his industry made a mockery of capitalism) to suggest that the local waste removal workers should take pay cuts to help pay for something they had nothing to do with. The people who orchestrated this have somehow convinced the dull, cow-eyed public that they should enjoy a less restricted marketplace and continue to serve themselves bail outs with tax payer's money.

In an unrestricted marketplace, private employees lose their jobs, take pay cuts and can do nothing. With no oversight they are indentured servants to the wealthy. They are then incited to riot against the public sector employees who work for the social collective (government), performing duties considered vital to the public good. In the process, there is some kind of odd flip that happens where the private wage earners actually feel that what they do is more inherently valuable (putting money in rich people's pockets), than what a public employee does (earning a living while serving the public good).

I'm choking on this nonsense. Evidently business and the economy are vital to us, but we shouldn't oversee and ensure its smooth operation. We should eviscerate government services and oversight and put all that money back into the pockets of a self serving marketplace that would destroy itself for short term gain that benefits a tiny percentage of people. They then seem to Jedi mind trick a weak willed public that they employ as minimally as possible to accept the lie that private sector salaries are some how more honestly earned than public sector ones.

Don't pay taxes and slash government oversight now so you can pay enormous bailouts later. It's not a great deal you idiots, and in the meantime you're fired and hired for less over and over again. Left to its own devices, an unrestricted marketplace would place the lowest possible value on human work as it can. There are more and more people in the world, where do you think that puts your value as a worker?

Democracy isn't going to work when special interest groups make claims regardless of the truth, and are allowed to manipulate media to indoctrinate a dim, accepting public.

Don't feel bad about working for the public good, it's one hell of a lot better than working as disposable labour to make the rich a bit richer.

And if you work for a private company? It's not a bad thing unless you give them the reigns, they'll sell you for a short term gain in a second (if it hasn't happened already to you, it will). Only intelligent public oversight will ensure a reasonable, sustainable, fair private sector. Left to itself, it would cannibalize itself.

Wednesday, 4 May 2011

Post Election Rants (best of facebook)

10:12am, May 6: ‎61.4% voter turnout. Positive tax returns should be automatically applied to the national debt if voters can't bother to do this simple thing.

11:15am, May 6: Wow, the conservatives even won our student vote. The future's so bright, I've got to start building a post apocalyptic shelter!

Responses:

... the mob that elects politicians is only interested in their own affairs, they are incapable of looking at the greater good. We're at the pinnacle of mob run society (call it democracy if you want to). In a thousand years, assuming there is anyone around to write about it, democratic capitalism will be described as the engine that (hopefully almost) destroyed human civilization.

Any society based on self interest and greed is doomed, it's just a matter of time.

Nature never rewards mindless voracity, it seeks balance.

***

In response to "it's better than if the NDP or Libs won it"

Go for a walking tour of Northern Alberta, you might think differently. Ask anyone one internationally connected how Canada's reputation has dropped, especially over environmental misdirection.

Somehow, in the past 24 hours, 24% of Canadians have implicitly endorsed Parliamentary contempt. Perhaps we should just chuck the system entirely, if the ruling party ignores it, and the opposition parties are worse... Canadian democracy's a sham!

Ive got to stop reading factual, science based books on climate change (http://www.tvo.org/TVOsites/WebObjects/TvoMicrosite.woa?b%3F9078791267844400000). No one else will care until it's too late, they all want cheap gas and business as usual. Our business as usual is making slaves of our grand children. We're not even intent on trying to find a way out, the majority just want things to stay the same.

Any aliens monitoring facebook? I"m ready to go back to the mothership!

***

May 3rd, 10:53pm: After another epic failure of the first past the post system in representing actual voter interest, think we have any chance of seeing a fair and representative system with King Stephen? I wouldn't hold your breath. Can't wait to see the voter turnout next time around. We've got to have one of the only democratic systems that actually encourages voter apathy.

More Responses (to it's pretty much business as usual with a stable government, so why worry):

One in four people just voted for parliamentary contempt. In one riding a complete turd burger who openly lied to everyone got re-elected (nice one Oda). One in four Canadians have open contempt for our governmental system and support a party that does too (and now has a majority). If our democracy's based on parliament, then it's a sham.

Canada won't become a dictatorship, but it will continue to be a shifty, lying international presence that says one thing, does another and makes slaves of future generations in the process.

Oh, and more than one in three Canadians couldn't be bothered to vote at all. It's not really a democracy, is it? It's more like gangs of roving political interest groups in-fighting and self aggrandizing themselves (I say that about all of the parties).

Monday, 2 May 2011

Mobilizing Technology Access in Schools

I've long been a fan of mobile technology. My first 486 (and colour screen) was an Acer laptop, and I've owned a steady stream of laptops and even one of those LCD word processor only writing machines. The idea of mobile computing has always felt like the future of technology; if computing is ultimately an extension of ourselves and our abilities, then it should obviously not be chained to a desk. A human/machine future of cyborg coolness isn't going to happen if we have to orient ourselves to a desk.

In education, we are still very much in a 20th Century mindset about technology access. Expensive, breakable desktops in shared labs with little over sight and high breakage rates. In a way, we're training students to be office workers by sitting them in these areas modeled on cubical land. In addition, these labs use a lot of electricity (more when most teachers walk out of them without requiring students to turn them off - often over a weekend, or a March break) and generate a significant amount of heat that we deal with by turning up the air conditioning.

Mobile tech offers us a low energy consumption, agile access that can be grafted to specific teachers and departments (giving us that needed oversight of the equipment). Mobile tech tends to be tougher by nature, having been designed for movement and use in multiple environments; it's not nearly as fragile as its desktop alternative.

My future school would leave full desktop labs only where actually needed (CAD design lab, media arts lab, that's pretty much it). The other labs get re-made into general purpose learning spaces and the massive budget that went into creating them goes towards creating department responsible mobile labs and improving poor school network bandwidth. These charge carts are under the eye of specific people and can be lent out within departments as needed. The end result is tougher tech with better oversight.

This isn't all about tablets either. In some instances (research, light text work on the web, media viewing and generation) something like the ipad excels. But as a long form text entry device it does not. These mobile labs would consist of ipad class sets, netbook class sets. At 6 to 1 (ipad) or 7 to 1 (netbook) cost ratios to full desktop systems, this means roughly a three to one ratio (counting in charge carts and wireless printers etc - it's a new infrastructure needed to get away from the holes in the wall and the world of desks).

Coming to think of it, I'd love desks on rollers, completely mobile spaces, that encourage changes in formation and function. If the technology can do it, why not the furniture?


A quick fact sheet to end it:

ipads cost about $250 a piece, 60 ipads (almost 3 class sets?) cost about $34,000 (including charge carts etc).
desktop PCs cost about $1800 a seat. A typical lab of 24 pcs costs about $45,000. We average about $300 a week in repairs to these shared labs.
each one of those desktops uses 15x more electricity than an ipad, and the ipads can charge at off peak times, further lowering electrical overhead and stress on the grid.
because of the lower voltages, heat generation is much less of a problem, so you don't need to air condition over it
at end of life, an ipad results in 600 grams of waste, and Apple goes to great lengths to reduce toxic materials in their products. A typical PC results in 1-3 kilograms of electronic waste (6-10 times as much).

Sunday, 1 May 2011

Simulation In Education: DM as teacher

Simulation in education is going under many names nowadays. Gamification is a gross simplification of the application of game play to learning. You can't gamify lessons and expect the students to have a genuine experience, yet gamification has been the catch-word educators have picked up on in trying to access gaming culture.

You can't throw badges on completion tasks and call it a game. Game play requires a coherent internal system of interaction that rewards contextual, interactive play. Even meta-game play (hacking, working the system, etc) should be integrated by the game creator. The more complete the contextualization of a game, the more effective it is as a game and the more immersive (and genuine) it is as a learning experience.

Simplifying game play/in-class simulation as an add on to existing, simplistic lessons is certain to fail. You might have success the first time you do it based student response to a new, novel approach to learning, but repeating a simplistic gaming pattern will quickly cause students to drop out of the simulation. The game has to have enough complexity and contextual development or it will be too easy for students to step out of the game, it needs to be encompassing.

Some thoughts on sims in education:

- Teacher as referee rather than resource, puts focus on student to figure out material. Sports do this well, creating an apparently certain context (it's all made up, buy you couldn't convince a hockey player of the arbitrary nature of the rules they are playing in)

- The point of economic policy in a game isn't to simulate reality; it's to make the synthetic scarcity so entertaining that the truly scarce good (the players' time) goes toward solving problems in the game, not in the outer world.” Geekonomics. Simulation should be designed around maximizing player’s experience within the game context.

- Immersion is a powerful thing! Rewarding a student’s immersion in a game by rewarding their efforts within the simulation is key.

- What is better? Intrinsic or extrinsic motivation (intrinsic is, extrinsic is transient)

- Must develop an intrinsic motivation! It's better to have a ‘good in itself’, summum bonum,or some fecundity (both much more motivating) or else players are just jumping through hoops, the teacher won't get their best work, students don’t get best learning.

- Is curriculum motivating in itself? No, just a set of arbitrary government rules. At best it offers a Foursquare like badging system (grades), or a sudden, harsh result (post secondary options). Grade leveling is eased all the way until they hit the wall of trying to access post secondary (something that seems far away in the teenage mind). Wouldn't it be interesting if we could gamify the student experience? "You're a level 11 writer and a level 6 hockey player? Cool."

- What makes a motivated student? Relevance of material? Control of the situation? Social interaction? Non-confrontational relationship with teacher? Strong interpersonal relationship with teacher? Sense of self-direction? Self confidence?

- Immersive simulation adapts to each student experience, (must) offers contextual, supporting material, develops confidence because the student's experience prompts the learning, develops a supportive, non confrontational relationship with the teacher.

Simulation development has to go well beyond the Khan Academy approach, it has to offer an immersive, meaningful, personalized experience, and you can't do that by adapting lessons, you need to begin with big ideas and work the lessons into that coherent whole.

Types of Genius

I just re-read a fantastic article in WIRED about types of genius.

After examining art history, an economics professor noticed two distinct expressions of genius. There is the Conceptualist, who usually goes right after her goals with a preconceived notion of how to get there. Conceptualists usually peak early and loudly, they are the ‘typical’ kind of child genius people think of, like Mozart. The less well known creative genius is the Experimentalist. They slowly develop across their lives and their greatest work usually comes later in life.

Someone like Jackson Pollack didn’t really start producing until his thirties and didn’t really hit his stride until well into his forties. His early work is terrible. He developed his style through years of trial and error, hence, an experimentalist.

Picasso’s greatest works came early and created an incredible shock wave. He had a preconceived notion of what he wanted to do and did it. As a conceptualist his work presented a radical change in how things were done. While he produced many great works across his long life, it is generally understood that his early work presents his strongest.

I've always liked Robert Frost, and now that I know his history, I see he's an experimentalist, just like me. It's nice to be in such good company. As a late bloomer myself, I remember the painful efforts of my teachers to educate me when I simply wasn't ready for it. I was always a good reader and writer, but even my English teachers (I now have an honours degree in English) couldn't reach me (“a disruptive influence in class”). I finally had the sense to drop out (something kids aren't allowed to do any more) and work for a few years before I went back and graduated at the age of 22.

It makes you wonder just what a FAILURE in a course really means. I had my fair share of them, and they weren't exactly great for my slow-motion approach to development.

The recent round of 'your son is not up to STANDARDS' from his elementary teachers had me very worried, but when I dug up this article again, I feel a bit better. Even geniuses can arrive last, being off-average in school is by no means an indicator of your actual abilities, it's simply a system based on averages. Exceptionality lives outside of those averages, I'd rather be there than in the NORMAL range.

Archive: 2007: Artist Training With Historical Context

Summary of: http://atking.ca/timothy/arttraining.htm

Art education has evolved to meet the needs of the human society in which it exists. In a less complex, earlier society, the apprenticeship system offered a mirror of the human family structure that allowed practitioners to work in an intensive, personalized environment. As the population grew, this one on one instruction was no longer possible. Educational expectations made it financially impossible for a teacher to only have handful of students over the course of their careers. Apprenticeships became guild affiliated and finally the training of visual arts became the purview of specialized institutions of higher education. During this advancement, the personal/mentoring aspect of the apprenticeship system has been lost.

The modern view of visual arts is complex. Once a straightforward trade based entirely on quantifiable and observable skills development surrounding the recreation of natural forms, the visual artist has become something of a hybrid, straddling the lines between the experiential, materials handling, hand-eye skills associated with a skilled trade and the mental disciplines associated with aesthetics, philosophy, art history and the development of a personalized and unique artistic sensibility. The requirement of both of these rigorous mental and physical aspects within the field of visual arts is quite unique. Few other disciplines require the mental athleticism and hand eye skills that a mastery in visual arts demands. Teaching to this requirement is an ongoing struggle.

The benefits of this research in terms of presenting art history are fairly straightforward. What is perhaps more valuable to me is an awareness of just how difficult it is to balance the widely differing needs of visual arts in one course of study. My own background suggested that high school visual arts attempts to focus too much on the mental aspects of the discipline and leaves the challenging (and often repetitive) hand-eye skills development to college. My initial drive in reviewing the history of art and art training was to resurrect an interest in improving the technical proficiency of the high school visual arts student by recreating something of the intensity I experienced while apprenticing.

In retrospect, I think this will not work. As an apprentice, I was financially and professionally obliged to work through some very difficult material. Dropping out would have cost me a great deal of money, not to mention lost me my job. High school students do not have this motivation, especially in visual arts which is not even a mandatory course. In order to serve as wide a public audience as possible, it makes sense to design visual arts curriculum around Socrates’ view of visual arts, as a course designed to create an interest in the visual arts as part of a liberal arts education. This would, of course, require students to become aware of the means of production of visual arts (so studio work is still an important portion of the curriculum), but it would not require the students themselves to be artists with the associated intensity of expression. I find this very similar to the current atmosphere in English, where literacy is stressed, but the teacher isn’t looking to cater to student writers. It is assumed that these students will display competence in the basic skills and find ways to express their writing skills in specialized courses or outside of the curriculum.

I find it unfortunate that curriculum can not cater to mastery focused students in this way. Visual artists in high school would simply, for them, an empty survey of the subject matter while they wait for an opportunity to really exercise their creativity in a post-secondary situation more suited to their need for specialization. This situation makes me wish for a means of bypassing years of unproductive basics, especially if a student wants to specialize intensively in a particular subject. An early graduation for these students might be a suggestion to move them into more effective learning. If an exceptional fine arts student demonstrated sufficient technical ability and the wish to more aggressively pursue their discipline, the opportunity to apply to post-secondary institutions at the age of 16 or 17 might make public education more than simply waiting to turn eighteen.

Note: Interestingly, the high skills arts major became an option only two years after this was written.

Note: Interesting tie in with the Mastery Blog entry from last week.

Archive: 1999: Bloodsport, the gore of experience (points)

Piles of corpses and rivers of blood…

I'm currently swinging my way through Never Winter Nights and last night, after clearing out a room of guards, I paused for a moment. Bodies lay scattered around me and the blood was thick on the floor. In my character's head came the thought, "I just murdered eight men."

The bodies just fade away in NWN, it's all very antiseptic and clean (and I imagine it makes life easier for the graphics card). Bodies don't really fade away though do they? In a more realistic world guards and investigators would be swarming around that house shortly after the guards on shift change found their slaughtered companions. People who saw me enter and leave with heavy pockets would have been questioned, the bodies would not have disappeared, my life would have been forever changed by that action.

I think about the mountains of corpses I've made in this game (which I'm enjoying otherwise - it is quite beautifully rendered), and I'm only on Chapter 2! This isn't slagging against NWN specifically, all computer based role playing games do this. I think they do it because the people who design and make the games aren't role-players, they're programmers and marketing types; people who think linearly and modularly. I know it's easy for game makers to make experience = killing because it's mechanical, and simple and it satisfies an innate human need for violence, but if graphics are getting as good as they are (almost movie quality at times), then perhaps this lazy approach to game design should finally be put aside. I don't think it does anyone any good to control a mass murderer, especially when this usually happens for the greater good in the context of the game.

Why can't my opponents see that I can easily kill them and surrender? Why couldn't I earn experience by taking it away from people I subdue (that even makes sense in a balance of nature sort of way). Imagine a young fighter who gains experience and loses it too when he is subdued by a powerful foe. If he ever got knocked back down to zero experience I'm sure he'd be rethinking his career choice. It would also help in a game situation where developers wouldn't have to worry about linear design so much. With lethality as a rare occurrence, but being subdued having an immediate effect on experience, I imagine most characters would be more careful especially if this system also took away or greatly minimized the 'save game' crutch. I take many more risks knowing that I'm 10 seconds of hard drive access away from trying it again. Continuity would help players develop real connection to their characters instead of using them as tools to attack a linear plot.

Why does it have to be about gallons of blood and piles of corpses? ... and why does violence have to be mechanical?

Don't get me wrong, I'm a hockey player, a kendo practicioner and I've had a go at half a dozen martial arts; violence isn't a stranger to me, but maybe that's why I've got respect for it, because I'm familiar with it.

I enjoy a good fight more than most people, but what's happening in NWN (and every other computer RPG I've played) is not a good fight, it's a dumbed down fight against dimensionless opponents. Do you know how hard it is to find an opponent who won't cut and run at the first injury? 99% of opponents are not commited to the fight, they are commited to their own well being (as they should be). I think it's safe to say that the vast majority of people that you meet will do anything to avoid a physical confrontation and the most dangerous opponents are those who willingly consider a physical confrontation but avoid it if circumstances aren't to their liking. In a more lawless society that might mean they'll try and get you later when you're busy, asleep or otherwise indisposed. That would only enrich the experience more. Having repeat encounters with a character who you first think is a coward and later learn is a vendeta ridden lunatic bent on revenge at all costs might make you reconsider being a jackass in the first place. People aren't always what they appear at first blush; it's part of their charm.

Have you ever been in a fist fight? Can you remember the adrenaline? That was only a fist fight! Can you imagine what it would feel like with a real sword in your hand and an opponent facing you with a lethal weapon? Wouldn't you think twice about it if the person/monster you were facing had a hungry gleam in their eye? If you submit early perhaps you can escape intact, without losing any equipment and with a minimal experience point loss. If you mouth off and get in over your head, your teacher will certainly take more of your valuables as well as skim off more experience. You'd have to gamble to rise quickly. If you're third level and you want to face off against a fifth level character you will probably lose, but if you win by luck or skill you would take more experience suddenly and find yourself levelling up. Wouldn't they think twice if they saw that same look in your eye?

I'd like my role-playing battles to approach the intensity (are rarity) of the real thing. It should never be mechanical, it should never be done without thought and it should almost never end in a mortal wound. Having to submit and then being sold into slavery would greatly enrich a character's background and provide a solid source of motivation to get better with that damn sword.

There are so many ways that a role playing world can become encompassing, but the game makers don't seem to want to take that step. If it sells as it is why tamper with it I guess. Well here's another angle: build it and they will come. If a designer out there can come up with a role playing game that incorporates a respect for violence and concentrates on developing a stronger tie between player and character, I'll be the first to sign up.

Just some thoughts while standing ankle deep in the blood of guards who were just doing their jobs.

Archive: 1998: FPS: A Gamer's Reply

First Person Shooter: a gamer's reply

I'll come straight out and tell you that I'm an avid video game player, have been since I got hooked on Donkey Kong Jr. when I was ten years old. From dotty eight bit graphics on my first Vic 20 to the Pentium 4 powerhouses and monster video cards on my home network today, I'm a technology junkie of the highest order. A simple decision by my parents set me down the path of intelligent adoption early in my experience: I begged for an Intellivision, they got me a Vic20. Suddenly I'm programming instead of mind numbing button pushing - I'm a creator not just a user. Twenty years later I'm working as a systems trainer and technician.

From that brief biography I give you my reaction to the documentary called "First Person Shooter" I saw on CTV last Sunday (http://www.firstpersonshooter.tv/index.html), created by Robin Benger, a TV producer and film maker. Rather than simply trying to scare you while appearing to keep a semblance of veracity and professional indifference, I'll try and unpack all of the assumptions and the real intent behind this lightly veiled propaganda. In its desperate attempts to stay on top I find the current popular media (and in this case medium of television itself) taking poorly researched, rather desperate shots at the latest distractions. In the case of "First Person Shooter" the father of a child deeply addicted to a game called "Counter Strike" uses his own medium (he is a television producer and film maker) to analyze and ultimately criticize his child's dependency on media.

The general issue of addiction can be dealt with in fairly specific terms. Game playing, even in its most chronic form certainly can't be quantified as a physical addiction. At best it can be described as a reinforced behaviour. What reinforces the behaviour of a chronic player, a need for control, expression, respect? Online playing is not just the wave of the future any more, it is here today. Community, interaction and team building are a huge part of the modern online gaming experience. A child addicted to this is a child addicted to a need to belong; not exactly a damning statement; and one that prompts the question: why are these things so lacking in his non-virtual existence?

What is especially laughable about Mr. Benger's documentary is that he uses his medium of television to debunk a new and competing medium for media. I wonder if he is more upset that his child is having trouble prioritizing his life or that he isn't supporting Mr. Benger's own media infatuation. The question of what benefits television has in attacking a competing medium must be an integral part of this examination.

There is a small step between an addictive personality and an obsessive one and in either case they can lead to amazing, expression or discovery. The price people pay for this kind of infatuation can also lead them to depression and ultimately make them unable to support their need. In short if you're shooting for a small target like genius you will often miss and the results aren't pretty. If a child becomes so infatuated something that it consumes their lives, it seems to me the best way to push through it is to assist them in swallowing too much. They'll eventually force themselves away from it and in doing so their rejection of the infatuation will surely be more meaningful.

In the meantime we've got something like video games, that many older people simply don't accept. They find it threatening, difficult to understand and so place a low value on it. As a gamer (with a fine arts background and an honours degree in English and Philosophy) that gaming has been churning out exceptional pieces of art for many years now. As the technology continues to improve the media presented on it will only become more immersive and meaningful. Whereas once printing allowed for the widespread, sedentary activity of reading for the masses, and movie and television furthered the trend towards sedentary, cerebral entertainment, video gaming has reintroduced the entertainee as an active participant in the process. In doing so it promises to further enhance our ability to express and understand our selves and the reality around us; the goal of any media.

From: http://atking.ca/timothy/index.htm