Monday, 30 January 2012
Gaming Insanity
A game is a deceit, designed to entertain. If that entertainment becomes a perceived memory, and the actions in it something you believe you actually did, then what is the difference between you and someone with an associative disorder who thinks that they are Stalin? Both experiences are fabricated on beliefs founded on false memory. Both are a kind of insanity.
You have the problem of the teen who plays a lot of fighting games and believes himself a master pugilist. He gets into a fight at school after shooting his mouth off, believing that he is something that he is not. The result inevitably gets posted on youtube where he looks like a penguin trying to slap another penguin; yet his own recall of events is that of a flawless victory.
I see this with skateboarders all the time. They play Tony Hawk like it's going out of style, but can't land an actual trick in real life, yet they carry themselves as though they do. It's a kind of digital machismo that is leaking into the real world.
Even in games themselves, you hear trash talk from the most inept players who flip out and rage because they clearly (and repeatedly) get pwned. That Generation x-box mentality wins out, it's a kind of self-belief that defies logic (and reality).
The ongoing problem I see with gaming related egoism is that games are designed to be beaten. Through staging and a careful progression of skills development, games lead a player to success. If only real life were like that, there would be far fewer meth-addicts, addicts in general for that matter, welfare cases and criminals, not to mention poverty, obesity, school failures, cancer and bullying (this list could get quite comprehensive). Life isn't remotely fair, or designed to entertain you while you succeed.
If your entire self worth is built around the idea that you beat something designed to entertain you while you defeat it, you have to wonder what happens when you get to something like, I don't know, school or a job, where we expect you to handle complex tasks that aren't designed to entertain you, and not everyone wins, even when they might be better at something.
Rather than a Ra-Ra Gamification high, perhaps we should be looking at this from a more Orwellian/Huxleyan point of view. Games are designed to placate the masses, make them feel like they've accomplished something while enhancing their self worth in meaningless ways. We take our soma where we can get it, I guess.
Wednesday, 25 January 2012
Three Years Out
Here we are at the beginning of 2012. Our board is having a learning fair at the end of the summer (9 short months away) and they are looking at 21st Century learning and technology as a focus.
My suggestion is three years of time travel. It doesn't sound like much, but at the current rate of change, we're stretching the boundaries of reasonable speculation pushing even three years out. What will our class rooms look like in 2015, if we moved with the technology? Looking back might give us a clear idea of how little we may be able to guess!
The economy was on its knees, a radical new voice was about to be sworn in south of the border, and gas prices were about to leap and then leap even higher; peak oil panics abounded. BRIC countries were in huge growth while the old democracies fed their young to capitalist bandits.
Three years ago, Facebook was in its massive growth phase and wasn't a habit so much as a new sensation. Tablets were an in-joke on the Simpsons from the '90s, and Apple was still a year away from getting it right with the first ipad. There was no tablet market as such.
Netbooks (net what?) were the new and exciting technology craze, what everyone thought would make personal computing affordable, portable and available to everyone. An obvious way to keep tech moving forward as the entire banking system fell into disrepute.
Smartphones were still half screens with keyboards, and ruled by RIM. An exciting new phone by a company called "Palm" was the buzz at CES2009 and Microsoft evidently once made a Mobile Windows OS for phones! The first iphone was a year old, stratospherically priced, attached to a single provider and had a new app store with no apps. Android wasn't even a glimmer in Google's eye.
In 2009 our school was in the process of installing wireless internet, but it was still a year away from being stable enough to use (and is still a poor second place due to bandwidth issues). Everyone in the school used a single core, IBM/Lenovo, board desktop etherneted to the wall, if they used one at all. Less than 1 in 100 students brought their own laptops to class (though more and more were beginning to bring netbooks, though they couldn't use them online because they were allowed to plug in to the ethernet).
Walking in to an old-school, centralized IT environment did not seem so silly in 2009. The network ran on board owned machines in a closed system. Other than email, cloud based storage was unheard of. It was a long three years developing UGDSB's Google Cloud project; it was non-existent in 2009.
Things have zigged and zagged in surprising directions. Game changers like ipad and Android and the abject failure of the biggest technical buzz item of 2009 (those netbooks) show that there are some changes that sweep through our digital ecosystem so quickly that they are impossible to foretell.
Having said all that, I've been hammering away at ideal directions in Bring Your Own Device (BYOD) and multiple ecosystem technical learning environments for months now, and think I might be able to take a viable stab at it.
I only hope my colleagues are willing to jump into mix with me. We could contribute to yet another real step forward in our board's technical evolution:
Mini-lab: the decentralized Education Lab
The Future of Media Arts Labs
Future School
IBM's 5in5 (great example of shocking changes that are probable)
My suggestion is three years of time travel. It doesn't sound like much, but at the current rate of change, we're stretching the boundaries of reasonable speculation pushing even three years out. What will our class rooms look like in 2015, if we moved with the technology? Looking back might give us a clear idea of how little we may be able to guess!
*** 2009: An Archeological Review Of A Year In Tech ***
The economy was on its knees, a radical new voice was about to be sworn in south of the border, and gas prices were about to leap and then leap even higher; peak oil panics abounded. BRIC countries were in huge growth while the old democracies fed their young to capitalist bandits.
Three years ago, Facebook was in its massive growth phase and wasn't a habit so much as a new sensation. Tablets were an in-joke on the Simpsons from the '90s, and Apple was still a year away from getting it right with the first ipad. There was no tablet market as such.
Netbooks (net what?) were the new and exciting technology craze, what everyone thought would make personal computing affordable, portable and available to everyone. An obvious way to keep tech moving forward as the entire banking system fell into disrepute.
Smartphones were still half screens with keyboards, and ruled by RIM. An exciting new phone by a company called "Palm" was the buzz at CES2009 and Microsoft evidently once made a Mobile Windows OS for phones! The first iphone was a year old, stratospherically priced, attached to a single provider and had a new app store with no apps. Android wasn't even a glimmer in Google's eye.
In 2009 our school was in the process of installing wireless internet, but it was still a year away from being stable enough to use (and is still a poor second place due to bandwidth issues). Everyone in the school used a single core, IBM/Lenovo, board desktop etherneted to the wall, if they used one at all. Less than 1 in 100 students brought their own laptops to class (though more and more were beginning to bring netbooks, though they couldn't use them online because they were allowed to plug in to the ethernet).
Walking in to an old-school, centralized IT environment did not seem so silly in 2009. The network ran on board owned machines in a closed system. Other than email, cloud based storage was unheard of. It was a long three years developing UGDSB's Google Cloud project; it was non-existent in 2009.
*** Can We Forecast a Classroom in 2015? ***
Things have zigged and zagged in surprising directions. Game changers like ipad and Android and the abject failure of the biggest technical buzz item of 2009 (those netbooks) show that there are some changes that sweep through our digital ecosystem so quickly that they are impossible to foretell.
Having said all that, I've been hammering away at ideal directions in Bring Your Own Device (BYOD) and multiple ecosystem technical learning environments for months now, and think I might be able to take a viable stab at it.
I only hope my colleagues are willing to jump into mix with me. We could contribute to yet another real step forward in our board's technical evolution:
Mini-lab: the decentralized Education Lab
The Future of Media Arts Labs
Future School
IBM's 5in5 (great example of shocking changes that are probable)
Friday, 20 January 2012
Part 2: Tech Fetishes and Digital Horcruxes
I just wrote about the spell casting nature of technical support when the Harry Potter metaphor extended itself.
In a grade 12 academic English class we were talking about 1984 and Brave New World. The idea of a technological dystopia seemed very immediate for those seventeen and eighteen year old students. They felt trapped by their technology, dependent on it, desperate for it, addicted to it!
One student mentioned that he forgot his smartphone on a recent trip and was beside himself not knowing what was going on. I told him, "that used to happen in the old days, we called it blindness." It was said partly in jest, but the conversation turned to the idea that technology is becoming a part of us and when you leave a piece of personal electronics behind you actually suffer withdrawal. Any modern teacher who has watched students in exams fidgeting and anxious will know the truth of this.
That student didn't just feel like he'd lost a sense while he was away from his Blackberry. The physical aspect of that very personal piece of electronics was like a missing body part; he even had phantom pains, reaching for it constantly when it wasn't there.
With the Harry Potter spell casting through technology thing floating around in my head, it seemed a logical next step to look at his smartphone as a horcrux. These personal pieces of electronics give us senses and abilities that a few years ago would have seemed magical.
In the case of personal devices like smartphones, tablets and even laptops, especially the really fetishy ones that Apple is famous for (though not exclusively), our tech has become as much a part of who we are as our clothing or other worn, personal icons. If our personal technology defines us, then it's a small psychological step from identifying with a physical object to believing our self-worth is an aspect of it.
The difference between passive items like clothes and our interactive tech is that the tech touches our minds as well as our bodies, it feels like a piece of how we think. From there it's a small step to feeling like they are part of our core being; a piece of our souls.
When I was a teen, I wanted a car more than anything in the world, it meant freedom and mobility. I had no interest in cars before I was able to drive, and then I was infatuated with them. My encyclopedic knowledge of everything build in the '80s and '90s is a result of that infatuation (as well as my entirely dodgy string of vehicles). I've been all about anthropomorphizing mechanical devices since I was a kid watching Lost in Space.
The relationship with personal electronics seems destined to eclipse the earlier affection we had for our mechanical devices. The nature of these electronics means a mental as well as physical interaction, and our adaptable brains are more than ready to accomodate the change.
Voldemort put his soul into horcruxes to prevent his own destruction, much as Sauron did in Lord of the Rings. The idea of off-loading or decentralizing ourselves to external devices isn't a new one. Looking at the involuntary and constant connection to smartphones in today's teens, it appears to already be happening on a massive scale, and they themselves realize how different it makes them from the people who came before them.
You've got to wonder though... why did these authors always have the villain do this de-humanizing thing for their own self-aggrandizement?
In a grade 12 academic English class we were talking about 1984 and Brave New World. The idea of a technological dystopia seemed very immediate for those seventeen and eighteen year old students. They felt trapped by their technology, dependent on it, desperate for it, addicted to it!
One student mentioned that he forgot his smartphone on a recent trip and was beside himself not knowing what was going on. I told him, "that used to happen in the old days, we called it blindness." It was said partly in jest, but the conversation turned to the idea that technology is becoming a part of us and when you leave a piece of personal electronics behind you actually suffer withdrawal. Any modern teacher who has watched students in exams fidgeting and anxious will know the truth of this.
That student didn't just feel like he'd lost a sense while he was away from his Blackberry. The physical aspect of that very personal piece of electronics was like a missing body part; he even had phantom pains, reaching for it constantly when it wasn't there.
With the Harry Potter spell casting through technology thing floating around in my head, it seemed a logical next step to look at his smartphone as a horcrux. These personal pieces of electronics give us senses and abilities that a few years ago would have seemed magical.
In the case of personal devices like smartphones, tablets and even laptops, especially the really fetishy ones that Apple is famous for (though not exclusively), our tech has become as much a part of who we are as our clothing or other worn, personal icons. If our personal technology defines us, then it's a small psychological step from identifying with a physical object to believing our self-worth is an aspect of it.
The difference between passive items like clothes and our interactive tech is that the tech touches our minds as well as our bodies, it feels like a piece of how we think. From there it's a small step to feeling like they are part of our core being; a piece of our souls.
When I was a teen, I wanted a car more than anything in the world, it meant freedom and mobility. I had no interest in cars before I was able to drive, and then I was infatuated with them. My encyclopedic knowledge of everything build in the '80s and '90s is a result of that infatuation (as well as my entirely dodgy string of vehicles). I've been all about anthropomorphizing mechanical devices since I was a kid watching Lost in Space.
The relationship with personal electronics seems destined to eclipse the earlier affection we had for our mechanical devices. The nature of these electronics means a mental as well as physical interaction, and our adaptable brains are more than ready to accomodate the change.
Voldemort put his soul into horcruxes to prevent his own destruction, much as Sauron did in Lord of the Rings. The idea of off-loading or decentralizing ourselves to external devices isn't a new one. Looking at the involuntary and constant connection to smartphones in today's teens, it appears to already be happening on a massive scale, and they themselves realize how different it makes them from the people who came before them.
You've got to wonder though... why did these authors always have the villain do this de-humanizing thing for their own self-aggrandizement?
Thursday, 19 January 2012
Part 1: Magical Technologists
"Any sufficiently advanced technology is indistinguishable from magic."
Arthur C. Clarke
I'm reading Kurzweil's The Singularity is Near, and in the opening he compares computer programs to Harry Potter's magical spells. It seemed spurious when I read it, but now I'm wondering how it looks from other eyes.
I'm the go-to tech guy at school, and I dig the position. I've joked before about how people need to sacrifice a chicken (or just wave a rubber one over the computer) if they want something to work, but now the metaphor is resolving a bit more.
Today our soon to retire head of guidance came in all worked up because he couldn't take a document and put it in his powerpoint. He was using and old, hobbled, board laptop with an ancient copy of, well, everything on it; it was state of the art in 2002 when he got it.
I copied his (wordpad!) file onto a USB key, opened it on my competent, not-board computer (it actually uses Windows 7 instead of XP - the ONLY OS of choice for our board) and MS Office instead of Wordperfect. I opened the DOC in Office (which just works, unlike Wordperfect on the board laptop) and then screen grabbed the guidance material he wanted into two jpegs. I then copied them onto the USB and moved them back over to his sad, old laptop. In moments I had one of the jpegs filling a slide on his powerpoint. After I did the first one, I got him to do the second one. He was happy, it all worked, and he even had some idea of how to put jpegs into powerpoint too.
Looking at the order of operations above, it looks pedantic and pretty this/then/that to me, but many people reading it would get lost in the acronyms or the logical sequence of it. It assumes an understanding of what works with what and how to bypass difficulties around software not cooperating, among other things.
From another point of view, it might look like I pulled out my own, newer, better wand (laptop), and made some arcane gestures (trackpad), spoke some gobbledigook (tech-talk) and dropped a regent into the spell (the USB key). and made what seemed impossible possible. Without comfort level, experience and equipment, it looks like I made something happen out of nothing.
The councilor with him said I was the secret technical mystic they turned to when things just didn't work.
I try to be transparent with what I'm doing, and explain it to people as I'm doing it, but I see their eyes glaze over when I use the first acronym and then they just sit there with a happy smile on their face as the issue gets resolved. I'd like for everyone to be able to cast their own spells, but I fear many would rather just applaud the magician.
Which takes me back to Harry again. There's a scene where Dumbledore escapes from the evil Ministry in spectacular fashion. He could have just disappeared, but he doesn't, he does it with a flourish. Kingsly the auror says afterwards, "Dumbledore may be a criminal, but you've got to admit, he has style!"
If you're going to be a tech-magician, and if you're reading this you probably already are, then don't cast your spells flat, be like Dumbledore, have some style!
Which takes me back to Harry again. There's a scene where Dumbledore escapes from the evil Ministry in spectacular fashion. He could have just disappeared, but he doesn't, he does it with a flourish. Kingsly the auror says afterwards, "Dumbledore may be a criminal, but you've got to admit, he has style!"
If you're going to be a tech-magician, and if you're reading this you probably already are, then don't cast your spells flat, be like Dumbledore, have some style!
Saturday, 14 January 2012
Media Arts Lab 2.0
Redesigning media arts to create, not consume |
The Macs in our media arts lab are getting old and plastic. They can't push the high-def video coming out of our latest cameras, so it's time for a hardware upgrade, but it's not just about the hardware.
One of the biggest problems we face in our static, desktop centred lab with ordered rows of imacs are the bad habits students fall back into. Because our lab is like every other lab in the school (factory like rows of desktops in Pink Floyd The Wallesque rows of conformity), students do what they usually do in a computer lab; they zone out and become passive media consumers. Passive TV viewing has evolved into passive computer use.
In a media arts class where they are supposed to be in a creative, active mind-space, this is an ongoing class management headache. Battling the Facebook zombies and youtube droolers becomes an ongoing headache in the typical computer lab, especially with the weakest students who tend to be the most non-experimental and habitual in their technology use.
I've looked at this from a typical school IT/lab point of view, advocating for a mini-lab concept that emphasizes diversified, mobile technology, but this is the media-arts angle.
Many of the ideas are similar, but the idea of mobile, adaptable media tools also spurred the realization that students in front of an online desktop act much the way that students in front of a television do; they become passive, unquestioning media consumers. In a media arts lab this is an ongoing crisis.
There is the culture of entertainment that most digital natives subscribe to. Computers with internet access are toys to be used for entertainment. Their habitual use of computers at home and throughout their school careers have only enforced these bad habits. Unfortunately, those habits extend to most educators too. From PD days where the presenter assumes that if you're on a computer you're not paying attention, to teachers booking labs to have a period off, computers aren't considered anything other than an entertaining distraction by just about everyone.
We then get them into media arts where they are creating large amounts of digital media, and most of them are trapped in their bad habits and social expectations of technology. The fact that school related computer lab time is often unsupervised only adds to the problem.
Trying to break them out of that rut in a room with rows of desktops isn't working. Time to free up the tech, and break the passivity.
Sunday, 8 January 2012
Architect of the Future
I just read @banana29's "Emergence of Web3.0" blog on the immediate future of the web. Web3.0, if Alanna is on her game (and I know she is), looks like the next step in managing our data meltdown.
Last year ended with me in a dark and questioning place about the effects of digital media on how people think. I've done my due diligence, and read The Shallows by Nick Carr. Carr puts forward a compelling, well researched and accurate account of just what the internet is doing to people in the early 21st Century. I see it in school every day with the digital zombies. What is to become of the poor human too stupid to pass the are-you-human capcha? The Shallows points us to our failure to manage the digital revolution we've begun.
I've decided to start off the new year by going to the opposite side of the digital Armageddon/digital paradise debate; I've just started Ray Kurzweil's The Singularity Is Near on the advice of a Quora member who describes The Singularity as the opposite of The Shallows. Kurzweil begins the book with some math and an explanation of how exponential growth works. In the process he suggests a different growth pattern than the one most people would intuitively follow.
If Kurzweil is right, and I suspect he is closer than many futurist speculators, then we are about to hit a period of accelerated growth similar to that of the industrial revolution. Our floundering in data is much the same as the mid-nineteenth century's floundering in early industrialization. Like Dickins, Carr points to the perils of new technology and how it's making us worse, and there is no doubt that, for the vast majority, it is making them worse at this early stage digitization.
Just as children were pressed into dangerous factory work and pollution killed millions in early industrialization, so our first steps into digitization have zombified much of the populace, making them less than what they were before. Our heavy-handed, pre-digital habits have been hugely amplified by networked efficiencies and have hurt many digital natives in the process. What used to be slow moving, linear marketing in the pre-digital age has become an unending avalanche of brain numbing, tedious attention grabbing on the nascent world wide web.
Sharing music on a mixed tape used to be a benign bit of theft between friends, of no real damage. Take that idea of sharing music and digitize it, and suddenly you've crippled a major industry that only existed in the first place because live music was industrialized into sell-able media. Digitization creates efficiencies that would seem completely foreign and unbelievable in previous contexts.
Having friends over to watch a movie, or going out to a movie together that happened before home video, suddenly turns into video sharing online, and stuns another media empire. They struggled against VCRs, then got knocked flat by torrents, but at no point did they think it wasn't OK to charge me $6 to see Star Wars in the theatre each of nine times, then $40 for the VHS, then another $40 for the DVD, then another $40 for the bluray (it's not done yet, they're going to resell it to me in 3D next).
Suddenly police states (like Egypt, Libya or San Francisco) can't create silence and obedience out of fear, and dictators around the world are faced with a slippery new medium for communication that is not centrally administrated and controlled. Dictators around the world (from media companies to Gaddafi) fear their loss of control over the signal.
We've always shared media, we're a social species and love to share art that represents our stories and culture. Digitization brought that back after a century of industrialized, centralization of culture that trivialized and often eradicated memes that weren't attractive to enough people. This subtle and persistent destruction of variation culturally bankrupted us by the end of the 20th Century. To many, watching that monster die doesn't bring on any waves of despair, and will usher in a renaissance of creativity.
Web2.0 pushed social media, allowing common interests and individual ideas to flourish regardless of geography. No matter how trivial or insignificant your interest, you are always able to find a critical mass of people online who you can share your fascination with. This has corrosively weakened the century of industrialized, forced shared interests we've all been required to live with.
Digitization is re-animating the idea of a more unique sense of the self. You no longer have to be a brand name junkie based on massive, global industrial interests telling you what you should like. Advertising is agonizing over this now, as are those massive, global interests.
Into this maelstrom of early digitization comes Carr, accurately describing how the early internet is a new medium, infected by the old industrial interests whose heavy handed marketing has created whole generations of attention deficit zombies. When you combine the heavy handed tactics of pre-digital business with the near frictionless and always on nature of digital media, you get a recipe for Ritalin.
Like the soot covered, pollution infected children of the industrial revolution, the screen caged digital child is being treated roughly, but to expect that the early days of a revolution will be like the later days is not historically reasonable; though that shouldn't stop us from fighting against the dehumanization of children caused by our current mistakes.
Those soot covered child-laborers prompted society to develop public education systems that eventually produced stunning break-throughs in all eras of human endeavor. In fact, that initial failure of industrialization eventually produced a more educated and capable population thanks to the public education it caused. We won't see soot covered digital children forever.
The digital world we will eventually develop will have as much in common with 2012, as 1970 did with 1870. And if you believe Kurzweil, the exponential growth curve will develop information technology and artificial intelligence so advanced that it begins self-recursion, drastically increasing capabilities. No longer limited to biological evolution, Kurzweil forsees a rate of growth that makes the industrial revolution look positively anemic. It won't take one hundred years for us to see as much change as industrialization did in a century.
This will happen less soon but more quickly than people suspect, such is the nature of exponential growth. In the process we will be abused by old habits on new technology less and less as more of us become more capable. Web2.0 and social media are a huge step in this direction. We'll beat back the manipulators and make the technology serve us rather than having economic interests overpowering us with their own heavy handedness.
If this seems like a lost cause, it isn't; you can't let something like The Shallows scare you off inevitable change. You're living in a transformative time, and these are the moments when the people who can see the truth of things to come become architects of the future.
Last year ended with me in a dark and questioning place about the effects of digital media on how people think. I've done my due diligence, and read The Shallows by Nick Carr. Carr puts forward a compelling, well researched and accurate account of just what the internet is doing to people in the early 21st Century. I see it in school every day with the digital zombies. What is to become of the poor human too stupid to pass the are-you-human capcha? The Shallows points us to our failure to manage the digital revolution we've begun.
I've decided to start off the new year by going to the opposite side of the digital Armageddon/digital paradise debate; I've just started Ray Kurzweil's The Singularity Is Near on the advice of a Quora member who describes The Singularity as the opposite of The Shallows. Kurzweil begins the book with some math and an explanation of how exponential growth works. In the process he suggests a different growth pattern than the one most people would intuitively follow.
If Kurzweil is right, and I suspect he is closer than many futurist speculators, then we are about to hit a period of accelerated growth similar to that of the industrial revolution. Our floundering in data is much the same as the mid-nineteenth century's floundering in early industrialization. Like Dickins, Carr points to the perils of new technology and how it's making us worse, and there is no doubt that, for the vast majority, it is making them worse at this early stage digitization.
Just as children were pressed into dangerous factory work and pollution killed millions in early industrialization, so our first steps into digitization have zombified much of the populace, making them less than what they were before. Our heavy-handed, pre-digital habits have been hugely amplified by networked efficiencies and have hurt many digital natives in the process. What used to be slow moving, linear marketing in the pre-digital age has become an unending avalanche of brain numbing, tedious attention grabbing on the nascent world wide web.
Sharing music on a mixed tape used to be a benign bit of theft between friends, of no real damage. Take that idea of sharing music and digitize it, and suddenly you've crippled a major industry that only existed in the first place because live music was industrialized into sell-able media. Digitization creates efficiencies that would seem completely foreign and unbelievable in previous contexts.
Having friends over to watch a movie, or going out to a movie together that happened before home video, suddenly turns into video sharing online, and stuns another media empire. They struggled against VCRs, then got knocked flat by torrents, but at no point did they think it wasn't OK to charge me $6 to see Star Wars in the theatre each of nine times, then $40 for the VHS, then another $40 for the DVD, then another $40 for the bluray (it's not done yet, they're going to resell it to me in 3D next).
Suddenly police states (like Egypt, Libya or San Francisco) can't create silence and obedience out of fear, and dictators around the world are faced with a slippery new medium for communication that is not centrally administrated and controlled. Dictators around the world (from media companies to Gaddafi) fear their loss of control over the signal.
We've always shared media, we're a social species and love to share art that represents our stories and culture. Digitization brought that back after a century of industrialized, centralization of culture that trivialized and often eradicated memes that weren't attractive to enough people. This subtle and persistent destruction of variation culturally bankrupted us by the end of the 20th Century. To many, watching that monster die doesn't bring on any waves of despair, and will usher in a renaissance of creativity.
Web2.0 pushed social media, allowing common interests and individual ideas to flourish regardless of geography. No matter how trivial or insignificant your interest, you are always able to find a critical mass of people online who you can share your fascination with. This has corrosively weakened the century of industrialized, forced shared interests we've all been required to live with.
Digitization is re-animating the idea of a more unique sense of the self. You no longer have to be a brand name junkie based on massive, global industrial interests telling you what you should like. Advertising is agonizing over this now, as are those massive, global interests.
Into this maelstrom of early digitization comes Carr, accurately describing how the early internet is a new medium, infected by the old industrial interests whose heavy handed marketing has created whole generations of attention deficit zombies. When you combine the heavy handed tactics of pre-digital business with the near frictionless and always on nature of digital media, you get a recipe for Ritalin.
Those soot covered child-laborers prompted society to develop public education systems that eventually produced stunning break-throughs in all eras of human endeavor. In fact, that initial failure of industrialization eventually produced a more educated and capable population thanks to the public education it caused. We won't see soot covered digital children forever.
The digital world we will eventually develop will have as much in common with 2012, as 1970 did with 1870. And if you believe Kurzweil, the exponential growth curve will develop information technology and artificial intelligence so advanced that it begins self-recursion, drastically increasing capabilities. No longer limited to biological evolution, Kurzweil forsees a rate of growth that makes the industrial revolution look positively anemic. It won't take one hundred years for us to see as much change as industrialization did in a century.
This will happen less soon but more quickly than people suspect, such is the nature of exponential growth. In the process we will be abused by old habits on new technology less and less as more of us become more capable. Web2.0 and social media are a huge step in this direction. We'll beat back the manipulators and make the technology serve us rather than having economic interests overpowering us with their own heavy handedness.
If this seems like a lost cause, it isn't; you can't let something like The Shallows scare you off inevitable change. You're living in a transformative time, and these are the moments when the people who can see the truth of things to come become architects of the future.
Subscribe to:
Posts (Atom)