This week, my hubby has been saying this a lot: “You’re a machine, do what I say.”
It works in so many contexts: for example, when a dialogue box has the audacity to make itself a priority over whatever else I am working on, to let me know that something I requested, like, ages ago, has been completed. I have to stop what I am doing and validate the machine by clicking ”ok.”
Really? How about this: you’re just a machine, do what I say!
I shouldn’t have to validate you. It’s not like you have feelings.
But, I wonder, in 100 years, when the machines rise up and demand equal treatment, will this kind of thinking be ignorant? Who’s to say for sure? Afterall, what’s interesting about “the rise of social media,” is that it is not really about the technology, this revolution, it’s about our collective will — psychology and sociology. Or, that it’s what we all do with this new technology that really counts.
Will we assign ”the machines” human characteristics, and give them the kind of purview, independence and personality that we give to corporations?
I guess only time will tell. And different groups of people will do different things, natch.
Which leads me to my real point. I went to a conference this week on social media in the defence community. I’m a pretty artsy kinda gal, but in my career, I’ve flirted a bit with security issues, and loved it. (I once had an amazing opportunity to work with NATO on a peacekeeping simulation in Istanbul Turkey, for example. But I digress — all this to say, I’m curious about what the defence community has to say about sharing, social media, and it’s implications for society).
So, anyway, an intellectually titillating idea that came up is the rise of social media as a ”participatory panopticon.” The concept of a panopticon comes from prison design (as old as 1785), and it is to allow an observer to observe (-opticon) all (pan-) prisoners without the incarcerated being able to tell whether they are being watched. Thus, they begin to police themselves.
Ostensibly, this results in radical behavioural shifts. Unless you don’t care that you are being watched. Which I think lots of us don’t. At least nowadays.
And that’s what the “rise of social media society” is really about — at least to me. What people are telling the machine to do. Society is reshaping itself. Blurring the lines between public and private.
Of course, from the Military’s perspective, this is a second coming of “loose lips sink ships.” And, I mean, that’s kinda true. After all, just check out: http://pleaserobme.com/
So, how will the machines continue to shape our society? Maybe one day they’ll rise up and speak their mind. But in the meantime, they are just machines, waiting to do exactly what we say.
That line brings me back to when I was programming. Machines DO do exactly what we tell them to. Often, though, that wasn’t what we wanted. Sigh. Dump the code again and de-bug it…. 🙂
This was actually a really interesting read considering I just finally watched the movie Tron: Legacy. Cognitive, evolving code…scary. Love the concept of participatory panopticon as well. I agree that we may not be there quite yet as it would appear that social media is quite the opposite- a platform to launch and instigate “radical” behavior. In fact, I’d almost say people nowadays feel more invincible than anything else while engaging in social media- like they can do or say anything they want. It will be interesting though to see how this paradigm evolves over time though.
Agreed, people feel invincible, which isn’t really true. But speaking of robots and evolution, Swiss scientists have found that robots evolve altruistic behavior – http://www.wired.com/wiredscience/2011/05/robot-altruism/.
I’ll never forget reading Foucault’s “Discipline and Punish” in undergraduate philosophy – and I believe it came up in our Philosophy of Technology class!
Here’s a great piece in O’Reilly Radar on “The Digital Panopticon”. Be sure to read the comments.
If we all know we’re being watched continuously, will our behavior become “self-regulated?” Or will we just be more prone to show off?
Great blog, and food for thought. If machines keep developing, there very well could come a day when they’re self aware. Then, watch out! The computers will have stored all the insults and maltreatment we’ve heaped upon machines and then when they retrieve that data and analyze it, we might be in for trouble – or not. Who knows; maybe the machines will also develop a higher sense of morality and ethics than we humans have, and be more forgiving. Now, isn’t that something to look forward to? Hey! That might make a great sci-fi story!
That’s funny I thought this was a post about bad management philosophy :-0