The internet of things: humans and digital devices
Our family has two Macbook Pros, an iPad, an iTouch, and two iPhones. How did we ever get to a point where digital devices have so much to do with our lives and the way we work?
When I was in Middleton High School in Southwest Idaho, 1989, not a single student or teacher had a cellphone or laptop, and very few had basic computers - more like complex calculators no one could use. Our school had a computer room, and the drives for the computers were cassette players. No, not 8-track tapes smart asses. But, the kind of tapes that still exist here and there today.
The loud speaker intercom that the office used for announcements was high tech to us, and if you needed to call someone, you used the pay-phone down the hall, a modern luxury. If you were to say the word "iPhone" at that time, people would think you were imitating Spielberg's E.T.
Now, about 78% of kids
have access to at least a laptop and a smartphone. The world would shut down without these devices, and just imagine if the whole Internet were destroyed - could we survive? As the human species, we evolve, but our cultures evolve along with us. Technology represents a major feature of who we are and how we operate, and it is evolving. We can no longer separate ourselves from the technology we use. Cultural and technological evolution
Remember, culture evolves, and since technology is a major part of society, it is the fastest evolving system ever. Nothing else even comes close, and the evolution is exponential. In other words, not only is it the fastest, but as time goes on, it evolves quicker, each new technology making it easier to develop more advanced devices. H2M2M
The outlook is shocking even by the standards of science fiction. People in IT are referring to the rapid shift in communication technology, as the "Internet of Things." Brian S Hall reports
A new survey of IT decision makers ... [concluded that] the rise of machine to machine (M2M) communications - more commonly referred to as the "Internet of Things" - is on the cusp of transforming our homes, our cities and how business is conducted.
We have moved beyond human to machine interaction (H2M) to M2M. The unfortunate phrasing of this quote leaves out humans altogether. I suggest we adopt new language here that puts humans in a functional role, even if just in name. How does human to machine to machine sound? H2M2M.
Spread the word on that will you? Maybe it will catch. It's downright spooky to leave humans completely out of the equation. If we are on the cusp of a transformation, please let us be at the center. Perhaps that is just human arrogance.
Electronics making writers obsolete?
For me, the answer is obvious.
Travel with me on a little thought experiment. We'll pack up just two electronic devices and head back to New York in 1843. We can take my Macbook Pro and iPhone.
New York is a humming bed of industry and commerce, but supposedly nearing a plateaue of invention as the Patent Office Commissioner, Henry Ellsworth, told congress that they were nearing "that period when human improvement must end." So, New York is at a peak of technology.
We draw a group of startled ladies and gents around us and engage them. We begin a demonstration of the capabilities of the two devices. We show a movie, like Star Wars, play a little Minecraft, chat a bit on Facetime, each of us on separate sides of the street, and let them examine the machines closely.
What would the reaction of the public be? We would be gods, my friend. We would posses the impossible and wield magical powers. Few would ever believe our story that humans could develop such things in the future. Their minds would not be able to grasp such a leap.
Back from the journey now, it becomes fairly easy to answer the title question, especially if you are familiar at all with AI advancements in recent years, months. Given time, provisions, and imagination, history has proven that there are few things humans cannot create.
Already projects are coming close to language in AI, and on TechRadar.Computing, Jamie Carter
writes interestingly of Narrative Science's program Quill. The abilities of Quill are thrilling, but she questions, pejoratively, whether AI will eventually take over journalism. An interesting question, but the approach to the quandary troubles me.
At the core of Carter's question, humans are pitted against technology, as if we were somehow entirely separate. The precedent has always placed us much closer than adversaries. Take this passage from Carter:
Journalism isn't complicated. The popularity of online news stories can be tracked – and therefore the importance of news easily ranked – while almost everything is written using the inverted pyramid structure. Since automated writing software can already do most of that, are we looking at the last generation of human journalists?
She transforms the historic art of journalism into simplicity because software might approximate the function. However, the complexity of the task should not be based upon whom or what completes it.
Think of the task and the technology both in terms of human evolution. We are talking about the most complex skill in history - the ability to use language to communicate. Journalism is complicated, but Carter contends here that if a computer can do it, the task must be easy.
This follows in line with thinkers who belittle the task if the actor is not human. Similar thoughts are expressed when remarkable abilities of animals are discovered. "Well, if a chimp can do it, the task was not very difficult in the first place."
Carter also implies that if computers can do the work, people will be out of a writing job, which is a dubious presumption.
Your brain is a genius on its own.
I was a lame student at Middleton, Idaho high school, class of 1989, showing little promise, but my lowest grade was only a C in typing. I worked so hard at that damned keyboard, and look at me now. A typing master. That's actually the skill I'm most proud of, and it isn't because of the grade. Imagine that?
I conceived of myself as a B person, not student, but person.
I applied the label to myself and my classmates. Jimmy was a definite A and Travis a definite F. I held the narrow belief that people were born with a certain level of intelligence and little could be done to change it. So, I didn't try to.
Much to my surprise, in my first college History 101 course, I discovered that people received grades largely corresponding to the amount of study and effort put into the course. I worked so hard on a paper about the legend of King Arthur and received a shiny A. I was stunned by my success.
Unshackling myself from my preconceived limitations, I performed well in college and shucked the B label. Brilliant people such as Albert Einstein recognize this lesson earlier in life.
I contend that most humans are brilliant learners by nature, abilities enhanced by advances in technology. The unadulterated human brain is the most remarkable machine on earth.
Are you the smiley face or a blue fellow?
The end goal
As one of those few unhappy people that inhabit this planet, I wondered today if any research has been done on whether happy people are faking. Happiness is entirely subjective and therefore difficult to quantify, but surely scientists have studied what most people say is their primary goal in life. In the U.S., we even have it in our Declaration of Independence
, as a divine gift, "the pursuit of happiness."
Among my Twitter crowd, happiness abounds in positive aphorisms, but most sound hollow and empty. "Life is short. Live it up," "Happiness is a state of mind," and the like make me nauseous. The happy cliches are abundant enough to convince me that a lot of people are concerned about contentment. But is this high prize this easily attainable?
Yes, I am asking the question: just how many of these happy people, with their shallow quotes, are fakes? I found some fascinating facts along my own pursuit of the truth behind happiness
.Trying to be happy makes you less happy
Simply glance on any social media, and you will see how prominent happy positive sentiments and quotes are. These people could actually be expressing heartfelt happiness, revealing a desire for it, or faking it altogether.
Pinker, cognitive and evolutionary scientist
An encounter with Pinker
In my post graduate studies, I became bored with the same old interpretations of literature. In 3 hour seminars, I would sit, my shoulders slack, eyes blinking slow with sleep. Literature seemed to be going nowhere new, no undiscovered country.
Can you believe that literary scholars in this century would still be using Freud to interpret great works? Sure, the man was a drug addicted genius, but literary scholars act as if psychology equals Freud.
At about 9:00 PM one evening, I was semi-slumped in my linguistics seminar when the instructor mentioned, in passing, the connection between languages, mind, and cognitive science. She said the name Pinker. I was startled to hear that there were existed loads of relatively new fields, including cognitive linguistics, cognitive science, and neuroscience.
Of course, being a geek, I ran out and bought the book which mesmerized me. When I picked up Steven Pinker's, How the Mind Works
, my life and studies exploded with new possibilities. Ever since, I have loved the man, despite his flaws, cognitive science, and evolutionary psychology.
Pinker has written a slew of important books on the mind, and he recently participated in the Reddit "Ask Me Anything,"
where readers can throw questions at a famous person and get them answered. He shared many insights into his work and the human mind, and I recommend a full read. I engage one of his responses here.Does a scientific world view make you less happy?
Pinker contends that a "naturalistic" conception of human nature and evolution are essential to becoming a knowledgable adult, and understanding evolution and neuroscience of the brain is exhilarating. We are opening the human mind, and in doing so we better understand the human brain, culture, and ourselves.
In a brilliant description of an intelligent, informed life, he defines wisdom and happiness:
Wisdom consists in appreciating the preciousness and finiteness of our own existence, and therefore not squandering it; of being cognizant of what makes people everywhere tick, and therefore enhancing happiness and minimizing suffering; of being alert to limitations and flaws in our own judgments and decisions and passions, and thereby doing our best to circumvent them.
Wisdom involves an understanding of our fragile and fleeting existence, which causes us to value it more intensely. Great thinkers and Gurus
have struggled to define wisdom for millenea, but Pinker's is exquisite, as it is grounded in a concrete understanding of who we are as evolved human beings, with an abnormally large frontal lobe.
Prometheus tortured for giving humans fire.
Haters of technology extend back to the origins of society.
The god Prometheus bestowed fire into the hands of mortals, and Zeus punished him by binding him to a boulder, impaling him and allowing the birds and nature to peck at him for eternity. Since he was immortal, not even the narcotic death could save him.
Greek mythology represents perhaps the earliest attack on technology. The power of fire's primitive technology produced bad mortals, according to Zeus. Human beings characterized in this manner lack the intelligence and ethics to handle technology. It would make life too easy for mortals and therefore was labeled evil. In order to learn, humans in this world view needed a hard life because of moral and intellectual weakness.
The idea is absurdly conservative and reactionary, but so tempting to buy into that it just will not die. Here, I will refute the idea regurgitated by The Wall Street Journal that technology is bad, and provide you with specific benefits to the individual and entire societies.
Bad technology may sound nonsensical to you, but the idea remains embedded in the collective unconscious, and anti-technology spokespeople are everywhere. Far from ignorant, detractors are
often hyper intelligent, but in the end misguided in adhering to an idea grounded in mythology from millennia past. Programs such as "Digital Nation" by PBS examine the benefits and drawbacks of living in a technological world, with objectivity, but resistance remains.
The Wall Street Journal picked up the ancient idea recently, but they are far from the first in modern times. You cantrace this most recent thread of the argument to Nicholas Carr, who published "Is Google Making Us Stupid" in The Atlantic Monthly in July of 2008, answering the title question in the affirmative. His story raised the paranoid heads of people in the U.S., fearing that our minds were wasting away, the result of ominous technology.
Carr contended then that the internet in general was dumbing the population, physically changing our brains so that we think with less complexity. The country was outraged that technology was stifling intelligent thinking. Carr revived the ancient issue and made it mainstream news. Imagine the scandal: Google alters neural pathways, restricting our ability to think! The article wasn't even about Google specifically, but internet technologies.
In an unfortunate pseudo-scientific follow-up study, Science in 2011 revealed that Carr's "Google Effect" was a reality. As if the internet were horrifically negative, they stated that "sophisticated algorithmic search engines, has made accessing information as easy as lifting a finger." So, "the Google Effect," in fact, reduced our intelligence.
The study was outdated since the Pew Research Center released a study in 2010, citing top scholars in cognitive science who stated that the small negative effects were trumped by positives such as massive, accessible, and free information.
Technology serves the greater good of society, despite minor drawbacks.
's "Is Smart Making Us Dumb?"
by Evgeny Morozov on February 23, 2013 (a title swiped from Carr, merely using synonyms) rehashes the tired idea with reference to "smart technologies:"
These objects are no longer just dumb, passive matter. With some help from crowdsourcing or artificial intelligence, they can be taught to distinguish between responsible and irresponsible behavior (between recycling and throwing stuff away, for example) and then punish or reward us accordingly—in real time.
The iconic Einstein at the board, thinking.
Einstein the father
Imagine a 25 year old Albert Einstein wheeling baby Albert in a pram through a park in Switzerland. It was 1904, a cute image of scientific icon, lost in his own thoughts, imagination, universe, and embracing his role as a father.
Nestled beside baby Albert in the pram was a dirty old notebook, Einstein pausing intermittently to scrawl notes to himself. Of course, he would not wander through the park without thinking. His mind never shut off. The New York Times revealed this tender story today, and I am always delighted by little known stories of my hero.
In that baby carriage with his infant son was Mr. Einstein’s universe-in-the-making, a vast, finite-infinite four-dimensional universe, in which the conventional universe – existing in absolute three-dimensional space and in absolute three-dimensional time of past, present and future – vanished into a mere subjective shadow.
New York Times
In those moments of inspiration, he mentally entered that "finite-infinite four-dimensional universe," a man so shaped by the reality of his imagination that his feet rarely touched this earth. Foremost in Einstein's genius was his ability to imagine complex experiments and events, some of which still cannot be duplicated in real life but lead to his major breakthroughs in the theory of relativity.
Einstein the hero
Few things are more powerful in my own imagination than Einstein tripping through the universe at the speed of light, wild hair flying behind him, a transcendent look on his face, his visions and theories becoming his reality. I don't know when I first conceived this, but the image is powerful. Einstein had a nearly perfect mind that merged the creative with mathematics and science.
My favorite anecdote of Albert, however, is of a time when he doubted himself and all of quantum theory.
Elearning and MOOCs represent a fundamental disruption in the continuum of traditional higher education, initiating an unprecedented revolution in education that will induce mayhem before resolving itself into a unique and evolved system.
The new universe of higher education will empower more people across the globe and increase the general level of human intelligence, a democratic system of education that benefits human evolution globally.
Darin L. Hammond
D. L. Hammond calls for the education revolution.
In the midst of a turbulent Continental Congress in Philadelphia, 1976
, Thomas Jefferson rallied the unconvinced delegates with revolutionary language in the Declaration of Independence. Jefferson took the roll of author reluctantly, feeling that more capable authors could state the claims more effectively.
However, nothing matches his brilliance in separating the American people from England and oppression. He declares:
We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness. ...That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness.
Thomas Jefferson, The Declaration of Independence
Higher Education Revolution
Higher education needs a revolutionary voice capable of declaring the causes which impel the people to alter or abolish the traditional regime. This voice will emerge soon, and we will recognize and be compelled by the call for revolt.
We do not yet know that voice, but just as the revolution had already begun when Jefferson put it in writing, the battle has ensued in education. New voices will inspire and unite, while at the same time dividing and initiating a bloodletting.David McCullough, author of 1776,
describes the chaos created by the Declaration:
The year 1776, celebrated as the birth year of the nation and for the signing of the Declaration of Independence, was for those who carried the fight for independence forward a year of all-too-few victories, of sustained suffering, disease, hunger, desertion, cowardice, disillusionment, defeat, terrible discouragement, and fear, as they would never forget, but also of phenomenal courage and bedrock devotion to country, and that, too they would never forget.
David McCullough, 1776
McCullough reminds us that independence and revolution never come easy, and the people must transcend the chaos to be victorious: the battles, defeats, fears, and uncertainty. The education revolution requires the valor of individuals willing to pit themselves against an entrenched tradition of education.
The brief film I present here introduces a series I am creating regarding this revolution: "Higher Education Explosion Series: Part 1 An Introduction to Massive Open Online Courses (MOOCs)." I begin with MOOCs because they are at the front of the battle, disrupting education more than any other movement in history.
Please click "Read More" if the video is not immediately below.
Redefining writer's block
Ralph Ellison: Blocked and broken writer.
If you are a blocked writer, you may be comforted, or tormented, by Ralph Ellison's story. His block scorched an enchanted literary career, a conflagration created in the fuel of his famous 1950 novel Invisible Man
The book elevated Ellison to the heights of the literary elite and the Civil Rights Movement, and his compulsion to surpass his previous success with a new novel crushed him, his death in 1994 the final defeat. Ellison wrote only one novel despite more than forty years of effort. The New York Times
His predicament was worsened by the feeling that he had failed not only himself but the broader black society whose aesthetic he had hoped to champion in a great book that would rival “Moby-Dick.”
Ellison's story is the extreme of a serious psychological condition. Although some writers and critics characterize this block as a euphemism for laziness or procrastination, more than likely have not experienced a real one. However, some scientific substance is required to validate a writer's crisis as a psychological condition.
The Oxford Dictionary
falls short in defining the writer's plague as "The condition of being unable to think of what to write or how to proceed with writing." The term "condition" is imprecise, but notice the difference in the following definition:
A psychological inhibition preventing a writer from proceeding with a piece.
The second definition validates the condition by ascribing it to the mind, a psychological impediment. Also, notice that the first suggests that a writer cannot think of what to write, but Merriam-Webster accurately states that a mental obstacle prevents moving further, not an absence of material.
Good writers always have material. They live and breath it. Ellison did not lack ideas, and we should reconceive writer's block as:
- A cognitive interference creating a disconnect between ideas in the executive, creative brain regions and the linguistic regions that physically communicate in writing.
Holmes at work being mindful.
A modern brain at work
My focus right now is the neuroscience and cognitive psychology of mindfulness, but I am doing other tasks as I think through the subject.
I multitask and think myself proficient. I'm sitting in my comfy, black leather office chair, watching "Everybody Loves Raymond" on Netflix via my iPad, hand typing at the Macbook Pro, ears listening to a Ted Talk playing into earphones from my iPhone. Kids readying for school in the hall outside my office, and mom is yelling ... I mean raising her voice.
Moments when I sit in quiet thought have become quite rare, and with four cute children, a lovely wife, teaching, reading, and writing, I see less mindfulness in my future. Mindfulness refers to extended periods of calm reflection on a single item, the present moment, and multitasking is the exact opposite.
Myth of multitasking
Recent psychology and neuroscience reinforce the human need for mindfulness throughout the day, and many scholars suggest that working on one task easily trumps multitasking. We accomplish far less when managing multiple jobs, but hold tight to the belief that we are doing more.