Elon University

The 2012 Survey: What is the likely future of today’s teens to 20s age group in 2020?

Teens-to-20s to benefit and suffer due to ‘always-on’ lives. From their amazing ability to juggle many tasks to their thirst for instant gratification, survey reveals experts’ hopes and fears

>Download a copy of the 36-page official report by clicking here, or continue reading it on this page.<

Between now and 2020 the teens-to-20s age group of the always-on generation will benefit and suffer thanks to their reliance on rapidly evolving digital information networks. They will approach problems in a different way from their elders. “There is no doubt that brains are being rewired,” said danah boyd of Microsoft Research.

In a new survey, boyd and more than 1,000 other technology stakeholders and critics discussed their expectations for hyperconnected youth. Report co-author and Imagining the Internet director Janna Quitney Anderson refers to the teens-to-20s age group born between the turn of the century and the 2020s as Generation AO, for “always on.”

“They are growing up in a world that offers them instant access nearly everywhere to nearly the entirety of human knowledge, with incredible opportunities to connect, create and collaborate,” she said. “While most of the survey participants see this as mostly positive, some said they are already witnessing deficiencies in young people’s abilities to focus their attention, be patient and think deeply. Some experts expressed concerns that trends are leading to a future in which most people become shallow consumers of information, endangering society.”

Many of the experts surveyed by Elon University’s Imagining the Internet Center and the Pew Research Center’s Internet & American Life Project predicted this generation will be good at connecting, collaborating, and working quickly, they also expect their characteristics to include a thirst for instant gratification and quick fixes and a lack of patience and deep-thinking ability due to what one referred to as “fast-twitch wiring.”

(Methodology section is at bottom of page.)

Lee Rainie, director of the Pew Internet Project, a second co-author of the report says the experts called for a transformation of education. “There is a palpable concern among these experts that new social and economic divisions will emerge as those who are motivated and well-schooled reap rewards that are not matched by those who fail to master new media and tech literacies,” he noted. “They called for reinvention of public education to teach those skills and help learners avoid some of the obvious pitfalls of a hyperconnected lifestyle.”

Survey participants were also asked to predict the most-desired life skills for young people in 2020. Among those they listed are:

  • Public problem-solving through cooperative work (sometimes referred to as crowd-sourcing solutions or using collective intelligence).
  • The ability to search effectively for information online and to be able to discern the quality and veracity of the information one finds and then communicate these findings well (referred to as digital literacy).
  • Synthesizing (being able to bring together details from many sources).
  • Being strategically future-minded.
  • The ability to concentrate.
  • The ability to distinguish between the “noise” and the message in the ever-growing sea of information.

The major findings of the report follow below, on this page.

>Click here to go directly to for-credit responses.<

>Click here to go directly to anonymous responses.<

Experts share their hopes and fears for hyperconnected youth

OVERVIEW: Teens and young adults have been at the forefront of the rapid adoption of the mobile Internet and the always-on lifestyle it has made possible.

The most recent tech-usage studies by the Pew Internet Project show how immersed teens and young adults are in the tech environment and how tied they are to the mobile and social sides of it. Some 95% of teens ages 12-17 are online, 76% use social networking sites, and 77% have cell phones. In addition, 96% of those ages 18-29 are Internet users, 84% use social networking sites, and 97% have cell phones. More than half of those in that age cohort have smartphones and 23% own tablet computers like iPads.

People are tuning into to communications technologies – many of which primarily offer massive amounts of user-generated, shallow-dive information – at an ever-expanding level. Some recent indicators:

• Nearly 20 million of the 225 million Twitter users follow 60 or more Twitter accounts and nearly 2 million follow more than 500 accounts.

• There are more than 800 million people now signed up for the social network Facebook; they spend 700 billion minutes using Facebook each month, and they install more than 20 million apps every day. Facebook users had uploaded more than 100 billion photos by mid-2011.

• YouTube users upload 60 hours of video per minute and they triggered more than 1 trillion playbacks in 2011 – roughly 140 video views per person on earth.

Where will we be by 2020?

Respondents to the Elon University-Pew Internet survey were presented with two potential 2020 scenarios, asked to choose one and encouraged to “Explain your choice about the impact of technology on children and youth and share your view of any implications for the future. What are the positives, negatives and shades of grey in the likely future you anticipate? What intellectual and personal skills will be most highly valued in 2020?” This yielded a wonderful outpouring of opinion about the likely future.

Here is a sampling of the predictions and arguments:

• The environment itself will be full of data that can be retrieved almost effortlessly, and it will be arrayed in ways to help people navigate their lives.

• Teen brains are being rewired to adapt to the new information-processing skills they will need to survive in this environment.

• “Memories are becoming hyperlinks to information triggered by keywords and URLs. We are becoming ‘persistent paleontologists’ of our own external memories, as our brains are storing the keywords to get back to those memories and not the full memories themselves,” argued Amber Case, CEO of Geoloqi.

• Young people accustomed to a diet of quick-fix information nuggets will be less likely to  undertake deep, critical analysis of issues and challenging information. Shallow choices, an expectation of instant gratification, a lack of patience, are likely to be common results. One possible outcome is stagnation in innovation.

• Another possibility, though, is that evolving social structures will create a new “division of labor” that rewards those who make swift, correct decisions as they exploit new information streams and rewards the specialists who retain the skills of focused, deep thinking. New winners and losers will emerge in this reconfigured environment; the left-behind will be mired in the shallow diversions offered by technology.

• A key differentiator between winners and losers will be winners’ capacity to figure out the correct balance in this new environment. Just as we lost oral tradition with the written word, we will lose something big, but we will gain as well. “As Sophocles once said, ‘Nothing vast enters the life of mortals without a curse,’” noted Tiffany Shlain, director of the film Connected and founder of the Webby Awards.

• Reform of the education system is necessary to help learners know how to maximize the best and minimize the worst. Reform could start by recognizing that distractions of all kinds are the norm now. Educators should teach the management of multiple information streams, emphasizing the skills of filtering, analyzing, and synthesizing information. Also of value is an appreciation for silence, focused contemplation, and “lessons in ignoring people,” as futurist Marcel Bullinga put it.

• There are concerns about new social divides. “I suspect we’re going to see an increased class division around labor and skills and attention,” said media scholar danah boyd.

• A portion of respondents challenged the premise of the tension pair, arguing that the move from a text-based society will change people’s patterns of thinking, but not the “wiring” of their brains. “Scholars at the University of Southern Denmark have coined the wonderful phrase ‘the Gutenberg Parenthesis’ to examine the shift into and now out of a textually based society,” noted pundit Jeff Jarvis.

• Others noted research that challenges the idea that people can be “multitaskers.” People really toggle between tasks and “time slice” their attention into ever-smaller chunks of time, argued Nikki Reynolds, director of instructional technology services at Hamilton College.

• There is evidence now that “supertaskers” can handle several complicated tasks well, noted communications expert Stowe Boyd. And some survey respondents noted that it is not necessarily only young adults who do this well.

• Some argued that technology is not the issue as much as bedrock human behavior is. The “moral panic” over digital technology “seems to be wired into us,”—it parallels previous concerns about media that have not led to the downfall of civilization, noted Christopher J. Ferguson, a professor from Texas A&M whose research specialty is technologies’ effects on human behavior.

Futurist John Smart, president and founder of the Acceleration Studies Foundation, recalled an insight of economist Simon Kuznets about evolution of technology effects known as the Kuznets curve: “First-generation tech usually causes ‘net-negative’ social effects; second-generation ‘net-neutral’ effects; by the third generation of tech—once the tech is smart enough, and we’ve got the interface right, and it begins to reinforce the best behaviors—we finally get to ‘net-positive’ effects,” he noted. “We’ll be early into conversational interface and agent technologies by 2020, so kids will begin to be seriously intelligently augmented by the Internet. There will be many persistent drawbacks however [so the effect at this point will be net neutral].

“The biggest problem from a personal-development perspective will be motivating people to work to be more self-actualized, productive, and civic than their parents were. They’ll be more willing than ever to relax and remain distracted by entertainments amid accelerating technical productivity.

“As machine intelligence advances,” Smart explained, “the first response of humans is to offload their intelligence and motivation to the machines. That’s a dehumanizing, first-generation response. Only the later, third-generation educational systems will correct for this.”

Another comprehensive insight came from Barry Chudakov, a Florida-based consultant and a research fellow in the McLuhan Program in Culture and Technology at the University of Toronto. He wrote that by 2020…

“Technology will be so seamlessly integrated into our lives that it will effectively disappear. The line between self and technology is thin today; by then it will effectively vanish. We will think with, think into, and think through our smart tools but their presence and reach into our lives will be less visible. Youth will assume their minds and intentions are extended by technology, while tracking technologies will seek further incursions into behavioral monitoring and choice manipulation. Children will assume this is the way the world works. The cognitive challenge children and youth will face (as we are beginning to face now) is integrity, the state of being whole and undivided. There will be a premium on the skill of maintaining presence, of mindfulness, of awareness in the face of persistent and pervasive tool extensions and incursions into our lives. Is this my intention, or is the tool inciting me to feel and think this way? That question, more than multitasking or brain atrophy due to accessing collective intelligence via the internet, will be the challenge of the future.”

Following is a selection from the hundreds of written responses survey participants shared when answering this question. The selected statements are grouped under headings that indicate the major themes emerging from these responses. The headings reflect the varied and wide range of opinions found in respondents’ replies.

This is the next step in positive human evolution: We become “persistent paleontologists of our own external memories”

Most of the survey respondents with the largest amount of expertise in this subject area said changes in learning behavior and cognition will generally produce positive outcomes.

One of the world’s best-known researchers of teens and young adults—danah boyd of Microsoft Research—said there is no doubt that most people who are using the new communications technologies are experiencing the first scenario as they extend themselves into cyberspace. “Brains are being rewired—any shift in stimuli results in a rewiring,” she wrote. “The techniques and mechanisms to engage in rapid-fire attention shifting will be extremely useful for the creative class whose job it is to integrate ideas; they relish opportunities to have stimuli that allow them to see things differently.”

Amber Case, cyberanthropologist and CEO of Geoloqi, agreed: “The human brain is wired to adapt to what the environment around it requires for survival. Today and in the future it will not be as important to internalize information but to elastically be able to take multiple sources of information in, synthesize them, and make rapid decisions.”

She added, “Memories are becoming hyperlinks to information triggered by keywords and URLs. We are becoming ‘persistent paleontologists’ of our own external memories, as our brains are storing the keywords to get back to those memories and not the full memories themselves.”

Morley Winograd, author of Millennial Momentum: How a New Generation is Remaking America, echoed the keyword-tagging idea. “Millennials are using packet-switching technology rather than hard-wired circuit switching to absorb information,” he responded. “They take a quick glance at it and sort it and/or tag it for future reference if it might be of interest.”

“Those who bemoan the perceived decline in deep thinking or engagement, face-to-face social skills and dependency on technology fail to appreciate the need to evolve our processes and behaviors to suit the new reality and opportunities. Young people and those who embrace the new connectedness are developing and evolving new standards and skills at a rate unprecedented in our history.”

Cathy Cavanaugh, an associate professor of educational technology at the University of Florida, noted, “Throughout human history, human brains have elastically responded to changes in environments, society, and technology by ‘rewiring’ themselves. This is an evolutionary advantage and a way that human brains are suited to function.”

Susan Price, CEO and chief Web strategist at Firecat Studio and an organizer of TEDx in San Antonio, Texas, is optimistic. “The amazing plasticity of the brain is nowhere as evident in the rapid adaptations humans are making in response to our unprecedented access to electronic information,” she wrote. “Those who bemoan the perceived decline in deep thinking or engagement, face-to-face social skills and dependency on technology fail to appreciate the need to evolve our processes and behaviors to suit the new reality and opportunities. Young people and those who embrace the new connectedness are developing and evolving new standards and skills at a rate unprecedented in our history. Overall, our ability to connect, share and exchange information with other human beings is a strong net positive for humanity.”

Teens expert boyd says adults have to recognize the need for young people to explore the world widely and build future skills. “If we keep restricting the mobility of young people, online and offline, we will be curbing their ability to develop social skills writ large,” she warned. “This has nothing to do with technology but with the fears we have about young people engaging with strangers or otherwise interacting with people outside of adult purview.”

William Schrader, a consultant who founded PSINet in the 1980s, expressed unbridled hope.  “A new page is being turned in human history, and while we sometimes worry and most of the time stand amazed at how fast (or how slowly) things have changed, the future is bright for our youth worldwide,” he wrote. “The youth of 2020 will enjoy cognitive ability far beyond our estimates today based not only on their ability to embrace ADHD as a tool but also by their ability to share immediately any information with colleagues/friend and/or family, selectively and rapidly. Technology by 2020 will enable the youth to ignore political limitations, including country borders, and especially ignore time and distance as an inhibitor to communications. There will be heads-up displays in automobiles, electronic executive assistants, and cloud-based services they can access worldwide simply by walking near a portal and engaging with the required method such as an encrypted proximity reader (surely it will not be a keyboard). With or without devices on them, they will communicate with ease, waxing philosophic and joking in the same sentence. I have already seen youths of today between 20 and 35 who show all of these abilities, all driven by and/or enabled by the internet and the services/technologies that are collectively tied to and by it.”

Perry Hewitt, chief digital officer at Harvard University, says this evolution is positive. “It seems easy to decry the attention span of the young and to mourn the attendant loss of long form content—who will watch Citizen Kane with rapt attention when your Android tells you Rosebud was a sled? On consideration, though, the internet has brought forward not only education, but thinking. While we still want to cultivate in youth the intellectual rigor to solve problems both quantitatively and qualitatively, we have gotten them out of the business of memorizing facts and rules, and into the business of applying those facts and rules to complex problems. In particular, I have hope for improved collaboration from these new differently ‘wired’ brains, for these teens and young adults are learning in online environments where working together and developing team skills allows them to advance.”

“We will think differently, and a large part of that will be as a result of being capable of exploiting a new communicative environment.”

David Weinberger, senior researcher at Harvard University’s Berkman Center for Internet & Society, says values will evolve alongside the evolution in ways of thinking and knowing. “Whatever happens,” he wrote, “we won’t be able to come up with an impartial value judgment because the change in intellect will bring about a change in values as well.” Alex Halavais, an associate professor and internet researcher at Quinnipiac University, agreed. “We will think differently, and a large part of that will be as a result of being capable of exploiting a new communicative environment,” he noted.

Anonymous respondents added:

“People of all ages are adjusting to a world where ‘facts’ are immediately discoverable, and judgment between competing facts becomes a primary skill.”

“They will be more nimble and enjoy the access that is available to them interact with their peers, to see, hear, learn, observe, and be entertained—not necessarily in that order. They will have greater flexibility in the world of employment as well.”

“They are used to using the complex interfaces from childhood. It results in a brain better able to assimilate software structure, to organize and resolve complex problems more quickly and almost appear to be ‘wired’ differently than my generation. Positively, they will operate at a much quicker rate in terms of decision-making, analysis, and methodology than my generation. Negatively, they might be missing the sheer joy of play, of conversation, or quiet contemplative moments due to the interruptions of their lives by electronic communication.”

Negative effects cited include the need for for instant gratification, a lack of concentration, craving for social affirmation, loss of patience, obesity

A number of the survey respondents who are young people in the under-35 age group—the central focus of this research question—shared concerns about changes in human attention and depth of discourse among those who spend most or all of their waking hours under the influence of hyperconnectivity.

Alvaro Retana, a distinguished technologist with Hewlett-Packard, expressed concerns about humans’ future ability to tackle complex challenges. “The short attention spans resulting from the quick interactions will be detrimental to focusing on the harder problems, and we will probably see a stagnation in many areas: technology, even social venues such as literature,” he predicted. “The people who will strive and lead the charge will be the ones able to disconnect themselves to focus on specific problems.”

Stephen Masiclat, a communications professor at Syracuse University, said, “When the emphasis of our social exchanges shifts from the now to the next, and the social currency of being able to say ‘I was there first’ rises, we will naturally devalue retrospective reflection and the wisdom it imparts.”

Masiclat said social systems will evolve to offer even more support to those who can implement deep-thinking skills. “The impact of a future ‘re-wiring’ due to the multitasking and short-term mindset will be mostly negative not because it will reflect changes in the physical nature of thinking, but because the social incentives for deep engagement will erode,” he noted. “We will likely retain deep-thinking capability if we just reward it sufficiently in many of our social institutions.”

“The biggest consequence I foresee is an expectation of immediacy and decreased patience among people. Those who grow up with immediate access to media, quick response to email and rapid answers to all questions may be less likely to take longer routes to find information, seeking ‘quick fixes’ rather than taking the time to come to a conclusion or investigate an answer.”

Marjory S. Blumenthal, associate provost at Georgetown University and former director of the Computer Science and Telecommunications Board of the National Academies, agreed. “Perhaps the issue is, how will deep thinking get done—including by whom—rather than will everyone be able to do deep thinking. In other words, division of labor may change.”

Dana Levin, a student at Drexel University College of Medicine, wrote, “The biggest consequence I foresee is an expectation of immediacy and decreased patience among people. Those who grow up with immediate access to media, quick response to email and rapid answers to all questions may be less likely to take longer routes to find information, seeking ‘quick fixes’ rather than taking the time to come to a conclusion or investigate an answer.”However, students who participated in the survey tended to express concerns about their peers’ ability to get beyond short-burst connections to information.

Melissa Ashner, a student at the College of William and Mary, observed, “People report having more difficulty with sustained attention (i.e., becoming immersed in a book). Today, we have very young, impressionable minds depending on technology for many things. It is hard to predict the ways in which this starves young brains of cognitive ability earned through early hands-on experiences. It is likely to continue to contribute to the rise in childhood obesity as well, which further hinders cognitive function.”

Richard Forno, a long-time cybersecurity expert, agreed with these younger respondents, saying he fears “where technology is taking our collective consciousness and ability to conduct critical analysis and thinking, and, in effect, individual determinism in modern society.”

He added, “My sense is that society is becoming conditioned into dependence on technology in ways that, if that technology suddenly disappears or breaks down, will render people functionally useless. What does that mean for individual and social resiliency?”

“I wonder if we will even be able to sustain attention on one thing for a few hours—going to a classical concert or film, for instance. Will concerts be reduced to 30 minutes? Will feature-length films become anachronistic?”

Many anonymous respondents focused their responses on what one referred to as “fast-twitch wiring. Here’s a collection of comments along those lines:

“Communication in all forms will be more direct; fewer of the niceties and supercilious greetings will exist. Idle conversation skills will be mostly lost.”“I wonder if we will even be able to sustain attention on one thing for a few hours—going to a classical concert or film, for instance. Will concerts be reduced to 30 minutes? Will feature-length films become anachronistic?”

“Discussions based around internet content will tend to be pithy, opinion-based, and often only shared using social media with those who will buttress—rather than challenge—political, ideological, or artistic beliefs.”

“Increasingly, teens and young adults rely on the first bit of information they find on a topic, assuming that they have found the ‘right’ answer, rather than using context and vetting/questioning the sources of information to gain a holistic view of a topic.”

“Constant broadcasts don’t make it easy for the individual to step away and work through an issue or concern without interruption.”

“My friends are less interested in genuine human interaction than they are looking at things on Facebook. People will always use a crutch when they can, and the distraction will only grow in the future.”

“Parents and kids will spend less time developing meaningful and bonded relationships in deference to the pursuit and processing of more and more segmented information competing for space in their heads, slowly changing their connection to humanity.”

“How/why should we expect the next generation to be ‘different’ (implication = more evolved/better) when they’re raised in a culture increasingly focused on instant gratification with as little effort as possible?”

“It’s simply not possible to discuss, let alone form societal consensus around, major problems without lengthy, messy conversations about those problems. A generation that expects to spend 140 or fewer characters on a topic and rejects nuance is incapable of tackling these problems.”

“Why are we creating a multitasking world for ADD kids? The effects will be more telling than just the Twitterfication of that generation. There have been articles written about how they’re losing their sense of direction (who needs bearings when you have Google Maps or a GPS?). Who needs original research when you have Wikipedia?”

“Human society has always required communication. Innovation and value creation come from deeper interaction than tweets and social media postings. Deeper engagement has allowed creative men and women to solve problems. If Thomas Edison focused on short bursts of energy, I doubt he would have worked toward the creation of the light bulb.”

“‘Fast-twitch’ wiring among today’s youth generally leads to more harm than good. Much of the communication and media consumed in an ‘always-on’ environment is mind-numbing chatter. While we may see increases in productivity, I question the value of what is produced.”

“There is less time for problems to be worked out, whether they are of a personal, political, economic, or environmental nature. When you (individual or collective) screw up (pollute, start a war, act in a selfish way, or commit a sexual indiscretion as a public person) everyone either knows very quickly or your actions affect many people in ways that are irreversible.”

“They should all be forced to whittle a whistle while sitting on a porch with nothing but the trees and birds for company.”

“Long-form cognition and offline contemplative time will start to be viewed as valuable and will be re-integrated into social and work life in interesting and surprising ways.”

Annette Liska, an emerging-technologies design expert, observed, “The idea that rapidity is a panacea for improved cognitive, behavioral, and social function is in direct conflict with topical movements that believe time serves as a critical ingredient in the ability to adapt, collaborate, create, gain perspective, and many other necessary (and desirable) qualities of life. Areas focusing on ‘sustainability’ make a strong case in point: slow food, traditional gardening, hands-on mechanical and artistic pursuits, environmental politics, those who eschew Facebook in favor of rich, active social networks in the ‘real’ world.”

Enrique Piraces, senior online strategist for Human Rights Watch, said communication and knowledge acquisition are increasingly mediated by technology, noting that by 2020, “a significant part of the knowledge that anyone can discover will be processed by ‘third-party brains.’ Machines will learn from that processing, but I’m afraid the subjects won’t develop deep thinking based on this.”

“Face-to-face time will be calculated in terms of touchscreen camera time and not in face-to-face human contact. Much of this is true in the decade of the 2010’s. The ‘hardwiring’ of the basic core or fabric of the individual will not change; it is technology applications and their outcomes that should be of concern…Stagnation of the whole population will come as a result of lack of the skills of innovation, deep thinking, and a lack of desire or urgent need to fulfill basic human drives in proper human interactions.”

Robert F. Lutes, director of Valley Housing and Economic Authority, says technology is taking humanity down a harmful path. “We have, by-and-large, created a ‘feed-me/fix-me’ generation of sound-bite learners. They are not given the skills to retain anything more than short bits of information. Hence the new generation of computer skills found on social network site such as Twitter, Facebook, et al., are quite easy to grasp hold of and only serve to widen their realms of friends…HP and IBM both dropped their sales of laptop computers for the 2020 generation. Most of the other mainstream companies will continue to do so. CD’s and DVD’s will be totally absent from the scene by that time. Nanotechnology, cloud computing, flash drives, and so forth will be the order of the day. Over the course of the past three years, touchpad technology has exploded exponentially in usage and available applications. These will become the books, communications media, and everything. Face-to-face time will be calculated in terms of touchscreen camera time and not in face-to-face human contact. Much of this is true in the decade of the 2010’s. The ‘hardwiring’ of the basic core or fabric of the individual will not change; it is technology applications and their outcomes that should be of concern…Stagnation of the whole population will come as a result of lack of the skills of innovation, deep thinking, and a lack of desire or urgent need to fulfill basic human drives in proper human interactions.”

A number of respondents to the survey expressed concerns over the health and well-being of young people by 2020. Keith Davis, a team leader for a US Defense Department knowledge-management initiative, noted, “Technology is taking more and more of our children’s time, and not much of the internet time is spent learning. Time once spent outside (as a child) is now spent on computers. Our children are becoming sedentary and overweight at an alarming rate. Weight gain and that type of lifestyle causes apathy in our children. Social skills will be lost, and a general understanding of common sense will be a thing of the past—common sense = Web search. Here is my 2020 prediction: 60% of children over the age of 15 are overweight in the US, and the Web traffic to non-learning sites has grown threefold.”

Bruce Nordman, a research scientist at Lawrence Berkeley National Laboratory and active leader in the Internet Engineering Task Force, expressed concerns over people’s information diets, writing: “The overall effect will be negative, based on my own experience with technology, attention, and deep thinking (I am 49), and observing my children and others. I see the effect of television as a primary example, in which people voluntarily spend large amounts of time in mentally unhealthy activity. I also see our crisis of obesity as informative, as the wide availability of both healthy and unhealthy food ends up with many people eating large amounts of unhealthy food and abandoning healthy habits like exercise. While I am quite willing to believe that some ‘wiring’ differences are occurring and will occur, they will be a modest effect compared to others.”

Eugene Spafford, a professor of computer science and engineering at Purdue University, responded that many young adults are unable to function in a confident and direct manner without immediate access to online sources and social affirmation. He observed: “The ability to express opinion and emotion is replaced with flaming and emoticons, which are much less nuanced. The level of knowledge of the world around many young adults—cultural, political, historical, scientific—seems reduced in favor of greater knowledge of pop culture. There is also a blurring in their minds between facts and opinions because both are presented in quantity with similar polish and forcefulness, and verification and reasoning have been replaced by search engine results. The resulting acceptance of bombast for fact is damaging in nearly all fields of formal inquiry.”

Megan Ellinger, a user experience analyst for a research organization based in Washington, DC, noted that it is becoming more difficult to find truth. “The negative learning behavior and cognition I see occurring by 2020 is rooted in our society’s ability to assess information at a deeper level and to determine what is fact and what is fiction,” she wrote. “It’s an issue that is not unique to future generations, but one I imagine will become more challenging as we generate more collective ‘intelligence.’”

The result is likely to be a wide-ranging mix of positives, negatives, and in between– and not just for young people

Many survey participants said always-on connectivity to global information is a double-edged sword. Dave Rogers, managing editor of Yahoo Kids, observed that there will be winners and losers as this technology evolves. “Certainly,” he noted, “there will be some teens and young adults who will suffer cognitive difficulties from unhealthy use of the internet, Web, social media, games, and mobile technology. These problems will arise not because of the technology but because of wholly inadequate adult guidance, training, and discipline over young people’s use of the technology. But most teens and young adults will prosper as described in the first option.”

He said one plus is that mobile connectivity is rapidly transforming the lives of children. “The learning and cognitive development made possible by tablets is much more ‘natural,’ more in keeping with the evolutionary-driven development of young minds because it is so much less dependent upon cognitive skills that the youngest children have not yet developed (e.g., advanced verbal abilities),” he wrote. “It’s still early, but I believe we will see significant, positive and even astounding improvements in the cognitive abilities of young people within the next five years.”

Youth expert Morley Winograd said the Millennial generation will drive positive change in the next decade. “When Millennials remake our educational institutions so that they reflect this internet-based architecture, rather than the broadcast, ‘expert in the center’ framework of today’s K-doctorate educational systems,” he wrote, “then their ability to process, if not actually absorb, a greater amount of information will be used to produce positive outcomes for society. But that will take longer then eight years to accomplish.”

“The question we face as individuals, organizations, educators and perhaps especially as parents is how we can help today’s kids to prepare for that world—the world they will actually live in and help to create—instead of the world we are already nostalgic for.”

“I made the optimistic choice, but in reality, I think that both outcomes will happen,” noted Hal Varian, chief economist at Google. “This has been the case for every communications advance: writing, photography, movies, radio, TV, etc. There’s no reason to believe that the internet is any different. It will provide ways to save time, and ways to waste time, and people will take advantage of both opportunities. In balance, however, I lean toward the more optimistic view since a larger fraction of the world’s population will now be able to access human knowledge. This has got to be a good thing.”

Alexandra Samuel, director of the Social + Media Centre in Vancouver, Canada, said it is important to recognize that cultural and generational biases have always influenced the way older people perceive how young people think and spend their time. “If we can stop fretting about what we’re losing we can make room to get excited about what we’re gaining: the ability to multitask, to feel connected to ‘strangers’ as well as neighbours, to create media unselfconsciously, to live in a society of producers rather than consumers,” she said. “The question we face as individuals, organizations, educators and perhaps especially as parents is how we can help today’s kids to prepare for that world—the world they will actually live in and help to create—instead of the world we are already nostalgic for.”

Computing pioneer and ACM Fellow Bob Frankston, predicted that people will generally take all of this in stride. “We will renorm to the new tools,” he said. “We have always had mall rats and we’ve had explorers. Ideally, people will improve their critical thinking skills to use the available raw information. More likely, fads will continue.”

Jerry Michalski, founder and president of Sociate, asked, “What if we’re seeing a temporary blip in behavior because an Aleph has suddenly opened in the middle of civilization, a Borges-like hole through which anyone can talk to anyone, and anyone can see everything that ever happened and is happening now? Because this has never existed, all the way back through prehistory, of course we’re seeing addictive and compulsive behaviors. Naturally. The big question seems to me to be whether we’ll regain our agency in the middle of all this, or surrender to consumerism and infotainment and live out a WALL-E world that’s either Orwell’s or Huxley’s misanthropic fantasies in full bloom. I think we’re figuring out how to be human again amid all this, and that we’ll all learn how to use the new technologies to multitask as well as to dive deep into materials, weaving contexts of meaning that we haven’t seen before. Call me an optimist.”

“Just as with J.R.R. Tolkien’s ring of power, the internet grants power to the individual according to that individual’s wisdom and moral stature. Idiots are free to do idiotic things with it; the wise are free to acquire more wisdom. It was ever thus. Each new advance in knowledge and technology represents an increase in power, and the corresponding moral choices that go with that power.”

Tiffany Shlain, director of the film Connected and founder of the Webby Awards, quoted Sophocles. “We are evolving and we are going to be able to access so much knowledge and different perspectives that we will come up with new ideas and new solutions to our world’s problems,” she responded. “The key will be valuing when to be present and when to unplug. It is the core of what makes us human is to connect deeply, so this always will be valued. Just as we lost oral tradition with the written word, we will lose something big, but we will gain a new way of thinking. As Sophocles once said, ‘Nothing vast enters the life of mortals without a curse.’”

Martin D. Owens, an attorney and author of Internet Gaming Law, also pointed out the dual effects of human’s uses of technologies, writing, “Good people do good things with their access to the internet and social media—witness the profusion of volunteer and good cause apps and programs which are continually appearing, the investigative journalism, the rallying of pro-democracy forces across the world. Bad people do bad things with their internet access. Porno access is all over the place—if you want it. Even Al Qaeda has a webpage, complete with interactive social games with a terrorist bent like Make a Bomb in the Kitchen of Your Mom. Just as with J.R.R. Tolkien’s ring of power, the internet grants power to the individual according to that individual’s wisdom and moral stature. Idiots are free to do idiotic things with it; the wise are free to acquire more wisdom. It was ever thus. Each new advance in knowledge and technology represents an increase in power, and the corresponding moral choices that go with that power.”

Jessica Clark, a media strategist and senior fellow for two US communications technology research centers, was among many who observed that there’s nothing new about concerns over teens and evolving ways to content creation and share it. “History is a progression of older people tut-tutting over the media production and consumption habits of those younger than them and holding tightly to the belief that the technologies of communication they grew up with are intellectually or culturally superior,” she wrote. “Every new generation finds creative and groundbreaking ways to use the new technologies to explore and illuminate human truths and to make dumb, sexist, horrifying schlock. Multitasking young adults and teens will be fine; they’ll be better at certain types of tasks and worse at others. Their handwriting will be horrendous. Their thumbs will ache. Life will go on.”

This could have a significant impact on politics, power, and control, endangering human rights and freedom

Respected communications scholar Sandra Braman of the University of Wisconsin shared a perception similar to the type of world Neal Postman warned of in his book Amusing Ourselves to Death: Public Discourse in the Age of Show Business. She wrote: “What is being lost are the skills associated with print literacy, including the ability to organize complex processes in a sustained way over time, engage in detailed and nuanced argumentation, analytically compare and contrast information from diverse sources, etc. What is being gained are hand-eye coordination skills, certain types of visual literacy, etc.”

She continued: “Which literacies are dominant is of serious consequence for society at large. The practice of democracy is one among the fundamental elements of high modern society that relies upon print literacy, as are scientific thought and experimental science. There are two more issues. One is transferability. Are the deep skills acquired by those with a lot of gaming experience transferable to the meat flesh world? That is, do those who can track multiple narratives simultaneously practice that same skill in environments that aren’t animations and have buttons to push? The second is will. Do those who can, to stick with the same example, track and engage with multiple narratives simultaneously choose to do the same with the meat-flesh political environment? The incredibly important research stream that we have not seen yet would look at the relationship between gaming and actual political activity in the meat-flesh world. My hypothesis is that high activity in online environments, particularly games, expends any political will or desire to effectively shape the environment so that there is none of that will left for engaging in our actual political environment.”

Jesse Drew, an associate professor of technocultural studies at the University of California-Davis, echoed Braman. “My fear is that though their cognitive ability will not be impaired, their ability to think critically will be, and they will be far more susceptible to manipulation,” he wrote.

“Which literacies are dominant is of serious consequence for society at large. The practice of democracy is one among the fundamental elements of high modern society that relies upon print literacy, as are scientific thought and experimental science.”

John Pike, director of GlobalSecurity.org, observed: “The world is becoming more complex, and yet both old media (e.g., cable TV news) and new media (e.g., Twitter) are becoming increasingly simplistic. What passes for politics is increasingly a charade detached from actual governance.”

Paul Gardner-Stephen, a telecommunications fellow at Flinders University, said the underlying issue is that people will become dependent upon accessing the internet in order to solve problems and conduct their personal, professional, and civic lives. “Thus centralised powers that can control access to the internet will be able to significantly control future generations,” he pointed out. “It will be much as in Orwell’s 1984, where control was achieved by using language to shape and limit thought; so future regimes may use control of access to the internet to shape and limit thought.”

A number of anonymous respondents brought up control and attention issues when they responded to this research question. Among them:

“With deregulation, consolidation of media ownership and control, and the acceptance of capitalism as natural and inevitable, learning styles and attention spans are headed toward the inability to think critically. Trends in education, social activities, and entertainment all make more likely a future of passive consumers of information.”

“Popular tools allow us to move at a pace that reinforces rapid cognition rather than more reflective and long-term analysis. I fear that market forces and draconian policies will drive the technology/media interface.”

“Among my own peer group today (the young adults), much more attention is given to the topic-of-the-day than to deep, philosophical/moral issues, and I don’t see this trend reversing. There’s a decent chance something could come around in the next five years to create radical social and cultural change the world over, but it’s hard to predict what and even harder to expect it on a short timeframe when the competition has trillions of dollars at their disposal to prevent such radical change from happening.”

“The ease at which authorities can be bypassed erodes our civil society. Cheating and corruption is rampant. Productivity continues to fall not grow as each new wave of technologies fails to live up to its potential. People are obsessed with mundane things. Consumerism becomes the main fuel for our emotions.”

“We need to be more worried about how search engines and other tools are being increasingly controlled by corporations and filtering the information we all access.”

“We have landed in an electronics age where communications technologies are evolving much more quickly than the minds that are producing them and the social structures that must support them. We are not taking the time to evaluate or understand these technologies, and we aren’t having serious conversations about what effects these new tools have on us.”

“We are evolving tools and habits to select what is valuable from what is not, but we clearly can go either way, and some—perhaps many—will go the route of gossip and distraction. The good news is that it doesn’t take very many highly creative people to transform a society; those who figure out how to bring new creations out of internet chaos will surely lead the rest in new and good directions, even as others lead us elsewhere.”

“We have landed in an electronics age where communications technologies are evolving much more quickly than the minds that are producing them and the social structures that must support them. We are not taking the time to evaluate or understand these technologies, and we aren’t having serious conversations about what effects these new tools have on us.”

Fernando Botelho, an international consultant on technology and development, expressed concerns about humans’ tendencies to sort themselves in ways that may cause friction. “Humanity needs no additional help in dividing itself into groups that exclude more than include,” he wrote. “The best way to unite millions and divide billions is nationalism, but the reality is that religion, politics, and so many other mental frameworks can do it just as effectively, and the internet enables much more narrowly targeted divisions so that we are not divided anymore into less than 200 national territories or three or four major religions, but into thousands or even millions of subgroups that challenge us to avoid the tragedy of the commons at a global level.”

And Sam Punnett, president of FAD Research, drew out the second scenario in a multilayered, doleful future:
“The seemingly compulsive nature of modern media use and the distracted nature of users themselves have other serious interpersonal effects akin to substance addictions. We need to know much more about these phenomena. So to go wide and long on this, let’s say in 2020 that the entire wired population has largely restricted its information flow through filtering and aggregators. People expose themselves only to information that conforms with their view of the world, from people they ‘know.’

“Interpersonal skills have eroded to a point where many people no longer have a sensibility for exercising what might have previously been described as tact or social graces. The manner in which communications occur (or do not occur) allows people to artificially wall themselves off from anything unpleasant or unanticipated or complicated. There is an increase in mental illnesses related to disassociation and alienation.

“All communications must be short, visual, and distracting/entertaining. The intellectual attributes that may become highly valued are those that concern particular expertise in an area that requires study and the consolidation of information over time. On the other hand, presentation and on-screen personality may trump expertise as people come to rely on people who merely present information in an entertaining and digestible fashion causing the least amount of cognitive dissonance.

“There will be an increase in accidents and things going wrong due to miscommunication and the widespread combination of sleep deprivation and fractured attention spans. In 2020 almost no one will remember a time when things were different.”

“Branding and politics are ruled by those who can mount the most entertaining ‘noise’ on the most effective platforms. Education will have largely moved ‘on-screen’ in the class and online at higher levels. There is a decline in people’s ability to communicate verbally. Language will simplify to conform to the new requirement for bite-size messages.

“Libraries will continue to consolidate themselves into fewer outlets as crosses between repositories for ‘dead media’ and community centers for public Net access and entertainment. There will be a further emergence of virtual associations in things like game ‘clans,’ online special interest groups and groups formed through social networks.

“Personal skills like those that enable people to get others to cooperate in work settings will be more at a premium as are people with ‘people skills’ such as those required for psychiatric services, mediation and social work. Organizational skills that allow people to see the ‘big picture’ and to coordinate others may be even more highly valued than they are now.

“There will be an increase in accidents and things going wrong due to miscommunication and the widespread combination of sleep deprivation and fractured attention spans.

“In 2020 almost no one will remember a time when things were different.”

Many argue that reinvention and reform of education are the key to a better future; some predict it will not happen quickly enough

Respondents often pointed to formal educational systems as key driver toward a positive and effective transition to taking full advantage of the fast-changing digital-knowledge landscape. “The changes in behavior and cognition in the future depend heavily upon how we adapt our pre-school-through-college curricula to encompass new techniques of learning and teaching,” wrote Hugh F. Cline, an adjunct professor of sociology and education at Columbia University who was formerly a senior research scientist at a major educational testing company based in Princeton, NJ. “If we simply continue to use technologies to enhance the current structure and functioning of education, our young people will use the technologies to entertain themselves and engage in online socializing and shopping. We will have missed enormous opportunities to produce independent life-long learners.”

David Saer, a foresight researcher for Fast Future, said he’s a young adult who predicts a positive evolution but, “education will need to adapt to the wide availability of information, and concentrate on teaching sifting skills.” He added: “The desire for instantaneous content should not be seen as a lack of patience or short attention span but as a liberation from timetables set previously by others. It’s simply a matter of demanding information and technology to suit the timetable of the individual, an overarching trend throughout human history.”

“The changes in behavior and cognition in the future depend heavily upon how we adapt our pre-school-through-college curricula to encompass new techniques of learning and teaching. If we simply continue to use technologies to enhance the current structure and functioning of education, our young people will use the technologies to entertain themselves and engage in online socializing and shopping. We will have missed enormous opportunities to produce independent life-long learners.”

Another futurist, Marcel Bullinga, author of Welcome to the Future Cloud—2025 in 100 Predictions, said education is essential. “Game Generation teens and adults will have lasting problems with focus and attention,” he noted. “They find distraction while working, distraction while driving, distraction while talking to the neighbours. Parents and teachers will have to invest major time and efforts into solving this issue: silence zones, time-out zones, meditation classes without mobile, lessons in ignoring people. All in all, I think the negative side effects can be healed.”

Larry Lannom, director of information management technology and vice president at the Corporation for National Research Initiatives, said, “People must be taught to think critically and how to focus. If they are, then the network is a rich source of information. If they aren’t, then it will be a source of misinformation and mindless distraction. Individual differences will prevail and some will do well in the new environment and some will not.”

Tapio Varis, principal research associate with the UN Educational, Scientific, and Cultural Organization (UNESCO), wrote, “The first scenario will succeed only if the formal school system develops accordingly.” Berkeley, California-based consultant John N. Kelly added, “The ‘wiring’ change is real. Learning opportunities could easily continue to be lost unless educators, venture capitalists, taxpayers, volunteers, and businesses all make concerted efforts to leverage the potential of new technology to enhance the critical thinking skills of young people.”

“Despite schools’ best efforts to integrate technological materials and devices, they’re failing to completely redesign the education system to fit these students. Instead, they are creating drones who succeed purely on their ability to sit still for long periods of time, not use the technological devices available to them, and restrict their studying and research to strict parameters. Students are often unable to adapt when they enter college classrooms requiring them to apply processes and information, problem-solve, or think critically. They barely know how to use alternative words or phrases to complete a Google search.”

Jeniece Lusk, a researcher and PhD in applied sociology at an Atlanta-based information technology company, responded, “Unless the educational paradigms used in our schools are changed to match the non-academic world of the Millennial student, I don’t foresee an increase in students’ abilities to analyze and use critical thinking. Students’ attention is increasingly being pulled into myriad directions—and arguably most of these ‘distractions’ are exciting, fun, and can be used to educate. However, despite schools’ best efforts to integrate technological materials and devices, they’re failing to completely redesign the education system to fit these students. Instead, they are creating drones who succeed purely on their ability to sit still for long periods of time, not use the technological devices available to them, and restrict their studying and research to strict parameters. Students are often unable to adapt when they enter college classrooms requiring them to apply processes and information, problem-solve, or think critically. They barely know how to use alternative words or phrases to complete a Google search. Since they’ve been taught that e-technology has no place in the classroom, they also haven’t learned proper texting/emailing/social networking etiquette, or, most importantly, how to use these resources to their advantage.”

Bonnie Bracey Sutton, a technology advocate and education consultant at the PowerofUS Foundation, said educators have to break through the old paradigm and implement new tools. “We were previously harnessed by text and old models of pedagogy,” she wrote. “When we move to transformational teaching it is hard to explain to traditional teachers what we are doing in a way that allows them to understand the beauty of using transformational technology. Many ways of learning are involved, and the work is not all done by the teacher. Resources abound from partners in learning, in advocacy, and academia. The technology makes it all possible, and we can include new areas of learning, computational thinking, problem solving, visualization and learning, and supercomputing.”

An anonymous respondent said most teachers today can’t comprehend the necessary paradigm to implement the tools effectively, “Those who are teaching the children who will be teenagers and young adults by 2020 are not all up-to-speed with the internet, mobile technologies, social interfaces, and the numerous other technologies that have recently been made mainstream. There will be a decline for behavior and cognition until those who have grown up with this type of technology are able to teach the children how to correctly and productively utilize the advantages it presents us.”

Another anonymous respondent wrote, “Interactions will definitely be different as a result of kids growing up with all this technology at their fingertips. I don’t think this will result in less-smart children, but it will definitely result in children who learn differently than those who grew up without constant stimulation from technology.”

Tin Tan Wee, an Internet expert based at the National University of Singapore, estimates a slow movement to try to adapt to deal with the likely divide. “After 2020,” he predicted, “more-enlightened educators will start developing curricula designed to tap a post-internet era. After 2030, educational systems, primarily private ones, will demonstrate superior outcomes on a wider scale. After 2040, governments will start realising this problem, and public examination systems will emerge.

“The key lynchpin to watch for will be online testing systems which allow for the use of internet access and all the issues of identity, security, copying, plagiarism, etc., some of which companies like Turnitin are starting to address for tertiary education. So during the next 20 to 30 years, a digital divide will grow in educational systems and in outcomes in which the individuals, systems, etc., which can adapt will progress far more rapidly than those who cannot—and they will be the majority and will do badly and suffer. We are already seeing this manifested in the economic scene, where the rich get richer and the poor poorer.”

Ken Friedman, dean of the faculty of design at Swinburne University of Technology in Melbourne, Australia, said, “With an added repertoire of experiences and skills, it might be that technology could lead to a brighter future, but today’s young people generally do not seem to be gaining the added skills and experiences to make this so.”

“The difference between the two scenarios will come down to the ability of our educational system (or its replacement) to teach people how to manage the flow of information, the interaction between personal and work, social and entertainment, fact and opinion. The focus must be on the fact that learning means knowing how to filter and interpret the vast quantities of data one is exposed to—we must use the fact that the internet has all of this information to spend less time doing rote memorization and more time on critical thinking and analysis of the information that is available to you.”

Freelance journalist Melinda Blau said education in internet literacy is key. “Technology always presents us with a combination of losses and gains,” she wrote, “but I believe the internet gives more than it takes away. 2020 will yield primarily helpful results, especially if our schools and other institutions take steps to—in Howard Rheingold’s words—help develop Internet literacy.”

Wesley George, principal engineer for the Advanced Technology Group at Time Warner Cable, said there must be a shift in focus in the education system. “The difference between the two scenarios will come down to the ability of our educational system (or its replacement) to teach people how to manage the flow of information, the interaction between personal and work, social and entertainment, fact and opinion,” he predicted. “This does represent an evolutionary change, but the focus must be on the fact that learning means knowing how to filter and interpret the vast quantities of data one is exposed to—we must use the fact that the Internet has all of this information to spend less time doing rote memorization and more time on critical thinking and analysis of the information that is available to you.”

Tom Franke, chief information officer for the University System of New Hampshire, noted that it is up to people to actively set the agenda if they want a positive outcome. “As machines that ‘think’ become prevalent and information access becomes even more universal than today, we will need to re-envision our models of education and learning,” he said. “The possibility of exploring deep questions will be enhanced, but it will be our culture, not our technology, that determines whether or not we have the will to use the tools in meaningful ways to enhance humanity.”

Teachers express many concerns about the disconnect they are feeling with students; you can feel the tension in their words

A number of people who identified themselves as teachers answered this question as anonymous respondents and most of them expressed frustration and concern for today’s students. Several noted that they have seen things “getting worse” over the past decade. Is this at least partially due to the fact that they are still trying to educate these highly connected young people through antiquated approaches? Perhaps those who have argued for education reform would think so.

Among the responses from those who expressed concerns about the students they are teaching now, some blame technology; some blame culture. Following is a selection of those responses:

“My experience with college students suggests to me that their critical skills are diminishing; they can’t make connections or see issues and events in terms of systems, prior choices, or institutions. Instead, any item/event is the equivalent of any other item/event. It is quickly displaced or disconnected from other items/events, and just part of a massive flow. Students don’t read books. They rarely read long articles. When they do read, they don’t read for arguments. Instead, they skim the middles of pages, perhaps moving their eyes up and down if something interests them. They don’t work on retaining what little they read, or even seem to think that taking notes is necessary. Their reasons seem to be that they can always find out whenever they need to. The future will belong to those who can focus. This will be an increasingly small and rare group of people.”

“I have seen a general decline in higher-order thinking skills in my students over the past decade. What I generally see is an over-dependence on technology, an emphasis on social technologies as opposed to what I’ll call ‘comprehension technologies,’ and a general disconnect from deeper thinking. I’m not sure that I attribute this to the so-called ‘re-wiring’ of teenage brains, but rather to a deeper intellectual laziness that the Web has also made possible with the rise of more video-based information resources (as opposed to textual resources).”

“Students don’t read books. They rarely read long articles. When they do read, they don’t read for arguments. Instead, they skim the middles of pages, perhaps moving their eyes up and down if something interests them. They don’t work on retaining what little they read, or even seem to think that taking notes is necessary. Their reasons seem to be that they can always find out whenever they need to. The future will belong to those who can focus. This will be an increasingly small and rare group of people.”

“I have horror stories about lack of attention. I am not sure that the physiology will change, but I am sure about how the current generation orients to traditional text—reading it or writing it. I have also seen the loss of interpersonal communication competence. What has emerged is an overly dramatic face-to-face style and a greater unwillingness to engage or cope with differences. It is extending adolescence.”

“I teach at the college level—have been for 12 years. I have seen a change in my students, their behavior, their learning, etc. Students do not know how to frame a problem or challenge. They do not know how to ask questions, and how to provide enough detail to support their answers (from credible sources). Technology is playing a big part in students not only not being able to perform as well in class, but also not having the desire to do so.”

“Every day I see young people becoming more and more just members of a collective (like the Borg in Star Trek) rather than a collection of individuals and I firmly believe that technology is the cause. I also believe that this phenomenon, which is at first merely seductive, eventually becomes addictive and is going to be very difficult to undo.”

“From my teaching I find more and more college students finding trouble in reading, listening, understanding directions, and comparing ideas. It is not wiring so much as change in education and culture, in my opinion.”

“Students do not know how to frame a problem or challenge. They do not know how to ask questions, and how to provide enough detail to support their answers (from credible sources). Technology is playing a big part in students not only not being able to perform as well in class, but also not having the desire to do so.”

“The answers that students produce—while the students may be adept at finding them online through Google—tend to be shallow and not thought through very well. However, to say that somehow they aren’t as smart as earlier generations is a crock—many write quite poorly on academic assignments but are fine when blogging and producing diaristic accounts that ask them about themselves—an outcome, doubtless, of the lifetime focus on ‘me’ that many middle-class and upper-class kids now experience as the norm.”

“I do not think that the highly interactive, interrupt-driven, always-on, information-rich society that young people are growing up in is causing a decline in deep intellectual activity. Rather, I think it is the culture at large, driven by the generation before this youngest generation that devalues science, facts, intelligence, reasoning, and intellectual achievement in favor of emoting, celebrity, athletic achievement, fighting and winning, and faith. Also, the culture of praise over criticism leads to a society where to tell someone they are incorrect is at best a social faux pas and at worst reasons for demotion, dismissal, and poor teaching evaluations.”

“We’re all going to end up being more distracted, shallow, fuzzy thinking, disconnected humans who cannot think or act critically. But this won’t be because of the Internet, it’ll be because of the loss of values and resourcing of things like education and civics and the ridiculous degree to which popular media, etc., are influencing our culture, values, etc.”

Widening divide? There’s a fear the rich will get richer, the poor poorer

Teens expert danah boyd raised concerns about a looming divide due to the switch in how young people negotiate the world. “Concentrated focus takes discipline, but it’s not something everyone needs to do,” she wrote, “unfortunately, it is what is expected of much of the working-class labor force. I suspect we’re going to see an increased class division around labor and skills and attention.”

Barry Parr, owner and analyst for MediaSavvy, echoed boyd’s concern about a widening divide. “Knowledge workers and those inclined to be deep thinkers will gain more cognitive speed and leverage,” he said, “but, the easily distracted will not become more adept at anything. History suggests that on balance people will adapt to the new order. The greatest negative outcome will be that the split in adaptation will exacerbate existing trends toward social inequality.”

“Knowledge workers and those inclined to be deep thinkers will gain more cognitive speed and leverage, but, the easily distracted will not become more adept at anything. History suggests that on balance people will adapt to the new order. The greatest negative outcome will be that the split in adaptation will exacerbate existing trends toward social inequality.”

Alan Bachers, director of the Neurofeedback Foundation, said society must prepare now for the consequences of the change we are already beginning to see. “The presence of breadth rather than depth of cognitive processing will definitely change everything—education, work, recreation,” he responded. “Workers will show up unsuited for the robotic, mind-numbing tasks of the factory—jobs now vanishing anyway. Creativity, demand for high stimulus, rapidly changing environments, and high agency (high touch) will be what makes the next revolution of workers for jobs they will invent themselves, changing our culture entirely at a pace that will leave many who choose not to evolve in the dust.”

An anonymous survey respondent said children who grow up with access to technology plus the capacity to use it in a positive manner will generally be more successful than others: “Decision-making will yield better results and those who are adept at integrating knowledge will be very successful. However, a wired world will be very addictive and those young adults who do not have a clear goal and a desire to achieve something will be caught in a downward spiral from which escape will be almost impossible. They will fall further and further behind. The result will be bimodal. The result will be positive overall, but a new type of underclass will be created which will be independent of race, gender, or even geography.”

Another anonymous respondent echoed those thoughts, writing, “Young people from intellectually weak backgrounds who have no special driving interest in self-development are all too likely to turn out exactly as the purveyors of a debased mass-culture want them to be: shallow, impulse-driven consumers of whatever is being sold as ‘hot’ at the moment.”

Tin Tan Wee, an expert based at the National University of Singapore, noted: “The smart people who can adapt to the internet will become smarter, while the rest, probably the majority, will decline. Why? The reason is simple. Current educational methods evolved to their current state mostly pre-internet. The same goes for a generation of teachers who will continue to train yet another generation of kids the old way. The same goes for examination systems, which carry out assessment based on pre-internet skills. This mismatch will cause declension in a few generations of cohorts. Those who are educated and re-educable in the internet way will reap the benefits of the first option. Most of the world will suffer the consequence of the second. The intellectual divide will increase. This in turn fuels the educational divide because only the richer can afford internet access with mobile devices at effective speeds.”

Some say the use of tech tools has no influence in the brain’s ‘wiring’

Well-known blogger, author, and communications professor Jeff Jarvis said we are experiencing a transition from a textual era and this is altering the way we think, not the physiology of our brains. “I don’t buy the punchline but I do buy the joke,” he wrote. “I do not believe technology will change our brains and how we are ‘wired.’ But it can change how we cognate and navigate our world. We will adapt and find the benefits in this change.”

He continued: “Hark back to Gutenberg. Elizabeth Eisenstein, our leading Gutenberg scholar, says that after the press, people no longer needed to use rhyme as a tool to memorize recipes and other such information. Instead, we now relied on text printed on paper. I have no doubt that curmudgeons at the time lamented lost skills. Text became our new collective memory. Sound familiar? Google is simply an even more effective cultural memory machine. It has already made us more fact-based; when in doubt about a fact, we no longer have to trudge to the library but can expect to find the answer in seconds. Scholars at the University of Southern Denmark have coined the wonderful phrase ‘the Gutenberg Parenthesis’ to examine the shift into and now out of a textually based society.”

“We don’t bother to remember things we know our spouse will remember for us. The Internet is the same thing on a larger scale. I remember 15 years ago when people were terrified that kids would not be able to write because of the text-message shorthand that they had invented for themselves. It turns out that kids who use (and invent) text-message shorthand have better verbal skills than us oldsters do because text-message shorthand is inventive word play. The kids aren’t smarter or dumber than we were; technology helps us free our brains for more useful things.” 

“Before the press,” Jarvis concluded, “information was passed mouth-to-ear, scribe-to-scribe; it was changed in the process; there was little sense of ownership and authorship. In the five-century-long Gutenberg era, text did set how we see our world: serially with a neat beginning and a defined end; permanent; authored. Now, we are passing out of this textual era and that may well affect how we look at our world. That may appear to change how we think. But it won’t change our wires.”

Jim Jansen, an associate professor of information science and technology at Penn State University and a member of the boards of eight international technology journals, noted, “I disagree with the opening phrase: ‘In 2020 the brains of multitasking teens and young adults are ‘wired’ differently from those over age 35.’ I find it hard to believe that hard wiring, evolved over millions of years can be re-wired. We can learn to use tools that impact the way we view things, but to say this is wiring is incorrect.”

Tracy Rolling, a product user experience evangelist for Nokia, observed, “One of the great things about the Internet is that it frees up people’s memories. You don’t have to remember information; you only have to remember how to find the information you need. Most of the information we need, we don’t need all the time. There’s no reason to actually remember it at all. But this is no different in ‘wiring’ that what we all do anyway. We don’t bother to remember things we know our spouse will remember for us. The Internet is the same thing on a larger scale. I remember 15 years ago when people were terrified that kids would not be able to write because of the text-message shorthand that they had invented for themselves. It turns out that kids who use (and invent) text-message shorthand have better verbal skills than us oldsters do because text-message shorthand is inventive word play. The kids aren’t smarter or dumber than we were; technology helps us free our brains for more useful things.”

Some analysts framed their arguments in more general terms and argued that there will not be significant cognitive change. This is the way Seth Finkelstein, a prominent tech analyst and programmer, put it: “I really wish there was an option for: ‘In 2020 the brains of teens and young adults are not ‘wired’ differently from those over age 35 and overall it yields essentially identical results. They learn roughly the same amount, as for most people the speed of information access is not the limiting factor. In sum, the changes in learning behavior and cognition among the young aren’t significantly affected.’”

Questioning the idea of multitasking; some define it to be impossible, some believe in time-slicing or supertasking

The word “multitasking” has firmly rooted itself as the primary descriptor used to refer to the task-juggling and attention-switching that is part and parcel of the hyperconnected lifestyle. Multitasking is a common act among today’s teens and 20s set. The semantics of the word have been argued, with many saying it is not possible to perform multiple tasks simultaneously.

“Regarding the word ‘multitasking,’ cognitive, behavioral, and neurological sciences are moving toward a consensus that such a state does not actually exist in the human brain,” observed emerging technology designer Annette Liska. “We may make many quick ‘thoughts’ in succession, but human performance in any activity that is done without focus (often termed ‘multitasking’) is of significantly lower quality, including an absence of quality and consciousness. The word unfortunately perpetuates a false ideal of the human capacity to perform and succeed.”

“The desire for constant stimulation and task switching is being inculcated in our youth, but not necessarily the ability to manage multitasking effectively to get more done. The end result will be negative. Concentration and in-depth thought may be skills that are rare, and thus highly valued in 2020.”

Devra Moehler, a communications faculty member at the University of Pennsylvania, shared research resources. “Eyal Ophir (Stanford) and others [have shown] the effects of multitasking are negative, even for those who think they are good at it. Matt Richtel wrote about this topic in the 2010 New York Times article Your Brain on Computers: Attached to Technology and Paying a Price and Helene Hembrooke and Geri Gay wrote in 2003 for the Journal of Computing in Higher Education The Laptop and the Lecture: The Effects of Multitasking in Learning Environments. The desire for constant stimulation and task switching is being inculcated in our youth, but not necessarily the ability to manage multitasking effectively to get more done. The end result will be negative. Concentration and in-depth thought may be skills that are rare, and thus highly valued in 2020.”

“I agree with all of those who say that multitasking is nothing more than switching endlessly from one thought to another—no one can think two things at once—but I don’t agree that this kind of attention-switching is destructive or unhealthy for young minds,” added Susan Crawford, professor at Harvard University’s Kennedy School of Government and formerly on the White House staff. “It’s just the way the world works now, and digital agility is a basic skill for everyone. At the same time, I have hopes for my students: I hope they’ll discover the flow experience of reading long-form works and won’t need distraction in order to concentrate; I hope they’ll go on finding ways to hang out that are meaningful and don’t involve devices.”

Nikki Reynolds, director of instructional technology services at Hamilton College, said studies indicate that young people are not truly multitasking. “They are ‘time slicing’,” she responded. “A few seconds of attention to the phone, now switch to the homework, now the TV, now back to the phone. This means it takes them longer to complete any one task, such as their homework. It also appears to affect the quality of their work. However, in my experience as a manager of only a few people, all of whom must interact daily with many more people, I am beginning to believe that this time slicing will become a skill that will help young people manage adult life better. The number of people who need our attention to answer a quick question or connect them to some resource is growing rapidly, and this requires me and my team to spend a lot of time switching contexts as part of our jobs. We touch a lot of people for brief little bits of time, in an unpredictable stream of interactions. I suspect the kids will be fine.”

“Cultural criticism seems to want to sequester certain questionable activities—like video gaming, social networking, multitasking, and others—into a no-man’s-land where the plasticity of the human mind is negative. None of these critics wring their hands about the dangerous impacts of learning to read, or the intellectual damage of learning a foreign language. But once kids get on a skateboard, or start instant messaging, it’s the fall of Western civilization.”

Gina Maranto, a co-director in the graduate program at the University of Miami, said information multitasking is not a new phenomenon. “My father, a corporate editor, used to watch television, read magazines, and listen to the radio at the same time long before computers, cellphones, or iPads,” she said. “On the whole, I believe access to information and to new techniques for manipulating data (e.g., visualization) enhance learning and understanding rather than negatively impact them. Like Diderot’s encyclopedia, which freed up knowledge that had been locked in guilds, the internet and World Wide Web have freed up knowledge that was locked in proprietary databases, archives, and other difficult-to-access sources—and this has far-flung implications, not just educational but socioeconomic and cultural ones. The best students will use these technologies to carry out higher-level cognitive tasks.”

Communications consultant Stowe Boyd says new studies may be showing us that multitasking is actually quite possible. “There is recent evidence (published by researchers Jayson Watson and David Strayer) that suggests that some people are natural ‘supertaskers’ capable of performing two difficult tasks at once, without loss of ability on the individual tasks,” he wrote. “This explodes the conventional wisdom that ‘no one can really multitask,’ and by extension the premise that we shouldn’t even try. The human mind is plastic. The area of the brain that is associated with controlling the left hand, for example, is much larger in professional violinists. Likewise, trained musicians listen to music differently, using more centers of the brain, than found in non-musicians. To some extent this is obvious: we expect that mastery in physical and mental domains will change those master’s perceptions and skills. But cultural criticism seems to want to sequester certain questionable activities—like video gaming, social networking, multitasking, and others—into a no-man’s-land where the plasticity of the human mind is negative. None of these critics wring their hands about the dangerous impacts of learning to read, or the intellectual damage of learning a foreign language. But once kids get on a skateboard, or start instant messaging, it’s the fall of Western civilization.”

Boyd said it seems as if the social aspects of Web use frighten many detractors, adding, “But we have learned a great deal about social cognition in recent years, thanks to advances in cognitive science, and we have learned that people are innately more social than was ever realized. The reason that kids are adapting so quickly to social tools online is because they align directly with human social connection, much of which takes place below our awareness. Social tools are being adopted because they match the shape of our minds, but yes, they also stretch our minds based on use and mastery, just like martial arts, playing the piano, and badminton.”

Contrary to popular belief, young people are not digital wizards

David Ellis, director of communications studies at York University in Toronto, has a front-row seat to observe how hyperconnectivity seems to be influencing young adults. He said it makes them less productive and adds that most of them do not understand the new digital tools or how to use them effectively. “The idea that Millennials have a cognitive advantage over their elders is based on myths about multitasking, the skill-sets of digital natives, and 24/7 connectedness,” he commented. “Far from having an edge in learning, I see Millennials as increasingly trapped by the imperatives of online socializing and the opportunities offered by their smartphones to communicate from any place, any time.

“I can see this in the living experiment that takes place every week in the computer lab where I teach Internet technologies to fourth-year communication studies majors. Students everywhere have become relentless in their use of mobile devices for personal messaging. Even good students delude themselves into thinking they can text friends continuously while listening to a lecture and taking notes and, in the process, retain information and participate in discussions. But good research has shown that even especially bright kids are less productive when multitasking, a finding resisted by plenty of grown-ups as well.

“I see Millennials as increasingly trapped by the imperatives of online socializing and the opportunities offered by their smartphones to communicate from any place, any time…Even good students delude themselves into thinking they can text friends continuously while listening to a lecture and taking notes and, in the process, retain information and participate in discussions. But good research has shown that even especially bright kids are less productive when multitasking, a finding resisted by plenty of grown-ups as well…The immersive nature of 24/7 connectedness creates the illusion that they can somehow tap into a form of collective intelligence just by being online, while looking impatiently for messages every three minutes. We are entering an era in which young adults are placing an inordinately high priority on being unfailingly responsive and dedicated participants in the web of personal messaging that surrounds them in their daily lives.”

“Our fondness for thinking positively about multitasking, especially among the young, gets a lot of reinforcement from two other assumptions: that Millennials have a special aptitude for digital media because they’ve grown up digital; and that ubiquitous, seamless connectivity is a positive social force. The first assumption is baloney; the second is fraught with contextual problems. Of the hundreds of liberal arts students I’ve taught, not one in ten has come into my class with the slightest clue about how their digital devices work, how they differ from analog devices, how big their hard drive is, what Mbps (megabytes per second) measures. In other words, they’re just like people who haven’t grown up digital. And of course the immersive nature of 24/7 connectedness creates the illusion that Millennials can somehow tap into a form of collective intelligence just by being online, while looking impatiently for messages every three minutes.

“I don’t think there’s anything inherently bad or anti-social about smartphones, laptops, or any other technology. I do, however, believe we are entering an era in which young adults are placing an inordinately high priority on being unfailingly responsive and dedicated participants in the web of personal messaging that surrounds them in their daily lives. For now, it seems, addictive responses to peer pressure, boredom, and social anxiety are playing a much bigger role in wiring Millennial brains than problem-solving or deep thinking.”

Hello! AOADD, AKA (Always-On Attention Deficit Disorder) is age-defying

Rich Osborne, senior IT innovator at the University of Exeter in the UK, said his own life and approaches to informing and being informed have changed due to the influence of hyperconnectivity. “As I am in possession of just about every technical device you can name and I am using just about every cloud service you can think of, you’d think I’d be all for this,” he observed. “But I’ve started to wonder about how all this use of technology is affecting me. I strongly suspect it’s actually making me less able to construct more complex arguments in written form, for example—or at the very least it is certainly making such construction harder for me. Of course it might be other issues, stress at work, getting older, interests changing, any number of things—but underlying all these possibilities is the conscious knowledge that my information-consumption patterns have become bitty and immediate.

“I’ve noticed in my own habits how the instant availability of bite-size data has led me away from deeper more complex texts, a form of intellectual procrastination—perhaps even addiction-style behaviour. Of course this might just be temporary—more an effect of the current state of the Internet, as opposed to something that is baked into the very nature of the internet itself. In the meantime, though, the immediate and bite-size nature of Internet exchanges will make it harder for multitasking teens and young adults to undertake deep thinking in particular, and the ‘top-10’ effect, i.e., people selecting whatever Google proposes on the first page of search results, may lead to a plateau of intellectual thinking as we all start to attend to the same content.”

An anonymous respondent agreed, writing, “I find in myself that switching constantly between tasks, and the eyesight and energy issues from sitting in front of a screen all day make it harder for me to concentrate and connect with others in both online and offline settings. I have a shorter attention span. I’m less patient because I’m used to not having to wait for information; there are many things worth doing that take time, are tedious, and require patience. Who among us doesn’t rely on a phone or computer for knowing what to wear, how to get from A to B, and to know what’s happening with our friends, even those we rarely speak to? I don’t see how the more positive scenario could result.”

Another wrote, “I’m 33 years old and over the last two years have ramped up my time spent on the internet to 10-plus hours a day. The effects have been detrimental. My attention span for longer-form information consumption such as books, movies, long-form articles, and even vapid 30-minute TV shows has been diminished immensely. My interpersonal communications skills are suffering, and I find it difficult to have sustained complex thoughts. My creativity is zapped and I get very moody if I’m away from the Web for too long.”

“I’ve noticed in my own habits how the instant availability of bite-size data has led me away from deeper more complex texts, a form of intellectual procrastination—perhaps even addiction-style behaviour. Of course this might just be temporary—more an effect of the current state of the Internet, as opposed to something that is baked into the very nature of the Internet itself. In the meantime, though, the immediate and bite-size nature of internet exchanges will make it harder for multitasking teens and young adults to undertake deep thinking in particular, and the ‘top-10’ effect, i.e., people selecting whatever Google proposes on the first page of search results, may lead to a plateau of intellectual thinking as we all start to attend to the same content.”

Debbie Donovan, a marketing blogger based in Mountain View, California, described her experience: “As an over 35-er, I can tell you that I’ve deliberately re-wired my brain and I can manage a more complex and rewarding life situation as a result of the digital skills deliberately acquired. I am more effective in my work. How we interact digitally is infinitely revealing of how our brains work with all the inputs we receive. I am more effective in my personal life because I can reach out and stay in touch with a much larger circle of friends and family and cultivate the level of intimacy I can achieve in those relationships.”

Heidi McKee, an associate professor of English at Miami University, said, “Nearly 20 years ago everyone was saying how teens were going to be wired differently, but when you look at surveys done by Pew, AARP, and others, older adults possess just as much ability and desire to communicate and connect with all available means.”

Dan Ness, principal researcher at MetaFacts (producers of the Technology User Profile), noted that each generation laments the younger generation and imagines a world that’s either completely better or worse than the current one. “You can go back to writings from hundreds and thousands of years ago and hear the same conclusion,” he said. “While most aging adults don’t want to admit to their own calcification or rigidity, nor how their memory of past events may be romanticized or simplified, there seems to be a perennial need to imagine a starkly changed future. So, this statement is less about the Internet and technology per se, and more about human development. The under-35 group is more likely to fully use the tools and technology around them and incorporate them into their lives. In the main, as people age, they will choose to use what they’ve learned for ‘positive’ or ‘negative’ outcomes.”

A respondent wittily observed the age discrimination implicit in the scenario, writing, “I aint a technophobe and i really hate it when Internet use is demonised for creating problem teens.”

No matter what the tech is, it all comes down to human nature

Human tendencies drive human uses of technology tools. Many of the people participating in this survey emphasized the importance of the impact of basic human instincts and motivations.

Some survey respondents observed that all new tools initially tend to be questioned and feared by some segment of the public. Socrates, for instance, lamented about the scourge of writing implements and their likely threat to the future of intelligent discourse. In his response to this survey question, Christopher J. Ferguson, a professor from Texas A&M whose research specialty is technologies’ effects on human behavior, noted, “The tendency to moralize and fret over new media seems to be wired into us.”

“Youth today are the least aggressive, most civically involved, and mentally well in several generations. Independent reviews of the literature by the US Supreme Court and the Australian Government have concluded the research does not support links between new technology and harm to minors. I think on balance we’ll eventually accept that new media are generally a positive in our lives.”

He added, “Societal reaction to new media seems to fit into a pattern described by moral panic theory. Just as with older forms of media, from dime novels to comic books to rock and roll, some politicians and scholars can always be found to proclaim the new media to be harmful, often in the most hyperbolic terms. Perhaps we’ll learn from these past mistakes? I think we may see the same pattern with social media. For instance the American Academy of Pediatrics claims for a ‘Facebook Depression’ already have been found to be false by independent scholarly review.

“New research is increasingly demonstrating that fears of violent video games leading to aggression were largely unfounded. Youth today are the least aggressive, most civically involved, and mentally well in several generations. Independent reviews of the literature by the US Supreme Court and the Australian Government have concluded the research does not support links between new technology and harm to minors. I think on balance we’ll eventually accept that new media are generally a positive in our lives.”

“The pervasive network allows people to build more quickly on the foundations laid by their predecessors, but it also allows more efficient delivery of increasingly-addictive media that caters to our troop-of-apes-on-the-savannah social needs for popularity and attention.”

One anonymous respondent noted that it is human to take the easy path, writing: “Learning requires three key underlying skill sets—patience, curiosity, and a willingness to question assumptions. Unfortunately, the internet can tend to give answers too quickly and make people think they are experts simply because they can access anything and everything immediately. Ensuring that youth understand that really understanding something requires lots of time and substantial amounts of thinking and questioning is going to be a challenge.” Another anonymous respondent added that the easy path generally leads to entertainment more often than education or enlightenment: “We are already beginning to see the short attention spans people have as well as their lack of overall knowledge about their world and local context. Just consider the dreadful state of political dialog in this country today. People are distracted from deep engagement and are solely interested in being entertained, most often by viewing the misfortune of others.”Several survey participants noted that basic human responses are being leveraged to advantage by marketers tapping into human tendencies. “There are evolutionary traits and preferences that are hard-wired in, and that’s where the danger lies—not in teenagers wasting their time writing SMSs rather than novels for the ages, but in marketers’ ever-increasing ability to tap in to addictive and deep-seated psychological traits that are common to all of us, to convince us to play just one more round of Angry Birds, or have just one more scoop of salted-caramel ice cream,” wrote an anonymous respondent. “The pervasive network allows people to build more quickly on the foundations laid by their predecessors, but it also allows more efficient delivery of increasingly-addictive media that caters to our troop-of-apes-on-the-savannah social needs for popularity and attention.”

Human nature, one anonymous respondent noted, has its sunny side and its dark side: “Those who are interested, driven, engaged, excited about learning will learn, grow, and develop—for its own sake. Those who are not, will not; they’ll party, they’ll coast, and they’ll become investment bankers.”

“People will always want the same things—sex, power, affection, fulfillment, etc.—and they will use technologies as they always have, to seek out more of the things they want, which intrinsically involve interacting with other people. Ask a geeky friendless kid in small-town America 40 years ago if he’d like to have some way of communicating with people who appreciate him.”

There were those who expressed optimism about human nature and the days ahead. An anonymous respondent wrote, “Our surrounding world is developing and changing, and teens, youth, and children are going to be leading the way through the new world just like they always have.” Another added, “I am an optimist with faith in the deeper motivations of our species to learn, acquire understanding, and be challenged.” And another added: “People will always want the same things—sex, power, affection, fulfillment, etc.—and they will use technologies as they always have, to seek out more of the things they want, which intrinsically involve interacting with other people. Ask a geeky friendless kid in small-town America 40 years ago if he’d like to have some way of communicating with people who appreciate him.”

Richard Titus, a venture capitalist based in London and San Francisco, said the construction of strong social and moral frameworks is necessary for positive evolution. “The idea in the 1960s of unstructured, unguided, collaborative contribution was considered anathema, yet it brought us one of the most important human inventions, the internet, un-imaginable within the previous mental model,” he wrote. “The most important thing to bring a positive vision of 2020 is to steer the next generation towards results—meaningful, measurable results, with less focus on how they is arrived at—and to build stronger social, moral frameworks to replace those roles previously held by power structures which relied on the previous models.”

The most-desired skills of 2020 will be…

Survey respondents say there’s still value to be found in traditional skills but new items are being added to the menu of most-desired capabilities. “Internet literacy” was mentioned by many people. The concept generally refers to the ability to search effectively for information online and to be able to discern the quality and veracity of the information one finds and then communicate these findings well.

David D. Burstein, a student at New York University and author of Fast Future: How the Millennial Generation is Remaking Our World, noted, “A focus on nostalgia for print materials, penmanship, and analog clock reading skills will disappear as Millennials and the generation that follows us will redefine valued skills, which will likely include Internet literacy, how to mine information, how to read online, etc.”

“Youth are learning to focus on ‘what matters most,’ with an emphasis on leveraging social media as one’s personal learning network. Purposeful collaborative actions will leverage shared knowledge—if we all share what we know, we’ll all have access to all our knowledge.”

Collective intelligence, crowd-sourcing, smart mobs, and the “global brain” are some of the descriptive phrases tied to humans’ working together to accomplish things in a collaborative manner online. Internet researcher and software designer Fred Stutzman said the future is bright for people who take advantage of their ability to work cooperatively through networked communication. “The sharing, tweeting, and status updating of today are preparing us for a future of ad hoc, always-on collaboration,” he wrote. “The skills being honed on social networks today will be critical tomorrow, as work will be dominated by fast-moving, geographically diverse, free-agent teams of workers connected via socially mediating technologies.”

Frank Odasz, a consultant and speaker on 21st century workforce readiness, rural e-work and telework, and online learning, said digital tools are allowing human networks to accelerate intelligence. “Because everything is becoming integrated and interrelated, youth’s abilities for expansive thinking and public problem solving will dramatically increase,” he noted. “Youth are learning to focus on ‘what matters most,’ with an emphasis on leveraging social media as one’s personal learning network. Purposeful collaborative actions will leverage shared knowledge—if we all share what we know, we’ll all have access to all our knowledge. Peer-evaluated, crowd-accelerated innovation will be recognized as a new dynamic for our global hologram of shared imagination. Digital reputations will be judged by the level of leveraged meaningful activities one leads, and is directly involved in advocacy for. Just-in-time, inquiry-based learning dynamics will evolve along with recognition that the best innovations can be globally disseminated to billions in a day’s time.”

Jonathan Grudin, principal researcher at Microsoft, emphasized the critical thinking involved in analytical search processes. “The essential skills will be those of rapidly searching, browsing, assessing quality, and synthesizing the vast quantities of information that is available and is of importance or interest to each person. These skills were not absent before but were not needed when the available significant information was less, more heavily vetted in advance, and more difficult to access. In contrast, the ability to read one thing and think hard about it for hours will not be of no consequence, but it will be of far less consequence for most people.”

“While digital thinking may lead to excessive multitasking and a reduction in attention span, the human brain can adapt to this new pattern in stimuli and can compensate for the problems that the pattern may cause in the long run. Online and digital interaction will make new forms of expression more important in social relationships, so that there is less emphasis on superficial attributes and more value placed on meaningful expression and originality of ideas.”

Jeffrey Alexander, senior science and technology policy analyst at SRI International’s Center for Science, Technology & Economic Development, said, “As technological and organizational innovation comes to depend on integrating and reconfiguring existing and new knowledge to solve problems, digital and computational thinking will become more and more valuable and useful. While digital thinking may lead to excessive multitasking and a reduction in attention span, the human brain can adapt to this new pattern in stimuli and can compensate for the problems that the pattern may cause in the long run. Online and digital interaction will make new forms of expression more important in social relationships, so that there is less emphasis on superficial attributes and more value placed on meaningful expression and originality of ideas.”

“These two modes of thinking (rapid information gathering vs. slower information processing and critical analysis) represent two different cultures, each with its own value system,” maintained Patrick Tucker, deputy editor of The Futurist magazine. “They can work together and complement one another but only with effort on the part of both sides. Ideally, internet users across age groups take the time to develop critical thinking ability. We value these too cheaply today. The internet, in its very nature, pushes and encourages feral information gathering, so no special training or attention is really required to instruct the ‘over 35’ set how to find what they want online quickly. The premise of the question, thus, is flawed. On the contrary, some of the best content aggregation out there is done by baby boomers. Quick pattern recognition and extrapolation is a natural mental state. The ability to focus, to analyze critically, these require learning and practice.”

An anonymous survey respondent said talented people will have the ability to work with people on both sides of the technology divide: “There is too much of a gap between the ‘people in charge’ and the ‘wired kids,’ leaving too much room for miscommunication and inevitable friction. In 2020, I would imagine that the most highly valued intellectual and personal skills will be the ability to exist in both of these spaces.”

P.F. Anderson, emerging technologies librarian at the University of Michigan-Ann Arbor, suggested that it’s not just the new-age literacies that should be emphasized, writing: “Have young folk practice rapid retrieval skills alongside quiet time, personal insight, attention to detail, memory. Develop the skills to function well both unplugged and plugged-in.”

“Probably the most highly valued personal skills will be cosmopolitanism, in the way philosopher Kwame Appiah conceives it—the ability to listen to and accommodate to others—and communitarianism, in the way sociologist Amitai Etzioni has outlined—an awareness that there must be a balance between individual rights and social goods.”

Gina Maranto, a co-director in the University of Miami’s graduate program, said ways of thinking to serve the common good will be of the greatest benefit. “Probably the most highly valued personal skills,” she wrote, “will be cosmopolitanism, in the way philosopher Kwame Appiah conceives it—the ability to listen to and accommodate to others—and communitarianism, in the way sociologist Amitai Etzioni has outlined—an awareness that there must be a balance between individual rights and social goods.”

Tom Hood, CEO of the Maryland Association of CPAs, shared feedback from hundreds of grassroots members of the CPA profession, who weighed in on the critical skills for the future in the CPA Horizons 2025 report and arrived at these: 1) Strategic thinking—being flexible and future-minded, thinking critically and creatively. 2) Synthesizing—the ability to gather information from many sources and relate it to a big picture. 3) Networking and Collaboration—understanding the value of human networks and how to collaborate across them. 4) Leadership and communications—the ability to make meaning and mobilize people to action and make your thinking visible to others. 5) Technological savvy—proficiency in the application of technology.”

Barry Chudakov, a research fellow in the McLuhan Program in Culture and Technology at the University of Toronto, said the challenge we’re facing is maintaining and deepening “integrity, the state of being whole and undivided,” noting: “There will be a premium on the skill of maintaining presence, of mindfulness, of awareness in the face of persistent and pervasive tool extensions and incursions into our lives. Is this my intention, or is the tool inciting me to feel and think this way? That question, more than multitasking or brain atrophy due to accessing collective intelligence via the internet, will be the challenge of the future.”

An anonymous respondent noted, “The ability to concentrate, focus, and distinguish between noise and the message in the ever growing ocean of information will be the distinguishing factor between leaders and followers.”

It is difficult to tell what we will see by 2020, as people and tools evolve

Duane Degler, principal consultant at Design for Context, a designer of large-scale search facilities and interactive applications for clients such as the National Archives and Verisign, said we’re already witnessing a difference in cognitive abilities and perceptions dependent upon the information/communication tools people are using, and not just among the under-35 set. “One thing these scenarios don’t speak to,” he noted, “is the degree to which the tools themselves are likely to recede further into the background, where they become a part of a fabric for how people carry out tasks and communicate. This is likely to be a result of both technology (pervasive computing, context-aware interactions) and a settling in of personal/social habits. As a result, the dominant social and information behaviors are likely to be influenced by other factors that we can’t yet see, in the same way current social and information behaviors are now being influenced by capabilities that are predominantly five years (or at most ten years) old.”

Pamela Rutledge, director of the Media Psychology Research Center at Fielding Graduate University, says this evolution is creating a new approach to thinking. “The new ‘wiring’ creates the ability to be fluid in adapting to change,” she explained. “Experience with rapidly changing technologies, gaming environments, user interfaces, and environmental impact have established a new approach to thinking where ‘how things are supposed to be’ is a changing rather than fixed understanding. More importantly, the ability to act and interact, to synthesize and connect, can radically change an individual’s sense of agency. There is a new assumption about participation. It is not just the expectation to participate that we talk about in convergence culture; it is the belief that each person can participate in a meaningful way. Beliefs of agency and competence fuel intrinsic motivation, resilience, and engagement.

“The ability to act and interact, to synthesize and connect, can radically change an individual’s sense of agency. There is a new assumption about participation. It is not just the expectation to participate that we talk about in convergence culture; it is the belief that each person can participate in a meaningful way. Beliefs of agency and competence fuel intrinsic motivation, resilience, and engagement.”

New York-based technology and communications consultant Stowe Boyd noted, “Our society’s concern with the supposed negative impacts of the internet will seem very old-fashioned in a decade, like Socrates bemoaning the downside of written language, or the 1950’s fears about Elvis Presley’s rock-and-roll gyrations. As the internet becomes a part of everything, like electricity has today, we will hardly notice it: it won’t be ‘technology’ anymore, but just ‘the world.’”

Richard Lowenberg, director and broadband planner for the 1st-Mile Institute, said many complexities lie ahead.

“Though young people may or may not be wired differently, there is too much hype associated with such evolutionary changes, and not enough attention paid to the dynamically complex issues that provide context for such generational changes,” he wrote. “Major forces that affect how we are ‘wired’—and how we evolve in hopefully healthy ways—include the implications of: family life; the health and demosophia [wisdom of the people] of societies; technological consumerism as driving influence; failing economic infrastructure and understandings which do not account for the necessary balance between competition and cooperation; and our largely misdirected educational systems, which do not foster lifelong learning and an ecology of mind. Without positive outcomes in these and more, we will be caught up in the tensions and disruptions of technology-mediated imbalances brought on by greater noise-to-signal among more than 7 billion people worldwide.”

And an anonymous respondent shares a ray of hope: “Today’s Internet engineers and developers are about as brilliant as they come; it is my opinion that all these different kinds of brilliant minds will fuse the old-fashioned and new-fashioned ways of thinking into one extremely powerful and advanced future generation. I have faith in the ways that educators, innovators, engineers, developers, mentors, etc., will compensate.”

Methodology

Respondents to the Future of the Internet V survey, fielded from August 28 to October 31, 2011, were asked to consider the future of the Internet-connected world between now and 2020. They were asked to assess eight different “tension pairs” – each pair offering two different scenarios that might emerge by 2020 with the same overall subject themes and opposite outcomes. They were asked to select the most likely choice between the two statements. The tension pairs and their alternative outcomes were constructed to reflect emerging debates about the impact of the Internet. The tension pair options distill statements made by pundits, scholars and technology analysts about the likely evolution of the Internet. They were reviewed and edited by the Pew Internet Advisory Board.

The survey results are based on a non-random online sample of 1,021 Internet experts and other Internet users, recruited via email invitation, conference invitation, or link shared on Twitter, Google Plus or Facebook by Elon University’s Imagining the Internet Center and the Pew Research Center’s Internet & American Life Project.

Since the data are based on a non-random sample, a margin of error cannot be computed, and the results are not projectable to any population other than the people participating in this sample. The study is a “snapshot” capture of people’s opinions today about what might take place tomorrow knowing what is known now. This helps illuminate issues and concerns and raises the need to address them.

Survey participants were asked to choose one of the two provided scenarios and then explain that choice. Respondents were invited to explain their answers and it is their narrative elaborations that provide the core of our reports.

While most people agreed with the statement that the future for the hyperconnected will generally be positive, many who chose that view noted that it is more their hope than their firm prediction, and a number of people said the true outcome will be a combination of both scenarios. The statistical outcome of respondent choices is not a firm measure, but…

55% agreed with the statement:

“In 2020 the brains of multitasking teens and young adults are ‘wired’ differently from those over age 35 and overall it yields helpful results. They do not suffer notable cognitive shortcomings as they multitask and cycle quickly through personal- and work-related tasks. Rather, they are learning more and they are adept at finding answers to deep questions, in part because they can search effectively and access collective intelligence via the Internet. In sum, the changes in learning behavior and cognition among the young generally produce positive outcomes.”

42% agreed with the statement:

“In 2020, the brains of multitasking teens and young adults are ‘wired’ differently from those over age 35 and overall it yields baleful results. They do not retain information; they spend most of their energy sharing short social messages, being entertained, and being distracted away from deep engagement with people and knowledge. They lack deep-thinking capabilities; they lack face-to-face social skills; they depend in unhealthy ways on the Internet and mobile devices to function. In sum, the changes in behavior and cognition among the young are generally negative outcomes.”


Pew and Imagining Logo

A selection of quote excerpts from the thousands of predictions about the teens-to-20s age group and the human impact of people’s uses of the Internet by 2020:

“There is no doubt that brains are being rewired.” —danah boyd, Microsoft Research

“We will see significant, positive, and even astounding improvements in the cognitive abilities of young people.”
Dave Rogers, Yahoo Kids

“Work will be dominated by fast-moving, geographically diverse, free-agent teams of workers connected via socially mediating technologies.” —Fred Stutzman, Carnegie Mellon

“The replacement of memorization by analysis will be the biggest boon to society since the coming of mass literacy.” —Paul Jones, UNC-Chapel Hill

“When these young people remake our educational institutions…a greater amount of information will be used to produce positive outcomes for society.” —Morley Winograd, co-author of Millennial Momentum

“Teens find distraction…[we will need] silence zones, time-out zones, meditation classes without mobile, lessons in ignoring people.”
Marcel Bullinga, futurist

“Society is becoming conditioned into dependence on technology in ways that, if that technology suddenly disappears or breaks down, will render people functionally useless.”
Richard Forno, cybersecurity expert

“Short attention spans resulting from quick interactions will be detrimental to focusing on the harder problems and we will probably see a stagnation in many areas.” —Alvaro Retana, distinguished technologist, HP

“[When] the social currency of being able to say ‘I was there first’ rises, we will naturally devalue retrospective reflection and the wisdom it imparts.” —Stephen Masiclat, Syracuse University

“How we can help today’s kids to prepare for that world—the world they will actually live in and help to create—instead of the world we are already nostalgic for?” —Alexandra Samuel, Social + Interactive Media Centre

“Digital reputations will be judged by the level of leveraged meaningful activities one leads, and one is directly involved in advocacy for. Just-in-time, inquiry-based learning dynamics will evolve.” —Frank Odasz, expert on 21st century workforce

“Each new advance in knowledge and technology represents an increase in power, and the corresponding moral choices that go with that power.”
Martin Owens Jr., Internet law attorney

“Creativity, demand for high stimulus, rapidly changing environments, and high agency (high touch) will be what makes the next revolution of workers for jobs they will invent themselves, changing our culture entirely at a pace that will leave many who choose not to evolve in the dust.” —Alan Bachers, director, Neurofeedback Foundation

“High activity in online environments, particularly games, expends any political will or desire to effectively shape the environment so that there is none of that will left for engaging in our actual political environment.” —Sandra Braman, University of Wisconsin-Milwaukee

“Centralized powers that can control access to the Internet will be able to significantly control future generations…Future regimes may use control of access to the Internet to shape and limit thought.” —Paul Gardner-Stephen, director, Serval Project

“I have hope for improved collaboration from these new differently ‘wired’ brains, for these teens and young adults are learning in online environments where working together and developing team skills allows them to advance.” —Perry Hewitt, Harvard University

“We will renorm to the new tools. We have always had mall rats and we’ve had explorers. Ideally, people will improve their critical thinking skills to use the available raw information. More likely, fads will continue.”
Bob Frankston, computing pioneer and ACM Fellow

“Whatever happens, we won’t be able to come up with an impartial value judgment because the change in intellect will bring about a change in values as well.” —David Weinberger, Harvard Library Innovation Lab

A selection of anonymous comments:

“I wonder if we will even be able to sustain attention on one thing for a few hours—going to a classical concert or film, for instance. Will concerts be reduced to 30 minutes? Will feature-length films become anachronistic?”

“With deregulation, consolidation of media ownership and control, and the acceptance of capitalism as natural and inevitable, learning styles and attention spans are headed toward the inability to think critically. Trends in education, social activities, and entertainment all make more likely a future of passive consumers of information.”

“Popular tools allow us to move at a pace that reinforces rapid cognition rather than more reflective and long-term analysis. I fear that market forces and draconian policies will drive the technology/media interface.”

“We have landed in an electronics age where communications technologies are evolving much more quickly than the minds that are producing them and the social structures that must support them. We are not taking the time to evaluate or understand these technologies, and we aren’t having serious conversations about what effects these new tools have on us.”

“Discussions based around Internet content will tend to be pithy, opinion-based, and often only shared using social media with those who will buttress—rather than challenge—political, ideological, or artistic beliefs.”

“Increasingly, teens and young adults rely on the first bit of information they find on a topic, assuming that they have found the ‘right’ answer, rather than using context and vetting/questioning the sources of information to gain a holistic view of a topic.”

“My friends are less interested in genuine human interaction than they are looking at things on Facebook. People will always use a crutch when they can, and the distraction will only grow in the future.”

“Parents and kids will spend less time developing meaningful and bonded relationships in deference to the pursuit and processing of more and more segmented information competing for space in their heads, slowly changing their connection to humanity.”

“How/why should we expect the next generation to be ‘different’ (implication = more evolved/better) when they’re raised in a culture increasingly focused on instant gratification with as little effort as possible?”

“It’s simply not possible to discuss, let alone form societal consensus around, major problems without lengthy, messy conversations about those problems. A generation that expects to spend 140 or fewer characters on a topic and rejects nuance is incapable of tackling these problems.”

“Why are we creating a multitasking world for ADD kids? The effects will be more telling than just the Twitterfication of that generation. There have been articles written about how they’re losing their sense of direction (who needs bearings when you have Google Maps or a GPS?). Who needs original research when you have Wikipedia?”

“Human society has always required communication. Innovation and value creation come from deeper interaction than tweets and social media postings. Deeper engagement has allowed creative men and women to solve problems. If Thomas Edison focused on short bursts of energy, I doubt he would have worked toward the creation of the light bulb.”

“Fast-twitch wiring among today’s youth generally leads to more harm than good. Much of the communication and media consumed in an ‘always-on’ environment is mind-numbing chatter. While we may see increases in productivity, I question the value of what is produced.”

“There is less time for problems to be worked out, whether they are of a personal, political, economic, or environmental nature. When you (individual or collective) screw up (pollute, start a war, act in a selfish way, or commit a sexual indiscretion as a public person) everyone either knows very quickly or your actions affect many people in ways that are irreversible.”

“Long-form cognition and offline contemplative time will start to be viewed as valuable and will be re-integrated into social and work life in interesting and surprising ways.”