TOO MUCH TECHNOLOGY?
Do you
think that the more information managers receive, the better their decisions?
Well, think again. Most of us can no longer imagine the world without the
Internet and without our favorite gadgets, whether they’re iPads, smartphones,
laptops, or cell phones. However, although these devices have brought about a
new era of collaboration and communication, they also have introduced new
concerns about our relationship with technology.
Some
researchers suggest that the Internet and other digital technologies are
fundamentally changing the way we think and not
for the better. Is the Internet actually making us “dumber,” and have we
reached a point where we have too much technology? Or does the Internet offer
so many new opportunities to discover information that it’s actually making us
“smarter.” And, by the way, how do we define “dumber” and “smarter” in an
Internet age? Wait a second, you’re saying. How could
this be?
The
Internet is an unprecedented source for acquiring and sharing all types of
information.
Creating
and disseminating media has never been easier.
Resources like Wikipedia and Google have helped to organize knowledge and make
that knowledge accessible to the world, and they would not have been possible
without the Internet. And other digital media technologies have become
indispensable parts of our lives. At first glance, it’s not clear how such
advancements could do anything but make us smarter.
In response to this argument, several
authorities claim that making it possible for millions
of people to create media written blogs, photos, videos has
understandably lowered the quality of media.
Bloggers
very rarely do original reporting or research but instead copy it from
professional resources. YouTube videos contributed by newbies to video come
nowhere near the quality of professional videos. Newspapers struggle to stay in
business while bloggers provide free content of inconsistent quality.
But
similar warnings were issued in response to the
development of the printing press. As Gutenberg’s invention spread throughout
Europe, contemporary literature exploded in
popularity, and much of it was considered mediocre by intellectuals of the era.
But rather than being destroyed, it was simply in the early stages of
fundamental change. As people came to grips with the new technology andthe new
norms governing it, literature, newspapers, scientific journals, fiction, and
non-fiction all began to contribute to the intellectual climate instead of
detracting from it. Today, we can’t imagine a world without print media.
Advocates
of digital media argue that history is bound to repeat itself as we gain
familiarity with the Internet and other newer technologies. The scientific
revolution was galvanized by peer review and collaboration enabled by the
printing press.
According
to many digital media supporters, the Internet
will usher in a similar revolution in publishing capability and collaboration,
and it will be a resounding success for society as a whole.
This may
all be true, but from a cognitive standpoint, the effects of the Internet and
other digital devices might not be so positive. New studies suggest that
digital technologies are damaging our ability to think clearly and focus.
Digital technology users develop an inevitable desire to multitask, doing
several things at once while using their devices.
Although
TV, the Internet, and video games are effective at developing our visual
processing ability, research suggests that they detract from our ability to
think deeply and retain information. It’s true that the Internet grants users
easy access to the world’s information, but the medium through which that
information is delivered is hurting our ability to think deeply and critically
about what we read and hear. You’d be “smarter” (in the sense of being able to
give an account of the content) by reading a book rather than viewing a video
on the same topic while texting with your friends.
Using
the Internet lends itself to multitasking. Pages are littered with hyperlinks
to other sites; tabbed browsing allows us to switch rapidly between two
windows; and we can surf the Web while watching TV, instant messaging friends,
or talking on the phone. But the constant distractions and disruptions that are
central to online experiences prevent our brains from creating the neural
connections that constitute full
understanding of a topic. Traditional print media, by contrast, makes it
easier to fully concentrate on the content with fewer interruptions.
A recent
study conducted by a team of researchers at Stanford found that multitaskers
are not only more easily distracted, but were also surprisingly poor
atmultitasking compared to people who rarely do so themselves. The team also
found that multitaskers receive a jolt of excitement when confronted with a new
piece of information or a new call, message, or e-mail.
The
cellular structure of the brain is highly adaptable and adjusts to the tools we
use, so multitaskers quickly become dependent on the
excitement
they experience when confronted with something
new. This means that multitaskers continue to be easily distracted, even if
they’re totally unplugged from the devices they most often use.
Eyal
Ophir, a cognitive scientist on the research team at Stanford, devised a test
to measure this phenomenon. Subjects self-identifying as multitaskers were
asked to keep track of red rectangles in series of images. When blue rectangles
were introduced, multitaskers struggled to recognize whether or not the red
rectangles had changed position from image to image.
Normal
testers significantly outperformed the multitaskers. Less than three percent of
multitaskers (called “supertaskers”) are able to manage multiple information
streams at once; for the vast majority of us, multitasking does not result in
greater productivity.
Neuroscientist
Michael Merzenich argues that our brains
are being ‘massively remodeled’ by our constant and ever-growing usage of the
Web. And it’s not just the Web that’s contributing to this trend. Our ability
to focus is also being undermined by the constant distractions provided by
smart phones and other digital technology. Television and video games are no
exception. Another study showed that when presented with two identical TV
shows, one of which had a news crawl at the bottom, viewers retained much more
information about the show without the news crawl. The impact of these
technologies on children may be even greater than the impact on adults, because their brains are still
developing, and they already struggle to set proper
priorities and resist impulses.
The
implications of recent research on the impact of
Web 2.0 “social” technologies for management decision
making are significant. As it turns out, the “always-connected”
harried executive scurrying through
airports and train stations, holding multiple voice
and text conversations with clients and co-workers
on sometimes several mobile devices, might
not be a very good decision maker. In fact, the quality
of decision making most likely falls as the quantity
of digital information increases through multiple
channels, and managers lose their critical
thinking capabilities. Likewise, in terms
of management productivity, studies of Internet use in the workplace
suggest that Web 2.0 social technologies offer
managers new opportunities to waste time rather
than focus on their responsibilities. Checked your
Facebook page today? Clearly we need to find out
more about the impacts of mobile and social technologies
on management work.
Sources:
Randall
Stross, “Computers at Home: Educational Hope vs.
Teenage
Reality,” The New York Times, July 9, 2010; Matt Richtel,
“Hooked
on Gadgets, and Paying a Mental Price,” The New York
Times, June
6, 2010; Clay Shirky, “Does the Internet Make you
Smarter?”
The Wall Street Journal, June 4, 2010; Nicholas Carr, “Does
the
Internet Make you Dumber?” The Wall Street Journal, June 5,
2010;
Ofer Malamud and Christian Pop-Echeles, “Home Computer
Use and
the Development of Human Capital,” January 2010; and
“Is
Technology Producing a Decline in Critical Thinking and
Analysis?”
Science Daily, January 29, 2009.
CASE STUDY
1. What are
some of the arguments for and against the use of digital media?
2. How
might the brain affected by constant digital media usage?
3. Do you
think these arguments outweigh the positives of digital media usage? Why or why
not?
4. What
additional concerns are there for children using digital media? Should children
under 8 use computers and cellphones? Why or why not?
MIS case 2
1.
·
If we use our digital
media can be more aware of media developments and their usefulness if digital
media is appropriate for our needs for the digital media.
·
if we deny the existence of digital
media around us then I think we will have difficulty facing any activity that
relate to digital media according to date.
2. Because our brains continue to respond whenever there is development of
digital media so that our brains are being "massively remodeled" by
our constant and ever-growing usage of the Web. And it's not just the Web
that's contributing to this trend. Our ability to focus is also being
undermined by the constant distractions provided by smart phones and other
digital technology.
3. I think these arguments outweigh the positives of digital media usage
because the argument discusses the values of the positive and negative uses
of digital media along with how digital media should be used and at what age
should one start using the digital media.
4.
·
The additional concerns are there for
children using digital media is the negative effect that will be received by
the children of the digital media is greater than the impact on adults, because
their brains are still developing, and they've been struggling to set the right
priorities and resist impulses.
·
We recommend that children under 8 years
old are given enough knowledge about computers dah cellphones, should not be
allowed to use it, because it's for the good of the child development