Sunday, 15 September 2013

The Multitasking Myth: Technology Use and Instructional Outcomes

By Brent Reed, Pharm.D., BCPS, Assistant Professor, University of Maryland School of Pharmacy

As you read this blog, how many technologies are competing for your attention? Perhaps your phone is sitting nearby, buzzing intermittently with the arrival of a new text message. Or a popup has just alerted you to the 17 new emails anxiously awaiting you in your email inbox.  Indeed, the neverending competition for our attention has become almost ubiquitous.  You can set “push” notifications for everything from up-to-the-minute scores of your favorite football team to the dessert photos your friend just posted to Instagram. The ability to manage these interruptions—often termed media multitasking—is the only way to survive in an increasingly technologically advanced society.   Or is it?  A growing body of evidence now suggests that multitasking is detrimental in many ways.  Some researchers contend that humans are incapable of performing multiple cognitive tasks at one time.  What we perceive as multitasking is essentially rapid “task switching.”1
For many young adults, especially those in the millennial generation, media multitasking is a way of life.  In a survey of undergraduate students published in 2010, Smith, et al. found that 4 out of every 5 owned a laptop computer and nearly two-thirds owned a mobile device capable of accessing the Internet.2 The overwhelming majority of young adults consider themselves excellent multitaskers, but studies indicate that individuals who proclaim themselves to be the most capable are actually the worst at multitasking.3  So too are those who most frequently multitask.4  Nevertheless, the growing prevalence of technologies that enable media multitasking has had a significant impact in a variety of areas of our lives.   The classroom and other learning environments are no exceptions.

No comments:

Post a Comment