Kutcher Punk’d Wozniak and Brogrammed “Jobs” to Hide the Women

Katherine’s analysis of the movie Jobs is published in the online film journal Bright LightsTaxonomy is about seeing the relationships between different elements of the material being organized.  That’s what I did for this article.  I paid attention to the details and found a very different movie.  Here’s the article’s first paragraph.

“Ashton Kutcher, producer and star of MTV’s celebrity prankster series Punk’d, played a prank on Steve Wozniak in the film Jobs, where he stars in the title role. The prank symbolically places Woz in a female role and gives primary engineering credit to early Apple employee Rod Holt. Kutcher’s motive may have been Wozniak’s agreement to help Aaron Sorkin with another Jobs biopic and his refusal to work with Kutcher and director Joshua Michael Stern on this one. With Woz as the feminine lead, Kutcher and Stern hide Apple’s female innovators and trivialize the women in Jobs’ life using brogramming techniques that discourage women from entering the technology field.”


Print pagePDF pageEmail page
14
Apr 2014
POSTED BY
DISCUSSION 0 Comments
TAGS

Expanding the Metaphor

          The metaphor of digital natives and digital immigrants was originated by educator Marc Prensky (2001) to differentiate between students who were born into and grew up with the Internet and their teachers who encountered the Internet as adults.  The metaphor reflects immigrants who move to a new country, try to assimilate, but still speak with an accent.  Their children, born in the new country, are naturally assimilated and speak the language as their native tongue.  Prensky provides a few examples of this pre-Internet accent, such as printing emails before reading them (p. 2).  Most, if not all, of his accent examples seem more like getting familiar with new technology in 2001.  I don’t know of any digital immigrants who print emails, at least not since 1995.      

          There are other objections to this metaphor.  For one, it doesn’t address access to technology, which adds socio-economic status into the mix (Bennett, Maton, & Kervin, 2008, p. 778).  You can’t be a digital native if your family can’t afford to buy the digits.  David Weinberger (2007) offered another objection in a KMWorld column.  He doesn’t consider himself a digital immigrant because of his lengthy computing history and superior computing skills.  So he changed the metaphor from Ellis Island to post-Revolution America and defined himself as a digital settler – not born in the country, but an early and skilled resident. 

          I’d like to expand that metaphor a little further, with settlers preceded by explorers and pioneers.  Like Lewis and Clark, digital explorers forged their way into new territory, blazing trails of hardware and software.  Pioneers followed, finding new ways to use the technology.  Settlers liked what they saw and joined in.  They were followed this time by immigrants and their children, the digital natives. 

          This metaphor doesn’t take into account Native Americans who were already on the land when the European explorers arrived.  But in the digital frontier, the land was not already in existence, waiting to be stolen.  It was constructed by digital explorers and pioneers who sold their ideas to eager settlers, immigrants and eventual natives. 

References

Bennett, S., Maton, K., & Kervin, L.  (2008).  The ‘digital natives’ debate:  A Critical review of the evidence.  British Journal of Educational Technology, 39(5), 775-786.  Retrieved January 15, 2009, from First Search WilsonSelectPlus database.

Prensky, M.  (2001, October).  Digital natives, digital immigrants.  On the Horizon, 9(5), 1-6.  Retrieved January 5, 2009, from http://www.marcprensky.com/writing

Weinberger,
D.  (2008, January).  Digital natives, immigrants and others.  KMWorld 17(1).  Retrieved January 15, 2009, from http://www.kmworld.com


Print pagePDF pageEmail page
06
Jan 2009
POSTED BY
POSTED IN TECHNOLOGY
DISCUSSION 0 Comments
TAGS

Wikipedia as a Research Tool

Frank Luntz, the conservative pollster who coined the phrase “death tax” to replace “estate tax,” cites Wikipedia frequently in his book Words That Work: It’s Not What You Say, It’s What People Hear.  An odd reference choice, since one rarely sees citations for encyclopedia entries.  Wikipedia has the additional drawbacks of being entirely written by users, with unsigned articles that can be changed by anyone at any time. 

He cites Wikipedia as the source for a quote from Lyndon Johnson’s 1964 anti-Goldwater “Daisy” commercial (pp. 123, 300).   Wikipedia currently provides a link to the original 30 second ad.  Of course these entries change all the time, but according to the Internet archive Wayback Machine, the link first appeared three years before Luntz published his book.  For an investment of 30 seconds, he could have cited the primary source.  But he didn’t bother with that.  Instead he relied on an encyclopedia that can be modified by anyone on the Internet.       

I’m a big fan of Wikipedia.  It’s always my first step when embarking on a new research project.  With a few caveats, I have convinced friends and colleagues that it is reliable.  Because author groups tend to form around one knowledge area, the entries are usually accurate and usually meet the Wikipedia standard of NPOV (no point of view). 

Anyone can write a Wikipedia entry, so it has lots of information about obscure topics.  I recently looked up a major rock band, the Doobie Brothers, which led me to their producer Ted Templeman and then to his 1960’s band Harpers Bizarre, who had covered a song by Cole Porter.  In a print encyclopedia, I would have to look up each item in its respective volume.  In Wikipedia, the links do the looking up for me. 

Of course, in a traditional encyclopedia, and in the online Encyclopedia Britannica which requires a subscription, only Cole Porter has his own article.  The Doobie Brothers and Ted Templeman are both mentioned in an entry about Warner/Reprise Records.  A search for Harpers Bizarre only returns an article about Diana Vreeland, fashion editor for Harper’s Bazaar, the magazine whose name the band parodied.  In Wikipedia, even “Anything Goes,” the title song from Porter’s 1930’s musical, has its own separate entry.  Harpers Bizarre covered it in 1967.  “In olden days a glimpse of stocking was looked on as something shocking, now heaven knows, anything goes.”  Fit right in with the Haight Ashbury scene.   

This depth of information is achieved because someone out there knows a lot about Harpers Bizarre.  Wikipedia gives that person, and anyone else with a knowledge niche, a forum to anonymously write about their favorite topic.  If a group forms around that knowledge area, refining and evolving the article, accuracy is achieved, along with the Wikipedia gold standard of NPOV, because the group moderates itself. 

For this reason, the accuracy level tends to be about the same as traditional encyclopedias.  This was tested in 2005, when Nature magazine found that the average Wikipedia science article contained four errors, while the average Britannica science article contained three (“Internet Encyclopedias Go Head to Head,” 12/15/05, pp. 900-901).  Wikipedia has the advantage here because it can correct errors immediately and the hard copy Britannica has to wait for the next printing. 

But Wikipedia’s strength is also its weakness.  It remains an intellectual graffiti wall where errors linger until someone bothers to paint them over.  Accuracy and NPOV are achieved through the group.  Highly visible topics more readily meet the standard than less popular topics.  If someone writes an inaccurate article and no one reads it or bothers to change it, then the errors stand.  In 2005, prior to the Nature study, a jokester modified journalist John Siegenthaler’s Wikipedia biography to imply that he participated in the assassination of Robert Kennedy.  Sieganthaler, a pallbearer at Kennedy’s funeral, discovered the error four months after it appeared.

I found an error myself this year while researching the French Revolution.  One Wikipedia article stated that Marie Antoinette’s brother was a pope.  Her brother Leopold II was the Holy Roman Emperor, ruler of a lot of Germanic territories and a major player in the wars of the French Revolution, but, despite the title, not a pope.  I assume the error was quickly corrected, but it was there when I happened to be reading.  If I had been new to the topic, perhaps a 7th grader, I might have believed it. (Sorry, no links.  This is a memory I hadn’t expected to use in an article.) 

As a reader, you do not know if a Wikipedia fact has been modified by a confused researcher or by someone just having a little fun.  Of course you can look at the editorial conversations that are generally available to all, but it would be impossible to vet every single fact.  So I don’t rely on Wikipedia for accuracy and I have not yet used it as a reference, although I do frequently link to it in these blog postings to provide more information. 

And now here’s the real reason I like Wikipedia – lots and lots of footnotes, references and links to more information – much more than a print encyclopedia because Wikipedia is not constrained by space.  Also it needs to prove its accuracy, so it places a high premium on documentation.  I knew a link to the original “Daisy” ad would be on that Wikipedia page, because that’s how Wikipedia operates.  If it can point you to the primary source, it will do so. 

I use Wikipedia the way you’re supposed to use any encyclopedia, as a starting point.  It gives me an overview that is more likely than not to be accurate and it gives me lots of resources for more information.  Those resources are signed and they have more references and it was Wikipedia that got me set for a new knowledge hunt.


Print pagePDF pageEmail page
02
Dec 2008
POSTED BY
POSTED IN TECHNOLOGY
DISCUSSION 0 Comments
TAGS

The Quest for Knowledge

Wired has declared the end of science, at least that’s what the cover of the July issue says.  The graphic is a quaint still life with symbols of the old knowledge – analog tools, ledgers, books, although no print magazines among these images of the olden days.  In hard copy versions, the ubiquitous card catalog is front and center.  It always seems to appear when millenarians want to display the horrors of a pre-digital era. 

Of course, Wired contradicts itself even on the cover.  The subtitle to “The End of Science” is “The quest for knowledge used to begin with grand theories.  Now it begins with massive amounts of data.”  In other words, massive data is not the end of science.  It is a new method for conducting science.  The result is still a quest for knowledge.   

Inside the magazine, the article title is a little calmer – “The End of Theory:  Scientists have always relied on hypotheses and experimentation.  Now, in the era of massive data, there’s a better way.”  Again, massive data is not the end of science, but “a better way” for scientists to do their work. 

Throughout the article, author and Wired editor-in-chief Chris Anderson pulls back from his “End of Science” headline.  “The scientific method is built around testable hypotheses . . . . This is the way science has worked for hundreds of years.”  Only hundreds?  We’ve been questing for knowledge for thousands of years.  Now, he tells us, “faced with massive data, this approach to science – hypothesize, model, test – is becoming obsolete.”  So Anderson is forecasting the end of a method that has been used for a few hundred years, not the end of science itself.

And here we get back to the millenarian viewpoint.  Anderson uses his cover to tell us the sky is falling.  Wow, things are changing so much, we won’t even have science any more.  He’s a magazine huckster, drawing us in with a dramatic statement he doesn’t even attempt to prove, with those big letters sitting just below the card catalog, saying loud and clear: The End of Science.


Print pagePDF pageEmail page
22
Jul 2008
POSTED BY
POSTED IN TECHNOLOGY
DISCUSSION 0 Comments
TAGS

Big Brother at Wired

Chris Anderson, editor in chief of Wired, has an article in the July issue, “The End of Theory,” about massive computing and changes he foresees in the scientific method. It’s part of a group of articles titled, on the cover, “The End of Science.” His third paragraph begins, “Sixty years ago, digital computers made information readable.” I wasn’t around then, but I’m pretty sure my parents were reading information before 1948.  Actually, humans have been reading information since the invention of alphabets about 5000 years ago. Perhaps he meant machine readable, a truestatement. But he didn’t say that and by eliminating the crucial adjective, he indulged in a little millenarian hyperbole.

Millenarians believe their time on earth is the most important time in all of history. Everything before was dark ignorance, everything after will be brilliant. They show up during major social upheavals. Millenarians in the 17th Century believed the American and French Revolutions would lead to a Golden Age of global freedom, world peace, and for some, Armageddon. Current Web millenarians believe Internet connectivity will significantly change every aspect of our lives and thus lead to a Golden Age of global freedom, world peace, and for some, the Singularity, a non-religious word for Armageddon.

One technique for promoting a Golden Age is to imply that nothing much happened before the current era. That’s what Anderson is doing when he assigns the invention of reading to computers. That’s what Big Brother did in George Orwell’s Nineteen Eighty-Four:

It was always difficult to determine the age of a London building. Anything large and impressive, if it was reasonably new in appearance,
was automatically claimed as having been built since the Revolution, while anything that was obviously of earlier date was ascribed to some dim period called the Middle Ages. The centuries of capitalism were held to have produced nothing of any value. One could not learn history from architecture any more than one could learn it from books. Statues, inscriptions, memorial stones, the names of streets – anything that might throw light upon the past had been systematically altered. (Part I, Chapter VIII)

 Nineteen Eighty-Four of course is fiction, written in 1949, just one year after the invention of reading, according to Anderson’s theory.  That was also the year China became a People’s Republic. Two decades later, Mao Zedong’s Cultural Revolution replicated Orwell’s fictional end of historic truth with the systematic destruction of the four olds: old culture, old custom, old habits and old ideas. Instead of love, peace and happiness, their 60’s generation rampaged through the country, burning art and artifacts, and humiliating or murdering anyone who objected.

The technique worked. In a recent All Things Considered series on China, NPR reporters described Narrow Alley, an area of historic homes that was torn down and is now being rebuilt in the historic style as a tourist shopping center. These homes were not renovated. They were torn down and rebuilt. Anthony Kuhn, NPR’s China based reporter, commented that the Chinese have a “preoccupation with newness. A feeling that old things are just not worth saving.”

The middle-aged Red Guards are now the parents and leaders of China. They spent their formative years burning history and destroying anyone who honored the past. It’s survival of the fittest, I suppose. If we let Chris Anderson tell us now that reading didn’t exist before computers, sometime in the future we may actually believe it.


Print pagePDF pageEmail page
05
Jul 2008
POSTED BY
POSTED IN TECHNOLOGY
DISCUSSION 0 Comments
TAGS