Just because something appears on a computer screen doesn’t mean it’s true. Well, sure, who can disagree with that? But from my earliest introduction to the Internet — years ago when the real inventor of Trinity’s first email system, Jim Heynderixx (his actual job was as the first director of the Writing Center, but Jim’s gifts went well beyond paragraphs) tried to explain bulletin boards and the alt.newsgroup concept to me — I pressed the question: how do we teach students how to differentiate between truth and fiction when the medium itself can seem to take on the authority of an oracle?
We live on the frontier of the Information Age, a vast terrain called cyberspace where the rules of engagement are loose and vague, morphing to suit the gatherings of posses and pioneers camping out on MySpace or CraigsList or YouTube or BoingBoing or among the denizens of Wikipedia and the blogosphere, all tracing maps through Google or Yahoo to sort through the endless stream of information and chatter, sound and light and boundless imagery — some good and true, some simply wasteful and vapid, some invented and fictitious, some outright lies. Centuries from now people will probably laugh at our quaint customs in what is really still the first full decade of pervasive online life.
How do we teach students to distinguish between truth and fiction? Of course, this is the central question of all education through the ages. When the medium was simply speech (think: Socrates asking rhetorical questions under that tree), students still had to learn to sort out truth and fiction — hence the “Socratic Method” with its rigorous internal logic exercises. Then the printed word emerged to supplement the lecture, and from the middle ages academics invented elaborate systems and rules to ensure rigor in the search for Truth. For most collegiate faculty today, as in the past, popular print media (newspapers, magazines) are rarely acceptable as primary sources, and almost never an encyclopedia. Original scholarly texts, presented in appropriate citation format, are the first, and sometimes only, acceptable sources for coursework.
Don’t get me wrong, I’m not an anti-Internet Luddite — heck, I have a blog!
But two related opinion columns in today’s Outlook section of the Washington Post really caught my attention. One of the pieces, entitled “Cut and Paste is a Skill, Too,” presented a stunningly irresponsible case for tolerating plagiarism and eliminating the writing of term papers as a means of assessing student performance. I spent too much time wondering what got into the Post editors who allowed this ridiculous piece to get into the paper. The writer (did Jason Johnson actually write the column, or did he just cut and paste it from some sophomore’s dream of a world without plagiarism consequences, a world without the need to demonstrate any ability to write something longer than an instant message) manifested a remarkable disrespect for the ability to conceptualize and write a lucid text presenting the student’s own analysis of facts and opinions. The writer argues that we academics should simply give in to the prevalence of plagiarism and find methods other than term papers to assess student knowledge, reasoning and writing abilities (well, he pretty much dismissed writing abilities as relevant).
I wonder who will be able to hire his successor if we ditch writing as a primary academic skill? What will happen to the valid authority of any piece of writing if we ignore the fundamental ethics of citation, if we give plagiarism a pass because it’s become so hard to regulate?
Writing is essential, and plagiarism is unacceptable. We must not relent on those simple points. Trinity does not tolerate plagiarism; our Academic Honesty Policy is clear.
The second column, “Wikiality in My Classroom,” is worth reading and debating. A teacher at St. Albans, Jacqueline Hicks Grazette, comments on the challenges posed by new information sources on the Internet, notably, Wikipedia. Should Wikipedia be an acceptable source for student papers? Should Googling be an acceptable primary research method? These are movable feasts for academic argumentation — research without the Internet is now unthinkable, but how do we teach students to differentiate between the slick, easy answers (many of which are also quite wrong) and the hard, deep research into original sources that also requires considerable critical analysis? I applaud this teacher’s example of a good use of a fine Internet resource — she expects her students to read Supreme Court opinions as part of her American History classes, but she also has them listen to oral arguments available on the Supreme Court’s OYEZ Project website.
The Wiki-Age does not require new rules, but rather, a reaffirmation of the essential rules of academic ethics. Academic research is a journey of exploration, and our job as teachers requires us to help students learn to see the Truth and recognize the fraud. The Internet is a great tool, now indispensable, but the fact that some information comes across a computer screen is no different than if that information came in a book, newspaper, television or radio. Students must learn the art of critical analysis, one of the primary skills taught in higher education. That skill can only be demonstrated in written and oral communication — even science and mathematics require facility with words to analyze the problems, and even art and music rely on words for interpretation. Knowing how to use words well, how to communicate honestly and persuasively, how to produce the artifacts of human intelligence in text and symbol and image and sound — this is the true manifestation of higher learning.
What do you think about Wikipedia and other Internet research tools? What about Jason Johnson’s proposal that the only way to stop plagiarism is to stop assigning term papers? Send me your comments by clicking on the envelope below or sent an email to email@example.com