Monthly Archives: December 2011

Why I Travel (A historian’s perspective)

On the way down to the Smithsonian’s National Zoo holiday Zoolights festival, I had a conversation with my sister-in-law about why she disliked zoos and aquariums (and for similar reasons the study of anthropology, but I disagree with her premises, so we’ll just skip it this time).  The gist of it was that she would rather experience a giraffe sighting in its natural environment than on display in a zoo.  In fact, she would simply rather experience the world than see it on display.

Roman road built in Ephesus

I understand her point.  I learned about the Roman Empire, watched documentaries about the Roman Empire, and visited Roman sculptures and pieces of edifices in museums, but touching a Roman wall built in Regensburg Germany, walking through the ruined Roman streets and porticos of Ephesus, and descending down to the Roman foundations of Barcelona was an experience above and beyond anything I’d done previously.  (Ironically, I still have yet to visit Rome.)  My interest and previous study in history, though, helped me to better appreciate and understand the incredible sights I witnessed on my travels.  It was enriching on so many levels and inspiring.

As a historian, I want to travel to the places where “things happened” to see the lay of the land for myself–to observe the growth and the evolutions as much as the foundations.  Travel gives me new insights, inspires new understandings, and stimulates new questions.  Visiting Barcelona gave me a whole new insight into Emperor Augustus’s plan for Spain, into the presence of the Carolingian Franks in Spain, the pilgrimage trails in northern Spain tied to Santiago de Compostella, and the post-exploration Spanish world.  No way could I have gotten two thousand years of history in a week without having visited the city.

Lonely Planet used to have a bumper sticker that read, “DO SOMETHING GREAT FOR YOUR COUNTRY.  LEAVE.”  There are many good arguments for traveling–especially in our ever-shrinking world–but, I recognize that not everyone has the opportunity or the inclination.  As for opportunity, I am grateful for the libraries and museums, zoos and aquariums that expose folks to the world beyond–many free of charge or for a small fee.  But, a lack of inclination is our fault, collectively, as parents or educators, society at large.  Even when we cannot afford to travel with our classes or families, we can challenge our kids to think about what is beyond their small world, ferment curiosity, and dare them to dream and plan to explore grand things and distant places.

Our world is enriched when our citizens are global, or at least can think globally, a mindset that is also necessary for a country of disparate immigrant populations living together.  Networks exist all over the world to bind us together and facilitate travel and experience.  We should be explorers, conquerors of challenging terrain, and eager life learners.  Good education, and history in particular, I believe, should facilitate the growth of such citizens.  My daughter should be excited to see a Roman ruin in person because she understands something of the vastness and greatness of the Roman Empire.  Her prior learning should enrich her personal experience and her travel experiences should inspire more learning and curiosity.  That is the beauty of travel for me: it is touching history.  In addition to those old standby perks: cuisine, culture, music, architecture, art, new people, geography, exotic animals, etc.  I love travel because I love wonder and curiosity.


Filed under Travel

Writing Fiction as an Exercise in History Education

The literary world has much to offer the study of history.  While I do not mean to suggest that novels should replace academic history texts in higher education (though I’d be less concerned if they replaced many of the textbooks I’ve seen), good historical fiction, or fiction written historically, can augment our developing understanding of historical eras.  Washington Irving’s “Rip Van Winkle” or Charles Dickens’s “Oliver Twist” are stories that inform us about our eras of study.

The reverse is also true.  The effort of research and study of primary documents provides a bounty of fruitful forays into one’s imagination.  Without imagination, being a historian is almost impossible since one is compiling a reconstruction of a past era with bits and pieces of information that have been handed down–much has been lost, naturally.  Historians with an inclination towards writing do the world a service; whether they choose to write fiction or not, others will still recreate the past at will but not necessarily with any accuracy; I submit Dan Brown as Exhibit A.

In other words, historians have the done the research and have the imagination to produce fiction that enlightens the world on multiple levels.  They also have a number of other responsibilities that make writing full length novels a challenge given the time available to them.  Many may also doubt their abilities, having a healthy respect for the demands of writing.  Still, where time can be found, the effort would be rewarding for both other educators and readers in the general populace.

By the same token, however, the assignment of fiction writing as part of a larger research project is also a fruitful exercise for the inexperienced history students.  As a multi-disciplinary project, it is incredibly valuable: not only do English teachers have the opportunity to teach them about literature and creative writing, but the History teacher has the opportunity to teach both historical research and test cultural assumptions that they might make.  A character has to behave as she might in the studied era, not in the 21st century; he has to communicate as he would in his era, not in this post-modern information age; she has to exhibit an education commiserate with what her era would teach her, not what she would learn in today’s democracies.

This is such a valuable mental exercise not only for budding historians, or at least young history students, but also for young people who are learning how to find their way in a world that supports many different cultures and mores.  It is an exercise in understanding and in imaginative reconstruction based on available evidence.


Filed under Experiencing History - Project Based Learning, Fiction

The end goal of education–what do we need, Part 2 of 2

Do the skill-sets of our degrees apply to the real world?

This post is a follow-up to the previous post regarding education and its goals.  A liberal arts education has long been praised for its development of the ethical adult human being and, especially more recently, criticized for its lack of emphasis on career training.  It remains an essential ingredient in debates on education policies and approaches.  What follows is a response to the report recently posted at The Chronicle of Higher Education regarding a poll of hiring decision-makers.  The poll reveals that employers are increasingly skeptical of the value of broad college educations.   To read the article for yourself, follow the link by clicking of the title, below.

Employers Say College Graduates Lack Job Skills, The Chronicle of Higher Education

(To see the results from the actual study follow this link:

academic,education,faces,graduations,mortarboards,people,schools,special occasions,students,tassels,universities

In a recent study of over 1,000 industry hiring decision-makers conducted by the American Council for Independent Colleges and Schools, employers revealed a lack of confidence in prospective employees.  The study posed the questions to a variety of industries and the results suggested that graduates seeking employment were lacking in areas such as: interpersonal skills, teamwork, problem-solving, job-specific knowledge, written communication skills, work experience, technical ability, education, business savvy, professional references, and math skills.  At the same time, they find the same pool of applicants to be under-skilled in areas such as using new media formats, filtering information according to needs, connecting to others in a deep and direct way, thinking of novel solutions that are outside the box, computational thinking, operating in different cultural settings, and other skills that are projected to be of increasing necessity in the future.

My first thought, considering these lists, is that many of these skills should be addressed and managed in a typical liberal arts education.  The fact that these skills are not developed or are not perceived to have been developed raises some interesting questions.  One thing that is not clear from the study is how the hiring personnel are evaluating these skills in prospective employees.  In my own experiences with career counselors, I have been greatly underwhelmed by the support they give to students who are constructing their first resumes.  Students often have more skills than they know and often do not know how to share these in written formats for would-be employers.  (Many adults are equally incompetent.)  They furthermore have difficulty expressing themselves well in interviews–which may reflect a lack of practice in their classes.  So, the question remains, at what point do HR departments make these evaluations?  Is it after some time on the job or after an interview?  Does the fault lie with the education or career services?

Let’s allow for the possibility that the broad liberal arts educational approach is at fault.  Is it simply the case that these skills are not included or emphasized in curricula?  Do students too often complete a course without having to give an independent oral presentation, present a written case or project, complete group work as a team, use new media for their assignments, filter an excess of information, engage in unfamiliar cultural settings, draw conclusions from data, create novel solutions in problem-solving, or develop their literacy skills across multiple disciplines?  It is possible, of course, that this is precisely the problem and yet I can think back to my own education in which I was required to do activities that covered each of these areas.  But, I should include a caveat to my own experience: I was not always aware that I was developing those skills during those activities.  In fact, it was only later that I realized the dual-effect–that of learning content whilst developing these skills–of many of my assignments and much of that came through training in education or coursework specifically addressing the subject of teaching.  My teachers and professors could have told me about such features in their curricula, and some did, but it is the sad habit of too many students not to reflect much about why an assignment is constructed the way that it is–especially if they are complaining about it.

Is it possible that the fault could be remedied in simply holding a dialogue at the end of a course that reflects on the skills developed?  This was a practice that Close Up developed for its week -long programs in Washington DC, one which I attempted to incorporate in different ways at the community college level.  I think the practice bears reviving or considering at the very least, especially if it is set up in such a way that the students are prompted to identify the value of their own work assigned by the professor.

The answers remain mysterious, but further inquiry is healthy.  ACICS, for one, the organization which conducted the study, thinks a greater emphasis on job skills needs to be implemented into post-secondary education.  Interestingly, the study’s respondents are split on that point.  For my own sake, I wish I had taken advantage of more internships during my education.  These support the development of skills, professional contacts and references, and occasionally stipends during the college years.  They may also educate the student in what he/she does not want to do professionally.  Internships should probably be encouraged and rewarded more often than they are in educational institutions, but students should also know that they serve as their own reward–and, they do not have to wait until college to start interning.

One of the most telling slides from the .pdf of a PowerPoint provided by ACICS summarizing the key points of the study was the last one provided.  After 68% answered that they expected some post-high school education (requiring a range from trade schools to graduate work), the final question in the presentation asked the following question:  “And, what, in your opinion, is more more important… The type of degree(s) that job applicants have completed.  [or]  The specific skills and ability that job applicants possess.”  10% answered that the degree was most important and 90% selected skills and ability.  The answer seems to be that a high level of education is expected, but the content is left unspecific outside of skills and ability.  How can content be taught and learned without skills and abilities being developed?  Or, is content really unnecessary so long as students have good written, oral and interpersonal communication skills, good computer and media skills, good problem-solving skills, etc.  Can these things be taught without a liberal arts education?  Doesn’t the field of study suggest certain skills and abilities?

Another question I have to ask is whether or not businesses across the industries have cut entry-level positions and internal job-training over the years.  Is it fair for accounting firms, pharmaceutical companies, tech corporations, property managements, etc., to expect vocational and career competence in all of these desired areas plus professional-level inter-personal skills, new media competency, cross-cultural experience, etc.?  Again, I think more graduates have these attributes than they themselves realize, but I also think there is a desire to hire someone proven, with multiple years of experience, while also paying entry-level salaries.  This factor may be reinforced by a lack of in-house, company-specific, industry-specialized training.

Some of the onus may lie with students who are increasingly addicted to technological means for interaction that may retard the development in other areas or who are disinterested and apathetic about education and their futures.  Additional responsibility lies with schools at all levels.  High schools fail to prepare students for the next level of education.  Universities  over-stuff classrooms, limiting the professor’s creativity, professors take the easy way out, or scholars fail to take the teaching part of their posts seriously.  Having said that, with all the truly excellent scholars and high school educators out there who take a true student-first approach, it is hard to weigh them down with the greatest culpability even with all the duds who “teach” along side of them.

There is one final consideration that I want to share.  The bulk of the poll-takers have been at their jobs for 10+ years: 9% 1-10 years, 28% 11-20 years, 34% 21-30, and 29% 31+ years.  A simple question must be asked about the culture of the generation who is sifting prospective employees: are they too far removed from the younger generations to be able to properly vet them and accurately assess their compatibility with the job requirements?  Is this older generation of hiring managers able to identify the connections between these newer needs assessments and the abilities of this younger generation emerging from academic institutions, today?  I think it is hard to answer that or to develop a study that can assess the necessary qualities, but I will leave it with this thought:  Many of the cutting-edge companies, whose employees and leadership embody the skill-sets covered in this study, who are forging exciting new ground in different industries have been in business for fewer years than many of the individuals who took part in the study.

The questions seem to increase in considering the study.  Assessing job-readiness is more difficult than might be expected and polls are tricky things to use in evaluating the abilities of both employers and prospective employees–although they may accurately take the temperature of HR personnel.  But, regardless, the results or supposed results of such studies will be used to debate and determine education policies and government involvement in education at multiple levels.  So, by all means considerate it carefully, whether you are a voter, educator, or administrator.

Leave a comment

Filed under Editorials on education

The end goal of education–what do we need, Part 1 of 2

If we’re not going to use it later, do we really need to learn it?

This post is one of two reflecting on education as stimulated by two articles, one from the Washington Post and one from the Chronicle of Higher Education.  What is the purpose of education?  Why do we need to learn certain things–do we need to learn certain things at all?  These are age-old debates and are closely related to debates about how we teach, as well.  Below, I will consider the points of the Washington Post’s article and then conclude with some reflections, similarly, a follow up post will evaluate the claim in the Chronicle’s article and reflection.  To read the original articles for yourself, simply click on the title in each post and a link will open the articles in a new screen.

When an adult took standardized tests forced on kids – Washington Post

This post from Monday on The Washington Post website discussed the challenges faced by an accomplished adult on the school board who took the standardized tests assigned to 10th graders in his state.  Guest author Marion Brady described the adult in the following way:

By any reasonable measure, my friend is a success. His now-grown kids are well-educated. He has a big house in a good part of town. Paid-for condo in the Caribbean. Influential friends. Lots of frequent flyer miles. Enough time of his own to give serious attention to his school board responsibilities. The margins of his electoral wins and his good relationships with administrators and teachers testify to his openness to dialogue and willingness to listen.

On the morning that he took the test, Brady spoke with his friend and listened as he explained that he was certain he did not do well.  Subsequently, the results came back and the school board member had this to say:

“I won’t beat around the bush,” he wrote in an email. “The math section had 60 questions. I knew the answers to none of them, but managed to guess ten out of the 60 correctly. On the reading test, I got 62% . In our system, that’s a “D”, and would get me a mandatory assignment to a double block of reading instruction.

He continued, “It seems to me something is seriously wrong. I have a bachelor of science degree, two masters degrees, and 15 credit hours toward a doctorate.

“I help oversee an organization with 22,000 employees and a $3 billion operations and capital budget, and am able to make sense of complex data related to those responsibilities.”

In other words, this is a bright and accomplished guy who got his butt kicked by a tenth grade standardized exam.  Is it because he is so many years removed from his tenth grade studies?  He considered that point, but argued that it isn’t really relevant in the end, saying:

“It might be argued that I’ve been out of school too long, that if I’d actually been in the 10th grade prior to taking the test, the material would have been fresh. But doesn’t that miss the point? A test that can determine a student’s future life chances should surely relate in some practical way to the requirements of life. I can’t see how that could possibly be true of the test I took.”

This is the same question students ask, of course, and many struggle to see the point.  Does anyone recall asking or a fellow student asking, “Why do we have to memorize multiplication tables when we have calculators?”  I recall that question as late as the eighth grade, long after they were supposed to be memorized.  I also recall the answers–how good am I?–at least, some of the answers to that question.  A: You won’t always have a calculator at hand.  (To which I snarkily raised my left hand to brandish my calculator watch–yes, I had a calculator watch in junior high, are you really surprised?)  A: It’s important for your brain functioning to be able to do this.  (This is not the answer that I got then, but this is what they meant as far as I understood it.)  A: You aren’t allowed to use a calculator on the test.  Ok, there were probably others I don’t actually remember, but those were the ones that stick.

Math classes and curricula do, in fact, try to answer these questions for students through the word problems.  They seek to provide students with a real-life context and ask the student to determine which tools from math will allow them to solve the problem.  Still, our school board member would retort with the following comment:

“I have a wide circle of friends in various professions. Since taking the test, I’ve detailed its contents as best I can to many of them, particularly the math section, which does more than its share of shoving students in our system out of school and on to the street. Not a single one of them said that the math I described was necessary in their profession.”

The article’s take away, here, is that the math tests are damaging.  They are, it is argued, doing more damage to students than providing the state with adequate assessments of teacher performance.  The school board member is further quoted as concluding thusly:

“If I’d been required to take those two tests when I was a 10th grader, my life would almost certainly have been very different. I’d have been told I wasn’t ‘college material,’ would probably have believed it, and looked for work appropriate for the level of ability that the test said I had.

“It makes no sense to me that a test with the potential for shaping a student’s entire future has so little apparent relevance to adult, real-world functioning. Who decided the kind of questions and their level of difficulty? Using what criteria? To whom did they have to defend their decisions? As subject-matter specialists, how qualified were they to make general judgments about the needs of this state’s children in a future they can’t possibly predict? Who set the pass-fail “cut score”? How?”

“I can’t escape the conclusion that decisions about the [state test] in particular and standardized tests in general are being made by individuals who lack perspective and aren’t really accountable.”

It is difficult to know what impact these test results have on students without seeing some hard data investigating those questions–which is not provided in the article, but the daunting supposition of the school board member is hanging ove Brady’s understanding of the tests’ impact on students.  For Brady, this is proof-positive that the tests are sign of a larger failure that has little to do with teaching or teacher accountability:

There you have it. A concise summary of what’s wrong with present corporately driven education change: Decisions are being made by individuals who lack perspective and aren’t really accountable.

Those decisions are shaped not by knowledge or understanding of educating, but by ideology, politics, hubris, greed, ignorance, the conventional wisdom, and various combinations thereof. And then they’re sold to the public by the rich and powerful.

All that without so much as a pilot program to see if their simplistic, worn-out ideas work, and without a single procedure in place that imposes on them what they demand of teachers: accountability.

Now, let me first say that I am not clear what Brady means by “corporately driven education change.”  Still, aside from that, it is clear that he is fed up with standardize testing, and takes a great deal of hope from the fact that schools and principals are too, referencing a New York Times piece (linked in his article).

I have to confess that I think there is an unspoken or perhaps unintended implication–which I alluded to above.  If the testing of certain math skills reveals those math skills to be unnecessary for later careers, than is it really an indictment of testing or of the math skills included on the test that no one needed?  Is the issue that the these tests are unreasonable or that the tested skills are unnecessary?

My bigger issue with standardized tests has always been more in the realm the way the evaluate skills and knowledge and less with the fact that try to do so.  One of the essential factors in math is that it trains your brain to operate a certain way.  I recently learned from a fellow Rotarian, that many of his fellow math students in college took it as a pre-law track to work on problem-solving skills.  The point of the test is demonstrate that the student has learned the skills, even if only temporarily (alas!) for the students’ developing brain.  While the argument in this case has questioned the valued of the tested skills, I always wonder the value of being tested with multiple choice questions.

As a historian, multiple choice selection is not a particularly useful skill, but evaluating a primary source is–a skill that is exercised by the historian in writing or oral papers, not multiple choice tests.  When I took one of my AP tests I did better on the essay portion than the multiple choice portion, I think that is the ideal, as I was grappling with complex concepts instead of trivia.  I seldom required 101 or 102 students to do much memorization, precisely because I couldn’t see the value of a future pilot or engineer or nurse knowing the Roman emperors in order.  I did however see something extremely useful in their learning about the incredible influence of Rome on our culture; and, I also put value on their learning the necessary skills to critically read an account of an emperor’s reign from a source that might be biased against or in favor of the emperor with the end being the development of critically reading and recognizing bias in a text.  This can be tested with multiple choice, of course, but not as well as with an essay.  I confess I am not certain what the corollary for math would be, but I am certain and handful of math professors or engineers or statisticians, etc., know exactly what it would be.

The point is this, however, the curriculum is called into question by Barry’s evaluation as much as testing is, if not more so.  Is the curriculum given false relevance by testing and, perhaps, college acceptance boards?  Is there a reason for learning extraneous knowledge with its attendant skills?  Assuming we answer the latter question in the affirmative, is there a need for testing it and do tests actually accurately evaluate what the students have learned?

To follow up on this I will consider the post from the Chronicle of Higher Education’s website which considers the concerns from the workplace about graduates and their lack of skills for the jobs they seek.


Filed under Editorials on education

Word of the Week, 12/5 – 12/10/11 – papist

Papist (pe·pist) An adherent of the pope; esp. an advocate of papal supremacy; also, more generally, a member of the Roman Catholic Church; a Roman Catholic or Romanist.  (Usually hostile or opprobrious.) [1521 FISHER Serm. agst. Luther Wks. 344  The popes holynes & and his fauoureres, whom he [Luther] calleth often in derisyon papistas, papastros, & papanos, & papenses.]  1534 (title) A Litel Treatise ageynst the Mutterynge of some Papistis in Corners.  1657 J. SERGEANT Schism Dispach’t 656 “Tis clear tht al Roman-Catholikes, that is all Communicants with the Church of Rome or Papists (as they call them) hold the substance of the Pope’s Authority.

~ The Compact Edition of the Oxford English Dictionary

The word "papist" is derived from Lutheran works during the period of the Reformation.

When Dr. Owen Stanwood was researching the demise of an early, isolated colonial town, he assumed that the colonists would blame Indians for the massacre, but he was surprised.  The English colonists blamed papists.  The term comes to English from similar derogatory terms originating with the German Lutherans.  The association is a negative commentary on those loyal to the pope’s authority and their support for the claimed power he had over souls, theology, and secular issues.

In England, this quickly becomes associated not just with religion but with nascent national identity and, subsequently, the very ability to be loyal to king and country.  It becomes excepted in England that one cannot be Roman Catholic and maintain loyal to the English crown as adherence to the church and the Pope in Rome put one in direct opposition to the king of England; it supplanted one’s loyalty to one’s country; it was treasonous.  This was reinforced by the political powers who defended the Roman Catholic Church, namely France, clearly an enemy of England.

English "papist" Sir Thomas More

This suspicion was challenged by a handful of English officers, Sir Thomas More, Sir Edward Campion and George Calvert, Lord of Baltimore.  However, of these three examples, only George Calvert escaped a death sentence though he did resign his government post when he converted and his Jesuit advisers were technically guilty of treason under the law of the land, stating as it did that any Roman Catholic priest who set foot on English soil was subject to death.  The law in England made Catholics second-class citizens and suspicion would remain a part of the English Protestant tradition, even in the New World.

English colonists in America continued to be suspicious of Catholic colonists.  Even as they were establishing a new government based on democratic ideals, John Adams and Thomas Jefferson speculated in their written correspondence about whether Catholics had a place in a country whose government rejected kings, and presumably, by extension, pontiffs or bishops who sought to behave as such.  Could Catholics in America be trusted to operate loyally to the American government and not respond slavishly to a Roman pontiff?  Calvert believed that loyalty to his country and king was of the highest importance, and that it remained unimpeded by his Catholicism.  In fact, he was the first to seek to establish religious freedom in America, putting religious pluralism into law for his colony of Maryland; but, rather backwardly, he and his progeny also resisted representative modes of government to their undoing.

First Lord of Baltimore, George Calvert, founder of the "papist" colony of Maryland

The challenge was heightened for Anglicans who were cut off in America from their church hierarchy following the Revolutionary War, or in truth the Declaration of Independence which rejected the king’s authority.  (Awkward for a colonial church whose head was the king.  Interestingly, they find a simple way around this by seeking out the bishops of Scotland, who were, for political reasons, granted a reprieve from swearing allegiance to the English crown.  The Episcopalian Church, thus, grew from a bishop consecrated in Scotland, who returned to the United States to shepherd his flock and consecrate new bishops and priests.)  Here, the members of the Church of England had ousted not only their king, but their religious leader, and as they broke away from their old country they formed a new religion, though, admittedly with sacred ties to the old one, but it reinforced how backwards and even threatening the Roman Catholics could be to the young country if the old suspicions about them were true.

In England, suspicion of papists was maintained through declining relations with the Irish, heightened in the area of Northern Ireland, where Protestants and Catholics came to strongly associate religion with national identity.  In the US, this suspicion was fostered by a largely elitist response to poor Catholic immigrants arriving especially those from Ireland and Italy.  The American response was not motivated exclusively by a lack of wealth, but associations of certain behaviors or temperaments with these populations that were also often visually ill-mannered (even drunk), under-educated, and poor.

The term continues to be a derogatory one which assumes a lack of intellectual inquiry or individual thought.  “Papist” still invokes images of Roman Catholics who slavishly follow church hierarchy and the pope’s word, coupled with beliefs or practices that are considered unenlightened.  This stigma, particularly of Church authority, was raised as a concern even with the Kennedys as they entered into politics as recently as the 1950s-60s.  Today, it is a slur frequently associated with deliberately misleading tracts and individuals who repeat inaccurate representations of the Catholic religion.  For whatever reason, the use of this cultural, religious insult does not carry the same stigma that other derogatory labels of  a similar kind carry in our society.

Leave a comment

Filed under Historian's Journal, Word of the Week


When I was teaching at the community college, I had a student inform me during a discussion about a paper assignment that he had never been told that he had to put a quotation from a source in quotes.  Now, of course, he could have been lying–maybe he was–but, he was a special needs student, a rather competent one I found, and I have to wonder whether  he didn’t just fall through the cracks because teachers didn’t want to deal with it plagiarism and a special needs student.

It raises some interesting questions about plagiarism.  How is it taught?  Is it taught?  Do students actually learn what they are suppose to do in this regard and how often do we assume they know when they’re clueless?

Of course, there are clearly accidental infractions of plagiarism, such as is described in the article below–an instance where a student understands what plagiarism is, but was not sufficiently experienced in the field to recognize that the descriptive expression used was unique to the author and not common parlance in the discipline.

I was a plagiarist.  (I was not, it’s the article title.)

There has been some depressing stuff coming out of campuses in recent years when it comes to cheating and dishonesty, but let’s consider the difficulty of the concept for a moment.  Inexperienced students may struggle to find the balance between referencing someone else’s work and avoiding plagiarism.  Many early “research papers” coming out of middle school or junior high are little more than book reports, where the student has used one or two sources total.  However, since they are not billed or assigned as book reports, students think they are doing research papers when they are in fact essentially plagiarizing.  I have noticed this to be the case even when multiple sources are required.

How well do teachers correct this?  The student is clearly learning, but yet is not handing in original work.  More to the point, how does the student’s next teacher deal with this type of work after a precedence is set?  Sometimes educators are so gratified to see that students have learned something that this sort of mistake goes uncorrected.  This is especially common for students who do not engage in or grasp the detective work involved in history–they regurgitate a scholar’s argument and have no idea that this is not a history research paper.  The repercussions for plagiarism are often harsh, particularly in colleges and universities where it is policy to black ball the student as a plagiarist–a label attached to their transcript–but even in high school often result in automatic failure for at least that assignment.

Notice that this is not the same as a student going to the publisher’s website and copying and pasting the reviews into a word document and handing it in; this is not lifting a Wikipedia article; this is not copying a page from an author without giving credit and then removing the page from the book–all of which do and have happened in clear cases of dishonest, deliberate plagiarism.

So, I am interested in what people think.  Take the polls below:


Filed under Editorials on education