Skip to main content

Every University has a contract-cheating problem

By Dr Malcolm Murray

Head of Digital Learning

That was the stark take-away from last week’s UK Turnitin User’s Summit (#TurnitinSummit19) held at the Baltic, Gateshead. Call it “ghost-writing”, “contract cheating”, “using essay mills”  or just “buying essays”, this cancerous habit is gnawing away at academic integrity and poses a threat to society in general. The world may not stop spinning if one student buys an essay for ANTH1011 but would you be happy sitting on a plane or under surgery by someone who had bought their way through their exams?

Similarity Detection

Turnitin are probably best known for their similarity detection software, designed to help staff identify cases of plagiarism (unacknowledged use of the work of others). Turnitin and other tools such as Urkund have become very good at detecting potential cases. Combining their use with structured educational approaches (helping students understand what plagiarism is and how to avoid it) can significantly reduce the number of cases over time.

Turnitin provided a service update for the tools, demonstrating better uptime in 2019 (99.3% non-degraded service, if you like numbers).  It’s just a shame that the remaining 0.7%, the unplanned outages still tend to coincide with peak submission periods in the UK.

Their developers showed us conceptual mock-ups of add-ins for Microsoft Word and Google docs they are working on. These are designed to help students identify accidental plagiarism and missing references as they are writing. The add-in takes the form of a bar to the right hand side of the page and on request it would calculate a similarity score and identify any missing citations. I liked this, but there was not universal praise. Some staff were worried that this would simply encourage students to focus on paraphrasing, to “game” the similarity score, others questioned whether it was somehow doing too much of the work for them managing their citations?

Ironically, the effectiveness of these approaches may be driving people towards other ways to cheat: using essay mills. That was the main focus of this year’s conference.

Contract Cheating

At present it is not possible to definitively calculate the level of contract cheating at an institution (that’s not to say institutions aren’t trying to detect it). Philip Newton’s review of past studies from 2014-18 based on students’ self-reporting, estimates on average 15.7% of students admitted to paying someone else to undertake their work. A recent investigation by the Guardian which used Freedom of Information requests sent to Russell Group Universities shows the number of people being sanctioned for academic misconduct (which is the charge they’d face for using contract cheating services) is on the rise. At Durham, there is specific reference to contract cheating in the Learning & Teaching handbook. In September 2018 the Vice Chancellor joined with other UK University Heads in a call to make essay mills illegal. A private members bill was subsequently proposed by Lord Storey to make it illegal to advertise or provide contract cheating services in England and Wales. It had its first reading in the House of Lords on the 10th of July but will make no further progress because of the general election.

There was a lot of discussion at the Turnitin event about methods to detect contract cheating. These tend to fall into four categories:

  1. Analysis of the text – looking for things like reading age, sentence length, use of different writing conventions such as the Oxford comma, and comparing these with other examples of student work submitted previously.
  2. Reading it – often they are very poor quality and are only tangentially related to the set question.
  3. Checking the references – often bibliographies are fake or irrelevant.
  4. Parsing the document meta data – looking at how many revisions were made, who first created the file, when, on which machine, etc. and checking for links to known essay mill sites.

Delegates added two other methods

  1. Direct emails from the essay mills themselves, giving details about students who commissioned essays but didn’t pay or the payment didn’t go through.
  2. Distressed students who contact the University for advice, because they are being blackmailed by the essay mill even though they paid for the essay.

There was general agreement that despite some very impressive attempts by individual academics flexing their CSI muscles, this is not a problem that individual institutions are equipped to resolve. Instead it needs consensus about appropriate methods and procedures (including processes to support students during an investigation).  When the MyMasters scandal hit leading Australian universities in 2014, most were not ready. Are UK institutions any better placed a few years on? This is still an evolving space. Some delegates argued that we should focus on  educating not catching students, whilst others felt that there was sufficient cheating going on that we can’t afford to ignore it. Durham University’s Teaching & Learning Committee are currently working with Turnitin to explore their Authorship Investigation tool.

Staff from Coventry and Loughborough talked about their ‘whistle-blowing policies‘ for calling out staff and students suspected of being involved in contract cheating (whether as customers or suppliers). Students didn’t like the idea of their peers cheating and getting away with it, so staff worked with the student unions to develop these policies. When I got back I looked up Durham’s Whistle Blowing Policy.

Also controversial was Cath Ellis’ assertion at the conference that it was not possible to design out contract cheating. If you can ask a student to do it, they can pay someone else to do it for them. As well as commissioning essays, you can pay someone to analyse data for you, fake a personal reflective journal, even sit an exam for you. Whilst some assessment designs can make cheating harder, can we hand on heart, say that any are fool-proof? There was not universal agreement with this point from the audience, but it was striking that the people who did agree included those who seem to have spent the most time investigating contract cheating at their institutions. When I got back to the office, I came across another study looking at the relationship between assessment types and the likelihood that students would attempt to use contract cheating services, sadly it seems that the assessment types least likely to be outsourced are also the least likely to be used by academics!

Learning Analytics

There was a presentation about a new “data exhaust” from Turnitin’s similarity detection tools, developed with help from Jisc. A beta version is available to subscribers to the Jisc Analytics Service. The trick is being able to  match the reports back to individual students and courses (which may require linking it to other data from the VLE). Durham is not using this service at present.

Other Developments

Turnitin also showed off a previous acquisition (Gradescope) – developed to mark hand-written paper scripts. This might be of particular use in subjects where assessments include numerical formulae or, if future developments deliver, hand-drawn graphs and diagrams.

They gave hints of what is on their longer-term road-map. Improvements such as better rubrics management, support for more complex workflows and an attempt to standardise the look and feel of their tools, making it easier for staff and students – if they like them. Delegates left with the impression that this is a rapidly changing area, where policies, pedagogy, technology and essay mills are competing in an arms race.