Ouriginal is now part of the Turnitin family Learn More
July 12, 2021

The Future of Writing Style Analyses: Ouriginal Metrics as a Learning Analytics Tool

In the last two parts of this series (part 1, part 2) on computational linguistics and writing style analytics, we mostly discussed how stylometry can be used to detect ghostwriters. Besides this scenario, Ouriginal Metrics could also be used as a learning analytics tool to analyze students writing styles, help them to improve their writing skills.  

A change in perspective: New measures and workflows for Ouriginal Metrics?

If Ouriginal Metrics is eventually extended into a learning analytics tool, the aim would be to identify not just individual students who are performing exceptionally or poorly, but groups of students who struggle on particular measures. This may provide important quantifiable data needed to help these students achieve the level of the targeted learning goals through pedagogical intervention. These findings would be a valuable resource for academic institutions with standardized testing by providing educators or students with real-time performance feedback.

Scenarios for extending Ouriginal Metrics into learning analytics may require establishing a fluid workflow instead of developing anything new. If a particular student scored below average on a measure it is probably not significant. If the low performance is consistently noted across assignments and the student scores worse than his academic peers, this result may indicate the need for a pedagogical intervention. 

Additional usage scenario for writing style analyses: What to add in order to turn Ouriginal Metrics into a learning analytics tool?

Implementations of learning analytics may vary. One possibility would be to show the results of Ouriginal Metrics only to the instructor. The other option would be to provide the student with the results of the analysis. Finally, each student could see the results of the analysis in comparison to the anonymized results of the peer group. In this last implementation, students may be told their rank on particular assignments. 

As with all algorithms, there are many tacit ideological assumptions embedded in the measures of Ouriginal Metrics. For example, it is important to note that not all of the measures in Ouriginal Metrics may be appropriate to use for learning analytics. Short sentences (a factor that affects Gunning Fog) and low levels of lexical originality may be esteemed in some cultures or genres, but regarded with disdain in others. Sentence length is another measure that varies from language to language. That having said, the targeted learning goals may be different in varying educational contexts.

When adapting Ouriginal Metrics as a learning analytic, it may be necessary to rework the scoring system in a way that effectively considers low scores for measures as well as mean scores. The current operant assumption our tool makes in scoring high outliers is that professional ghostwriters will be more thoroughly composed than a struggling student who turns to contract cheating.

Stylometry: A way to catch ghostwriters and improve students’ writing skills?

What is really exciting is that stylometry helps to detect not only new ways of cheating or academic misconduct such as commissioned work by ghostwriters or so-called ‘essay mills’, but it can also be used to improve the writing skills of students. If researched properly, computational linguistics and writing style analyses can support students strengthen their own creative voice by improving their own unique writing style.  

Currently, we are working to develop Ouriginal Metrics further to enable the use of the tool for the latter objective of improving the writing skills of our users.

We would love to hear your opinion on using stylometry to support teachers in identifying students’ writing skills! Discuss the topics with us on Twitter.

Read more blogs:

This website uses cookies to improve the site’s overall user experience and performance. Read more here.