Search
Search Results
-
Investigating the patterns of syntactic complexity predicting high-quality writing: a corpus-based study of the written text production at the B2+ English Language Exam at a Hungarian University
Views:32This pilot study explores the predictive role of syntactic complexity in assessing L2 writing proficiency, with a focus on its potential contribution to validating a high-stakes English language examination. Drawing on prior research that highlights the importance of syntactic complexity in writing evaluation, the study aims to identify specific syntactic measures that reliably distinguish between low-rated and high-rated L2 texts. The analysis is based on a corpus of written texts for the B2+ level for so-called 'Basic' English Language Examination (BLE) administered at a Hungarian university. Although labeled "Basic" the BLE represents a mandatory proficiency examination (B2+ level according to the CEFR) required for academic advancement. Rather than examining inter-rater reliability, the research centers on contribution to validation by investigating linguistic features associated with rated writing quality. Grades assigned by human raters were used solely to group texts and build the corpus for analysis. A total of 60 syntactic complexity indices were extracted using the Multidimensional Analysis Tagger (MAT) (Nini, 2019) and the Coh-Metrix 3.0 software package (Graesser, McNamara & Kulikowich, 2011). These indices include measures of clausal, phrasal, and overall structural complexity. The findings are expected to inform ongoing validation efforts for the BLE and contribute to more robust, evidence-based practices in L2 writing assessment by identifying linguistic patterns that correlate with writing proficiency.