Penny Kinnear

Early English language assessment to improve first-year success of engineering students

Two research-intensive universities, one in a large city and the other in a mid-size city, are dealing with changing student demographics reflecting changes in Canada’s population over the past several decades as well as the more recent trend in internationalization of higher education. The diversity is not just from international students who make up an increasingly large proportion of the student bodies, but also reflects the ethnic, cultural and linguistic diversity and complex linguistic histories of the Canadian population. According to Statistics Canada’s most recent available figures, immigrants make up 46% of the urban population where University of Toronto is located. Although the share of newcomers settling in this urban area declined slightly since the last (2006) census, the area still received the largest share of newcomers, nearly one-third (over 380,000). As 61.2% of the immigrant population and 66.8% of the newcomer population speak one of Canada’s official language and one or more non-official languages [1], it would be surprising not to find this diversity reflected in the classrooms.  The students accepted into the two programs are bright and hard working but not all have had the opportunities to work within and with the academic vocabulary and rhetorical discourses of the academy and the engineering discipline.

The IELTs and TOEFL scores that are used as admission criteria at both universities claim to provide a basis for predicting a student’s readiness to handle the language of an academic course of study and their “scores are said to extrapolate to performance in real-life academic settings[2] . There are two issues with the use of such tests. The first is the extent to which the extrapolation holds true “for the actual language use” [2] as the evidence has not been extensively investigated. Secondly, the language proficiency measures are not designed to provide diagnostic information about a student’s language use post-admission. University of Toronto had discarded an in-house designed writing task administered to all first year students several years ago. Queens University has continued with an in-house English Proficiency Test that serves as a graduation requirement, but does not explicitly provide diagnostic information, and is usually administered too late to support incoming students with English support needs. The efficacy of efforts to provide support and instruction to students who demonstrated language and communication needs have been hampered by the lack of an efficient, reliable and timely identification and diagnostic strategy or instrument as well as the voluntary nature and additional workload the participation in the support activities demanded at both universities.

At University of Toronto, 1224 first-year students wrote the DELNA screening test during the first two weeks of classes, of whom 315 wrote the subsequent diagnostic during the first two weeks of October on the basis of receiving a DELNA score below the 60% cutpoint (a designation of Band 1). At Queens University 761 students wrote the DELNA screening test during the first week of classes, and 96 students were flagged to write the diagnostic two weeks later based on receiving a score below the 60% cutpoint. At University of Toronto scoring the diagnostics by a team of five assessors knowledgeable of second language learning and writing preceded by a 2-hour benchmarking session. At Queens University scoring was done by by the coordinator of the English Support for Engineers program and writing tutors, preceded by a 1-hour benchmarking session.

We found that we have a significant number of students who appear to struggle with the transition from studying language as a subject to using language to study engineering. This is reflective of the internationalization of higher education and recruitment efforts as well as the increasing diversity of our domestic population. We can see that language proficiency measures provide one kind of measure but are not particularly helpful in identifying student needs when transitioning from language as subject to language as medium of instruction or language as practice within a profession. The screening appears to reliably identify students who would benefit from a finer grained analysis of their academic language and literacy strengths and weaknesses. The diagnostic appears to identify clusters of students with particular needs, including some who need support with lexicogrammatical issues and word choice and others who need support with the more complex elements of argumentation, concision, and inferencing. Further analysis of our results will help us to revise and refine both the diagnostic and the analytic rubric.

Statistics Canada, Immigration and Ethnocultural Diversity, Catalogue Number, 99-010-X, May 2013, available:  https://www12.statcan.gc.ca/nhs-enm/2011/as-sa/99-010-x/99-010-x2011001-eng.cfm

L. Brooks and M. Swain, “Contextualizing performances: Comparing performances during TOEFL iBT™ and real-life academic speaking activities,” Language Assessment Quarterly, vol. 11, pp. 353-373, 2014.

Kinnear_etal

View Kinnear_slides

Advertisements