![]() Simple design, rapid completion and adequate vocabulary coverage are very important factors if the aim is to reflect the state of development of the second language lexicon rather than test individual words. Beyond careful experimental design, it is useful to try to understand what makes some tests better than others. Two areas are addressed: the precise measurement of vocabulary recognition speed and the measurement of lexical accessibility. ![]() In this thesis, the approach to word recognition testing described above is first set within the literature on second-language word recognition. The target words are all high-frequency and a large number (typically 50) can be tested in a few minutes. ![]() In this way, Q_Lex produces a score rather than a response time result. Learners score points if they respond within these norms. Native speakers have little trouble identifying the hidden words and so their latencies can be used as a baseline to create norms for each item. This approach slows down recognition time to a degree that PCs can reliably measure (1 or 2 seconds). The most innovative feature is that the stimuli are masked in nonsense strings e.g. Our solution to this problem is a quick-and-easy test called „Q_Lex‟. However, the very short reaction times involved mean the associated technology is complicated to handle and set up in school-based testing situations. Word-recognition tests for second language learners have become more common in recent years with increased awareness of their importance in lexical processing.
0 Comments
Leave a Reply. |