In a world increasingly driven by data and measurement, training procurement specialists are constantly looking to establish quantifiable data in order to assess the impact of the training they source. How else do you know if the investment has been worth it?
At first sight, this is particularly challenging for language learning. Language can appear like a wild, sprawling beast of a thing – 10s of 1000s of words colliding across lines of grammar, sentence structure, different pronunciations, slang, literary invention, collocation, group-speak, tech-speak, etc etc.
Using a new language effectively can open doors, build relationships, and win deals. However, this is not just a question of knowing a list of words and tenses to be tested using multiple or gap-fill activities, for example. It is more about articulating ideas, building rapport and expressing yourself empathetically and with clarity. This is about performance. Like other soft skills such as leadership, negotiation and strategy, this is more difficult to pin down and measure.
But that’s not to say it cannot be done.
You can take an external test. There are plenty around which attempt to quantify language skills and show both student and trainer what has been learnt. They are based on years of research and pegged to international competence standards. IELTS, BULATS, TOEFL, DELE, DELF, DALF, TestDAF, HSK, JLPT… are all well-established exams – contact me if you want to know which languages they test (and whether they’re any good..).
However, there are 2 main issues for business language training sourced by companies.
- Learners end up learning for the test. Consequently, they do not focus on the language they need to communicate more effectively in their business relationships. They focus on what they need to pass the test, much of which may be irrelevant outside the training room.
- Such tests may focus on discrete items like grammatical structures or vocabulary, as they are easy to capture, rather than the ability to communicate opinions or ideas, which appears more difficult to assess.
Business-focused language courses tend to have specific objectives and therefore follow a tailored syllabus. Learners may be relocating, building relationships with overseas clients and partners, or setting up an office in a new territory, for example. The language they need to learn should be (if the training supplier is any good) reflected in the course design – input, materials, practice work, and training methodology.
The more generic exams mentioned above are not particularly useful here. For such courses, it makes more sense for the trainer to set an assessment activity designed to match the learners’ specific requirements and therefore may include a company presentation, a written report which is then discussed, or scenario role plays. These are ‘whole language’ tasks involving the kind of grammar, vocabulary, delivery, skills mix and pronunciation that will be needed in practice.
Such assessment is ideal for short, targeted business language courses, and also for on-going assessment. Reports can be given on performance and disaggregated into relevant focus areas, such as structural accuracy, communicative impact, stylistic appropriacy, and comprehensibility. These can provide accurate assessments of progress to date and also a template for further work and study.
If you are looking to source language training for yourself or your company, make sure that the training provider gives you guidelines on what to expect in terms of assessing progress. Language learning can be – and should be – measured qualitatively giving you a clear view on outcomes and what learners can actually do as a result of their training.