It all began innocently enough; with an infographic. A small corner of social media bubbled with controversy after data visualizer Latinometrics published a graph which illustrates the relationship between English proficiency among Latin American countries and their nearness to the US.
The infographic –based on EF’s English Proficiency Index– became the center point of a debate between language experts, teachers, entrepreneurs and other sorts of people discussing the accuracy of the English proficiency rankings on display. Commenters disputed if Argentina truly is the most proficient Latin American country when it comes to English; whether Mexico deserves a better ranking; and if the scores of Colombia, Perú, Chile and Brazil are a fair representation of their populations’ English-speaking skills.
(1/12) The farthest LatAm country from the US — Argentina — has the best English proficiency score in the region; Mexico has the 2nd worst.
A thread on English proficiency in Latin America 🗣️🇺🇸: pic.twitter.com/tJ1sJpVDQ4
— Latinometrics 📊 (@LatamData) January 16, 2023
Though a minor incident, the debate sparked by the infographic put the spotlight on a problem that has yet to be solved: when it comes to evaluating English proficiency in Latin America, everyone still stumbles in the dark.
A Splintered Measuring Stick
The major problem with Latinometrics’ infographic is that it is based on data from the EF English Proficiency Index.
The EF Index is perhaps the most popular guideline for the evaluation of a country’s English proficiency levels. It provides an uncomplicated picture of the English-speaking skills of a given population, allowing for direct comparisons between territories.
Although convenient, the index is known for its lacking methodology. Czech scientist and linguist Jakub Marian characterized it as a very flawed tool, to put it mildly. Here at NSAM, we’ve covered the issue too.

The problem of the EF Index lies in data. Index results are based on data gathered from millions of tests. The product is a set that isn’t based on representative samples, which provides a distorted snapshot of a population’s actual level of English proficiency. The picture might not be entirely false, but it is certainly warped.
“I don’t like to rely on this type of rankings”, said Mauricio Velásquez, Managing Director of Bogotá-based consulting firm Velásquez & Company, in an interview for a previous article. “They arrive at certain conclusions while extrapolating from an inconsistent statistical basis”.
The problems don’t end with the EF Index. Language and education experts have for years pointed out the shortcomings of other tests and indexes as tools for evaluating proficiency.
“Several studies show that, for a person to be fluent in a specific language, practice and the acquisition of a specific volume of vocabulary is necessary. This is measurable in certain tests, but not in all”, explained Laura Pérez, a translator at ExpandShare. “Young people build a considerable part of their vocabulary through social media, for example, and that’s not measured”.
High-ed expert and observer of international education Karin Fischer has pointed to a similar problem with tests in colleges. Their problem, she explains, is that they measure cultural performance, not actual language skills.
Although some countries provide data on English skills –gathered by government, NGOs or private enterprise–, observers risk running into an array of confusingly varied numbers and definitions on what proficiency actually means. In Mexico, for example, data on the percentage of the population that speaks English throws numbers as varied as 2% and 11%, depending on the source. To make matters more difficult, the information available is not up to date.
In other words, though there are several tools and points of data available to evaluate English-speaking skills in Latin America, none of them have proven to be truly worthy of trust.
“Today, there’s no organization that determines English level on a city, school, state or even country level”, commented Roberto Torres, a consultant that works closely with startups.
“We need a dedicated organization [that accurately measures English proficiency]. It can be regional or perhaps work on a country level. It would face major challenges, but it’s definitely necessary”, he added.
Driving the Hunt for Talents and Sites
The lack of a proper measuring stick for English proficiency in the region grows more relevant if one considers the trends pushing site location.
Industry analysts have pointed to “location agnosticism” as a growing trend among companies who seek to export global business services from the Nearshore. In an age of remote work and global hiring, talent is the major driver of site selection. Companies are now hunting for skills, both technical and linguistic, above all else.
“One of the most important parts that you’re seeing now, since all these costs are the same, are language skills […] You’ve got to find markets that have good English, and those are few and far between. So, you look at smaller markets, but then you have smaller centers”, explained Jeff Pappas, Senior Managing Editor at Newmark, during a panel at Nexus 2022.
“We need a dedicated organization [that measures English proficiency]. It can be regional or perhaps work on a country level. It would face major challenges, but it’s definitely necessary”—Roberto Torres, Tech Practice Lead at MDV Consultores
Not everyone has caught up to the necessity of English proficiency in business, though. Torres told NSAM that, in his practice, he has stumbled upon startup executives who give little weight to language as a skill worth developing. It most be noted that, in tech, English has become a convenient tool for communication between programmers.
“When we put forth API [application programming interface] documentation for our clients, it has to be in English. Documentation, variables, everything has to be in English”, he said. “If you want to do business in the world, you must speak English. And if you want to do business in tech, it’s even more important.”
What Options Are There?
As more people realize the shortcomings of the EF Index and similar tools, and as the nearshoring of business and tech processes grows into a more solid industry, investors, companies and the countries themselves will want to have access to tools that accurately measure language proficiency.
Though one doesn’t exist at the moment, there are options out there that can be relied upon, assured Laura Pérez. Harvard and Columbia both have English proficiency reviews by country. Plus, GMAT and TOEFL test scores tend to be reliable, she added.
Then again, measuring linguistic proficiency is an inaccurate science in itself.
“I believe that one’s ability to speak a second language is measurable only to a certain point, because there are many factors that involve language acquisition. Environmental factors play a major part”, commented Pérez.
The accelerated growth of Nearshore service delivery is pushing some governments in the region to launch English teaching programs. Argentina in particular has a program that focuses on teaching the language in a way that works for the IT industry.
Yet, without a proper tool to evaluate proficiency on a country-by-country basis, companies will feel lost. Cornered by the rising demand, some have begun to take the matter of English-teaching into their own hands.
Totally true; throughout the years the way that “english proficiency” has been measure in every LATAM country, region or location has big differences between them, and frequently there´s no correlation; to the point where doing a viability study has become a nightmare; I join the urgency of having or creating institutions specialized in this matter, specially with the new opportunity of nearshoring in MEX that it´s just around the corner; without this hard data, there is no competitive advantage to offer, other than just being a next door option.