This is an Eval Central archive copy, find the original at engagewithdata.com.
Over the past few weeks, I learned just how wrong I was — and ate a big piece of humble pie in the process.
With colleagues, I’m working on a landscape analysis of how families and educators in California feel about family engagement and the state’s requirements for incorporating stakeholder feedback into district plans for improvement. We’re designing a training program around these topics, but to make sure our program will be relevant, we wanted to hear from the people who would be participating in it. We designed a survey and planned for focus groups, and I naively thought we were good to go.
Although Baltimore has a growing — but fairly localized — population of English Language Learners, the families at the schools where I worked were predominately Black and English-speaking. When I worked at the district, we had a cadre of interpreters we regularly contracted with for events, and we used large-scale survey software that easily facilitated (mostly adequate) translations.
So when we decided to translate the California survey into nine additional languages, I didn’t anticipate just how difficult that would be.
Our survey was fairly basic and brief, so I built it out in Google Forms … only to learn that despite the widespread availability of their free translation technology, there was no mechanism for translating surveys in their tool. (I’m honestly still scratching my head about this.) The most straightforward (ha!) way I found to create a multilingual survey in Google was to independently translate the survey into each language, build a separate page in the survey for each language, copy and paste each line of the translated surveys, and then use skip logic to direct people to the page with the language they selected.
Umm what?
We gave up on Google. We found out that our client had a Survey Monkey account that included the ability to create multilingual surveys. I was excited. Finally – a logical way to complete this seemingly simple task!
Nope. I was still wrong.
While this platform at least offers a dropdown menu of languages on the survey page (thereby making it easier for respondents and avoiding the skip logic silliness on the back end), it turns out that this paid feature was just as cumbersome to use as the Google option. What I ended up having to do was download a coded text file for each language, pay to independently translate each of the languages (Thank you, Stepes Translation, for coming to our rescue!), copy and paste each line of the translations into specific sections of the text file, and then upload the translated file to the system. NINE TIMES.
With my hand cramping from all of that Ctrl-C and Ctrl-V action, I was stunned by how technically difficult and frankly, inaccessible the survey translation process was. Who is actually going to go through all this? More importantly, what does this mean for the voices of those who are not native English speakers? Without access to a large, institutional subscription to a powerhouse survey software, my gut tells me that very little translation is likely to happen. As a result, many important voices are being silenced.
I don’t have a solution to offer here, but I’m glad that this is a lesson I learned. This has opened my eyes to the institutional roadblocks that prevent equitable language access in our country… and I know I’ve just scratched the surface. Translation services, albeit not 100% reliable, are widely accessible and free online, yet they are not integrated into lower-cost survey platforms. This not only causes a huge headache for survey designers, but it inhibits the ability to hear from non-English speakers about important issues. As I seem to say in a lot of my blog posts, we have to do better.
If anyone has a better solution than the relay race I just ran, please share in the comments! I do hope that a more accessible and user-friendly option exists.