People often talk about how machines don’t have bias. However, that’s not exactly true. After all, they are programmed by people who do have biases. One of those you might not expect are in things like translators. According to Fast Company, instances have been found in “Bing Translate, Google Translate, Systran, and other popular machine translation platforms.” They follow gender bias. Teachers and nurses are assumed to be women and require female word forms, doctors and engineers are assumed to be men and require male word forms. This happens in both directions of translation, to or from English. So don’t necessarily trust something is unbiased just because it comes from a machine. Think about your words and your word choices and don’t make those same assumptions.