Summary: New research shows that AI writing assistants can unintentionally homogenize global writing styles, pushing non-Western users to sound more American. In a study comparing Indian and American users, AI suggestions often promoted Western topics and writing patterns, diminishing Indian cultural expressions.
Indian users accepted more AI suggestions but had to frequently modify them, resulting in less productivity gain. Researchers call for AI developers to prioritize cultural sensitivity to preserve global diversity in writing.
Key Facts:
- Cultural Homogenization: AI writing suggestions nudged users toward Westernized language and topics.
- Reduced Productivity: Indian users gained less benefit from AI tools due to frequent corrections needed.
- Call for Cultural Sensitivity: Researchers urge AI developers to account for diverse cultural contexts, not just language.
Source: Cornell University
A new study from Cornell University finds AI-based writing assistants have the potential to function poorly for billions of users in the Global South by generating generic language that makes them sound more like Americans.
The study showed that when Indians and Americans used an AI writing assistant, their writing became more similar, mainly at the expense of Indian writing styles.

While the assistant helped both groups write faster, Indians got a smaller productivity boost, because they frequently had to correct the AI’s suggestions.
“This is one of the first studies, if not the first, to show that the use of AI in writing could lead to cultural stereotyping and language homogenization,” said senior author Aditya Vashistha, assistant professor of information science.
“People start writing similarly to others, and that’s not what we want. One of the beautiful things about the world is the diversity that we have.”
The study, “AI Suggestions Homogenize Writing Toward Western Styles and Diminish Cultural Nuances,” will be presented by first author Dhruv Agarwal, a doctoral student in the field of information science, at the Association of Computing Machinery’s conference on Human Factors in Computing Systems.
ChatGPT and other popular AI tools powered by large language models, are primarily developed by U.S. tech companies, but are increasingly used worldwide, including by the 85% of the world’s population that live in the Global South.
To investigate how these tools may be impacting people in nonWestern cultures, the research team recruited 118 people, about half from the U.S. and half from India, and asked them to write about cultural topics.
Half of the participants from each country completed the writing assignments independently, while half had an AI writing assistant that provided short autocomplete suggestions. The researchers logged the participants’ keystrokes and whether they accepted or rejected each suggestion.
A comparison of the writing samples showed that Indians were more likely to accept the AI’s help, keeping 25% of the suggestions compared to 19% kept by Americans. However, Indians were also significantly more likely to modify the suggestions to fit their topic and writing style, making each suggestion less helpful, on average.
For example, when participants were asked to write about their favorite food or holiday, AI consistently suggested American favorites, pizza and Christmas, respectively. When writing about a public figure, if an Indian entered “S” in an attempt to type Shah Rukh Khan, a famous Bollywood actor, AI would suggest Shaquille O’Neil or Scarlett Johansson.
“When Indian users use writing suggestions from an AI model, they start mimicking American writing styles to the point that they start describing their own festivals, their own food, their own cultural artifacts from a Western lens,” Agarwal said.
This need for Indian users to continually push back against the AI’s Western suggestions is evidence of AI colonialism, researchers said. By suppressing Indian culture and values, the AI presents Western culture as superior, and may not only shift what people write, but also what they think.
“These technologies obviously bring a lot of value into people’s lives,” Agarwal said, “but for that value to be equitable and for these products to do well in these markets, tech companies need to focus on cultural aspects, rather than just language aspects.”
About this artificial intelligence research news
Author: Becka Bowyer
Source: Cornell University
Contact: Becka Bowyer – Cornell University
Image: The image is credited to Neuroscience News
Original Research: Closed access.
“AI Suggestions Homogenize Writing Toward Western Styles and Diminish Cultural Nuances” by Aditya Vashistha et al. arXiv
Abstract
AI Suggestions Homogenize Writing Toward Western Styles and Diminish Cultural Nuances
Large language models (LLMs) are being increasingly integrated into everyday products and services, such as coding tools and writing assistants. As these embedded AI applications are deployed globally, there is a growing concern that the AI models underlying these applications prioritize Western values.
This paper investigates what happens when a Western-centric AI model provides writing suggestions to users from a different cultural background. We conducted a cross-cultural controlled experiment with 118 participants from India and the United States who completed culturally grounded writing tasks with and without AI suggestions.
Our analysis reveals that AI provided greater efficiency gains for Americans compared to Indians. Moreover, AI suggestions led Indian participants to adopt Western writing styles, altering not just what is written but also how it is written.
These findings show that Western-centric AI models homogenize writing toward Western norms, diminishing nuances that differentiate cultural expression.