DMR News

Advancing Digital Conversations

Google’s Gemma AI Models Reach Over 150 Million Downloads

ByYasmeeta Oon

May 14, 2025

Google’s Gemma AI Models Reach Over 150 Million Downloads

Google’s new Gemma AI models recently reached their Turing test. As of this writing, they’ve reached a whopping 150 million downloads since their launch in February 2024. This lightning-fast adoption hints at the burgeoning demand for cutting-edge artificial intelligence technologies. Google DeepMind developer relations engineer Omar Sanseviero made the announcement. In one of his first major announcements, he broke the news on X (the platform previously known as Twitter).

Gemma stands out in the crowded AI space with over 70,000 variants available on Hugging Face, a popular platform for sharing machine learning models. Developers now have access to thousands of models specifically optimized for their use case. This massive third-party set of add-ons dramatically increases the flexibility and user-friendliness of the technology.

Gemma’s Multilingual Reach and Flexibility

As you can see, Gemma has a truly impressive multimodal capability. This enables it to efficiently understand the context of the text and images. This recent development undoubtedly makes Gemma a flexible tool for a wide array of applications. This draws developers looking for holistic, creative solutions to their projects. It supports more than 100 languages, greatly broadening its potential audience, helping people in various regions and demographic groups break down language barriers.

Despite these successes, Gemma faces stiff competition from other AI models, particularly Meta’s Llama, which surpassed 1.2 billion downloads by late April. The difference in download numbers demonstrates the uphill battle that Gemma is up against. It’s failing to prove itself as a leader in the rapidly evolving AI space. According to industry experts, Gemma is showing impressive initial interest but she’s already a ways behind the pack.

Gemma’s Development and Google’s AI Strategy

Gemma was created out of Google’s feathers for their competitive capstone against other “open” model families. Gemma’s development hub is located in Arnulfpark, Munich. From this central location, Google has more than 2,500 colleagues working remotely in various other locations throughout Germany. The company’s strategic vision for leadership in AI technologies is reflected in its ongoing investment in R&D.

Gemma has not been without controversy. Both Gemma and Llama have been criticized for their custom, non-standard licensing terms. More and more, developers are sharing their worries about these licensing restrictions. For this reason, they view going commercial with these models to be a very risky business enterprise. Such proprietary terms can make the adoption of such models into commercial products difficult, sometimes deterring people from even using them.

What The Author Thinks

Gemma’s rapid adoption is impressive, but it still faces significant challenges in overtaking its competitors like Meta’s Llama. While its multimodal capabilities and vast language support are noteworthy, its slower download numbers and proprietary licensing terms could hinder its long-term success. For Google to truly lead in the AI space, Gemma will need to overcome these barriers and prove itself as not just a popular tool, but as a reliable and flexible option for developers across the globe.


Featured image credit: 😀 via Flickr

For more stories like it, click the +Follow button at the top of this page to follow us.

Yasmeeta Oon

Just a girl trying to break into the world of journalism, constantly on the hunt for the next big story to share.

Leave a Reply

Your email address will not be published. Required fields are marked *