Cohere Launches Open-Weight ‘Tiny Aya’ Multilingual Models at India AI Summit

AI News Hub Editorial
Senior AI Reporter
February 17th, 2026
Cohere Launches Open-Weight ‘Tiny Aya’ Multilingual Models at India AI Summit

At the India AI Summit this week, enterprise AI startup Cohere introduced a new family of open-weight multilingual models called Tiny Aya, designed to support more than 70 languages and run directly on everyday devices without requiring an internet connection. The models, developed by Cohere Labs, include regional variants tailored to South Asia, Africa, and broader Asia-Pacific and European markets, and are being released publicly on platforms including HuggingFace, Kaggle, Ollama, and the Cohere Platform.

The base Tiny Aya model contains 3.35 billion parameters and was trained on a single cluster of 64 Nvidia H100 GPUs. Cohere also unveiled TinyAya-Global, a version fine-tuned for stronger instruction following, alongside three regional adaptations: TinyAya-Earth for African languages, TinyAya-Fire for South Asian languages, and TinyAya-Water for Asia Pacific, West Asia, and Europe. The models support languages including Bengali, Hindi, Punjabi, Urdu, Gujarati, Tamil, Telugu, and Marathi.

Unlike many large foundation models that depend on cloud infrastructure, Tiny Aya is built for local deployment. Cohere said the models can operate directly on laptops and other standard devices, enabling offline use cases such as translation. The company added that it optimized the underlying software for on-device performance, requiring less computing power than comparable systems.

“This approach allows each model to develop stronger linguistic grounding and cultural nuance,” Cohere said in a statement, adding that all variants retain broad multilingual coverage while offering more region-specific depth.

The emphasis on multilingual, offline-capable AI is notable in markets such as India, where linguistic diversity and uneven internet access create deployment challenges for cloud-dependent systems. In such environments, lightweight models that can function without continuous connectivity could broaden the range of practical AI applications for developers and researchers.

Cohere said the models were trained using relatively modest computing resources compared to frontier-scale systems, positioning Tiny Aya as accessible infrastructure for builders targeting native-language audiences. In addition to releasing the models, the company is publishing associated training and evaluation datasets on HuggingFace and plans to release a technical report detailing its methodology.

The launch comes as Cohere continues to expand its enterprise footprint. CEO Aidan Gomez said last year that the company intends to go public “soon.” According to CNBC, Cohere ended 2025 with $240 million in annual recurring revenue and reported 50% quarter-over-quarter growth throughout the year.

By focusing on multilingual performance and edge deployment rather than sheer model size, Cohere is positioning Tiny Aya as practical infrastructure for developers building localized AI applications — particularly in regions where language diversity and connectivity constraints shape how technology is adopted.

This analysis is based on reporting from TechCrunch.

Image courtesy of Cohere.

This article was generated with AI assistance and reviewed for accuracy and quality.

Last updated: February 17th, 2026

About this article: This article was generated with AI assistance and reviewed by our editorial team to ensure it follows our editorial standards for accuracy and independence. We maintain strict fact-checking protocols and cite all sources.

Word count: 445Reading time: 0 minutesLast fact-check: February 17th, 2026

AI Tools for this Article

📧 Stay Updated

Get the latest AI news delivered to your inbox every morning.

Browse All Articles
Share this article:
Next Article