Skip to main content
News Directory 3
  • Home
  • Business
  • Entertainment
  • Health
  • News
  • Sports
  • Tech
  • World
Menu
  • Home
  • Business
  • Entertainment
  • Health
  • News
  • Sports
  • Tech
  • World
Cohere Launches Tiny Aya: Open-Weight AI Models for 70+ Languages & Offline Use - News Directory 3

Cohere Launches Tiny Aya: Open-Weight AI Models for 70+ Languages & Offline Use

February 17, 2026 Lisa Park Tech
News Context
At a glance
  • Cohere, an enterprise AI company, has launched a family of open-weight multilingual models called Tiny Aya, designed to run on everyday devices without an internet connection.
  • The Tiny Aya models are “open-weight,” meaning their underlying code is publicly available for modification and use.
  • The base Tiny Aya model contains 3.35 billion parameters, a measure of its size and complexity.
Original source: techcrunch.com

Cohere, an enterprise AI company, has launched a family of open-weight multilingual models called Tiny Aya, designed to run on everyday devices without an internet connection. The release, announced on the sidelines of the India AI Summit, marks a significant step towards democratizing access to advanced language AI, particularly in regions with limited connectivity.

The Tiny Aya models are “open-weight,” meaning their underlying code is publicly available for modification and use. This contrasts with closed-source models where the internal workings are proprietary. Cohere Labs, the company’s research arm, developed the models with a focus on supporting over 70 languages, including a strong emphasis on South Asian languages like Bengali, Hindi, Punjabi, Urdu, Gujarati, Tamil, Telugu, and Marathi. This regional focus addresses a historical imbalance in AI language model development, which has often prioritized widely spoken languages like English.

The base Tiny Aya model contains 3.35 billion parameters, a measure of its size and complexity. While smaller than some of the largest language models currently available, Cohere emphasizes that Tiny Aya’s architecture and training data allow it to achieve strong performance on a variety of tasks, even with limited computational resources. The company has also released TinyAya-Global, a version specifically fine-tuned to better follow user commands, making it suitable for applications requiring broad language support. Further specialization comes in the form of regional variants: TinyAya-Earth for African languages, TinyAya-Fire for South Asia, and TinyAya-Water for Asia Pacific, West Asia, and Europe.

According to Cohere, this regional specialization is key to improving the “linguistic grounding and cultural nuance” of the models. By tailoring the models to specific regions, the company aims to create AI systems that feel more natural and reliable for local communities. The models are designed to be flexible starting points for further adaptation and research, allowing developers to build upon the existing foundation.

The development of Tiny Aya is notable for its relatively modest computational requirements. Cohere reports that the models were trained on a cluster of 64 H100 GPUs – Nvidia’s high-powered processing units – but emphasizes that this represents a comparatively small investment in computing resources. This suggests that similar models could be developed by a wider range of organizations, potentially accelerating innovation in multilingual AI.

The ability to run directly on devices, without relying on cloud connectivity, is a crucial feature of Tiny Aya. This opens up a range of possibilities for offline applications, such as translation tools for travelers, educational resources for remote areas, and communication aids for individuals with limited internet access. In countries like India, where internet penetration varies significantly, this offline capability could be particularly impactful.

Developers can access the Tiny Aya models through HuggingFace, a popular platform for sharing and testing AI models, as well as the Cohere Platform. The models are available for download on HuggingFace, Kaggle, and Ollama for local deployment. Cohere is also releasing the training and evaluation datasets used to create the models on HuggingFace, promoting transparency and reproducibility in AI research. A technical report detailing the training methodology is also planned for release.

The launch of Tiny Aya comes as Cohere prepares for a potential initial public offering (IPO). According to CNBC, the company ended 2025 with $240 million in annual recurring revenue, representing a 50% growth rate quarter-over-quarter. CEO Aidan Gomez stated last year that the company plans to go public “soon,” suggesting a strong financial position and confidence in its future prospects.

Beyond the immediate applications of the Tiny Aya models, the release signals a broader trend towards more accessible and inclusive AI development. By prioritizing multilingual support and offline functionality, Cohere is addressing critical gaps in the current AI landscape and paving the way for a more equitable distribution of AI benefits.

Share this:

  • Share on Facebook (Opens in new window) Facebook
  • Share on X (Opens in new window) X

Related

AI models, Cohere, Offline first

Search:

News Directory 3

ByoDirectory is a comprehensive directory of businesses and services across the United States. Find what you need, when you need it.

Quick Links

  • Disclaimer
  • Terms and Conditions
  • About Us
  • Advertising Policy
  • Contact Us
  • Cookie Policy
  • Editorial Guidelines
  • Privacy Policy

Browse by State

  • Alabama
  • Alaska
  • Arizona
  • Arkansas
  • California
  • Colorado

Connect With Us

© 2026 News Directory 3. All rights reserved.

Privacy Policy Terms of Service