Announcing SUTRA Dual2L

A multilingual model that brings the best of SUTRA and Llama 3.1 to 50+ languages.

TWO Team

Llama 3.1 marks a significant milestone in accelerating AI adoption across markets with a series of frontier-quality open source models.

Meanwhile, SUTRA’s dual-transformer architecture, multilingual capabilities, and enterprise solutions are already enabling AI for the non-English market.

With Llama 3.1, SUTRA just got even more powerful.

SUTRA Dual2L

We are excited to introduce SUTRA Dual2L, a new multilingual LLM that uses Llama 3.1 as its concept model. SUTRA Dual2L employs TWO AI’s groundbreaking dual-transformer (Dual2) approach, which separates concept learning from language learning. Instead of retraining from scratch, SUTRA’s architecture is able to leverage the latest in state-of-the-art open source models to jump start its model development by enhancing and extending the concept model to many languages, improving efficiency, and enabling cost-effective, enterprise-ready solutions.

Improving multilingual performance across languages

SUTRA Dual2L efficiently handles many languages beyond Llama3.1’s capabilities, expanding support to 50+ languages and improving multilingual performance by 5-10% (for the 70B model) for languages like Korean, Gujarati, Tamil, Malayalam, and Japanese.

The Dual2L model also utilizes SUTRA’s purpose-built multilingual tokenizer with a 256K vocabulary trained on a balanced dataset from multiple languages, offering efficient token representation and reducing token consumption costs for non-English languages by 3-5x.

AI for All

Combining the strengths of SUTRA's unique architecture with the advanced capabilities of models like Llama 3.1 is the gateway to cost-effective multilingual AI solutions for the global market.

SUTRA Dual2L models will be available soon via API and in AI Enterprise Solutions for our enterprise customers. Contact us for information about availability of SUTRA Dual2L for your business.

date published

Jul 24, 2024

date published

Jul 24, 2024

Category

Full

Category

Full

reading time

5 mins

reading time

5 mins