The Economic Times

The Economic Times
Indian AI startup Sarvam has open-sourced two reasoning models, Sarvam 30B and Sarvam 105B, marking a significant development in Indian language AI capabilities. Co-founder Pratyush Kumar announced that both models excel in Indian languages while competing strongly on global benchmarks.
The models use Mixture-of-Experts architecture for computational efficiency. The 30B model has 30 billion parameters but activates only one billion per token, while the flagship 105B model contains 105 billion parameters with 10.3 billion active parameters, designed for enterprise applications.
Both models were trained entirely in India using compute capacity provided under the "IndiaAI Mission", with Sarvam receiving Rs 98.68 crore in subsidies for access to 4,096 Nvidia H100 GPUs. The 30B model trained on 16 trillion tokens and the 105B on 12 trillion tokens, with significant portions covering the 10 most widely spoken Indian languages.
The models power Sarvam's consumer products: Samvaad, an enterprise conversational agent, and Indus, an AI assistant for complex reasoning tasks serving as an Indian alternative to ChatGPT.
Bangladesh agri-tech platforms use Bangla apps for farmers
Marathi Made Mandatory for Auto-Rickshaw Drivers in Maharashtra
Chrome adds support for 8 Indic languages in AI features
Telangana digitizes 1.8 lakh manuscripts in 15 languages
93% of Karnataka students choose Hindi as third language
AI Farm Advisor Built for Telugu Speakers in India
