TLDR: Indian AI startups are increasingly focusing on developing and utilizing open-source small models, driven by factors like accessibility, cost-effectiveness, and the ability to tailor solutions for specific Indian contexts and languages. This trend is shaping the country’s AI ecosystem, with a push towards indigenous model development and a cautious approach to large language models.
Indian artificial intelligence (AI) startups are demonstrating a significant inclination towards open-source small models, a strategic shift influenced by various factors including the ease of access, cost efficiency, and the potential for specialized applications. This preference is evident in the burgeoning AI landscape of India, where companies are actively engaging in the development of indigenous AI solutions.
One of the primary drivers behind this trend is the open availability of these models, which significantly aids research and development efforts. As one expert noted, “Just having them openly available, without any restrictions, helped a lot in my research journey.” This accessibility fosters innovation and allows startups to experiment and build upon existing frameworks without incurring substantial licensing costs.
Companies like Soket AI are at the forefront of this movement, with ambitious plans to build open-source Indic Large Language Models (LLMs). Soket AI aims to scale small models to 120 billion parameters within ten months, demonstrating a commitment to developing powerful yet accessible AI. Similarly, Soket AI Labs, focusing on ethical AGI, unveiled Pragna-1B, a foundational AI model designed for 22 Indic languages, highlighting the emphasis on linguistic inclusivity and efficient on-device performance. This model, the first of its kind in the open-source arena, partnered with Google Cloud to enhance its capabilities.
While there’s a clear focus on small and specialized models, the broader Indian AI ecosystem is also grappling with the challenges of building large language models from scratch. Industry experts emphasize that significant investments in hardware, data centers, talent, and innovation are crucial for such endeavors. There’s a recognized need for sustained government investment in funding, research, and education to support the development of frontier models and ensure AI sovereignty.
The Indian AI scene has seen a surge in activity, with approximately 10 startups focused on generative AI research and over 60 creating products and wrappers from existing solutions. However, challenges persist, particularly in customer adoption. Many Indian customers still prefer global products like ChatGPT, which has a substantial user base in India, accounting for approximately 14 million users.
Despite a slowdown in funding, Bengaluru remains a vital AI tech hub, benefiting from its skilled talent pool. The city’s AI ecosystem shows resilience, even amidst rising competition from other cities. The focus is not just on building LLMs but on developing advanced models for artificial general intelligence (AGI) eventually. Some initiatives propose a collaborative fund, with companies and the government contributing to create a substantial fund for building frontier models.
However, the path is not without its hurdles. The recent shutdown of Subtl.ai, a GenAI startup that reportedly outperformed OpenAI on benchmarks and served major Indian institutions, serves as a cautionary tale. This incident highlights issues such as fragile moats, a lack of developer ecosystems, broken funding cycles, and an enterprise culture that may not yet be fully ready for AI adoption. While India has the potential to produce numerous AI unicorns, such failures underscore the need for a robust and supportive ecosystem.
Also Read:
- India Accelerates Indigenous Large Language Model Development, Aims for Year-End Launch
- Google and IIT Bombay Partner to Advance Indic Language AI Models
In essence, Indian AI startups are navigating a dynamic landscape, prioritizing open-source small models for their practical advantages while simultaneously aspiring to contribute to the development of more complex, indigenous AI solutions. The journey involves balancing innovation with market realities and fostering a collaborative environment to achieve long-term AI prowess.


