Our Terms & Conditions | Our Privacy Policy
DeepSeek to boost growth of India’s AI compute infra firms – Technology News
Domestic artificial intelligence (AI) compute infrastructure and data centre solution providers such as E2E, Netweb Technologies, NxtGen Datacenter, Yotta, CtrlS, among others, are hopeful of a significant increase in business potential with low cost open source models like DeepSeek.
The reasons for the same can be attributed to largely three factors. First, the development of models like DeepSeek at lower cost has given a blueprint to startups and companies that they can also develop language models without the need for expensive, large numbers of GPUs.
Second, since DeepSeek is an open source model, startups can leverage its APIs at significantly lower costs compared to its competitors such as OpenAI, thereby prompting more demand for domestic AI infrastructure providers.
Third, such low cost models and lower compute will also lead to increase in take-up of graphics processing unit (GPU) as a service model, wherein the compute providers lease the GPUs at an hourly rate, tech experts said.
Notably, DeepSeek is not the first open source model but the new thing is that it is built by a non-US company and at significantly less cost of about $5.6 million compared to $100 million for GPT-4 model training.
“In terms of our own business, more of open source AI is good for the industry because this gives enterprises option to build models on their own instead of leveraging models like ChatGPT or Gemini,” said AS Rajgopal, managing director and chief executive officer of NxtGen Datacenter and Cloud Technologies.
“These open models, when deployed on domestic providers, will be 10 times cheaper than working with closed source models by big companies,” Rajgopal said.
The Chinese model DeepSeek, which created waves rocking global technology stocks, comes as an alternative to OpenAI’s ChatGPT and Google’s Gemini. Unlike its US counterparts, it fully relies on open-source technology and lower-end chips.
Experts said DeepSeek follows a balanced loading approach when dealing with prompt compared to ChatGPT which goes through all the knowledge and puts all the capabilities at one go, which increases the compute requirement compared to a DeepSeek which loads when required.
Simply put, if ChatGPT is asked a medical question, or a mathematical question, it answers, by looking at a complete repository as it is trained on a single super knowledge base. DeepSeek doesn’t actually put all capabilities at one go when it starts interacting with users. It first understands the language and whether the question is biological, or a medical or business question, then only will it load the necessary knowledge, which is effective, and requires less compute and faster.
Sunil Gupta, co-founder, CEO & MD, of Yotta Data Services said, “Open-source models like DeepSeek lower entry barriers but simultaneously drive demand for infrastructure capable of supporting large-scale inferencing and deployment. This reinforces the critical role of data centers in powering India’s AI revolution”.
According to Gupta, scaling these models efficiently will require cutting-edge algorithms and optimisations that enhance performance while minimising costs.
Experts point out that DeepSeek’s innovative model architecture paves the way for more distributed and energy-efficient datacenter designs going forward.
“Providers who embrace this paradigm — offering flexible GPU leasing, prioritising energy efficiency, and supporting distributed computing — will be ideally positioned to capitalise on this expanding market,” said Sridhar Pinnapureddy, founder and CEO of CtrlS Datacenters.
One of the GPU models of leasing that will gain more traction includes procurement of large GPUs by these compute infrastructure providers and then splitting the same on a lease basis to companies building their models. The same will be cost effective for them.
“We view the emergence of DeepSeek as a significant opportunity for our business growth. DeepSeek paves the way for inclusive AI adoption, expanding the market further,” said NetWeb Technologies, which is a high-end computing solution provider.
According to Netweb, lower cost barriers associated with advanced technology, enable a wider range of customers to access and utilise appropriate computing resources.
Some experts, however, also want the compute infra providers to be wary of cybersecurity issues.
Munish Vaid, VP-digital strategy at Primus Partners said, “this is also a crucial phase for compute and cloud companies in terms of maintaining data sovereignty and being transparent with the government on how they operate”.
According to Vaid, these AI models won’t have barriers between the countries but these cloud infrastructure companies need to be wary of cybersecurity threats and will have to ensure security of data.
Images are for reference only.Images and contents gathered automatic from google or 3rd party sources.All rights on the images and contents are with their legal original owners.
Comments are closed.