DeepSeek is a new AI model that has taken the technology world by storm.
DeepSeek-R1 is an AI model built on the latest innovative technologies that was launched by DeepSeek, a new Chinese startup.
DeepSeek is newsworthy for many reasons –
- DeepSeek costs almost 100 times less to build the model. It broke the myth that the only way to build AI is by throwing more data at the AI thereby requiring more data centers.
- DeepSeek costs less to run in inference so enterprise can build AI with less token cost.
- DeepSeek published their approach and code in open source. This has opened the floodgates on every startup and innovators globally to build on the new model.

Reasoning Model
DeepSeep competes with OpenAI’s latest reasoning model o3-mini. See this below if you want to understand what is reasoning in LLMs and Agent AI.

Impact on AI markets
Many tech stocks crashed in valuation because investors realized that huge number of GPUs or large data centers were not required to build AI.
DeepSeek has broken the stronghold of top tech companies from owning the AI foundation model space by putting their code in open source. Now AI is accessible to everyone. Here is DeepSeek’s Opensource github repository.
What is the impact on industries
Enterprise adopts a new technology, typically when tools are built on top of innovative technologies, making it easy to build applications. This middleware layer is still missing for LLM space. Intelligent chatbots for customer service are the only applications that are built upon LLM today.
Every time your customer chats with you for support or inquires about your product in pre-sales cycles which is typically considered good for business, with LLM based AI, it costs token cost at inference.
What is token cost at inference?
After you build AI and deploy the AI, that production version of AI is called Inference AI. With LLM, it costs an incremental cost called tokens every time you run your application and each chat with the customer costs a small amount that quickly adds up. So Enterprise markets were not adopting LLM in full gusto because of the high token costs.
Now DeepSeek has shown that their token cost is much cheaper.
Data privacy and Security
DeepSeek is new to the market and just like the first version of GPT3.5 that powers ChatGPT did, it will go through iterations to adapt to Enterprise grade security and privacy protections.
But in the meanwhile because DeepSeek shared their model code in open source, there are several startups offering hosted versions of DeepSeek. It reminds to be seen if this will evolve to a Linux like model with RedHat picking up market share offering enterprise grade SLA’s building upon open source software.
I believe that the opportunity is ripe for enterprise industry innovators to test out new LLM options that are cost effective and secure and demand the overall AI market to compete to innovate to serve them better.
—
Sudha Jamthe is a globally acclaimed technology futurist, author and educator with AI courses at LinkedIn Learning and at select university and EMBA programs. Sudha can be reached at her learner community at Business School of AI on LinkedIn.