Cost, security, and flexibility: the business case for open source gen AI

Travel and expense management company Emburse saw multiple opportunities where it could benefit from gen AI. It could be used to improve the experience for individual users, for example, with smarter analysis of receipts, or help corporate clients by spotting instances of fraud.Take for example the simple job of reading a receipt and accurately classifying the expenses. Since receipts can look very different, this can be tricky to do automatically. To solve the problem, the company turned to gen AI and decided to use both commercial and open source models. Both types of gen AI have their benefits, says Ken Ringdahl, the company’s CTO. The main commercial model, from OpenAI, was quicker and easier to deploy and more accurate right out of the box, but the open source alternatives offered security, flexibility, lower costs, and, with additional training, even better accuracy.With security, many commercial providers use their customers’ data to train their models, says Ringdahl. It’s possible to opt-out, but there are caveats. For instance, you might have to pay more to ensure the data isn’t being used for training, and might potentially be exposed to the public.“That’s one of the catches of proprietary commercial models,” he says. “There’s a lot of fine print, and things aren’t always disclosed.”Then there’s the geographical issue. Emburse is available in 120 different countries, and OpenAI isn’t. Plus, some regions have data residency and other restrictive requirements. “So we augment with open source,” he says. “It allows us to provide services in areas that aren’t covered, and check boxes on the security, privacy, and compliance side.”Right now, the company is using the French-built Mistral open source model. “We’ve evaluated all the major open source large language models and have found that Mistral is the best for our use case once it’s up-trained,” he says. “Another consideration is the size of the LLM, which could impact inference time.”For example, he says, Meta’s Llama is very large, which impacts inference time.“Our open source LLM selection certainly could change as this space is evolving rapidly,” he adds. “We’ve developed our software such that the LLM — open source or proprietary — can be swapped in or out via configuration.”Another benefit is that with open source, Emburse can do additional model training. The company has examples of receipts, already labeled and categorized, in many different formats and languages. “We’ve fine tuned it for our own specific use case to be very good so the success rate is extremely high,” he says.That means, for non-English use cases, the fine-tuned open source models can be more accurate than the big commercial ones.Open source models also offer companies more flexibility in when to upgrade.“OpenAI’s current model is GPT 4-o, but they’ll come out with version five and eventually version four will go away — on their schedule, not mine,” says Ringdahl.That’s a problem, since building commercial products requires a lot of testing and optimization. “With open source, you have control over where you’re using it and when it’ll go away,” he says.Finally, there’s the price. Open source isn’t completely free as there are still infrastructure and management costs.“In our case, we run it on AWS within our own private cloud,” he says. “So we’re still paying for usage. That can still lead to sticker shock if you don’t understand usage patterns and how that impacts your charges.”But overall, there’s definitely a cost savings from not having to pay OpenAI’s API charges. “That’s probably one of the top two or three reasons to use an open source model,” he says. “You get more control over your costs.”Other companies are also finding that open source gen AI models can offer more flexibility, security, and cost advantages, although there are risks.An abundance of choiceIn the most general definition, “open source” here refers to the code that’s available, and that the model can be modified and used for free in a variety of contexts. And there are plenty of such models to choose from.Hugging Face currently tracks more than 150,000 LLMs for text generation alone, up from 80,000 six months ago. Too many to choose from? Chatbot Arena ranks more than 160 top models — both proprietary and open source — and lists their licenses.And there are thousands of gen AI-related open source tools, on top of the models themselves. GitHub lists more than 100,000 projects with LLM in their names, compared to 50,000 in May. But most companies stick with the big players. According to Baris Sarer, who leads the AI division of Deloitte’s technology, media, entertainment and telecommunications industry practice, Meta’s Llama model is the one that shows up most in industry deployments, followed by Mistral. On the Chatbot Arena leaderboard, the latest Llama 3.1 is a bit behind the very latest OpenAI model, the September GPT-4o, but is ahead of the August release of the same model.“Meta originally went to market with a number of smaller models,” says Sarer. “But now they have a frontier model, too, and are going toe-to-toe with the major players.” And the market share numbers support this. According to predictive selling platform Enlyft, after GPT-4’s 41% market share, Llama is in second place with 16%. Mistral also makes the list, though at less than 5% market share.Kong, which surveyed developers about their API use, found a similar balance, with OpenAI at 27%, Llama at 8%, and Mistral at 4%. And beyond the big-name frontier models at the top of the charts, there’s also a rapid proliferation of small language models (SLMs), designed for niche use cases.“Studies have shown that small language models, with parameter counts in the range of millions to low billions, can outperform large, generic language models in specialized tasks,” says Anand Rao, AI professor at Carnegie Mellon University.They also require less computational power and can be fine-tuned more efficiently, he says. That makes them a better fit for deployment in resource-constrained environments.Llama helps with sales support, codingDeloitte’s Sarer recently worked with a data center technology company that was looking for gen AI to help transform both the front and back office.“They had a series of use cases — sales, marketing operations, field services,” he says. “We picked Meta’s Llama to be the model of choice due to cost, control, maintainability, and flexibility.”For example, for sales prospecting, the AI is used to pull insights from internal and external sources to better prepare sales staff to pitch products and services to customers, and make upsell and cross-sell recommendations.“They rolled it out a few months ago to regions in the US and Europe, and are now making enhancements based on feedback, and will roll it out more broadly,” Sarer says. “We’re getting good feedback from the salespeople using it.”Yet it’s too early to officially calculate ROI, he says, which will require more data points over a long time period. But early results are promising enough to expand the rollout.It’s true that proprietary gen AI, most often OpenAI, has the most adoption. But there are many cases where an open source alternative makes sense, Sarer says.“If the client has a preference to deploy AI on-prem, open source is really the only game in town,” he says. “And on-prem is actually still quite prevalent in certain industries.” And, like Emburse, many companies see geographic reasons to use open source.“Globally, we’re seeing AI as increasingly viewed as important to national security and sovereignty, so there’s demand to keep AI in your geography,” he says. “That’s making open source, frankly, the only option.”And many other companies are also finding benefits to fine-tuning their own models.“You can take a pre-trained open source model and fine-tune it with your own proprietary data,” he says. And open source offers more flexibility in deployment, he adds. “If you want to deploy a smaller model on the edge, most of the models in that space are open source.”Finally, in addition to security and flexibility, cost is a key factor. With open source, companies still have to pay for infrastructure, but they don’t have to pay AI vendor margins. “There’s a case for open source and that case will get stronger,” Sarer says.And even out of the box, some open source models might be better than the commercial alternatives at particular tasks. Agus Huerta, SVP of digital innovation and VP of technology at Globant, says he’s seen better performance on code generation using Llama 3 than ChatGPT.“Llama 3 has a proven use case for providing an understanding of software and how it correlates with other lines of code,” he says. “It can also help with refactoring. Llama 3 has proven to be very good at that.”When a new developer needs to quickly jump into a project and start being productive, it’s good for onboarding, he adds, and it’s great to maintain a solution.”Why open source AI lags behind commercialLower costs, more flexibility, higher security — what’s not to love about open source? There was a wide gap in performance between open source and proprietary models, but last year was a long time ago. “The gap has significantly narrowed in 2024,” says Gartner analyst Arun Chandrasekaran. “But while the gap has significantly narrowed, we don’t see a lot of open models in production yet.”One reason is that companies have made significant investments in closed source models and don’t see any urgent need to change, he says. Then there’s the operational complexity of running open source models, and the potential legal liabilities. Legal indemnification is a common feature of gen AI contracts from OpenAI, Microsoft, Adobe, and other major vendors.That’s not the case with open source. “Model creators don’t often take on legal liability,” says Chandrasekaran. And yes, open source models can be more easily re-trained or customized. But this process is complex and expensive, he says. “And the underlying base models are changing rapidly,” he adds. “If you customize something and the base model changes, you have to re-customize it.”Finally, there’s the question of long-term sustainability. “It’s one thing to build an open model, release it, and have millions of people use it, versus building a business model around it and monetizing it,” he says. “Monetizing is hard, so who’s going to continue bankrolling these models? It’s one thing to build version one, but it’s another thing to build version five.”In the end, we’re likely headed for a hybrid future, says Sreekanth Menon, global head of AI at Genpact. “Both open and closed-source models have their place, despite the popular sentiment of open source takeover,” he says. “Enterprises are better off being model agnostic.”Closed source models, backed by well-funded companies, can push the boundaries of what’s possible in AI. “They can provide highly refined, specialized solutions that benefit from significant investment in research and development,” he says.Why the open source definition matters to businessMeta’s Llama comes up first in any conversation about open source gen AI. But it might not technically be open source, and the distinction matters. In late October, the Open Source Initiative released the first form definition of open source AI.It requires open source AI to share not just the source code and supporting libraries, but also the model parameters, and a full description of the model’s training data, its provenance, scope, characteristics, and labeling procedures. But, more importantly, users must be able to use open source AI for any purpose without having to ask for permission.By that definition, Meta’s Llama models are open, but not technically open source, since there are limitations. For example, some Llama models can’t be used to train other models. And if it’s used in an app or service with more than 700 million monthly users, a special license from Meta is required.Meta itself refers to it as a community license or a bespoke commercial license. It’s important that corporate users understand these nuances, says Mark Collier, COO at OpenInfra Foundation, who helped work on the new definition. “To me, what matters most is that people and companies have the ability and freedom to take this fundamental technology and remix it, use it, and modify it for different purposes without having to ask a gatekeeper to give them permission.” So a company needs to feel assured it can incorporate the AI into a product and not have someone come back and say it can’t be used that way.Vendors will sometimes announce their AI is open source because it helps with marketing and recruitment, and lets customers feel they’re not locked in. “They have this halo effect, but they’re not really living up to that,” Collier says.In the rush to adopt AI, companies might take a vendor’s description of their AI as open source at face value.“The Meta example is a good one,” he says. “A lot of the mainstream tech coverage says this is open source AI, and that’s how Zuckerberg describes it, and it gets repeated that way. But when you get into the details, there are restrictions on the license.”As companies get more serious about making big commercial bets on AI technology, they need to be careful with the license, he adds. And there are also other benefits to using a model with a fully open source license, he adds. For example, having access to a model’s weights makes it easier to fine tune and adapt. Another thing for companies to watch out for is open source licenses that require all derivative works to also be open source.“If a company customized a model or fine-tuned it on their own proprietary data, they might not want to publish it,” he says. That’s because there are ways to get a model to expose its training data.Staying on top of these issues is tricky, he admits, especially since the gen AI sector is evolving so quickly. It doesn’t help when model developers invent new licenses all the time.“If your company is releasing something open source and your lawyers are attempting to create yet another license — please don’t do that,” he says. “There are plenty of good licenses out there. Just pick one that meets your goals.”

featured-image

Travel and expense management company Emburse saw multiple opportunities where it could benefit from gen AI. It could be used to improve the experience for individual users, for example, with smarter analysis of receipts, or help corporate clients by spotting instances of fraud. Take for example the simple job of reading a receipt and accurately classifying the expenses.

Since receipts can look very different, this can be tricky to do automatically. To solve the problem, the company turned to gen AI and decided to use both commercial and open source models. Both types of gen AI have their benefits, says Ken Ringdahl, the company’s CTO.



The main commercial model, from OpenAI, was quicker and easier to deploy and more accurate right out of the box, but the open source alternatives offered security, flexibility, lower costs, and, with additional training, even better accuracy. With security, many commercial providers use their customers’ data to train their models, says Ringdahl. It’s possible to opt-out, but there are caveats.

For instance, you might have to pay more to ensure the data isn’t being used for training, and might potentially be exposed to the public. “That’s one of the catches of proprietary commercial models,” he says. “There’s a lot of fine print, and things aren’t always disclosed.

” Then there’s the geographical issue. Emburse is available in 120 different countries, and OpenAI isn’t. Plus, some regions have data residency and other restrictive requirements.

“So we augment with open source,” he says. “It allows us to provide services in areas that aren’t covered, and check boxes on the security, privacy, and compliance side.” Right now, the company is using the French-built Mistral open source model.

“We’ve evaluated all the major open source large language models and have found that Mistral is the best for our use case once it’s up-trained,” he says. “Another consideration is the size of the LLM, which could impact inference time.” For example, he says, Meta’s Llama is very large, which impacts inference time.

“Our open source LLM selection certainly could change as this space is evolving rapidly,” he adds. “We’ve developed our software such that the LLM — open source or proprietary — can be swapped in or out via configuration.” Another benefit is that with open source, Emburse can do additional model training.

The company has examples of receipts, already labeled and categorized, in many different formats and languages. “We’ve fine tuned it for our own specific use case to be very good so the success rate is extremely high,” he says. That means, for non-English use cases, the fine-tuned open source models can be more accurate than the big commercial ones.

Open source models also offer companies more flexibility in when to upgrade. “OpenAI’s current model is GPT 4-o, but they’ll come out with version five and eventually version four will go away — on their schedule, not mine,” says Ringdahl. That’s a problem, since building commercial products requires a lot of testing and optimization.

“With open source, you have control over where you’re using it and when it’ll go away,” he says. Finally, there’s the price. Open source isn’t completely free as there are still infrastructure and management costs.

“In our case, we run it on AWS within our own private cloud,” he says. “So we’re still paying for usage. That can still lead to sticker shock if you don’t understand usage patterns and how that impacts your charges.

” But overall, there’s definitely a cost savings from not having to pay OpenAI’s API charges. “That’s probably one of the top two or three reasons to use an open source model,” he says. “You get more control over your costs.

” Other companies are also finding that open source gen AI models can offer more flexibility, security, and cost advantages, . An abundance of choice In the most general definition, “open source” here refers to the code that’s available, and that the model can be modified and used for free in a variety of contexts. And there are plenty of such models to choose from.

Hugging Face currently tracks more than 150,000 LLMs for text generation alone, up from 80,000 six months ago. Too many to choose from? Chatbot Arena ranks more than 160 top models — both proprietary and open source — and lists their licenses. And there are thousands of gen AI-related open source tools, on top of the models themselves.

GitHub lists more than 100,000 projects with LLM in their names, compared to 50,000 in May. But most companies stick with the big players. According to Baris Sarer, who leads the AI division of Deloitte’s technology, media, entertainment and telecommunications industry practice, Meta’s Llama model is the one that shows up most in industry deployments, followed by Mistral.

On the Chatbot Arena leaderboard, the latest Llama 3.1 is a bit behind the very latest OpenAI model, the September GPT-4o, but is ahead of the August release of the same model. “Meta originally went to market with a number of smaller models,” says Sarer.

“But now they have a frontier model, too, and are going toe-to-toe with the major players.” And the market share numbers support this. According to predictive selling platform Enlyft, after GPT-4’s 41% market share, Llama is in second place with 16%.

Mistral also makes the list, though at less than 5% market share. Kong, which surveyed developers about their API use, found a similar balance, with OpenAI at 27%, Llama at 8%, and Mistral at 4%. And beyond the big-name frontier models at the top of the charts, there’s also a rapid proliferation of (SLMs), designed for niche use cases.

“Studies have shown that small language models, with parameter counts in the range of millions to low billions, can outperform large, generic language models in specialized tasks,” says Anand Rao, AI professor at Carnegie Mellon University. They also require less computational power and can be fine-tuned more efficiently, he says. That makes them a better fit for deployment in resource-constrained environments.

Llama helps with sales support, coding Deloitte’s Sarer recently worked with a data center technology company that was looking for gen AI to help transform both the front and back office. “They had a series of use cases — sales, marketing operations, field services,” he says. “We picked Meta’s Llama to be the model of choice due to cost, control, maintainability, and flexibility.

” For example, for sales prospecting, the AI is used to pull insights from internal and external sources to better prepare sales staff to pitch products and services to customers, and make upsell and cross-sell recommendations. “They rolled it out a few months ago to regions in the US and Europe, and are now making enhancements based on feedback, and will roll it out more broadly,” Sarer says. “We’re getting good feedback from the salespeople using it.

” Yet it’s too early to officially calculate ROI, he says, which will require more data points over a long time period. But early results are promising enough to expand the rollout. It’s true that proprietary gen AI, most often OpenAI, has the most adoption.

But there are many cases where an open source alternative makes sense, Sarer says. “If the client has a preference to deploy AI on-prem, open source is really the only game in town,” he says. “And on-prem is actually still quite prevalent in certain industries.

” And, like Emburse, many companies see geographic reasons to use open source. “Globally, we’re seeing AI as increasingly viewed as important to national security and sovereignty, so there’s demand to keep AI in your geography,” he says. “That’s making open source, frankly, the only option.

” And many other companies are also finding benefits to fine-tuning their own models. “You can take a pre-trained open source model and fine-tune it with your own proprietary data,” he says. And open source offers more flexibility in deployment, he adds.

“If you want to deploy a smaller model on the edge, most of the models in that space are open source.” Finally, in addition to security and flexibility, cost is a key factor. With open source, companies still have to pay for infrastructure, but they don’t have to pay AI vendor margins.

“There’s a case for open source and that case will get stronger,” Sarer says. And even out of the box, some open source models might be better than the commercial alternatives at particular tasks. Agus Huerta, SVP of digital innovation and VP of technology at Globant, says he’s seen better performance on code generation using Llama 3 than ChatGPT.

“Llama 3 has a proven use case for providing an understanding of software and how it correlates with other lines of code,” he says. “It can also help with refactoring. Llama 3 has proven to be very good at that.

” When a new developer needs to quickly jump into a project and start being productive, it’s good for onboarding, he adds, and it’s great to maintain a solution.” Why open source AI lags behind commercial Lower costs, more flexibility, higher security — what’s not to love about open source? There was a wide gap in performance between open source and proprietary models, but last year was a long time ago. “The gap has significantly narrowed in 2024,” says Gartner analyst Arun Chandrasekaran.

“But while the gap has significantly narrowed, we don’t see a lot of open models in production yet.” One reason is that companies have made significant investments in closed source models and don’t see any urgent need to change, he says. Then there’s the operational complexity of running open source models, and the potential legal liabilities.

Legal indemnification is a common feature of gen AI contracts from OpenAI, Microsoft, Adobe, and other major vendors. That’s not the case with open source. “Model creators don’t often take on legal liability,” says Chandrasekaran.

And yes, open source models can be more easily re-trained or customized. But this process is complex and expensive, he says. “And the underlying base models are changing rapidly,” he adds.

“If you customize something and the base model changes, you have to re-customize it.” Finally, there’s the question of long-term sustainability. “It’s one thing to build an open model, release it, and have millions of people use it, versus building a business model around it and monetizing it,” he says.

“Monetizing is hard, so who’s going to continue bankrolling these models? It’s one thing to build version one, but it’s another thing to build version five.” In the end, we’re likely headed for a hybrid future, says Sreekanth Menon, global head of AI at Genpact. “Both open and closed-source models have their place, despite the popular sentiment of open source takeover,” he says.

“Enterprises are better off being .” Closed source models, backed by well-funded companies, can push the boundaries of what’s possible in AI. “They can provide highly refined, specialized solutions that benefit from significant investment in research and development,” he says.

Why the open source definition matters to business Meta’s Llama comes up first in any conversation about open source gen AI. But it might not technically be open source, and the distinction matters. In late October, the Open Source Initiative released the first form definition of open source AI.

It requires open source AI to share not just the source code and supporting libraries, but also the model parameters, and a full description of the model’s training data, its provenance, scope, characteristics, and labeling procedures. But, more importantly, users must be able to use open source AI for any purpose without having to ask for permission. By that definition, Meta’s Llama models are open, but not technically open source, since there are limitations.

For example, some Llama models can’t be used to train other models. And if it’s used in an app or service with more than 700 million monthly users, a special license from Meta is required. Meta itself refers to it as a community license or a bespoke commercial license.

It’s important that corporate users understand these nuances, says Mark Collier, COO at OpenInfra Foundation, who helped work on the new definition. “To me, what matters most is that people and companies have the ability and freedom to take this fundamental technology and remix it, use it, and modify it for different purposes without having to ask a gatekeeper to give them permission.” So a company needs to feel assured it can incorporate the AI into a product and not have someone come back and say it can’t be used that way.

Vendors will sometimes announce their AI is open source because it helps with marketing and recruitment, and lets customers feel they’re not locked in. “They have this halo effect, but they’re not really living up to that,” Collier says. In the rush to adopt AI, companies might take a vendor’s description of their AI as open source at face value.

“The Meta example is a good one,” he says. “A lot of the mainstream tech coverage says this is open source AI, and that’s how Zuckerberg describes it, and it gets repeated that way. But when you get into the details, there are restrictions on the license.

” As companies get more serious about making big commercial bets on AI technology, they need to be careful with the license, he adds. And there are also other benefits to using a model with a fully open source license, he adds. For example, having access to a model’s weights makes it easier to fine tune and adapt.

Another thing for companies to watch out for is open source licenses that require all derivative works to also be open source. “If a company customized a model or fine-tuned it on their own proprietary data, they might not want to publish it,” he says. That’s because there are ways to get a model to expose its training data.

Staying on top of these issues is tricky, he admits, especially since the gen AI sector is evolving so quickly. It doesn’t help when model developers invent new licenses all the time. “If your company is releasing something open source and your lawyers are attempting to create yet another license — please don’t do that,” he says.

“There are plenty of good licenses out there. Just pick one that meets your goals.”.