Useful information
Prime News delivers timely, accurate news and insights on global events, politics, business, and technology
Useful information
Prime News delivers timely, accurate news and insights on global events, politics, business, and technology
Join our daily and weekly newsletters to get the latest updates and exclusive content on industry-leading AI coverage. More information
While large language models (LLM) and generative AI have dominated enterprise conversations about AI over the past year, there are other ways businesses can benefit from AI.
An alternative is large quantitative models (LQM). These models are trained to optimize specific objectives and parameters relevant to the industry or application, such as material properties or financial risk metrics. This contrasts with the more general language generation and comprehension tasks of LLMs. Leading advocates and commercial providers of LQM include SandboxAQwhich today announced that it has raised $300 million in a new round of financing. The company was originally part of Alphabet and was spun off as an independent business in 2022.
The funding is a testament to the company’s success and, more importantly, its future growth prospects as it seeks to solve enterprise AI use cases. SandboxAQ has established partnerships with leading consulting firms, including Accenture, Deloitte and EY, to distribute its enterprise solutions. The key advantage of LQMs is their ability to address complex, domain-specific problems in industries where the underlying physics and quantitative relationships are critical.
“This is about building core products in companies that use our AI,” SandboxAQ CEO Jack Hidary told VentureBeat. “And so if you want to create a drug, a diagnostic, a new material or you want to do risk management at a big bank, that’s where quantitative models shine.”
LQMs have different objectives and work differently than LLMs. Unlike LLMs that process text data obtained from the Internet, LQMs generate their own data from mathematical equations and physical principles. The goal is to address the quantitative challenges that a company might face.
“We generate data and we get it from quantitative sources,” Hidary explained.
This approach allows advances in areas where traditional methods have stagnated. For example, in battery development, where lithium-ion technology has dominated for 45 years, LQMs can simulate millions of possible chemical combinations without the need for physical prototypes.
Similarly, in pharmaceutical development, where traditional approaches face a high failure rate in clinical trials, LQMs can analyze molecular structures and interactions at the electronic level. Meanwhile, in financial services, LQMs address the limitations of traditional modeling approaches.
“Monte Carlo simulation is no longer sufficient to handle the complexity of structured instruments,” Hidary said.
A Monte Carlo simulation is a classic form of computational algorithm that uses random sampling to obtain results. With the SandboxAQ LQM approach, a financial services company can scale in a way that a Monte Carlo simulation cannot allow. Hidary noted that some financial portfolios can be extremely complex with all kinds of structured instruments and options.
“If I have a portfolio and I want to know what the tail risk is, given the changes in this portfolio,” Hidary said. “What I would like to do is create 300 to 500 million versions of that wallet with slight changes, and then I want to analyze the final risk.”
Sandbox AQ’s LQM technology focuses on enabling companies to create new products, materials and solutions, rather than simply optimizing existing processes.
Among the business verticals in which the company has been innovating is cybersecurity. In 2023, the company first launched its Sandwich crypto management technology. This has since been further expanded with the company’s AQtive Guard enterprise solution.
The software can analyze a company’s files, applications, and network traffic to identify the encryption algorithms being used. This includes detecting the use of outdated or broken encryption algorithms such as MD5 and SHA-1. SandboxAQ feeds this information into a management model that can alert the chief information security officer (CISO) and compliance teams to potential vulnerabilities.
While an LLM could be used for the same purpose, the LQM provides a different approach. LLMs are trained on broad, unstructured Internet data, which may include information about encryption algorithms and vulnerabilities. In contrast, Sandbox AQ LQMs are created using specific quantitative data about encryption algorithms, their properties, and known vulnerabilities. LQMs use this structured data to create knowledge models and graphs specifically for encryption analysis, rather than relying on general understanding of the language.
Looking ahead, Sandbox AQ is also working on a future fix module that can automatically suggest and deploy updates to the encryption being used.
The original idea behind SandboxAQ was to combine artificial intelligence techniques with quantum computing.
Hidary and his team realized from the beginning that real quantum computers were not going to be easy to obtain or powerful enough in the short term. SandboxAQ uses quantum principles implemented through an enhanced GPU infrastructure. Through a partnership, SandboxAQ has expanded Nvidia’s CUDA capabilities to handle quantum techniques.
SandboxAQ also does not use transformers, which are the basis of almost all LLMs.
“The models we train are neural network models and knowledge graphs, but they are not transformative,” Hidary said. “You can generate them from equations, but you can also get quantitative data from sensors or other types of sources and networks.”
While LQMs are different from LLMs, Hidary doesn’t see it as an either/or situation for companies.
“Use LLM for what they are good at, then bring in LQM for what they are good at,” he said.