Thank you! Your submission has been received!
It looks like that email is already in our system - thanks for signing up!
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
×
June 13, 2023

Creating Reliable and Transparent AI: Why We Invested in Normal Computing

Creating Reliable and Transparent AI: Why We Invested in Normal Computing

In recent years, there has been a steadily increasing level of excitement about the potential of artificial intelligence, a hype cycle that only continues to grow with the astonishing rise of Generative AI. People are right to be optimistic – AI has the potential to revolutionize many industries, from healthcare to manufacturing to transportation.

After the first AI wave of the past decade or so, this new phase of building applications and leveraging Large Language Models (LLMs) is being labeled by many as the second AI wave. However, there remain many serious challenges that must be addressed before AI can reach its full potential. Reliability, privacy, power consumption, cost, and the size of the computation workload are just some of the many issues facing the daunting task of bringing AI to scale. An example of this is the so-called AI “hallucinations”, which are unacceptable for mission-critical applications. 

Power consumption and the associated costs are already major issues. Large language models like ChatGPT require a massive amount of computing power and storage to train and run, fueled by some of the most powerful computational systems available. The cost of a single query on ChatGPT is estimated to be greater than 20X of a Google search, an unsustainable cost differential that is only expected to increase as the models continue to grow in complexity. For example, the ChatGPT-4 data set is estimated to be more than 500X greater than the training data size of ChatGPT-3.

Normal Computing's unique methodology is based on probabilistic machine learning, which allows AI models to consider the uncertainty of the data they are trained on, as well as their own limitations.

Today there are a large number of startups leveraging OpenAI's GPT-4 or other LLMs for building applications. One can easily imagine the implications if that query volume increases by 100x or 1000x. The additional power or compute resources to support this scale of growth is easily beyond the capacity of current infrastructures. An entire new realm of systems innovations must be developed to ensure applied AI tools can run much more efficiently.

Yet another massive challenge for future AI applications – one facing both AI developers and business leaders looking to exploit its capabilities – is reliability. How can AI begin to comprehend the limits of its own reasoning? How can its logic and data sourcing be more transparent and auditable? How do we ensure AI can be trusted for critical applications such as finance, defense, security, or health?

As industries experience rapid advancement or disruption, our approach at Celesta is to identify platforms and technologies with the potential to accelerate mass adoption.  

This thinking is precisely why we invested in Normal Computing, an emerging startup working to address these important questions and enable AI solutions for the most critical enterprise and government applications. Celesta was excited to participate in their recently announced seed funding round, along with others including First Spark Ventures and Micron Ventures.

Normal Computing is developing a new approach to AI to address some of these critical challenges. Their unique methodology is based on probabilistic machine learning, which allows AI models to consider the uncertainty of the data they are trained on, as well as their own limitations. This makes AI models more reliable and less likely to make inaccurate predictions, and more aware of how best to work with humans. At the same time, the overarching goal is to enable AI to be more efficient and therefore accessible.

Led by CEO and co-founder Faris Sbahi, with the founding team hailing from Google Brain, Palantir, and X, Normal Computing has exhibited an impressive passion and tireless work ethic. They are making rapid progress towards their goal of an adaptable full-stack product. In line with the tradition of Celesta Capital, we are working closely with the Normal Computing team to support them in key areas like team building, product strategy, and open-source strategy.

We see a huge opportunity for Normal to play a defining role in this next phase of AI advancements. The solutions they are working on are mission critical for the industry and we believe they have the talent and experience to achieve their ambitious goals.  

By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.