EnCharge AI, a chip startup born at a Princeton University lab, on Wednesday said it raised $21.7 million as it looks to commercialize its computing technology that is designed to run artificial intelligence applications more efficiently.
Its first products will be cards that can be easily slotted into server racks for companies to run AI applications, said Naveen Verma, CEO and co-founder of EnCharge AI and a professor of electrical and computer engineering at Princeton. He didn’t give a timeline for the product launch.
EnCharge AI chips work by computing data directly in the memory on the chip, using a special chip design and software. As AI models become more diverse and larger, more efficient computing is key, said Verma, and the traditional way of moving data in and out of memory causes bottlenecks in computing, slowing things down.
The technology was originally funded by the U.S. Department of Defense’s Defense Advanced Research Projects Agency (DARPA) and R&D was funded by the Department of Defense, the company said. DARPA, which works on weapons, is credited with creating the internet and runs public contests for human-looking robots and self-driving cars.
The chips will first be used in factories, warehouses and retail spaces to run AI applications, said Verma. He said for now they will be focused on doing inference work – for example, identifying if something is a dog or a cat rather than training a machine to learn what a dog or a cat is.
EnCharge AI said the latest funding round was led by Anzu Partners with participation from AlleyCorp, Scout Ventures, Silicon Catalyst Angels, Schams Ventures, E14 Fund and Alumni Ventures. Reuters