VSORA has announced a new AI inference processor targeting large-scale data center workloads and high-efficiency machine learning deployment. The chip, called Jotunn8, is designed for inference rather than training, addressing performance and energy challenges in running large AI models at scale. The processor delivers a peak compute capacity of 3,200 teraflops, operating at more than…
