(PR) Cerebras Systems Sets Record for Largest AI Models Ever Trained on A Single Device

Cerebras Devices, the pioneer in superior general performance synthetic intelligence (AI) computing, these days declared, for the initially time ever, the capability to educate types with up to 20 billion parameters on a single CS-two procedure – a feat not attainable on any other solitary unit. By enabling a single CS-2 to train these styles, Cerebras reduces the process engineering time vital to operate substantial natural language processing (NLP) models from months to minutes. It also removes one particular of the most unpleasant aspects of NLP—namely the partitioning of the product across hundreds or hundreds of tiny graphics processing units (GPU).

“In NLP, even larger designs are demonstrated to be extra exact. But customarily, only a incredibly pick out couple companies experienced the methods and experience important to do the painstaking function of breaking up these big types and spreading them across hundreds or hundreds of graphics processing units,” claimed Andrew Feldman, CEO and Co-Founder of Cerebras Devices. “As a final result, only incredibly couple of companies could train substantial NLP designs – it was as well costly, time-consuming and inaccessible for the rest of the field. Nowadays we are happy to democratize accessibility to GPT-three one.3B, GPT-J 6B, GPT-3 13B and GPT-NeoX 20B, enabling the whole AI ecosystem to established up big models in minutes and train them on a one CS-2.”

Add Comment

Your email address will not be published. Required fields are marked *