(PR) Cerebras Systems Sets Record for Largest AI Models Ever Trained on A Single Device

Cerebras Programs, the pioneer in substantial effectiveness synthetic intelligence (AI) computing, currently announced, for the to start with time ever, the ability to practice models with up to 20 billion parameters on a one CS-2 method – a feat not possible on any other single unit. By enabling a solitary CS-2 to educate these designs, Cerebras decreases the method engineering time necessary to operate big natural language processing (NLP) styles from months to minutes. It also eradicates one of the most agonizing features of NLP—namely the partitioning of the model throughout hundreds or hundreds of little graphics processing units (GPU).

“In NLP, larger designs are revealed to be more precise. But ordinarily, only a quite find couple of firms experienced the resources and knowledge required to do the painstaking perform of breaking up these big designs and spreading them throughout hundreds or 1000’s of graphics processing models,” reported Andrew Feldman, CEO and Co-Founder of Cerebras Techniques. “As a consequence, only very several organizations could educate big NLP versions – it was also high-priced, time-consuming and inaccessible for the rest of the business. These days we are very pleased to democratize accessibility to GPT-three one.3B, GPT-J 6B, GPT-3 13B and GPT-NeoX 20B, enabling the whole AI ecosystem to established up massive types in minutes and educate them on a single CS-2.”

Add Comment

Your email address will not be published. Required fields are marked *