(PR) Cerebras Systems Sets Record for Largest AI Models Ever Trained on A Single Device
Cerebras Systems, the pioneer in substantial efficiency artificial intelligence (AI) computing, today announced, for the initially time ever, the potential to teach models with up to 20 billion parameters on a solitary CS-2 program – a feat not attainable on any other solitary machine. By enabling a one CS-2 to train these versions, Cerebras lessens the technique engineering time required to operate significant natural language processing (NLP) types from months to minutes. It also gets rid of one particular of the most distressing areas of NLP—namely the partitioning of the design across hundreds or 1000’s of smaller graphics processing units (GPU).
“In NLP, bigger products are demonstrated to be more precise. But ordinarily, only a very find couple firms experienced the means and abilities required to do the painstaking do the job of breaking up these big designs and spreading them across hundreds or 1000’s of graphics processing units,” said Andrew Feldman, CEO and Co-Founder of Cerebras Programs. “As a result, only incredibly few providers could teach huge NLP models – it was as well high priced, time-consuming and inaccessible for the relaxation of the industry. Right now we are happy to democratize entry to GPT-three one.3B, GPT-J 6B, GPT-3 13B and GPT-NeoX 20B, enabling the whole AI ecosystem to established up big styles in minutes and teach them on a single CS-2.”