530 Billion Parameters! Microsoft and NVIDIA Trained the Largest Generative Language Model | Synced 530 Billion Parameters! Microsoft and NVIDIA Trained the Largest Generative Language Model

On October 11, Microsoft introduced the largest and “the most powerful monolithic transformer language model” trained to date, a 530 billion parameter GPT-3-style generative language model.

By · · 1 min read

Source: Synced | AI Technology & Industry Review

On October 11, Microsoft introduced the largest and “the most powerful monolithic transformer language model” trained to date, a 530 billion parameter GPT-3-style generative language model.