Google has introduced PaLM (Pathways Language Model) 2, an update to its next-generation large language model with improved multilingual, coding, and reasoning capabilities.
For multilingual tasks, PaLM 2 was more heavily pre-trained on multilingual text, spanning more than 100 languages, thus improving the software’s ability to understand and translate nuanced text including idioms, poems, and riddles.
PaLM 2 also digested a wider-ranging data set as part of its pre-training than its predecessor including scientific papers, web pages, and mathematical expressions, Google said. As a result PaLM 2 demonstrates improved capabilities in common sense reasoning, mathematics, and logic.
Developers can sign up to use the PaLM 2 model or use the model in Vertex AI.
Google noted that PaLM 2 is also faster and more efficient than previous models, while coming in four sizes for a range of use cases, allowing PaLM 2 to be fine-tuned to support entire classes of products. PaLM 2 already powers more than 25 Google products and features. These include:
PaLM was unveiled in April as a 540-billion-parameter, dense decoder-only Transformer model trained with the Pathways system.
Copyright © 2023 IDG Communications, Inc.