Generative pre-trained transformer

Type of large language model

Generative pre-trained transformers (GPTs) are a type of large language model (LLM) and a prominent framework for generative artificial intelligence. They are artificial neural networks that are used in natural language processing tasks. GPTs are based on the transformer architecture, pre-trained on large data sets of unlabelled text, and able to generate novel human-like content. As of 2023, most LLMs have these characteristics and are sometimes referred to broadly as GPTs.

Generative pre-trained transformer News and Videos   LIVE

ℹī¸ This is a LIVE page which updates automatically as we're monitoring our hand-curated selection of verified and trustworthy media outlets

You might like