TAAFT
Free mode
100% free
Freemium
Free Trial
Deals
Create tool

OPT

New Text Gen 2
Released: May 1, 2022

Overview

OPT-175B is Meta AI’s 175B-parameter Open Pretrained Transformer. It targets GPT-3 class capabilities with open weights for research, strong few-shot performance, and a standard decoder-only architecture suitable for analysis, baselines, and downstream fine-tuning.

Description

OPT-175B was released to advance transparent large-scale language modeling. It matches the classic decoder-only Transformer recipe at very high capacity, trained on large public text corpora and accompanied by smaller OPT checkpoints so researchers can reproduce scaling trends. The release prioritized openness, including training logs, evaluation details, and an open-weights license for research use, which made the model a common baseline for studies on efficiency, safety, and alignment. In practice it handles long-form generation, summarization, and few-shot tasks competently, transfers to instruction or domain tuning with standard methods, and serves as a reference for system work on memory, quantization, and distributed inference. Compared with newer generations it has a shorter context window and lighter alignment, so teams often wrap it with guardrails or adapt it via fine-tuning when deploying in modern pipelines.

About Facebook

We're connecting people to what they care about, powering new, meaningful experiences, and advancing the state-of-the-art through open research and accessible tooling.

Location: Orlando, California, US
View Company Profile

Related Models

Last updated: October 14, 2025