TAAFT
Free mode
100% free
Freemium
Free Trial
Create tool

Phi 3.5 mini

New Text Gen
Released: August 22, 2024

Overview

Phi-3.5-mini is Microsoft’s compact open-weight LLM (~3.8B parameters) tuned for strong instruction following, reasoning, and coding. It supports long-context (up to 128K), function calling, and JSON outputs, and is MIT-licensed for easy use on Azure AI or locally.

Description

Phi-3.5-mini is a small, efficient dense model in the Phi-3.5 family designed to deliver high quality at low latency and cost. With ~3.8B parameters, it’s well-suited for mobile/edge or single-GPU servers while still handling non-trivial reasoning, multilingual tasks, and code generation. It offers a long context window (up to 128K) for summarizing large documents and navigating codebases, plus structured outputs (JSON), function/tool calling, and system/instruction tuning for reliable behavior. The model ships as open weights under an MIT license, making it straightforward to deploy on Azure AI or run locally with common inference stacks; lightweight fine-tuning (e.g., LoRA) and 8/4-bit quantization are supported to fit tighter memory budgets. Typical uses include chat assistants, RAG pipelines, code helpers, batch summarization, and on-device prototyping where speed, cost, and permissive licensing matter.

About Microsoft

No company description available.

Location: Redmond, WA, US
View Company Profile

Related Models

Last updated: September 22, 2025