Overview
Phi-3 Small is Microsoft’s ~7B open-weight LLM tuned for instruction following, reasoning, and coding. It’s compact enough for single-GPU/edge deployment and supports tool/function calling and structured (JSON) outputs under a permissive license.
Description
Phi-3 Small is a dense ~7B-parameter model in Microsoft’s Phi-3 family, designed to deliver high quality at low latency and cost. It’s text-in/text-out, instruction-tuned, and strong on practical reasoning, multilingual tasks, and code generation. The model supports function/tool calling, JSON-formatted responses, and integrates cleanly with RAG and agent frameworks. Open weights and a permissive license make it easy to run on Azure AI or locally; 8-/4-bit quantization and LoRA fine-tuning help it fit tight memory budgets. Typical uses include chat assistants, document QA/summarization, coding helpers, and on-device prototyping where efficiency and reliability matter.
About Microsoft
No company description available.
Location:
Redmond, WA, US
Website:
news.microsoft.com
Related Models
Last updated: October 6, 2025