At this year's Microsoft Build, we introduced the Phi-3 series of small language models (SLMs), a groundbreaking addition to our Azure AI model catalog. The Phi-3 models, which include Phi-3-mini, Phi-3-small and Phi-3-medium, represent a significant advancement in the realm of generative AI, designed to deliver large model performance in a compact, efficient package.
The Phi-3 series stands out by offering the capabilities of significantly larger models while requiring far less computational power. This makes Phi-3 models ideal for a wide range of applications, from enhancing mobile apps to powering devices with stringent energy requirements. These models support extensive context lengths—up to 128K tokens—pushing the boundaries of what small models can achieve.
Features and Benefits
Experience the efficiency and agility of Phi-3 small language models on Azure AI model catalog through Pay-As-You-Go (PAYGO) offering via Serverless APIs. PAYGO allows you to pay only for what you use, perfect for managing costs without compromising on performance. For consistent throughput and minimal latency, Phi-3 models offer competitive pricing per unit, providing you with a clear and predictable cost structure. The pricing starts on June 1st, 2024 at 00:00 am UTC i.e. 05:00 pm PST on May 31st, 2024.
These models are available in East US2 and Sweden Central regions.
Models |
Context |
Input (Per 1,000 tokens) |
Output (Per 1,000 tokens) |
Phi-3-mini-4k-instruct |
4K |
0.00013 |
0.00052 |
Phi-3-mini-128k-instruct |
128K |
0.00013 |
0.00052 |
Phi-3.5-mini-instruct |
128K |
0.00013 |
0.00052 |
Phi-3-small-8K-instruct |
8K |
0.00015 |
0.0006 |
Phi-3-small-128K-instruct |
128K |
0.00015 |
0.0006 |
Phi-3-medium-4k-instruct |
4K |
0.00017 |
0.00068 |
Phi-3-medium-128k-instruct |
128K |
0.00017 |
0.00068 |
Phi-3.5-vision-instruct |
128K |
0.00013 |
0.00052 |
Phi-3.5-MoE-instruct |
128K |
0.00016 |
0.00064 |
Fine-tuning is available in the East US2 region
Models |
Context |
Fine-tune Training Job (Per 1,000 tokens) |
Fine-tuned Model Hosting (Per hour) |
Phi-3-mini-4k-instruct |
4K |
0.003 |
0.8 |
Phi-3-mini-128k-instruct |
128K |
0.003 |
0.8 |
Phi-3-medium-4K-instruct |
4K |
0.003 |
0.8 |
Phi-3-medium-128K-instruct |
128K |
0.003 |
0.8 |
Stay tuned for more updates on Phi-3, and prepare to transform your applications with the efficiency, versatility, and power of Phi-3 small language models. For more information, visit our product page or contact our sales team to see how Phi-3 can fit into your technology stack.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.