Microsoft Started several new AI models “Open” AI On Wednesday, the most capable of which is competitive with OPENAI’s O3-MINI in at least one point of reference.
All new models authorized with licensed-Phi 4 mini-logic models, Phi 4 reasoning and Phi 4 Reasoning Plus are “logic” models, which means they are able to spend more time solutions to control over complex problems. They expand Microsoft’s PHI “Little Model”, launched by the company a year ago to offer a foundation for AI programmers’ apps.
The PHI 4 Mini Coneraling has been trained in about 1 million synthetic mathematical problems created by the Deepseek collective R1 model. About 3.8 billion in size parameters, the Phi 4 mini reasoning is designed for educational applications, says Microsoft, such as “built -in teaching” on light devices.
The parameters correspond to approximately the problem -solving skills of a model and models with more parameters generally give better than those with fewer parameters.
The PHI 4 logic, a 14 billion parameter model, was trained using “high quality” web data as well as “edited demonstrations” from Openai’s O3-MINI. It is better for mathematics, science and coding applications, according to Microsoft.
As for the Phi 4 Reasoning Plus, it is Microsoft’s Phi-4 model tailored to a reasoning model to achieve better accuracy to specific tasks. Microsoft claims that PHI 4 Reasoning Plus is approaching R1 performance levels, a model with significantly more parameters (671 billion). The company’s internal comparative evaluation also has Phi 4 reasoning plus that fits O3-Mini in Omnimath, a mathematical skill test.
PHI 4 mini Reasoning, Phi 4 Reasoning and Phi 4 Reasoning Plus are available in AI Dev platform platform It is accompanied by detailed technical reports.
TechCrunch event
Berkeley, ca
|
June 5
Book now
‘Using distillation, reinforcement and high quality data, these [new] Models Size Balance and Performance, “Microsoft wrote in a blog. “They are small enough for low delay environments, but they maintain strong reasoning capabilities that compete with much larger models. This mixture even allows even resource -limited devices to perform complex logic tasks.”
