Cheraw Chronicle

Complete News World

Microsoft launches a “small” AI language model, Phi-3 Mini, trained on AI outputs – IT Pro – News

Microsoft announced Phi-3 Mini, a small language model trained on children's books written by other language models. The AI ​​model understanding should be roughly in line with OpenAI's GPT 3.5.

The Phi-3 Mini does not appear to be online at the time of writing, however Microsoft has put the paper on the subject online. This is a model with 3.8 billion parameters. This is relatively little and the advantage is that it is light enough to run on laptops and smartphones. Training dates were According to Microsoft in an article from The Verge Partly from the web and partly from “children's books” created by AI-powered chatbots that include only the 3,000 most common words.

The goal is to prove that with the right training data, it is possible to build an equally powerful model with fewer parameters, and according to Microsoft, Phi-3 Mini can compete with GPT 3.5 and Mixtral. The latter has 45 billion parameters.

The mini version is the first in a series of three Phi-3 models that Microsoft plans to release. In addition, Phi-3 Small has seven billion parameters, while Phi-3 Medium has fourteen billion parameters. It is not known when they will be released. Microsoft puts its Phi models online on HuggingFace, but they can also be used via Microsoft's cloud platform Azure and the Ollama desktop app.

It is not known what Microsoft's plans are for Phi. It generally uses OpenAI's GPT models for its CoPilot products, which train themselves on services in which Microsoft integrates the model.

Microsoft fi 3 miniMicrosoft fi 3 mini