Last year, Apple’s WWDC Keynote highlighted the ambitious steps of the company at AI. This year, the company emphasized the emphasis on Apple Intelligence and focused on updates of operating systems, services and software, introducing a new aesthetic that calls “wet glass” along with a new name contract.
However, Apple has also tried to calm down the crowd with some announcements related to AI, such as an image analysis tool, a training coach, a live translation translation feature and much more.
Visual intelligence
Visual Intelligence is Apple Image Analysis technology that allows you to gather information about your environment. For example, it can detect a plant in a garden, tell you about a restaurant or recognize a jacket wearing.
Now, the feature will be able to interact with the information on your iPhone screen. For example, if you come across a post in a social media application, Visual Intelligence can conduct a search for images associated with what you see during browsing. The tool performs the search using Google search, chatgpt and similar applications.
To access visual intelligence, open the control center or customize the action button (the same button is usually used to receive a screenshot). The feature is available with iOS 26 when it starts later this year. Read more.
Chatgpt comes to the playground
Apple Integrated Chatgpt in the playground of the image, the AI ​​-working image tool. With Chatgpt, the application can now create images in new styles such as “anime”, “oil painting” and “watercolor”. There will also be an option to send a prompt to Chatgpt to let additional images. Read more.
Friendly
Apple’s latest training coach is exactly what heard-uses a speech model to provide encouragement while practicing, imitating the voice of a personal trainer. When you start a run, AI in the workout app provides you with a motivation discussion, pointing out basic moments such as when you run your fastest mile and your average heart rate. After completing the workout, the AI ​​summarizes the average rhythm, heart rate and if you achieve milestones. Read more.
Live translation
Apple Intelligence supplies a new live translation feature for messages, Facetime and phone calls. This technology automatically translates text or oral words into the preferred user language in real time. During FaceTime calls, users will see live captions, while for phone calls, Apple will translate the conversation loudly. Read more.


AI helps with unknown callers
Apple has introduced two new AI features for telephone calls. The first is referred to as call checks, which automatically answers calls from unknown numbers in the background. This allows users to listen to the name of the caller and the reason for the call before they decide whether to answer.
The second feature, Hold Assist, automatically detects music when waiting for a call center. Users can choose to stay connected while waiting, allowing them to use their iPhone for other tasks. Notifications will notify them when a living agent is available. Read more.
Poll proposals in messages
Apple also introduced a new feature that allows users to create polls within the messaging application. This feature uses Apple Intelligence to propose polls based on the framework of your conversations. For example, if people in a group conversation find it difficult to decide where to eat, Apple Intelligence will recommend the poll start to help land on a decision. Read more.


Shortcuts powered by AI
The shortcut application is becoming more and more useful with Apple Intelligence. The company explained that when building a shortcut, users will be able to choose an AI model to activate features such as the summary AI. Read more.


Including the projector
A little update is introduced into Spotlight, the on-device search feature for Mac. It will now integrate Apple Intelligence to improve the awareness of the context, providing suggestions for actions usually performed by users and tailored to their current duties. Read more.
Foundation models for developers
Apple now allows developers to access AI models even when offline. The company introduced the framework of the Foundation’s models, which allows developers to create more AI capabilities in third -party applications used by Apple’s existing systems. This is likely to encourage more developers to create new AI features as Apple competes with other AI companies. Read more.
Apple’s Apple Siri failure
The most frustrating news that emerged from the event were that the long -awaited Siri developments are not yet ready. Participants were willing to promise the promising features that worked with AI that was expected to debut. However, Craig Federighi, SVP of Apple’s mechanical software, said they would have no more to share until next year. This delay can raise questions about Apple’s strategy for the voice assistant in an increasingly competitive market. Read more.
