Not to mention Vision Pro! Counting the AI moments on Apple's WWDC23 last night
The focus of last night's WWDC23 keynote speech was undoubtedly the surprise release of the $3,499 MR device Vision Pro.
But throughout the event, careful friends will hear "machine learning" mentioned more than once. As several familiar concepts in the field of AI, the words "Machine Learning" and "Transformer model" were carried out throughout the night. Introduction to the press conference.
From Mac to iPhone to AirPods, from hardware to software, we can see it.
Hardware: serving usage scenarios
M2 Ultra : Use unified memory to show the advantages of running large models
Last night, Apple officially released the M2 Ultra chip, which combines two M2 Max chips for up to 192GB of unified memory.picture
At WWDC23, Apple emphasized that using such a large unified memory can run other PCs that cannot afford machine learning operations. For example, if you use it to run Transformer large models, you don't have to worry about memory bottlenecks anymore.picture
AirPods : Personalize the volume and let the headphones "learn" your habitspicture
Through machine learning, AirPods can also "learn" to automatically adjust the volume, which can fine-tune your media volume experience according to your preferences at different time periods and the surrounding environment.picture
In addition, AirPods has also added support for "adaptive audio", which can automatically adjust the volume and noise reduction settings according to the ambient sound, so that you can focus on your favorite content or talk to people nearby.
Software: Focus on on-device machine learning
Calls & iMessage : Local real-time voice-to-text conversion is not difficult
With Apple's Live Voicemail (live voicemail), you can view a real-time transcription of someone's voicemail, and with the help of a neural network engine, the iPhone can provide local instant transcriptions. You can always take their call if you have an issue that you want resolved right away.picture
Voice messages in iMessage can now also be locally transcribed into a text version, allowing users to read the content immediately.picture
This feature is easily reminiscent of Whisper, the AI voice transcription model that was very popular a while ago. Apple has packaged similar technology into a small iPhone, which also shows their maturity in the combination of software and hardware. The powerful neural network engine provides the hardware foundation for the operation of these models, and the efficient model operation efficiency also makes their instant operation possible.
Keyboard input : more powerful intelligent correction, dictation abilitypicture
Now, the automatic correction of the iPhone's built-in keyboard can provide better support when you type text. With the powerful computing power of the Apple chip, it can be achieved that "every time you press a key, the iPhone will run the Transformer language model once."picture
In addition to the increase in speed, the strength and time span of the correction ability have also been enhanced. The speaker specifically mentioned that this function "can understand when you want to input inappropriate words", and "the predicted content will be personalized and optimized according to your words", which also confirms that this is a word prediction that will continue to learn It is beyond imagination that the model can do this on this machine.picture
In terms of dictation, through the new Transformer language recognition model and neural network engine, Apple's dictation ability has also made great progress.picture
Notes APP : machine learning, select moments worth recording for youpicture
iOS 17 has added a native APP: Notes (Journal in English). Through local machine learning, iPhone will record important events in your life or interesting moments of daily activities, and add details to any entry including photos, music, recordings, etc. information.picture
Through on-device machine learning, your iPhone marks those important moments and creates personalized moment suggestions based on your photos, music, workouts, and more for you to discover a new you when you revisit them.picture
It's worth mentioning that since this feature is created entirely on your iPhone, you control the type of data the app contains, and thanks to end-to-end encryption, no one but yourself can access the diary.picture
Although there was not much mention of AI-related content in the WWDC23 keynote speech last night, we can still see Apple's approach from some system update highlights and hardware introductions: implement all privacy-related models locally on the device Running, pursuing the efficiency and practicability of machine learning.
This also shows Apple's clear attitude towards the development of machine learning and even AI: these technologies should serve human beings and respect privacy.
After all, one of the purposes of technology is to make life better.