Apple’s brainiacs seem to have cracked a code that lets iPhones handle their own fancy language models, opening the door for some pretty cool AI features in upcoming iPhone models. I got wind of this from a couple of research papers spilled on arXiv, the Cornell University research-sharing platform. Feel free to dive into the papers if you’re feeling adventurous, but I’ll give you the lowdown.
The big headache when it comes to putting a massive language model on a mobile is the limited hardware memory, especially with these new AI models packing a crazy number of parameters. Apple’s brain trust suggests two tricks to tackle this. First up is “windowing,” where the AI recycles already processed data instead of constantly munching on new info, lightening the load on the hardware. Then there’s “row-column bundling,” a fancy term for grouping data into sizable chunks to supercharge the language model’s understanding and generation mojo.
The second paper spills the beans on iPhones potentially cooking up animated 3D avatars using videos snapped by the rear cameras. They call it HUGS (Human Gaussian Splats). Sure, similar tech existed before, but Apple claims their version can whip up avatars a hundred times faster and snag all those little details like clothing and hair.
What Apple plans to do with these snazzy tricks is still a mystery, but the possibilities are wild – a beefed-up Siri, real-time language translation, cool photography features, and even chatbots. Rumor has it that a smarter Siri might be in the works, jazzed up with AI, possibly showing up in the Messages app for answering tough questions or finishing your sentences with style. There’s also some buzz about Apple working on a chatty AI named Ajax and tossing around the idea of “Apple GPT” as a name.