Apple’s AI strategy should remain consistent as it has always been

Apple’s AI strategy should remain consistent as it has always been

Apple is apparently working with ‌its own form of generative AI but analyst Ming-Chi Kuo⁢ tells us not to expect an⁢ Apple GPT service before 2024, ​or later.

The analyst says part of the reason Apple isn’t yet introducing these​ services is that executives haven’t figured out a “clear strategy” for deployment of the tech.

Why isn’t it ⁤clear?

The thing is the go-to market strategy for this ‌(and for any other technology) has always been ‌inherently understood at Apple: It is to augment human capability while respecting innate humanity. This has always been Apple’s approach.

Apple ⁣knows this, and already follows that path. It’s put limited implementations of AI inside its products for years, including features  such as fall and crash detection, the electrocardiogram (ECG) functionality on the Apple ​Watch, translation, image ⁢recognition, and  most recently, voice message transcription.

“These things are not only great ⁣features, but ⁤they’re also saving people’s lives out there,” Apple CEO Tim Cook said earlier this year. “We view AI as huge, and we’ll ​continue weaving it in⁢ our products on a very ⁣thoughtful basis.”

Stay‍ focused, stay foolish

Apple’s ⁤approach really should reflect its ⁤traditional focus ⁢on⁢ giving ⁣users⁤ what they need. It makes sense to ​deploy an Apple​ LLM in specific ways for use in specific apps.

Here are some ideas to show how the‍ company could‍ improve its products through ‍focused deployment of generative AI.

Just as Adobe has done in Photoshop, in Photos, Apple‍ could deliver on-device, ⁤prompt-based edits and‌ enhancements of images, removal of backgrounds and other basic features. The ability to use⁤ Siri ‌to initiate these commands would benefit iPhone-using photographers putting quick edits and compositions together on the fly⁣ for subsequent ⁢finalization in pro imaging apps.
In Health, the tech could ‌combine‌ physical measurements with location, scheduling,‍ and other ⁣information to provide users not just with ‍a summary of historical health habits, but also to identify patterns of ill health and/or recovery. This information might even help people falling sick identify when and where they were when‌ they ⁢became infected, which could ⁢make a ‍big difference to public⁤ health.
Mail is the essential application everyone kind of hates. Apple has improved it over the last couple of years and new features such as the capacity to‌ delay message sending or remind you of incoming emails make it more useful. But its⁣ search ​facilities are⁣ quite limited. An Apple large language model (LLM) could help deliver richer ⁤associations between the information you already own, identifying ⁣more complex‍ interactions, such as, ⁢“Gather all the email that relates to the presentation ‍scheduled for Sept. 19,” ‌for example.
Shortcuts are great, but I still⁤ think most users ‍remain confused by some of the language used in the instruction templates. However, each available ‍step in ⁢Shortcuts is predictable and…

2023-08-03 19:24:02
Post from www.computerworld.com rnrn

Exit mobile version