Apple Intelligence Brings Private, Secure AI to Entire Apple Ecosystem

A graphic shows a central user icon surrounded by various app icons in concentric circles, including social media, messaging, calendar, maps, health, photos, email, and shopping. The background features a network of interconnected purple lines and nodes.

Apple’s 35th annual Worldwide Developers Conference (WWDC) keynote address, considered by industry analysts to be one of the most important ever, has focused heavily on artificial intelligence (AI).

Apple, like seemingly every other tech company, is investing heavily into AI and finding ways to implement different AI models into nearly every aspect of its software and hardware. Apple’s use of the term “AI” is a relatively recent shift in the company’s public-facing communications, and its relative slowness to fully embrace AI has adversely affected its stock valuation.

More importantly, Apple users at large have noticed, with competing smartphones from the likes of Google and Samsung sporting flashy new AI features across the board and Microsoft Windows heavily relying upon AI on the computer side of things.

After showing all its operating system updates, Apple switched gears to focus on artificial intelligence at large. Apple says AI must be powerful, intuitive, deeply integrated, and personal for it to be useful. It also must be built with privacy in mind.

“Together, this goes beyond artificial intelligence — it’s personal intelligence,” says Apple.

Apple Intelligence

Apple Intelligence is Apple’s new personal intelligence system that makes personal products “more useful and delightful.” With iOS 18, iPadOS 18, and macOS Sequoia, Apple is bringing “powerful generative models” to the core user experience of Apple’s biggest operating system ecosystems.

Starting with Apple Intelligence’s capabilities, Apple Intelligence promises to enable iPhone, iPad, and Mac users to create and generate language, images, and actions. On the language and text side, Apple Intelligence’s large language models promise improved notifications, organization, and new writing tools system-wide. Writing tools can proofread, re-write, and summarize text, for example. These tools are available across first- and third-party apps on all of Apple’s latest devices.

As for images, Apple says its new AI features enable text-to-image creation. This image creation can also recognize specific contacts and cater generative images to the individual, like an illustration of a specific person for a personalized birthday card. This is built into Notes, Freeform, Messages, Pages, and more.

Apple is using AI to carry out specific tasks. For example, commands “Show me all the photos of mom, Olivia, and me,” or “Play the podcast that my wife sent the other day,” can generate automatic actions on a user’s device.

Given that Apple Intelligence builds upon a person’s data and content, it can be hyper personal to a specific person. However, this opens the door for privacy concerns. Apple says Apple Intelligence can understand a person’s individual context, so it “must be done right.” Apple Intelligence says that “powerful intelligence goes hand-in-hand with powerful privacy.”

Privacy

The core of Apple’s privacy approach requires on-device processing. With Apple’s silicon, there’s already the hardware required for Apple Intelligence. There is on-device large language modeling and semantic processing.

Security is a significant area of focus for Apple and its new Apple Intelligence features. If an AI feature requires more processing power than is available on a user’s device, Apple will send only relevant data to a private cloud network, and the data will not be stored once processing is complete. Apple says this is a brand-new standard for privacy and security in AI.

A person stands in front of a large screen in a modern, white auditorium. The screen displays three points: "Your data is never stored," "Used only for your requests," and "Verifiable privacy promise." The person is gesturing as they speak.

Experiences

Apple Intelligence underpins an all-new Siri experience on iPhone. Using voice or text, Siri can generate personalized information based on user requests. Later this year, it will also be able to interact with on-screen elements and understand what a user is currently viewing on their device and take actions inside apps based on user requests.

Inside an app like Photos, Apple says that its new AI will be able to automatically enhance and edit a user’s photos. Further, Siri will work alongside third-party apps, so a person could make a request like, “Take a long exposure video,” and an iPhone can open a relevant app like Moment’s Pro Camera app.

A smartphone displays a night photography app interface, capturing light trails on a highway. The screen shows a dark mode theme with various camera settings and a green shutter button at the bottom center. The time at the top left corner reads 9:41.
Screenshot

Developing…

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Swift Telecast is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – swifttelecast.com. The content will be deleted within 24 hours.

Leave a Comment