Apple May Use a Samsung Sensor in the iPhone as Soon as 2026

A digital rendering of a computer chip with various electronic components. The chip is illuminated with light trails representing data transfer, and a small, pixelated image of a landscape hovers above the chip, symbolizing processing power and digital imaging.
Samsung

A new report alleges that Apple will start using Samsung sensors in its iPhones as soon as 2026, ending the exclusive use of Sony sensors which has persisted for more than 10 years.

Camera manufacturers basically never say where their sensors come from if they’re not developing the technology in-house, and that’s usually the case for Apple, too. However, in December 2022, Tim Cook shared that the iPhone had been using Sony camera sensor for the last 10 years. That had long been the speculation, but Apple often rarely shares the specifics of hardware components that go into the iPhone. It’ll state megapixels, focal length, aperture, and other specifications but typically does not mention identifiable hardware components.

“We’ve been partnering with Sony for over a decade to create the world’s leading camera sensors for iPhone. Thanks to Ken and everyone on the team for showing me around the cutting-edge facility in Kumamoto today,” Tim Cook wrote in a post on what was at the time Twitter.

That could change soon.

“Samsung is expected to begin shipping 1/2.6-inch 48MP ultra-wide CMOS image sensors (CIS) to Apple for iPhones as early as 2026, breaking Sony’s years-long monopoly on supplying CIS to Apple. To this end, Samsung has established a dedicated team to serve Apple,” Ming-Chi Kuo, an industry insider, writes.

As The Verge notes, Kuo does not say if Samsung will be tapped to manufacture any other sensors for the iPhone or if Apple will start using sensors from both companies simultaneously.

Even if Apple does have to mix sensors from two companies in a single array, it’s unlikely the end user would notice. Firstly, Apple reportedly wants to take sensor design in-house anyway, which would mean anything Samsung or Sony did would be fabrication of hardware based on specifics provided by Apple’s team. That’s not really any different than Nikon’s current relationship with Sony Semiconductor.

Secondly, Apple puts a lot of emphasis on fine-tuning its hardware and software to provide a consistent look to images.

“We replicate as much as we can to what the photographer will do in post,” Apple’s Vice President, Camera Software Engineering Jon McCormack told PetaPixel earlier this year. “There are two sides to taking a photo: the exposure, and how you develop it afterwards. We use a lot of computational photography in exposure, but more and more in post and doing that automatically for you. The goal of this is to make photographs that look more true to life, to replicate what it was like to actually be there.”

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Swift Telecast is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – swifttelecast.com. The content will be deleted within 24 hours.

Leave a Comment