Mobile augmented reality for iOS typically comes through apps via ARKit, but Apple is borrowing a few pages from Google’s playbook and bringing AR directly to iOS 15.During the keynote presentation at the WWDC 2021, Craig Federighi, senior vice president of software engineering for Apple, unveiled Live Text, a new camera mode coming to iOS 15 that delivers much of the same functionality that Google Lens offers for Android smartphones and Google Photos.Don’t Miss: Apple’s AR Spaces in Updated Clips App Uses iPhone & iPad LiDAR to Give You New AR Video PowersLive Text will enable iPhone users to point their capture photos and interact with text. In addition to copying text, users can search based on selected text or call a phone number recognized within the image. The same functionality will be available via the Photos app on iOS and macOS Monterey. Live Text also replicates Google Lens’s ability to recognize pet breeds, plants, products, art, and landmarks.Apple/YouTubeApple/YouTubeIn…more