Apple Patents Technology for AR Touch Detection Using ML and Depth-Map

Views:
 
     
 

Presentation Description

As the iPad and iPhone have mostly demonstrated to a great extent, much of Apple’s hardware nowadays relies heavily on accurate detection of direct touch inputs, such as a finger resting against a screen, or on a trackpad, in Mac’s case. However, as people across the globe depend on augmented reality for both entertainment and work, they require interacting with digital objects that aren’t equipped with physical touch sensors. Quite recently, Apple has patented a prime technique for detecting touch using Machine Learning (ML) and depth-mapping cameras. ✅ For view source:https://www.kashishipr.com/blog/apple-patents-technology-for-ar-touch-detection-using-ml-and-depth-mapping-cameras/

Comments

Presentation Transcript

slide 1:

Apple Patents Technology for AR Touch Detecton Using ML and Depth-Mapping Cameras As the iPad and iPhone have mostly demonstrated to a great extent much of Apple’s hardware nowadays relies heavily on accurate detecton of direct touch inputs such as a fnger restng against a screen or on a trackpad in Mac’s case. However as people across the globe depend on augmented reality for both entertainment and work they require interactng with digital objects that aren’t equipped with physical touch sensors. Quite recently Apple has patented a prime technique for detectng touch using Machine Learning ML and depth-mapping cameras. As per patent standards Apple’s technology in the depth-based touch detecton system is prety straightorward – the external cameras work together in a live or real-tme environment for creatng a three-dimensional depth map by measuring the distance of an object for instance – a fnger from a touchable surface and then further determining when the object touches the surface. Crucially the distance measured is designed in a way so that it becomes usable even when the cameras change their positon depending partly on training from an ML-based model to discern touch inputs. Illustratons of the technique corresponding to Apple’s patented technology demonstrate three external cameras working together for determining the relatve positon of a fnger which is a concept that might turn out to be

slide 2:

somewhat known to the users of Apple’s triple-camera iPhone 11 Pro models. Similar mult-camera arrays are most likely to appear in the future devices of Apple such as the new dedicated AR glasses and iPad Pros allowing each to determine fnger input conveniently by applying ML knowledge and depth- mapping a scene to analyze the intent of changes in the fnger’s positon. Equipped with this technology AR glasses in the future could efciently eliminate the need for trackpads and physical keyboards by replacing them with digital versions which only a user can see and further interact with adequately. Such AR glasses could also enable user interfaces to be anchored to other surfaces like walls creatng a secure elevator conceivably that could only be operated or brought to partcular foors with the help of the AR butons. Apple has been granted the patent US10572072 corresponding to the AR touch detecton technology – invented by Sunnyvale-based Daniel Kurz and Lejing Wang. Apple had frst fled the Patent Applicaton at the end of September in 2017. Tim Cook Apple’s CEO has suggested that AR will be of the utmost importance for the tech giant to move forward in the present highly compettve world. For more visit: htps://www.kashishipr.com/ Don’t forget to follow us on social media: Facebook – htps://www.facebook.com/kashishipr/ Twiter – htps://twiter.com/kashishipr Linkedin – htps://www.linkedin.com/company/kashishipr/ Pinterest – htps://www.pinterest.com/kashishipr/ Tumblr – htps://kashishipr.tumblr.com/ Contact - US Email Id: kashishiprkashishipr.com Website: www.kashishipr.com

authorStream Live Help