Você está na página 1de 22
co) United States c2) Patent Application Publication co) pub. No.: US 2023/0065288 Al US 20230065288A1 Valsan et al (43) Pub. Date: Mar. 2, 2023 (54) ELECTRONIC DEVICES WITH BODY. COMPOSITION ANALYSIS CIRCUITRY AGI $0077 (2015.01, AOTB 5/4872 (2013.1), 4608 §/7267 (201301). 4618 (71) Applicant: Apple Ine, Copeino, CA (US) $1075 (201301), AOTB $7278 (201301), GO6T 70012 201301): GO6T 7821 (72) Inventors: Gopal Valsan, Gilroy, CA (US) (2017.01), G16HT 30440 201801), 4618 Thilaka S. Sumanaweera, San Jose, 2576102 2013.01}, 4618 2090/502 (2016.02) CA (US): Liliana T Keats, Cupertino, CA.(US): David J. Feathers, Williamsville, NY (US): Pavan Kumar (57) ABSTRACT Anasosalu Vasu, Sunnyvale. CA (US) : Anelectonie device may inchue body composition analysis Gy Appl. Nox 17/855184 circuitry tha estimates body composition base on captred (22) Filed: dt images ofa fae, neck, andor body (ex, depth map images coped by a dept sensor, visible ight and nore mace oe captured by image senror, andor oer suitable images). The body composition analysis circuitry may analyze the (60) Provisional application No, 6/238.714filedon Aug, inuge data art may ext potions of the iiage data that 30, 2021, provisional ampication No, 63/242,357, stanly comelate with bay composition, suchas portions ied on Sep. 9, 2021 Of the cheeks, neck, waist, et. The bay compo alysis circuit’ may encode the image dala ito latent Publication Classification space. The latent space may be based on a deep learning G1) meek rod that accounts for Faia expression and nek pose in AGIB 500 (2006.01) {acelncek images and that acount for beating ad body Gust 700 (Gone) pose in body images. The body composition analysis ct Gust 7821 (2oni601) uty may output an estimated body composition based on Gran 3040 (2on6.01) the image daa and based on user demographic information ELECTRONIC DEVICE conrroveacumay pw DISPLAY INPUT-OUTPUT DEVICES INFRARED LigiTsouRce —_}—}. HIG ‘OPIICAL PRONIMITTY SENSOR (E.G, INFRARED TFEMITIING DIODE INFRARED LIGHT DETECTOR) VISIBLE IMAGE. ‘SENSOR INFRARED IMAGE SENSOR AMBIENT GH SOR Patent Application Publication Mar. 2, 2023 Sheet 1 of 11 US 2023/0065288 AI 16 ELECTRONIC DEVICE CONTROL CIRCUITRY [16 INPUT-OUTPUT DEVICES fen DISPLAY = Fs 8 OPTICAL COMPONENTS VISIBLE LIGHT SOURCE |-20 INFRARED LIGHT SOURCE — F-—4K_|_39 OPTICAL PROXIMITY SENSOR (E.G., INFRARED LIGHT-EMITTING DIODE [7-24 AND INFRARED LIGHT DETECTOR) VISIBLE IMAGE SENSOR Lo 26 INFRARED IMAGE SENSOR = [KL AMBIENT LIGHT SENSOR [L350 FIG. I Patent Application Publication Mar. 2, 2023 Sheet 2 of 11 US 2023/0065288 Al Patent Application Publication Mar. 2, 2023 Sheet 3 of 11 US 2023/0065288 Al EXTERNAL OBJECT . i. Qr46 10 Cv Cc Pex 2 28 Patent Application Publication Mar. 2, 2023 Sheet 4 of 11 US 2023/0065288 Al 56 FIG. 4 pear FIG. 5 Patent Application Publication Mar. 2, 2023 Sheet S of 11 US 2023/0065288 Al 38 FACE AND NECK ESTIMATED IMAGE DATA BODY COMPOSITION BODY COMPOSITION ANALYSIS CIRCUITRY ADDITIONAL USER DATA. FIG. 6 BoRy ESTIMATED IMAGE DATA oor ADDITIONAL BODY COMPOSITION OMPOSITION USER DATA ANALYSIS CIRCUITRY FIG. 7 Patent Application Publication Mar. 2, 2023 Sheet 6 of 11 US 2023/0065288 Al oor FIG. 9 Mar. 2, 2023 Sheet 7 of 11 US 2023/0065288 Al a 60 FIG. 10 FIG. 11 Patent Application Publication Mar. 2, 2023 Sheet 8 of 11 US 2023/0065288 Al Patent Application Publication Mar. 2, 2023 Sheet 9 of 11 US 2023/0065288 Al FIG. 14 Patent Application Publication Mar. 2, 2023 Sheet 10 of 11 US 2023/0065288 Al 66 66 66 66 Patent Application Publication Mar. 2, 2023 Sheet 11 of 11 US 2023/0065288 Al CAPTURE IMAGES OF USER'S FACE, NECK, AND/OR BODY 100 ¥ EXTRACT IMAGE DATA FOR RELEVANT PORTIONS OF FACE, NECK, AND/OR BODY AND DELETE — }_yo2 REMAINING IMAGE DATA. ¥ ENCODE IMAGE DATA INTO LATENT SPACE ¥ COMPENSATE FOR EFFECT OF FACIAL EXPRESSION AND NECK POSE IN FACE/NECK IMAGES AND/OR FOR EFFECT OF BREATHING AND BODY POSE IN BODY TX 106 IMAGES ¥ ESTIMATE BODY COMPO: DATA IN LATENT SP, ADDITIONAL AVA SITION BASED ON IMAGE. ACE AND BASED ON | ILABLE USER DATA 108 ¥ TAKE ACTION P10 FIG. 16 US 2023/0065288 Al ELECTRONIC DEVICES WITH BODY COMPOSITION ANALYSIS CIRCUITRY [0001] This application claims the benefit of provisional patent application No, 63/238,714 filed Aug, 30,2021, ad provisional patent application No, 63/242.357, filed Sep. 9, 202, bot of which are hereby incorporated by reference herein in their entireties, FIELD [0002] This relates generally to electronic devices, and, ‘ore particularly, to electronic devices with sensors. BACKGROUND, {0003} Electronic devices sich as cellar telephones, ‘wristwatches and her equipment are someines povided {vith sensor sch as fingerprint semsrs, facial recognition Samer, and het rate sor {0004 Iccan be challenging to use devices sch these The ser may wish to oa dierent types of eat related ‘information that rains electron devices ae unable 10 provide. The wir may nocd to rely more thn one picce Df cecironi equipment to obtain the desired bealtnrlted information, which can be inconvenient and cumbersome SUMMARY 10005] An electronic device may include body composi- ton analysis circuitry that estimates body composition based fon captured images of a face, neck, andior body (ea, threedimensional depth maps captured by a depth sensor, Visible light and inffared images, andlor other suitable mages). In some arrangements, « depth sensor in the electronic device may include an infrared bt emitter that iMhuminates face and neck with structured infrared light and ‘an infrared light devector tht detects infrared light relected from the fee and neck, The depth sensor may produce depth, sup image data capturing the threelimensional steal data based on the reflected inteared ight. Other ypes of ‘depth sensing technology andor visible light cameras my be used to capture fice and neck image dita, i desired. Ta some arrangements, the images may be full dy images or ‘maybe images oft portion of wser's body sich as the torso or bicep, [0006] In some aerangements, the body composition ‘analysis ercuiry may use usersindy-tine made 0 mp the images to body composition information. The model ‘may'be tained on images ofa specific Body part andlor mis be rained on images of an entre body. The body compo- sition information may deseribe how fat is distributed throughout the body andor may describe relative amounts ‘of Tt inthe visceral and subeutanoous comactments of the body. The body composition analysis eireuitry may use mages of the fae to seal images of the body in order 10 determine dimensions of the body. The electronic device say be a head-mounted device or any other suitable ele: tronic device that is worn or used by a first user while copturing images of a second use. The eloctonie device may’ also be selfoperated while capturing images of the user. If desired, the electronic dovice may capture images of| the user while attached to 8 stationary itr, [0007] The body composition analysis circuitry may ana lye the image data and may extract potions of the image data that strongly correlate with body composition, such as Mar. 2, 2023 portions ofthe checks, eck, face, chest, waist, hips, thighs, and other ares. The body composition analysis cireuitry ‘aay encode the image data into a latent space. The latent spice may be based on a deep learing model that is trained fom ser study 10008] When using fice images, the latent space may Include a fist latent space representing a user identity. Sccond Intent space representing @ facial expression, and a third Intent space representing neck pose. The body composition analysis eiruitry may compensate for facial expression and neck pose by using the face and neck image data inthe usr identiny latent space to output an estimated body composition [0009] When using body images, the latent space may Jnchide a fist latent space representing a user identity. a fccoad latent space representing @ breathing state, and a third latent space representing a body pose. The body composition analysis eiuitry may compensate for brea ing state and body pose by using the body image data in the user identity Intent space to output an estimated body composition, 10010] The body composition analysis circuitry may use a tserstudy-tmined model t map the images to body comt- Position infomation. The model may be tained on images ff a specific body part andor may be tried on images of fn entire body. The body composition information may ‘deserbe how fat is distributed throughout the body andor ray describe relative amounts of visceral and subestancous {at in specific body parts. The body composition analysis circuitry may use images of the face to scale images of the body in onder to determine dimensions of the bods. The electronic device may he a head-mounted device or any ther suitable electronic device that is wom by «ist ser while eapturing images of @ second user. [BRIEF DESCRIPTION OF THE DRAWINGS [0011] FIG. 1 is o schematic diggram of an illustative electron device in socordance with an embodiment 10012] FIG, 2 is a perspective view of an illustative electronic deve with display having optical eompanent windows overlapping optical components in accordance ‘vith an embodiment. [0013] FIG. 3 isa cross-setional side view of an illus tive electronic doviee that has optical components such 28 2 Hight source and an image sensor in aecordance with an embodiment [0014] FIG. 4 is 2 cross-sectional sde view of an illusta- tive light source that includes a diffuser in acconlance with an embodiment 10015] FG. § isa fat view of an iusiative object on hic a dot pantera is projected using alight source of the type shown in FIG. 4 in accordance with an embodiment 10016] FIG. 6 isa schematic diagram of illustative body composition analysis circuitry being tsed to analyze face find neck images in accordance with an embodiment 10017] FIG. 7 isa schematic diagram of ilusteative body composition analysis circuitry being used to analyze body Jmages in accordance with an embodiment. [0018] FIG. 8 is a disgram of illustrative threesimen- sional depth map image data associated with face and neck in accordance with an embodiment [0019] FG. 9 isa diagram showing how relevant postions of threedimensional depth map image data of the type US 2023/0065288 Al shown ia FIG. 8 may be entacted for body composition alysis ia secordance wih an embodinent [0020] FIG. 10 isa digram of Hhseaive image data coresponng to font body view in accordance wih ihcimest [W021] "FIG. 11 is a dageam of istrative image daa coresponding toa side body view in accordance with 2 cnbouie [0022] "TIG. 12 isa diagram showing how relevant poe dons of image data of the type shown in FIGS. 10 and 1 say be exited for body composition analysis in ser dance with an embodiment. [0023] T1G. 13 i a cagram illseating hw fe image dita captred at different times may be aligned for body composition analysis in acordance with an emboximent [0024] FIG. 14s diagram illustrating how body image data captred at dilfereat times may be aligned lor body composition analysis in aordance with an embodiment [0025] FIG. 15 sa digpan of iluseaive usc ty data that may be gathered over a period of tine in accordance ‘with a embodiment 10026] FIG. 163 ow ch of lose tps iavolved fn estimating body composition basod on eapteed image da in accordance with an embod DETAILED DESCRIPTION [027] A schematic diggram of an illustrative electronic device of the type that may be provided ith an optical ‘component is shown in FIG. 1. Ect device 10 may be 4 computing device such sa laptop computer, 2 computer ‘monitor containing an embedded computer, a tablet com per, cellular telephone, a mediaplayer, or oer baniheld fr portable electronic device, a speaker (eg. a voice ‘contrlled assistant oF other suitable spesker), smaller device such as a wristwatch device, a pendant device, a headphone or earpiece device, a deviee embedded in eye: alases of other equipment worn on a user's head, or ether Wwearable or miniature device, a television, & computer Aisplay that does not contain an embeskded computer, 2 taming device, a navigation device, an embedded system Such as a system in Which electroaic equipment with a Aisplay is mounted in kiosk or automobile, equipment that implements the funetionaiy of two or more of these devices, or other electonie equipment. [0028] Asshown in FIG. 1, electronic device 10 may have control circuitry 16. Control circuitry 16 may include stor- age and processing circuitry for supporting the operation of | ‘device 10. The stage and processing eitcitry may inclide storage sich a harddisk dive storage, nonvolatile memory (eg. Mash memory or other elecrcally-proprammable- read-only memory configured 0 form a solid state drive), volatile memory (e, static or dynamic random-access- ‘emory), ete. Processing circuitry in contol circuitry 16 ‘may be used 10 contol the operation of device 10. The processing circuitry may be based on one of more mien processors, microcontrollers, digital signal processrs,base- band processors, power management units, audio chips, ‘application specific integrated iris, ee 10029] | Device 19 may have input-output eieuitry such 2s Inputoutput devices 12, Inputoutput deviees 12 may include wser input devices tat gather user inp and oui ‘components that provides ser with out, Devices 12 may also ince communications circuitry tht receives data for ‘device 10 and that supplies data from deviee 1010 external Mar. 2, 2023 devices. Deviees 12 may also include sensors that gather Jnformation from the enviroament 10030] Inpot-output devices 12 may include one or more Aispays such as display 14. Display 14 may be a touch screen displey that includes @ touch sensor for gathering touch inp from a user or display 14 may’ be insensitive to touch. 4 touch sensor for display 14 may be based on an tray of capacitive ouch sensor electrodes, acoustic tue Sensor stucires, resistive Taveh component, force-bised touch sensor structres, light-based touch seasor, or other suitable touch sensor arrangements. Display 14 may be a liquid erysal display. a light-emitting diode display (ean conganic lightemitting diode displ), an electrophoretic display, or other display. 10031] Input-output devices 12 may inclode optical com: Poneats 18. Optical components 18 may include light emitting diodes and other light sourees. As an example, ‘optical components 18 may include one or more visible ight sources sels ight source 20 (e.g, lipht-emiting diode). Ligh-emitting diode 20 may provide constant illumination (€2., 10 implement a fashight function for device 10) andlor may emit pulses of fash illumination for a visible ight camera such ot visible light image sensor 26, Optical components 18 may also include an infrared light source (as a laser, lamp, infrared light-emitting diode, an array of vertica-cavitysurface-emiting lasers (VCSEL), et.) such ‘infrared light source 22, Infrared light source 22 my provide constant and/or pulsed ilumination at an infrared ‘wavelength such as 940 nm, a wavelength in the range of | 800-1700 nm, et. For example, infrared light souree 22 may provide constant illumination for an inated camera such 3 fnfrared image sensor 28 Infrared image sensoe 28 may, 28 an exanple, be configured to capture irs sean information Tom the eyes of user andor may be used to capture ines for a facil recognition process implemented on control circuitry 16. 10032] 1f desire, infrared light source 22 may be use 19 provide flood illumination (et difused infrared light that Uniformly covers a given arca) and to provide structured ight (e-. pattem of collimated dts). Flood illumination may be used to capture infrared images of extemal objects (eas to detect a user's face andor to ereate a depth map), whereas strctured light may be projected onto an external object to perform depth mapping operations (eg. to obtain f three-dimensional map ofthe user's face), This i merely illustrative. Other types of depth sensors may be used. if desired (€. indirect time-oF- Hight sensors, stereo cameras, te) 10033] To enable light souree 22 to provide both flood ‘Mumination and. sutured ight, Tight source 22 may inchide switchable diffuser and a collimated fight source such as a laser or an array of vertical eavity surfiee- Tosers, When food illumination is desired, de diflser my be tamed om to dif the light from the light source. When stnictredillamination is desired. the difsce my be tamed off to allow the collimated Tight t passthrough te difuser ‘tnnhibited. Diffusers such asthe diffuser in light source 22 ‘may be formed from liquid erytal material, electrophoretic material, or other switchable Hight modulators, In. some ‘implementations, light source 22 projects light though diffractive optical clement (DOE) to erete replicas of the pattem of dots. This is, however, merely illustrative. If ‘esied, infrared light source 22 may include a first light US 2023/0065288 Al source that provides flood illumination and second Hight souree that provides structured light [0034] Optical components 18 may also include optical proximity detector 24 and ambient light seasor 30 [0035]. Optical proximity detect 24 may include an infea- red light source sch as an infared lightening diode ad a eorresponding ight detector such as an infrared photode- tector for detecting when an extemal object that is illumi ‘uted by infrared light rom the light-emiting diode sin the vicinity of device 1, 10036] Ambient fight sensor 30 may be a monochrome ambient light sensor that measures the intensity of ambient Tight or may be a color ambient Tight sensor that measures ambient light color and intensity by making light measure- ‘ments with multiple photodetectors cach of which is pro= vided with @ corresponding color fiter (e., color filer that passes red light, blue Tight, yellow Haht, gree light, or Tigh of other colors) and each of which therefore responds ‘© ambient Fight in a differnt wavelength band. [0037] In addition to optical components 18, input-ouput devices 12 may include butions joysticks, seoling whoos, touch pads, key pads, keyboards, microphones, speakers, fone generators, vibrators, cameras, light-emiting diodes and other status indicators, non-opical sensors (tm perature sensors, microphones, capacitive touch sensors, Tree sensors, gas sensors, pressure sensors, sensors that ‘monitor device orientation asd motion sich as inertial ‘measurement units formed from accelerometers, compasses, andlor gyroscopes). data ports, ete. A user can contol the ‘operation of device 10 by supplying commands through input-output devices 12 and may receive status information fad other output rom deviee 19 using the output resources of input-output devies 12. [0038] Device 10 may have a housing, The howsing may oem a laptop computor enclosure, an enelosure for 2 wrist watch, a cellular tolophone enclosure, a tablet computer enclosure or other suitable device enclosure. perspective ‘View of portion of an illustrative elecionic device i shovsn in FIG. 2, In the example of FIG. 2, device 10 includes 4 display such as display 14 mounted in housing 32, Housing 32, which may sometimes he referred (0 us an enclosure or ease, may be Formed of plastic. glass, ceramics, fiber com- posites, metal (e., stainless stool, aluminum, ete), ether Suitable materials, or a combination of any two oF mote of ‘hese materials. Housing 32 may be formed using 3 unibody configuration in which some or all of housing 32 is machined or molded asa single sircture ar may be formed using multiple structures (ean intemal frame stacture, fone or more structures that form exterior housing surfaces, te). Housing 32 may have any suitable shape. In the ‘example of FIG. 2, housing 32 has a rectangular outline (fooiprint when viewed from sbove) and has four peripheral tsdges (eg, opposing upper and lower edges and opposing Jeltaad eight edges), Siewalls may run along the periphery ‘of housing 32. If desired, a strap may’ be coupled to amin portion of housing 32 (eg. in configurations in which tlevice 10 isa wristwatch or head-mounted device), [0039] Display 14 may be protected wsing a display cover layer such as & layer of transparent glass, clear plastic, sapphire, or other clea layer (eg, & transparent planar rember that forms some oral of front fae of device 10 orthat is mounted in other portions of device 10). Openings ‘may be fonmed in te display cover layer. For example, at ‘opening maybe forme in the display cover layer to Mar. 2, 2023 secommedate a button, speaker port such as speaker port 34, or oller components. Openings may be formed in housing 32 to form communications ports (ee, an audio jack port,» digital data port et), 16 form openings for buttons, ete. In some configurations, housing 32 may have a rear hovsing wall formes from a planar gloss member or other eansparent layer (e-, planae member formed on a rear faee of devee 10 opposing a front fee of devi 10 that includes a display cover layen) 0040] Display 14 may have an aray of pixels 38 inactive aa AA (eg.,liguid erystal display pixels, organi lght- emitting diode pixels, elctropharaie display pixels, et) Pixels 48 of ative ae AA may display images fora user of device 10, Active area AA may be rectangular, may have notches along one or more ofits edges, may be eteular, may be oval, may be rectangular with rounded comers, and/or ray huve other suitable shapes. 0041} Inactive potions of display 14 soch as inactive border area TA may’ be formed along one or more edges of| sctivenrea AA, Inetive onder area IA may overap iris, ‘anal Tne and other structures that do not emit ight for Torming images. To hide initive cineitry and other com: ponenis in border arca 1A from view by a user of device 10, the underside ofthe outermost layer of display 14 (eg. the display cover layer or other display layer) may be costed ‘wth an opaque masking material suchas a layer of black ink (eg, polymer containing black dye andor black pigment, ‘opaque materials of other coors, et.) andor othe layers (gn metal, dielectce, semiconductor, et.). Opague mask- Jing materials such as these may also be formed on an inner surface of @ planar ear housing wall formed from gl ceramic, polymer, crystalline wansparent materials such 8 Sapphire, or othor wanspareat material 10042] Inthe example of FIG. 2, speaker por 34 is Formed Jom sn elongated opening (exe, stnp-shaped opening) that extends along a dimension parallel 1 the upper perp en edge of housing 32. speaker may be mosated sithin Advice housing 32 in alignment with the opening for speaker port 34. During operation of device 10, speaker port M4 Serves as an ear speaker por for a user of device 10 (eg. @ ‘ser may place opening $4 adjacent to the user's cae during telephone call). 10043] Optical components 18 (e.,2 vsibe digital image Sensor, an infrared digital image sensor, light-based prox: Jimity sensor, an ambient light seasor, visible andor iniared lightemiting diodes that provide constant andr pulsed ilumination, ete.) may he mounted under ope or more optical component windows such as optical component ‘windows 40. Inthe example of FIG. 2, four of windows 40 have circular outlines (circular foctprints when viewed fom above) and one oF windows 4 has an elongated strip-shaped opening (ez, an elongated strip-shaped foot prt when viewed from above). The elongated window 40 $s mounted between the sidewall along the upper peripiral cge of deviee 10 and speaker port M and extends parallel 'o the upper peripheral edge of housing 32. If desired, ‘windows such as optical windows 40 may have shapes other than cireular and rectangular shapes. The examples of FIG. 2 are merely illustrative, {0044} Optical component windows sveh a8 windows 40 ray be formed in inetive area LA of display 14 (ean inactive border area in a display cover layer such a= an inactive display region extending along the upper peripheral teige of housing 32) or may be formed in other portions of| US 2023/0065288 Al device 10 such as portions of a rear housing wall formed fom a transparent member coated with epague misking ‘material, portions of a metal housing wall, polymer wall Structures, etc. Inthe example of FIG. 2, windows 40 are ormed adjacent to the upper peripheral edge of housing 32 between speaker port opening 34 inthe display cover layer foe display 14 and the sidewall along the upper edge of housing 32. In some configurations, an opaque masking layer is formed onthe underside of the display cover layer in inotive rea 1A and optical windows 40 are formed from ‘openings within the opaque masking layer. To help optical ‘windows 40 visually lend with the opaque masking layer, a dark ink layer, metal layer, a thin interference filter formed from stack of dicletre ayers, andr other strc tures may be overlap windows 40, [0045] An infrared emitter and infrared detector in device 10 may be used to form a three-dimensional deh sensor FIG, 36a side view of an ilustrative depth sensor 36 in ‘device 10 that may be used fo produce Uhee-dimensional depth maps such as eye scan information, facial images (eae. images of « user's face for use in performing facial recognition eperations to authenticate the user of device 10, Images of a user's face and nock for producing Animojis, se.) body images (eg, images of a user's body for use in peeforming motion tracking or body sepmentation), and/or ther three-cimensional depth mapping information, Dept Sseasor 36 may include infrared light emitter 22 and iniared Tight detecor 28. Device 10 may use infrared ight source 22 (Ga. an infrared lght-emiting diode, an inared laser, ct.) to prodice infrared light 48, Light 48 may illuminate extr- ral objects in the viinity of device 10 such as external object 44 (eg, a user’ face andor eyes). Reflected infrared Tig 46 from extemal objet 44 may be received and imaged using infrared digital image sensor 28 to proce infrared Jmages (c., thee-dimensional depth mips) of the Face rndor eyes. Depth information may also be captured by ‘applying appropriate software algorithms 1 visible andor near-infrared videos andor using any olher suitable depth sensor in the devies [00$6] trae light source 22 may operate in different ‘modes depending on the rype of infrared information 1 be gathered by infrared camera 28, Tor example, in Mood ‘Mlumination mode, ight source 23 may emit difsed Tight that uniformly covers a desired target are, In a structured light mode, ligt source 22 may emit a known patter of Tight onto desited a [0047] FIG. 4 illustrates illaination from light source 22 ‘when light source 22 is operated in a food illumination mode. As shown in FIG. 4 ight source 22 may emit difsed infrared light 56 that continuously covers a given area of external object 44, Infrared camera 28 may capture an infrared image ofthe dffosely illuminated extemal object 40 In some arringements. flood illumination from light source 22 may be used to detet a user's face during face ‘denifiation operations [0088] FIG. illustrates illsination fom light source 22 ‘when light source 22 js operated in a structured ight mode. In structured light mode, light source 22 may project Known pattern of infrared Fight $6 onto external object 44 Inthe example of FIG, §, infrared light 86 forms a patter ofdots on extemal object 44, The dts may be nan ordered rid array (eg. uniformly spaced from one apther) o the dots may be projected in a random speckle patter. This is, however, merely illustrative. If desired, light source 22 may Mar. 2, 2023 emit srdtured light in ther pattesns (ep, horizontal fines, ‘vertical lines, a prid of horizontal and vertical lines, or other suitable predetermined pattern), Structured infrared light 86 OfFIG. $ may be based on laser interfere or may be base fon a projection digplay clement that emits infrared light through a spatial ight modilator to ereate the desited pattem, 0049] In some amangements ight source 2 may include ‘ne light source that provides ood illumination and another light source that provides structure light. In other arange- ments, the same light source may be used to provide both flood. illumination and. structured fight. This may be achioved using a switchable diffuser element that selectively Aifoses light emited fom the Hight source. 10050] Data that is gathered using optical components 18 ‘may be wsed for one or more health-related applications sich body composition assessments. For example, control circuitry 16 may wse optical components 18 to capture mages of the users face, neck, andor body (eg. visible Images, infrared imapes, three-dimensional depth map ‘mages, ete), which may then be analyzed 10 provide user-specific body composition information, such as body sss index. body fat percentage (er. fat percentage of the total body at percentage in individual body parts, andor fat Pereentage in diferent fat storage compartments such as the Subcutaneous and viseral compartments), bone mass, and! br other health-related information, 0081] Control circuitry 16 may store one oF more models or mapping user image dat to body composition informa tion. The model may be a statistical model, may be a ‘machine learning model, may be a model based on a combination of statistical modeling and machine learning. or may be a combination of multiple machine leaming modes Models that are trained using. machine leaning. may be ilemented using principal component analysis, an auto encoder, and/or any other suitable data compression teh: igh 10052] An autoeacoder isan artificial neural nesvor that learns to encode data into a latent space by reducing the dimensions of he data, The autoeneoder i trained to encode ‘distribution of inputs within latent space to minimize loss berwoon the outputs and the inputs. Principal eomponent nalyss recs the dimensionality of input data by remov Jing redundant information and capturing the most important features of the input data (eg. features with the highest variance). Principal componeat analysis is generally restricted to linear mapping. whereas encoders donot ave ny linearity constants, 10053] FG. 6 s.a schematic diggram of body composition analysis eireutry $8 boing used to determine body compo- ‘ition fom face andior neck images. Body composition analysis eireuitey $8 may be part of contol circuitry 16 andor may be implemented as a sindalone circuit, Body ‘composition analysis circuitry $8 may receive information such as face and nock image data (e.g, three-dimensional depth map data of a user's face andor neck from depth sensor 36, visible images ofthe user’ face adr neck from Visible image sensor 26, et), and optional additional user data (eg, user-specific demographie information such a ender, height, weight, ae, ethnicity, andr oer user data red in device 10 andlor otherwise provided to eireuity S58), Based on the received face and neck image data and optional user demographi dat, body composition analysis cireutry $# may output estimated body composition infor US 2023/0065288 Al sation such as body mass index, body fat percentage, fit percentage of the fie and neck, bone mass, andor ether health-related information. If desired, user’ demographic information may be omitted and body composition analysis circuitry 88 may estimate the usor’s body composition based Solely on the captured face ane neck image data, Iisa} 1 desired, face and neck image data may be there as part of a dedicated body composition analysis (eg, when depth sensor 36 is being used specifically for ‘obtaining face and neck images fr body eomposition analy- ‘is) andor may be gathered when depth sensor 36 i already being used for some other purpose (ex, when depth sensor 36 js already being used for facial recognition and user authentication purposes, when depth sensor 36 is already being wsed for erating an Animoji or other vital reality applications that involve capturing a user's facial expres sons, et.) The fae and neck image data may include one or more images that are captured of the face and neck at ‘ferent times of the day andlor over mhiple days [0058] User demographic information may be received from the user as part of dedicated body composition analysis questionaire andor may be received from the user fs part of some other health-related application 10056] Body composition analysis circuitry $8 may store model that is tind using data from user studies, For ‘example, data may’be collected from 2 group of participants (ea ten participants, fity participants, one hundred pare Siipants, ope thousand participants, and/or any other suit able nuniber of participants) over a given period of time (eae. one month, two months, tree months, six months, ight months. ton months, a year, more than a year Less than year, ec. At each point of data colletion ding the study, the study participant's face and neck shape and size may be ‘measured and the use’s body composition may be mea sured, Face and neck shape and size may be measured using ‘uthree-dimensional depth sensor ofthe type shovsn in FIG. 3, using anthropomenic measurements (e2, tod’ lind ‘marks and measurements) andor using any other suitable measuring device (e., a threedimeasional body scanner). ody composition may be measured using any suitable body composition tracking technology such as magnetic reso nance imaging, dual ney Xoray absoeptiometc, aie dis- placement plethysmography, underwater weighing, ete. ‘Alteratvel a model ean be trained to pret fat percent ‘age in the face and the neck. Data collected daring the user study may serve as training data for taining the model that is stored in body composition analysis erenty 88 in device 10. [0057] Body composition analysis circuitry S# may use Principal component analysis, an autoencoder, andlor any ther suitable data compression technique 10 reduce the imensionalty of the input data in a latent space. For ‘example, the latent space may inchide an identity Tatent Space that describes the identity of te subject an expression Intent space that describes the facial expressions of the subject and a pose latent space that describes the neck pose ofthe subject. By including facial expression Iatent space fand a neck pose latent space, body composition analysis circuitry $8 can compensate for effets of Feil expression and neek pose by using the identity late space only 10 ‘output an estimated body composition ofthe subject. Ad nally, transfer learing methods canbe used to selectively fahance pre-rsined machine leaning models using other ‘at Mar. 2, 2023 10058] TG. 7s. schematic diggram of body composition analysis eiruitey’ 88 determine body composition from fice tndlor neck images. Body composition analysis circuitry 88 ray be part of control circuitry 16 andar may be imple ‘mented as a standalone circuit. Body composition analysis circuitry §8 may receive inlaemation such as body image data (.g.threeimensional depth mp data of a user's body fom depih seasor 36, visible images of the user's body from visible image sensor 26, e(.), and optional additional user data (eg, user-specific demographic information sich a ender, height, weight, ae, ethnicity, andr other user data Stored in device 10 andor otherwise provided to ciritry 58), Based on the roeived body image data and optional ‘wer demographic data, body composition analysis eiruitry 58 may output estimated body composition information such as body mass index, body fat pereentage, bone mass, andor other health-related informition. If desire, user demographic information may be omitted and body com: position analysis circuitry $8 may estimate the user's body Composition based solely on the captured body image data 10059] Body composition analysis circuitry S® may an lyze body composition using any suitable model. In a ‘5vo-compartmeat model, the body is assumed to be made up of two compartments a fist compartment corresponding 19 Jat and a seeond compartment corresponding to everything other than fat (eg, muscle, bone, et). Ina three-compart- ‘ment model, the body is assumed to be made up of visceral Tat suboutaneons Tal, and non-fat, If desired, body compo: sition analysis cireuiuy 88 may use a three-compariment ‘model and may estimate an amount of visceral fat, subeu- taneous ft, and non-fat ina user basod on images othe wer Body composition analysis circuitry $8 may estimate body composition of specific regions ofthe body (eg, how much Visceral fat and subcutaneous ft i located in user's ars) cor may estimate body composition aross the entire body (Gc, how a total amount of viscera fat and subcutaneous fat js distributed across the users body). 10060] Ifdesized, body image data may be gathered as part ‘ofa dodicated body composition analysis (eg. when depth sensor 36 is being used specifically for obiaining body mages for body composition analysis) andor may be gath- cred when depth sensor 36 i aleady being used for some other purpose (e.g whea depth sensor 36 is already being used for some other body scanning purpose). The body Jimage data may include one or more images that are captured of the body from difleret views (eg., oat view, Side profile view, back view, ete.) at different times of The day andior over mulkiple days, The image data may include sequence of images, ucla those froma video taken while the subject is bathing andor moving. [0061] User demographic information may be received from the user as part of @ dedicated body composition nalysis questionnaire and/or may be receive rom the user fs part of some other heakthrelated application 10062] Body composition analysis circuitry 88 may store 8 model thot is tind using data from user stdies, For example, data may'be collected from a group of participants (eg. ten participants fity participants, one hundred par ‘ieipants, one thousand participants, andor any ote suit- able number of participants) over & given period of time (ex, one month, (wo months, thee months, six months, eight months, ten months, a year, more than a Year less than year, etc. At each poatof data colletion during the study. the study patiipant's body shape and size may be measured US 2023/0065288 Al and the user's body composition may be measured. Body shape and size may be measured Using a Uree-dinensios depth Seasor ofthe type shown in FIG. 3, using anthropo- metric measurements (eg.. body landmarks and meastre mens) andior using any othor suitable measuring devies (ca three-

Você também pode gostar