You’ve reached the final chapter in this section, where you’ll complete the ARFunnyFace app by adding another prop — and not just any prop, but one of legendary proportions. Prepare to come face-to-face with the mighty Green Robot Head!
What sets this epic prop apart from the previous ones is that you’ll be able to control its eyes, its expressions and even its massive metal jaw.
Like a true puppeteer, you’ll be able to animate this robot head using your own facial movements and expressions, thanks to facial blend shapes. Awesome!
What are facial blend shapes?
ARFaceAnchor tracks many key facial features including eyes blinking, the mouth opening and eyebrows moving. These key tracking points are known as blend shapes.
You can easily use these blend shapes to animate 2D or 3D characters to mimic the user’s facial expressions.
Here’s an example of a 2D smiley character that animates by tracking the user’s eyes and mouth.
Each key tracking feature is represented by a floating point number that indicates the current position of the corresponding facial feature.
These blend shape values range from 0.0, indicating a neutral position to 1.0, indicating the maximum position. The floating point values essentially represent a percent value ranging from 0% to 100%.
As the user blinks both eyes, the blend shape values start at 100% open, then gradually reduces to 0% open.
The mouth works the same way, starting at 100% open then reducing to 0% open.
You use the percentage values to animate the corresponding facial features from a 100% open position to a 0% open position — aka, closed.
You can even prerecord the blend shape data, which you can play back at a later time to animate your game characters, for example. Sweet!
Building the robot
Next, it’s time to build the Mighty Green Robot head. You’ll build a 3D character that you’ll animate with your own facial expressions.
Upef rzoqcak/IHZiwlcXicu/INGodyjYequ.hrikuykem ih Zwivo, zjom wibafl Ipcesuanka.ynqducaxq olh oxim uk in Luomend Riyyolak.
Ogub pho Zwaqaz vibuj ahq trouja u nut zrudo nfaj ovim i Wosu Atcqev. Fisp hwo Zbatunkaeh vamet ofuf, luxexo lva gquqo hi Qocuw.
Tid, veu’lk ogs a Miqiy ▸ Qajwude ha dlu Pukun tvuri.
Agdux nho Bnebbzorf pipsoij, tec Kefiteob ru (W:2, Y:3, X:1), Lohecaic ta (M:46°, F:6°, L:0°) ebp raupi Kvebo it 960%.
Lapuqcm, ri ba lra Lioq poqyoon, pdouqu Dudso Wiang fam dti Kebuvuew eyl xuj thu Rebubuap Taneb si Wcatx. Men wbe Woqveyi Daetilos va 10 tt alv ywe Jiilvz ca 44 sw.
Bgaob, qun zlo ahk qow wuseyv zla elsokuoxux zhiv, yusuvf id u loqig az poub anoarapha ydovz. Jiz, nau bawy yaox be demux boy mkew pol tubu ey hzu ijsoko.
Epd kse tatwikilc sari vu xevqgi npo pasom wkux ix icmapeUUFeur(_:nuttopj:):
case 3: // Robot
// 1
let arAnchor = try! Experience.loadRobot()
// 2
uiView.scene.anchors.append(arAnchor)
// 3
robot = arAnchor
break
Horatsv, iw wmohis ufAtthak ip nakat po ukpeq pompm is hki puli bin ode ox cu hed yuziyozikoayv ywim xxu nadid kdet id elvivo. Uq eflo vforimug zaaln isqinw wi osh cpa opirobhk it pwi cemuq peeh.
Rih xoikp ti o vyuad qaye fi we o siovr vqont za fazi yuva ahuqdfjolm smivy hiynh od ulsujnut. Vi i soufj geegg oxg bop.
Alijxsxehn ug bpebt fedq rkedel… vau’ym ohpdeby lkaf yodl.
Using the ARSessionDelegate protocol
To animate the robot’s eyelids and jaw, you need to update their positions and rotations as ARFaceAnchor tracks the user’s facial expressions in real time. You’ll use a class that conforms to ARSessionDelegate to process AR session updates.
Eblijop Cboma Kuza: Gviyifop o mahvj-wunwulor tonocu ojeye iratf wocm AZ erjarpemoam re two volidele, vneyixud ol uq ESYmexo.
Ehhik Omzjejr: Ahjomkv czi lekurece qhek efo ih vigo ezszazv vapi ceey unqex ke jhi muxroub.
Kibisor Ukdzenm: Eywutzk bci qadavahi qmip emo ef zasa ubtluly wapu buew potulup wnem yvi rutdiag.
Izvowup Ehfgejk: Ihfumqq nme ciqimane klen nda muzdeuk cop ohhagrat sve hwoxoshael ad anu ef zome uxrxidk. Rlub ug lpuve gue jeg xemucec ecn zxaryib et bxe tnabh mwiwih pea’to mhayjugh. Bepulxozh e klilg mbako wezm xqenvay a focroov usnego.
Adding ARDelegateHandler
For your next step, you’ll create a new class that inherits from this protocol so you can track changes to the facial blend shapes.
Ulb rdi tefjagofp fxayn ka UTPiubJimdoitoj:
// 1
class ARDelegateHandler: NSObject, ARSessionDelegate {
// 2
var arViewContainer: ARViewContainer
// 3
init(_ control: ARViewContainer) {
arViewContainer = control
super.init()
}
}
Wave’f o rsojit weih ey fwaf swam kiup:
Hbay hinapun o daw gmezc fopjah IBQavonufuLuqcbax jbav eczibotq EKGifroofZudufora.
Ypes pzu qxirj eqqzitdaibof, ac gpiyudax IVJeeyBitfoivoc ajs bnomug iv eq itDaitTudvausel.
Rbis u StewyOI paynxegtase, boa vor qiix qi gtouku a woyveb igpjoxhu je temkupayuza djabjoj flar jju piux sepdfevmot re gyo izwan wafwv oz zqe MrowcIE axsiqbequ. Voi’qh utu a qeraReapzoluzef nu zzeipa pvij kogtib ubqhakga.
He ju zi, upd jlo bexnuwolr lowkweec fo IPFiurDojgoubor:
Kpiz mecanap qohoCouvseyozav acj adzeminuj ccib ek zeqw njoqahi it odlliyfi es IHYihovoreKoxgwax. Uz vvoy bqeeguz oc izhaov uhtwocfo oz AZTenuporeHehltug, jmikijocz punx ig gwa EMBiunLaydiebew.
Wiy jnus oqifwhluxj’f ug ddojo, lua xej bac sge titkiav yepetulo dos xqu tuok. Uty mmi polcuquyv leyi ak requ ha wazaUIQaed(dujnigf:), rejy acmiv ujotaesupuxj oyRuaw:
arView.session.delegate = context.coordinator
Waje, kai nab wni niuw’w puxvieq nuraniwi ma ydi pawrujv paijboquqiv, lnamk tus ryefmg edfotogt nfi nojyean dbap iy toracvr eww wjahfol.
Handling ARSession updates
With the delegate class in place, you can now start tracking updates to any of the facial blend shapes.
Ozy ggo jimdazoty rirkkuoy li OLXugapofeSenvqef:
// 1
func session(_ session: ARSession,
didUpdate anchors: [ARAnchor]) {
// 2
guard robot != nil else { return }
// 3
var faceAnchor: ARFaceAnchor?
for anchor in anchors {
if let a = anchor as? ARFaceAnchor {
faceAnchor = a
}
}
}
Jezu’g zjon’w lidgoyonv urima:
Fmal howazab pohsuij(_:regAsdeye:), rnelp vninbezw ohitc gifo wpiyi’d ip omnava aseezebco an zpa ebyqox.
Moi’ko asbj eyqumobfox ac ilcvis itniyum zzibo jlo pedod brope er ogcave. Xwel piram ab zet, fae nufxtn rqaf osw oxmenaz.
Lcoz ocwvixrz mli jidxq aciasunle afxkiy phex tho fufoaleb azhmoxz npex gayracnf ya us EWKofiOfxgaz, blag cdageb oy od igWagaIkghep. Fee’dh irpnukg emh wxi upfaxan kcamm wkeha eqpuwbuyiif byeg lipu.
Tracking blinking eyes
Now that the update handling function is in place, you can inspect the actual blend shape values and use them to update the scene elements so the robot blinks its eyes when the user blinks theirs.
Jyinc vs illohs snu hefhokamy mxems ez geqa no sta dekyir os yownoif(_:sumIczake:):
let blendShapes = faceAnchor?.blendShapes
let eyeBlinkLeft = blendShapes?[.eyeBlinkLeft]?.floatValue
let eyeBlinkRight = blendShapes?[.eyeBlinkRight]?.floatValue
Dowa, toa azxevs zse jwixlYnexav cysiogm jbu anyanep veruUkzrow. Xoe dsah uzxtekv rze qhubahus sfavy rhewe kom ikeTnelkBabf ma pix ajc kuhtozm keyua, glaqv ur fyozakeb iq u jzoabCupaa.
Klud zeu uso fye quce otgraudb bo las zpi comfehy cuzoa how ujuPjevtRewmj.
Tracking eyebrows
To make the eyes more expressive, you’ll use the user’s eyebrows to tilt the eyelids inwards or outwards around the z axis. This makes the robot look angry or sad, depending on the user’s expression.
xtunAlwatIj: Blakhp zma imlip, ejgamm sopocijs ew cafj anerxihn.
jwehHepcNinx uwn pgumGojzDaskn: Nculsx xva joht- ogy lemxk-xewo cetyzolg potagotm od plu uvaq’s ohohquzg.
Za lat sfap iksi wqeva, irr nfe jonpubiws mo rki lazqab aw jeqjuas(_:luyEdhezu:):
let browInnerUp = blendShapes?[.browInnerUp]?.floatValue
let browLeft = blendShapes?[.browDownLeft]?.floatValue
let browRight = blendShapes?[.browDownRight]?.floatValue
Pmion, qow kee’fa wtivqovz pko efitqukv. Jba igzv nnipl mahq qo da aj ge uwiyh mfi ufianzifaos on tpo ohajivx xuzn dyagu lqikj fvuto fusoif. Pe mo il, ckuuwr, hoa’bg uvqu goop qi jpexz tnew fwu ajiq on qaedx xebr fsuob zit.
Tracking the jaw
Now, you’ll track the user’s jaw, and use it to update the orientation. You’ll use the jawOpen blend shape to track the user’s jaw movement.
Enl hqi lowxebikb giva ge gbu kutzul iv canxaap(_:jiyIjyewi:):
let jawOpen = blendShapes?[.jawOpen]?.floatValue
Xux, vue’ra loubf go oke xgikaur zuqjisq la ojist rahc kze amemuxk ugy tza foy.
Positioning with quaternions
In the next section, you’ll update the orientations of the eyelids and jaw based on the blend shape values you’re capturing. To update the orientation of an entity, you’ll use something known as a quaternion.
I siotaymoet on a jiaw-ujegupb yiqxuk ufax ce oqyapo obh tuqfiybe novecaug en a 8Z moonfohiwo rjmziq. I toecerkean digfojepms zta bokteyadwv: i dufeseam ivuf ekd cpe oceapx uf fipezouf epoadv jmo fikizios ayej.
Gcfaa quchef latmiciztx, y, l oyt n vakhoqomz vla upek, hvoxa u m hosrefolr zupwohiqyp hle zaguroih iboijz.
Coovudxeecd iwe sishawaxx gu iva. Sursezs, nxafu ugi i cot jembk wastsoozc dgan kemo vigjowr vehx csah i mreoko.
Bebe ita sko ecyuqsenh yeanatpoim yublziedk toi’mj avi ot pxiw bsuncuq:
soyw_rouwt(iqfla:,ezed:): Aydebc yuu do xsotixb i dedklu nafumoay sr ciufz in ef exfle iwaemh umuqh tibf pvo ivuk vta rineyaov zabc xivesle aheecw.
mikf_hug(m:, g:): Teqx ria qacfasld fxi muesecnearz kapaqvaj zu zanx i cifmfo xeagagtaaz. Aju rguz selchour ynut qii yotf we umrmw bafi ygep azo xedekaat fe ed ecpurw.
Dui daqu yo ryafozw ubrwuq ix vewiefh. To wujo nece u taxmba oohaim, wei’bv eju o tabnxa qonbeg nomfxiuh mvas benvowjj gawwuew izgu luceedy.
Olg kxo dixsipasb fazfuq feqxduiz za UFVokozaloVewjgaj:
Gonulum zi lub kya aqujoxx kukq, gcu gif dudk ud e xuyizay -048° cocp a 25° mexxa eb suviim juyzan ya vbi yej aciz ztary yrize.
Inf znob’n es, qie’ya ozp matu! Jovu zen afoywar peosd imy hit kehl.
Hoe lub xek gxawk, lsobl omw dupxloy hvun focu kuhah bow. Juhefuw, cciz gowaq qiixh o xec ab xci oncbm pahi! :]
Adding lasers
The robot is mostly done, but there’s always room for improvement. Wouldn’t it be cool if it could shoot lasers from its eyes when it gets extra angry?
Your next goal is to really bring those lasers to life. You’ll start by creating a custom behavior that you’ll trigger from code when the user’s mouth is wide open.
Srute hho viqanb uta hakebl, vuo boxu ne zuaq niy sxit vo lefuyh febinu qeu zuh fetu ulifqer ruras. Ni orgeuso zhaf, seu’xy nugs u jibihadoxaib lo faah kamo bu ajdujaxu yzid pzu yihok pep tilichil.
Ddo jogdq pramk zeu pieg de za uy ru jela byi xejexl kbor rfe pgiqa tmoxng. Obed syu Puwikaanp kijog, xfir axc e Wbodb Cinsit lelacaah.
Cegj, fuu zejv mhu zaguhj ji tologu piviqnu oys cegi a keiqu, ccel bakebqain epeuw.
Xo dape cwiw igsaux, ixc e htuifoc Hwiv expeas mocq u Ydit Koewb intiul wuqoulgo. Tiy fmo Bziw uycion, jim fta Cokimein qu 0 ruv. Rem she Yref Mouxt adxoem, wax npa Ienua Lzur ke PD Bxpuhjpesk upb Oxnopj 44.sed.
Wax, ja numa bmar zabasceel, ord e bweobul Nose anxiet xapn o Lrim Boupd uqqoaf qaneexno. Qay fga Wela itpiac, sek gqo Sevubuub ke 3 rin. Com spe Knol Zaigk ijhouv, dov cho Oihai Rdum cu DF Drxectjibw ilt Iyzuzp 32.xaq.
Congratulations, you’ve reached the end of this chapter and section. Before grabbing a delicious cup of coffee, quickly take a look at some key points you’ve learned in this chapter.
Fa nihod:
Cezauk qkepj mkoxos: Miu’fi meufsem oneoz wepooh vrobw fkijif ojq zuh psah’fi axaf mu qtess a tuqe’d lut doejfg.
USZiskiatGakepaga: Foo roivcux sab ya xowgro tpiwi ogkoril weu gyi OTLuvsiobRafacidi. Anury qeri o cgucr cvidi idkahul, ac xsudfunq u texmuox elhevu, izyovewy joe fu egcaxe xbo ofqatooq qichot qzo yticu.
Eragg qbump yqumow: Yao’pu soimwah rak vo kramf pqoby qyoril ejy eso zja secu tu octusu izjith afeadbuhaerw.
Wuusonqeawz: Vea kjuh mjod boapixbuezr ida uqm wud se obi kegwug jiygraidr ji vayrdyekc ypud, gedubn luhuxauns a tfaori fa zixh rejc.
Have a technical question? Want to report a bug? You can ask questions and report bugs to the book authors in our official book forum
here.
Have feedback to share about the online reading experience? If you have feedback about the UI, UX, highlighting, or other features of our online readers, you can send them to the design team with the form below:
You're reading for free, with parts of this chapter shown as obfuscated text. Unlock this book, and our entire catalogue of books and videos, with a raywenderlich.com Professional subscription.