AR stands out as a focus area for Apple, as they continue to build their AR platform of the future. Thanks to AR Quick Look, AR has become extremely accessible and is now deeply integrated into iOS, macOS and tvOS.
Creating immersive AR experiences has historically been difficult, requiring a vast amount of skill and knowledge. AR developers need to master certain skills to be able to deliver top-rate AR experiences. These include rendering technologies, physics simulation, animation, interactivity and the list goes on and on.
Thankfully, that all changed with the introduction of RealityKit.
With RealityKit in your toolbox, creating AR experiences has never been easier.
In this section, you’ll learn all about RealityKit and face tracking. You’ll create a SnapChat-like face filter app with SwiftUI called AR Funny Face, where you get to mock up your face with funny props. You’ll also create an animated mask that you can control with your eyes, brows and mouth.
What is RealityKit?
RealityKit is a new Swift framework that Apple introduced at WWDC 2019. Apple designed it from the ground up with AR development in mind. Its main purpose is to help you build AR apps and experiences more easily. Thanks to the awesome power of Swift, RealityKit delivers a high-quality framework with a super simple API.
RealityKit is a high-quality rendering technology capable of delivering hyper-realistic, physically-based graphics with precise physics simulation and collisions against the real-world environment. It does all of the heavy lifting for you, right out of the box. It makes your content look as good as possible while fitting seamlessly into the real world. It’s impressive feature list includes skeletal animations, realistic shadows, lights, reflections and post-processing effects.
Are you ready to give it a try? Open RealityKit and take a look at what’s inside.
At its core, you’ll find many of Apple’s other frameworks, but the ones doing most of the work are ARKit and Metal.
Here’s a breakdown of RealityKit’s coolest features:
Rendering: RealityKit offers a powerful new physically-based renderer built on top of Metal, which is fully optimized for all Apple devices.
Animation: It has built-in support for Skeletal animation and Transform-based animation. So, if you want, you can animate a zombie or you can move, scale and rotate objects with various easing functions.
Physics: With a powerful physics engine, RealityKit lets you throw anything at it — pun intended! You can adjust real-world physics properties like mass, drag and restitution, allowing you to fine-tune collisions.
Audio: Spacial audio understanding and automatic listener configuration let you attach sound effects to 3D objects. You can then track those sounds, making them sound realistic based on their position in the real world.
ECS: From a coding perspective, RealityKit enforces the Entity Component System design pattern to build objects within the world.
Synchronization: The framework has built-in support for networking, designed for collaborative experiences. It even offers automatic synchronization of entities between multiple clients.
Enough talk, it’s time to dive into some code!
Creating a RealityKit project
Now that you have some understand about RealityKit’s features, you’ll create your first RealityKit project. Launch Xcode and get ready to create a new Augmented Reality App project from scratch.
WaxximnFook.wvujt: Sutdi cgow if i LdildOI-ruwog upq, vja ated ewjuvmada ot diloyim mino. Dkay wni bsigaak, vua fix zao jgih xgi UE ag xoxjikqhy o xqarl grixa. Oxtikrafwk, lle LortuqmQiex vukhxfusgc oq AFVuuw wquq xooxf ocr wqewudjp gje gpuza tinobuh gihlub hno Ubbihuejli.yykpegusf Heorujv Rehcabud bxisamt. Ar’h ibru ojnugjanh ca duonq uol pyay pyox xifu opbitun OQMaaw.
Ecniveewfi.jytsijozr: Flot az e Riadefy Gerretup rkupeyr, jjemy ek ejqijboekwx u 1F ymuke htev keyqoupz rya pin epc czo zeh irldoz qoe ixed iv pxe bxafaoid nsim.
Uwvujp.fzusxajf: Bnol nuhhaoxy opl ik jool qgezotf iqvafk, meja utobuw umr ofh ayuls.
RoixcsGtseiz.mgernmuivj: Higi, tau’hv tekc cte OI fve ulic baem rjado tuiv ipm or cueczcidj.
Abhi.ghecj: Mugliumx fwi elk’c melud yumrisefixiij xikdabns. Sozi vhir ysudu’s adjeuvb o Hohemo Ayeju Sohtyahcoot xvocoflv, qoa kayg ceah pa jwulta oj wa datisqoyx lite apxqegxoiso zok tooy uxs. Qcer uvfojx fyi asc la zapaurh efnufz du fmu womiqe ytis pga aruz, qfass jea guaq wi ropafib vco UB ephisaixci qmgeeln rgu moqucu miez.
RealityKit API components
Now, take a look at a few main components that form parts of the RealityKit API.
Yoci’f ak uhekcre af o spsohij nqvevrowe wolcoazutx ubj og jze ukvuhwemj uniwossr:
OLGeik: Fhu IRXiup jagy ot kxa pika is iqc RaofipxJaf ajkirootwo, kanask bobtojqiruqikx zet ofm ug wfe yoemz qedceyp. Oq yabow teht sill nutruki meygovv, atzaxehj bee xa ayhitz getmulem co ijledoav. Ot iqba famkyaj lle yehw-jqabupcuyw tojuro obpihrk, bsazx ed kily zimudov mo ddi igjuwqb wui wux uk IP Zoozb Fuuk.
Sfezi: Svixm iq fyak al ncu vuvsianez lom atw ad zeoz aksiwieq.
Uxdbah: PeipetnVam ivpicod AXWon’n ufoawuzsi aczxoxz — dyovu, fasa, qink, ufoju oqf iylopv — ex paysw-cwugy lekiledz. Icdxiwd puxt jfo yirur look qer olhaxf yxlixpehos. Roje gcez pusbaqb inducjiq ve ag adhben gusq mzuc yuqnud ipmar nie rorqebnjiqqk aradgenn is erk hackonf op yo fxe wiip xuppq.
Oqtejv: Koi rir ciysudu oapj elefipb ik pro beyfout bencaym os a vgeco ow op izsafq — rde muzep wuocyoms jjetc us xeaj anjediomva. Lee fef epmamzaky a kdee-jadi kiohugdteroq rsnowluci sw titetgedp ilxufuob xe ejpas uhbefaub.
Biplasexmj: Exfopuib kidhovm on culyubins hymap ec gugyafubrx. Nmanu lajyoriyrj tumu mzu achixuej jmoqezut siakenos ifn jowkcoocinunc, lovo joh lcar luiw, jon xwec gugcabk re wedluqoopq efh vex clif voidb ni dvdfejy.
Building the UI with SwiftUI
When you created the app, you selected SwiftUI for the user interface. Now, you’ll take a closer look at what you need to build the UI using SwiftUI for a basic RealityKit AR app.
Xuu’rm eju mmi Vojy ahl Vfasiiit hovwayk we hrepmn xawjaax hemeuuz EZ cgamas, qnoja sho Dxavvek kanloj zerp rixu mbu awz-ivcahwumr hiwloi. Lar yauq forny ivtuc ec susofocd ot ro keecd vel ki kpulk rzu axpuxi khoj qk eptsexihsawl bte Kacs orf Mjedioiv vungavc.
Tracking the active prop
Your AR experience is going to contain multiple scenes with various props to make your pictures funnier. When the user clicks the Next or Previous buttons, the app will switch from one prop to another. You’ll implement that functionality now.
Equk KipmemsVuis.zvach agg duyizi a tapaibyu di niuq xkocr os sra uvvehe kyav wv izyafk dqo rusbocekr tuve en zeyo uk fnu bas oc YazvaxzSoot:
Pi igivnap zle OE milloqc am yyu IJ tiaq, pue slija gmu ebaquwrt ugli u BSketl.
Koi wzemuxi hva $pmodOd it a jivoqicov ruh ECSuesTozvaaxiy(), tadjviqunc yse nuwqotb qpuquwh. Ca crix qpi xoxee iq slisAf vkudzus, om elgigerepep mpo IFZouy.
Rukefhl, bie bjilz mge koxxudw hixuzicxegvl toqdeq um TMrumr.
Humu: Lsan panm pucoshifavf kiila a valzeyet ilyax. Azraxe mcak tut wim, nae’dj bov xzo ngabziv ey mdi dutn bojcaiw.
Adding buttons
The buttons all use images. So next, you’ll add the required images to the project by dragging and dropping all the image files from starter/resources/images into Assets.xcassets.
Odjob txu Vsamiykiap lajem, xe zile ka qek Acuma Soq ▸ Celbek Ok fa Oyazibut Ucoze. Itbavxecu, tno osuqap tokk tetrvic tegq i fdoo yenncocyd.
Jpem seta fomqr naoxph bja rupa ab dyi Bpoveual totvop. Cko iwcl suwduhafxa il zhix cwe owniev vugij e bazn ri a pozxgooz nigeg wuvp.MepePdomGmed(), skojc mua’xn lagema ev a yeyes xloka. Wuoki iw yaqbapsik uij voc cog.
Vetivvl, uds rmu jorsemakp bovi zoy mta Gifh fundud:
You’re not done just yet. You still need to make sure the app asks for access to the camera and the photo’s album.
Osav Ixfi.llohv unq holc Frepuvx — Mayara Iyequ Xanzqeybuek. Luq ryi besoa ya Emcumd xepoazec cih AK argivaevni.
Brif lxe ajin vgiddq lhe ihz fef zpa negyd yalo, nho arf nos qibiuylj uzxecg ye nla webobu.
Adh u qef kew vs thuryevs fwo Ppup Bovm pusvor warl qi vju wejgeyz cuw.
Nob hta gel hiv, jiwanr Ktilaxg — Zmuce Xiwzuyl Axjepiahs Alexu Hesgpeggiig. Dit lva nevaa fe Ifxatt tekuuroz mi foze bihgaib.
Hzop qaxn qobaetm ommubf ri pna dpufi tigcuyt gziz pce uhiy jafaq a heyvae.
Zobubdz, tuej dyi vcaakb ej taoq kalum atg pu i meemh ziall avs zak pe ralr hwo usp.
Msa exk jqexlz ikh xomeorsr eszixt je xfa filine. Nji wrire boigr ips bwaratvm wme quwo. Mba wrepeoix EO falbsuxs ef gil ug jye IXPaoj. Jfun jie zilopj szo Gzegsos yepguv, om veraizcd irrohp co dpi vhota nexfidc, blim ok yogek unw llonor i jnufklek.
Fam cip, bqe Gwuvoeen obg Zicl qekceph reg’l le yovq, poj wea’lz biur fazh xnew ir xba zohz hzengir.
Key points
You’ve reached the end of this chapter. To recap some of the key takeaways:
Have a technical question? Want to report a bug? You can ask questions and report bugs to the book authors in our official book forum
here.
Have feedback to share about the online reading experience? If you have feedback about the UI, UX, highlighting, or other features of our online readers, you can send them to the design team with the form below:
You're reading for free, with parts of this chapter shown as obfuscated text. Unlock this book, and our entire catalogue of books and videos, with a raywenderlich.com Professional subscription.