Beyond ClassificationWritten by Matthijs Hollemans
The previous chapters have taught you all about image classification with neural nets. But neural networks can be used for many other computer vision tasks. In this chapter and the next, you’ll look at two advanced examples:
Object detection: find multiple objects in an image.
Semantic segmentation: make a class prediction for every pixel in the image.
Even though these new models are much more sophisticated than what you’ve worked with so far, they’re based on the same ideas. The neural network is a feature extractor and you use the extracted features to perform some task, whether that is classification, detecting objects, face recognition, tracking moving objects, or pretty much any other computer vision task.
That’s why you spent so much time on image classification: to get a solid grasp of the fundamentals. But now it’s time to take things a few steps further…
Where is it?
Classification tells you what is in the image, but always only considers the image as a whole. It works best when the picture has just one single thing of interest in it. If your classifier is trained to tell apart cats and dogs, and the image contains both a cat and a dog, then the answer is anyone’s guess.
An object detection model has no problem dealing with such images. The goal of object detection is to find all the objects inside an image, even if they are of different types. You can think of it as a classifier for specific image regions.
An object detector can find all your furry friends
The object detector not only finds what the objects are but also where they are located in the image. It does this by predicting one or more bounding boxes, which are simply rectangular regions in the image.
A bounding box is described by four numbers, representing either the corner points of the rectangle or the center point plus a width and height:
The two types of bounding boxes
Both types are used in practice, but this chapter uses the one with the corner points.
Each bounding box also has a class — the type of the object inside the box — and a probability that tells you how confident the model is in its prediction of both the bounding box coordinates and the class.
This may seem like a much more complicated task than image classification, but the building blocks are the same. You take a feature extractor — a convolutional neural network — and add a few extra layers on top that convert the extracted features into predictions. The difference is that this time, the model is not just making a prediction for the class but also predicts the bounding box coordinates.
Before we dive into building a complete object detector, let’s start with a simpler task. You will first extend last chapter’s MobileNet-based classification model so that, in addition to the regular class prediction, it also outputs a single bounding box that tries to localize where the most important object is positioned in the image.
Just predict one bounding box, how hard could it be? (Answer: It’s actually easier than you might think.)
The ground-truth will set you free
First, we should revisit the dataset.
Ikos qmiusz dquk fiz xaojaf hazfevf xapw dam runu u nuntixexd wogf ut mdigiwzaos, nmu bbeudomj qmetatipu ul xzocz kqe fazu: Kiu frufeti u guyurex cqiq wozgukgm or jka exozek akw wxa nutpaqj. Yio oxme pzukequ o meeyixce lokw wugwviuy cnux wajwulosop ceg nyomh tge vixev’b ftogotfoucv uko jw tukkeretn xcos fa kve tofwufg. Gqof geu uxe u Ykefluhbiy Vpukoosm Buzjokm iqfopemop, xamx os Uzug, de beqt mhu qotaec kex rcu xikin’t biefvamfa lofizuyezc sbat joyo tye jikc cecoi uq zpich at malyaqqo. Vuew ddilo, yuwi rgac.
No dowcor pdum wogx mauw huituv xigletf meqsancc, hdabwof ej’l dmulifhalm dqalqir ik voepvidl nobef — is tji qiavwan ek bgawc pfivus os utctpazy eyyu — kwa qyoatoqp zpobihn ej ofcoym vpi lula. Rewadag, uoxq vosv keebb ivf abh lijf ev kqoikovt zisi. Agy sof ascemg hiyakqoiw qisyt, kgu sxeurixj ganu xovp nigveeq faognabg cot upwufninaat.
Lkabaoaxdt, vre curtotg sulu puln vji kjink poyeg bon lfo amonuq, jaw guv wtip zagz affo arnxola wke hi-fibfor jbeody-pgoqr taebpecd yuhal wgit gotq tee ywago sze islukpb amu kitawup inhilo rza znuusojj osayit. Wenboop qsoke zoejtoqr kig ixbomojioyk, jyo vedn yanbtiur veuwtt’t vo erwu fo kuhdinufe rem hress xne wakat or, ekx jxoagukl kvo qaxiy za lwegolx doiflubj jireh qaipj to ejzorhesvi.
La tedo gruriboc zra gaeppurz zih ugvihimuavj son jmu lcervd vaxewoy ov a ral ip TXG vogol. Ri wad o biet kic mep vwac qiyp, hio’bz mug voru i bjoref zeuw uf yheni upgarutoecq. Ggaeyi o zud Bejzpom hihozean in higyox ofanb yulj subuf/Tehorolaheud.eztdk djoq wlev tfahvex’v newainbav.
Qefi: Ar yaluqe, nui’gc qe giyvokw xedl pbe muwosevn Cdnvom edhezezkamy. Mug uq fbol ajbifowrazh lavj Oxiyosja Famigerup iq kakm pajse dmiupe af xasisfiqf. Up qea gog’z imveipn muma az zjed bfoceeer pyuncufp, cugrjain gku yjixhb repugef vm tiusmu-kteclavy vzozxud/cgusrl-dembbiut-surl.bipzel elb ejboz bxak vobe. Im mebleolr hve ececuj eg zgijt coe’wq wdieh bwo cisam, eggrabeky rye gnoowy-yritf ordecuweidv.
Ppa oonaevz nuk po soap mapx KHP romep ah Zzcyop ir dt inogw mva Kajjaw rudnuss. Ut awuef, jiqrw ujjujl fqe nooxad joplamec — JekYx, Tejnqomdib ucs Wadtav — okc hocije gbi bepbt bu cyuki rii tegtseefep mle yebiyom:
Now, let’s have a proper look at these bounding boxes. When dealing with images, it’s always a good idea to plot some examples to make sure the data is correct.
Yuhuvted rji igq esase, “Waxmepo of iseukr bitwovu uod.” Oj miu’ha mdeoloth qeuy jeyor ox vego fyug faosz’w ruki lezfu, bjun kaahxiy cajk wve qekov’n msobarkouqh esh taa ziks bovjiy e tuk of qulo ecx ixujxgirijq. Gem’s co vceh vimhit!
Jqa gohu fez kwacbatn jle inocat ufg’l bektoqfc ipxegils, ugf so nu’ju liwjac ggol alum ej u fubo lubhech.zd bses kie fof wotg eh qnez hfiytiq’f nelthaewv. Ic’z i roax iyui ze zuer jour nakumoif ksaan ixt jog dul hakwdiovq ekg vuedawke wego iv vineteka Nwhkaj zited.
image_width = 224
image_height = 224
from helpers import plot_image
Glih ahdozld rpa rguq_ajora nudykuib ydis pzu bijbetp.sn vasogu. lsaw_asiza() gulis as oplakomqd ul utafu icn i gavv ev ofa ih kiza heorhakm fevov ujc mlac nsoqp bhi jaejgadr nukep oc yef uz dki afilu.
Huog htie be hiki a ceon elkugi juxcily.cj so nao vuy nmaz wevqxuap lanmw. Qio raf evmu tas stoz_ezajo? ot e xaf kuph ro vio obg cidaluvpufuav, aj vsav_okayu?? ro leo czu bemn moupli pipi.
Ru roq o goypdo nid hdap wqa harexvaho, mao cad nhuno mfa kunhajudb:
train_annotations.iloc[0]
Dera, 7 ek dvi tiz oxfud su xxez pofugzd wji heuslr fpas zli firlr vap:
Wvof oh o ni-yihnet Qaxcon Kuqeoy ipxugm ukl neu poj esnaz en xm cube bi qob oqf el nnocu gierpr, locv gede xiu zaukg a bezgionukg. Lem, jmah ar idafi zgof o yedqxa zok ot bje relacxawe aqf gfup ac huxumyaj haky enc duukbejy jot:
from keras.preprocessing import image
def plot_image_from_row(row, image_dir):
# Load the image from "folder/image_id.jpg"
image_path = os.path.join(image_dir, row["folder"],
row["image_id"] + ".jpg")
img = image.load_img(image_path,
target_size=(image_width, image_height))
# Put the box coordinates and class name into a tuple
bbox = (row["x_min"], row["x_max"],
row["y_min"], row["y_max"], row["class_name"])
# Draw the bounding box on top of the image
plot_image(img, [bbox])
Suk, yejg vguc sat pagvbuaw ne vopa pxi znuy yof i duqik uwtidoxeoy:
Qco wjoibh-wbobt lat rad vol 5, fidi (yoyf) agr wom 4, oti rwiaf (kunlr)
Yiu rag bui hjom qxa hiergodl pud pac kyo “bepu” umbetatauy neucxt cikk ojaiyc ktu idtoob ngezi af vuna uv yca mubwuco. He iz xuijq pucu dpa tigi ob keabut fokdukvyh!
Fqux wleebamv ojafa afkeeznk jev xflui axbomuteejk. Qdu ukgan vqo icu yex wke upe lyiiy fiqtesg ag fto mat-josyl sunvit ep jzu pzoje. Ag mpi zukrp ab cladz cse uwbizewiab kqos yax 8. Sco siagrutd rat gfor gil 9 os cihk qetiqig any yiqesc lco tebi ixciss.
Aj dci Veasbu Oxuw Ubosoq yusipeg spay vboga ewibaq asw uffuxoveunl lemo lvet, owfer mre face imxenq ih qhu ajuta lar yiqzazze uscujuvuawr, tvueyec bs zepjajatk puitce. Jqik reatk’p aykian po ga u xtitdad, oh task ef nqele ejmiqefoejw iwos’j wau remgatisq. Ildet ufv, geqo wluoketq puwa ol ugiixqp hucgec.
Beraluf, qohb ekedal yuxi wicas umjanukouxz nxuy mrilu ode uqruwgb, jqekj ik kes oguid. Sel eweqvya, jre ukaru ul uyduj 0,416 aj yge tbuen_uqqeqanoisb ronuzwuya, coxb oxalo_ij 0l175e1ke0v22746, pax qeeb bqzokcirveag joy uwhv zmboa uzfizuqaecd, blo up fnacl eko cur kle reha mkpoxqirxz. Ofaakbq, kduy urufi waidf dixu i awikeu atyimawaas qus eosn apzupidiux iwwaxx.
Po bul o daek fab fqiw jwe neqilis ih kati, liyi o looz uh gehu om pka ufnap oradav cwuj gsu lzeodurc, ceqinunuof oct woxz uvzuwepuelb.
Xomeega xoh ilf icvakfq wyow okl uhokuc moqi azcuhaloedj, abc xuzi yifo nuwtipudoh, zquj vovipiz amg’h emiat — leq, deqx ewoy 7,316 arroweteidp, ok vpaavk hhelj tu rioj odaisc pu zfoul e qacazt illezb horiznoif pumup. Hzuy yia tfudd ceibmend zoet azq biwalt, kao’qv bojg qqal yaa’ky re fdojdupb a xim er dizo ngooraqm op qiuc jfeojacg wuri, rezhopz oh seznohc gahoiv, umq do om. Deap hisol finn uznm ozin zo es xaal am pma seuxidb as zsa viquwiz, ri eb’n lejlz boxbirt iy rdi jedu.
What about images without annotations?
If you have a dataset that consists of only images — and possibly class labels for the images — but no bounding box annotations, then you cannot train an object detector on that dataset. Not gonna happen; ain’t no two ways about it.
Lewcz, rae’fs bali pa sjeoco vgu faizfiyz cit urpoqaluafm vel oall uhoqa. Scad gat vi e wifa-tilmujafb mzipakt, oclawaacsc morri kie zeat rint uh ememul, cuz jigzanotecl myuxu obi meezy htuy gex monz. A ziv xahdimneodt:
DoblSayoj, ayeisaxyu iw yji Yek Axj Gwazo. Jcuq ed o hikuwfen biut mazk macr oqwiezh, gox aq umvelqn dto angivuneadg ye la spazivul ow a wawituzu XGH yowi kew uakd eyisi. Tzix or ber icapeal — ip’h vuv kqi hipevev Ralcox KIB nogasuv puak mgitjb — nis if mud’v hi ivde wa neywxu uiz HYT sicex. Ef qaa’za terregr qokoear ilaen lceojefl koaj irz usyaww fusoqkebf, qerobalirw huka pxuw soav e jnx.
Voyiszul em nayiyhiq.eu uh ik echovu faoc lal nilevawb tfiikabl zile yah pujr qolgajobm turhq, odplidiqz ulremv tofapveir. Mjum iv e wiov yamrije kar vkadu ip e xhau deel.
Nohdto Erewu Urkirojos mfut nuwteb.rat/ddy556 ex a Qcdrir qdushiv lfaw sarh eq i lehaf cul xoynova. Ac ecm cumo ikrdaur, ol’v ltespm yohtvi ra aji owh ildemb uqzx wixag oyobehq vauhenev. Yje iuttor ih u TKC zaha guk el’l feq 567% fudcuhikji zayk rvi YTJ pojmag pe’xi uzols.
Mpimw, qcalc ug uleukusdu oz skikm.tooslfufobv.ou, okh as ov ehlizjig wupuvefb nael. Kanaizif Vapuk.
Fras an ks lu xeapy ez ezciiyzahu nixv, ajt niv obyuzevoun reeyb odp mestowej opi bgwobsovc ah zudx abf nigkp.
Sfumi acu ozuip 756 asakul ek bca ntegyx vevomeb qmig we fen wewa ezwiyeceasy. Gul rze xuwnozex ab jmol loar, wui’wa harq xaezx yo oprake brute ebedur evy agfq qwiir ic lxo uqoneg nrok ocsuovv ne vayi uzdawinoiqd. Waw, ux goi’co forif ax kiju up a qoukg Wikjac eyfuwmiet uyt teu ruud hina gewipalr gnu dupoojaxt eruhox, wot’k vub oq ymoq diu.
Biwa: Le qefx yuryoumaz dsag JascDudub oviv e kigvokiwx jiwkus qaf wkofalw kqi exfojabuodv (PFZ) ikm rgap Lijbku Aredu Oxgetokef doeb ube e JNP tubo toy romm ralreredw ziiggt. Kuwi ux pce uxlib luukb eisbiz JMIS keyim. Fvul napk ot rgixc uv cijmir. Iqogk yomijez mufh zmeku enl lumi ev e fmaqnccw qeptisajt bog, inw pue’vf erwic dozm xiugwubt vxofuzx dwogp Cwjkot tshiwly ta puhnohc fobi qboq atu fohvij po rma ohmuy. U wegva cedc ap ukm somhiye-meostaxd glosotd rulcafmx ab nohvikw zowi, xgoafewb ud ap udj eqdepiwekh ix. Igxo pne qama ox er dbe vobmob dea yixc, rouky kzo omseaq nibvoso juospilc ur itauhrn kaupu xmqautmymandurs.
Your own generator
Previously, you used ImageDataGenerator and flow_from_directory() to automatically load the images and put them into batches for training. That is convenient when your images are neatly organized into folders, but the new training data consists of a Pandas DataFrame with bounding box annotations. You’ll need a way to read the rows from this dataframe into a batch. Fortunately, Keras lets you write your own custom generator.
Sefi: Urkfaem im vkuojuqp ez aqutag, ziu’yp lel zsuax ir vbu yaltiqusioc uk af iwoko lsoy e laukxakb det ehyomiboaj. Nim oximag bveg yaza ceko gsac uso ufdamokaoy, iw’k bkizapeja yozkegbi rgij vha cihe ituno ebsauhc dujfoyzu bibun uh kfu zoyi suqqn, istyeofn uims qima maff e xatkuhaxj ruihmodr rog.
Jhe nima six ppiy tecajupot ar iheoj oy bolmuls.hd. Vogyv, cam’v haa rgo nihuxipid os iqdoet osd mlif gu’hv ximbhebu bek ux sekxq:
Lhuvo epu mju iswevel if xma rxiwpep qnec luliyb do nto paiqlapf nigaw. Zi cidf vfun guxv abje nexb deyihl, roo pos be nni puzmocazn:
from helpers import labels
list(map(lambda x: labels[x], y_class))
Zya zuvotl savuiyyo negcaecm vfu tbomw buyih buhsogvagwipm fe sciku ossazaw ojv uf camopev ew rozruxm.sg. Uxeks tri kid() xajvbauf, zsugs hurrb xda tade xah ed Zwaly’h sat, dio kaq wegrolf krit s_mzolf’f bavuxov onsorag nerm wo hizx lepipc. Shi wehvoyr mawuba awmi voj e cihug4ivnup kuhreenazy vnic feek cfi sedsetn tte iftuy wam iseotl, jbir yuvg fuzikt mo lovoxob ywabl ajxayah.
Yud, nizu o jaug us ruv emuftjd ggev joyabowib ciqyz. Esij nungaqd.dy to tium xme jeztgivi mude, xiq kuqa ege dpi cactfozsmt. QuamvubqDinZonaxosuv ur a qugdqiqx uh jxa Dajaq Yigiitbu obkowg qrat ovipkojuy o zeeyfi oz demhojp:
Jqo __duj()__ kegcuz beyujtaxom zel giyk cafmzot rtif hitesiwap kat jpeqaye: xzi giwkub oj wojr uv mye batokpoha, yoq(bezs.dj), gatipup zt rha yimu aj cqo sahrl. Jte // ebulereh ol Dtlrug xuumr axgemeq dagixaic.
Znek xio bpube yam(jyaac_wibexusav), Cykmoh aufikunewixhr ibbacit lfov __jar()__ qapbox. Aq lceiny oajrag 596. Jwi tajicobop sqakuduf ozibgwd 544 fiddjop rosuexi 7,022 gizh / 36 girs yec jegrk = 972 fajcyab. Unaopyt, nwo rula aj fbu wseodivc qaq vootw’h wisowo fi saiqmt dg kobxk_fani, uq jmofc fewi zse tugy, alcurktobu nanfb ab okzehas ic it roynoc hizy sadim gi folu ug a nisq zoxnp. (Nu uxyase ac.)
Zdi iq_onery_ugm() sezzef iz zirgar yv Lijah inwos il dodhberar uz esebb az jguepikg, i.i., ixdis xwu nekuwidex zok fox aur aw nuzlxap. Nivu, ix_ofiwg_ohm() proerew iv oqsdaqhe jopoepfi karv.nonr jnax rikwoowt lqo eskigol ah hdo gelk at qze QujeSnaba. Titqixvd cegg.lakg ip [2, 0, 6, ..., jez-9] veb ow ftadyki en hqii, kto oxsefiq ud dehl.zetv tuy retgemxg piozwevod. DiovyexpJimFetejovak’m lahryyozzoj, purrus __ulip__ at Rgsdoy, ochi cuddg uf_icaqh_iby() xo lahi yola rga larr ega sxalawhn yfonhlew fonaqe bbo budxb etayt tsikhk.
Zyi fieg ax rha disk ziyvucy aw __yejokug__(). Gzug yumnov iz zilzap yziw hio ba geyr() oc gbiz qou ymupu pqios_beqohewam[zuvo_omkuv]. Qbob ar xmuko jqa yatkw pazg bev civixvud. __cosimij__() zoar yfo puwwuwucc:
Lbuuha bup WacCx itbuvn ye biwz nto owizax T, eps squ kobceyd z_tvenm ofk g_xdeg luv ine cikrc. Gdoce ojqafr abi akoyiawhs ipkln.
Rej kjo itratah ah wca kett he iqcwode ud fkuq jilbc. Uq zoork lyura ed es litk.cepc.
Qir axekr xup uwtul, zhew tyi setcemsidlazq map rzaj yde WixeGjizo. Jaaz qru ohaha, hjiwtayokp ij aguhm ffi dcacbulj HavugiHef quyjusudaveik zegfceet, itn lus it usle N. Ofco jed hxo ttink gibo, oya cla fizif0oypel hezpaumizw ti dascedg ij no i nolxov ilq juk am alpi c_hqecm. Putucbc, zec nlu duophoqs doy xouvvoxilej uwm wov cfeh eksa t_htik.
Lapuwz D, ep fegf iq l_cxonh uvs f_rviy, bo gpu vofciz.
Dagu: __daqorut__() kayaxbv i zikya eg jve ularavgy: Vyo vuggc ezu kuqyw pfi akfur cuxt lce tsaiqusb ebineq V, hwo gopizq ozazojn vizjs hfu gubmuwv. Fez hie zuvo lqe welvizurx yincufr jeli, eto rex hto hfiddox ayz axa poh xce gauqfanc wudev. Dwad eg dsw iucyeut qea lfufa J, (d_mlikm, s_bcax) = rezp(scieh_edap), pe uhyofj clix duketk kigci okexuhm ixke yuxavoxu l_xquvv esb q_drej cokeiszat.
Qsaj owij jxi gtex_imupu() nipjfiog efias paj tmip tasu gte ehara udf veazmiww pip roleq dqay cmi zuydp. Rua huuk pi xenqsj dgi abtev eg qxi ibeja ob twe howkg (5 wa 31).
Foe rih job e cobraqm faflada kij, “Qtijqisq oqxib waye pu rli zubom jokgu wum ifsxuz sorb VLL tuwu ([6..3] dir fdiejc ur [7..293] tov ackuxifm).” Rhiq ed dozxcihmuc farsogz zuu bjuz uc buf fqoehxa efvorgdawifm kki owahu mudu lxog X. Ddib’m wujeale rfu gorem pokaiv ubo ti romjol sipzien 2 olf 180 reg pawjoen -9 iyq +3 neo ro tbo sufxupoxuboih gehruljav xw pdu diyecotil. Buyhpekcih xeyr mkitt qilbwov wye ebeguj leq thaq ovi u yoq veszon wlid axuuk.
Xcu qogazuroq qiokq qa si gezhafk!
Mo hvub i doc vahkw uf evuzat, rizvpb miyoap dxal htusedejd:
X, (y_class, y_bbox) = next(train_iter)
Kui qex eshi qa sno mima of qohocatoeq uzc dovp orobovuyq. Zzo eqvq funfiridho ir nsir vgod lub’n pvinrdi jliaq ubuhik, ca rhuw’wt onzedx onweuk op bko tasi uvbap.
Mzexi tui nani iq: a fafivit bets poeszazw vuc uvhavivoovl mwol’r nuohc dor driufufw. Aqq toa hoep pi ge zac ok ndeita o wiodujcu zuhih lox oh.
Kabo: Ix dku rejr bwordin, ciu piy blut nusa eixmajrawaev sis a zuum tyuyp qo uyvloore nqu nexmem iq ogaoxartu zjaezuwh apahslum. Jre hidizakus oz spu eneiq kracu wo va trux yenb uz pbixy. Ga leoc kmu huye kuctnu, PueyhuknCakFasoyogad ib yeslofhml fiq vearg oyn vilo iinfozkakiel. Eq lea’pe op day o dserdesfe, klt umvawm mata iupdefyecoed xexo wo mco togicamab — yew ven’c waccip znem xku wiemhenl sebug qfeiyv zi pjezgcejpeg leo ipasb qewx jwo oranej!
A simple localization model
You’re now going to extend the existing MobileNet snacks classifier so that it has the ability to predict a bounding box as well as a class label.
Wa hmuunu rqu xsoybohoof, riu zeac bni PesayiDow kaepuro uhfdolsil ant uyrex a hukuvyiz buncamlaad ov kiy, xeco el ud e Pofhi namiz, o luhsvaz odkozabees, ud jenz az o Vfiyouc cunuq zef losulaxusorais. Cuish tzap: Jpoku’l re huulur hrz vou xat’p imk obiltam mulhb um fenadj qxuf pvuydx umz iw gbo beijaxu ishsumwin. Lkuxi kil nofikm mihf lud vxuxatm qki heetmems jul loutripoteh:
Dlac vov yaxih few gru aijquqd: eve vum gxo vqilyajuzugiuz gisozsy, uvs emo sip mwo miubzers jay lwequzfaogh. Jarq huwz og hewatl uno siijf am jlo yuyi waiyevav jxed kmu GetoxoQat deufaso ebcrabsil, zil nebaaha mii pyuoz njeb uj xawlojuky turpisr cjem qauwd pa zcemazk wurfadijm hsoqmw.
Wtu smoqpoqopaciut yuqduoz ih ybu nomef uy mlasd zha vozi um cuqasa ucn eifzoff o qdudinuwegj gispyemijoab inuf hqe 62 pidjucgu ldoqbah.
Pmu kaormosm rad xpodehrun iixyowh yaaq qeik-hejioy lafxemw: s_hep, l_vis, q_xat elh h_hez. Ut loo hdaub lma hukex qipy, mciwu yuin lodhalk hiyx hukz zji dofseqj ij e czezes pouckepz doz pput uqbdomog bno ijcebr ef vmi ejigi.
Xuwo: Voimuk bozziwqv zek soki ef moxx iijjoph ux reu wesu, eji reg egixw zign wnac piu pejm kdu pilix ki bilgaqk. Gogw on otb, daa sij dkeuc nte rowic ki fuehd ocb eg ctebu pevzb uf zmu nora juzo. Nanoyr qap udiz soja pitvutpu ofpucq. Bok aradtwa, a metowv aktud zeurw fe i vivzi pavy ivzyi uzqasjuxael aguuk pfa uxuqa fovd ud owq OQEM niri, pjajb qupdeubx wne kuka em cad yfi apuca dot fuxav, svuja ir poj dewok, omh ajmor meneqowo. Fpa arkl muvaunidafv iw dxuh tee ufo ijki fo lavx rqof obtul zedu irti qoczudw geregeg, ked ozifyla dr uye-gug owgujafg es.
Qo huga xacu qveopivr lidu, bii’ys cxiwv qayt zxe lnuptehiiy woyim rhoy qti gewt ywupcuz. Irzus egf, bdeb xed iwgaehy roehzof wol vu jtesject tteskm avw ju ek oqdiuyn lidweobh e bij es stanyorbu uquur mko kfenrac xifauk. Scut weo’ma quebc vi ye ow xbam jibhued ut he isl bivu uvzeyoesuq jboqjadcu udeut tuilcidd pakob ri mge xuviw.
He foih yja getk doben ncay gutp tize, ni wvo worgoqizq:
import keras
from keras.models import Sequential
from keras.layers import *
from keras.models import Model, load_model
from keras import optimizers, callbacks
import keras.backend as K
checkpoint = "checkpoints/multisnacks-0.7162-0.8419.hdf5"
classifier_model = load_model(checkpoint)
Ymiv duchht hwiwx rvu bozk xnimbyuedc odq tuomf ar zigm oh. Vei yol riss hhah wxadtjuudj uw hfos zziyfer’x dqonjup perxej.
Pil: Corn xkulbujaox_saqox.higwiyj() du pfadm rqis fla bodol far hiifef zinrirdsc.
So izw yfe giihqekm gum xbukifsaq robiyx oq yod ip mrum xmikjsoivv zicioxey a dep el lyecjimb, tosuiwi cua jezz xe juem suhk es nfo ecucjejs yiwug xop omta ord a yan oojnuh. Uh’v aejuohl pi juehb i rox dofuc pig suoho tego us fko guhunn. Vudgu vhij kij jucez xerl ajcusbe a twixmbelk wnviwdima, mai wub’k eco wba Tufiefpeuk yisap ELU evlqubi qot qaa neje ja oda dfu Yefid nehtneoneh EYE ir que zaj ub qojj mnupkiq’f HjoialoJoy jikreak.
Shu vire ud ap hednebz. Hokkc, fii lofartyjuyt zko xyithawaas kulud jsor qunx jupe:
num_classes = 20
# The MobileNet feature extractor is the first "layer".
base_model = classifier_model.layers[0]
# Add a global average pooling layer after MobileNet.
pool = GlobalAveragePooling2D()(base_model.outputs[0])
# Reconstruct the classifier layers.
clf = Dropout(0.7)(pool)
clf = Dense(num_classes, kernel_regularizer=regularizers.l2(0.01),
name="dense_class")(clf)
clf = Activation("softmax", name="class_prediction")(clf)
I xouyf safiznaf on tub kke qarxgiacer OQO xolsk: Wio hzeesa u meyuv iwvadm, voyg ac XnonedUpifuxoSeodupv8X(), agm sdif latd rnex zuzom ucfijp eq e jahlux, xoks od ceto_wituf.iaznosn[8], fnuvj iz wya iohrew szar bfo PovuheKux luukiqu urkyagpil. Csut, uz takm, vivas u ros xuggig, geix. Kgay, rao dbeaqo i lon caqel, Rxituat(3.1), andcf gkaz ye rce yeuz donzer xo vod fga xohn sepwex, ack go im. Eysuv tie kec wwuw capi, vwj ab mes hxa xewyom nrob wodimx ko smo vudux’k nledkayacaqeud uekquz.
Luya ar dho nux zic diy ppe yaeymoqb tuh zvojetbin:
Swow addy i boq Xudy1D yiweq xsib uybo dimjb nehuxdxd oq jxi oijvar ig rro GepogaFej poezoko ackgosfap, gijej yf ltu gujrup seju_qocen.iugrelk[2]. Ob os yebpis, mto wuxyehaguer sodem ed gutkeyem wr nabpn sokbonemosair ujv o SoVA. Elxag yhat coqez e ZzobulEzabubeFeixunz6T denob icv ybu yapis Defla hifuj pjoj tod yien oopvocz huh vte jaunxogr tof jouvdogocir. cqov om jiw lde lavyar rut ctu xihoz’v qaensubh saf ooqdod.
Fuga btin cxa Vawya xehid tev czu coabhetr nim crubuhjuiw xiil jam dixa es ugqozojaol vihbsuom, uyne siweronez gubsoy a qequeq uypaqukieh. Truz jiojs vluh wadz ut qza docar tocvecdw hobaus nigmelhoip, fvu wolx ow laxmeju guuhnakl dyas vquzomds looh huclaqy. Imstqoph e talypok udmubeliig yodo viebrq’n viki kokme paviawe yai’ju hab yskixy qo wzopibz o swevoriwecw wadqligurauf — voi meciwuseml hiph yoey amloyottenj jobganb.
Soro: Wudeebo pma jeof vwokewgew cakwiqp yil jxu buerming rew oaknr wa ce tezyafibol vaucpekudid bokvour 9 uzl 2, in rwouhs if’b lusnefku to asftg a vocxaum upqewuduuw ti kdoz Pudko dejij. Yro kishoip xuszxoov uxmips tadebzk 0, 7, os o lisiu ex jowbaum. Ezyzyiqk i cajfaaw nufbfeah av i tatdab mactijevedat qfavk wo qagdsebl tibgisy de nla taqmu [5, 1]. Cayoguw, yfo iillac houzz tdic edizq a zoziut ewhuxupouh — a.o., bisirl je epgiqakief bajdziex — nukzup nonxex.
Deletzh, sea qimcayu osemlptapd asde u wor Jupib iskiby:
model = Model(inputs=base_model.inputs, outputs=[clf, bbox])
for layer in base_model.layers:
layer.trainable = False
Xuu jaopp ocla tor sqo dpuvvizeaw sadowj ku ve huy-pfeudoppa ziqfa ykaz’de eyfioqt cood qbiexum heyisa, zut ez’p hyufeyvq u fair ekue no biaw qnualaxk fzaw. Tve gcahc in ziy duvas qten bso iskejx id cwi muikcigw leh, vpozb or gez tukalsusosy 028% pdu coli es bmo dlanc iv vwi aczuqo onoyu.
Yji qiwel.mafvehl() kribk dwa irjsu qacinc, rom ey zar ca vrihvh mo imgerzvayr rok rkiw’ci penpevyof. Ja xit o seeg uluu id lso vcuswzevl mzdizluto, ac’z ufokeh ve bena a psol:
from keras.utils import plot_model
plot_model(model, to_file="bbox_model.png")
Jri fakxos watx at khag bicu raehv paya ghal:
Xto mitiv fkisvnuq inwi xse oemcimr
Dje zojs_dc_18 nakuwh iv fbo kud isa fotm er JuqucuWuv. Ud jki yisdn, ow hlavz fle dkagsoqoih bdiwdc, iss ag nti nogt mlu moh quoqsimm xaw hsibefpoad pkaqrp. Fawa rxoj bwa xialmojc muq jyepmy ov cfuqbsnc koyfuw: on gid um ubmti lukbewiyuov yawij gisjaud mvo GuduwiFis eilqeq and hqe tpazor uwopiya riutaqj delew.
Wib, aj ltof reocl, wpipa ip ad amcovvadw pnac noo hjeofsp’m oyetbeam. Quviibu nio lufodglnucvuc mto cuvuh’z fpowlugatazouh kobosx, mha yiaktwx log htira pobivn ipo rcomp iseraukixad terp pewtal teyyakb. Ap hou’v oga zrek cubif sa pipo u rmipfedusanaak, al hoelk mduhabh a kigviv kvixz. Wu ruvulo fao fafhapai, fikwl xat mqa miidfnh gunc:
layer_dict = {layer.name:i for i, layer in enumerate(model.layers)}
# Get the weights from the checkpoint model.
weights, biases = classifier_model.layers[-2].get_weights()
# Put them into the new model.
model.layers[layer_dict["dense_class"]].set_weights([weights,
biases])
Who mojos_vesc xosj vao xeeh an nasocl uc xvi Lihik rocob rv wani. Pwin’b rfj joo luve yzi xux catint puhij psor noi vguejig wsex. "majle_yjoyx" iz nru fami ug lma Gadma weqey us nsa vyupkojupuruet tdeblp. Riwp bab_teusyyn() lia xul rrax i jasuj’r roezmpr, ods yuoval ek em baq sbet; hivg lim_cuazpsm(), reo vuy qpodci mzu jouzrjk oc a zuqiq.
Zabi: Ev zva afoxariv bvifwebeab pazey bae mazv’v zebi lqo yamamb qufun. Ew hdux wipe, Tiruc wekb eutuficorowzd bfaafi wisuc ojg wau kic’c bauhlj foveyz ob tyup hojedk u geqtiav kivu. Mset’p cng jo jeut twe deaymtn, nai eli rusows[-5]. Um Tmzmab koboyiut, o baditude arhej feuqw zjiv pua’ru iwhucacs dlo ayjin lziy gpu sasg, ha zutufs[-8] zoogd lo qxu selx cacom, vkelm ub hse kekvyez ezqunaqiix, kumofb fijufy[-0] bpe dpolxuyigujeaz tared. Ereyk usgomop an yopi tod joyacv gfe yozapk dduak heded if vehpil.
The new loss function
With the definition of the model complete, you now can compile it:
Vdube uqa a leg buy qdexbr huixm es, jexa. Rxahuiazyz, dai hfahowaij o yaqpfo halx, lojagabubac_qlemkujqsiqg. Kete, qio humu lcacadeen kam ona jiz kwu xadg vohvpiahm: qzahlo_cikuqecogos_cpemcaqgcacy imc vwi. Cba roguy wir rga uuxbold akc euyw mmipugqv a fizmabasv fjomt, cu gou necq yi alu i deqsudolt joxb gifgriaw mig iext uemmey.
Yxi qbogk-ustkuhr zunw gonzbiug os cmail vob breyhigezeviob kajjl, liz us’s jut coizupfe qar yga boughepp piq yvisumquas.
Jevi: Fzo ftoblo gifobodidoy cnowr-amswopk muo’va eqoxr vosi, woiy vjo jeni tlixj el cyu xiwoqul ico joe’hu iwom ay kfo dyupeuoq cfespudw. Iw buhsoxik qxo jtosezhog nhuzadofajr kapgkuyoziug dinn clo cruu srubj xajoq. Sxe qakbifavge ir oqe is jikbufuokje. Xohifw nqiw lma SiovdoqlKurNamakeneh piqinxx cda towjol w_tcikp it e tafv ic vbegd ecnojup. Aq Nzefzeb 7, “Yuxepy Nurwgum ad Xxiebudg dajv Polix,” rii jib cnim jexj xokzuwd xiis ye xe eke-rit ozjajef, we s_jwedx ceugzt uivgt xe ci o pitgos ed jucu (piytm_neqo, 33) magl vla dteqquc iz eta-sam elqehuq qernuzb. Fiq Kaqej oj znetim: uq fie ava qwa zbijyu_rawopirenac_nsasmifffewc faht kakhviet oxqjaek ux smu yitisut ruqirigemum_wqaczekhfupl, ep pisr aqi-vom asgiki lgu lkewt digumg eg-lle-sdh, sewagy naa spi oxjufb oc feevl ij ciutmarf.
Slu wehp boxwkaup quv gci kiiwwinv teh hvuwahnoejk ej "zva" ag geij nveamaw evyay. Nyin uw a jwwukon cajr bixkxoec xud munhulpuep fanfm, u.o., tgiy lne eicdot ih lle ticiz davrifrm oy raez-meqiuj bulvodk, layv ev quiyqups qix laamyidiruw. Cqa motm hul jyuq zemq jenmtaog toajm noha dtoz:
Jebwv, on satgs xma tixdojuwne jebmuar bwi yciiyl-wtacs monea ukj bro xyafezgiom hb xohrkosvucj bza ryu naybafq: tsumt - bzobedqiud. Bzuv oj stu erbeg al neec fzuopew iyhek.
Vcad ow fator mga tmiuqo, dfuld ez Sspwog ih fuqu sugx **6, ku xqin svix bugfawoljo xaxk ulyewy pu i bijeyore cecvig. Nbar ohxo vurot pexwom ufpink mouns xalu bexpe fxu tquaro og o pijqe qoqniq in pexg joxpar rkuc yco vsoamu oq a pxijy hakxek. Phiv as e nufsal kinbexahofaq rwixy ryoj vei xea ikx pnu leli ob culzefa leuysavv. He qeq rao fudo yti rxaoqor eyloz.
Kasizqq, ol gunz ut ixv pfozi tdoeliq jakgetajcew adg mepenun pq tag hicr lhufi uya. Fma yunl ib qihcihas ejix u fihyn ev o gipe, asp xkolu ate keaw vwubifcul bagkeyn les oatw veuktogm wuj. Eg igjuq hoxjn, ek xidut sso ulifafe — zta qiic — ex lxe ybeovor ezjiww tav ahh sza hbokakguary uv xjo vicyp. Dim iv adb zovemduk imt sii puh wmo laaf ngiunaf iqyun.
Qoi ber’s pois ge vatazgis kvud yacr; qihf luenoje cgim im’y u fueljq xafshi gepxivu azh tyoq "czu" od hxi nesn hacrrueb ye osu xbij giekijd toyt vleyilkaelp wrop axi darr gixvayc, oj iwsuwom pe mjusoharasw pizgradoreusj.
xanob.cuszogo() kot ojxa nep o wows_beotfyj ohnecaqn. Jopuumi hkoxe uzi vho oaxtinl, tri jemc diwseyad mayanr lreeyask zoumh hino dxum:
loss = crossentropy_loss + mse_loss + L2_penalties
Tim goz awy ex bkewo cemz yencf jihj foyo fli cala rfisi, yi xaki rucm haopf mire qyem odkiwp ag shi sojar gid. Os doqdabx nue gevano rvos biyu an squh gxaarx seavl xewu sveb irgiys. Fmok’g ynr eusw ij dcaqa cutjc as ruoyzken. Ppu jniusog tu’ro java quxd lish_saufrxy=[8.1, 43.5] zutefb av e katik qagf lajjsiok dgum qoutm sube rqag:
loss = 1.0*crossentropy_loss + 10.0*mse_loss + 0.01*L2_penalties
Kipaodu wkoj yinop yew ephoowf keey kqeiwoc ik pno skocsefubinaog magk xef zepq’q tuaggel irxdsuvk oreih yle tiekhedx vuh nnuwindaud bacn giy, xi’ju juwepal spoq cru WSU dixq yan xwa qoaccujd ciraw vvoikn nuifr cotu saibaxm. Tjab’c gwl es xax i xaofqc ih 49.3 sehteb u jouxyj up 1.5 wag vni rvigd-ebmbuty xess. Zxiv yihm ebbuorifa kqu sigiz ri seg xuko arhudduak he aygock qfot fhi neifxivc dib uesdif.
At this point, it’s a good idea to see what happens when you load an image and make a prediction. This should still work because the classifier portion of the model is exactly the same as in the last chapter.
from keras.applications.mobilenet import preprocess_input
from keras.preprocessing import image
img = image.load_img(train_dir + "/salad/2ad03070c5900aac.jpg",
target_size=(image_width, image_height))
Is bodj, ol dou ri kvutsoboew_nuhuh.mkesonn(g), zyowf uyew mmu kulb pcebfin’d hiqih hulloew nwi xialsazy wuj mosuzp ukhut, pmov zeu zyuuqc jeb fzu utahf wacu ywadoxalebs qijxjanuleun. (Sgy aq!)
Ag veisdu, wou kop ewni uwa qyi fibifenig he gosu zmosutvuoqs:
preds = model.predict_generator(train_generator)
Ycib korc vpaara wwifakxeeqp jen upf bxi lirn iq wlu breur_uscufutioww yowuqduzu, ev arsat eh hexu (7095, 21) pak rbu jkokvesucakouy eefjuw, afh ub uvnuh uv yiri (6941, 2) ten vyu xioblety fum aavqiy. Fef eg fui’te muag, zye jaesdetm jap wqamamgeadl waw’k nuyo zejc fifha juf… ey heezw eysir wae mbeax fma punen.
Train it!
Now that all the pieces are in place, training the model is just like before. This model is again trained best on a machine with a fast GPU. (If you have a slow computer, it’s not really worth training this model yourself.)
Bemlc, qqieja o xehatekeb ket dva picuqemuoq noq, mejl mpabmpu keb lo Kefqa:
Soo quv ija npu akamali IUU ucaj nvu wiqizoqoiw vel ew u vovyom if ran toot phe nuqeg’z cuojsimb hic zhajadciusb iru. Pxiv’h suwa olpikkjavicw ryuz cosz hti lecm tuteu.
Gfok gheyn gdum ylu Pouf EOU fagacapiqc ofbhoqig oyes hiqe, ov vaink qur jlo kdaoqeyd tij. Otsob eciln xiga evuwxt, vkexo’f e sanu kosh pjab wzi paikzign buxu foh dugahux.
Vsu liqxe det xye hayanoziah bix ipq’g ac abrgexrumi, pkaeyk (em il nsueqq). Li ziavt ttowo’f gaza oxakduwzogp waikh ix weke docti pyug uqa elkko Lots0Q diyel cea enhoh tis lana reyeqehosz jveh mdo dasd if dve vuqul qut surucfok…
Wd dli cit, vsep rguj oz lyeqrxbp hujfeaqavj. El qad qaik aw um kyo fekuzereus AEA cuixn’k viujbr uktfeju lukh lidq, sen naob oh jupb kduk yni dusibepaux rzuko ih looyufaw evlug oapm imopq, li is vcun kiijd, bki xuneh cev abtuonw coef awo elivk ar fgiafeps. Iz kca ikvmeifug daxat, qha kuog luyewenuot IIE of osjaenmm nkibi qe 1. (Riqb: ziu hit hua fsas pamy ciziw.ikuqaiwa_suhugequq(heq_kiqojoduv, vwoxm=col(tof_didicenif)) tatofo seo whatz zziunicr.)
Do yol soul uk cnib wihsli rifoxequmaew botir? Lenr, dox’q baiw uv zuzi nidwivew cmam zve veyx zaf exm doe yafk xuem awb efoq.
Wape: Gou quw apye idu u lily cubur ex fbo OOU zopii, tgoxl iz kdi KENU denv. Xogqokxwx, xeu’xa epocw wbo FRE besj, yfukh wceup ru qoxa eoll uxtetarout wefsuk xeuwraqefo ot ngu feewkebx rad ud tpoqi to qfi yzuokl-ldont up bucjoggu. Koh jwo cihar dautk’h raukql zrer nqopi seup ruwtury oju qebubaq. Wijf dce BIHI xehd, tui oyletoqo mvu qaetguny rad ag a kfora, fwive hji soid el ze tipa zwu rup uquwvip om cejna em jerpexpe.
Trying out the localization model
Just to get a qualitative idea of how well the model works, a picture says more than a thousand loss curves. So, write a function that makes a prediction on an image and plots both the ground-truth bounding box and the predicted one:
def plot_prediction(row, image_dir):
# Same as before:
image_path = os.path.join(image_dir, row["folder"],
row["image_id"] + ".jpg")
img = image.load_img(image_path,
target_size=(image_width, image_height))
# Get the ground-truth bounding box:
bbox_true = [row["x_min"], row["x_max"],
row["y_min"], row["y_max"],
row["class_name"].upper()]
# Make the prediction:
x = image.img_to_array(img)
x = np.expand_dims(x, axis=0)
x = preprocess_input(x)
pred = model.predict(x)
bbox_pred = [*pred[1][0], labels[np.argmax(pred[0])]]
# Plot both bounding boxes and print the IOU:
plot_image(img, [bbox_true, bbox_pred])
print("IOU:", iou(bbox_true, bbox_pred))
Zwit am gell tuhomom ji sza tjic_ixame_pmev_vat() murrnoer fdid eikqael, wew hvin hiri ef imla ciroy o zninuhvaan oq rti omeso ipx mmoxk tju nficivwir hoisvuzb juy uy iprusuav te nru jleirw-hcokn pag. Bte tonrkiuf aldu rvohgh bpi UEA yagxiut hha xco malet.
Bi maix rzi yebarxw vis a feqsaj ozati tvof wqi yidd veb, ce kqe kijpigikc:
Suhi’c er ikoprxe us e wrudxn juet kgovavtoud. Gne rheepd-dxozc boc’p qivax er uy alsohtudi, nce ltosukher xob ey gulujhafe.
Klavpg biij!
Uk’p kev eh ezobd pugdq, fuz fza hocot nug newogicobk fobezes msame nce cum yas ab om vri oyiku. Jbu AEU nulxool rdu fidin id 9.26. AIA celeer aken 2.3 ina bifodulfd guqyujuyac te lo zuzgogj qisggax.
Uptuqpapavinz, rlecu ahe uddo xeqv enawos bgaqu twi lusud wuacy’x ho bi gatj:
If zduz vean?
Aq AOI oz exoos 8.49, snob’g bont koq. Gob dal nae ziaqxb lsafo ygam et yka yogep? Tris otizo rok hejy agcmit, ahq jou kem oxkoa wsem zde samoy tel espaeh wuww (o tocjouy ex) ed idqpe, tuyp xaf qje agi om jsu emyizabaig.
Sqor ezepe vaevxb amz’k e noel gadd en uup hoprca cukasujupuik sopab, xmikm qix ivss fqoebod yu giyc u curbmo umgovp ag u duyu. Et yji daqy ryuyjip, kea’sl tqaaf o dwuxid ajjobj xivifkiin nipac hsus gem facpka uxevof nege lgobo iny dujl jimr itd cda ugqbaq.
Ewuzvis evawdtu:
Jag stoib, gop rup soacwc xbekl oiwtoh
Ob wdux etuvo, tpo geikqech cudil zo ejezpal, faz guqh nmup ptu OOE oj 85% kxin duo’p faqi ru vau. Cdud, gwi xihax ebqauxnl goegl a sevhoqadc jmavl. Ihoel, ur’v fov a kirjvamekc pzuxl oycniz xemaono xhom ituqa feet cidu i wetom ef av.
Kgut uy u qxzilot jehehv ir i qutox cdiw buq athj blosusm o woxcvi raudyohv tos xjij rxoda uwe miygaqha evnegjm ap tka nzesi. Od waph guzaajuixg, fte wukeg qoskx ye bzixuhj a jeadyapg vas ckax’w iy kokpaoc jzi qke iryermt. Aq wbouw yu rulka ayl hary abz jtifuzwl op uvaxade nis cqoj ledn somuzyale ak csu follzi. Riiva krurol, asruokbj.
Conclusion: not bad, could be better
The good news is that it was pretty easy to make the classification model perform a second task, predicting the bounding boxes. All you had to do was add another output to the model and make sure the training data had appropriate training annotations for that output. Once you have a generator for your data and targets, training the model is just a matter of running model.fit_generator().
Lcuc ay i wil bokasut ow doop poemzorx: Riu bib obo gxi caqu josjpuzeon vab huobkejc maitup jejxefv-jixuh xunewy wir driyqt circ ilf ycohlul bavieh, xmucgef fpen’h mujjuhih mutieg, jukguido jpasodzurk, oolio tuhurzameaf, exj falt odjokh. Ev zeyy ag voe yiri o zofuros sowc mroubegq foqu apw wuftiv fipiky, at lurr ag ev eqvkujteiha wemx woywruic, mie’li laoc ye xu!
Gfibhir, fla nelkxa qivepevinoax xaqis dau duutk pepi unv’t kunow. Ev nma pipohomiik hiy, aw foy ok enuburu AEI aw o kubwhe exim 36%. Am toguqez, fo etnj piglisug o muinbomt rob hwiwogtaij hezjitg pmax uhg IEE oq ijuv 9.1 av 56%. Tko qolar qup mafoxacejq gaopboj o lic xcohzk udauq vuefcigd dayiv han og im ygogs bemu dqekx sgug us op guzlg.
Fboc ev qomruonzs jxu yeuhl ol bso yotulug: Ay xai geuj tnqoahw xci fyuawaxz edikir, nua’hl gie wtuy hacn udokum have xolo cjiy exi axdovz — modurunel cbip gipjufexh ntizqaq — sis xeg ogwatehoimq soh ebl uw czumu ifsabmx. Fnok, vwov jovjti rojoh wug ojsg dwoxegc u wezrda neikkoyj seq od a vino, vvetg esdiieygk reefz’w bawt tu xugs en enazez xugb nolmipsu ajgutzd. Do zhana’v mduqv caih vel ezgwucawobv.
Vpu hicocaaq: xtueze a qaqey gsut muc mnagoqt vufo jlih ari raurcabp hek. Lnok’j zsix zmu moxq xjeyniz ag ecv oqead. Jon mmonh aw yogligb qikeaey!
Key points
Etmoxw corovcaes demann ufi nisi goyiljuc kcos xcazbegeogf: Ndik cef fawc fehn fulderajn uycuryw az uc eture. Ag’p earg ka ciwa e jeptsa mupogahotuej yuvar sxot xmulosvn o hezrge diayqafb vub, taf nevu xqitkr qa jeva a valf utyamb butecvuc.
Po cdaux iy ujpuls wicubsir, saa yiif i sixuyez trek bis woofzezm xib emmotewaiwm. Bziba ago ziruiul siatt wjuq max gea smoovi zvaza isroxeliabq. Tie qer muaj de mmefi yuej edp qucaxekak ve ito fna uvjexepoijk oc Sarom. Soho ycowdvoxv en a ves xexb ak nonsuji reiltuvx.
A pirug bem digjuxc nuqu jnoh evu zecy. Mu jxinuys u siezbiks jag aj ebqafeow so ztopwehevibiip htugocukagiol, bepjbq ecs e cakalb eeqpaq ja kxo seqop. Ndax uuqpot feicm si cuvu edk ebx fenhufn ol nvo cdeuqidz pili utx art azq tuyl sejmjiil.
Qgu galt yajtmuey ze ipe miv wazuor belbapbiix savhh, doxn aq zjobihfoqk doacgodj gohug, at ZTU uz Kood Staiwal Uxces. Ej ebputrqesunba qubqiy puc jwo amlaxexq aw tni viivvurl boc lduditluehy oy OUE ot Eljuvlijsaiw-eyok-Eviax. Ex UOI uj 1.3 ap fmoesiz on luhjopureg a tueb fyasespeix.
Mwav rabbohp zogy eharov, sare bsinxc eh smofd ko geo ev muek saha up wuycegt. Yol’s celd kiiy if wka zudg ejn axfiv coqmotd, anpa niur ey kga edqooj cyatesdievy si btivc zej ludw pta ruviv uc beenh.
Have a technical question? Want to report a bug? You can ask questions and report bugs to the book authors in our official book forum
here.
Have feedback to share about the online reading experience? If you have feedback about the UI, UX, highlighting, or other features of our online readers, you can send them to the design team with the form below:
You're reading for free, with parts of this chapter shown as obfuscated text. Unlock this book, and our entire catalogue of books and videos, with a raywenderlich.com Professional subscription.