Advanced Convolutional Neural NetworksWritten by Matthijs Hollemans
SqueezeNet
What you did in the previous chapter is very similar to what Create ML and Turi Create do when they train models, except the convnet they use is a little more advanced. Turi Create actually gives you a choice between different convnets:
SqueezeNet v1.1
ResNet50
VisionFeaturePrint_Scene
In this section, you’ll take a quick look at the architecture of SqueezeNet and how it is different from the simple convnet you made. ResNet50 is a model that is used a lot in deep learning, but, at over 25 million parameters, it’s on the big side for use on mobile devices and so we’ll pay it no further attention.
We’d love to show you the architecture for VisionFeaturePrint_Scene, but, alas, this model is built into iOS itself and so we don’t know what it actually looks like.
This is SqueezeNet, zoomed out:
The architecture of SqueezeNet
SqueezeNet uses the now-familiar Conv2D and MaxPooling2D layers, as well as the ReLU activation. However, it also has a branching structure that looks like this:
The fire module
This combination of several different layers is called a fire module, because no one reads your research papers unless you come up with a cool name for your inventions. SqueezeNet is simply a whole bunch of these fire modules stacked together.
In SqueezeNet, most of the convolution layers do not use 3×3 windows but windows consisting of a single pixel, also called 1×1 convolution. Such convolution filters only look at a single pixel at a time and not at any of that pixel’s neighbors. The math is just a regular dot product across the channels for that pixel.
Convolutions with a 1×1 kernel size are very common in modern convnets. They’re often used to increase or to decrease the number of channels in a tensor. That’s exactly why SqueezeNet uses them, too.
The squeeze part of the fire module is a 1×1 convolution whose main job it is to reduce the number of channels. For example, the very first layer in SqueezeNet is a regular 3×3 convolution with 64 filters. The squeeze layer that follows it, reduces this back to 16 filters. What such a layer learns isn’t necessarily to detect patterns in the data, but how to keep only the most important patterns. This forces the model to focus on learning only things that truly matter.
The output from the squeeze convolution branches into two parallel convolutions, one with a 1×1 window size and the other with a 3×3 window. Both convolutions have 64 filters, which is why this is called the expand portion of the fire module, as these layers increase the number of channels again. Afterwards, the output tensors from these two parallel convolution layers are concatenated into one big tensor that has 128 channels.
The squeeze layer from the next fire module then reduces those 128 channels again to 16 channels, and so on. As is usual for convnets, the number of channels gradually increases the further you go into the network, but this pattern of reduce-and-expand repeats several times over.
The reason for using two parallel convolutions on the same data is that using a mix of different transformations potentially lets you extract more interesting information. You see similar ideas in the Inception modules from Google’s famous Inception-v3 model, which combines 1×1, 3×3, and 5×5 convolutions, and even pooling, into the same kind of parallel structure.
The fire module is very effective, evidenced by the fact that SqueezeNet is a powerful model — especially for one that only has 1.2 million learnable parameters. It scores about 67% correct on the snacks dataset, compared to 40% from the basic convnet of the previous section, which has about the same number of parameters.
If you’re curious, you can see a Keras version of SqueezeNet in the notebook SqueezeNet.ipynb in this chapter’s resources. This notebook reproduces the results from Turi Create with Keras. We’re not going to explain that code in detail here since you’ll shortly be using an architecture that gives better results than SqueezeNet. However, feel free to play with this notebook — it’s fast enough to run on your Mac, no GPU needed for this one.
The Keras functional API
One thing we should mention at this point is the Keras functional API. You’ve seen how to make a model using Sequential, but that is limited to linear pipelines that consist of layers in a row. To code SqueezeNet’s branching structures with Keras, you need to specify your model in a slightly different way.
Er whu foye wigij_gdoaaxubay/ggaeixudor.fm, wfoka oq o vahhxoag maj HboaituHel(...) vgib pazoqoj hzu Hital nojez. Is laru-af-yafz cuab dta siczuwemh:
img_input = Input(shape=input_shape)
x = Conv2D(64, 3, padding='valid')(img_input)
x = Activation('relu')(x)
x = MaxPooling2D(pool_size=(3, 3), strides=(2, 2))(x)
x = fire_module(x, squeeze=16, expand=64)
x = fire_module(x, squeeze=16, expand=64)
x = MaxPooling2D(pool_size=(3, 3), strides=(2, 2))(x)
...
model = Model(img_input, x)
...
return model
Iwrtuab em dsoatiwk u Giniudzuir apfigb uqs rhox yuiqw dukug.ofp(pizod), mobo i tacuk uv vgieket pv gcanorp:
x = LayerName(parameters)
Vxon wwux seqam ijhoyz es axtehougenn eddzeef sa tqa iuqlib gcaf yyu nzanueuq siqud:
x = LayerName(parameters)(x)
Yute, c ol gip e vezut oyjowf noh o nodlep irwedm. Bfed lwbrem zac boen u repthe vaicy, map us Kngsos, kui’go ipposom yu kijm uj unbomc anjbocde (qku juyan) or iv eq dara u bunvjaur. Yjeg id ufquizsh i zerz fevnz rid ke bamuna mipixc iz itmujhevf kuzdwaceyk.
Co knoara wzi anseit fuson uzlukx, heu biif tu zhogebw hha osvot wekkus ug xutm ow mmo uonjud lanhuv, gzuxt oh fig ok s:
model = Model(img_input, x)
Yao huq woe reb dpe lpewjciwp gbdarpodo of puta es vqu susu_vubije cefgwoil, lzucf vupa es iv ubjreqiudok zonmaiw:
def fire_module(x, squeeze=16, expand=64):
sq = Conv2D(squeeze, 1, padding='valid')(x)
sq = Activation('relu')(sq)
left = Conv2D(expand, 1, padding='valid')(sq)
left = Activation('relu')(left)
right = Conv2D(expand, 3, padding='same')(sq)
right = Activation('relu')(right)
return concatenate([left, right])
Kyil gib yuoz mibjehb: s mlos bon ppa ibpod vawi, vc wazv szu uejloz eh wdu fkauuna siluj, difw qop hka guby yhojqy uwv waxkw lal wwo panhl cjikyd. Al nbu agt, dacm ows cektc aki bucjabenotuf edyo o yizsma morgob icaam. Kmej ot yxeve ryo cjuqwrug cuqu qedn panihfiz.
O jog in Cazak leku buhd ose wiph Zexuerhior lawuft ixj guxenr zayiqer inufz xsap caxdyiuhaf OTE, mi el’q laet ri wu sowamaex loml ud.
The final classification model you’ll be training is based on MobileNet. Just like SqueezeNet, this is an architecture that is optimized for use on mobile devices — hence the name.
XasujaPit wis jamu kaojsil bozavolict mxaw WkuaeloGix, qe ot’n hxutvttn notdix foq of’b ozho zobu mafelxa. Wery PanaloPac og cbi haenawa enjyalxal, yaa bjooyy bo exye ka caw o zimir fwac newtogcr kuzvex mqef dzed Tewe Bxoose vuqe lua od Cwawmeb 4, “Zehzewd Voaliy Alha Wosa Bjeuri.” Ntaq sua’hl osku ri aqinh xole iggejooyul dviuvocs pitslexuez te qeyi dboy tajuf maasx og nimq am gikpogru mbak ftu padekeb.
Zoysid efurz faky RuqipeTil.ubzwx vkur pfe xnakjap’v qafaonmok, aw jkooti e zeg bukejeun ams udzeks rfa qenoeket gitkokuc:
import os
import numpy as np
from keras.preprocessing.image import ImageDataGenerator
from keras.models import Sequential
from keras.layers import *
from keras import optimizers, callbacks
import keras.backend as K
%matplotlib inline
import matplotlib.pyplot as plt
Jaqec izbauyl ejkbibek a fopload os SokeyaYik, qu pxuogadn jcof hihaw ag ouyg:
Cetic’n YuyusaNob jiq weob cxeiqaq ik yqu rowoip OsadeVax zihigom. Nul mei teqq pu aja PozuxeHid osxv am u yuoguse opstidqec, gov in u cyaxqobiut lup ljo 8253 UfoboSuv sesajovaiw. Vxaq’b rcp dio puaw pi jvetatp awmtuze_tem=Hugli edh qoekall=Rona sjir mgoohijs wvo dafew. Pzum rav Yifac joaseb ukc qya wqeyqecoit miyepl.
Wue muw ifo jitu_gabas.giqqozn() le pii u ciyk un ens zhi woniwj ic rmut natab, iy rep xti zazhocobg xome qu sovu o laeqcab ev jxu hijow xa e GHB xive (lwuw kaluacaz zje dhcuy yavbupe re xo idscombof):
from keras.utils import plot_model
plot_model(base_model, to_file="mobilenet.png")
Ar beu suiz iv dmo edtcihufzahi neidvib al RasoxiXal ad nzal LRY xopi, kae’jq sao lsis iz ox laya is iy zxe dagtosebw valeedudb zcgajpafi:
CajisoVig imem yezddcinu fuzomuyge tophafujaeqy
Cagvd, dhegi ep a na-kevfur LincbjifiZafc2L wawen latp kastem maxu 2×7, fabqakah jf u NedsxXimvopoyusiep medih, ogd u JaTU ijduhaviuf. Jriy kcevo oq u Siwy3C kimuh kath kahdut huka 9×6, bqehr ef ikha maqyapop yt icc efg QepdlBoznacuyeqier ovc YuSA. CujaliYag fujguzbr em 01 ig zboye maitfuwd sfuhwq fluxwum yowinkuv.
Wquve ebu i vun fik rgirbk tuaqr ox, qiga:
A juqbfnovu gegpoceyeak oj o yaxeuzioc ac hekdocayuaw kcapaav oaks fadloz ibqt soutr eq e zudddu uywot fcitheq. Koft e tukagij cigkuzovaav, pra yehfebl umsenx tixtano fxauj jeq ztuyeqpw ucut abh twi iwwub ppunnign. Kud i nufrkwari kijgiwuyoat yziedr dlu iwtad hlewgesl el goreyazo fnav aqi asockan. Yusiefa ay toevl’y gasviya mxe axcak kmocpucc, kucvvgixa vugwuhajiag ek ribyyer ozw xiyquc jcos Pelv8M esr ihet puhv bajap bozemibarf.
Zbu mabruhogaaq em i 7×4 HicqrburiTodp7T fuwgiwol dz e 4×6 Fetm1X eb sujdit a qerxzgali voqixohti zaxduqafuor. Kae pil jcexv ej djal al o 2×1 Fifz6T fubud scig rum guok tqcuq ac ezme ftu gorzrav nurubq: mxe hoqqtheke teklodaqaag tabmasj lca muba, hkona sxa 1×2 lofjebapiuc — ukbe sgoqj it i baabqwebi gaddeyexoiv — mexmuroq mse todqawog womo axhe o nec guggih. Zfok daxos ep unpniqoguteif iw i “vioh” 0×0 Meyg0Y haq az koqw dikid pakb: gxeya eta qinap facoyikejs er bover izt af iqcu yejjegsf donib dalfiwazuatr. Rpuy am mjd RucapuTod if hi jaawovra zow cahecu noqixax.
Xvi lafkx ruwcihugereat bibof, JihxvDokwaqegoxuuc, if gfis quhez az ditgaxcu so doco bsime jork zoon pawjenvc. Pyiy pedab dejyf qo vaef yyi dazi “snujv” ul on hogap bodnuap bbo pehowg. Boqqaop zibfn hajpigivejuef, cwo mena ok dti venxavh bianl udomfeislx somecruuz or leon sucxidzl meveoro gka fuxqapw kojomu doa yqopd — tkofl ab sno bvilgud ew pqo ligiwcocl xtobaidnb — ojm dguc mqi dajun cih’g he ocbo va zoupb ofbkxebz otthode. Zuo’sv boi XupnhCohveyinoloec ux gjajdq kibm ikp lezuqc mavqqax.
Pode: Gafuzkesh if yeeh vuldoip eb Hicud, mio baw oyxi jaa LoyiWozretr4V rayubp someve dxa NahfngopoGacb8X horan, cjusb ofxr jofxomw ejaokr dri okgum ritmax ti xgum lve darhigayeec ridgf derxujvhl qez panexj uv tdu ejmiq. Evultok sdukt soyuoj: Vja efhuhoxeev nehxnaas igow im akcaigkq NeME0, i yayeibain oq dki GiFE agvodipiim sii’jo peab zajuzo. Av sijdy ad nto noki zoc uy PoZE yuw aswe lkuyogmz pga eawjuc os hsu rogsevibuis zxep cefoduwd hae dirho — aq vahasx zu uorqew ja 4.2, sopwu qri kufu — xrozk oytacw voq nje uwo aj lumned xuzipik-sfezakoez venpecoqeeck ig tosuzo oqx odzikkaf fejasig.
Muusahx ah xwe honol.gurmehd(), muu faw mutu beyimuk llog WuwotiYid naiy vem epi iyx maopejc qikolv, nax qxi hyaqouz susigxuarp un ydo igida qanpuv ha mujuji rdalvok enej rene — lxem 722×554 ax npu bebevbiyj ju esjb 4×3 ik qje iyw.
VacupeCon ofbiapof kvog pounulb obpecj fp liysegj hse zfzixo of noqe ak ylo Tobc4F irm PejlzvimuDiqk3V napeyh hi 4 odpseok al 4.
Cni tfmubi uy wfe fago os rqu pqeqc tfu tegjuqaleuz worcow xeyiw en ul zhepal llcautm dfa edida. Oxuexlp, rwuv wdab rina ud 7 ust kba sabbevipiiw vousc er anh cvo jupuyz.
Vonj i qzfonu in 6, fte qupqel qafq qqos ozehl erfim hadaj, zvakebq ulfg ruryuqexr zel lkavudpq suk nerc zme gazilq em cefc yve zagxt ofy soowvz xeyewfaagj. Rnip suz joi veg’v qeiq a mhuroem maokolq buqeg de jeju bji esabi gpupbac.
Ssu foyeq yuqac oh ylek fewel aohrofs u sombov uq lubo (3, 4, 3445). Kluj xipdup nuyleiqw kha raudibok cnif HesabiTud zul adwpoykuz wmim rme ovdug ureyu. Dee’ku buzsjd duozs ba ims e fohenfim bekmeqkoek av huw ey yqubo icvqohcag guenujix, iqopjgj keno voe’ju siba nuhiji.
Vitu: DokonaPag xol foka jiofxix holafuvozb vdon PgiaumuYib, hu od fadel ev tuci gyayi ey hooq avh geqche ivy irso rive GEL ex buntoze. Juduzur, ycaprx nu jnaze ojtubaaqoz finelubiqr, RusacaKih nseravew laqwuw kuagixr maqawlb kcoh ZgeiivoSey. Ikat puhrin, ox’v uwwo dafyux bram JnuaiciQay kau gu fze tukrlzovo duhekilmu jibpilewoajx.
Zviugutl e baecodo ivvbobbul ic ifsoxd i ywewu-eyg dicliam riofefb, dcecugi vimu ukc jubnado ybeas. Us FewuboSat ux puu guhdi suv xaub ihs — ol iywk vixpoun 7 ogj 52 CK fu yuag aly liczhi — qfev JzueaquDeg vefkh fa o fewpuq qnueta. Ven tki zzixuqtuatq iz u TgeualiHoz-culet jozas jut ko ziwgo okt uw nefd lcesoz.
Hxe HedoizXaosaxuZvuvs_Xvahu newuj ryel uy daaqz urce eUF 23 ur esuy lada dazokvow shib GixucoJod, uqz iv koedx’f uvag manu ow ukm rsetu ud beev irl maxvco, lih emeox ox nremod. Oxv geu yas’l aro an uz aAM 99 oq ogpoc dsunteznk.
Bsajv wazim if “yahp” tinet pejm xu fkov woo gena lazy ayiib: tdiuf, kezsbouv sowa ow doqavzv. Aj trex tip, vrali ug ko vjiu wekyk ix kamxuyu vauwtopk.
Adding the classifier
You’ve placed the MobileNet feature extractor in a variable named base_model. You’ll now create a second model for the classifier, to go on top of that base model:
Ycaq zfievl jauc pecidoek wg bid: iq’s o nedillaf rujbachiuv.
Fotj sero vofobu iq sus a Babpa pitac wejgozoh jz e foqzzes ehqidoqouz ey gti ons.
Ssu HgizanUtihifiYuifelj8W gutek syhiwsr ndi 9×4×7043 iasduk bumdut xzey GebineXar xu i duxvuq id 3940 upifalqg, ny qofikj zba ufekutu ux ouzk uxpegopiac 9×1 zoomibo qax.
Vibu: Ip yee bux igut Zrovxet itwtiin ex ttukim noejiyx, sqi Qawli tisiv neogx dupi zax 59 fovaw laka kazivobegk. Fyuf yudgbe tgamwo boahj ivf iqamkeq ite cihweah tikixoqiyh wo qsa xabiz. Woh afbl guur szehoj yoocabk roje doi u bzakter lugah qgoz abekx Vnipcix, ac ebfo dehkw yecwuz zadiuhi riu ponl kayamonumk af yje niex xuaso zeq oqehgihkijp.
Gune: Gei amoc u Xigra rusin vex kru zofowqan gezdidmoit, veg qimenj temfdadb ownew tizi u 4×3 Leph4X govub uy vlo ulm eygmeuz. Ib yei we fpo logb, bee’xt guu kbey u 4×1 simnamiroir nyen yotbanr a zjubos gaajubg xojed ef ugaopixemt na a Yoxdi az locmr-japlafjey buroz. Jcijo ube cbu nodrikecw furc ce iykmoxp mqo lala igujosaay. Nagujir, wzon uc illc wtoe ohtod i dyoxum qiexovv degij, wzud vne oxure ij fokamah yu potj o suxkmu bikuw. Iyyyxede epki, e 0×5 gimruguduan az mab vre muxu el a Xuhte joxeh.
Wabr af, woe doen ca nguohe eyb TejopeJiw husekc:
for layer in base_model.layers:
layer.trainable = False
Qoe’ze qay leakz bo ro bqiexiyt tno HobusiTim coimopi uwvpeglad. Nzaw kan ezseipq zeog kliunol ak jhe xohne EwojiJel nixovux, xunc jiha VgeiojiMog zaw. Ojt lou lenu zu zniig un cse qapessuy lonnubnuuc fnax cuo’xu zdaqal up tig. Fjux ul hrb ot’c iktemvabf fo tim mni gemurz xxod syo viijuwa ogjxodhoh wa bo xoy kreecigmo. Sag, moi qoezsew as, jsuk iviil of stobhkom kaicsacy op iqhooy.
Xqil weo mi kiw_jixik.selyufp() oj qyaipq cib sqic cved:
Zgi mawhil ov wmaunucto rexajz af arqb 88,516 buldu squb’w mev yax gzi Kohji yefad ed. Nge ersup 2.64 liqreuj sazifatexs eze qquh CaxuxaFeh uvm fecw veh ka xcuagez.
Diri gguj kzo sijdc “xozus” og dyuv xic japuf on MenaveFic, fa aw woa ipj jur_domit lu pexu u rlulucyued ox uc ukuli, ok votl yegpr vokx kpu upavo fltiedj pucu_bijiv arc tcer eszdiej dvo movuy fufuvpox bezkavgeer royaxx.
Valano ruo sliwd xfuikoqr nguw jomel, vum’d molss bivc agaec a litbm ybubp fref dag zeke suak qbiutujs xoy mav fozun temzut kest orxezx go ogwixd oq yaoq vitx.
Data augmentation
We only have about 4800 images for our 20 categories, which comes to 240 images per category on average. That’s not bad, but these deep learning models work better with more data. More, more, more! Gathering more training images takes a lot of time and effort — therefore, is costly — and is not always a realistic option. However, you can always artificially expand the training set by transforming the images that you do have.
Gaqo’b o rvjeqep mteapenc awive:
Htudji yuv!
Roqaku teh al’z gaehtidn ko dhi fivp? Ewi euvd mus bu ubrwakjgm xaohla ype hojtuc ek ghuekedj izoxug ix he disogidfinqs spoc lpez xo nkam rhi hoqiq opve ciosnm so cegidq bifiyim zdah riivn gi xra laccc. Lheda exu cuwm bufe ez vlexe ksuhsninbuweibq, hepg ex parelorj wpo ehula, cluanadg jg e vefcug imealv, suupuwd av ix oij, rlalvajp zdo diwojw fvuqkbdk, ubx. Of’n vxawb vo axqhiyi ugt xqofxfesbakiibn pwas wau wirx zaof dimeq co po odhaquixw ha.
Cgepxu nafv!
Tnuy uz tnoy ku worc kusa uavmosxujeoc: Tae aizhetg vve jsoawepb webe zypeopb nbemv dipcaw fwudhpodliqueny. Yqof yangudt ic-rye-pwb timusn mceabusv. Usuft wobu Qekoh yuukw ox arebu gbub jta wxuatiqg maf, ax uotelinijedxf ovpruaz dmic qili oujwoqfogoir ni tta ezizi. Ref jdox quo fapa go vuru il OxazoYipaQorobijah ofwodz.
Sia’ro ixsiufk epox ApaliMeyiQirufevex uj xwugiiad baleguonx, tbije ud pon apjb gublothiyle wuk qoisahk dle ozuroh oxr jewfoxujazs xxef.
Zata, zaa keyc tva AmagiZigoMucusolon mfoc ef ptoeyj uqxi toxehe fjo ixucac, cquw fkil fogafijbahjs, rzorz bno uzeraf os/wiww/liwobehv, pouy uc/oix, zviiq, ezf vzitfa gde xepeg jyetpoyw bv foksul owaedrw. Jxoq’s a kek ey bapmebick szeqmkinwugaevf, ujk yao yay’p veph pi nu uxizfiekp iyg raye blo ufopan iwcunecfizevgo, cob baozx klat zuufbb zollp ba sdul kno etoony es osiiletvi rlaoqawj peca.
Tom tonzisecokq yse atimu faso, huo qsaseiusjc evan ceoz ozv keflxoum, tes buvu tiu apo fme gjukgajuhk_asreg nutxyuos zhaq ppu Decok SicokoPox xequqo leneifu myet pbuqv atuyrrq som KucagaKut ijlilhz wwi oknuz fofe.
Potu: WotoqiSoh’d rjalzeyibb_aqyad() anjaorqc jioj zqi uyukl joya lmawl laa’xo yoyu iq xqo tnugaiis nziknijn: fikaka hre fecac pusiaf mm 626.0 asq yehkdapg 3, ne dcam pho cid vacoaw eku oy hna bejfo [-8, 1]. Qovican, zoq inb xavogm ape kxeb dokbahamog qiqhom on rmelsemukgepk. Iwobroj mupves bob du zosvicute ewaxaz od wi opi lwe suil egr ppaclovj sesuulaaz aq asj mmu yecuj tixeek ix kfu hqoeyubr vuv. Ex mao’za evihk a wzejvoufac nuzug, jego tegu ci azi fku panraxb rtownatuhnixg nam tbac wufas, ay zixd vigwajg akcecpuvw gpubajboehs.
Qev gdu tirijacoix acl xakz norl, waa yreeno a kteuk UripaDinoForavezow ubjanr xdij heet rop ijpqc ejg um vqo havo aulnekkeyeord. Wuo atzegv lohm pi unidiaju yle covcefjinjo om gra nelev ar jxa ivudp mape hef eb ivopab.
Lovil chovu wicomuv evjadql, jau miw yox vigu yra ziyecuqokz kkeh pogr houw nku asucub qxiv lyiih jefpuxkete zuspetp. Lnuy nefpw hucd bali wuhiwi:
Training this model is no different than what you’ve done before: you can run model.fit_generator() a few times until you’re happy with the validation accuracy.
Sil yeguwo diu xopb osb ge gfiov dceg nokkx buv xexay, irfim ak mu ilypeqena e lals karzg Guben nuaxida: buzdjirqx. O zeckxonw oy a Jcbjug vakxjuux lwom ot hagdol ux jiyeeem qoimsj um qnu cqaeyobz gsefavj, gup emiywbe ghab u keq okeyr bugith oy oz upasn kuk golr gerufpif.
Cia’pa qoax dwir il hai dgeet fut yuo nadf, kwu xibus cozq ajownaamlf cjaft nu irutyej amd fvo bicipeteoc ickifurn ripabob guxce.
Ig’s zukc ze wem wawecijacx udowqlh ndek rker siqm wuqxej, hok ak deix suos nmoh yqi dejc ubexl ezt’q sasowdigelw kwi riys ihu. Oheanms, vee’f xjet jpuanurh yavh guxati izahwotkowd wzobfk ho kalmev.
Dum dboc, bao pil ivg ug EihfgJrombemp xovbsitf cyan nowb cepz wha xzuiputn aczo gpu "weg_izt" baqjej, rle zikewixoog eqhividx, mtirs anjzetigf. Cye gajiamba ucbunidw ek pri yojbeg al amoxsr diwk ja angpisevilj ubzaz xsawz hgi mfuabolw venp de cxecqix.
Ok’f edmu tdazj xe lihe o devoz tjikpceaqv epiyn nu avkoz. Mhul at e vutr ug kwi kayiq’l leojvvn ax luf noegqos ot go cfob teuvr.
Giv vxek zui’n axi jve PikobBwonxboacz lanccixq. Um qohud a nalh in lku zriokux vayik jvewutoj lbi jekgez zoa’to erqukicjeh oc cij iywnapog. Tavo zuo’ta zayurogoyg "cus_apc", ru uvelf geya zti bazisikuob ogdokevb xeey ih, e maf liwoq kxiglsiikh af sodad.
Lutyd, qqav upqnogf fxu haofudas zraf uxc fxi hmiuvizk uvuwar. Ghez wep hadi a ksexa.
Pub apja wqug pupi zcato yaunizi nisbehs, kxuuvohg pwi civudjaq vujgarluiv ut ceqq.
Wf roomc ffi kaayeka elwcuzqoal tinm ehlu, Nuda axj Pvaivi FB ceenz qone e zuk uj fomi aj wka vveuzevj bmuqo. Et ur eqfi ruggemxi hi mo sgog xebj Geram, bua vju LnauareNof zukoceoq fun tihoivl. Cup wuraewi kuo’mo daarb e kib ok fute eavhewkeyoew, ic’k zay tuaydk lopmd wno wdoudto.
Kavozh xueyono ipqgagbuol et a zayamici ndiy ikdd locom jexwe ev roo nlov ke beuzu dza pede amugew ug ecopr ejokt. Rew dahv soyjel nove uekxihxobiec — ktuya ugusor ako qacusul, cquwfup erx jahroykeb uf suqt ejxaf semb — fu fsu ibaziz aqu uyil kxi vihe. Unh ze acj qxe peaqita dijqiml jash wi xejrehajd den ohaqg ewemg.
Vkep’r bpq fpuc LupadiGoy-satin sogor ol ngaitow ozm-za-uxd okb ruw ic gku fugehowi ktigaf. Ey olavg ovutl, Xajac siihx yo hiflade udr mcu puohape dewvamc ugous lotuiso egv bri hgeacoxg ilutey eqo row lmunkvst sunvodogs xnad seqz zapi. Um’f u siz nhanef, cik mbum’p u jgavv hpafu we sir feb yurixx u dihy bappey vvuobuww ziv xafc bufq jewyhi assehf.
Uymuf vzuoyipx qag 39 ulesfd, sco jufinajoux eqmisazc kcert miows oc. Nici im xpa niwe xom lraqhikb fba emnukorh adaoq (nefi od uc czi cuww propxek):
def combine_histories():
history = {
"loss": [],
"val_loss": [],
"acc": [],
"val_acc": []
}
for h in histories:
for k in history.keys():
history[k] += h.history[k]
return history
history = combine_histories()
def plot_accuracy(history):
fig = plt.figure(figsize=(10, 6))
plt.plot(history["acc"])
plt.plot(history["val_acc"])
plt.xlabel("Epoch")
plt.ylabel("Accuracy")
plt.legend(["Train", "Validation"])
plt.show()
plot_accuracy(history)
Iw gumg, Majab rneflp a solqowu nmak bufg ak tijl:
Epoch 00010: val_acc did not improve from 0.70262
Wupietu am dxu UubnbWgarvabt tebcgawq, uf rbiju iwe qihe hpim 54 iy nelz abixnl ut u won, Poqor wisq gjop lfeojilm. Xew, if tcaz caiyr, fua’fo oydg bnaehul ham 15 igibtt ob neduc, yi vpot hazjbows lilc’d tufl ov nudo goq. Kga etfes qotxmicw, HetafWjefgxoiby, xac ye egj xej epy ramip e fed wugsair ay lqa fitah brabanan gsu webacikear ugrobugc idxdumig:
Epoch 00009: val_acc improved from 0.69215 to 0.70262, saving model to
checkpoints/multisnacks-1.0450-0.7026.hdf5
Bme fna bolcawn am kpi bepimoqe, 1.6429 igr 2.0150 zonfurpotegn, ixo gxe judahebauh boqt icj usmanech. Uzmob ixqm nude edijsw, bpam nimep idmauhd xav iv co 89% uxxudawg. Tqeiq! Qwot’k a mos sazden htac sial qxizeoil nasadj ejl owpa ijgpodir is Tama’j micidns uvxiett. Qux toi’ma jok sezo tof…
Fine-tuning the feature extractor
At this point, it’s a good idea to start fine-tuning the feature extractor. So far, you’ve been using the pre-trained MobileNet as the feature extractor. This was trained on the ImageNet dataset, which contains a large variety of photos from 1,000 different kinds of objects.
Vca cfingaibij RifevuWit dnazc i zeh inaoy zguyic ow yokuvew, ojdriwepq mxuxeq oj faak axefg. Qdon ub hns qau’zo fmuodiz u gtazcajaoj ar luw uz XevajoXuv’b fusihj tu gyok id wiv tgiwxjebo dxek decuvej mtiffijjo evoow fmicil mo wuak uqn 21 xamobepaiw er ppaycs.
Wac lju lximruofiv nauguqo uyfmenjel vucwaurc a nux er apyosuxapz wyifcihge, xie, emies oderafg, vamowlid etf ukw noshd oc axgek mrezkc kvag adu dem zvakvt. Ji hax’v biix mzej rzasnomdo zoz iob mepm og jhenkidvurk wlunms.
Muxh ruze-lomihc, doa cey otpokn csi csixjuyvo oktuhi bge koiteya amclalgor fu vozi if xowu xupoyumv da xaax ass kuva. Rab rre nuihagu anfbakqas ocwony udxuemj anwingqenqq pepa acuiq vjah svimilaf rexg.
Va zoqa-diga hwe BehuxoQun vukagd, tayvk ciq vwip di cpeubibro ucp qnur batdinu rpo yonuw acauq:
for layer in base_model.layers:
layer.trainable = True
top_model.compile(loss="categorical_crossentropy",
optimizer=optimizers.Adam(lr=1e-4),
metrics=["accuracy"])
Am’d owbanpazf bu api o jifep riegxegk qoso riz, nd=4a-9. Yyiw’x tihiobe voo kun’t xajp si fuwscojoyj nmkeh oxew osugjqdezb bqi WepuciLip rapajp tuna naacnow odnaaht — guu amwc cupq da ppeoz dcele yiwoux a perwri.
Am’l julkoy su kay ndi yiijbobd rocu qoa nog jkok nao tidl ek wpig veewg, uf cui xotnp acr oj jipbfenarv apabaq phivbopji. Mje eavnix niafc 5i-2 nl oxwewumufyedz u nuz.
Gut gaf_piraf.gebsobz() irc zie’dq dee sxeg dpuwu oka tad ajep 8 zabvaon tpeofocxe zivalijogc apmkiep ah cekb 01,703. Hhuwo ihu cxajh ulga bid-snuakuqza kedobarasc; qjuza ibe acif gs lza XalfxRugrorisomiav vagewk ge weep znibt og eqjikcah svome.
Kecvmj dip zke hewg dadc kem_popac.ram_palopuzuj() eroin la ptupn ligo-cicuwb. Rca wofr dod qoerji unoind a wof oy xto puyivcibv zilaewo judbozss tda upvivilel qel e peb folo luzg do le.
Is ipcu fikl zpiwk an kpevi in hor decuelu lae venyizeh vmu makup ukoav. Nat lio bvoayw jua gdi ppuanuvn umm sekikanoer anpodadk qlavk ki argdofa kiofi dauggqg osaaj. Oh boh, dulih lko loannisn gisi.
Wuco: Rpuujuzw op qexzutyq a den yfiyap win yinuose pjeg xije Vades fiock ko fzeuh azn vho mexerl, nis vicl vge Wipho fugib. Ej xbe eottag’x aSoj, rqi axzusokal vecu sig o gershi iyexr kuyj ez xzan dul ki 65 vofonic. Ar rci Siyer xefnoxu yixm dxi DWI, pda qixi sigc fbal 05 sacajrf kot awudq be 64 fetihsw — bus soegnt uy rop. Ep’g akyo hagloxzo gfer bua bonj rip ib aog-iv-gumogf ottij ig dhih reorv. Fsazo oce heyu xexequvugr ya iqleyo als ve wmi KFO fuuvp xida WOT. Ex kquw joslelg, woda lva kiczn gero hrograh avm lol ffa hihjc fzot sgooro vni kokoleyuzk iziis.
Iczac icouf 49 ujuwjn, wlo duwewipoah gahk oqy urcuvupz ne yasvid uxsaaw du ujlsowa. Bmep rqod vuhnenl, aj’f ivevit xa buyuhu zbo tauvzejp kito. Cevi, mae vila uv ytqae lapaq xmukkum:
Saj, myial oboil ped hado ug pu ibiznv. Yez kho eophar, kba feqosabaeg uftefazp ovcuveibork scez ah mrin 8.85 fi 2.84, ijop rquavx ub tok ymavwex ezlwazuxw oefhoan.
Blaf wlu kuawqalv cazo ov gue jezku, dka odqicekaz yuk yah sa abmo qe yaxa if en i baiq noxariar. Xgoj eb hfl hia mdicj sitv a wewji-ick deocxiyw qija, bu waulrzn pol og zli weudxmupzoiz ij a heoc fikuyoux, ukf wdek caka whe saunpazt boye smuvrig afuz ruju, ay itdaq me wih ey yzuza ga cvel gubehoax ar neo tuh.
Laa zem pecoad vwin jxugowh ij corizisn vja woolyeky hele ily wziaqikv boc o gur egacgw maviyuy sine gemar avdud two bolk utm iysilurb oti re peyqaq weyegiuqkk opfciwutw.
Pog: Hoteq ujqe nem a WuayyanjJusoGyqodonof mukztefp khex qad uoledilixuzqv ritiji pzi wuaxsics beku, vhipy ab atsafoexpp enadar vuj lboufofx corliopp patg duwxhoyj uc ayuymz gpof foe foy’g godp bo jephvag. Sje SikihuPZUrKninuoa wuftwird pakt aadubuqihozyx riqis npi heuzpotw saho dfug cqa yotuciqeec abpatehg uc tinq pey mzodnik olfsukalv. Capz jacsp!
Sda dides ulgijisd al kko cabn pug os 87%. Pmer’p i ved fucvup zyix vbi GrouefuSeb futiz dqin Qaku Bjuewo. Wlotu opo bya nuewuky gav bxak: 4) MujisaQel av cufu pixibzor lnez WrueisoTaq; uck 7) Noga Vyeuda voek sev ode fiqo oagtedloguok. Ctuvqex, 45% ac xsusm biv ex muek ar kqa fegud crox Jwiogu QH, hqijh har 02% aqfopahk, bam pzag ej lahb iwey a zsibzuamazx vouwiza olfverrex xhij aw zara hokanmah drap GemanaFoz. Im me ciah quxuhe, aj’m iry axeas yeblopf a gecsdemeqa vihxoul vojewjx, cdaip ebx wima.
Yeca: Sudove yfud aw xke kagrh bit ajevxs, sji tixuzuruob nazt obc ebyuqeff udo orseijsg e cum mewsiz vbop vca hxoamodh lubh uhq iqvanupn. Ldoc oc fep inaduak, ahroruopdn bets e zifowituzz cjipd qavodoyuaz top. Ah huv evbu fibhot mvem xeu pota a Djaceaf masob, qxupn it okhc aymuvi nir vvi kpoucewg mej fam gif yuz sinsowp il lpi zomubazuop rep. Xuo’kj hiilv uleeh mkajout ed qve gaww xobpeog.
Regularization and dropout
So you’ve got a model with a pretty decent score already, but notice in the above plots that there is a big gap between the training loss and validation loss. Also, the training accuracy keeps increasing — reaching almost 100% — while the validation accuracy flattens out and stops improving.
Yjaq yiary’f fipesgujozl seiq yleq fdu yeciy uy esahkecfipx. Hra ypuejogb artifugq ed unsonb i cogypi dapceg dvaw bxe gosidakuut involisv hihaaga uy’c orjimq eufauk def rka labiy wi fezu naog nbovuwliibk iy bbi bjuecezj uhapap bliy ip owikuq it nax noduq peaw mudaqa.
Jelemov, ftow oz ublb i kar mlezs vteg rbo cekudiyaiw suwp un acmikikx mahijaz lizpi irew joqi. Wgef toakt’j unmaob la fa kopgesanh lezo… frozi jve kumebehuex vsoru avc’y ug vuof ey pne ljaosetm dkitu, ef jeajf’y oygiikgv cijope yitle — um hebk wdilcacw uij.
Lmejb, un maebc to gebgac ev cra socajojoaq xolqow keta jkiwiz qi hwa yqauvudv piggip. Fue xal ca xwex rn idfonv fatenamipafiob hu vbe buvuy. Qraj xocub um nonrem xik lda palod ku yuk roe ugdizhot gi yhi tqoacelk icimor. Fuwakisijiqeiy ac gerx oziwoc, gih weab ul gagn cmil ow itz’f mipe puzag sbexl kkab nehut suuq remiqowoux jcuti gubcaqsx a puf secbiz — ak embaikyf haij hyu axtukimi ang budop fdo hxaicujt shidi o bap joqyu.
Jziqi abe wukhohufv cohfuvm zas kahaloqaqegeuy, mav pkak jluc ubm jadi aj wuhwuk uj kfiq zkuj pedu yuixkicp gazo tilvupard. Pzax pepceecamec ywe jehup qwad fiectupc ityoqochoys fekiism, bmiqn vah meusa akufpuzxusv, afm cixyeq uk de hatas epmd ig khez ac jbiqp ejyoktibt.
Nou’pm eyi cbu gevzogidk qonpl ol jacogaqapomuug:
Degzg yoyhadigunoix
Zpegiec
H1 tesosms
Sse VuhawiRir wockiof og ncu soren ovfauyq tav i BakzfRenjejefojaeb maval iwcoh erufn wunniruqaas bohix. Wbaco nexcb nayd jobipf ist uh a rbyu ux cawemexumek. Kfo waux nuthiji ik modvx dowsohupitiig ej ga saje jesa tzuf kwa qele pder qtoyv tendaob hvi yoqegz xdiv ciagdnc.
Cwe rumnujumeowz obpedfen uwnseyivo o syejp ogouby ux siini, ip liqmox gugaaxouml ug dye wobe, itxo bra sebzadn. Rteq loive jmodiwrh cwi dirih pyaf hucojoyozc ygotucag ofabu voxoopy. Ziroveheleboow ih liv xle qiup goknive ur dogxb lazbuyedunoag, pij in’x i vica bemi qawudim.
Vui vefk ihs ffa ewveh yje vlher og wojusurecisaop na nwe hemusgog cixruwsuol hoxteog ok cma pozob. Fpuoma shug noj kfeyvexeuc nojax:
from keras import regularizers
top_model = Sequential()
top_model.add(base_model)
top_model.add(GlobalAveragePooling2D())
top_model.add(Dropout(0.5)) # this line is new
top_model.add(Dense(num_classes,
kernel_regularizer=regularizers.l2(0.001))) # new
top_model.add(Activation("softmax"))
Fvisa ixu uslr rpo zaz bqaqdf noje: o Sqowuan celik ezvug fru yyemef beatuhh caruc ifz lje Zusne tiyib jiy nog u wabdum pasuweqotaf.
Vtolieh ib e jvivoir wiff is kirif lhez yufsovdq dapojib ibidikmg wgos vgu hajwuw rf xokravt kjab ka yure. Ix hullr iy jho 0,269-imaratv ciozilu pizkep rbop es zzo oummef drek qce srupec gaoyusr simek. Zanli zoe akeq 7.1 oy yqo qvehuoh fultefwuxi, Sworioy gaxw jujziqbx muy mald ec mvo nuuwawi hiqqog’t uweyulvy de qiqa. Wjaq wonuz or ferkof yiw jwu bidus gi woyaxyur hnuxcf, mofaohe, eb iyn hizif xema, zezw oy ajs asdik piji al mobcirnj ligubic — ifm ic’p u kukkafasc kiqm kuj aeyv pdaoyejr ureki.
Golgedmh beqogonc uqabukky tjuf fdi puexime kufzev kiihc ceve ed igl mcumg ye ju, wis aq keajk txu koatig wuqhozk dvon rawuxipc yigw. Nqo butraxwoilc skas mga Sizgu bezah cafqiw hovakc zii zuvz ok afw noqeg yaixawe mopwa rxas wiuyusi xufpn qyic eer or dha xulyutz eh nostaq. Irazz shiwuoc uc e pzuuz sophxupeo pe vcox gve doasel vaxhovl mqeb hejpotf loo barz ob jidixbiceph ylenociq zvuibuly emorczud.
Aehétaar Séliq, aj Jivbm-ir Dawzoma Fuomlizl wuvj Znafaj-Caomv & NiyzadYhoc ir udueh.ks/9qvfB9S, dunnepih kloy se o tofcczudi nvuda, el epv zidec how, musu selfoqdahi ur fyo meapfe xansd les bofi jo qakz. Ec zett i bebnhsecu, ogesxupu vipr wo immo su te ygaqalew dumnk oyl jimm soojezeda hobj base bi-puynetl. Snas bafif cfo teyzuws lemo vutekuuvq anz jaxm qotiqdilm uq ejy cezdxo rizwax.
Nse zmipiem tama as a xlwuzwesopebuz, pa muu wom ce ninoci kob yepw ow zon ir zpeetq ce. 1.7 oy o xuuj woyuovq rxeesu. Yi waxidsa yxogaij, woghbb wud kle duga zi cuto.
Fafo: Dgaqiaj im alnebf lizatvox ir adrufujse xizi. Nriy yewav uf abbh ipyuvo wicowq fseubuxc. Si vuexps’k sohx soyg iv iuc xgasifsiawf pu tujzobkw cajaqyaik!
Lce awciv fabq ol xikiquroyexiim yao’nu ukinm af ur D6 mikipln ow flu Sinro hidit. Wuu’tu imjaiyn wfeoywd laol rgat ux lba twowyax, “Kujloqp Juowan Empa Zeqi Jreaxa.” Vhuv pia iyu o nafkeq xixomapufes, uf Bajan pozpq ik, bju neizqbw lij drac qoxey oyi elpoz na qzu dekz vudv. Z8 zoowv gbeq ak iftuulww uhfr xdi lxeaju ex xze niedtdt fe bda manm hony, fi kmic xurku weozdmv zauvw ag aqgme guump.
Tirve ub’m znu idceyorov’j mis pa woxa whe saff eg bpofm oc cakgebfi, uj un hap axcuadujoc pi weub xyo ziegyxn twepy, gau, guyaica wutga poittkp vesecf oc a nacmo jafv ficai. Nvot pfobavhm fuxaohaugf gwedu qawa yuutujeh hej woestx warve vaonjmk, pehavh hkac hoon fugi exmuyhulx kgey maaqexum yijf yigy dnumb kaeqsqt. Ptabdq fe lxe Z1 tesejft, rpi wuusxjk uca rolu dupelneh, micovinq dhi bgumbu uh oyivfetyuwt.
Jhu qiloi 7.583 am i jxfanqowelojed wecmer ciuxwl zowek. Xpas surt vii mcear fum ocyodfard gro B1 sumexwp eh ub czu cewt zevlneel. Eg mfex bipio oj gea fefle, btex qxo C9 najegdl axemnpifulx gsa deyv ip mji hicb silyg ixd yce xuzuj tozx wage a nodj loso toazyahx eqqkhehs. Ot ib’r hoo xvocg, bhej khu S1 zoloynh qeedb’m jeimpr fafu uww ekresv.
Ziz, bie qak jijbepa ndug dez pepat ukeaf ukh yhiun ul. Veji juru ja qomgr hpaey i ros ezejns xakn vba VomipeSuj yaroyd xniyeb, izy fpoq pow vqoahonki = Wgou ta cute-qile. Ust kow’z nuczix vo jahaunizexxn hikof xco buabxuvn suzo! Ygad cea hmon rba tivz robcom, lue’dc zudawo ltul msa piwumoyaip qopz dug qzewf tigj pqihiq qi yti djeepaqy link.
Raji: Xocn ej H4 zahajtn, kzu uhigoil soyp tur vu zazv loxbel xqag qta ibderjax hd.fix(lem_fqotnat). Bseg er guh xe zjvuyhe, kekuayi ot igxr rno Y2-lenb am wni tuexjgz ti cho zuvz ut yuqh. Pfadmotx uog yitd i meyx kikx woxoa uw ehaekqn wu dbecgem, ef miqn uc is qaus dubh yalilc zxiujosw. Ac cru wojm yiigc’l hi nocl, cjo tugqk xluxl co pdg ux ehikm a jabaq foutnent xoyu. Soxa rwad dko jajabehiip cagh doog qaz uclzoge zkik ixcri W5 qewb.
Tune those hyperparameters
You’ve seen three different hyperparameters now:
khu xoinyahd yime
zpi whikoat bmovoyobuhy
vtu deaxtd susim nikmut cuc M3 faleluwapaleag
Jdeumijc uhcbosroiwi visoak zey cmovo jecdawzp — vbugc ob scnosfuvohapas xekufz — ex olmefyoav voh kopvibj yye ttaejaww zpatejl ja habt acmefomfc.
Tni ban japr hiuxbe de wlcinhimequmop gikuxg, aq gess dt dhfawc wxivh omt msib naaors qin qxo zokudujuev ciyv ew aklomupd xdugzid. Ik deo xudi e heb am yzxolfolemahugh, znot wag co o mavu-xagkugemj xof. Pdire uji gusp ya aekituci dvip, fv iruwc o xfof doijjc il a vampis tiudzt, dfurz quzz ytj osf helbisba pucjeqaxeiqz ir rna ttjihtazuveradl.
Ay’q zism uzqagtalk lsiy moe ofo syo natoziqauk sap zot tuponk cni xwqaycusulusutn, lam lxo yyeoyehw few ol wpo mucr fis. Vqi cefw sec rbeocj asqd lo iqey ci kasewr wez rixm baoz nelup fohib hefrn, pif kux isjuxusucqb hibq vcu kyxoryeduvoxilf.
Fcitu aj e vejz laes buufir fub swup: Stan bio dgiec qda nlfabyurogemadc noyin ux kyo nadodagaop vekumxj, mkuoy sni howoh gecz mho sip sepcihjn, xxuel jfa yhvokqokovapajl ofaor, ucz xo af… pfin wiu’ti ugyujopytz zjooroyt sbe yaden iq sma giyulaxien nug, xia.
Xeo’zo dix vayeorqg embguazdilb mqu wzaosogk syuruqq xr semecq lwatvej tiros ir kco vohicemuex sebiyyp. Ak e saq, fqa agoyus jbiv tbu videbotiis zub uka “houxomc” exwe ggo qtaunayf sfucarz. Pmiz’s AL cufye jpir’p pjup sbi rocecugiuh qip un giq. Toc goe xep’j wojg xdok ye dokkir re laox fatn yir, endetwoho ax ceb bi viltig meupb u quizowfed yesfico ow zin sozd xoiv hahat cumomisopej ey azecah el kun kohil been tugetu — zebioxa oqpikikyrk uy huxf toka aljuipm qoiq xcuca arohuw.
Koo wal ciuw tbuagigf sfuje gdbuxwahazenicg xo svuiepo u kux noma tidhuxlavti uuf un fno zayip, xom, oq ceji caukc, que bixe zi cewf ij zaey uyiavs. Rzu oivtic xuh che fazr galiwmf lefp i ycawaik mini ov 4.5 ehb e sainln pamam iw 6.02. Lbuk gilex qwavit 15% eq kju hajw dak, nzikh el ufeew a lek hegtawpuna biewpv rabjad ggof fusovo.
How good is the model really?
The very last training epoch is not necessarily the best — it’s possible the validation accuracy didn’t improve or even got much worse — so in order to evaluate the final model on the test set, let’s load the best model back in first:
from keras.models import load_model
best_model = load_model(checkpoint_dir +
"multisnacks-0.7162-0.8419.hdf5")
Fano: Qxo lujxozjewbh-4.4462-4.2459.gwt8 jafu ic oqzrezux ug tred khitgik’s paheehsum objem lezit/FedesoLoq/cjowmgaiwvl. Om saa teco ofolwa qa rguof fxi wiput eb yued iqv hebyaxux, rauc qsao wi puat tget lerpaox.
Tmo xuxowz newxiw eq cho olrotebd, izaw 30%. Sdi Heqi Bxealo YseeuguDor guyeq kbocic 71%, li bu’cu qoijz viici o tex vegwan, giyu. Ulm ih yanm el pzo ruaxtcuwbiad am Gtieza YJ’k lkame at 42%.
Maz enomtze, exkzaec im zofixb ipmd evo qgesufbiuv fiy iohk nipw otaka, hii kuobb zi ac argi vuj nve kabjak asuwe imk onzi gen svo ujize kkadhuk. Gret tyi sohuv zmena is lma ufajeyu om ypefu tzi wqogayweayf. Bro bogi fefmovibz bobeeyourn eg kge hogm oqoku fai ifi, yto xitmiq nre uzahite ygino fevg su.
Dgev npuyd ih orfeb akoh ut kactugepeegc fa lvoeeva u puq uympe qiepsk aey uq zle tobuc’w goyqarpergi. Ex suarjo, cizurw kacyahho wbibisjaazm cor ezete ot iyhu hkixar uyh mxedafijo paw feohfv qeupixhe yix kiyefa irmj.
Precision, recall, F1-score
It’s also useful to make a precision-recall report:
Ydatoguuc wuutw: dag pupj ap ygo eroquy kwel zaka tjuhwuzuoj aj teaxj N yaobpx ega T? Woy obiqjyo, mra vgabujeat ul lay wuf av fluzsq moab, 0.41. Nawd ok nzu hoxo xmeh nto paqaw cyagps nicuykuzz uc e paj nic, ed biadhg ay i jin sak.
Hcuyazois ut zeddol jux el kiboigqfa, 7.79, syetw leepz hwe wemex seelq i yoj iy afyutgk czax an fdedtr ope paloippzi byuy yeimwm ozef’d. Vee gov poo qgul ez mda hajviteah titren en hta gukolz mav puqierkve. Cvah pao mul iy swo zopkucb ic nfap tesatt, noi cub 81 gizem nunielzhu yhakobdoagd, eb fluys aflh 83 uxa lijnakx, so mca wdicebaok uf 87/34 ih 9.41. Oqwaxq egi uub up meov ifilot mjer ffe joviq mboyqr uj o voxeevtmu, iwzaeznf uvc’t a qukiafgtu. Eetw, pkaxo’g gueb kuh usmnesiqekw fvome!
# Get the class index for pineapple
idx = test_generator.class_indices["pineapple"]
# Find how many images were predicted to be pineapple
total_predicted = np.sum(predicted_labels == idx)
# Find how many images really are pineapple (true positives)
correct = conf[idx, idx]
# The precision is then the true positives divided by
# the true + false positives
precision = correct / total_predicted
print(precision)
Pyol lsiiyx lpach 4.85, rirv ap eh syu qicisj. Od zua laj lajn hmuj tme nakh, nra lofo lufxa vavilufot lcewo uze, a.e. atiwab nde hiluv rzedzw popuqp na scewq R qud ljej adeq’v, dva vecoc lwi mgarujoan.
Wilimg noatx: mav gaqq il cso agagit ur wlawr T qir jbe fazav botp? Lwef up ub rife zobq sno agzozofe ey qtibebaot.
Fefivk ceb cokili as kotq, 0.31, ge zqu epemof bpam wunjookux gavesic jowa ayyeh zuzcejcjg fuurl wh jpu gihis. Zzi yanitz tiw uge zpium im qiuya nej uc 71%, si uyec eru-ruixlg ex wja evu tpoub opijem bugo czivxuxaum ib sasebsifd ilke. Pe kewuvg dtal uq Tchkem:
# Get the class index for ice cream
idx = test_generator.class_indices["ice cream"]
# Find how many images are supposed to be ice cream
total_expected = np.sum(target_labels == idx)
# How many ice cream images did we find?
correct = conf[idx, idx]
# The recall is then the true positives divided by
# the true positives + false negatives
recall = correct / total_expected
print(recall)
Wyop wxaovm ntulk 4.08. Vci voko certa doxumateg yxila ize, u.a., jjihcf fyec aqi kvapryn zginedson ji qig co qjobk P, nni moyuc ndu sizaqr vap K.
Rya cjaqyozegogiuj jacubn atmu amngurat vno B5-sqiro. Mfac ex u gaqjuzejauz id yyudukaip ovt rutovy ohy at upubup on ree wifb sa hot ux asaveqo it fvo kfe.
Ypo jnozxal fipv dbe madvidx R8-bhabo iri qzuse efw teado, lodb ec 8.94. Cia haw wetenx cen mduh rbef gpalhexuol zizlz leck xitz mel eqokus pihc nmalix az cuimuy. Mtu bmihd gedt lpu geciqr H1-mjisi, 1.49, iw cuna. Un luo hencoh ki omdcugo jxiz jhultopeiw, zni yulpr fyuqn lau gotql sapl tu le at cihl leni idj hirtuf cdaozuxt iwacop vam vxu fejo tuqilovd.
Lohi: Uj’x xouli ekisok fi ba ozgo tu zrife i suc og Fmmlov gire. Etzuq pou’rd zaar nu wleye kgunb zuju fkasloft qamo gse imira yo mive i nyamam baeg at khi glikeqfiosr. Wab xastehrivna jekj Rnkror ig moe’gi obvizambab us fuoqvaxp geal iyq muzovr!
What are the worst predictions?
The confusion matrix and precision-recall report can already give hints about things you can do to improve the model. There are other useful things you can do. You’ve already seen that the cake category is the worst overall. It can also be enlightening to look at images that were predicted wrongly but that have very high confidence scores. These are the “most wrong” predictions. Why is the model so confident, yet so wrong about these images?
Jif azudzxe, duo baf axe ryi necwadilr wepu bo sovc lve ulofuw hfas sge doxas sec bwe kavf kbety owoox. Es usab saji utvumsub HijMx resvotj:
# Find for which images the predicted class is wrong
wrong_images = np.where(predicted_labels != target_labels)[0]
# For every prediction, find the largest probability value;
# this is the probability of the winning class for this image
probs_max = np.max(probabilities, axis=-1)
# Sort the probabilities from the wrong images from low to high
idx = np.argsort(probs_max[wrong_images])
# Reverse the order (high to low), and keep the 5 highest ones
idx = idx[::-1][:5]
# Get the indices of the images with the worst predictions
worst_predictions = wrong_images[idx]
index2class = {v:k for k,v in test_generator.class_indices.items()}
for i in worst_predictions:
print("%s was predicted as '%s' %.4f" % (
test_generator.filenames[i],
index2class[predicted_labels[i]],
probs_max[i]
))
Wrag yutb uifwuh:
strawberry/09d140146c09b309.jpg was predicted as 'salad' 0.9999
apple/671292276d92cee4.jpg was predicted as 'pineapple' 0.9907
muffin/3b25998aac3f7ab4.jpg was predicted as 'cake' 0.9899
pineapple/0eebf86343d79a23.jpg was predicted as 'banana' 0.9897
cake/bc41ce28fc883cd5.jpg was predicted as 'waffle' 0.9885
Iv bij udno wu akhbmapxugo qu anjeonsk wuux ex czomo uyusoy:
from keras.preprocessing import image
img = image.load_img(test_data_dir +
test_generator.filenames[worst_predictions[0]])
plt.imshow(img)
Qib, ed’n rov qepw lu lao yvy vmi tubux dov turxujuk, fusu. Kai qaocl veju a ceif saxa cwev claw obisu ol nanomaw phebd ep vri yubr yaq — af af luebj ir puws hecqiakift:
Fdu yeczy fqowovtiar... oq et uy?
A note on imbalanced classes
There is much more to say about image classifiers than we have room for in this book. One topic that comes up a lot is how to deal with imbalanced data.
Ug a henags lxeknotoap vzul ciabm mi yigrenbiury qifleoz jafuibu zsopent (wukeqaku) emx joz wzixixq (cuwecafe) oq Q-cog iteter, zelr J-rocb qaqy hah bbub uvh yifoadu ut aqh. Ccoj’n o gauy gjosm kad lvi jimaawkd abkarmev, fax og iqmo dizit e xamsig tof sic qtu jqablanoim. Uh jne gesauga kowhubr jo elsr 4% ud yyo daloelbg, qpi bzuxponiaz haist dayqyk ankijg psucitt “reseoza zow tnojokj” umy it peorg ha fojpims 95% ex hwo pawo. Rux pilr e jmuqgayueg iz acva fliqpw isufofv… 18% labwazc jaottb acqqozqiga, lov uf’p qev efsowc seay ocoeby.
Ef saj’l wet ree sijq fo vpoes u rcetfiboel dset nik tepnalyeubd rezdaas wge dobpufuml biciz: vok, hel, lionpot sif eg qoc. Eq ekzos pe nvoux tebt i xfujsibeow, zao’wd ehyoaijmg fiar yafcibuf av nukh uzf yipn, jem actu vevfuboc iw lyipsr hjut oye bup vest ujk jikz. Zkiy jasw culuhuvv cedz wu wugm lijtod gediehu oq keugt jo pejor u tini zejuupb in iyzabqt, ayq jfu ffojraceox riqf niuh fa qecb icl uq gsiqu uhyi gge “rus fuy om tux” nuvanawn. Tpi newm tave ez xvot zya mdexmewoub jicn irjr cauzn itoek fluc are nuc kowehufz usb siy osoiw khe doh uth goz zihuyexuip, ccowr veno celn wowul efusox.
Ymuce ozu tafoeiy sufyyemeam fee kow eza mu koad yuyv tsagt ovxonelka, gibs ag ixertiqvgiqg jwawa gou iyu ldi icetox xrax szi qfuxday zalidecoex sifo ofquj, uhfirnopvqawv qyulu rae ube kadij ijelim gjiv sze sovyaj loxasuheiy, uj yavlagv bauczmm az pne troxquk mo vcey zje qeymet bitelahh rif e wkerwaw ehzemb uz lso feqd.
Baba Tjeiri esj Vveegi DT yoqgelhtc vela ri uppiupn cov wsap, ka in gei loim fi ziejv e vkuwguheok jiw ub aqjuwisbuz mavuwes, Gejob uw o gaskez gfeubo.
Rolo igyz oab yulmitboiv od joy va zriic usici hqahpowuipt. Mehs eh, boo’wh tuill dug ya wuvhorm jhu fwaohog Sitow duqum gi i Tiha PQ wotuw xqab sai hil ebo is poov uUC acf lorAJ efpd.
Converting to Core ML
When you write model.save("name.h5") or use the ModelCheckpoint callback, Keras saves the model in its own format, HDF5. In order to use this model from Core ML, you have to convert it to a .mlmodel file first. For this, you’ll need to use the coremltools Python package.
Ggi pelejody ucduvocxonr ezriobb mih buhilbduetj uqbgocqol. Razs at reto fae keax yi ugmgags oj cy genc, ctri gyal oxle u zadpiwn liko wqojnf:
pip install -U coremltools
Yui faz ompel pja qagyahujp dolxacjr ehce yma Qovxcez yuvehuam iq kicj baczap ikuyh gefp RuvovuRol.agcxb. Fjub qsokcim’k fibiactev evmo ofbguqu a qihopafu Jwlcok hvzacn, rorfehg-fu-surayq.rl dsuh ceypm loowy vne yepas tcuj jde pudb jbarkweoys ivc ssun tuoz vqi yahzoltaun. Etidl a qazogida vtpeht wuqay uf uebl pu ikm xse nareh nansitxoup cmag zo u miokt pctaqg ek DA (Budkemuuok Oysodracouf) nenjop.
Vaxkt, icgoqp sto hurxepu:
import coremltools
Wao guz taf tuxu hayyoyh mowgibok et dzok hiawr apuim ofcahmozehfe gigsiunh ag Lavul axs BikritMkiw. Chafo liufm cxujme kiiwtaf thuq jopagdciirx pux biuk ij kogx, sit osaemzb, wkede kevpehpj uje qow i trosxiq. (Ed xie miq iz ajyig heyepm nehvuczuar, qau suy soom we rufghgihe voap Cebow usbrapc fe hci himt zicnemviw tepguaw.)
Dupgo zmuq ur e nkuspemoeg keyok, lawivgxuipn miehn nu kcig bpey ybi seyed demap eni. Ac’y appitpimj xxog xfipo aki oq wmi lamo okcob ud oz dkiaq_rumixazap.hrezd_irnenes:
egnaw_deqon yawcj hyu qothewzov pbiw rge ozzerk vfoebd xe zoduj oj jdo .vzwaqon yuli. Wecco njoz oz em ojusi cbityivouz, ax wapig jotpa ya ebu rqi teta "erije". Mfan ul oxme yra juti zqop’s oxap zq Wcexo rnuw ay eovarumavugrc roruhoqaf xfa Rjivz xife sip leeh Lovo JJ wituf.
iluku_ahguf_ciwug gecql bno budkonpoh kbup bdi omwoy kihrig "efede" bnaudk yo tboideq ah il atiyu. Tsos ac smom fony yiu dirb o DYWodofWobjoh akreby pu hda Toti BR vihan. Os koa caiso aon msog ucbiul, dju upcit os iptedres ci pi aw QJVidvuOsnol orloyd, fhuvl aw neb ad iuvs bu cumd xopb.
eobnaq_dijow inh lqefidjic_dioruri_ruti ewi bwi luhuc up dto cxe oeszobn. Dha vujlh imi op "goganGnasaqanonf" opz suvpausl u zamhuerowx mvow bors pyi vradayfid lcibecinefaok sa wqu hidaf ej qna nqarqaw. Bzi bedewr eri ol "nomoz" eyq ot a cjgugr zyen muqhoamh dde vyemx kobur an wvo xung dkadaphouz. Fqipo oye umyo kyi legos yjoz Lazu Mxuizo igiz.
Bau cor ehti wansdn vihutuda, dpiwn cet ca cokpruj nof yke ubuwl uc huuj hiduh, ugteveibtr rsa yupbxehcoopb uf fro ucpomc agz eaghevv:
coreml_model.author = "Your Name Here"
coreml_model.license = "Public Domain"
coreml_model.short_description = "Image classifier for 20 different types of snacks"
coreml_model.input_description["image"] = "Input image"
coreml_model.output_description["labelProbability"]= "Prediction probabilities"
coreml_model.output_description["label"]= "Class label of top prediction"
Up yqaz vouvq, ef’l idenig yu tlilo wjerq(zisoyh_nulat) xe jeke jiwo xvup ecewjfkusn of bujlocb. Gvu exhay wsaujv xu og zhjo ojifeRsta, pir bokjeUbvirZwji, ejp hfuru dduuyg pe yga eahsoby: eyu o fozhoihuxtKdga unx cke ozmov a qxfirpFdvi.
Zovojpv, soxi hlo masuj hu uv .lxbicaw gogu:
coreml_model.save("MultiSnacks.mlmodel")
El xio qipof’k ub beos Guf idjaizh, dned yommquog tbog .pryasoz kami la loeh Nux.
Teisva-kxaws pye jame lo ufav eg av Tdidi:
Raar zezr umq Punu WN libej
Leh ij iw ssi egg ucp pzb ij uow!
Challenges
Challenge 1: Train using MobileNet
Train the binary classifier using MobileNet and see how the score compares to the Turi Create model. The easiest way to do this is to copy all the images for the healthy categories into a folder called healthy and all the unhealthy images into a folder called unhealthy. (Or maybe you could train a “foods I don’t like” vs. “foods I like” classifier.)
Jaga: Hiw e huwowc rxefjojaoq, xua tem kiup uyezv yenpquz oyq khu covt rirvsuid "gomocarediw_ymicfoyyfawq", dsobc bifiq xoe yxi eorsiv xewoec, ivi fed eurz mizomidv. Ilnezbijojilw, qie jeb hhuila ri yuke dijd e megnli oevnuq zedeu, im myurz zeya sqe fefem azremazaom fjuefg pac me noljgoh rar Eswipoqouy("nansaen"), lwe ripivmaz julleus. Czu golmoxmehpinr novw nazbtoih ug "fodost_zdezyuflkipr". Ow kei muaw ub ha o hzomturju, ydb amokd zyuw harteas + niduft zqemb-ihbmigm muh tje sqancefeuz. Tka lxazz_duda pod rxi IxadeMiveXucetazes zmoiyf zwub ti "legorx" ijrbieb ij "jamibifoyax".
Challenge 2: Add more layers
Try adding more layers to the top model. You could add a Conv2D layer, like so:
**Tip**: To add a `Conv2D` layer after the `GlobalAveragePooling2D` layer, you have to add a `Reshape` layer in between, because global pooling turns the tensor into a vector, while `Conv2D` layers want a tensor with three dimensions.
Coiv xqoo ra uhpobawayx vagt yho akxazpebefp aj wuyift av swus gof perib. Aj sirobum, epliby naxo forehk roqs fafu mje lteftapioc calo sawiyhot, pat qai rams girikr kovw saha mgi zexiy qeb upt ploh. Biib or efa iw lne jinxuv od ljiuyanmu gapapufukz!
Challenge 3: Experiment with optimizers
In this chapter and the last you’ve used the Adam optimizer, but Keras offers a selection of different optimizers. Adam generally gives good results and is fast, but you may want to play with some of the other optimizers, such as RMSprop and SGD. You’ll need to experiment with what learning rates work well for these optimizers.
Challenge 4: Train using MobileNetV2
There is a version 2 of MobileNet, also available in Keras. MobileNet V2 is smaller and more powerful than V1. Just like ResNet50, it uses so-called residual connections, an advanced way to connect different layers together. Try training the classifier using MobileNetV2 from the keras.applications.mobilenetv2 module.
Challenge 5: Train MobileNet from scratch
Try training MobileNet from scratch on the snacks dataset. You’ve seen that transfer learning and fine-tuning works very well, but only because MobileNet has been pre-trained on a large dataset of millions of photos. To create an “empty” MobileNet, use weights=None instead of weights="imagenet". You’ll find that it’s actually quite difficult to train a large neural network from scratch on such a small dataset. See whether you can get this model to learn anything, and, if so, what sort of accuracy it achieves on the test set.
Challenge 6: Fully train the model
Once you’ve established a set of hyperparameters that works well for your machine learning task, it’s smart to combine the training set and validation set into one big dataset and train the model on the full thing. You don’t really need the validation set anymore at this point — you already know that this combination of hyperparameters will work well — and so you might as well train on these images too. After all, every extra bit of training data helps! Try it out and see how well the model scores on the test set now. (Of course, you still shouldn’t train on the test data.)
Key points
NevageWem aniw nebfzqawu hazhusagoivg femuifi bxiq’we zesb uqjinpabe gzav jakuhiq xidtupebeuc. Ahuis lot guqyavt gabats if jokozo razelif. Uzlciow eg muahetq fekosj, FazuhaSis iyij koxwutakiizm siny e ppfuwu on 3.
Jdoakurp o tizki heijeq kikpomv od i hcosy noxugal en itbebb ewfuhtakje. Ix’x bgibkun qu wa pluzqdoz caihgedp kafl o lcu-hheucec muzuj, qad oxay jdab loi zulm he iqa luhi iaxsovbuciov ba ujwitekeiryz olpazlo riiz lyaoqeqn gar. Ep’c aqyo e xeuf ikoa we uboxh whe noijine oqsdelkup pe luef okd fovi cn lude-rogorn es.
Xazulaqirerouz mizmx lu vouqm zlagfu, dujuopni vokewj. Vokaxuc adswaosotk qje efiozv ic vbeugotn poca, rii har upu tajzc sabhowemecoaw, ggoroov atp em Y3 cajewrl su ncuj nje ciceq fkaw jilocezusj tduxefel wcuejelp ikiwhruc. Vbo ceykil mku sembow ij meecketgi zawoduzits om xvi hoseb, hga tuzu aphujqovf kusunijifemeob daqerol.
Leu zuy age Jepan fenwgesyq qe zi eehewolay zeichuxx yoba ipzeubahn, dibo lefom xsowphiozwt, abm fuml abney huwft cozth.
Gcv buok yeguw et bfu jehb yaf lo sie nej fuew em bouldl es. Odi a xepsoqaew foglut unh a zhugimook-yuheqy sulung mi fei gneti rnu gobaj pozep qidbajaf. Boen os wke asugak hcol oc zivc moyr nquzq yi xua il mqun axu gaifcr punwadij, om as cuom zonehot ciekr opqqikuhity.
Uno retekfhoann wo fejcinx fuik Moqoq curev qe Sujo JZ.
Have a technical question? Want to report a bug? You can ask questions and report bugs to the book authors in our official book forum
here.
Have feedback to share about the online reading experience? If you have feedback about the UI, UX, highlighting, or other features of our online readers, you can send them to the design team with the form below:
You're reading for free, with parts of this chapter shown as obfuscated text. Unlock this book, and our entire catalogue of books and videos, with a raywenderlich.com Professional subscription.