Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective:...

22
Getting creative with autoencoders Tijmen Tieleman, CEO & CTO at minds.ai WiSSAP 2019, Trivandrum

Transcript of Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective:...

Page 1: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Gettingcreativewithautoencoders

Tijmen Tieleman,CEO&CTOatminds.aiWiSSAP 2019,Trivandrum

Page 2: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Overview

• Thisisafairlygeneraltalk• Application-agnostic• Generalideas,ratherthanrecentresearch

• Thisisafairlyspecializedtalk• It’saboutDeepLearning• Morespecifically:representationlearning• Morespecifically:unsupervisedrepresentationlearning• Morespecifically:autoencoders

Page 3: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Autoencoders

• Trainingobjective:reconstructtheinput• Components:encoderanddecoder• Usualpurpose:representationlearning• Generativeview:variationalautoencoders• Components:encoderanddecoder generator

• Flexible• Getcreativewithencoderand/orgenerator• Interpretablecodescanbearranged

• Codeunitsareonlypartiallyhiddenunits

Page 4: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Clustering

• “kmeans”• Kmeansasan(odd)autoencoder• Encoder(notNN)• Generator(constantforeachcluster)

• Implicitassumption• Morepowerfulgenerator:non-trivialcovariance• Differentimplicitassumption

Page 5: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

ClusteringautoencoderWithfreecodeunitsNaïve

Withfreecodeunitsanddiscretization

Page 6: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Intermission:afewminutesforquestionsaboutthepastfewslides.

Page 7: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Gettingcreativewithautoencoders

• Clusteringautoencoder• Ignoringautoencoder• Binarizingautoencoder• Biasedautoencoder• Computergraphicsautoencoder• Independentfeaturesautoencoder• Etc• Feelfreetocombine

Page 8: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Ignoringautoencoder

• Goal:codethatignoressomeinformation• E.g.ignoringspeakerinformation• E.g.ignoringthespokentext

• Application:focusingonessentials• Application:domainadaptation• Application:anti-discrimination• Application:reconstructingwithdifferentinformation• Method1:providethatinformationtothedecoder• Method2:insistthattheto-be-hiddeninformationcannotbereconstructed

Page 9: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Binarizingautoencoder

• Goal:binarycode• Application:”semantichashing”• Application:extremecompression(lossy)• Method:usediscretizationnoiseonabankoflogisticcodeunits

Page 10: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Biasedautoencoder

• Goal:codesthattendtobeofcertainkinds• E.g.asoftmax with2commonlyusedunitsand10rarelyusedones.• E.g.binarizingandpreferringsparsity

• Method:simplypushintheobjective

Page 11: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Intermission:afewminutesforquestionsaboutthepastfewslides.

Page 12: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Computergraphicsautoencoder

• Moregenerally:domain-specificgenerator• E.g.audiosynthesisautoencoder

• Goal:highlyinterpretablecodes• Goal:helptheencoderunderstandthenatureofimages• Application:reverserendering,for“imageparsing”oraspreparationforgeneratingimagevariants• Application:usingthebetterencoderasagoodfeaturizer• Method:replacethegenerator

Page 13: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Computergraphicsasgenerator

• Problem:ithastobedifferentiable• Solution1:implementareducedCGsysteminTensorFlow• Weakness:“reduced”isunfortunate• Nice:itcancontainlearnables

• Solution2:learnasecondNNtoapproximatearealCGsystem;backpropagatethroughtheapproximation• Weakness:approximationsareapain• Nice:ithasthepotentialforfullCG• Thecodesstillhavetobecontinuous

Page 14: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

PoC:graphicsforMNIST

• Uses“capsules”• Learnedconstant:smallimagetemplates• Learnedfunction:freecodeunitvaluesà affinetransformationforthetemplates• Hard-codedfunction:applyaffinetransformationstothetemplates• Affinetransformedtemplatesareaddedtogethertocreatefinaloutput

Page 15: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

PoC:graphicsforMNIST

• Uses“capsules”• Learnedconstant:smallimagetemplates• Learnedfunction:freecodeunitvaluesà affinetransformationforthetemplates• Hard-codedfunction:applyaffinetransformationstothetemplates• Affinetransformedtemplatesareaddedtogethertocreatefinaloutput

Page 16: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

ClusteringMNIST

• Kmeansclusteringwouldn’tworkwell:thegeneratoristooprimitive• Weneedamorepowerfulgenerator• Ithastounderstandthatashiftedimageisstillaboutthesameimage• Let’susethiscomputergraphicsgenerator

• Unsupervisedclusteringinto25clusters• Goodclassificationaccuracywithjust25labeledcases

Page 17: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation
Page 18: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Independentfeaturesautoencoder

• Goal:codecomponentsthataresomewhatindependentlyinterpretable• Application:identifyingthepatternsinthedata• Application:generatingdatavariantswithahigherchanceofmeaningfulresults• Method:applydropouttothecodecomponents

Page 19: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Independentfeatures

• Thisfocusesononeimage(seethecentralcolumn)• Thisshowsdatavariantsbyvaryingonefeatureatatime• The8featuresareclearlysomewhatindependent• Thevariantsaremeaningful

ß Featurevalueà

ßFeatureindexà

Page 20: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Toolsforcraftingspecializedautoencoders

• Imposesomestructureonthecode• E.g.clusteringautoencoder,computergraphicsautoencoder

• Applydeterministiceditingtothecode• E.g.ignoringautoencoder

• Applyrandomeditingtothecode• E.g.binarizingautoencoder,clusteringautoencoder,independentfeaturesautoencoder

• Applyanobjectivefunctionadditiontothecode• E.g.sparsebinarizingautoencoder

• Hard-code(partof)thegenerator• E.g.computergraphicsautoencoder

Page 21: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Conclusion

• Creatingcustomspecializedautoencodersisdoable• Thereisacollectionofbasictoolsforthis• Thiscanimproveinterpretabilityandlearningpower

• NextstageforDL:combininglearnableandhard-codedcomponents• Learnablecomponentsprovide“intuition”• Hard-codedcomponentsprovidethepowerofcomputation• Eitheraloneisquitelimited• AlphaGohashard-codedasmain()andlearnableassubroutine• TheCGautoencoderhaslearnableasmain()andhard-codedassubroutine

Page 22: Getting creative with autoencoders · 2020. 1. 8. · Autoencoders •Training objective: reconstruct the input •Components: encoder and decoder •Usual purpose: representation

Thankyou