版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請進(jìn)行舉報或認(rèn)領(lǐng)
文檔簡介
第1題(Singlechoicequestion)Whatdoesaneuroncompute?()AAneuroncomputesanactivationfunctionfollowedbyalinearfunction(z=Wx+b)BAneuroncomputesalinearfunction(z=Wx+b)followedbyanactivationfunctionCAneuroncomputesafunctiongthatscalestheinputxlinearly(Wx+b)DAneuroncomputesthemeanofallfeaturesbeforeapplyingtheoutputtoanactivationfunction第2題(Singlechoicequestion)WhichoftheseisLogisticloss?()AMSElossB
NCElossC
FocallossDCross-Entropyloss第3題(Singlechoicequestion)WhichoftheseisthebasicunitinNN?()AneuronBlayerC
parameterDModule第4題(Singlechoicequestion)WhichoftheseisNOTactivationfunction?()AReLuBSoftmaxCTanhDSigmoid第5題(Singlechoicequestion)Considerthetwofollowingrandomarrays"a"and"b",
a=np.random.randn(4,3);b=np.random.randn(3,2);c=a*bwhatwillbetheshapeof“c:().A(4,3)B(4,4)C(4,2)D(3,3)第6題(Singlechoicequestion)Whichoftheseistheabbreviationofmulti-layerperceptron?()AMLPBRNNCNLPDBTP第7題(Singlechoicequestion)What’sthedifferencebetweenthesupervisedlearningandtheunsupervisedlearning?()ASupervisedlearninghaslabels,unsupervisedlearningdoesn’thavelabels.BSupervisedlearningneedtobetrained,unsupervisedlearningdoesn’tneedtobetrained.CSupervisedLearning:adesignedoutputisknownandusedtocomputeerrorsignal.UnsupervisedLearning:nosuchoutputisknown.第8題(Singlechoicequestion)Whichoftheseisthebasicsupervisedalgorithmofneuralnetwork?()AforwardcomputingBbackpropagationCgradientcomputing第9題(Singlechoicequestion)WhichoftheseistheformulaofSquareloss?()ABCD第10題(Singlechoicequestion)WhichoftheseistheformulaofAbsolutevalueloss?()ABCD第11題(Singlechoicequestion)WhichofthesearenotthereasonforDeepLearningrecentlytakingoff?()AWehaveaccesstoalotmorecomputationalpower.BNeuralNetworksareabrandnewfield.CWehaveaccesstoalotmoredata.DDeeplearninghasresultedinsignificantimprovementsinimportantapplicationssuchasonlineadvertising,speechrecognition,andimagerecognition.第12題(Singlechoicequestion)Whenanexperienceddeeplearningengineerworksonanewproblem,theycanusuallyuseinsightfrompreviousproblemstotrainagoodmodelonthefirsttry,withoutneedingtoiteratemultipletimesthroughdifferentmodels.()ATrueBFalse第13題(Singlechoicequestion)WhichoftheseistheformulaofCross-Entropyloss?()ABCD第14題(Singlechoicequestion)Youarebuildingabinaryclassifierforrecognizingcucumbers(y=1)vs.watermelons(y=0).Whichoneoftheseactivationfunctionswouldyourecommendusingfortheoutputlayer?()AReLUB
LeakyReLUC
sigmoidDTanh第15題(Singlechoicequestion)WhichoftheseistheformulaofHingeloss?()ABCD第16題(Singlechoicequestion)Supposeyouhavebuiltaneuralnetwork.Youdecidetoinitializetheweightsandbiasestobezero.WhichofthefollowingstatementsareTrue?()AEachneuroninthefirsthiddenlayerwillperformthesamecomputation.Soevenaftermultipleiterationsofgradientdescenteachneuroninthelayerwillbecomputingthesamethingasotherneurons.BEachneuroninthefirsthiddenlayerwillperformthesamecomputationinthefirstiteration.Butafteroneiterationofgradientdescenttheywilllearntocomputedifferentthingsbecausewehave“brokensymmetry”.CEachneuroninthefirsthiddenlayerwillcomputethesamething,butneuronsindifferentlayerswillcomputedifferentthings,thuswehaveaccomplished“symmetrybreaking”asdescribedinlecture.第17題(Singlechoicequestion)Amongthefollowing,whichoneis"hyperparameters"?
()AlearningrateBmodelweightsCmodelbias第18題(Singlechoicequestion)Whichofthefollowingstatementsistrue?()AThedeeperlayersofaneuralnetworkaretypicallycomputingmorecomplexfeaturesoftheinputthantheearlierlayers.BTheearlierlayersofaneuralnetworkaretypicallycomputingmorecomplexfeaturesoftheinputthanthedeeperlayers.第19題(TrueorFalseQuestions)Duringforwardpropagation,intheforwardfunctionforalayerlyouneedtoknowwhattheactivationfunctionisinalayer(Sigmoid,tanh,ReLU,etc.).Duringbackpropagation,thecorrespondingbackwardfunctionalsoneedstoknowwhattheactivationfunctionisforlayerl,sincethegradientdependsonit.()ATrueBFalse第20題(TrueorFalseQuestions)VectorizationallowsyoutocomputeforwardpropagationinanL-layerneuralnetworkwithoutanexplicitfor-loop(oranyotherexplicititerativeloop)overthelayersl=1,2,…,L.()ATrueBFalse第1題(TrueorFalseQuestions)SVMisabinaryclassifier.()ATrueBFalse第2題(Singlechoicequestion)WhatisthetaskofSVM?()AAbinaryclassificationtaskwithy=+1/-1.BAbinaryclassificationtaskwithy=1/0.第3題(TrueorFalseQuestions)SVMisamethodtryingtofindthelargestmarginbetweentwofeatures.()ATrueBFalse第4題(Singlechoicequestion)What’sthemeaningofthemargin?()AThemarginisdefinedintermsofthedistancefromtheboundarytoexamples.BThemarginisbasedonthevalueofthelinearfunction.第5題(Singlechoicequestion)WhichoftheseisthehyperplaneoftheSVM?()ABC第6題(Singlechoicequestion)WhichoftheseisthedistancebetweenH1andorigin?()A
B第7題(TrueorFalseQuestions)InSVM,wecantrytofindauniquesolutionbyrequiringthatthetrainingexamplesareclassifiedcorrectlywithanon-zero“margin”,themarginisdefinedintermsofthedistancefromtheboundarytotheexamplesratherthanbasedonthevalueofthelinearfunction.()ATrueBFalse第8題(Singlechoicequestion)Whichoneisthemethodofachievingthelargestmargin?
()AB第9題(TrueorFalseQuestions)SVMoptimizationisarelaxedquadraticoptimizationproblem.()ATrueBFalse第10題ATrueBFalse第11題(Singlechoicequestion)WhatwillhappenifusinglargeC?()AItwillcausefewviolations.BItwillcausemanyviolations.第12題(TrueorFalseQuestions)LinearSVMcanstillhandleproblemwhentheinputspaceismappedtoahigh-dimensionalfeaturespace.()ATrueBFalse第13題(TrueorFalseQuestions)Gaussiankernelisonetypeofkernelinnon-linearSVM.()ATrueBFalse第14題ATrueBFalse第15題
(Singlechoicequestion)WhatistheconsequenceofusingsmallC?()AItwillcausefewviolations.BItwillcausemanyviolations.第16題(TrueorFalseQuestions)Many(lowdimensional)problemsaresolvedwellbyalinearclassifierwithslack.()ATrueBFalse第17題(Singlechoicequestion)Howcanwegetnon-linearmargincurvesintheoriginalspace?()AMappingexamplestofeaturevectorsandmaximizingalinearmargininthefeaturespace.BJustusethelinearclassifierwithslack.第18題(Singlechoicequestion)What’sthefunctionofparameterC?
()AaccuracyBefficiencyCRegularization第19題(TrueorFalseQuestions)ASVMwithnokernelfunctioncanalsobeseenasusingalinearkernel.()ATrueBFalse第20題(Singlechoicequestion)IfweuseGaussiankernelinSVM,whenwelargethesigma:().Ahighvariance,lowbias.Blowvariance,highbias.Exercise第1題(Singlechoicequestion)AMarkovchainmodelisNOTdefinedby:().AAsetofstates.BAsetoftransitionswithassociatedprobabilities.CAsetofpredictions.第2題(TrueorFalseQuestions)Asetoftransitionswithassociatedprobabilitiesmeansthatthetransitionsemanatingfromagivenstatedefineadistributionoverthepossiblenextstates.()ATrueBFalse第3題ATrueBFalse第4題ATrueBFalse第5題(TrueorFalseQuestions)TheMarkovpropertyspecifiesthattheprobabilityofastatedependsonlyontheprobabilityofthepreviousstate.()ATrueBFalse第6題ATrueBFalse第7題(Singlechoicequestion)What’sthemeaningofNinHMM
=(N,M,A,B,
)?()AThenumberofstatesinthemodel.BThenumberofdistinctobservationsymbolsperstate.CThestatetransitionprobabilitydistribution.DTheobservationsymbolprobabilitydistribution.第8題(Singlechoicequestion)What’sthemeaningofMinHMM?()AThenumberofstatesinthemodel.BThenumberofdistinctobservationsymbolsperstate.CThestatetransitionprobabilitydistribution.DTheobservationsymbolprobabilitydistribution.第9題(Singlechoicequestion)What’sthemeaningofAinHMM?()AThenumberofstatesinthemodel.BThenumberofdistinctobservationsymbolsperstate.CThestatetransitionprobabilitydistribution.DTheobservationsymbolprobabilitydistribution.第10題(Singlechoicequestion)What’sthemeaningofBinHMM?()AThenumberofstatesinthemodel.BThenumberofdistinctobservationsymbolsperstate.CThestatetransitionprobabilitydistribution.DTheobservationsymbolprobabilitydistribution.第11題(Singlechoicequestion)Given:amodel,asetoftrainingsequences.
Do:findmodelparametersthatexplainthetrainingsequenceswithrelativelyhighprobability(goalistofindamodelthatgeneralizeswelltosequenceswehaven’tseenbefore)Whichtaskisright?()ALearningBClassificationCSegmentation第12題
(Singlechoicequestion)Given:asetofmodelsrepresentingdifferentsequenceclasses,atestsequenceDo:determinewhichmodel/classbestexplainsthesequenceWhichtaskisright?()A
LearningBClassificationCSegmentation第13題(Singlechoicequestion)Given:amodelrepresentingdifferentsequenceclasses,atestsequenceDo:segmentthesequenceintosubsequences,predictingtheclassofeachsubsequenceWhichtaskisright?()ALearningBClassificationCSegmentation第14題(Singlechoicequestion)Whatisthemostprobable“path”forgeneratingagivensequence?()AtheForwardalgorithmBtheViterbialgorithmCtheForward-Backward(Baum-Welch)algorithm第15題(Singlechoicequestion)HowlikelyisagivensequenceinHMM?()AtheForwardalgorithmBtheViterbialgorithmCtheForward-Backward(Baum-Welch)algorithm第16題(Singlechoicequestion)HowcanwelearntheHMMparametersgivensasetofsequences?()AtheForwardalgorithmBtheViterbialgorithmCtheForward-Backward(Baum-Welch)algorithm第17題(TrueorFalseQuestions)ExpectationMaximizationalgorithmisafamilyofalgorithmsforlearningprobabilisticmodelsinproblemsthatinvolvehiddenstate.()A
TrueB
False第18題
(Singlechoicequestion)CalculatetheComputationalComplexityofHMMAlgorithms:GivenanHMMwithSstatesandasequenceoflengthL,thecomplexityoftheForward,BackwardandViterbialgorithmsis().AB第19題(Singlechoicequestion)CalculatetheComputationalComplexityofHMMAlgorithms:GivenMsequencesoflengthL,thecomplexityofBaumWelchoneachiterationis().AB第20題(Singlechoicequestion)WhichoneisnottheDP-basedalgorithmsforHMMs?()AForwardBBackwardCViterbiDExpectationMaximizationExercise第1題(Singlechoicequestion)Whichofthefollowingisacharacteristicofunsupervisedlearning?()AEachinputcorrespondstoanoutputlabel.BThepurposeofthetaskistolearntopredictthecorrectlabelbasedontheinputcharacteristics.CTrainingdatahasnocorrespondinglabel.DThenoiseofthelabelwillaffectthequalityofthemodel.第2題(Singlechoicequestion)Whatisthepurposeofunsupervisedlearning?()AExploretherelationshipbetweeninputfeaturesandoutputtags.BEstablishtheinputrepresentationthatcanbeusedfordecision-making,predictthefutureinput,andeffectivelytransfertheinputtoanothermachine.CExploredatastructureswithclassroom/outputguidance.DModelingofknownmodelsbasedonhistoricaldata.第3題(Singlechoicequestion)Whichofthefollowingisnotanunsupervisedlearningmethod?()AClusteringBDimensionReductionCLatentvariablemodelsDRegression第4題(Singlechoicequestion)Whichofthefollowingisnotaclusteringmethod?()APartitionclusteringBHierarchicalclusteringCLocalclusteringDDensity-basedclustering第5題(Singlechoicequestion)①FeatureSelection/Extraction②InterpatternSimilarity③GroupingWhatisthecorrectorderofclustering?()A①②③B①③②C②③①D③①②第6題(Singlechoicequestion)Whichkindofclusteringmethodbuildsaclusteringtreefromdata?()APartitionclusteringBHierarchicalclusteringCGrid-basedclusteringDDensity-basedclustering第7題(Singlechoicequestion)WhichisnotoneofthemethodofPartitionclustering?()AK-meansBK-mediodsCCLARADBRICH第8題(Singlechoicequestion)WhichisnotoneofthemethodofHierarchicalclustering?()ABRICHBROCKCK-meansDChameleon第9題(Singlechoicequestion)WhichisnotoneofthemethodofModel-basedclustering?()AROCKBEMCConceptclusteringDNeuralnetworkbasedclustering第10題(Singlechoicequestion)WhichisthebiggestadvantageofGrid-basedclustering?()ALowmemorycostBHighqualityCFastprocessingspeedDVisualization第11題(Singlechoicequestion)Whichisnotoneoftherecentmethodofunsupervisedlearning?()AVAEBU-netCBERTDSimCLR第12題(Singlechoicequestion)WhatisthetypeofVAE?()AContextualinformationBContrastLearningCReinforcementlearningDReconstruction第13題(Singlechoicequestion)WhichofthefollowingstatementsaboutBERTiswrong?()AThemodelhasmanylayers.BThemodelhasmorethanonekindofembedding.CThemodelhasbothencoderanddecoder.DThemodeloftenperformsbetterindownstreamtasks.第14題(Singlechoicequestion)WhichofthefollowingstatementsaboutTrainingTrickiswrong?()AAllparametersneedtobeinitializedwith0.BWeightsaredrawnfromGaussiandistributionwithfixedmean0andfixedstandarddeviation(0.01),thisisthemostcommoninitializationmethod.CWeightsaredrawnfromproperlyscaleduniformorGaussiandistributionwithzeromeanandaspecialvariance.D第15題(Multiplechoicequestions)WhichisnotoneofthemethodofHierarchicalclustering?()ASTINGBK-mediodsCCLARADWaveCluster正確答案:BC第16題(Singlechoicequestion)Whichkindofclusteringmethodusesamulti-resolutiongriddatastructuretoquantifythespaceintoalimitednumberofunits?()APartitionclusteringBHierarchicalclusteringCGrid-basedclusteringDDensity-basedclustering正確答案:C第17題(Singlechoicequestion)Whichkindofclusteringmethodtriestofindthemodelbehindit,andusesitsprobabilitydistributioncharacteristicsforclustering?()APartitionclusteringBModel-basedclusteringCGrid-basedclusteringDDensity-basedclustering第18題(Singlechoicequestion)Inwhichkindofclusteringmethods,clustersareregardedasdenseobjectregionsseparatedbylow-densityregionsindataspace,andsometimessuchlow-densityregionsareregardedasnoise?()APartitionclusteringBModel-basedclusteringCGrid-basedclusteringDDensity-basedclustering第19題(Multiplechoicequestions)
Whichofthefollowingstatementsistrue?()AThebottom-upstrategygraduallymergessmallcategoriesintolargecategories,whichiscalledcohesion.BThetop-downstrategygraduallymergessmallcategoriesintolargecategories,whichiscalledcohesion.CThebottom-upstrategygraduallysplitslargecategoriesintosmallcategories,whichiscalledsplitting.DThetop-downstrategygraduallysplitslargecategoriesintosmallcategories,whichiscalledsplitting.正確答案:AD第20題(Singlechoicequestion)Whichkindofunsupervisedlearningmethodistoanalyzerelationshipsbetweenasetofdocumentandthetermstheycontainbyproducingasetofconceptsrelatedtothedocumentsandterms?()ALatentvariablemodels
BGraphmodelCAssociationanalysisDLatentsemanticanalysisExercise第1題(Singlechoicequestion)
Whatisthefoundationofartificialintelligence(AI)andsolvesproblemsthatwouldproveimpossibleordifficultbyhumanorstatisticalstandards?()ACNNBANNCGNNDRNN第2題(Singlechoicequestion)
WhichisnotamotivationofCNN?()AExternalstimuliarenottransformedintoelectricalsignalsthroughnerveendings,whicharenottransducedtonervecells(alsocalledneurons).
BNumerousneuronsconstitutethenervecenter.
CThenervecentersynthesizesvarioussignalsandmakesjudgments.
DThehumanbodyrespondstoexternalstimulibasedoninstructionsfromthenervecenter.
第3題(Singlechoicequestion)
WhichisnottheadvantagesofCNNcomparedwithfullconnectionnetwork?()AItneedsfewerparametersBItcostslesstimeforonepassCItiseasiertooverfitDItfocusmoreonlocalfeatures第4題(Singlechoicequestion)
Whichofthefollowingistrue?()AAConvNetiscomprisedofmorethanoneconvolutionallayers(oftenwithapoolingstep)andthenfollowedbyoneormorefullyconnectedlayersasinastandardmultilayerneuralnetwork.BAConvNetiscomprisedofoneormoreconvolutionallayers(oftenwithapoolingstep)andthenfollowedbyoneormorefullyconnectedlayersasinastandardmultilayerneuralnetwork.CAConvNetiscomprisedofoneormoreconvolutionallayers(oftenwithoutanypoolingstep)andthenfollowedbyoneormorefullyconnectedlayersasinastandardmultilayerneuralnetwork.DAConvNetiscomprisedofoneormoreconvolutionallayers(oftenwithapoolingstep)andthenfollowedbymorethanonefullyconnectedlayersasinastandardmultilayerneuralnetwork.
第5題(Singlechoicequestion)
WhichofthefollowingrepresentsCNN?()AB第6題(Multiplechoicequestions)WhatmeasuresdidCNNtaketosolvetheproblemof“Dimensionsaretoolarge,Parametersaretoolarge,difficulttotrain”?()ALocalconnectivityBParametersharingCSubsampleDDropout正確答案:ABC第7題(Singlechoicequestion)
WhatmeasuresdidCNNtaketosolvetheproblemof“ThepositionisnotusedinANN”?()ALocalconnectivityBParametersharingCSubsampledDDropout第8題(Multiplechoicequestions)
WhatmeasuresdidCNNtaketosolvetheproblemof“ThenumberofnetworklayersislimitedanditisdifficulttotrainadeepfullyCNNthroughthegradientdescentmethod.”?()ALocalconnectivityBParametersharingCSubsampledDDropout正確答案:ABC第9題A0B1C-1D-2第10題(Multiplechoicequestions)
WhatadvantagescanparametersharingbringtoCNN?()ASaveparametersBLesstimecostCMoredifficulttooverfitDEasiertodesign正確答案:AC第11題(Singlechoicequestion)
Whatisneededtoavoidthematrixafterconvolutionbecomessmallerandsmaller?()AStrideBFilteringCPaddingDPooling第12題(Singlechoicequestion)
Whichnumberisusuallyusedtobeasthepaddingnumber?()A1B255C-1D0第13題(Singlechoicequestion)
Whichofthefollowingsetsofcorrespondenceiscorrect?()AConvolutionlayer&Poolinglayer-Extractfeatures,Fullconnectedlayer-ClassificationBConvolutionlayer&Poolinglayer-Classification,Fullconnectedlayer-ClassificationCConvolutionlayer&Poolinglayer-Classification,Fullconnectedlayer-ExtractfeaturesDConvolutionlayer&Poolinglayer-Extractfeatures,Fullconnectedlayer-Extractfeatures第14題(Singlechoicequestion)
Poolingwillperformadownsamplingoperationalongwhichdimension?()ATimeBSpatialCChannelDColor第15題(Multiplechoicequestions)
WhatadvantagescanpoolingbringtoCNN?()ASaveparametersBLesstimecostCAddrobustnesstopositionDProvidetranslationinvariance正確答案:ABDC第16題(Singlechoicequestion)
Isitpossibletomimicacomplexmodelwithmorelayersbutnoactivationfunctions?()AYesBNo第17題(Singlechoicequestion)
Whichisnotthepropertyofactivationfunction?()AContinuous.Gradientdescent'srequirements.BTherangeispreferablynotsaturated.Ifthesystemoptimizationentersthesaturationstage,thegradientisapproximately0,andthelearningofthenetworkwillstop.CLinearityDMonotonicity.Whentheactivationfunctionismonotonic,thelossfunctionofthesingle-layerneuralnetworkisconvex,whichisgoodforoptimization.
第18題(Singlechoicequestion)
Whichkindofactivationfunctionisgoodatdetectingdifferencesandiscommonlyusedinbinaryclassificationtasks?()ATanhBSigmoidCReluDLeakyRelu第19題(Singlechoicequestion)
WhyTanhfunctionconvergesfasterthanthesigmoidfunction?()ATanhis
zerocentered
sothatgradientsarenotrestrictedtomoveinoneparticulardirection.BTanhneedlesscomputationInbackpropagationCTanhneedlesscomputationInforwardpropagationDTanhisspeciallysupportedbythehardware第20題(Multiplechoicequestions)
Whichofthefollowingsetsofcorrespondenceiscorrect?()AMeansquareerror-oftenusedinregressionproblemsBCrossentropylossfunction-oftenusedinclassificationproblemsCMeansquareerror-oftenusedinclassificationproblemsDCrossentropylossfunction-oftenusedinregressionproblems正確答案:ABExercise第1題(Singlechoicequestion)
Whichofthefollowingisnotsequencedata?()ATime-seriesdataBGeneSequenceCSpeechsoundsDPhotos第2題(Multiplechoicequestions)WhydoweneedRNN?()AVariableneuronsandParameterSharingamongdifferenttimeBHugeamountofdataCLong-termdependencychallengeDDifferentdomainofdata正確答案:AC第3題(Singlechoicequestion)
WhichisnotthesolutionsofLong-termdependencychallenge?()AUsingrecurrentconnectionswithlongdelaysBCNNCLeakyUnitsDLSTM(gatedRNNs)第4題(Singlechoicequestion)
Whatisthetrainingprocessinwhichthefedbackinputsarenotthepredictedoutputsbutthetargetsthemselves?()ALSTMBDropoutCTeachingforcingDPooling第5題(Singlechoicequestion)
Whatcanbeviewedasadirectedgraphicalmodelthatestimatestheconditionaldistributionofasequence?()ACNNBRNNCGNNDANN第6題(Singlechoicequestion)
Inmanyapplicationswewanttooutputattimetapredictionregardinganoutputwhichmaydependonthewholeinputsequence.Tosolvethisproblemwhatmodelwasproposed?()ABidirectionalRNNsBMASSCCNNDGRU第7題(Singlechoicequestion)
Whichisnotthemomentwhenwewanttooutputattimetapredictionregardinganoutputwhichmaydependonthewholeinputsequence?()ASpeechRecognitionBHandwritingRecognitionCFacerecognitionDBioinformatic
第8題(Singlechoicequestion)
TherearethreeblocksofparametersandassociatedtransformationinRNN,whichisnotincluded?()AFromoutputtohiddenstateBFrominputtohiddenstateCFromhiddenstatetohiddenstateDFromhiddenstatetooutput第9題(Singlechoicequestion)
Whichkindofnetworkhasacomputationalgraphthatgeneralizesthatoftherecurrentnetworkfromachaintoatree?()ADeepRecurrentNetworkBRecursiveNeuralNetworkCConvolutionNeuralNetworkDGraphneuralnetwork第10題(Singlechoicequestion)
ArisingfrommultipleproductofthesameweightmatrixW,whenW'sspectralradiusisbiggerthan1,itleadsto①,otherwise②.()A①gradientexploding②gradientexplodingB①gradientexploding
②gradientvanishingC①gradientvanishing
②gradientexplodingD①gradientvanishing
②gradientvanishing第11題(Singlechoicequestion)
WhichsolutionofsolvingLong-termDependenciesproposestosetthoseweightssuchthattherecurrenthiddenunitsdoagoodjobofcapturingthehistoryofpastinputs,andonlylearntheoutputweights?()AEchoStateNetworksBCombiningshortandlongpathsCLeakyUnitDGatedRNNs第12題(Singlechoicequestion)
WhichsolutionofsolvingLong-termDependenciesproposestouserecurrentconnectionswithlongdelays?
()AEchoStateNetworksBLeakyUnitCCombiningshortandlongpathsDGatedRNNs第13題(Singlechoicequestion)
WhichsolutionofsolvingLong-termDependenciesproposestohaveunitswithlinearself-connectionsandaweightnear1ontheseconnections?()AEchoStateNetworksBLeakyUnitCCombiningshortandlongpathsDGatedRNNs第14題(Singlechoicequestion)
WhichsolutionofsolvingLong-termDependenciesproposestoletnetworklearnstodecidewhentoupdateorforgetstate?()AEchoStateNetworksBLeakyUnitCCombiningshortandlongpathsDGatedRNNs第15題(Singlechoicequestion)
Whichofthefollowingstatementsistrue?()ALSTMislesspowerfulthanGRU,andGRUismoreeasytotrainthanLSTMBLSTMismorepowerfulthanGRU,butGRUislesseasytotrainthanLSTMCLSTMislesspowerfulthanGRU,butGRUismoredifficulttotrainthanLSTMDLSTMismorepowerfulthanGRU,butGRUismoreeasytotrainthanLSTM第16題ALSTMBBERTCCNNDGRU第17題ALSTMBBERTCCNNDGRU第18題(Singlechoicequestion)
WhichsolutionofsolvingLong-termDependenciesproposestodividethesmallderivativebyasmallsecondderivative,whilenotscalingupinthedirectionswherethesecondderivativeislarge?
()AEchoStateNetworksBBetteroptimizationCCombiningshortand
longpathsDGatedRNNs第19題(Singlechoicequestion)
WhichsolutionofsolvingLong-termDependenciesproposestochangethesizeofthegradient?
()AEchoStateNetworksBBetteroptimizationCClippingGradientsDGated
RNNs第20題AbreadthfirstsearchBbeamsearchCgreedysearchDCoresearchExercise第1題(Singlechoicequestion)
Adversarialnetworks(GAN)usuallyconsistofhowmanypart(s)()
A1B2C3D4第2題(Singlechoicequestion)
Thefollowingdescriptionofgenerativeadversarialnetworksisincorrect().AGenerativeadversarialnetworkconsistsoftwoparts:generatoranddiscriminatorBWhenthediscriminatorofadversarialnetworkistraining,itsinputistheimagegeneratedbygeneratorandtherealimagefromthetrainingset.CGAN'sGeneratorsgenerateimagesfromrandomnoise(usuallytakenfromauniformorGaussiandistribution)DSincegenerativeadversarialnetworksareunsupervisedmodels,notrainingdataisrequired第3題(Singlechoicequestion)
Thegeneratoranddiscriminatorofadversarialnetworkcanonlyhavethesamelearningfrequency().AcorrectBincorrect第4題(Singlechoicequestion)
Thegoalofadversarialnetworktrainingistomakethelossfunctionofgeneratoraslowaspossible().AcorrectBincorrect第5題(Singlechoicequestion)
Whichofthefollowingstatementsisnottrueaboutthecostfunctionofgeneratingadversarialnetworks(GAN)?
()AThegeneratorshouldminimizetheaccuracyofthediscriminantmodelD,whilethediscriminatorshouldmaximizetheaccuracyoftrueandfalseclassificationBInordertoachievethegamebalancebetweengeneratoranddiscriminator,thecostfunctionofGANneedstoconsidertheperformanceofbothCBytrainingdiscriminatorandgeneratoralternately,theperformanceofdiscriminatorandgeneratorcanbeimprovedtoreachabalancepointDGenerallyspeaking,GANcanalwaysreachtheminimumvalueofcostfunctionthroughtraining第6題(Singlechoicequestion)
Whichofthefollowingstatementsistrueforthetrainingofgenerativeadversarialnetworks(GAN)?
()AThegeneratorisimplementedbyafeedforwardneuralnetworkordeconvolutiondeepnetworkanditsgoalistomakethegeneratedimagelookliketherealsampleBIfthediscriminatorisoverfitted,thegeneratormaygenerateaverystrangesample第7題(Singlechoicequestion)
Whichofthefollowingstatementsisnottrueaboutthecross-entropycostfunctionsofgen
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負(fù)責(zé)。
- 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時也不承擔(dān)用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。
最新文檔
- 電池及電池系統(tǒng)維修保養(yǎng)師崗前保密意識考核試卷含答案
- 長期照護(hù)師班組考核知識考核試卷含答案
- 制材工安全技能考核試卷含答案
- 農(nóng)作物種植技術(shù)員安全教育水平考核試卷含答案
- 甘油精制工班組協(xié)作模擬考核試卷含答案
- 甲殼類繁育工安全綜合競賽考核試卷含答案
- 燒結(jié)成品工崗前日??己嗽嚲砗鸢?/a>
- 制帽工操作技能競賽考核試卷含答案
- 糖藝師崗前生產(chǎn)安全考核試卷含答案
- 坯布縫接工安全防護(hù)水平考核試卷含答案
- 國網(wǎng)安全家園題庫及答案解析
- 足踝外科進(jìn)修匯報
- 【12篇】新部編版小學(xué)語文六年級上冊【課內(nèi)外閱讀理解專項訓(xùn)練(完整版)】含答案
- 船艇涂裝教學(xué)課件
- 招標(biāo)績效考核方案(3篇)
- 500萬的咨詢合同范本
- 2025年貸款房屋轉(zhuǎn)贈協(xié)議書
- 2025天津市個人房屋租賃合同樣本
- 中藥熱熨敷技術(shù)及操作流程圖
- 鶴壁供熱管理辦法
- 01 華為采購管理架構(gòu)(20P)
評論
0/150
提交評論