notonlyidentifyasinglepriceforeachinterval,buttoextractadditionalinformation:Theopen,high,low,andclosepricesforthisinterval,namedOHLC-data(seeFig.
1).
Fig.1.Ontheleftsideweseethemarketrateofacertainequityduringoneday.Ontherightside,thesedatahavebeencompressedanddepictedusingtheso-calledcandlesticklayout:Theupperandlowershadowsmarktheday’shighestandlowesttradedprices,whereasthebodyofthecandlespansfromtheopentothecloseprice.Thecolorofthebodyillustratestheequity’sdevelopmentduringtheday:Ifthepricewentup,thebodyiswhiteandblackotherwise.
C.RelatedResearchintheFieldofTechnicalAnalysisOverthelast15years,therehasbeenavastamountofscienti cinvestigationstousingmachinelearningmethodsfortechnicalanalysis.
[8],forexample,useabackpropagationneuralnetwork(multilayerperceptron)withonehiddenlayertopredictthedailyclosepricesofthestockindexS&P500,andcomparetheresultstoanARIMA-model.Asaresult,theyshowthatalthoughtheneuralnetworkhasahighertolerancetomarket uctuations,itsoutputistoovolatiletoindicatelong-termtrends.Abettersuitedapproachisdescribedin[9],whichutilizesrecurrentElmanneuralnetworks[10]forforecastingforeignexchangeprices.Itiscombinedwithamechanismtoautomaticallychooseandoptimizethenetwork’sparameters.Asaresult,itishighlightedthattheforecastsdonotdifferasmuchbetweendifferentmodelsasbetweendifferentinputdata.Foronlytwooutof veexchangerates(JPY/USDandGBP/USD),reliablepredictionsarepossible,whereasfortheotherrates,thepredictionaccuracyissimilartoanaiveforecast.
[11]usesamodi edSVMmodelforregressionwiththe(static)Gaussiankernel.ByadjustingtheregularizationconstantCwithaweightfunction,recenterrorsaremoreheavilypenalizedthandistanterrors,thusincreasingthein uenceofthemostrecentstockprices.Inadditiontothat,[12]addsasimilarweightfunctiontothethresholdε,whichlimitsthetoleranceofVapnik’sε-insensitiveerrorfunction[13].Thisapproachhelpstofurtherreducethecomplexityofthebuiltmodelandthenumberofsupportvectors.Furtheremphasizingtheneedofthoroughdatapreparation,[14]usessupportvectorclassi cationcombinedwithavarietyofdifferentpre-processingmethods.Asakernelfunction,
theyusethepolynomialkernelinadditiontotheGaussiankernelfunction,paredtoabackpropagationnetwork,theGaussianversionheavilyincreasesthemeasuredpredictionaccuracy.
Althoughallthesearticleswereabletopresentsomesuccessintheirexperiments,themajor awisobvious:Withastatickernelfunctionitisonlypossibletoincorporateacertain(limited)amountofinformationaboutthechart’shistory.Theinherenttemporalstructureofthedatacannotbeanalyzedap-propriately,leadingtorelativelypoorandunstablepredictionresults.
III.SUPPORTVECTORMACHINESWITHDYNAMIC
KERNELFUNCTIONSA.FundamentalsofSupportVectorMachines
Inthisarticle,cost-sensitivesupportvectormachines(C-SVM)andν-SVMareusedtoclassifythetimeseriesusingcharacteristicattributesextractedfromthetimeseriesasinputs.Basically,SVMuseahyperplanetoseparatetwoclasses[15]–[18].Forclassi cationproblemsthatcannotbelinearlyseparatedintheinputspace,SVM ndasolutionusinganon-linearmappingfromtheoriginalinputspaceintoahigh-dimensionalso-calledfeaturespace,whereanoptimallyseparatinghyperplaneissearched.Thosehyperplanesarecalledoptimalthathaveamaximalmargin,wheremarginmeanstheminimaldistancefromtheseparatinghyperplanetotheclosest(mapped)datapoints(so-calledsupportvectors).Thetransformationisusuallyrealizedbynonlinearkernelfunctions.C-SVMandν-SVMbothallow,butalsominimizemisclassi cation.
Comparedtothepopulararti cialneuralnetworks,SVMhaveseveralkeyadvantages:Bydescribingtheproblemasaconvexquadraticoptimizationproblem,theyareensuredtoconvergetoauniqueglobaloptimuminsteadofonlyapossiblylocaloptimum.Additionally,byminimizingthestructuralriskofmisclassi cation,SVMarefarlessvulnerabletoover tting,oneofthemajordrawbacksofstandardneuralnetworks.B.RelatedWorkintheFieldofDynamicKernelFunctionsAnoverviewandcomparisonofmethodsfortimeseriesclassi cationwithSVMcanbefoundin[19]or[20],forinstance.OnecommonmethodforclassifyingtimeserieswithSVMistouseoneofthedefaultstatickernels(i.e.,poly-nomialorGaussian).Forspeakerveri cation[21],phoneticclassi cation[22],orinstrumentclassi cation[23]thishassuccessfullybeendone.Abigdisadvantageofthisapproachisthatstatickernelsareunabletodealwithdataofdifferentlength.Therefore,itisnecessarytore-samplethetimeseriestoacommonlength,ortoextracta xednumberoffeaturesbeforestatickernelscanbeapplied.Itisobviousthatthere-samplingorthereductiontosomeextractedfeaturesinducesalossofinformationandisnotverywellsuitedtodealwithtimeseriesofvariablelength,wherealinearfunctionforre-scalingisnotapplicable.Amoresophisticatedapproachistousemethodsthatdirectlycomparethedatapointsoftwotimeseriesinamore exibleway,forexamplewith