.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/plot_example1.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_plot_example1.py: Regression ---------- .. GENERATED FROM PYTHON SOURCE LINES 7-25 The objective of this example is to create a SVM Regression algorithm in order to create a metamodel. otsvm enables to: - set lists of tradeoff factors and kernel parameter with the methods setTradeoffFactor, setKernelParameter. - select the kernel type in this list : Linear Kernel, Polynomial Kernel, Sigmoid Kernel, RBF kernel. - compute the algorithm on an input and output samples. - compute the algorithm on an experiment plane and a function. - compute the algorithm on an input and output samples and an isoprobabilistic distribution. We recommend for users to use the RBF Kernel (the Gaussian kernel). Moreover, it is important to understand that the selection of parameters (kernel parameter and tradeoff factor) is primary. If you don't know what to take as parameters, you must take a wide range values, for example :math:`tradeoff \in \{10^-5,10^-3,10^-1...10^3 \}` :math:`kernel\ parameter \in \{10^-15, 10^-13...,10^3 \}`. Usually, the algorithm always converges, but this can take a long while, especially if you have a lot of parameters to test. .. GENERATED FROM PYTHON SOURCE LINES 27-129 .. code-block:: Python import openturns as ot import otsvm # create a function, here we create the Sobol function dimension = 3 meanTh = 1.0 a = ot.Point(dimension) inputVariables = ot.Description(dimension) formula = "1.0" covTh = 1.0 for i in range(dimension): a[i] = 0.5 * i covTh = covTh * (1.0 + 1.0 / (3.0 * (1.0 + a[i]) ** 2)) inputVariables[i] = "xi" + str(i) formula += ( " * ((abs(4.0 * xi" + str(i) + " -2.0) + " + str(a[i]) + ") / (1.0 + " + str(a[i]) + "))" ) covTh = covTh - 1.0 model = ot.SymbolicFunction(inputVariables, ot.Description(1, formula)) # create the input distribution ot.RandomGenerator.SetSeed(0) marginals = ot.DistributionCollection(dimension) for i in range(dimension): marginals[i] = ot.Uniform(0.0, 1.0) distribution = ot.ComposedDistribution(marginals) # create lists of kernel parameters and tradeoff factors tradeoff = [0.01, 0.1, 1, 10, 100, 1000] kernel = [0.001, 0.01, 0.1, 1, 10, 100] # first example : create the problem with an input and output samples: # first, we create samples dataIn = distribution.getSample(250) dataOut = model(dataIn) # second, we create our svm regression object, we must select the third parameter # in an enumerate in the list { NormalRBF, Linear, Sigmoid, Polynomial } algo = otsvm.SVMRegression(dataIn, dataOut, otsvm.LibSVM.NormalRbf) # third, we set kernel parameter and tradeoff factor algo.setTradeoffFactor(tradeoff) algo.setKernelParameter(kernel) # Perform the algorithm algo.run() # Stream out the results result = algo.getResult() # get the residual error residual = result.getResiduals() # get the relative error relativeError = result.getRelativeErrors() print(f"residual={residual} error={relativeError}") # second example : create the problem with an experiment plane: # first, we create the plane myExperiment = ot.MonteCarloExperiment(distribution, 250) dataIn = myExperiment.generate() dataOut = model(dataIn) # second, we create our svm regression object, the first parameter is the # function algo2 = otsvm.SVMRegression(dataIn, dataOut, otsvm.LibSVM.Linear) # third, we set kernel parameter and tradeoff factor algo2.setTradeoffFactor(tradeoff) algo2.setKernelParameter(kernel) # Perform the algorithm algo2.run() # Stream out the results result = algo2.getResult() # get the residual error residual = result.getResiduals() # get the relative error relativeError = result.getRelativeErrors() print(f"residual={residual} error={relativeError}") # third example is here to present you the SVMResourceMap class. # Users can fix others parameters like the degree and the constant of the # Polynomial Kernel,the cacheSize, the number of folds or the epsilon # first, we create samples dataIn = distribution.getSample(250) dataOut = model(dataIn) # second, we create our svm regression object # here, we select the Polynomial Kernel but by default his degree is 3. We want a # degree of 2 ot.ResourceMap.Set("LibSVM-DegreePolynomialKernel", "2") # now the degree of the Polynomial kernel is 2 algo = otsvm.SVMRegression(dataIn, dataOut, otsvm.LibSVM.Polynomial) # third, we set kernel parameter and tradeoff factor algo.setTradeoffFactor(tradeoff) algo.setKernelParameter(kernel) # Perform the algorithm # algo.run() # Stream out the results # result = algo.getResult() # print(result) # get the residual error # residual = result.getResiduals() # get the relative error # relativeError = result.getRelativeErrors() .. rst-class:: sphx-glr-script-out .. code-block:: none residual=[0.0043475] error=[0.00793795] residual=[0.0515409] error=[1.04162] .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 4.902 seconds) .. _sphx_glr_download_auto_examples_plot_example1.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_example1.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_example1.py `