paramz.examples package¶
Submodules¶
paramz.examples.ridge_regression module¶
Created on 16 Oct 2015
@author: Max Zwiessele
-
class
Basis
(degree, name='basis')[source]¶ Bases:
paramz.parameterized.Parameterized
Basis class for computing the design matrix phi(X). The weights are held in the regularizer, so that this only represents the design matrix.
-
basis
(X, i)[source]¶ Return the ith basis dimension. In the polynomial case, this is X**index. You can write your own basis function here, inheriting from this class and the gradients will still check.
Note: i will be zero for the first degree. This means we have also a bias in the model, which makes the problem of having an explicit bias obsolete.
-
-
class
Polynomial
(degree, name='polynomial')[source]¶ Bases:
paramz.examples.ridge_regression.Basis
-
basis
(X, i)[source]¶ Return the ith basis dimension. In the polynomial case, this is X**index. You can write your own basis function here, inheriting from this class and the gradients will still check.
Note: i will be zero for the first degree. This means we have also a bias in the model, which makes the problem of having an explicit bias obsolete.
-
-
class
Regularizer
(lambda_, name='regularizer')[source]¶
-
class
RidgeRegression
(X, Y, regularizer=None, basis=None, name='ridge_regression')[source]¶ Bases:
paramz.model.Model
Ridge regression with regularization.
For any regularization to work we to gradient based optimization.
Parameters: - X (array-like) – the inputs X of the regression problem
- Y (array-like) – the outputs Y
:param
paramz.examples.ridge_regression.Regularizer
regularizer: the regularizer to use :param str name: the name of this regression object-
objective_function
()[source]¶ The objective function for the given algorithm.
This function is the true objective, which wants to be minimized. Note that all parameters are already set and in place, so you just need to return the objective function here.
For probabilistic models this is the negative log_likelihood (including the MAP prior), so we return it here. If your model is not probabilistic, just return your objective to minimize here!
-
parameters_changed
()[source]¶ This method gets called when parameters have changed. Another way of listening to param changes is to add self as a listener to the param, such that updates get passed through. See :py:function:
paramz.param.Observable.add_observer
-
phi
(Xpred, degrees=None)[source]¶ Compute the design matrix for this model using the degrees given by the index array in degrees
Parameters: - Xpred (array-like) – inputs to compute the design matrix for
- degrees (array-like) – array of degrees to use [default=range(self.degree+1)]
Returns array-like phi: The design matrix [degree x #samples x #dimensions]
-
degree
¶
-
weights
¶