Kernel Fusion¶
Kernel fusion is a technique used in machine learning to combine multiple kernels, which are functions that measure the similarity between data points. This approach allows for the integration of different types of data or features, enabling the model to capture complex relationships and improve performance. Typically, kernel fusion involves weighting and summing the outputs of individual kernels, which can be designed for specific data modalities or feature sets.
Kernel-space fusion via weighted sum of per-view kernel matrices.
Classes / functions¶
center_kernel(K) remove mean in RKHS normalize_kernel(K) scale so K[i,i] = 1 for all i is_valid_kernel(K) check symmetry + PSD KernelSpec per-view kernel configuration KernelFusion fit/transform estimator producing a fused kernel
References
Gönen & Alpaydin (2011). Multiple kernel learning algorithms. JMLR 12, 2211-2268.
- class polyview.fusion.kernel_fusion.KernelFusion(*args: Any, **kwargs: Any)¶
Bases:
BaseMultiViewTransformerFuse views via a weighted sum of per-view kernel matrices.
Each view is mapped to a kernel matrix by its
KernelSpec, then:K_fused = sum_i (w_i * K_i) normalize_weights=False K_fused = sum_i (w_i / W * K_i) normalize_weights=True
The output is a raw (n_samples, n_samples) kernel matrix for use with any kernel method: spectral clustering, kernel SVM, kernel PCA, etc.
- Parameters:
specs (KernelSpec or list of KernelSpec or None) – One spec per view.
Noneor a single spec is broadcast: -None-> RBF, weight=1, center+normalize on each view - single KernelSpec -> same spec applied to every viewnormalize_weights (bool, default=False) – Divide weights by their sum (convex combination).
- kernels_¶
- Type:
list of ndarray (n_samples, n_samples)
- weights_¶
- Type:
ndarray (n_views,)
- K_fused_¶
- Type:
ndarray (n_samples, n_samples)
- specs_¶
- Type:
list of KernelSpec
Examples
>>> kf = KernelFusion() >>> K = kf.fit_transform([X1, X2]) # RBF on each view
>>> specs = [KernelSpec("rbf", weight=2.0, gamma=0.1), ... KernelSpec("linear", weight=1.0)] >>> K = KernelFusion(specs).fit_transform([X1, X2])
>>> specs = [KernelSpec("precomputed"), KernelSpec("precomputed")] >>> K = KernelFusion(specs).fit_transform([A1, A2]) # A* are n x n
- fit(views: List, y=None) KernelFusion¶
Compute per-view kernels and fuse them.
- Parameters:
views (list of array-like) – Feature arrays (n_samples, n_features_i), or square kernel matrices (n_samples, n_samples) paired with
KernelSpec("precomputed").y (ignored)
- Return type:
self
- kernel_matrix() numpy.ndarray¶
Return a copy of the fused kernel matrix.
- transform(views: List) numpy.ndarray¶
Return the fused kernel for a (possibly new) set of views.
- Parameters:
views (list of array-like)
- Returns:
K_fused
- Return type:
ndarray of shape (n_samples_new, n_samples_new)
- view_contributions() List[dict]¶
Per-view contribution fractions to the fused kernel (by Frobenius norm).
- class polyview.fusion.kernel_fusion.KernelSpec(kernel: Literal['linear', 'rbf', 'polynomial', 'precomputed'] | Callable[[numpy.ndarray], numpy.ndarray] = 'rbf', weight: float = 1.0, center: bool = True, normalize: bool = True, kernel_params: dict | None = None, **kwargs)¶
Bases:
objectConfiguration for one view’s kernel.
- Parameters:
kernel (str or callable) –
"linear"— dot-product kernel (sklearn linear_kernel)"rbf"— RBF/Gaussian (sklearn rbf_kernel, median heuristic when gamma=None)"polynomial"— polynomial kernel (sklearn polynomial_kernel)"precomputed"— the view array IS already a kernel matrix callable — any(X: ndarray) -> ndarrayfunctionweight (float, default=1.0) – Scalar weight in the weighted sum. Must be >= 0.
center (bool, default=True) – Center the kernel in the RKHS before fusion.
normalize (bool, default=True) – Normalize K so diagonals equal 1 before fusion.
kernel_params (dict, optional) – Extra kwargs forwarded to the kernel function, e.g.
{"gamma": 0.1}for RBF or{"degree": 2}for polynomial.**kwargs – Shorthand for kernel_params —
KernelSpec("rbf", gamma=0.5)is equivalent toKernelSpec("rbf", kernel_params={"gamma": 0.5}).
Examples
>>> KernelSpec("rbf") >>> KernelSpec("rbf", gamma=0.5) >>> KernelSpec("polynomial", kernel_params={"degree": 2, "coef0": 0.0}) >>> KernelSpec("precomputed", weight=2.0) >>> KernelSpec(lambda X: np.tanh(X @ X.T), weight=0.5)
- build(X: numpy.ndarray) numpy.ndarray¶
Compute (and preprocess) the kernel matrix for view X.
- Parameters:
X (ndarray of shape (n_samples, n_features), or (n_samples, n_samples)) – when
kernel="precomputed".- Returns:
K
- Return type:
ndarray of shape (n_samples, n_samples)
- polyview.fusion.kernel_fusion.center_kernel(K: numpy.ndarray) numpy.ndarray¶
Center a kernel matrix in the RKHS.
Equivalent to centering the implicit feature map phi(x) so its empirical mean is zero. Strongly recommended before fusing kernels from different views — removes the constant bias term and makes kernels comparable.
- Parameters:
K (ndarray of shape (n_samples, n_samples))
- Returns:
K_centered
- Return type:
ndarray of shape (n_samples, n_samples), symmetric
- polyview.fusion.kernel_fusion.is_valid_kernel(K: numpy.ndarray, tol: float = 1e-06) bool¶
Return True if K is a valid symmetric PSD kernel matrix.
- Parameters:
K (ndarray)
tol (float) – Tolerance for symmetry check and minimum eigenvalue.
- Return type:
bool
- polyview.fusion.kernel_fusion.normalize_kernel(K: numpy.ndarray, eps: float = 1e-10) numpy.ndarray¶
Normalize a kernel so that K[i, i] = 1 for all i.
K_n[i, j] = K[i, j] / sqrt(K[i, i] * K[j, j])
Prevents views with larger-magnitude kernels from dominating the weighted sum in KernelFusion.
- Parameters:
K (ndarray of shape (n_samples, n_samples))
eps (float, default=1e-10) – Guards against zero-norm samples.
- Returns:
K_normalized
- Return type:
ndarray of shape (n_samples, n_samples), values in [-1, 1]