Spectral clustering and biclustering of networks : large graphs and contingency tables /

Explores regular structures in graphs and contingency tables by spectral theory and statistical methods This book bridges the gap between graph theory and statistics by giving answers to the demanding questions which arise when statisticians are confronted with large weighted graphs or rectangular a...

Full description

Saved in:
Bibliographic Details
Online Access: Full text (MCPHS users only)
Main Author: Bolla, Marianna (Author)
Format: Electronic eBook
Language:English
Published: Chichester : Wiley-Blackwell, 2013
Subjects:
Local Note:ProQuest Ebook Central
Table of Contents:
  • Spectral Clustering and Biclustering: Learning Large Graphs and Contingency Tables; Contents; Preface; Acknowledgements; List of abbreviations; Introduction; References; 1 Multivariate analysis techniques for representing graphs and contingency tables; 1.1 Quadratic placement problems for weighted graphs and hypergraphs; 1.1.1 Representation of edge-weighted graphs; 1.1.2 Representation of hypergraphs; 1.1.3 Examples for spectra and representation of simple graphs; 1.2 SVD of contingency tables and correspondence matrices; 1.3 Normalized Laplacian and modularity spectra.
  • 1.4 Representation of joint distributions1.4.1 General setup; 1.4.2 Integral operators between L2 spaces; 1.4.3 When the kernel is the joint distribution itself; 1.4.4 Maximal correlation and optimal representations; 1.5 Treating nonlinearities via reproducing kernel Hilbert spaces; 1.5.1 Notion of the reproducing kernel; 1.5.2 RKHS corresponding to a kernel; 1.5.3 Two examples of an RKHS; 1.5.4 Kernel
  • based on a sample
  • and the empirical feature map; References; 2 Multiway cuts and spectra; 2.1 Estimating multiway cuts via spectral relaxation.
  • 2.1.1 Maximum, minimum, and ratio cuts of edge-weighted graphs2.1.2 Multiway cuts of hypergraphs; 2.2 Normalized cuts; 2.3 The isoperimetric number and sparse cuts; 2.4 The Newman-Girvan modularity; 2.4.1 Maximizing the balanced Newman-Girvan modularity; 2.4.2 Maximizing the normalized Newman-Girvan modularity; 2.4.3 Anti-community structure and some examples; 2.5 Normalized bicuts of contingency tables; References; 3 Large networks, perturbation of block structures; 3.1 Symmetric block structures burdened with random noise; 3.1.1 General blown-up structures.
  • 3.1.2 Blown-up multipartite structures3.1.3 Weak links between disjoint components; 3.1.4 Recognizing the structure; 3.1.5 Random power law graphs and the extended planted partition model; 3.2 Noisy contingency tables; 3.2.1 Singular values of a noisy contingency table; 3.2.2 Clustering the rows and columns via singular vector pairs; 3.2.3 Perturbation results for correspondence matrices; 3.2.4 Finding the blown-up skeleton; 3.3 Regular cluster pairs; 3.3.1 Normalized modularity and volume regularity of edge-weighted graphs.
  • 3.3.2 Correspondence matrices and volume regularity of contingency tables3.3.3 Directed graphs; References; 4 Testable graph and contingency table parameters; 4.1 Convergent graph sequences; 4.2 Testability of weighted graph parameters; 4.3 Testability of minimum balanced multiway cuts; 4.4 Balanced cuts and fuzzy clustering; 4.5 Noisy graph sequences; 4.6 Convergence of the spectra and spectral subspaces; 4.7 Convergence of contingency tables; References; 5 Statistical learning of networks; 5.1 Parameter estimation in random graph models.