/plushcap/analysis/datastax/datastax-reducing-computational-complexity-correlate-traversals

Reducing Computational Complexity with Correlate Traversals

What's this blog post about?

Graph computing is unique in its application to structured data and statistical algorithms from graph theory and network science. Two canonical uses are "Friend-of-a-Friend" (querying) and PageRank (statistics). Statistical techniques analyze the shape of a graph, focusing on vertex contributions to overall structure. A graph-centric statistic typically yields a single descriptive number. Eccentricity is a vertex-centric statistic that maps every vertex to a single number representing the longest shortest path from it to every other vertex. Betweenness centrality measures for each vertex the number of shortest paths it exists on between every pair of vertices in the graph, with a computational complexity reaching O(|V|^3). In large-scale graph computations, it is important to avoid geodesics if possible. Identifying centrality correspondences must be determined on a case-by-case basis. The self-similar aspect of scale-free graphs makes it possible to calculate a correlation matrix on a smaller subgraph and maintain confidence that the subgraph correlations will generalize to the full graph.

Company
DataStax

Date published
April 10, 2017

Author(s)
Artem Aliev

Word count
1752

Language
English

Hacker News points
None found.


By Matt Makai. 2021-2024.