Reading:
Publish on 2025-09-10
notes and understandings I make when reading the paper. comments will be written in the callout.

$$ O(m\ log^{2/3}n) $$

Preliminaries

  • Directed Graph $G=(V,\ E)$ with non-negative weight function: $E \to \mathbb{R}_{\geq0}$
  • $w_{uv}$ denotes each edge $(u,\ v)\in E$
  • $n=|V|,\ m=|E|$ as the number of vertices and edges in graph
  • Source vertex in the graph is denoted by $s$
  • Shortest path from $s$ to every vertex $v\in V$, denoted by $d(v)$
  • assume every vertex in $G$ is reachable, thus $m\geq n-1$

Constant-Degree Graph.

For any given graph $G$ we may construct $G’$ by a classical transformation to construct

constant in-degrees and out-degrees graph.

  • Substitute each vertex $v$ with a cycle of vertices strongly connected with zero-weight edges. For every incoming or outgoing neighbor $w$ of $v$, there is a vertex $x_{vw}$ on this cycle
  • For every edge $(u,\ v)$ in $G$, add a directed edge from vertex $x_{uv}$ to $x_{vu}$ with weight $w_{uv}$

It’s clear that the shortest path is preserved by the transformation. Each vertex in $G’$ has in-degree and out-degree at most 2.

Comparison-Addition Model.

each addition and comparison takes unit time, and no other computations on edge weights are allowed

Labels Used in the Algorithm.

maintains a global variable $\hat{d[u]}$ as a sound estimation of $d[u]$ (that is, $\hat{d[u]} \geq d(u)$

  • Initialize $\hat{d[s]} =0$ and $\hat{d[v]} =\infty$ for any other $v\in V$
  • Update $\hat{d[v]}$ in a non-increasing manner by relaxing an edge $(u,\ v)\in E$ toughout the algorithm. That is, assign $\hat{d[v]} \leftarrow\hat{d[u]} +w_{uv}$ when $\hat{d[u]} +w_{uv}$ is no greater than the old value of $\hat{d[v]}$
  • If $\hat{d[x]}=d(x)$, we say $x$ is complete; otherwise, we say $x$ is incomplete. If all vertices in a set $S$ are complete, we say $S$ is complete
  • Maintain a shortest path tree according to current $\hat{d[\cdot]}$ by recording the predecessor $Pred[v]$ of each vertex $v$. When relaxing an edge $(u,\ v)$, set $Pred[v]\leftarrow u$

💡 The predecessor, $Pred[v]$ simply records how the algorithm found the current best path to $v$. Which is the vertex that comes immediately before $v$ on the current shortest path from the source $s$.

Total order of Paths.

make the following assumption

Assumption 2.1. All paths we obtain in this algorithm have different lengths.

This assumption is required for two purposes:

  • To ensure that $Pred[v]$ for all vertices $v\in V$ keep a tree structure throughout the algorithm
  • To provide a relative order of vertices with the same value of $\hat{d[]}$

💡 How comes tree structure can not be kept if path length could be the same?
Keeping the tree structure means there could be only one possible value for $Pred[v]$ (one parent only). If $Pred[v]$ has two parents $x_1$ and $x_2$, it indicates that, $\hat{d[x_1]} +w_{x_1v} = \hat{d[x_2]} +w_{x_2v}$. Which is just what the assumption try to avoid.

Keeping the tree structure means there could be only one possible value for $Pred[v]$ (one parent only). If $Pred[v]$ has two parents $x_1$ and $x_2$, it indicates that, $\hat{d[x_1]} +w_{x_1v} = \hat{d[x_2]} +w_{x_2v}$. Which is just what the assumption try to avoid.

💡 How the order is preserved when $\hat{d[]}$ could have the same value?
Say, a path of length $l$ traverses $\alpha$ vertices $v_1=s,\ v_2,\ …,\ v_\alpha$ as a tuple $\langle l,\ \alpha,\ v_\alpha,\ v_{\alpha-1},…,\ v_1\rangle$. Sort path in the lexicographical order of the tuple in principle.

That is, first compare the length $l$. When we have a tie, compare $\alpha$. If there is still a tie, compare the sequence of vertices from $v_\alpha$ to $v_1$.

Comparison between the tuples can be done in only $O(1)$ time with extra information of $Pred[]$ and $\alpha$ stored for each $\hat{d[v]}$:

  • Relaxing an edge $(u,\ v)$: If $u\neq Pred[v]$, even if there is a tie in $l$ and $\alpha$, it suffices to compare between $u$ and $Pred[v]$, and if $u=Pred[v]$, then $\hat{d[u]}$ is updated to currently “shortest” and $\hat{d[v]}$ needs to get updated

  • Comparing two different $\hat{d[u]}$ and $\hat{d[v]}$ for $u\neq v$: Even if there is a tie in $l$ and $\alpha$, it suffices to compare the endpoints $u$ and $v$ only

Say, a path of length $l$ traverses $\alpha$ vertices $v_1=s,\ v_2,\ …,\ v_\alpha$ as a tuple $\langle l,\ \alpha,\ v_\alpha,\ v_{\alpha-1},…,\ v_1\rangle$. Sort path in the lexicographical order of the tuple in principle.

That is, first compare the length $l$. When we have a tie, compare $\alpha$. If there is still a tie, compare the sequence of vertices from $v_\alpha$ to $v_1$.

Comparison between the tuples can be done in only $O(1)$ time with extra information of $Pred[]$ and $\alpha$ stored for each $\hat{d[v]}$:

  • Relaxing an edge $(u,\ v)$: If $u\neq Pred[v]$, even if there is a tie in $l$ and $\alpha$, it suffices to compare between $u$ and $Pred[v]$, and if $u=Pred[v]$, then $\hat{d[u]}$ is updated to currently “shortest” and $\hat{d[v]}$ needs to get updated

EXPLAIN: If $u=Pred[v]$ when performing a relaxing, it can be deduced that $\hat{d[u]}^{new} + w_{uv} \leq \hat{d[u]}^{old} + w_{uv}$, thus $\hat{d[u]}$ is updated, $\hat{d[v]}$ is updated to $\hat{d[u]}^{new} + w_{uv}$ as a result. However, is there exists the possibility that the tuple of $\hat{d[u]}^{new}$ is less than the one of $\hat{d[u]}^{old}$ when $\hat{d[u]}^{new} + w_{uv} = \hat{d[u]}^{old} + w_{uv}$? Assume such tuple exists, which suffices $\hat{d[u]}^{new}= \hat{d[u]}^{old} $, wow, isn’t that familiar? Yes, it is exactly the first purpose of the assumption, and contradictory it is.

  • Comparing two different $\hat{d[u]}$ and $\hat{d[v]}$ for $u\neq v$: Even if there is a tie in $l$ and $\alpha$, it suffices to compare the endpoints $u$ and $v$ only

EXPLAIN: For $u\neq v$, $u$ and $v$ can be compared lexicographically by nature.

Therefore, in the following sections, we may assume that each path started from $s$ has a unique length.

The Algorithm

Main idea is based on divide-and-conquer on vertex sets.

Let $k:=\lfloor log^{1/3}(n)\rfloor$ and $t:=\lfloor log^{2/3}(n)\rfloor$, partition a vertex set $U$ into $2^t$ pieces $U = U_1\cup U_2 \cup …\cup U_{2^t}$ of similar sizes, where vertices in earlier pieces have smaller distances, and then recursively partition each $U_i$. In this way, the size of the sub-problem shrinks to a single vertex after roughly $(log\ n)/t$ recursion levels.

To construct our structure dynamically, each time we would try to compute the distances to a set of closest vertices (without necessarily recovering a complete ordering between their true distances), and report a boundary indicating how much progress we actually make.

Suppose at some stage of the algorithm, for every $u$ with $d(u)<b$, $u$ is complete and we have relaxed all the edges from $u$. We want to find the true distances to vertices $v$ with $d(v)\geq b$. To avoid the $\Theta(\log\ n)$ time per vertex in a priority queue, consider the “frontier” $S$ containing all current vertices $v$ with $b\leq \hat{d[v]} <B$ for some bound $B$ (without sorting them).

The shortest path to every incomplete vertex $v’$ with $b\leq d(v’) <B$ must visit some complete vertex in $S$. Thus to compute the true distance to every $v’$ with $b\leq d(v’) <B$, it suffices to find the shortest paths from vertices in $S$ and bounded by $B$. This subproblem is called bounded multi-source shortest path (BMSSP).

💡 Here the “must visit some complete vertex in $S$” just mean it has to check the complete vertex in $S$. If $v’$ mustn’t visit some complete vertex in $S$, indicating there exists a vertex $v”$ such that

© 2024 humbornjo :: based on 
nobloger  ::  rss