1 Introduction
Primitives that reduce the sizes of graphs while retaining key properties are central to the design of efficient graphs algorithms. Among such primitives, one of the most intriguing is the problem of vertex sparsification: given a set of terminal vertices , reduce the number of nonterminal vertices while preserving key information between the terminals. This problem has been extensively studied in approximation algorithms [MM10, CLLM10, EGK14, KR17, KW12, AKLT15, FKQ16, FHKQ16, GR16, GHP17a]. Recently, vertex sparsifiers were also shown to be closely connected with dynamic graph data structures [GHP17b, PSS19, GHP18, DGGP19].
Motivated by the problem of dynamic edgeconnectivity, which asks whether a pair of vertices have at least edgedisjoint paths between them, we study vertex sparsifiers suitable these problems.
Definition 1.1.
Two graphs and that both contain a subset of terminals are cutequivalent if for any partition of into , we have
Here denotes the set edges leaving in , and is the same as .
Our main result is that for any graph and terminals , there is a graph that is equivalent to , where the size of depends linearly on the size of , but exponentially on . We call a vertex sparsifier for on terminals as has far smaller size than while maintaining the same edge connectivity information on the terminals .
Furthermore, we utilize ideas from recent works on edge connectivity [FY19, SW19, NSY19b] to obtain efficient algorithms for the case where is a constant.
Theorem 1.2.
Given any graph with vertices, edges, along with a subset of terminals and a value , we can construct a graph which is cutequivalent to

with edges in time,

with edges in time ^{2}^{2}2We use to hide factors. In particular, denotes ..
Both components require algorithms for computing expander decompositions (Lemma 6.2). The first uses observations made in vertex cut algorithms [NSY19a, NSY19b, FY19], while the second uses local cut algorithms developed from such studies.
The more general problem of multiplicatively preserving all edge connectivities has been extensively studied. Here an upper bound with multiplicative approximation factor [CLLM10, MM10] can be obtained without using additional vertices. It’s open whether this bound can be improved when additional vertices are allowed, but without them, a lower bound of is also known [MM10]. For our restricted version of only considering values up to , the best existential bound for larger values of is vertices [KW12, FHKQ16]. However, the construction time of these vertex sparsifiers are also critical for their use in data structures [PSS19]. For a moderate number of terminals (e.g. ), nearlylinear time constructions of vertex sparisifers with vertices were known only when previously [PSS19, MS18].
Vertex sparsification is closely connected with dynamic graph data structures, and directly plugging in these sparsifiers as described in Theorem 1.2 into the divideandconquer on time framework proposed by Eppstein [Epp94] (a more general form of it can be found in [PSS19]) gives an efficient offline algorithm for supporting fully dynamic connectivity queries.
Theorem 1.3 (Dynamic offline connectivity).
An offline sequence of edge insertions, deletions, and connectivity queries on a vertex graph can be answered in time per query.
In previously published works, the study of fully dynamic connectivity has been limited to the setting [GI91, HdLT98, LS13, KL15, Kop12],
To our knowledge, the only results for maintaining exact connectivity for are an incremental algorithm for [DV94, DV95, DW98], and an unpublished offline fully dynamic algorithm for by Molina and Sandlund [MS18]. These algorithms all require about time per query.
Furthermore, our algorithms gives a variety of connections between graph algorithms, structural graph theory, and data structures:

The vertex sparsifiers we constructed can be viewed as the analog of Schur complements (vertex sparsifiers for effective resistance) for edgeconnectivity, and raises the possibility that algorithms motivated by Gaussian elimination [KLP16, KS16] can in fact work for a much wider range of graph problems.

Our dependence on is highly suboptimal: we were only able to construct instances that require at least edges in the vertex sparsifier, and are inclined to believe that an upper bound of is likely. Narrowing this gap between upper and lower bounds is an interesting question in combinatorial graph theory that directly affect the performances of data structures and algorithms that utilize such sparsifiers.
1.1 Paper Organization
In Section 2 we give preliminaries for our algorithm. In Section 3 we give an outline for our algorithms. In Section 4 we show the existence of good cut vertex sparsifiers. In Section 5 we give a polynomial time construction of cut vertex sparsifiers whose size is slightly larger than those given in Section 4. In Section 6 we use expander decomposition to make our algorithms run in nearly linear time.
2 Preliminaries
2.1 General Notation
All graphs that we work with are undirected and unitweighted, but our treatement of cuts and contractions naturally require (and lead to) multiedges. We will refer to cuts as both subsets of edges, , or the boundary of a subset of vertices
For symmetry, we will also denote cuts using the notation , with , where is disjoint union.
We will use to denote a subset of terminals, and define Note that each cut naturally induces a partition of the terminals into and . For the reverse direction, given two subsets of terminals , we use to denote an arbitrary minimum cut between and , and we let denote its size. Note that if and overlap, this value is infinite: such case does not affect Definition 1.1 because it naturally takes the minimum with . On the other hand, it leads us to focus more on disjoint splits of , and we denote such splits using . For a set of terminals , we refer to the set of cuts separating them with size at most as the cuts.
A terminal cut is any cut that has at least one terminal on both sides of the cut. We also sometimes refer to these as Steiner cuts, as this language has been used in the past work of Cole and Hariharan [CH03]. The minimum terminal cut or minimum Steiner cut is the terminal cut with the smallest number of edges.
2.2 Contractions and Cut Monotonicity
Our algorithm will use the concept of contractions. For a graph and edge , we let denote the graph with the endpoints of identified as a single vertex, and we say that we have contracted edge . The new vertex is made a terminal if at least one of the endpoints was a terminal. For any subset , we let denote the graph where all edges in are contracted. We can show that for any split of terminals, the value of the mincut between them is monotonically increasing under such contractions.
Lemma 2.1.
For any split of termainls , and any set of edges , we have
2.3 Observations about cut equivalence
We start with several observations about the notion of cutequivalence given in Definition 1.1.
Lemma 2.2.
If and are cut equivalent, then for any subset of , and any , and are also equivalent
This notion is also robust to the addition of edges.
Lemma 2.3.
If and are cutequivalent, then for any additional set of edges with endpoints in , and are also cutequivalent.
When used in the reverse direction, this lemma says that we can remove edges, as long as we include their endpoints as terminal vertices.
Corollary 2.4.
Let be a set of edges in with endpoints , and be a set of terminals in . If is cut equivalent to , then , which is with added, is cut equivalent to .
We complement this partition process by showing that sparsifiers on disconnected graphs can be built separately.
Lemma 2.5.
If is cutequivalent to , and is cutequivalent to , then the vertexdisjoint union of and , is cutequivalent to the vertexdisjoint union of and .
2.4 Edge Reductions
Furthermore, we can restrict our attention to sparse graphs only [NI92].
Lemma 2.6.
Given any graph on vertices and any , we can find in time a graph on the same vertices, but with at most edges, such that and are cutequivalent.
Proof.
Consider the following routine: repeat iterations of finding a maximal spanning forest from , remove it from and add it to .
Each of the steps takes time, for a total of . Also, a maximal spanning tree has the property that for nonempty cut, it contains at least one edge from it. Thus, for any cut , the iterations adds at least
edges to , which means up to a value of , all cuts in and are the same. ∎
Note however that sparse is not the same as bounded degree: for a star graph, we cannot reduce the degree of its center vertex without changing connectivity.
2.5 Edge Containment of Terminal Cuts
Our construction of vertex sparsifiers utilizes an intermediate goal similar to the construction of cutsparsifiers by Molina and Sandlund [MS18]. Specifically, we want to find a subset of edges so that for any separation of has a minimum cut using only the edges from .
Definition 2.7.
In a graph with terminals , a subset of edges is said to contain all cuts if for any split with , there is a subset of of size which is a cut between and .
Note that this is different than containing all the minimum cuts: on a length path with two endpoints as terminals, any intermediate edge contains a minimum terminal cut, but there are up to different such minimum cuts.
Such containment sets are useful because we can form a vertex sparsifier by contracting the rest of the edges.
Lemma 2.8.
If is a connected graph with terminals , and is a subset of edges that contain all cuts, then the graph
is a cutequivalent to , and has at most vertices.
Proof.
Consider any cut using entirely edges in : contracting edges from will bring together vertices on the same side of the cut. Therefore, the separation of vertices given by this cut also exists in as well.
To bound the size of , observe that contracting all edges of brings it to a single vertex. That is, is a single vertex: uncontracting an edge can increase the number of vertices by at most , so has at most vertices. ∎
Lemma 2.9.
Let be a set of edges in with endpoints , and be a set of terminals in . If edges contain all cuts in , then contains all cuts in .
Lemma 2.10.
If the edges contain all cuts in , and the edges contain all cuts in , then contains all the cuts in the vertex disjoint union of and .
3 High Level Outline
Our construction is based on repeatedly finding edges that intersect all cuts.
Definition 3.1.
In a graph with terminals , a subset of edges intersects all cuts for some if for any split with , there exists a cut such that:

has size ,

induces the same separation of : , .

contains at most edges from any connected component of .
We can reduce the problem of finding edges that contain all small cuts to the problem finding edges that intersect all small cuts. This is done by first finding an intersecting set , and then repeating on the (disconnected) graph with removed, but with the endpoints of included as terminals as well.
Lemma 3.2.
If in some graph with terminals , a subset of edges intersects all cuts, then consider the set
that is, with the endpoints of added. If a subset contains all cuts in the graph , then contains all cuts in as well.
Proof.
Consider a partition with . Because intersects all cuts, there is cut of size separating and that has at most edges in each connected component of .
Sections 4, 5, and 6 are devoted to showing the following bound for generating a sets of edges that intersect all cuts.
Theorem 3.3.
For any parameter and any value , for any graph with terminals , we can generate a set of edges that intersects all cuts:

with size at most in time.

with size at most in time.
Then the overall algorithm is simply to iterate this process until reaches , as in done in Figure 1.
Proof of Theorem 1.2.
Let be a constant such that part 1 of Theorem 3.3 gives us a set of edges intersecting all cuts of size at most in time. We show by induction that before processing in line 2 of Figure 1 that
and
We focus on the bound on , as the bound on is similar. The induction hypothesis holds for . By Part 1 of Theorem 3.3 we have the size of after processing is at most
Now the size of is at most twice this bound, as desired. Taking shows that the final size of is at most Take For (which we can assume by Lemma 2.6) the final size of is at most
Now, we apply Lemma 2.8 to produce a graph with at most vertices that is cut equivalent to . Now, we can repeat the process on times. The number of vertices in the graphs we process decrease geometrically until they have at most many vertices, as desired.
4 Existence via Structural Theorem and Recursion
Our algorithm is based on a divideandconquer routine that removes a small cut and recurses on both sides. Our divideandconquer relies on the following observation about when cuts are able to interact completely with both sides of a cut.
Lemma 4.1.
Let be a cut given by the partition in such that both and are connected, and and be the partition of induced by this cut. If intersects all terminal cuts in , the graph formed by contracting all of into a single vertex , and similarly intersects all terminal cuts in , then intersects all cuts in as well.
Proof.
Consider some cut of size at most .
If uses an edge from , then it has at most edges in , and thus in any connected component as well.
If has at most edges in , then because removing already disconnected and , and removing can only further disconnect things, no connected component in can have or more edges.
So the only remaining case is if is entirely contained on one of the sides. Without loss of generality assume is entirely contained in , i.e. . Because no edges from are removed and is connected, all of must be on one side of the cut, and can therefore be represented by a single vertex .
So using the induction hypothesis on the cut in with the terminal separation given by all of replaced by gives that has at most edges in any connected component of
Because connected components are unchanged under contracting connected subsets, we get that has at most edges in any connected components of as well. ∎
However, for such a partition to make progress, we also need at least two terminals to become contracted together when or are contracted. Building this into the definition leads to our key definition of a nontrivial separating cut:
Definition 4.2.
A nontrivial separating cut is a separation of into such that:

the induced subgraphs on and , and are both connected.

, .
Such cuts are critical for partitioning and recursing on the two resulting pieces. Connectivity of and is necessary for applying Lemma 4.1, and , are necessary to ensure that making this cut and recursing makes progress.
We now study the set of graphs and terminals for which a nontrivial cut exists. For example, consider for example when is a star graph (a single vertex with vertex connected to it) and all vertices are terminals. In this graph, the side of the cut not containing the center can only have a single vertex, hence there are no nontrivial cuts.
We can, in fact, prove the converse: if no such interesting separations exist, we can terminate by only considering the separations of formed with one terminal on one of the sides. We define these cuts to be the isolating cuts.
Definition 4.3.
For a graph with terminal set and some , a isolating cut is a split of the vertices such that is the only terminal in , i.e. , .
Lemma 4.4.
If is a subset of at least terminals in an undirected graph such that there does not exist a nontrivial separating cut of size at most , then
contains all cuts of . Here, is the union of all isolating cuts of size at most .
Proof.
Consider a graph with no nontrivial separating cut of size at most , but there is a partition of , , such that the minimum cut between and , and , has at most edges, and
Let be one such cut, and consider the graph
that is, we contract all edges except the ones on this cut. Note that has at least vertices.
Consider a spanning tree of . By minimality of , each node of must contain at least one terminal. Otherwise, we can keep one edge from such a node without affecting the distribution of terminal vertices.
We now show that no vertex of can contain terminals. If has exactly two vertices, then one vertex must correspond to and one must correspond to , so no vertex has terminals. If has at least vertices, then because every vertex contains at least one terminal, no vertex in can contain vertices.
Also, each leaf of can contain at most one terminal, otherwise deleting the edge adjacent to that leaf forms a nontrivial cut.
Now consider any nonleaf node of the tree, . As is a nonleaf node, at it has at least two different neighbors that lead to leaf vertices.
Reroot this tree at , and consider some neighbor of , . If the subtree rooted at has more than terminals, then cutting the edge results in two components, each containing at least two terminals (the component including has at least one other neighbor that contains a terminal). Thus, the subtree rooted at can contain at most one terminal, and must therefore be a singleton leaf.
Hence, the only possible structure of is a star centered at (which may contain multiple terminals), and each leaf having a exactly one terminal in it. This in turn implies that also must be a star, i.e. has the same edges as but possibly with multiedges. This is because any edge between two leaves of a star forms a connected cut by disconnecting those vertices from .
By minimality, each cut separating the root from leaf is a minimal cut for that single terminal, and these cuts are disjoint. Thus taking the union of edges of all these singleton cuts gives a cut that splits the same way, and has the same size. ∎
Note that Lemma 4.4 is not claiming all the cuts of are singletons. Instead, it says that any cut can be formed from a union of single terminal cuts.
Combining Lemma 4.1 and 4.4, we obtain the recursive algorithm in Figure 2, which demonstrates the existence of sized cutintersecting subsets. If there is a nontrivial separating cut, the algorithm in Line 3 finds it and recurses on both sides of the cut using Lemma 4.1. Otherwise, by Lemma 4.4, the union of the isolating cuts of size at most contains all cuts, so the algorithm keeps the edges of those cuts in Line 4a.
Lemma 4.5.
RecursiveNonTrivialCuts as shown in Figure 2 correctly returns a set of cutintersecting edges of size at most .
Proof.
Correctness can be argued by induction. The base case of where we terminate by adding all mincuts with one terminal on one side, follows from Lemma 4.4, while the inductive case follows from applying Lemma 4.1.
It remains to bound the size of returned. Once again there are two cases: for the case where we terminate with the union of singleton cuts, each such cut has size at most , for a total of .
For the recursive case, the recursion can be viewed as splitting terminals into two instances of sizes and where and . Note that the total values of across all the recursion instances is strictly decreasing, and is always positive. So the recursion can branch at most times, which gives that the total number of edges added is at most . ∎
5 Polytime Construction
While the previous algorithm in Section 4 gives our best bound on sparsifier size, it is not clear to us how it could be implemented in polynomial time. While we do give a more efficient implementation of it below in Section 6, the running time of that algorithm still has a term (as stated in Theorem 1.2 Part 1). In this section, we give a more efficient algorithm that returns sparsifiers of larger size, but ultimately leads to the faster running time given in Theorem 1.2 Part 2. It was derived by working backwards from the termination condition of taking all the cuts with one terminal on one side in Lemma 4.4.
Recall that a Steiner cut is a cut with at least one terminal one both sides. The algorithm has the same high level recursive structure, but it instead only finds the minimum Steiner cut or certifies that its size is greater than . This takes time using an algorithm by Cole and Hariharan [CH03].
It is direct to check that both sides of a minimum Steiner cut are connected. This is important towards our goal of finding a nontrivial separating cut, defined in Definition 4.2.
Lemma 5.1.
If is the global minimum separating cut in a connceted graph , then both and must be connected.
Proof.
Suppose for the sake of contradiction that is disconnected. That is, , there are no edges between and .
Without loss of generality assume contains a terminal. Also, contains at least one terminal because is separating.
Then because is connected, there is an edge between and . Then the cut has strictly fewer edges crossing, and also terminals on both sides, a contradiction to being the minimum separating cut. ∎
So the only bad case that prevents us from recursing is the case where the minimum Steiner cut has a single terminal on some side. That is, one of the isolating cuts from Definition 4.3 is also a minimum Steiner cut.
We can handle this case through an extension of Lemma 4.1. Specifically, we show that for a cut with both sides connected, we can contract a side of the cut along with the cut edges before recursing.
Lemma 5.2.
Let be a cut given by the partition in such that both and are connected, and and be the partition of induced by this cut. If intersects all terminal cuts in , the graph formed by contracting all of and all edges in into a single vertex , and similarly intersects all terminal cuts in , then intersects all cuts in as well.
Proof.
Consider some cut of size at most .
If uses an edge from , then it has at most edges in , and thus in any connected component as well.
If has at most edges in , then because removing already disconnected and , and removing can only further disconnect things, no connected component in can have or more edges.
The only remaining case is if is entirely contained on one of the sides. Without loss of generality assume is entirely contained in , i.e. . Because no edges from and are removed and is connected, all edges in and must not be cut and hence can be contracted into a single vertex .
So using the induction hypothesis on the cut in with the terminal separation given by all of replaced by gives that has at most edges in any connected component of
Because connected components are unchanged under contracting connected subsets, we get that has at most edges in any connected components of as well. ∎
Now, a natural way to handle the case where a minimum Steiner cut has a single terminal on some side is to use Lemma 5.2 to contract across the cut to make progress. However, it may be the case that for some , there are are many minimum isolating cuts: consider for example the length path with only the endpoints as terminals. If we always pick the edge closest to as the minimum isolating cut, we may have to continue rounds, and thus add all edges to our set of intersecting edges.
To remedy this, we instead pick a “maximal” isolating minimum cut. One way to find a maximal isolating cut is to repeatedly contract across an isolating minimum cut using Lemma 5.2 until its size increases. At that point, we add the last set of edges found in the cut to the set of intersecting edges. We have made progress because the value of the minimum isolating cut in the contracted graph must have increased by at least . While there are many ways to find a maximal isolating minimum cut, the way described here extends to our analysis in Section 6.2.
Pseudocode of this algorithm is shown in Figure 3, and the procedure for the repeated contractions to find a maximal isolating cut described in the above paragraph is in Line 3d.
Discussion of algorithm in Figure 3.
We clarify some lines in the algorithm of Figure 3. If the algorithm finds a nontrivial separating cut as the Steiner minimum cut, it returns the result of the recursion in Line 3(c)i, and does not execute any of the later lines in the algorithm. In Line 3(d)ii, in addition to checking that the isolating minimum cut size is still , we also must check that does not get contracted with another terminal. Otherwise, contracting across that cut makes global progress by reducing the number of terminals by . In Line 3iiC, note that we can still view as a terminal in , as we have assumed that this contraction does not merge with any other terminals.
Lemma 5.3.
For any graph , terminals , and cut value , Algorithm RecursiveSteinerCuts as shown in Figure 3 runs in time and returns a set at most edges that intersect all cuts.
Proof.
We assume throughout, as we can reduce to this case in time by Lemma 2.6.
Note that the recursion in Line 3(c)i can only branch times, by the analysis in Lemma 4.5. Similarly, the case where gets contracted with another terminal in Line 3(d)ii can only occur times.
Therefore, we only create distinct terminals throughout the algorithm. Let be a terminal created at some point during the algorithm. By monotonicity of cuts in Lemma 2.1, the minimum isolating cut can only increase in size times, hence is the union of cuts of size at most . Therefore, has at most edges.
To bound the runtime, we use the total number of edges in the graphs in our recursive algorithm as a potential function. This potential function starts at . Note that the recursion of Line 3(c)i can increase the potential function by , hence the total potential function increase throughout the algorithm is bounded by .
Each loop of Line 3(d)ii decreases our potential function by at least from contractions. Thus, the total runtime of the loop involving Line 3(d)ii can be bounded by
where the former term is from running a maxflow algorithm up to flow , and the latter is from an execution of the Steiner minimum cut algorithm in [CH03]. As the total potential function increase is at most , the loop in Line 3(d)ii can only execute times, for a total runtime of as desired. ∎
Our further speedup of this routine in Section 6 also uses a faster variant of RecursiveSteinerCuts as base case, which happens when is too small. Here the main observation is that
A modification to Algorithm as shown in Figure 3 can reduce the runtime.
Lemma 5.4.
For any graph , terminals , and cut value , there is an algorithm that runs in time and returns a set at most edges that intersect all cuts.
Proof.
We modify RecursiveSteinerCuts as shown in Figure 3 and its analysis as given in Lemma 5.3 above. Specifically, we modify how we compute a maximal isolating minimum cut in Line 3(d)ii. For any partition , by submodularity of cuts it is known that there is a unique maximal subset such that
Also, this maximal set can be computed in time by running the FordFulkerson augmenting path algorithm, with as source, and as sink. The connectivity value of means at most augmenting paths need to be found, and the set can be set to the vertices that can still reach the sink set in the residual graph [FH75]. Now set Thus by setting , we can use the corresponding computed set as the representative of the maximal isolating Steiner minimum cut.
Now we analyze the runtime of this procedure. First, in we reduce the number of edges to at most in time. As in the proof of Lemma 5.3, all graphs in the recursion have at most edges. The recursion in Line 3(c)i can only branch times, and we only need to compute maximal isolating Steiner minimum cuts throughout the algorithm. Each call to the ColeHariharan algorithm [CH03] requires time, for a total runtime of as desired. ∎
6 NearlyLinear Time Constructions Using Expanders
In this section We now turn our attention to efficiently finding these vertex sparsifiers. Here we utilize insights from recent results on finding vertex cuts [NSY19a, NSY19b, FY19], namely that in a well connected graph, any cut of size at most must have a very small side. This notion of connectivity is formalized through the notion of graph conductance.
Definition 6.1.
In an undirected unweighted graph , denote the volume of a subset of vertices, , as the total degrees of its vertices. The conductance of a cut is then
and the conductance of a graph is the minimum conductance of a subset of vertices:
The ability to remove edges and add terminals means we can use expander decomposition to reduce to the case where the graph has high conductance. Here we utilize expander decompositions, as stated by Saranurak and Wang [SW19]:
Lemma 6.2.
(Theorem 1.2. of [SW19], Version 2 https://arxiv.org/pdf/1812.08958v2.pdf) There exists an algorithm ExpanderDecompose that for any undirected unweighted graph and any parameter , decomposes in time into pieces of conductance at least so that at most edges are between the pieces.
Note that if a graph has conductance , any cut of size at most must have
(1) 
Algorithmically, we can further leverage it in two ways, both of which are directly motivated by recent works on vertex connectivity [NSY19a, FY19, NSY19b].
6.1 Enumeration of All Small Cuts by their Smaller Sides
In a graph with expansion , we can enumerate all cuts of size at most in time exponential in and .
Lemma 6.3.
In a graph with conductance we can enumerate all cuts of size at most with connected smaller side in time .
Proof.
We first enumerate over all starting vertices. For a starting vertex , we repeatedly perform the following process.

perform a DFS from until it reaches more than vertices.

Pick one of the edges among the reached vertices as a cut edge.

Remove that edge, and recursively start another DFS starting at .
After we have done this process at most times, we check whether the edges form a valid cut, and store it if so.
By Equation 1, the smaller side of the can involve at most vertices. Consider such a cut with as the smaller side, , and . Then if we picked some vertex as the starting point, the DFS tree rooted at must contain some edge in at some point. Performing an induction with this edge removed then gives that the DFS starting from will find this cut.
Because there can be at most different edges picked among the vertices reached, the total work performed in the layers of recursion is . ∎
Furthermore, it suffices to enumerate all such cuts once at the start, and reuse them as we perform contractions.
Lemma 6.4.
If is a set of edges that form a cut in , that is, with a subset of edges contracted, then is also a cut in .
Note that this lemma also implies that an expander stays so under contractions. So we do not even need to repartition the graph as we recurse.
Proof of Theorem 3.3 Part 1.
First, we perform expander decomposition, remove the intercluster edges, and add their endpoints as terminals.
Now, we describe the modifications to GetIntersectingEdgesSlow that makes it efficient.
Now at the start of each recursive call, enumerate all cuts of size at most , and store the vertices on the smaller side, which by Equation 1 above has size at most . When such a cut is found, we only invoke recursion on the smaller side (in terms of volume). For the larger piece, we can continue using the original set of cuts found during the search.
To use a cut from a precontracted state, we need to:

check if all of its edges remain (using a unionfind data structure).

check if both portions of the graph remain connected upon removal of this cut – this can be done by explicitly checking the smaller side, and certifying the bigger side using a dynamic connectivity data structure by removing all edges from the smaller side.
Since we contract each edge at most once, the total work done over all the larger side is at most
where we have included the logarithmic factors from using the dynamic connectivity data structure. Furthermore, the fact that we only recurse on things with half as many edges ensures that each edge participates in the cut enumeration process at most times. Combining these then gives the overall running time. ∎
6.2 Using Local Cut Algorithms
A more recent development are local cut algorithms, which for a vertex can whether there is a cut of size at most such that the side with has volume at most . The runtime is linear in and .
Theorem 6.5 (Theorem 3.1 of [NSY19b]).
Let be a graph and let be a vertex. For a connectivity parameter and volume parameter
, there is an algorithm that with high probability either

Certifies that there is no cut of size at most such that the side with has volume at most .

Returns a cut of size at most such that the side with has volume at most It runs in time
Let be a vertex. We now formalize the notion of the smallest cut that is local around .
Definition 6.6 (Local cuts).
For a vertex define to be
We now combine Theorem 6.5 with the observation from Equation 1 in order to control the volume of the smaller side of the cut in an expander.
Lemma 6.7.
Let be a graph with conductance at most , and let be a set of terminals. If then for any vertex we can with high probability in time either compute or certify that
Comments
There are no comments yet.