维生素b12有什么用| 8月15日什么星座| 平反是什么意思| hip是什么意思| 1月22是什么星座| 北京属于什么方向| 毛孔大什么原因形成的| 姝字五行属什么的| 小孩白头发是什么原因引起的| 阴煞是什么意思| 宫颈管少量积液是什么意思| 喉咙发炎吃什么药最好| 孩子出疹子应该注意什么| 手不什么什么| 舌头发麻是什么原因引起的| 吃什么食物能养肝护肝| 紧凑是什么意思| 怕痒的男人意味着什么| 早泄是什么原因导致| 乱花渐欲迷人眼是什么意思| 什么是情绪| 马眼棒是什么| 金刚钻是什么意思| 楼凤是什么意思| 资本运作是什么意思| 拉肚子低烧是什么原因| 腋窝疼痛挂什么科| 心悸是什么意思| ipadair2什么时候上市的| 梦见家里死人了代表什么预兆| 没事找事是什么意思| pbc是什么| 腻是什么意思| BLD医学上是什么意思| 羊可以加什么偏旁| 阴阳双补用什么药最佳| 风湿三项检查是什么| 25属什么| 胃酸分泌过多吃什么药| 宝宝热疹用什么药膏| 白头发吃什么好| 吃什么降钾最快| 做梦梦到钓鱼是什么意思| 包茎是什么意思| 属虎和什么属相最配| 上呼吸道感染吃什么消炎药| 心脏在乳房的什么位置| 区人大代表是什么级别| 老年人缺钾是什么原因引起的| 私密瘙痒是什么原因| 肠炎吃什么药效果好| 紫癜挂什么科| 吃鸡蛋胃疼是什么原因| 胬肉是什么意思| 熬夜流鼻血是什么原因| 紫癜吃什么药| 教师的职责是什么| 上火吃什么药| 中阴身是什么意思| o型血和什么血型容易溶血| 指甲有条纹是什么原因| 看抑郁症挂什么科| 下午6点是什么时辰| 松字五行属什么| 五花肉和什么菜炒好吃| 一个口四个又念什么| cbd什么意思| 姜茶什么时候喝最好| 陕西有什么山| 高血压检查什么项目| 柠檬可以做什么| 九出十三归指什么生肖| 真露兑什么好喝| 咳嗽能吃什么水果最好| mra是什么牌子| 办健康证需要带什么| 肠胃不舒服挂什么科| 什么人容易得心梗| 为什么会干呕| 失恋什么意思| 月令是什么意思| 怀孕白细胞高是什么原因| 小三阳吃什么食物好得快| 生长激素由什么分泌| 琴棋书画指的是什么| 打氨基酸点滴有什么好处和害处| 瞳孔缩小见于什么病| 术是什么意思| 心境什么意思| 除湿气吃什么好| 免费查五行缺什么| 怀孕胎盘低有什么影响| 脑缺血吃什么药最好| 海子为什么自杀| 血小板是什么意思| 小分子肽能治什么病| 晚上8点半是什么时辰| 轻医美是什么| 梦见把蛇打死了是什么意思| 关帝庙求什么最灵| cnv是什么意思| 心衰挂什么科| 上呼吸道感染吃什么药| 米参念什么| 牛仔蓝配什么颜色好看| 幼猫能吃什么| 用减一笔是什么字| 舌苔有裂纹是什么原因| 汲水什么意思| fc是什么| 羽字五行属什么的| 爱情的故事分分合合是什么歌| 女性肝囊肿要注意什么| 手指甲上有竖纹是什么原因| 醋有什么功效和作用| 眼睛肿胀是什么原因| 湿气重有什么症状| 驹是什么意思| 蒲公英有什么功效| 菲妮迪女装是什么档次| 泥鳅什么人不能吃| 提拉米苏是什么| 马眼是什么| 什么的跑步| 血亏什么意思| 茉莉毛尖属于什么茶| 孕酮低是什么原因造成的| 柯萨奇病毒是什么病| 涤纶是什么面料优缺点| 妞字五行属什么| 腿弯处的筋痛是什么原因| 五音指什么| 姊妹什么意思| 环孢素是什么药| 梦见孩子拉屎是什么意思| 痰多是什么问题| fdg代谢增高是什么意思| 脑出血什么原因引起的| 独在异乡为异客是什么节日| 簋是什么| 一什么白菜| 现在是什么时辰| 123是什么意思| 雷替斯是什么药| ha什么意思| 嘴唇有点发黑是什么原因引起的| 急性肠胃炎吃什么药效果好| 为什么会得人工荨麻疹| 早餐做什么简单又好吃| 人为什么要洗澡| 爬山有什么好处| 步步生花是什么意思| 铁棍山药有什么功效| 属猪的贵人属相是什么| 固涩是什么意思| 经常口腔溃疡是什么原因| 腰椎退行性变是什么意思| 鼻梁有横纹是什么原因| 用鸡蛋滚脸有什么好处| 疮疡是什么病| 抗生素是什么| 人参片泡水喝有什么功效和作用| 发烧头疼吃什么药| 为什么耳朵后面会臭| 滞气是什么意思| 什么都想吃| 雄起是什么意思| 射精是什么意思| 偏头痛看什么科| 聚聚什么意思| com是什么| 肾积水有什么症状| 桃花是什么颜色| 开飞机需要什么驾照| 月亮为什么会有圆缺变化| 小孩尿味道很重是什么原因| 成王败寇什么意思| 胎盘1级什么意思| 检查视力挂什么科| 五月二十号是什么星座| 治疗舌苔白厚用什么药| jojo是什么意思| 中国古代四大发明是什么| 做凉粉用什么淀粉最好| 测试你是什么样的人| 解解乏是什么意思| 三农是什么| 乙肝肝炎表面抗体阳性是什么意思| 刺瘊子是什么原因造成的| 微量泵是干什么用的| 血糖高吃什么中药好| 狐臭是什么味道| 海军蓝是什么颜色| mi什么意思| 铜锣湾有什么好玩的| 春节的习俗是什么| 胎膜是什么| 后羿射日是什么意思| 轱辘是什么意思| 男人梦见蛇是什么意思| 皮下男是什么意思| 白酒是什么时候出现的| cvt是什么意思| 白细胞低是什么原因引起的| 飞机杯是什么东西| 南瓜是什么颜色| 2043年是什么年| 得瑟是什么意思| 悟性是什么意思| 为什么相爱的人却不能在一起| 牛的五行属什么| 彼岸花是什么花| 维生素h是什么| 哭笑不得是什么意思| 每天吃一个西红柿有什么好处| 支气管舒张试验阳性说明什么| 百合花什么时候种植| 为什么韩国叫棒子国| 什么是籍贯| 吃芹菜有什么好处| 游龙戏凤是什么意思| 鱼油什么人不能吃| 鼓包是什么意思| 梦见买衣服是什么预兆| mw是什么意思| 车厘子什么季节成熟| 哦多桑是什么意思| 夏五行属什么| 小孩手麻是什么原因| 噤若寒蝉是什么意思| 肠绞痛什么原因引起的| 勃起不坚硬吃什么药| 大黄和芒硝混合外敷有什么作用| 藏红花适合什么样的人喝| 汗毛重的女人意味着什么| 孕妇吃什么鱼好| 脂浊是什么意思| 癌症有什么症状| 鳞状上皮炎症反应性改变是什么意思| 头发少剪什么发型好看| 金牛座和什么星座最配| 1月19日什么星座| 稀松平常是什么意思| 晏字五行属什么| 中性粒细胞低说明什么| 乳腺发炎吃什么消炎药| 脚踝肿了是什么原因| 领袖是什么意思| 小舌头有什么用| 粉墙用什么| 益生菌有什么功效| 子宫前位什么姿势易孕| 梦见大水是什么预兆| 胡萝卜不能和什么食物一起吃| 为什么突然长癣了| 杏仁是什么树的果实| 什么是免疫组化| 肚脐下方疼是什么原因| 投桃报李是什么生肖| 梦见洗碗是什么预兆| 蛇的五行属什么| 衣原体感染男性吃什么药| 大便拉不出来吃什么药| 氯是什么意思| 百度Jump to content

重庆飞机保税租赁业务新突破 引进七架空客A320neo飞机

This is a good article. Click here for more information.
From Wikipedia, the free encyclopedia
百度 如此连木带砖石经这豁口一车车运至雍和宫。

A graph with three components

In graph theory, a component of an undirected graph is a connected subgraph that is not part of any larger connected subgraph. The components of any graph partition its vertices into disjoint sets, and are the induced subgraphs of those sets. A graph that is itself connected has exactly one component, consisting of the whole graph. Components are sometimes called connected components.

The number of components in a given graph is an important graph invariant, and is closely related to invariants of matroids, topological spaces, and matrices. In random graphs, a frequently occurring phenomenon is the incidence of a giant component, one component that is significantly larger than the others; and of a percolation threshold, an edge probability above which a giant component exists and below which it does not.

The components of a graph can be constructed in linear time, and a special case of the problem, connected-component labeling, is a basic technique in image analysis. Dynamic connectivity algorithms maintain components as edges are inserted or deleted in a graph, in low time per change. In computational complexity theory, connected components have been used to study algorithms with limited space complexity, and sublinear time algorithms can accurately estimate the number of components.

Definitions and examples

[edit]
A cluster graph with seven components

A component of a given undirected graph may be defined as a connected subgraph that is not part of any larger connected subgraph. For instance, the graph shown in the first illustration has three components. Every vertex of a graph belongs to one of the graph's components, which may be found as the induced subgraph of the set of vertices reachable from .[1] Every graph is the disjoint union of its components.[2] Additional examples include the following special cases:

Another definition of components involves the equivalence classes of an equivalence relation defined on the graph's vertices. In an undirected graph, a vertex is reachable from a vertex if there is a path from to , or equivalently a walk (a path allowing repeated vertices and edges). Reachability is an equivalence relation, since:

  • It is reflexive: There is a trivial path of length zero from any vertex to itself.
  • It is symmetric: If there is a path from to , the same edges in the reverse order form a path from to .
  • It is transitive: If there is a path from to and a path from to , the two paths may be concatenated together to form a walk from to .

The equivalence classes of this relation partition the vertices of the graph into disjoint sets, subsets of vertices that are all reachable from each other, with no additional reachable pairs outside of any of these subsets. Each vertex belongs to exactly one equivalence class. The components are then the induced subgraphs formed by each of these equivalence classes.[7] Alternatively, some sources define components as the sets of vertices rather than as the subgraphs they induce.[8]

Similar definitions involving equivalence classes have been used to defined components for other forms of graph connectivity, including the weak components[9] and strongly connected components of directed graphs[10] and the biconnected components of undirected graphs.[11]

Number of components

[edit]

The number of components of a given finite graph can be used to count the number of edges in its spanning forests: In a graph with vertices and components, every spanning forest will have exactly edges. This number is the matroid-theoretic rank of the graph, and the rank of its graphic matroid. The rank of the dual cographic matroid equals the circuit rank of the graph, the minimum number of edges that must be removed from the graph to break all its cycles. In a graph with edges, vertices and components, the circuit rank is .[12]

A graph can be interpreted as a topological space in multiple ways, for instance by placing its vertices as points in general position in three-dimensional Euclidean space and representing its edges as line segments between those points.[13] The components of a graph can be generalized through these interpretations as the topological connected components of the corresponding space; these are equivalence classes of points that cannot be separated by pairs of disjoint closed sets. Just as the number of connected components of a topological space is an important topological invariant, the zeroth Betti number, the number of components of a graph is an important graph invariant, and in topological graph theory it can be interpreted as the zeroth Betti number of the graph.[3]

The number of components arises in other ways in graph theory as well. In algebraic graph theory it equals the multiplicity of 0 as an eigenvalue of the Laplacian matrix of a finite graph.[14] It is also the index of the first nonzero coefficient of the chromatic polynomial of the graph, and the chromatic polynomial of the whole graph can be obtained as the product of the polynomials of its components.[15] Numbers of components play a key role in Tutte's theorem on perfect matchings characterizing finite graphs that have perfect matchings[16] and the associated Tutte–Berge formula for the size of a maximum matching,[17] and in the definition of graph toughness.[18]

Algorithms

[edit]

It is straightforward to compute the components of a finite graph in linear time (in terms of the numbers of the vertices and edges of the graph) using either breadth-first search or depth-first search. In either case, a search that begins at some particular vertex will find the entire component containing (and no more) before returning. All components of a graph can be found by looping through its vertices, starting a new breadth-first or depth-first search whenever the loop reaches a vertex that has not already been included in a previously found component. Hopcroft & Tarjan (1973) describe essentially this algorithm, and state that it was already "well known".[19]

Connected-component labeling, a basic technique in computer image analysis, involves the construction of a graph from the image and component analysis on the graph. The vertices are the subset of the pixels of the image, chosen as being of interest or as likely to be part of depicted objects. Edges connect adjacent pixels, with adjacency defined either orthogonally according to the Von Neumann neighborhood, or both orthogonally and diagonally according to the Moore neighborhood. Identifying the connected components of this graph allows additional processing to find more structure in those parts of the image or identify what kind of object is depicted. Researchers have developed component-finding algorithms specialized for this type of graph, allowing it to be processed in pixel order rather than in the more scattered order that would be generated by breadth-first or depth-first searching. This can be useful in situations where sequential access to the pixels is more efficient than random access, either because the image is represented in a hierarchical way that does not permit fast random access or because sequential access produces better memory access patterns.[20]

There are also efficient algorithms to dynamically track the components of a graph as vertices and edges are added, by using a disjoint-set data structure to keep track of the partition of the vertices into equivalence classes, replacing any two classes by their union when an edge connecting them is added. These algorithms take amortized time per operation, where adding vertices and edges and determining the component in which a vertex falls are both operations, and is a very slowly growing inverse of the very quickly growing Ackermann function.[21] One application of this sort of incremental connectivity algorithm is in Kruskal's algorithm for minimum spanning trees, which adds edges to a graph in sorted order by length and includes an edge in the minimum spanning tree only when it connects two different components of the previously-added subgraph.[22] When both edge insertions and edge deletions are allowed, dynamic connectivity algorithms can still maintain the same information, in amortized time per change and time per connectivity query,[23] or in near-logarithmic randomized expected time.[24]

Components of graphs have been used in computational complexity theory to study the power of Turing machines that have a working memory limited to a logarithmic number of bits, with the much larger input accessible only through read access rather than being modifiable. The problems that can be solved by machines limited in this way define the complexity class L. It was unclear for many years whether connected components could be found in this model, when formalized as a decision problem of testing whether two vertices belong to the same component, and in 1982 a related complexity class, SL, was defined to include this connectivity problem and any other problem equivalent to it under logarithmic-space reductions.[25] It was finally proven in 2008 that this connectivity problem can be solved in logarithmic space, and therefore that SL = L.[26]

In a graph represented as an adjacency list, with random access to its vertices, it is possible to estimate the number of connected components, with constant probability of obtaining additive (absolute) error at most , in sublinear time .[27]

In random graphs

[edit]
An Erd?s–Rényi–Gilbert random graph with 1000 vertices with edge probability (in the critical range), showing a large component and many small ones

In random graphs the sizes of components are given by a random variable, which, in turn, depends on the specific model of how random graphs are chosen. In the version of the Erd?s–Rényi–Gilbert model, a graph on vertices is generated by choosing randomly and independently for each pair of vertices whether to include an edge connecting that pair, with probability of including an edge and probability of leaving those two vertices without an edge connecting them.[28] The connectivity of this model depends on , and there are three different ranges of with very different behavior from each other. In the analysis below, all outcomes occur with high probability, meaning that the probability of the outcome is arbitrarily close to one for sufficiently large values of . The analysis depends on a parameter , a positive constant independent of that can be arbitrarily close to zero.

Subcritical
In this range of , all components are simple and very small. The largest component has logarithmic size. The graph is a pseudoforest. Most of its components are trees: the number of vertices in components that have cycles grows more slowly than any unbounded function of the number of vertices. Every tree of fixed size occurs linearly many times.[29]
Critical
The largest connected component has a number of vertices proportional to . There may exist several other large components; however, the total number of vertices in non-tree components is again proportional to .[30]
Supercritical
There is a single giant component containing a linear number of vertices. For large values of its size approaches the whole graph: where is the positive solution to the equation . The remaining components are small, with logarithmic size.[31]

In the same model of random graphs, there will exist multiple connected components with high probability for values of below a significantly higher threshold, , and a single connected component for values above the threshold, . This phenomenon is closely related to the coupon collector's problem: in order to be connected, a random graph needs enough edges for each vertex to be incident to at least one edge. More precisely, if random edges are added one by one to a graph, then with high probability the first edge whose addition connects the whole graph touches the last isolated vertex.[32]

For different models including the random subgraphs of grid graphs, the connected components are described by percolation theory. A key question in this theory is the existence of a percolation threshold, a critical probability above which a giant component (or infinite component) exists and below which it does not.[33]

References

[edit]
  1. ^ Clark, John; Holton, Derek Allan (1995), A First Look at Graph Theory, Allied Publishers, p. 28, ISBN 9788170234630, archived from the original on 2025-08-06, retrieved 2025-08-06
  2. ^ Joyner, David; Nguyen, Minh Van; Phillips, David (May 10, 2013), "1.6.1 Union, intersection, and join", Algorithmic Graph Theory and Sage (0.8-r1991 ed.), Google, pp. 34–35, archived from the original on January 16, 2016, retrieved January 8, 2022
  3. ^ a b Tutte, W. T. (1984), Graph Theory, Encyclopedia of Mathematics and its Applications, vol. 21, Reading, Massachusetts: Addison-Wesley, p. 15, ISBN 0-201-13520-5, MR 0746795, archived from the original on 2025-08-06, retrieved 2025-08-06
  4. ^ a b Thulasiraman, K.; Swamy, M. N. S. (2011), Graphs: Theory and Algorithms, John Wiley & Sons, p. 9, ISBN 978-1-118-03025-7, archived from the original on 2025-08-06, retrieved 2025-08-06
  5. ^ Bollobás, Béla (1998), Modern Graph Theory, Graduate Texts in Mathematics, vol. 184, New York: Springer-Verlag, p. 6, doi:10.1007/978-1-4612-0619-4, ISBN 0-387-98488-7, MR 1633290, archived from the original on 2025-08-06, retrieved 2025-08-06
  6. ^ McColl, W. F.; Noshita, K. (1986), "On the number of edges in the transitive closure of a graph", Discrete Applied Mathematics, 15 (1): 67–73, doi:10.1016/0166-218X(86)90020-X, MR 0856101
  7. ^ Foldes, Stephan (2011), Fundamental Structures of Algebra and Discrete Mathematics, John Wiley & Sons, p. 199, ISBN 978-1-118-03143-8, archived from the original on 2025-08-06, retrieved 2025-08-06
  8. ^ Siek, Jeremy; Lee, Lie-Quan; Lumsdaine, Andrew (2001), "7.1 Connected components: Definitions", The Boost Graph Library: User Guide and Reference Manual, Addison-Wesley, pp. 97–98
  9. ^ Knuth, Donald E. (January 15, 2022), "Weak components", The Art of Computer Programming, Volume 4, Pre-Fascicle 12A: Components and Traversal (PDF), pp. 11–14, archived (PDF) from the original on January 18, 2022, retrieved March 1, 2022
  10. ^ Lewis, Harry; Zax, Rachel (2019), Essential Discrete Mathematics for Computer Science, Princeton University Press, p. 145, ISBN 978-0-691-19061-7, archived from the original on 2025-08-06, retrieved 2025-08-06
  11. ^ Kozen, Dexter C. (1992), "4.1 Biconnected components", The Design and Analysis of Algorithms, Texts and Monographs in Computer Science, New York: Springer-Verlag, pp. 20–22, doi:10.1007/978-1-4612-4400-4, ISBN 0-387-97687-6, MR 1139767, S2CID 27747202, archived from the original on 2025-08-06, retrieved 2025-08-06
  12. ^ Wilson, R. J. (1973), "An introduction to matroid theory", The American Mathematical Monthly, 80 (5): 500–525, doi:10.1080/00029890.1973.11993318, JSTOR 2319608, MR 0371694
  13. ^ Wood, David R. (2014), "Three-dimensional graph drawing", in Kao, Ming-Yang (ed.), Encyclopedia of Algorithms (PDF), Springer, pp. 1–7, doi:10.1007/978-3-642-27848-8_656-1, ISBN 978-3-642-27848-8, archived (PDF) from the original on 2025-08-06, retrieved 2025-08-06
  14. ^ Cioab?, Sebastian M. (2011), "Some applications of eigenvalues of graphs", in Dehmer, Matthias (ed.), Structural Analysis of Complex Networks, New York: Birkh?user/Springer, pp. 357–379, doi:10.1007/978-0-8176-4789-6_14, ISBN 978-0-8176-4788-9, MR 2777924; see proof of Lemma 5, p. 361 Archived 2025-08-06 at the Wayback Machine
  15. ^ Read, Ronald C. (1968), "An introduction to chromatic polynomials", Journal of Combinatorial Theory, 4: 52–71, doi:10.1016/S0021-9800(68)80087-0, MR 0224505; see Theorem 2, p. 59, and corollary, p. 65
  16. ^ Tutte, W. T. (1947), "The factorization of linear graphs", The Journal of the London Mathematical Society, 22 (2): 107–111, doi:10.1112/jlms/s1-22.2.107, MR 0023048
  17. ^ Berge, Claude (1958), "Sur le couplage maximum d'un graphe", Comptes Rendus Hebdomadaires des Séances de l'Académie des Sciences, 247: 258–259, MR 0100850
  18. ^ Chvátal, Václav (1973), "Tough graphs and Hamiltonian circuits", Discrete Mathematics, 5 (3): 215–228, doi:10.1016/0012-365X(73)90138-6, MR 0316301
  19. ^ Hopcroft, John; Tarjan, Robert (June 1973), "Algorithm 447: efficient algorithms for graph manipulation", Communications of the ACM, 16 (6): 372–378, doi:10.1145/362248.362272, S2CID 14772567
  20. ^ Dillencourt, Michael B.; Samet, Hanan; Tamminen, Markku (1992), "A general approach to connected-component labeling for arbitrary image representations", Journal of the ACM, 39 (2): 253–280, CiteSeerX 10.1.1.73.8846, doi:10.1145/128749.128750, MR 1160258, S2CID 1869184
  21. ^ Bengelloun, Safwan Abdelmajid (December 1982), Aspects of Incremental Computing (PhD thesis), Yale University, p. 12, ProQuest 303248045
  22. ^ Skiena, Steven (2008), "6.1.2 Kruskal's Algorithm", The Algorithm Design Manual, Springer, pp. 196–198, Bibcode:2008adm..book.....S, doi:10.1007/978-1-84800-070-4, ISBN 978-1-84800-069-8, archived from the original on 2025-08-06, retrieved 2025-08-06
  23. ^ Wulff-Nilsen, Christian (2013), "Faster deterministic fully-dynamic graph connectivity", in Khanna, Sanjeev (ed.), Proceedings of the Twenty-Fourth Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2013, New Orleans, Louisiana, USA, January 6-8, 2013, pp. 1757–1769, arXiv:1209.5608, doi:10.1137/1.9781611973105.126, ISBN 978-1-61197-251-1, S2CID 13397958
  24. ^ Huang, Shang-En; Huang, Dawei; Kopelowitz, Tsvi; Pettie, Seth (2017), "Fully dynamic connectivity in amortized expected time", in Klein, Philip N. (ed.), Proceedings of the Twenty-Eighth Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2017, Barcelona, Spain, Hotel Porta Fira, January 16-19, pp. 510–520, arXiv:1609.05867, doi:10.1137/1.9781611974782.32, S2CID 15585534
  25. ^ Lewis, Harry R.; Papadimitriou, Christos H. (1982), "Symmetric space-bounded computation", Theoretical Computer Science, 19 (2): 161–187, doi:10.1016/0304-3975(82)90058-5, MR 0666539
  26. ^ Reingold, Omer (2008), "Undirected connectivity in log-space", Journal of the ACM, 55 (4): A17:1–A17:24, doi:10.1145/1391289.1391291, MR 2445014, S2CID 207168478
  27. ^ Berenbrink, Petra; Krayenhoff, Bruce; Mallmann-Trenn, Frederik (2014), "Estimating the number of connected components in sublinear time", Information Processing Letters, 114 (11): 639–642, doi:10.1016/j.ipl.2014.05.008, MR 3230913
  28. ^ Frieze, Alan; Karoński, Micha? (2016), "1.1 Models and relationships", Introduction to Random Graphs, Cambridge University Press, Cambridge, pp. 3–9, doi:10.1017/CBO9781316339831, ISBN 978-1-107-11850-8, MR 3675279
  29. ^ Frieze & Karoński (2016), 2.1 Sub-critical phase, pp. 20–33; see especially Theorem 2.8, p. 26, Theorem 2.9, p. 28, and Lemma 2.11, p. 29
  30. ^ Frieze & Karoński (2016), 2.3 Phase transition, pp. 39–45
  31. ^ Frieze & Karoński (2016), 2.2 Super-critical phase, pp. 33; see especially Theorem 2.14, p. 33–39
  32. ^ Frieze & Karoński (2016), 4.1 Connectivity, pp. 64–68
  33. ^ Cohen, Reuven; Havlin, Shlomo (2010), "10.1 Percolation on complex networks: Introduction", Complex Networks: Structure, Robustness and Function, Cambridge University Press, pp. 97–98, ISBN 978-1-139-48927-0, archived from the original on 2025-08-06, retrieved 2025-08-06
[edit]
失眠为什么开奥氮平片 散光看东西是什么样的 粒细胞低是什么原因 梦见火车脱轨什么预兆 什么是穴位
老花眼有什么办法可以恢复 地格是什么意思 黑眼圈是什么原因导致的 小孩疳积有什么症状 世事无常是什么意思
石斛什么人不适合吃 百忙之中什么意思 胆固醇高不可以吃什么 edsheeran为什么叫黄老板 冲喜是什么意思
肺有问题会出现什么症状 泡酒用什么容器好 窦性心律t波改变是什么意思 喉咙痛吃什么消炎药 医院体检挂什么科
肚脐眼发炎是什么原因hcv9jop6ns9r.cn 妥瑞氏症是什么病hcv8jop4ns0r.cn 口出狂言是什么生肖hcv7jop6ns7r.cn 清华大学校长是什么级别hcv8jop4ns7r.cn 清心寡欲下一句是什么hcv9jop6ns9r.cn
减肥期间吃什么好hcv8jop9ns8r.cn 记过属于什么处分hcv9jop1ns8r.cn 酱油的原料是什么hcv9jop8ns3r.cn 子宫内膜手术后需要注意什么hcv9jop7ns4r.cn 1月25日是什么星座hcv9jop3ns0r.cn
睡觉出汗是什么原因男性hcv7jop6ns6r.cn bml是什么意思hcv7jop9ns9r.cn cfa是什么证书hcv9jop0ns7r.cn 孕妇感冒可以吃什么药hcv8jop6ns7r.cn 疣是一种什么病gysmod.com
淋巴结肿大是什么样子的hcv9jop3ns2r.cn 雄五行属什么hcv9jop1ns0r.cn 十二月二号是什么星座hcv8jop1ns9r.cn 气管痉挛是什么症状hcv9jop8ns1r.cn 月亮发红是什么原因wmyky.com
百度