Rethinking Digital Architecture
Beyond the Network
12.2024
The internet feels bad. Early dreams of cyberspace egalitarianism failed to materialize—leading instead to an increasingly polarized, commercialized, and fragmented media environment with profound negative effects on public discourse. In the 2024 election, 70% of American adults said they found it difficult to access accurate claims amid online misinformation – and yet, around 60% of young people claim to receive their news primarily through social media. Meanwhile, Elon Musk transformed one of the largest social media sites, X, into a right-wing media hub—through reinstating banned accounts, rolling back moderation policies, and amplifying far-right voices. In light of all this, it appears as though the possibility of a truly democratic digital space is far off, and potentially even a lost cause. Yet to retrieve a sliver of early internet optimism, it also seems worth asking: Is the badness of the internet inevitable? At the core of many issues we face in online spaces, from polarization to political extremism, lies not solely in the behavior and choices of users, but in the underlying blueprint upon which digital platforms are built—principles rooted in network science and informed by neoliberal politics.
Network science, as defined by the National Research Council, studies “network representations of physical, biological, and social phenomena leading to predictive models of these phenomena.” It aims to understand how complex systems function on a fundamental level to provide a comprehensive mapping of how they work. As network science was formalized as a field at the turn of the century, it coincided with the ‘neoliberal turn’ in politics–a shift that emphasized a logic of “capture, crisis, and optimization.” At the same time, the internet transitioned from an academic and government resource to a commercialized, privatized space. The principles of networks, combined with large datasets and computational tools, acquired a concrete infrastructure to embed themselves into social, political, and economic systems. As Manuel Castells observed in 1996, advances in information technologies made it possible to scale the networking logic of interconnected systems, transforming an abstract concept into an operational reality.
This entanglement of neoliberal ideals and network science shapes much of the contemporary internet’s architecture–an architecture that often entrenches existing power imbalances and undermines democratic values. To counter this, we need to then go beyond vague calls for regulation, content moderation, or even decentralization. Instead, we need to reimagine the very framework from which these outcomes arise. What principles could help structure a more just digital landscape? How might a social networking platform challenge, rather than reinforce, neoliberal ideals? Can other theoretical frameworks, such as complexity theory, offer alternative tools for designing algorithms and digital spaces?
In this paper, I will explore how the principles of network science don’t just describe but actively perpetuate injustice when implemented into digital architectures. To do so, I will first build upon Wendy Chun’s work examining the principle of homophily, extending her critique to include other key concepts in network science, such as power-law distributions. I will then explore how these principles shape algorithms to reinforce echo chambers and undermine democratic values. Finally, I will discuss the limitations of surface-level reforms that overlook the deeper issues embedded in the logic and architecture of our social technologies. In conclusion, I will call on progressive activists to actively imagine and create alternative infrastructures, drawing on other theoretical frameworks to provide a blueprint.
Although network science is not inherently a neoliberal construction in the abstract, it provides a theoretical framework that can justify and operationalize the design of digital infrastructures that prioritize profit and growth. Most simply, networks are mathematical models that represent the behavior of nodes—whether individuals, cells, or entities—and edges, the connections between them. Our world is seemingly full of networks, from the neural to the social, all of which (according to network scientists) share a similar set of dynamics and traits. Network science can explain, for instance, why certain airports act as hubs for air travel just as well as it can explain the distribution of goods in a supply chain. According to Albert-László Barabási, a foundational scholar in network science, networks allow us to make sense of chaotic dynamics—whether political, economic, or environmental—while enabling predictions about how these systems will evolve. This predictive power can then be applied across disciplines, from medicine to economics, to optimize processes. Reflecting on the societal impacts of network science, Barabási highlights its role in shaping platforms like LinkedIn and Twitter, stating that "algorithms conceived by network scientists fuel these sites, driving everything from friend recommendations to targeted advertising.”
At face value, network science can appear neutral—an objective and mathematically sound explanation for the dynamics that arise in self-organizing systems. Yet, as Wendy Chun observes in Updating to Remain the Same, networks teeter between empirical reality and idealized projections. They both describe reality and predict it. In doing so, network analysis "replaces real-world events with a reductive and abstract mathematical model" in an effort to map complex phenomena into a simplified structure.
This modeling can be useful in many contexts, such as understanding the viral spread of a disease or mapping protein interaction. However, the transition from theoretical model to instrumentalization transforms these principles from passive representations into active tools. When network models are embedded into technologies–technologies that are now integrated into every aspect of contemporary life–they stop merely reflecting the world and start shaping it. They enact and reinforce social dynamics, often in ways that replicate existing inequalities. To illustrate this process more concretely, I will draw on Chun’s excellent exploration of homophily, a key principle in network science.
Chun’s chapter Queering Homophily makes a compelling case for why the principle of homophily—or, the idea that similarity breeds connection—insidiously perpetuates segregation and leads to echo chambers. Initially, homophily emerged from a 1947 study on friendship formation in a bi-racial housing project. The study noted both homophilic (similarity-based) and heterophilic (difference-based) dynamics. However, as Chun points out, the analysis focused only on white residents, categorizing Black residents uniformly as "liberal," thereby ignoring their varied responses. What emerged was a loosely proven observation for just one instance of friendship formation that, over the decades since, has been elevated to a universal assumption of how humans connect.
As network science embraced homophily as a grounding principle for how nodes connect, it was “no longer something to be accounted for, but rather something that ‘naturally’ accounts for and justifies persistence of inequality within facially equal systems.” This perspective has become deeply informative to contemporary social media algorithms. The "people like you also like" model epitomizes this principle in action. By analyzing user behavior, recommendation algorithms curate content that aligns with a user’s past interactions, thereby reinforcing their existing preferences. This feedback loop encourages users to interact primarily with those who share similar interests, ideologies, or demographic characteristics.
Overtime, this dynamic leads to the formation of echo chambers, digital spaces where users are consistently exposed to a narrow range of perspectives, reinforcing their existing beliefs and ideologies. In turn, diversity of opinion and experience is stifled as interactions remain within narrow groups. This prevents cross-group engagement and undermines that possibility of healthy democracy online. Yet because the core driver of these echo chambers is homophily, it becomes love of the same (not hatred of the other) that drives such separation – making the principle seem innocent and even natural.
This is not to say there isn’t some truth to it; people often gravitate toward those with whom they share commonality. Ultimately, though, it presents an incomplete picture of human relationships. Framing relationships primarily through the lens of homophily also adheres to a neoliberal conception of the individual as a rational actor, a self-contained node who makes choices purely in their best interest. It overlooks the reality that individuals are interconnected agents whose decisions are shaped by broader socio-cultural structures. By privileging similarity as the dominant principle, homophily risks reducing human relationships to transactional, efficiency-driven interactions while neglecting contextual factors, like race, class, and power dynamics.
The rich get richer
Building on Chun’s argument, we can apply a similar analysis to the emergence of power-law distributions (where a small number of nodes have the majority of connections) within networks. In the early days of network science, two influential mathematicians, Paul Erdős and Alfréd Rényi, believed that nodes in a network were primarily connected on the basis of randomness. In a social network, for instance, while a few nodes might have more connections than others, early network scientists assumed that the average number of connections would be about the same across all nodes. This initial randomized model worked well in theoretical mathematical contexts and formed the foundation of network science for decades; however, when applied to real-world networks, it failed to account for unequal distribution of connections.
In the 90’s, the emergence of the internet allowed for one of the first large-scale datasets in which to analyze network behavior. Recognizing the opportunity this presented, Albert-László Barabási began to study the early architecture of the early web as a network, exploring how websites–acting as nodes–connected to one another. He quickly noticed that despite there being thousands of websites on the internet, only a few prominent sites had sufficient links to be discovered by the average user – a phenomenon that appeared incompatible with randomness.
One reason for these hubs, Barabási found, is the role of growth. The random-network model assumed networks to be static. All existing nodes could link randomly because no particular node had an advantage over the others. In real networks, nodes are added over time – with older nodes having more of an opportunity to accumulate connections than nodes that enter the network later. However, growth alone can’t fully account for the creation of hubs. If it did, the earliest websites would remain the most popular indefinitely, regardless of quality or relevance. Barabási argued that there was another central factor contributing to network power-laws: preferential attachment.
Preferential attachment, often summarized as "the rich get richer" principle, ensures that popularity begets more popularity. It suggests that new nodes are more likely to connect to popular nodes, amplifying their existing advantage. One way in which nodes gain prominence, as mentioned, is through growth. Early members of a network have more time to accumulate connections, thereby ‘grandfathering’ in their influence. Yet nodes can also gain popularity by having some sort of competitive edge or value over others, e.g. Google’s superior search engine helped it beat out existing ones.
In digital systems, the principle of preferential attachment is further complicated when combined with factors like user preferences and past behaviors—both of which influence the "fitness" of a node. Fitness refers to a node’s appeal or relevance within the system, often quantified by its number of connections, reputation, or other traits that attract new users. On social media platforms, fitness plays a key role in recommendation algorithms, which prioritize and promote certain nodes—whether content, users, or pages—based on their predicted likelihood of engagement. These algorithms assume that highly connected nodes are more likely to draw further attention, while also incorporating user preferences to tailor content to individual tastes. By analyzing past behaviors, these systems predict and deliver content aligned with users’ established interests.
Much like homophilic feedback loops, the fitness model reinforces existing patterns of engagement, fostering a cycle of sameness where the past dictates the future. This dynamic limits exposure to diverse or challenging perspectives, effectively locking users into predictable and self-reinforcing modes of interaction. More broadly, preferential attachment helps to perpetuate (and automate) the competitive, self-reinforcing logic of neoliberalism. It rewards those with preexisting advantages, whether those advantages come from early entry, superior fitness, or historical influence. In this framework, competition is celebrated, and market-driven dynamics are seen as inherently fair. However, this logic also exacerbates inequalities, as nodes with fewer initial resources or connections find it increasingly difficult to benefit from the network. As networks grow, the disparity becomes even more entrenched: "the penalty for being outside the network increases with the network’s growth because of the declining number of opportunities in reaching other elements outside the network.” This dynamic marginalizes less-connected nodes and reduces their ability to participate in an increasingly concentrated and insular system.
Unlike the random network model, the Barabási model incorporates real-world power dynamics into its abstraction. For Barabási, this suggests that the internet will naturally lead to emergent power laws:
“The hubs are the strongest argument against the utopian vision of an egalitarian cyberspace. Yes, we all have the right to put anything we wish on the Web, but will anybody notice? If the Web were a random network, we would all have the same chance to be seen and heard. In a collective manner, we somehow create hubs, Websites to which everyone links.
When read closely, Barabási’s use of the word ‘somehow’ has significant implications for how we might conceptualize network dynamics. To him, hubs are inevitable outcomes that reflect human choice and social dynamics: they simply emerge. While this may be accurate for network models in the abstract, when describing the real world, hubs are influenced by a whole slew of complex social, political, and economic forces. A news website, for instance, may gain popularity not solely for the quality of its content but for its ability to arbitrarily integrate enough search-engine optimized keywords to gain higher visibility within Google’s algorithm. Similarly, Google itself did not rise to prominence solely because its services are the best in the market. It leveraged its existing power and centrality to outcompete or buy smaller nodes (alternative search engines) to widen its hub’s reach and reinforce its power.
The real-world implementation of network science principles illustrates how power concentration is rarely neutral. In reducing these forces to a naturally occuring law, network science can obfuscate the political realities underlying lived power dynamics–such as flows of capital and access to networks–and instead posit that power accumulation and centrality are predetermined realities of a networked society. To be fair, though, Barabási was writing in the early stages of the web, long before hyper-personalized algorithmic recommendations and filtering systems. These types of algorithms adopt power-law principles as the rule for decision-making–making the network model-to-representation transition especially problematic in the context of contemporary social media platforms. In them, network-informed algorithms move from describing how the world is to dictating how it ought to be; or as Chun puts it, performative algorithms “put in place the world they claim to discover.”
One illustrative example is the emergence of the manosphere–an interconnected network of online communities centered on reinforcing traditional gender hierarchies and antifeminist thought. A person’s entry into this community may begin with a relatively benign search, such as for advice on how to be a man or how to attract women, but it can soon trigger algorithms that prioritize content with higher engagement. This process, driven by homophily, creates a feedback loop where the algorithm increasingly surfaces content that aligns with the user’s past behaviors, gradually steering them toward more extreme material. At the same time, preferential attachment leads algorithms to prioritize already popular hubs, like influential highly-popular creators like Joe Rogan or Aiden Ross, who come to dominate the network. Their high engagement (which is largely due to their contrarian views) ensures that their content is amplified as they become gateways to further radicalization within the manosphere. Of course, there are several socio-cultural forces that have contributed to this growing political sphere of thought, and we shouldn’t risk reducing the trend to algorithmic patterns. However, the rise of the manosphere serves as one example of how principles like homophily and preferential attachment can actively create (not just describe) digital communities.
If the badness of the internet is not inevitable but often algorithmically enforced, what is to be done? It's widely known (and decried) that online extremism and ideological echo chambers undermine healthy democratic discourse. Yet, policymakers and politicians often lack both the technical knowledge and the political will to implement meaningful reforms. Instead, they opt for cosmetic fixes that don't address the underlying foundation. For some, the solution is to hold platforms accountable and promote better content moderation. For others, reforms around notions of diversity: diversify the dataset, the team of software designers, or the metrics used to train and evaluate models. While these efforts may help mitigate some harms, such solutions often reflect a limited conception of diversity that fits within the neoliberal paradigm – one focused on outward markers of identity, rather than a true embrace of difference.
To embrace difference as something valuable in its own right would require more than diverse training data and DEI hiring initiatives. Instead, it would require us to conceptualize diversity as a more meaningful guiding principle to structure into our algorithms. Unfortunately, it is difficult to imagine the current underlying logic changing on a large scale, when major digital platforms continue to prioritize profit over the well-being of democracy and real people. The network-informed algorithm accomplishes what it sets out to do in optimizing efficiency—whether by quickly connecting users to the content that will engage them the longest or speeding up decision-making with automation. As long as the core incentive of algorithmic technology remains to optimize ease, speed, and profitability, we will continue to fuel the forces that undermine a healthy, just democracy. To create a digital environment that supports democracy, we must begin with the original blueprint of its architecture—not just address surface-level fixes.
Although the pervasive influence of network dynamics makes it challenging to conceptualize a holistic solution, it presents progressive activists and thinkers with an opportunity. If our current online ecosystem is structured to lead to power concentration and inequality, we can imagine alternatives that actively work against these outcomes. While this kind of imaginative work and small-scale experimentation won’t immediately tackle the full scope of digital injustice, it can offer a vision of the otherwise—a means of refusing the prevailing technological determinism that frames our digital realities as inevitable.
One theoretical approach that could guide this vision is complexity theory. While it shares a broad intellectual lineage with network science, it offers a fundamentally different perspective. Rather than prioritizing efficiency and optimization, as network science does, complexity theory embraces decentralized, unpredictable dynamics and sustains complexity. It should be understood, as Castells put it, as a “a method for understanding diversity, rather than a unified meta-theory.” A complexity-informed digital infrastructure could prioritize diversity, not as a surface-level representation but as a core value embedded in algorithms. It might incorporate mechanisms like negative feedback loops to regulate the concentration of power, ensuring no entity becomes disproportionately dominant. This approach would counter the homophily of engagement-maximizing content, fostering a more varied and democratic digital environment.
While such a model, or other alternatives, may prove infeasible due to its incompatibility with a profit-motivated market, it may eventually prove to be essential. Without a fundamental shift in how we value and measure success in the digital space, the current trajectory will only deepen current inequality and divisiveness. Moreover, if we persist in assuming that digital infrastructures must inevitably rely on the “natural” dynamics found in network science, we overlook the fact that embedding these principles into algorithms is an active choice. Even if homophily and preferential attachment reflect real-world dynamics, we should question whether we want to automate and amplify these tendencies. Why not imagine an algorithmic infrastructure that shapes our collective behavior for the better—rather than worse?
Bibliography
Barabási, Albert-László, and Márton Pósfai. Network Science. Cambridge: Cambridge University Press, 2016.
Barabási, Albert. Linked: How Everything Is Connected to Everything Else and What It Means. New York: Plume, 2003.
Castells, Manuel. The Rise of the Network Society. Chichester: John Wiley & Sons, 2011.
Chun, Wendy Hui Kyong. Updating to Remain the Same: Habitual New Media. Cambridge: The MIT Press, 2016.
Chun, Wendy Hui Kyong. "Queering Homophily." In Pattern Discrimination, edited by Clemens Apprich, Wendy Hui Kyong Chun, Florian Cramer, and Hito Steyerl, 59–98. Minneapolis: University of Minnesota Press, 2018.
Committee on Network Science for Future Army Applications. Network Science. National Research Council, 2006. https://doi.org/10.17226/11516.
Corsi, G. “Evaluating Twitter’s Algorithmic Amplification of Low-Credibility Content: An Observational Study.” EPJ Data Science 13, no. 18 (2024). https://doi.org/10.1140/epjds/s13688-024-00456-3.
George Washington University. GW U.S. Adults Post-Election Trust in Government Study: Key Findings (December 2024 for Release). Washington, D.C.: George Washington University, 2024. https://gspm.gwu.edu/sites/g/files/zaxdzs5061/files/2024-12/GW_US_Adults_Post-Election_Trust_in_Government_Study_Key_Findings_Dec._2024_for_release.pdf.
Yu, Fei, An Zeng, Sébastien Gillard, and Matúš Medo. "Network-Based Recommendation Algorithms: A Review." Physica A: Statistical Mechanics and Its Applications 452 (2016): 192–208. https://doi.org/10.1016/j.physa.2016.02.021.
Network science, as defined by the National Research Council, studies “network representations of physical, biological, and social phenomena leading to predictive models of these phenomena.” It aims to understand how complex systems function on a fundamental level to provide a comprehensive mapping of how they work. As network science was formalized as a field at the turn of the century, it coincided with the ‘neoliberal turn’ in politics–a shift that emphasized a logic of “capture, crisis, and optimization.” At the same time, the internet transitioned from an academic and government resource to a commercialized, privatized space. The principles of networks, combined with large datasets and computational tools, acquired a concrete infrastructure to embed themselves into social, political, and economic systems. As Manuel Castells observed in 1996, advances in information technologies made it possible to scale the networking logic of interconnected systems, transforming an abstract concept into an operational reality.
This entanglement of neoliberal ideals and network science shapes much of the contemporary internet’s architecture–an architecture that often entrenches existing power imbalances and undermines democratic values. To counter this, we need to then go beyond vague calls for regulation, content moderation, or even decentralization. Instead, we need to reimagine the very framework from which these outcomes arise. What principles could help structure a more just digital landscape? How might a social networking platform challenge, rather than reinforce, neoliberal ideals? Can other theoretical frameworks, such as complexity theory, offer alternative tools for designing algorithms and digital spaces?
In this paper, I will explore how the principles of network science don’t just describe but actively perpetuate injustice when implemented into digital architectures. To do so, I will first build upon Wendy Chun’s work examining the principle of homophily, extending her critique to include other key concepts in network science, such as power-law distributions. I will then explore how these principles shape algorithms to reinforce echo chambers and undermine democratic values. Finally, I will discuss the limitations of surface-level reforms that overlook the deeper issues embedded in the logic and architecture of our social technologies. In conclusion, I will call on progressive activists to actively imagine and create alternative infrastructures, drawing on other theoretical frameworks to provide a blueprint.
Network Science and Neoliberalism
Although network science is not inherently a neoliberal construction in the abstract, it provides a theoretical framework that can justify and operationalize the design of digital infrastructures that prioritize profit and growth. Most simply, networks are mathematical models that represent the behavior of nodes—whether individuals, cells, or entities—and edges, the connections between them. Our world is seemingly full of networks, from the neural to the social, all of which (according to network scientists) share a similar set of dynamics and traits. Network science can explain, for instance, why certain airports act as hubs for air travel just as well as it can explain the distribution of goods in a supply chain. According to Albert-László Barabási, a foundational scholar in network science, networks allow us to make sense of chaotic dynamics—whether political, economic, or environmental—while enabling predictions about how these systems will evolve. This predictive power can then be applied across disciplines, from medicine to economics, to optimize processes. Reflecting on the societal impacts of network science, Barabási highlights its role in shaping platforms like LinkedIn and Twitter, stating that "algorithms conceived by network scientists fuel these sites, driving everything from friend recommendations to targeted advertising.”
At face value, network science can appear neutral—an objective and mathematically sound explanation for the dynamics that arise in self-organizing systems. Yet, as Wendy Chun observes in Updating to Remain the Same, networks teeter between empirical reality and idealized projections. They both describe reality and predict it. In doing so, network analysis "replaces real-world events with a reductive and abstract mathematical model" in an effort to map complex phenomena into a simplified structure.
This modeling can be useful in many contexts, such as understanding the viral spread of a disease or mapping protein interaction. However, the transition from theoretical model to instrumentalization transforms these principles from passive representations into active tools. When network models are embedded into technologies–technologies that are now integrated into every aspect of contemporary life–they stop merely reflecting the world and start shaping it. They enact and reinforce social dynamics, often in ways that replicate existing inequalities. To illustrate this process more concretely, I will draw on Chun’s excellent exploration of homophily, a key principle in network science.
People like you also liked…
Chun’s chapter Queering Homophily makes a compelling case for why the principle of homophily—or, the idea that similarity breeds connection—insidiously perpetuates segregation and leads to echo chambers. Initially, homophily emerged from a 1947 study on friendship formation in a bi-racial housing project. The study noted both homophilic (similarity-based) and heterophilic (difference-based) dynamics. However, as Chun points out, the analysis focused only on white residents, categorizing Black residents uniformly as "liberal," thereby ignoring their varied responses. What emerged was a loosely proven observation for just one instance of friendship formation that, over the decades since, has been elevated to a universal assumption of how humans connect.
As network science embraced homophily as a grounding principle for how nodes connect, it was “no longer something to be accounted for, but rather something that ‘naturally’ accounts for and justifies persistence of inequality within facially equal systems.” This perspective has become deeply informative to contemporary social media algorithms. The "people like you also like" model epitomizes this principle in action. By analyzing user behavior, recommendation algorithms curate content that aligns with a user’s past interactions, thereby reinforcing their existing preferences. This feedback loop encourages users to interact primarily with those who share similar interests, ideologies, or demographic characteristics.
Overtime, this dynamic leads to the formation of echo chambers, digital spaces where users are consistently exposed to a narrow range of perspectives, reinforcing their existing beliefs and ideologies. In turn, diversity of opinion and experience is stifled as interactions remain within narrow groups. This prevents cross-group engagement and undermines that possibility of healthy democracy online. Yet because the core driver of these echo chambers is homophily, it becomes love of the same (not hatred of the other) that drives such separation – making the principle seem innocent and even natural.
This is not to say there isn’t some truth to it; people often gravitate toward those with whom they share commonality. Ultimately, though, it presents an incomplete picture of human relationships. Framing relationships primarily through the lens of homophily also adheres to a neoliberal conception of the individual as a rational actor, a self-contained node who makes choices purely in their best interest. It overlooks the reality that individuals are interconnected agents whose decisions are shaped by broader socio-cultural structures. By privileging similarity as the dominant principle, homophily risks reducing human relationships to transactional, efficiency-driven interactions while neglecting contextual factors, like race, class, and power dynamics.
The rich get richer
Building on Chun’s argument, we can apply a similar analysis to the emergence of power-law distributions (where a small number of nodes have the majority of connections) within networks. In the early days of network science, two influential mathematicians, Paul Erdős and Alfréd Rényi, believed that nodes in a network were primarily connected on the basis of randomness. In a social network, for instance, while a few nodes might have more connections than others, early network scientists assumed that the average number of connections would be about the same across all nodes. This initial randomized model worked well in theoretical mathematical contexts and formed the foundation of network science for decades; however, when applied to real-world networks, it failed to account for unequal distribution of connections.
In the 90’s, the emergence of the internet allowed for one of the first large-scale datasets in which to analyze network behavior. Recognizing the opportunity this presented, Albert-László Barabási began to study the early architecture of the early web as a network, exploring how websites–acting as nodes–connected to one another. He quickly noticed that despite there being thousands of websites on the internet, only a few prominent sites had sufficient links to be discovered by the average user – a phenomenon that appeared incompatible with randomness.
One reason for these hubs, Barabási found, is the role of growth. The random-network model assumed networks to be static. All existing nodes could link randomly because no particular node had an advantage over the others. In real networks, nodes are added over time – with older nodes having more of an opportunity to accumulate connections than nodes that enter the network later. However, growth alone can’t fully account for the creation of hubs. If it did, the earliest websites would remain the most popular indefinitely, regardless of quality or relevance. Barabási argued that there was another central factor contributing to network power-laws: preferential attachment.
Preferential attachment, often summarized as "the rich get richer" principle, ensures that popularity begets more popularity. It suggests that new nodes are more likely to connect to popular nodes, amplifying their existing advantage. One way in which nodes gain prominence, as mentioned, is through growth. Early members of a network have more time to accumulate connections, thereby ‘grandfathering’ in their influence. Yet nodes can also gain popularity by having some sort of competitive edge or value over others, e.g. Google’s superior search engine helped it beat out existing ones.
In digital systems, the principle of preferential attachment is further complicated when combined with factors like user preferences and past behaviors—both of which influence the "fitness" of a node. Fitness refers to a node’s appeal or relevance within the system, often quantified by its number of connections, reputation, or other traits that attract new users. On social media platforms, fitness plays a key role in recommendation algorithms, which prioritize and promote certain nodes—whether content, users, or pages—based on their predicted likelihood of engagement. These algorithms assume that highly connected nodes are more likely to draw further attention, while also incorporating user preferences to tailor content to individual tastes. By analyzing past behaviors, these systems predict and deliver content aligned with users’ established interests.
Much like homophilic feedback loops, the fitness model reinforces existing patterns of engagement, fostering a cycle of sameness where the past dictates the future. This dynamic limits exposure to diverse or challenging perspectives, effectively locking users into predictable and self-reinforcing modes of interaction. More broadly, preferential attachment helps to perpetuate (and automate) the competitive, self-reinforcing logic of neoliberalism. It rewards those with preexisting advantages, whether those advantages come from early entry, superior fitness, or historical influence. In this framework, competition is celebrated, and market-driven dynamics are seen as inherently fair. However, this logic also exacerbates inequalities, as nodes with fewer initial resources or connections find it increasingly difficult to benefit from the network. As networks grow, the disparity becomes even more entrenched: "the penalty for being outside the network increases with the network’s growth because of the declining number of opportunities in reaching other elements outside the network.” This dynamic marginalizes less-connected nodes and reduces their ability to participate in an increasingly concentrated and insular system.
Performative algorithms
Unlike the random network model, the Barabási model incorporates real-world power dynamics into its abstraction. For Barabási, this suggests that the internet will naturally lead to emergent power laws:
“The hubs are the strongest argument against the utopian vision of an egalitarian cyberspace. Yes, we all have the right to put anything we wish on the Web, but will anybody notice? If the Web were a random network, we would all have the same chance to be seen and heard. In a collective manner, we somehow create hubs, Websites to which everyone links.
When read closely, Barabási’s use of the word ‘somehow’ has significant implications for how we might conceptualize network dynamics. To him, hubs are inevitable outcomes that reflect human choice and social dynamics: they simply emerge. While this may be accurate for network models in the abstract, when describing the real world, hubs are influenced by a whole slew of complex social, political, and economic forces. A news website, for instance, may gain popularity not solely for the quality of its content but for its ability to arbitrarily integrate enough search-engine optimized keywords to gain higher visibility within Google’s algorithm. Similarly, Google itself did not rise to prominence solely because its services are the best in the market. It leveraged its existing power and centrality to outcompete or buy smaller nodes (alternative search engines) to widen its hub’s reach and reinforce its power.
The real-world implementation of network science principles illustrates how power concentration is rarely neutral. In reducing these forces to a naturally occuring law, network science can obfuscate the political realities underlying lived power dynamics–such as flows of capital and access to networks–and instead posit that power accumulation and centrality are predetermined realities of a networked society. To be fair, though, Barabási was writing in the early stages of the web, long before hyper-personalized algorithmic recommendations and filtering systems. These types of algorithms adopt power-law principles as the rule for decision-making–making the network model-to-representation transition especially problematic in the context of contemporary social media platforms. In them, network-informed algorithms move from describing how the world is to dictating how it ought to be; or as Chun puts it, performative algorithms “put in place the world they claim to discover.”
One illustrative example is the emergence of the manosphere–an interconnected network of online communities centered on reinforcing traditional gender hierarchies and antifeminist thought. A person’s entry into this community may begin with a relatively benign search, such as for advice on how to be a man or how to attract women, but it can soon trigger algorithms that prioritize content with higher engagement. This process, driven by homophily, creates a feedback loop where the algorithm increasingly surfaces content that aligns with the user’s past behaviors, gradually steering them toward more extreme material. At the same time, preferential attachment leads algorithms to prioritize already popular hubs, like influential highly-popular creators like Joe Rogan or Aiden Ross, who come to dominate the network. Their high engagement (which is largely due to their contrarian views) ensures that their content is amplified as they become gateways to further radicalization within the manosphere. Of course, there are several socio-cultural forces that have contributed to this growing political sphere of thought, and we shouldn’t risk reducing the trend to algorithmic patterns. However, the rise of the manosphere serves as one example of how principles like homophily and preferential attachment can actively create (not just describe) digital communities.
Limits of Reform
If the badness of the internet is not inevitable but often algorithmically enforced, what is to be done? It's widely known (and decried) that online extremism and ideological echo chambers undermine healthy democratic discourse. Yet, policymakers and politicians often lack both the technical knowledge and the political will to implement meaningful reforms. Instead, they opt for cosmetic fixes that don't address the underlying foundation. For some, the solution is to hold platforms accountable and promote better content moderation. For others, reforms around notions of diversity: diversify the dataset, the team of software designers, or the metrics used to train and evaluate models. While these efforts may help mitigate some harms, such solutions often reflect a limited conception of diversity that fits within the neoliberal paradigm – one focused on outward markers of identity, rather than a true embrace of difference.
To embrace difference as something valuable in its own right would require more than diverse training data and DEI hiring initiatives. Instead, it would require us to conceptualize diversity as a more meaningful guiding principle to structure into our algorithms. Unfortunately, it is difficult to imagine the current underlying logic changing on a large scale, when major digital platforms continue to prioritize profit over the well-being of democracy and real people. The network-informed algorithm accomplishes what it sets out to do in optimizing efficiency—whether by quickly connecting users to the content that will engage them the longest or speeding up decision-making with automation. As long as the core incentive of algorithmic technology remains to optimize ease, speed, and profitability, we will continue to fuel the forces that undermine a healthy, just democracy. To create a digital environment that supports democracy, we must begin with the original blueprint of its architecture—not just address surface-level fixes.
Imagining the otherwise
Although the pervasive influence of network dynamics makes it challenging to conceptualize a holistic solution, it presents progressive activists and thinkers with an opportunity. If our current online ecosystem is structured to lead to power concentration and inequality, we can imagine alternatives that actively work against these outcomes. While this kind of imaginative work and small-scale experimentation won’t immediately tackle the full scope of digital injustice, it can offer a vision of the otherwise—a means of refusing the prevailing technological determinism that frames our digital realities as inevitable.
One theoretical approach that could guide this vision is complexity theory. While it shares a broad intellectual lineage with network science, it offers a fundamentally different perspective. Rather than prioritizing efficiency and optimization, as network science does, complexity theory embraces decentralized, unpredictable dynamics and sustains complexity. It should be understood, as Castells put it, as a “a method for understanding diversity, rather than a unified meta-theory.” A complexity-informed digital infrastructure could prioritize diversity, not as a surface-level representation but as a core value embedded in algorithms. It might incorporate mechanisms like negative feedback loops to regulate the concentration of power, ensuring no entity becomes disproportionately dominant. This approach would counter the homophily of engagement-maximizing content, fostering a more varied and democratic digital environment.
While such a model, or other alternatives, may prove infeasible due to its incompatibility with a profit-motivated market, it may eventually prove to be essential. Without a fundamental shift in how we value and measure success in the digital space, the current trajectory will only deepen current inequality and divisiveness. Moreover, if we persist in assuming that digital infrastructures must inevitably rely on the “natural” dynamics found in network science, we overlook the fact that embedding these principles into algorithms is an active choice. Even if homophily and preferential attachment reflect real-world dynamics, we should question whether we want to automate and amplify these tendencies. Why not imagine an algorithmic infrastructure that shapes our collective behavior for the better—rather than worse?
Bibliography
Barabási, Albert-László, and Márton Pósfai. Network Science. Cambridge: Cambridge University Press, 2016.
Barabási, Albert. Linked: How Everything Is Connected to Everything Else and What It Means. New York: Plume, 2003.
Castells, Manuel. The Rise of the Network Society. Chichester: John Wiley & Sons, 2011.
Chun, Wendy Hui Kyong. Updating to Remain the Same: Habitual New Media. Cambridge: The MIT Press, 2016.
Chun, Wendy Hui Kyong. "Queering Homophily." In Pattern Discrimination, edited by Clemens Apprich, Wendy Hui Kyong Chun, Florian Cramer, and Hito Steyerl, 59–98. Minneapolis: University of Minnesota Press, 2018.
Committee on Network Science for Future Army Applications. Network Science. National Research Council, 2006. https://doi.org/10.17226/11516.
Corsi, G. “Evaluating Twitter’s Algorithmic Amplification of Low-Credibility Content: An Observational Study.” EPJ Data Science 13, no. 18 (2024). https://doi.org/10.1140/epjds/s13688-024-00456-3.
George Washington University. GW U.S. Adults Post-Election Trust in Government Study: Key Findings (December 2024 for Release). Washington, D.C.: George Washington University, 2024. https://gspm.gwu.edu/sites/g/files/zaxdzs5061/files/2024-12/GW_US_Adults_Post-Election_Trust_in_Government_Study_Key_Findings_Dec._2024_for_release.pdf.
Yu, Fei, An Zeng, Sébastien Gillard, and Matúš Medo. "Network-Based Recommendation Algorithms: A Review." Physica A: Statistical Mechanics and Its Applications 452 (2016): 192–208. https://doi.org/10.1016/j.physa.2016.02.021.