#5. Designing Meaningful Lives in a Shifting Knowledge Environment
Reclaiming Our Capacity to Make Meaning Amidst Algorithmic Influence
Many thanks to the people who have provided feedback on this post and the previous ones. Especially Aditya SK, Hannah Yang, Emil Wasteson, Aron Mill and Guillaume Vorreux!
What if the LLM I talk to understood more about my existential inquiries than some of my closest friends? I’m not sure how true that is, but the very possibility unsettles me. It suggests something about the evolving nature of our meaning-making ecosystem, and the growing role AI systems play in helping us navigate life’s big questions.
Whether we’re having a hard conversation, scrolling headlines, or asking a chatbot for advice, we constantly lean on our surroundings to make sense of things. But not all environments support good thinking. Some distort, rush, or fragment it. Philosophers call this broader landscape our epistemic environment (Turner, 2020).
An epistemic environment refers to the entire ecosystem in which we form beliefs and assess truth. It includes:
the resources we depend on such as media, education, digital platforms, cultural narratives
the conditions that shape how we think: habits of attention, trust in institutions, and social norms.
It’s the background structure that makes collective sense-making either possible or precarious. This final piece proposes that a healthy epistemic environment is a precondition for meaning-making. As AI systems take on greater roles as knowledge mediators, what can we do to preserve a healthy epistemic environment where meaning can emerge?
Power, Knowledge, and the Shape of Our Reality
To understand how our capacity for meaning-making is shaped, sometimes subtly, sometimes forcibly, it helps to examine the underlying forces that mediate how we come to know anything at all. The environments in which we form beliefs are not neutral. They are designed, curated, and governed by dynamics of power and profit. Philosophers like Michel Foucault and Shoshana Zuboff have long warned that knowledge and agency are deeply entangled with systems of control, whether through institutional discourse or digital surveillance. If we are to foster a healthy epistemic environment, we must first name the forces that compromise it.
Michel Foucault: Power/Knowledge
Michel Foucault introduced the concept of power/knowledge, which is helpful to examine the epistemic landscape shaped by AI. Foucault argued that knowledge is never neutral but is always intertwined with power relations (Feder, 2010). What we accept as "truth" emerges not solely from objective reality, but from dominant discourses, systems of classification, and societal norms. In the context of AI, this means that the algorithms which curate our digital experiences are not merely objective tools, but rather, they embody and reinforce specific power structures. For instance, search algorithms prioritize certain information sources, shaping public perception on a wide range of topics. This algorithmic curation creates what might be termed a "silent epistemic architecture", subtly guiding our understanding of the world (Michel Foucault on the Digital Panopticon: Artificial Intelligence, Privacy, and Wearable Technology, 2023).
Consider how social media algorithms and LLM filter content, sometimes prioritizing engagement over factual accuracy. This can lead to the amplification of misinformation and the creation of echo chambers, where individuals are primarily exposed to viewpoints that confirm their existing beliefs.
Shoshana Zuboff: Surveillance Capitalism
Shoshana Zuboff's work on surveillance capitalism exposes how our personal data is extracted, commodified, and used to predict and shape our behavior in the digital sphere (Shandler, 2019). Zuboff introduces the concept of "instrumentarian power", a form of influence that does not rely on overt coercion but rather designs environments that subtly nudge us toward predetermined outcomes (Zuboff, 2015). This has profound implications for our autonomy and our ability to construct meaning in our lives and raises a crucial question: Are we genuinely choosing our beliefs, or are we being subtly nudged for profit?
Targeted advertising provides a clear illustration of instrumentarian power. Companies collect vast amounts of data about our online behavior, interests, and demographics, and then use this information to deliver personalized ads that are designed to influence our purchasing decisions. Some argue that surveillance capitalism simply provides consumers with more relevant products and services and that individuals are free to ignore targeted advertising. However, Zuboff contends that the scale and sophistication of data collection and analysis create a power imbalance that undermines our ability to make truly autonomous choices. The "Big Other" (Zuboff, 2015) operates in the interest of surveillance capital, often without democratic oversight (Shandler, 2019).
Agency, really?
It’s true that individuals still choose which books to read, podcasts to follow, or films to watch. Yet today’s AI-driven systems, particularly large language models, operate on a far broader, more personalized scale. Unlike a fixed newspaper or TV lineup, LLMs generate and rank content in real time, learning from your every query to predict which ideas you’ll likely accept next.
This creates a feedback loop far subtler than editorial bias: you’re guided toward freshly composed, algorithm-tailored narratives that you may mistake for neutral or expert opinions. While we’re not complete bystanders, we can still seek out dissenting voices, these invisible, data-hungry architectures increasingly pre-structure our knowledge horizons.
Put simply: the same power structures Foucault and Zuboff described are still at work, only now they’re built into algorithms that, to some extent, undermine our ability to think for ourselves.
From Epistemic Risks to Institutional Design
As AI curation increasingly shapes what we see and think, our meaning-making capacity risks being co-opted, shifting us from active authors to passive recipients of meaning. The goals and values we adopt may end up echoing algorithmic biases rather than our own considered convictions. Upholding transparency, rigor, and diversity in our information spaces is thus essential: it preserves the self-authored narratives on which genuine agency and true freedom of choice depend.
Below are three illustrative, non-exhaustive cultural, social, and physical infrastructure propositions that respond to different facets of a healthy epistemic environment. Each one corresponds to key dimensions of meaning explored in this series. These infrastructures act as counterweights that support critical thinking and value formation.
Cultural Infrastructure: Meaning Incubators
In response to the moral flattening that can arise in an algorithmically-curated world, meaning incubators offer structured spaces, virtual or physical, for exploring ethical frameworks, long-term thinking, and worldview formation. Inspired by CFAR workshops and life-design labs, these incubators could feature tools like belief sourcing, micro-philosophy, life-goals framework and meaning menus. Participants might draft personal manifestos or long-range moral visions, practicing meaning-making under uncertainty.
Social Infrastructure: Meaning-Making Circles
As social media fragments discourse and nudges us into epistemic bubbles, meaning-making circles provide a corrective: small, regular gatherings where people put into practice the frameworks they have explored in the meaning incubator. Drawing on traditions like Socratic salons, secular havruta or vada vidhi debates, these circles encourage dialogic learning, narrative calibration, and shared metaphysical inquiry. In these intimate spaces, beliefs are examined, tested, and co-created. Circles can support participants in crafting meaning together, rather than alone or via algorithm.
Physical Infrastructure: Secular Sanctuaries
Finally, meaning also needs architecture, literal spaces that invite awe and self-transcendence. Secular sanctuaries, or epistemic commons, are civic spaces designed for reflection and ritual without religious doctrine. Think of the Long Now Foundation’s 10,000-year clock. Such space slows us down, reconnects us with deep time, and offers containers for communal meaning rituals. In a disembodied digital world, they remind us that meaning is also held by physical space.
Conclusion
Throughout this series, we’ve explored a working model of what constitutes a meaningful life. Drawing on research and philosophical inquiry, we examined a few key dimensions:
Coherence — making sense of reality
Purpose — aiming toward something larger than the self
Significance — the feeling that one’s life matters
A healthy epistemic environment — the background condition that makes the rest possible
After reading The Power of Meaning by Emily Esfahani Smith, I’d add transcendence and community as additional dimensions of meaning. Transcendence refers to moments of connection with something greater than oneself, whether through nature, art, spirituality, or awe, while community speaks to the deep sense of belonging that arises from shared values and mutual care.
Each of these pillars can be shaped, for better or worse, by the informational and social environments we inhabit. As artificial intelligence becomes more integrated into how we think, connect and decide, the conditions under which we construct meaning are shifting in real time.
This is a first pass at the kinds of design questions that will become increasingly relevant. As AI systems grow in capability and influence, the task ahead is civic, cultural, and philosophical.
To ground this exploration further into personal exploration, I’d like to share the practices that have helped me address my personal quest for meaningfulness.
Joining value aligned communities. The Shapers, the EA community, Wise, my school of advaita vedanta yoga, the community of practitioners of vipassana meditation.
Engaging in modern day “rituals”. Conferences, get together, chanting groups, co-learning circles, book clubs, silence retreats.
Bringing “awe” into my daily life
By exposing myself to various forms of art: visual arts, architecture and music.
By creating an altar at home where I keep a souvenir of the people that matter in my life, light up a candle, and spend a moment connecting with them through a short meditation practice.
Forming my views
On the nature of reality and how to approach knowledge acquisition.
On the type of moral framework I want to live by.
Through reading, writing, listening to podcasts and discussing with fellow meaning-seekers.
Striving to be net positive. Making choices that don’t increase suffering in the world, and preferably reduce it.
Though this one is really difficult. For instance, I tried being a vegan for a few months but struggled and transitioned back to being a vegetarian. Though I hope to become a vegan again in the future.
I also aim to donate time and/or money where it’s most needed, though identifying where that is remains an ongoing inquiry.
I’d be curious to hear about your meaning-making practices. Feel free to share in the comments!

