ChakraView is an artistic exploration of how data extracted from human lives is processed, reshaped, and often distorted within algorithmic systems frequently without meaningful human oversight. Presented as part of the Museum of Digital Society (MoDS), the exhibition examines how deeply personal data—drawn from our behaviors, choices, and identities, is transformed into algorithmic outputs that can reproduce bias, reinforce inaccuracies, and enable discrimination.

Drawing inspiration from the concept of “chakras,” ChakraView reflects on the lifecycle of data in the digital age. Just as chakras represent flows of energy within the body, the exhibition imagines data as a circulating force—collected, analyzed, and redeployed in ways that influence and even control human lives. Through immersive installations and visual narratives, it reveals how algorithmic systems can amplify harmful tendencies in artificial intelligence, including prejudice, exclusion, inequality, profiling, and fabrication.

By tracing the continuous cycle of data extraction and algorithmic manipulation, ChakraView invites viewers to confront an unsettling question: what happens when the very data used to understand and organize society also entrenches its inequities? The exhibition delves into themes of data mining, algorithmic bias, and “data fiction,” illustrating how digital systems not only reflect reality but actively reshape it.

At its core, ChakraView is both a critique and a call to action. It urges us to reimagine a future where AI systems are grounded in human values, guided by transparency, and held to ethical standards. In doing so, it advocates for a technological landscape where data serves humanity—rather than becoming a tool that perpetuates harm through the very information it derives from us.