<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[kindred spirits: Zeitgeist]]></title><description><![CDATA[Essays tracing the intricate intersections between tech evolution and the human experience]]></description><link>https://www.readkindredspirits.com/s/zeitgeist</link><generator>Substack</generator><lastBuildDate>Tue, 12 May 2026 19:37:44 GMT</lastBuildDate><atom:link href="https://www.readkindredspirits.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Sindhu Shivaprasad]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[kindredspirits@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[kindredspirits@substack.com]]></itunes:email><itunes:name><![CDATA[Sindhu Shivaprasad]]></itunes:name></itunes:owner><itunes:author><![CDATA[Sindhu Shivaprasad]]></itunes:author><googleplay:owner><![CDATA[kindredspirits@substack.com]]></googleplay:owner><googleplay:email><![CDATA[kindredspirits@substack.com]]></googleplay:email><googleplay:author><![CDATA[Sindhu Shivaprasad]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[How to make $183 billion disappear]]></title><description><![CDATA[What Claude's viral NYC pop-up and "thinking" cap reveals about tech culture's hunger for meaning]]></description><link>https://www.readkindredspirits.com/p/how-to-make-183-billion-disappear</link><guid isPermaLink="false">https://www.readkindredspirits.com/p/how-to-make-183-billion-disappear</guid><dc:creator><![CDATA[Sindhu Shivaprasad]]></dc:creator><pubDate>Tue, 07 Oct 2025 13:02:48 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/271ee124-cf4f-4358-9d03-134fbde02f40_792x595.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Over the weekend, AI company Anthropic took over Air Mail, a chic newsstand and caf&#233; in New York&#8217;s West Village, to create a &#8220;thinking space&#8221; for users of their generative AI app, Claude.</p><p>The pop-up was simply designed: free caps with &#8220;thinking&#8221; printed on them, free tote bags when the caps ran out, free coffee if you downloaded the Claude app, and copies of Anthropic CEO Dario Amodei&#8217;s book up for grabs. You just had to show the app on your phone to get in. The broader &#8220;Keep Thinking&#8221; campaign included a &#8220;say no to slop&#8221; vibe that contrasted Claude with the recent deluge of low-quality AI content.</p><p>Lines to Air Mail&#8217;s front door snaked down the sidewalk all weekend. People waited two hours, three, for a cap and a latte. My feed&#8212;an actual ocean away from the West Village&#8212;was filled with photos of the caps aesthetically arranged on coffee tables, gushing about the vibes and the aesthetic of it all.</p><p>The online commentary was <em>effusive</em>. People said the pop-up was &#8220;tasteful&#8221;, described it as feeling &#8220;like a warm room&#8221;, even called it &#8220;the best marketing campaign, almost Apple-like&#8221;. Indeed, nearly one in every five posts on my feed was a Claude billboard juxtaposed with Apple&#8217;s popular &#8220;Think Different&#8221; campaign.</p><p>Here&#8217;s where I got curious. Pop-ups aren&#8217;t new, and free merch is a tale as old as time. David vs Goliath framings are practically a Silicon Valley clich&#233; at this point. So what made the Anthropic campaign land the way it did? I think the answer isn&#8217;t immediately obvious, which is part of what makes this entire thing so incredibly interesting.</p><h2><strong>The circle is actually a spiral</strong></h2><p>I&#8217;ll start with the cap which, I&#8217;ll admit, is right up my alley. It&#8217;s a simple baseball cap with &#8220;thinking&#8221; embroidered onto it in classic serif against weathered fabric, with no other obvious branding.</p><p>The simplicity of this cap is the whole point. It functions as a costly signal, where the cost is time (spent waiting in line) and knowledge (about Claude, having the app, being plugged into the right networks to know about the event). This is what Pierre Bourdieu called cultural capital: the ability to recognise and value this aesthetic choice becomes, in itself, a marker of belonging. The cap is legible only to insiders; to everyone else, it&#8217;s a cap with a word on it. And that&#8217;s precisely what makes it valuable to those in the know.</p><p>Giving away Dario&#8217;s book was a particular stroke of genius, IMO. In tech culture, origin stories matter enormously. The narrative of principled researchers leaving the behemoth OpenAI over safety concerns and to do AI &#8220;the right way&#8221;, it gives people something to believe in beyond the features. Dario&#8217;s book makes that narrative tangible and portable, and giving it away for free makes that founder mythology spread that much more organically. The more people subscribe to that narrative, the more likely they are to choose that product of their own volition.</p><p>So on one level of the spiral, you have free offerings, which in general carry psychological weight. In traditional marketing, even if there&#8217;s something free up for grabs, there&#8217;s a clear transaction: give us your email or data, and we&#8217;ll give you something in return. Economists call this a market economy.</p><p>This pop-up probably felt different. The giving felt generous rather than extractive. The staff weren&#8217;t upselling you or collecting detailed information, they were handing you a whole bunch of free stuff. The singular ask&#8212;download the app if you want coffee&#8212;felt more like an optional joining of a community than like completing a transaction. From the many tweets I saw, people walked away feeling like they&#8217;d been given something, and not like something had been extracted from them. That <em>feeling</em> matters more than the economic reality of the exchange.</p><p>On the next level: the venue. I saw a lot of people asked: why NYC? Why not SF? I don&#8217;t think that choice was arbitrary. I think SF has reached tech saturation, so an Anthropic pop-up would register as just another tech thing in a deluge of tech things. Preaching to the choir. Choosing NYC, on the other hand, signals something beyond &#8220;we&#8217;re a tech company&#8221;.</p><p>Air Mail, for starters, sells carefully curated magazines, CBD-infused tonics, and other tangible and intangible markers of IYKYK taste. It&#8217;s also founded by Graydon Carter, the legendary Vanity Fair editor and taste-maker, which means it&#8217;s wrapped in cultural cachet. It being located in NYC&#8217;s West Village, which is not lacking in cultural significance, adds to the value.</p><p>By choosing this location, Anthropic juiced some of that credibility for itself, bathed in its alternative-boujie halo. They aligned themselves with a particular kind of taste: intellectual, established, earnest, cultural rather than just trendy. It&#8217;s not an image you&#8217;d associate a tech company with, which was the point. They made a statement as much through the venues they chose <em>not</em> to host in as by the one they <em>did</em>.</p><p>OpenAI already owns the tech mindshare. So Anthropic goes after something else: cultural legitimacy, intellectual seriousness, the sense that they care about craft and meaning. The aesthetic choices&#8212;classic serif typefaces, warm and intimate photography, a city of culture&#8212;all signalled care and intentionality at a human scale.</p><p>On the next level, you have the crowd itself. The people in line had inadvertently become collaborators in something much larger than a weekend pop-up, though most probably didn&#8217;t realise it.</p><p>Whether intentionally or intuitively, the event was designed for virality. Everything was photogenic: the warm lighting, the aesthetic cohesion, the free stuff. It was time-limited and location-specific, creating natural scarcity and FOMO. Conversations were kicking up while people were waiting in line, about which AI tool they use and what that says about them. (I called out, in an earlier essay, that <a href="https://sindhu.live/garden/the-apps-we-live-by">our choice of software has become subtle performances of identity</a>).</p><p>Once enough people are doing something, others join not because they&#8217;ve independently evaluated whether it&#8217;s worth spending a sunny Saturday on, but because the crowd itself signals value. You see a long line, you assume it must be worth the wait. The line validates the event, which attracts more people, and so on. That happened here, and those lines made it to photos on the Twittersphere, creating an illusion of massive scale.</p><p>I think that digital amplification spiraled outwards in ways that broke past the walls of that chic little newsstand. People who weren&#8217;t in NYC at the right time weren&#8217;t the traditional target of this popup, but <a href="https://mattcasmith.net/2025/10/05/anthropic-claude-ai-marketing-design">many of them became the actual converts</a>, downloading Claude to see what warranted this response (driven by social proof generated by people, half of whom might have been there because it was the weekend and there was free coffee).</p><p>To cap it all off, I don&#8217;t think Anthropic tried to hide that this was marketing. Their branding was visible. The request to download the app was explicit. They announced the pop-up from their official social accounts. They were admirably upfront about it all.</p><p>Design has this concept called &#8220;honest materials&#8221;, where something doesn&#8217;t try to pretend it&#8217;s something else. This campaign had that quality. It&#8217;s what people mean by anti-marketing marketing: it works <em>precisely</em> because it doesn&#8217;t feel like it&#8217;s trying to manipulate you. Commercial activity was laundered through such careful attention to taste and culture that it felt like something else entirely.</p><h2><strong>The perfect storm</strong></h2><p>So if those were the mechanics, what made them work the way they did? I spent hours going over the reactions to this campaign. Some reactions pitched Anthropic as makers of thoughtful collaborators pitted against those high up above:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!pkNU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d30c0a-dd88-43f6-8a3d-4b6317ed8720_1196x1426.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!pkNU!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d30c0a-dd88-43f6-8a3d-4b6317ed8720_1196x1426.png 424w, https://substackcdn.com/image/fetch/$s_!pkNU!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d30c0a-dd88-43f6-8a3d-4b6317ed8720_1196x1426.png 848w, https://substackcdn.com/image/fetch/$s_!pkNU!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d30c0a-dd88-43f6-8a3d-4b6317ed8720_1196x1426.png 1272w, https://substackcdn.com/image/fetch/$s_!pkNU!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d30c0a-dd88-43f6-8a3d-4b6317ed8720_1196x1426.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!pkNU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d30c0a-dd88-43f6-8a3d-4b6317ed8720_1196x1426.png" width="1196" height="1426" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/92d30c0a-dd88-43f6-8a3d-4b6317ed8720_1196x1426.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1426,&quot;width&quot;:1196,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1316838,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.readkindredspirits.com/i/175509071?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d30c0a-dd88-43f6-8a3d-4b6317ed8720_1196x1426.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!pkNU!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d30c0a-dd88-43f6-8a3d-4b6317ed8720_1196x1426.png 424w, https://substackcdn.com/image/fetch/$s_!pkNU!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d30c0a-dd88-43f6-8a3d-4b6317ed8720_1196x1426.png 848w, https://substackcdn.com/image/fetch/$s_!pkNU!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d30c0a-dd88-43f6-8a3d-4b6317ed8720_1196x1426.png 1272w, https://substackcdn.com/image/fetch/$s_!pkNU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d30c0a-dd88-43f6-8a3d-4b6317ed8720_1196x1426.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Other reactions weren&#8217;t as keen to drink the Kool-Aid:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!tp2N!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c3a5f44-488b-46a0-9bf7-8553ef3ca0b8_1194x1256.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!tp2N!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c3a5f44-488b-46a0-9bf7-8553ef3ca0b8_1194x1256.png 424w, https://substackcdn.com/image/fetch/$s_!tp2N!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c3a5f44-488b-46a0-9bf7-8553ef3ca0b8_1194x1256.png 848w, https://substackcdn.com/image/fetch/$s_!tp2N!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c3a5f44-488b-46a0-9bf7-8553ef3ca0b8_1194x1256.png 1272w, https://substackcdn.com/image/fetch/$s_!tp2N!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c3a5f44-488b-46a0-9bf7-8553ef3ca0b8_1194x1256.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!tp2N!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c3a5f44-488b-46a0-9bf7-8553ef3ca0b8_1194x1256.png" width="1194" height="1256" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6c3a5f44-488b-46a0-9bf7-8553ef3ca0b8_1194x1256.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1256,&quot;width&quot;:1194,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1110171,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.readkindredspirits.com/i/175509071?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c3a5f44-488b-46a0-9bf7-8553ef3ca0b8_1194x1256.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!tp2N!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c3a5f44-488b-46a0-9bf7-8553ef3ca0b8_1194x1256.png 424w, https://substackcdn.com/image/fetch/$s_!tp2N!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c3a5f44-488b-46a0-9bf7-8553ef3ca0b8_1194x1256.png 848w, https://substackcdn.com/image/fetch/$s_!tp2N!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c3a5f44-488b-46a0-9bf7-8553ef3ca0b8_1194x1256.png 1272w, https://substackcdn.com/image/fetch/$s_!tp2N!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c3a5f44-488b-46a0-9bf7-8553ef3ca0b8_1194x1256.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I somewhat agree that this says something about how tech culture&#8217;s tribalism operates, and how starved we are for anything that feels genuine. And granted, accounting for algorithmic bias, much of this reaction might have been contained within tech circles.</p><p>But to analyse why the philosophy and execution worked the way they did, it&#8217;s important to pay attention to the zeitgeist.</p><p>The current AI moment is uniquely anxious:</p><ul><li><p>excited about tech</p></li><li><p>terrified about job displacement</p></li><li><p>disgusted by slop</p></li><li><p>deeply cynical about big tech, and</p></li><li><p>desperate for goodness.</p></li></ul><p>This anxiety creates specific receptivity. We are primed to want an alternative that <em>feels</em> different: agentic not predatory, collaborative not manipulative, thoughtful not slop-ful.</p><p>The anxiety reached a particular pitch over the past couple of weeks, when the AI landscape was especially active. OpenAI launched Sora 2.0, their video general model, and socials were immediately flooded with the type of rapid-fire content production that starts to blur together after a while. Other companies like Google and Meta were also pulsing out updates and new models, adding to the general noise.</p><p>Beyond the immediate timing, something slower and more fundamental has been happening about how tech culture has been evolving. We&#8217;ve been living in meta-ironic tech culture for years. Everything is a meme, nothing is sincere. Earnestness is met with skepticism, even mockery.</p><p>This ironic distance was very much a product of environments where hype cycles are short and disappointments are frequent. When you&#8217;ve watched enough companies promise to change the world and then pivot to selling, I don&#8217;t know, ads, &#8220;lol jk unless&#8221; becomes the only rational reaction, the sensible default.</p><p>But cultural exhaustion has been building, and the pressure cooker is about to explode. There is a generational shift towards post-post-ironic sincerity; people are tired of everything being deeply unserious and wanting to believe in something, anything, again. The people yearn for genuineness.</p><p>Anthropic&#8217;s pop-up, intentionally or not, landed into this cultural moment. Their pop-up doubled as emotional reassurance: human-scaled, considered, and feeding a hunger for something that was neither spontaneous ragebait (ahem, Cluely and friends) nor A/B tested to death. It offered permission to engage as a community, not a cult or a permanent subscriber to The Future. The anti-slop framing emerged organically from the timing of it all; coming right after a wave of AI tools focused on output, generation, and just <em>more</em>, the emphasis on &#8220;keep thinking&#8221; took on the tone of implicit critique. For an audience that is fed up to the back teeth with hype cycles and dangerous tech, this particular restraint registered as different, curious.</p><p>(This positioning is also largely consistent with the product itself. Claude already has that particular tone and emotive predisposition; the models feel more conversational, more human, less like you&#8217;re talking to an optimisation engine.)</p><p>It helps that Anthropic&#8217;s position in the market made this dynamic all the more potent. If OpenAI had done this exact same pop-up, it would have read completely differently, corporate and calculated. Anthropic benefits enormously from being in second position, because it lends credibility to that David vs Goliath narrative. They can position themselves as the thoughtful alternative, the principled choice, the underdog with better values.</p><p>All of this made the perfect storm that caused this reaction to what was, on paper, a small event with free merch.</p><h2><strong>The marketing scales</strong></h2><p>Whether this positioning and posturing reflects reality is, rather cleverly, besides the point. That&#8217;s what I think is the most subtle part of this whole thing that we need to pay attention to.</p><p>Anthropic is valued at a whopping $183 billion. They&#8217;re backed by Google and Amazon. Anthropic is far, far from a scrappy startup operating out of a garage, fighting against impossible odds. They&#8217;re one of the most well-funded companies in Silicon Valley, competing for dominance in what might be the most important technology race of the decade. The pop-up was also a part of Anthropic&#8217;s major brand campaign, a multi-million dollar effort spanning TV, streaming, print, and OOH advertising. Put into perspective, this isn&#8217;t David vs Goliath, it&#8217;s Goliath vs Godzilla (or a better analogy).</p><p>And yet, in the lines stretching down West Village sidewalks and making their appearance on feeds all over the world, it seemed like people chose not to let that valuation complicate their enthusiasm. One would argue it conveyed the same sense of familiarity and warmth as a new neighbourhood coffee shop or mom-and-pop store. It makes the intimate, grassroots feeling that much more remarkable.</p><p>That leads me to think that the pop-up worked <em>precisely</em> because it made Anthropic&#8217;s massive scale feel invisible. Every single choice created perceptual distance from what Anthropic actually is: a billion-dollar company with backing from two of late-stage capitalism&#8217;s Final Bosses.</p><p>(To be clear: large companies must have principles. Size and values aren&#8217;t inherently incompatible. But the interesting bit is how readily Anthropic&#8217;s scale disappeared from the conversation.)</p><p>Here&#8217;s the other thing: we&#8217;re unlikely to know how successful this campaign was in traditional terms. How many people converted to Claude at the event, or got a paid plan there or after? How many will still use it a month later? How many were first-time users versus leeched from competitors? Will they go after London, or Tokyo, or Paris next, or will this remain a one-hit wonder? I&#8217;d argue that releasing this data would puncture the narrative, and keeping it vague means everyone can project their own definition of success onto it.</p><p>Of course, this could all be speculation, smoke and mirrors. But whatever their actual reasoning, I think the pop-up&#8212;and the reactions to it&#8212;reveals a lot about how we construct meaning around our choices, especially in times of anxiety and uncertainty.</p><p>We want to believe in something. We want our consumption choices to align with our values while propelling the collective forward. We want to feel like we&#8217;re backing the good guys. And when a company provides the right signals, the right emotional narrative, we&#8217;re willing to buy it.</p><p>Whether those signals reflect deeper reality or sophisticated positioning is almost beside the point. The feeling was real and, given time, feelings create their own truths.</p>]]></content:encoded></item><item><title><![CDATA[AI filters don't replace art any more than instant ramen replaces food]]></title><description><![CDATA[Reducing Studio Ghibli to an AI filter shows everything wrong with how we now see art and creativity]]></description><link>https://www.readkindredspirits.com/p/ai-filters-dont-replace-art-any-more</link><guid isPermaLink="false">https://www.readkindredspirits.com/p/ai-filters-dont-replace-art-any-more</guid><dc:creator><![CDATA[Sindhu Shivaprasad]]></dc:creator><pubDate>Fri, 28 Mar 2025 12:51:41 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!NrOG!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4a6309c9-120a-4282-93a8-96273c5c03b5_1920x1038.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Zeitgeist is a series of essays tracing the intricate connections between tech, culture and human experience. </em></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!NrOG!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4a6309c9-120a-4282-93a8-96273c5c03b5_1920x1038.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!NrOG!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4a6309c9-120a-4282-93a8-96273c5c03b5_1920x1038.heic 424w, https://substackcdn.com/image/fetch/$s_!NrOG!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4a6309c9-120a-4282-93a8-96273c5c03b5_1920x1038.heic 848w, https://substackcdn.com/image/fetch/$s_!NrOG!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4a6309c9-120a-4282-93a8-96273c5c03b5_1920x1038.heic 1272w, https://substackcdn.com/image/fetch/$s_!NrOG!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4a6309c9-120a-4282-93a8-96273c5c03b5_1920x1038.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!NrOG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4a6309c9-120a-4282-93a8-96273c5c03b5_1920x1038.heic" width="1456" height="787" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/4a6309c9-120a-4282-93a8-96273c5c03b5_1920x1038.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:787,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:155002,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.readkindredspirits.com/i/160055033?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4a6309c9-120a-4282-93a8-96273c5c03b5_1920x1038.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!NrOG!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4a6309c9-120a-4282-93a8-96273c5c03b5_1920x1038.heic 424w, https://substackcdn.com/image/fetch/$s_!NrOG!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4a6309c9-120a-4282-93a8-96273c5c03b5_1920x1038.heic 848w, https://substackcdn.com/image/fetch/$s_!NrOG!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4a6309c9-120a-4282-93a8-96273c5c03b5_1920x1038.heic 1272w, https://substackcdn.com/image/fetch/$s_!NrOG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4a6309c9-120a-4282-93a8-96273c5c03b5_1920x1038.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>via Howl's Moving Castle by Studio Ghibli. I have a tattoo dedicated to Studio Ghibli, because of how their works changed by life for the better.</em></figcaption></figure></div><p>A couple of days ago, OpenAI released a new ChatGPT model capable of turning any picture into a specific art style. Since then, my Twitter timeline has been flooded with hundreds of posts of people turning their photos into &#8220;Studio Ghibli artwork&#8221;. The problem is twofold. One, the new model is obviously capturing a style with no sense of narrative or storytelling that Studio Ghibli is known for. Two, some of these art styles aren't even Studio Ghibli&#8212;more like watercolour and pastel&#8212;which shows people don't even know what defines Studio Ghibli's art style.</p><p>This zeitgeist is perfectly captured in this screenshot of recent Google Trends data. The trending searches are all about the quickest path to creating something superficially similar to Ghibli's work, without any interest in understanding what makes their artistic approach meaningful or distinctive. The Hindustan Times even calls it &#8220;ChatGPT's &#8216;Studio Ghibli&#8217;&#8221;. It&#8217;s a misattribution that shows how readily we're willing to transfer ownership of artistic styles from their original creators to the AI tools that imitate them.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!IQ9X!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb94eda23-77e2-4bad-a188-ed0f6fe0cd50_2314x1268.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!IQ9X!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb94eda23-77e2-4bad-a188-ed0f6fe0cd50_2314x1268.png 424w, https://substackcdn.com/image/fetch/$s_!IQ9X!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb94eda23-77e2-4bad-a188-ed0f6fe0cd50_2314x1268.png 848w, https://substackcdn.com/image/fetch/$s_!IQ9X!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb94eda23-77e2-4bad-a188-ed0f6fe0cd50_2314x1268.png 1272w, https://substackcdn.com/image/fetch/$s_!IQ9X!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb94eda23-77e2-4bad-a188-ed0f6fe0cd50_2314x1268.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!IQ9X!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb94eda23-77e2-4bad-a188-ed0f6fe0cd50_2314x1268.png" width="1456" height="798" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b94eda23-77e2-4bad-a188-ed0f6fe0cd50_2314x1268.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:798,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!IQ9X!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb94eda23-77e2-4bad-a188-ed0f6fe0cd50_2314x1268.png 424w, https://substackcdn.com/image/fetch/$s_!IQ9X!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb94eda23-77e2-4bad-a188-ed0f6fe0cd50_2314x1268.png 848w, https://substackcdn.com/image/fetch/$s_!IQ9X!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb94eda23-77e2-4bad-a188-ed0f6fe0cd50_2314x1268.png 1272w, https://substackcdn.com/image/fetch/$s_!IQ9X!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb94eda23-77e2-4bad-a188-ed0f6fe0cd50_2314x1268.png 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>There's a concerning collapse of nuance happening here. We're seeing cultural flattening where &#8220;Studio Ghibli style&#8221; has become a catch-all term for any anime-adjacent art with soft colours or watercolour effects. This surface-level imitation without understanding the underlying principles reminds me of what happened with Van Gogh's style&#8212;people often reduce his work to swirly strokes without acknowledging how those techniques expressed his unique vision of the world and his emotional state.</p><p>The ability to distinguish between genuine Studio Ghibli aesthetics and general anime-inspired watercolour styles represents a deeper understanding of artistic literacy. Art relies heavily on aesthetic ecosystems. Their power goes beyond any single image; it's in their distinct aesthetic worlds. Yet we're reducing universes that took years and many minds to develop into Snapchat filters, and worse, we're happy to accept that that's all there is to these styles.</p><p>This pattern reduces our ability to recognise and appreciate quality in general&#8212;not just in art, but in design, media, and visual communication that we interact with daily. You don't know why certain colours work together, or why shadows need to fall at a certain angle to indicate a time of day, or what the significance is of a character's hair going from brown to white over the course of a movie. Sure, you could say all of that doesn't matter. But you can apply that reasoning to anything. &#8220;The purpose of food is to give us energy and nutrients, so let's all eat bland nutrient pellets and get on with our lives&#8221;. Or, &#8220;The purpose of college is to get a degree, so let's use an LLM to pass our exams instead of actually applying ourselves&#8221;.</p><p>Oddly, this reminds me of the discourse about porn. There&#8217;s been plenty of studies that show how constant exposure to pornography can dull sexual response and create unrealistic expectations. I feel like it&#8217;ll be the same with AI-generated art: the endless flood of images risks desensitising us to the subtleties of artistic expression. Porn strips intimacy of its emotional and relational context; AI art generation strips creation of its cultural and personal meaning. The sheer volume and accessibility of content leads to quick consumption rather than deep appreciation, and quality gets lost in quantity. Both reduce rich human experiences to purely visual consumption; both can make the real thing seem inadequate by comparison.</p><h4>Just because you can, doesn't mean you should</h4><p>This rush to generate without reflection has darker implications too. One user on X (fka Twitter) turned a photo of the horrific murder of George Floyd into a cute Ghibli-style image. That feels to me like a prime example of how defaulting to AI generation can strip away human judgment and cultural sensitivity. That&#8217;s not to say Studio Ghibli never talks about difficult topics, because they do. But they do it through carefully constructed narrative frameworks that provide appropriate context and respect for the weight of these issues. <em>Grave of the Fireflies</em> depicts war trauma, but rather than making war &#8216;prettier&#8217; or more &#8216;palatable&#8217;, the film creates a specific narrative space for processing difficult truths.</p><p>There&#8217;s a crucial difference between making difficult topics accessible and making them consumable. Converting a documented instance of racial violence into a style associated with whimsy and childhood wonder for internet clout doesn't make it more approachable, it trivialises it.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!hRJw!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1164fb5-2d2a-4f78-a3f0-1a6916d1a9fe_1210x1606.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!hRJw!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1164fb5-2d2a-4f78-a3f0-1a6916d1a9fe_1210x1606.png 424w, https://substackcdn.com/image/fetch/$s_!hRJw!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1164fb5-2d2a-4f78-a3f0-1a6916d1a9fe_1210x1606.png 848w, https://substackcdn.com/image/fetch/$s_!hRJw!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1164fb5-2d2a-4f78-a3f0-1a6916d1a9fe_1210x1606.png 1272w, https://substackcdn.com/image/fetch/$s_!hRJw!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1164fb5-2d2a-4f78-a3f0-1a6916d1a9fe_1210x1606.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!hRJw!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1164fb5-2d2a-4f78-a3f0-1a6916d1a9fe_1210x1606.png" width="1210" height="1606" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a1164fb5-2d2a-4f78-a3f0-1a6916d1a9fe_1210x1606.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1606,&quot;width&quot;:1210,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!hRJw!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1164fb5-2d2a-4f78-a3f0-1a6916d1a9fe_1210x1606.png 424w, https://substackcdn.com/image/fetch/$s_!hRJw!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1164fb5-2d2a-4f78-a3f0-1a6916d1a9fe_1210x1606.png 848w, https://substackcdn.com/image/fetch/$s_!hRJw!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1164fb5-2d2a-4f78-a3f0-1a6916d1a9fe_1210x1606.png 1272w, https://substackcdn.com/image/fetch/$s_!hRJw!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1164fb5-2d2a-4f78-a3f0-1a6916d1a9fe_1210x1606.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Look at this post from the White House's <a href="https://x.com/WhiteHouse/status/1905332049021415862">official X account</a>: an actual photo of a detained immigrant converted into &#8216;Studio Ghibli style&#8217; art. When the highest office in one of the world's most powerful nations reduces law enforcement and human suffering to cutesy memes, we've moved beyond bad taste into something genuinely dystopian. This goes beyond cultural flattening and misappropriation, it's institutional trivialisation (and <em>of course</em> the style isn't accurate).</p><p>Aside from being poles apart from the ethics and values of Miyazaki and the entire Studio Ghibli franchise, these images spread rapidly without context, they shape public perception of what's acceptable and inadvertently normalise insensitive depictions. They also become attached to the Studio Ghibli canon, which is even more dangerous for their reputation. The whole "just because you can doesn't mean you should" argument becomes painfully relevant here. Without taste and human judgment, there's no understanding of what's appropriate or respectful.</p><p>And don&#8217;t even get me started on how the large corporations profit from this cultural strip-mining and us literally helping them profit from it by filling our timelines with the resulting slop. We may as well let burglars enter our homes and help them load up the van with your things, maybe offer them a cup of tea for their hard work.</p><p><a href="https://aftermath.site/studio-ghibli-ai-art-openai-gpt-sam-altman-is-just-the-biggest-pile-of-shit">Luke Plunkett</a> says it best: &#8220;Companies like OpenAI are hoping that the longer their tech can stay out there, the more it becomes part of the background noise of the modern internet, and the more likely it is that they themselves will become part of the fabric of the modern internet, and not a bunch of raiders stripping the place for its creative wiring&#8221;.</p><h4>The 'accessibility' argument is a cop-out</h4><p>Some people have argued that this AI-generated content is a good thing because it introduces new audiences to Studio Ghibli, potentially bringing more viewers to their work. But I wonder if that's actually the case? They're being introduced to a surface-level impression that might actually prevent them from engaging with the actual films and their deeper artistic and narrative elements. I think the Google Trends screenshot is a clear example of that: no one&#8217;s looking for &#8220;Studio Ghibli movies&#8221;. Our focus on output and our rush to produce content often comes at the expense of genuine engagement, simply because we cannot take the time to learn, engage, and let things breathe.</p><p>Another argument I&#8217;ve seen crop up about AI art generation is that it removes the barrier to entry to art. I&#8217;m not thoroughly convinced by that, either. You still have to pay 20 dollars a month for a subscription to a model that can generate quality images. It&#8217;s like renting an ability. That same amount can get you a good set of paints and a drawing book that will last months, has no token limits or downtime, and can be preserved for as long as you choose.</p><p>Sure, it&#8217;s not equal to dropping millions of dollars on art school. But I&#8217;d argue that many prolific artists haven&#8217;t done that, either. They&#8217;ve just sat down with their supplies day after day to bring something to life. What, then, is the real barrier to entry? I think what people often mean by &#8220;barrier&#8221; is actually the time and effort required to develop skills. Art has always been accessible&#8212;people just didn't think it was because they didn't want to put in the effort or make bad art before they got to good art.</p><div class="pullquote"><p>AI removing the barrier to entry to creation is like saying we've removed the barrier to entry to mountain climbing by installing a lift to the peak. Yes, more people can now reach the summit, but have we actually made mountain climbing more accessible, or have we fundamentally changed what it means to climb a mountain?</p></div><p>If creation is the process, the labour, then the creator themselves is the end product, not the art they produce. The art is proxy, a stand-in for all the growing, learning, and perspective-shifting the creator has gone through during the process of creation. I think perhaps people forget why we create art in the first place. Sure, one aspect is getting eyeballs. But there was always the foil to that, i.e., creating to grow, or for the sake of creation, or to document. I fear, with AI, that the convenience of generation is compromising the other benefits that art truly has for us. Is the Sistine Chapel beautiful only aesthetically, or is its beauty enhanced by Michaelangelo&#8217;s effort? Conversely, does the ease and speed of perfect art generation make it less meaningful&#8212;just another pretty picture in an infinite feed?</p><p>Art has historically been much more than what meets the eye: a means of documenting human experience and perspective, a process of personal growth and discovery. The benefit of AI is in mass production. It might give us more images, but it might also make us poorer in terms of personal growth, cultural understanding, and human connection.</p><p>The problem with a lot of common arguments against relegating art creation to AI is that they don't resonate with a society that values output over effort. Nobody wants to take the time to learn a skill&#8212;or pay someone who has the skill&#8212;if they can generate the result in just 5 minutes and for under 20 dollars. Artists are consumers, but not all consumers are artists, and so they&#8217;re perfectly happy with a perfect &#8220;on spec&#8221; image. People don't want to develop the underlying skills to tackle new challenges, or work through the creative difficulties.</p><p>But again, the looming risk of that is that everything starts feeling and looking the same because true breakthroughs often come from deep understanding and novel combinations of knowledge. This has already become pretty obvious in the software world and will slowly creep into art and other creative spaces. The shortcut to an aesthetic image ultimately limits our capabilities in ways we might not recognise until it's too late. But because these degradations are gradual and systemic, they're hard to see at the moment. By the time the impacts become obvious, we may have lost something difficult to recover.</p><h4>Who serves whom?</h4><p>I want to be clear: this isn't an argument against AI, or against people having fun with new technology. These are tools for humanity, after all, and unbridled joy is a massive part of what makes us human. But what I&#8217;m wary of is when tools start shaping us instead of us shaping them. The question isn't whether to use AI or not&#8212;it's about understanding what we're gaining, who we&#8217;re profiting, and what we might be unconsciously losing in the process. Because once cultural literacy and artistic understanding erode, they're much harder to rebuild than they are to maintain.</p><div class="pullquote"><p>We need to examine our own motivations here: are we creating to express something genuine, to document, to grow&#8212;or to get attention from strangers on the internet? Is being seen was more important than having something to say?</p></div><p>Perhaps we need to be more precise with our language, and differentiate between &#8216;generating&#8217; images and &#8216;creating&#8217; art. The former might produce beautiful outputs, but the latter involves a transformative process that changes both the art and the artist. As <span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;Venkatesh Rao&quot;,&quot;id&quot;:2264734,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/562e590a-9494-4f66-87f0-330c1be204c2_500x500.jpeg&quot;,&quot;uuid&quot;:&quot;4de1c12f-bcff-45b5-893c-5c2e8c6d2147&quot;}" data-component-name="MentionToDOM"></span> <a href="https://substack.com/@contraptions/note/c-91312059?utm_source=notes-share-action&amp;r=5tael">noted about writing with AI</a> (and I think this applies equally to art): AI is great at execution&#8212;except for &#8220;the very tip of agency which is actually making creative decisions about what's worth creating at all and why, and what to prioritise/ emphasise for a given purpose.&#8221; This is almost the essence of creation itself.</p><p>In the long run, it'll all depend on what we value more: the convenience or the skill.</p><div><hr></div><p><em>If you&#8217;ve made it all the way here, thank you for reading! This essay marks the start of a new section of Kindred Spirits called Zeitgeist, dedicated to tracing the intricate connections between tech, culture and human experience. There will definitely be plenty of overlap between Zeitgeist and Kindred Spirits because, ultimately, they both seek answers to the same question: How do we exercise more agency over what we do, choose, consume, make and become? While Kindred Spirits looks inwards, Zeitgeist will look outwards, at the external influences that impress upon us. </em></p>]]></content:encoded></item></channel></rss>