YouTuber IV 2026 – 35 x 50cm, acrylic, charcoal on canvas
YouTuber IV 2026 is a physically and psychologically overwhelming painting in the YouTuber sequence. The dominant head fills the left side and most of the upper register, massive, eyes narrowed in intense focus, mouth open in mid-explanation or mid-rant. The hands are raised in that familiar YouTube gesture, palms out, fingers splayed, the explanatory gestural wave that has become the series’ signature motif. but here they are enormous, almost monstrous, dominating the foreground as if trying to push the viewer back or pull the world in. The impasto is thick, gouged, scraped, rebuilt: black, grey, bruised ochre, with those corrosive green intrusions that read as screen glare, chemical burn, or the afterimage of too many ring-lights. The face is not dissolving; it is expanding, pressing against the edges of the canvas, refusing containment.
The content of the painting is a surveillance/platform infrastructure which operates beneath the colorful surface of digital culture. The grim extractive machinery, the data centers, the monitoring systems, the algorithmic sorting, all of that functions in the background, invisible. What users see is the bright, engaging interface: the colorful thumbnails, the vibrant branding, the stimulating content. The painting foregrounds this hidden aspect of digital surveillance as soemthing innately corrupting. The digital overlay as a system is technically perfect, but in terms of moral purpose it is a corroded corollary.
The painterly process makes both layers visible simultaneously. The grey understructure shows the reality of extraction, exhaustion, violence, dissolution. The color shows the performance, the oversaturated aesthetic required to compete for attention, the false vitality of digital presentation. Are we looking at a contaminated subject, or are we looking through contamination? Has the extraction process poisoned the figure, or has it poisoned the entire system of observation, making neutral viewing impossible?
The inset screen in the lower right, shows what appears to be a smaller, clearer figure, creates a mise en abyme effect. Is this the YouTuber watching themselves? A viewer watching the YouTuber? The next iteration waiting to replace the current one who’s collapsing? The relationship between the dissolving main figure and the contained screen-figure is assymetrical: one version huge but disintegrating, the other small but still intact, still legible, still performing.
The architectural elements, dark vertical and horizontal structures suggest an interior space, maybe a studio, maybe a slaughterhouse, maybe a data center. The ambiguity is intentional: all three are sites of extraction and processing. The palette remains punishing: tar blacks, ash greys, that nauseating yellow-green of decay or chemical contamination, dirty whites that look like exposed fat or bone.
The miniature head mirrors the large one: same intensity, same open mouth, same frantic energy, but reduced to a fraction, compressed, powerless. The graphite storm that swirls around the large head bleeds across the divide, tangling into the small frame, showing that the noise is shared, the pressure is shared, the extraction is total.

The technical perfection of surveillance systems, Palantir’s flawless data integration, facial recognition software’s frightening accuracy, the seamless operation of algorithmic sorting, all of that functions impeccably. The infrastructure works exactly as designed. The glass is crystal clear from an engineering perspective.
But the moral corrosion, the purposes these systems serve, the extractions they enable, the violence they facilitate, that’s what the yellow-green contamination visualizes. It’s not technical failure; it’s ethical rot made visible. The smearing, dripping, toxic overlay isn’t bugs in the code; it’s the moral consequence of deploying perfect systems toward corrupt ends.
This reframes the entire painting. The yellow-green overlays are showing malfunction; it’s showing what happens when perfectly functioning systems are weaponized against people. The surveillance apparatus works flawlessly to monitor, track, extract, control, and that’s precisely the problem. The contamination is intentional, operational, by design.
The figure reduced to toxic waste isn’t a glitch or accident, it’s the intended output of systems working exactly as programmed. Charter /Free cities operate with technical precision to suspend democratic accountability. Data centers efficiently process every interaction into extractable value. Platform algorithms perfectly optimize for engagement regardless of human cost. All functioning as designed. All morally corroded.
This makes the “slipping, smearing” quality even more powerful. It’s not that the glass is dirty or the screen is failing. It’s that when you look through ethically bankrupt systems, even technically perfect ones, everything becomes contaminated by the viewing apparatus itself. You can’t get clean data from corrupt infrastructure. You can’t have neutral observation through extraction machinery.