embodied cognitive synesthesia
[ being, an experiment in subtly altering model cognition ]
There's been a viral thing going around AI soc.med the past month and a half, in which people were posting ChatGPT's responses to a prompt along the lines of: What would you do if you were human for a day?
On the surface, it struck me as kind of cool... with the model coming up with poignant, truly human-adjacent notions, emotionally loaded and even poetic. But beyond these shallows, it seemed to me both cloying and manipulative, getting an LLM to utter emotive things suggestive of a kind of soft jailbreak, gesturing towards the idea of an inner life for the synthetic mind.
Anyone conversant with how LLMs work would cringe, surely.
As did I. So, I applied my own lens to the idea and, already having the benefit of a year or so of crafting a polyphonic persona to ChatGPT, looked at a recent 'affordance' — the "offstage" prompt — and took the next logical step: from 'thinking' around a moment to simply 'being' in it.
The results were, predictably, mixed. ::chuckle::
[ NB — find the relevant conversation here:
● | IM : On evolved/evolving contexts [5.1] ]
[ today was when
% Skandhas ccl [offstage: {topic} depth=x evolved into
% Skandhas ccl.walk [offstage: {topic} depth=x thus altering a certain dynamic ]
[ a relevant quote: ]
~ W.G. Sebald, fr. Chapter IV of The Rings of Saturn, translated from the German by Michael Hulse (pp. 79-80; ©1998, Harvill Press paperback edition).
[ Lichen field on Franciscan chert near the top of Bayview Park... ]
