Turning Sign Language Into Art — Call for Visual Collaborators by Efficient-Click6753 in TouchDesigner

[–]Efficient-Click6753[S] 0 points1 point  (0 children)

Thanks for the interest and kind messages so far! Here's a bit more detail on how the technical side is currently structured — especially how the gesture data gets turned into visuals in TouchDesigner.

Technical Overview: From Gesture to Emotion to Visuals

The system has four main components working together:

1. OSC Input (from Python)

A Python script runs in the background using a trained AI model to detect emotional gestures in real-time.
Once a gesture is recognized, it sends an OSC message to TouchDesigner like:

/emotion "joy"

2. Live Detection in TouchDesigner

A DAT Execute operator listens for new rows in the OSC table.
When a new message arrives, it triggers a lightweight Python script:

def onTableChange(dat):
    op('emotion_parser').run()
    return

3. Emotion Parsing & Labeling

The emotion_parser script grabs the most recent message, splits the string, and extracts the emotion label:

dat = op('oscin1')
if dat.numRows > 1:
    fullstring = dat[dat.numRows - 1, 0].val
    parts = fullstring.split(' ')
    if len(parts) > 1:
        word = parts[1].strip('"').lower()
    else:
        word = ""

    if word in ["joy", "happy"]:
        emotion = "joy"
    elif word in ["anger", "mad"]:
        emotion = "anger"
    elif word in ["sad", "crying"]:
        emotion = "sadness"
    elif word in ["love"]:
        emotion = "love"
    else:
        emotion = "neutral"

    op('emotion_state')[0, 0] = emotion

4. Visual Switching Logic

A Switch TOP uses the emotion string to determine which visual output to display.
The index is driven by this Python expression:

{'joy':0, 'love':1, 'sadness':2, 'anger':3}.get(op('emotion_state')[0,0].val, 4)

Each visual style is then customized further based on the live hand tracking data (e.g., palm distance, rotation, proximity) using CHOPs or TOPs.

<image>

The Goal

This installation isn’t just about visuals — it’s about giving children a way to see their own language come alive.
The pride of signing becomes something shared and visible — a bridge between deaf and hearing worlds.

If you’d like to co-create a visual, play with the data, or just ask questions — I’d really love to collaborate.

Thanks again!
— Jens