Creativity & Storytelling
Creativity & Storytelling
Generative AI Explorations
Generative AI Explorations
Exploring the intersection of design, storytelling, and generative AI.
Exploring the intersection of design, storytelling, and generative AI.


Where design meets storytelling
Where design meets storytelling
My journey into AI art direction began in late 2022 — a natural extension of my design practice into the emerging frontier of generative creativity. What started as curiosity quickly evolved into a hands-on exploration of how AI can serve as both a design partner and a storytelling medium, bridging my professional focus on systems and interfaces with a personal passion for narrative and film.
Since then, I’ve collaborated with a range of clients and creative teams — producing fashion spreads, commercials, and full short films that blend human composition with machine-driven ideation. My work has attracted viral attention on Reddit, leading to active partnerships with Curious Refuge, Narrative Coders, and other studios experimenting at the intersection of design and generative media.
My journey into AI art direction began in late 2022 — a natural extension of my design practice into the emerging frontier of generative creativity.
What started as curiosity quickly evolved into a hands-on exploration of how AI can serve as both a design partner and a storytelling medium, bridging my professional focus on systems and interfaces with a personal passion for narrative and film.
Since then, I’ve collaborated with a range of clients and creative teams — producing fashion spreads, commercials, and full short films that blend human composition with machine-driven ideation.
My work has attracted viral attention on Reddit, leading to active partnerships with Curious Refuge, Narrative Coders, and other studios experimenting at the intersection of design and generative media.
Bespoke AI workflows
Bespoke AI workflows
This exploration is not about the final image, but the workflow — an evolving toolkit that unites prompt design, motion direction, and spatial composition into a process designers can own and adapt. It demonstrates how generative tools can extend a designer’s capacity for rapid visualization, mood exploration, and storytelling — allowing ideas to move from concept to moving image with unprecedented speed and fidelity.
Recent showcases include concept and visual design for the World Animal Protection initiative, alongside a series of experiments demonstrating current video and generative tools shaping the future of creative direction. Each project functions as a live study in AI-assisted design orchestration, blending structured intent with cinematic imagination.
Behind each project sits a growing ecosystem of bespoke AI workflows — purpose-built pipelines that integrate visual generation, motion synthesis, and narrative direction. Drawing on tools like Midjourney, Runway, Google Veo, Gemini, Claude, Adobe, ComfyUI, Reve, and Sora, I’ve developed adaptive systems that balance automation with intentional creative control.
These workflows are less about tool mastery and more about design orchestration — how each layer of intelligence and media interacts within a coherent visual process. Whether generating conceptual imagery, directing motion sequences, or refining visual identity through AI-assisted iteration, each pipeline is designed to enhance speed, cohesion, and creative precision.
This exploration is not about the final image, but the workflow — an evolving toolkit that unites prompt design, motion direction, and spatial composition into a process designers can own and adapt.
It demonstrates how generative tools can extend a designer’s capacity for rapid visualization, mood exploration, and storytelling — allowing ideas to move from concept to moving image with unprecedented speed and fidelity.
Recent showcases include concept and visual design for the World Animal Protection initiative, alongside a series of experiments demonstrating current video and generative tools shaping the future of creative direction. Each project functions as a live study in AI-assisted design orchestration, blending structured intent with cinematic imagination.
Behind each project sits a growing ecosystem of bespoke AI workflows — purpose-built pipelines that integrate visual generation, motion synthesis, and narrative direction. Drawing on tools like Midjourney, Runway, Google Veo, Gemini, Claude, Adobe, ComfyUI, Reve, and Sora, I’ve developed adaptive systems that balance automation with intentional creative control.
These workflows are less about tool mastery and more about design orchestration — how each layer of intelligence and media interacts within a coherent visual process. Whether generating conceptual imagery, directing motion sequences, or refining visual identity through AI-assisted iteration, each pipeline is designed to enhance speed, cohesion, and creative precision.



Interested in discussing generative AI?
Interested in discussing generative AI?