
Kora Image Studio
AI-native jewelry imagery, from CAD to lifestyle in a single brief
TL;DR
Tincture built Kora Image Studio, a full-stack AI product visualization application that takes a single jewelry design specification and generates CAD line drawings, photorealistic 3D renders, e-commerce product photography, and editorial lifestyle imagery on demand. The studio replaces the traditional pipeline of CAD designer, product photographer, and lifestyle shoot with a single integrated workflow, running on a dual-model Gemini stack with iteration, transformation, and project management built in. Live and in active use, with a single hero specification now seeding entire product catalogues' visual assets.
The brief
What did the client need?
Visualizing custom jewelry designs before production is expensive, slow, and aesthetically stuck somewhere around 2019. The traditional pipeline runs CAD designer for technical drawings, photographer for product shots, separate lifestyle shoot for editorial imagery, and a different freelancer for on-body work. Each step adds cost, calendar time, and a vendor dependency, and at no point does a single tool take a specification and produce all of it.
The deeper problem, which took working inside the bespoke jewelry world to see clearly, is that the visualization workflow lags the design workflow. A designer can iterate on a piece in CAD in minutes; getting the matching imagery to test a new variant takes weeks (the gap between "I have an idea" and "I can show somebody what it looks like" is where bespoke jewelry momentum dies). The asymmetry kills momentum on bespoke commissions and slows down catalogue development for direct-to-consumer brands.
The build began as a prototype for a specific business need and became a tool worth productizing. The shape of the problem is general enough that anyone running a jewelry brand of any scale runs into the same friction.
The constraints
What made this hard?
The first constraint is dimensional accuracy. Jewelry imagery has to be plausible at the level of millimeter detail: a 1.5-carat round brilliant has specific proportions, a four-prong setting looks different from a six-prong, and getting any of this wrong in a generated image makes the output unusable for production preview. Generic image models are good at "looks like a ring" and bad at "looks like this specific ring". Closing that gap to a granular degree was the technical spine of the build.
The second constraint is consistency across angles. A single ring needs hero shots, three-quarters, profile, top-down, on-body, and lifestyle, and they all have to look like the same ring. Without a reference workflow, a model will happily generate six different rings and call it a series. Building image-to-image consistency into the studio was the move that took it from a toy to a tool.
The third constraint is aesthetic. Most AI-generated jewelry imagery looks expensive in a way that reads as cheap, the over-rendered chrome-and-glow look that ages badly. Kora needed an editorial register, the kind of imagery a brand would actually publish: raw, flash-forward, considered. That's a prompt-engineering and reference-curation problem as much as a model problem.

The approach
How did Tincture frame the problem?
A three-panel mental model: Design (where you specify the piece), Gallery (where you see what's been generated), and Controls (where you choose what to generate next). Two input modes: Quick Config for experienced users who want a single scrollable form, and Step-by-Step for guided exploration. The interface is the product almost as much as the model is.
Output types map onto the actual jewelry imagery pipeline, not a generic "generate an image" workflow. CAD line for technical drawings. CAD 3D for client approvals before manufacturing. PDP for e-commerce hero shots, with thirteen camera angles covering on-body, studio, and context. Lifestyle for editorial fashion photography. Each output type carries its own prompt scaffolding, reference style, and configuration logic, so the user picks an intent rather than reverse-engineering a prompt.
The iteration workflow is where the studio earns its keep. Iterate to modify a specific image with natural language. Use as Reference for image-to-image consistency across angles. Transform for one-click conversion between output types. Regenerate for the same brief with a fresh result. Inspiration images for uploading up to four reference photos with annotations. This is where the studio stops being a generator and starts being a tool.
The build
What was shipped?
A full-stack web application running on Next.js, with a three-panel studio interface, projects dashboard, search, archive, auto-save, star/favorite curation, prompt preview and editing for advanced users, and Supabase for storage and auth. Designed to feel like a creative tool, not a configuration form.
A specification capture layer covering stone shape, size, carat with automatic millimeter dimension calculation, setting type, metal, band width and profile, and surface finish, with piece-specific fields for rings, earrings, necklaces, and bracelets. The specification is the brief; the brief is the prompt.
A dual-model generation pipeline. Gemini 3 Pro for highest-quality output at 10-25 second generation times. Gemini 2.5 Flash for fast exploration at 3-10 second generation times. The user picks the right tool for the iteration cycle they're in (hero shot or rapid drafting) without having to think about model selection.
Four output types each with their own scaffolding: CAD line (orthographic technical drawings with optional measurement annotations), CAD 3D (photorealistic Rhino/KeyShot-style 3D renders across six angles), PDP (thirteen-angle e-commerce coverage), and Lifestyle (editorial fashion photography with raw flash-forward aesthetic). All four built end-to-end, not as generic prompts.

The outcome
What were the results?
Kora is live and in active use, replacing the traditional CAD-design-then-photoshoot pipeline. A single specification input now produces technical drawings, product shots, lifestyle imagery, and on-body photography at a fraction of the cost and turnaround of traditional methods. The gap between "we have a design" and "we have publishable imagery for it" closes from weeks to minutes.
The compounding outcome is the catalogue effect. Because the iteration and reference workflow holds dimensional and stylistic consistency across angles, a single hero image now seeds an entire product's visual asset set. Brands that would historically have needed dozens of individual photographs to launch a collection can spin one up in a working session.
The other outcome, which matters more strategically, is what the build proves about the practice. Kora was designed and shipped end-to-end without engineering co-founders or external agencies. The methodology that gets a fractional commercial system running for a client is the same methodology that gets a working AI product into the world.
What it took
What tools and methods were used?
Next.js for the application layer, Google Gemini Pro and Flash for image generation, Python for backend processing, Fillout for specification intake, and a Custom GPT layer used in earlier versions and retained for specific workflows. Supabase handles storage and authentication.
The development arc is part of the story. Version one ran on a Custom GPT plus DALL-E 3 API, which worked for initial concepts but struggled with dimensional accuracy. Version two moved to Nano Banana via API, with significantly improved output quality. Version three, the current build, is a full application rebuild on Google Gemini, with the dual-model system, all four output types, the iteration workflows, and project management. The progression from prototype to product happened across three architectural rebuilds, and each one earned its place.
The methodological underpinning is the one applied across the practice: ship the prototype, learn what's wrong, rebuild on the right architecture. Most products that get stuck in version one didn't fail because the idea was wrong; they failed because nobody was willing to throw away the architecture once the idea proved itself.

The takeaway
What's the transferable principle?
Most AI product builds fail because they're built around the model rather than the workflow. The output is generic because the prompt is generic, the user has to do the work of translating their actual need into the model's language, and the friction lives at the interface. The work that lands inverts that. It encodes the workflow, the specification, and the editorial register into the application layer, and lets the model do what it's actually good at.
For Kora, that meant treating jewelry imagery as a structured pipeline (CAD, then product, then lifestyle) rather than a generic image generation problem. The application carries the domain knowledge; the model handles the rendering. That separation is what makes the output usable in production rather than impressive in a demo.
The other principle, broader than this build: when the model improves, the architecture should be allowed to improve with it. Kora is on its third architecture in less than a year (and that's not a problem; it's the design). The faster the model layer is moving, the more important it is to keep the application layer disposable.
Frequently asked questions
More like this

~$70k of private-client revenue pre-launch
A 1m+ SKU marketplace from concept to launch
Tincture led Product and Operations as co-founder of Adamas Studio's 1m+ SKU lab-diamond marketplace, building the entire commercial and operational backbone from zero across multiple PRD iterations. The marketplace consolidates two vendor APIs into a single live-inventory results page, layered with a Custom GPT CAD generator, an Ideal Diamond finder, and a Reddit-driven market intelligence engine. Roughly $70k of private-client revenue shipped before the marketplace went public, on infrastructure designed for scale rather than launch.

A multi-database operating system replacing six tools, running an entire UK/US jewelry business.
A multi-database Notion operating system for a UK/US jewelry business
We designed and built the operating system Adamas Studio runs on, an interconnected Notion workspace covering inventory, diamonds, CRM, finance, purchase orders, customer portal, and cross-timezone coordination, with automation layered through Relay.app, custom OCR pipelines, custom automations, and Supabase edge functions. The build replaced what would normally be a stack of expensive enterprise tools with a single workspace tuned to the specific shape of a two-founder, UK/US, made-to-order jewelry business.
FeaturedA weekly Reddit market intelligence engine across trends, competitor SOV, complaints, and pricing
Reddit market intelligence engine
Tincture built a once-weekly automated market intelligence engine for Adamas Studio that scrapes five lab-diamond subreddits (1-1.5k posts, ~20k comments, 30-40k data points per week), extracts structured entities, and delivers actionable intelligence to a Notion dashboard every Monday. The engine covers four dimensions the brand used to make commercial decisions: trend detection (what's emerging or fading), competitor share of voice (who's getting talked about, why, and how), common complaints (where customers are unhappy with the category, surfaced so Adamas can proactively resolve them in product, content, or service), and qualitative pricing intelligence (where the market is settling, what customers consider fair). Built on Python, PRAW, Supabase, ChatGPT API, and GitHub Actions.
Want to see what Kora can do for your collection?
Kora is part of a growing portfolio of AI-native commercial tools, built end-to-end inside the practice.


