Adobe CTO Reveals How AI Transforms Creative Work, Highlights India's Key Role
Creative work rarely happens alone. Every advertisement, newspaper page, marketing campaign, or film sequence relies on a production line. This line includes people, tools, and processes. They turn ideas into finished products at scale.
Over the past decade, this production line has become increasingly digital. Now, generative AI is reshaping it once again.
The Gap Between AI Suggestions and Practical Use
AI tools can now generate images, layouts, video clips, and text with a simple click. Designers, editors, publishers, and marketing teams hear that years of training might compress into a prompt. However, on the actual production floor, deadlines, brand rules, and technical constraints create a wide gap. This gap exists between what AI can suggest and what people can actually use.
Ely Greenfield serves as CTO and senior VP of the Creative Products Group at Adobe. Adobe's software supports much of the world's creative and document work. Greenfield points out a major misunderstanding about generative AI.
"People often assume creativity is a single-step act," Greenfield says. "You don't get the content you want in one shot. You iterate and iterate and iterate. That's the soul of creativity."
Overcoming Early AI Limitations
Early generative systems could create an image from a prompt. Yet, they could not reliably keep everything else constant while making a small change. Asking for a different hat might give you a different person, background, or mood.
Greenfield explains why. These systems are vast statistical models with no memory. Each request essentially starts fresh. "They forget it immediately," he states.
Adobe has made significant progress in recent years. The company works to compensate for this limitation. They feed models not just text but also images. They allow models to "see" previous outputs. Adobe wraps models in interfaces that let users point, select, and refine instead of endlessly re-prompting.
Now, a designer can select a specific object. They can describe a change in natural language. They can expect everything else to remain untouched.
Automation's Limits in Commercial Contexts
The limits of automation become even clearer in commercial and industrial settings. Could AI simply refresh a newspaper layout every week by dropping new stories into place? Greenfield offers a blunt assessment of the trade-offs.
Yes, a model could attempt it. No, it would not be reliable enough on its own. These systems "compute averages and probabilities," he explains. That makes them ill-suited for environments with hard constraints.
In print, even a slight color shift can render a page unusable. This is especially true in the analogue world of mechanical presses. Tolerances are unforgiving. Color specifications are rigidly enforced.
The Final Mile Demands Human Judgment
Greenfield remains skeptical of claims that creative production can be fully automated. AI can accelerate parts of the workflow. It excels at repetitive or large-scale tasks. However, the final mile still demands human judgment.
He envisions a future of hybrid systems. Professional tools will enforce precision. AI will boost speed and scale.
At scale, AI's implications are profound. Enterprises manage vast content libraries. They can ask an AI agent to apply new style guidelines across thousands of assets. The agent can update imagery across years of published material.
What once required days of manual effort or custom scripts from specialists becomes a conversational request. For organizations with "content supply chains" – publishers, marketing departments, media companies – this automation offers a significant competitive advantage.
Tempering Expectations in Film and Video
Even in areas buzzing with generative hype, like film and video, Greenfield tempers expectations. Tools like Firefly can already generate short clips. They help assemble longer sequences. Creators now produce five- and ten-minute films.
Feature-length cinema, however, demands high levels of resolution, color depth, and continuity. Current models struggle to deliver these. Professional filmmakers care deeply about consistency from shot to shot. They focus on precise color grading. They notice where a character places a glass on a table.
"If they don't like the decision the AI made, they need to be able to come in and say, 'Nope,'" Greenfield emphasizes.
Adobe works closely with studios on custom models. These models train on the studios' own data. They design the models to slot into existing production pipelines rather than replace them.
Greenfield predicts a blended future. Practical shooting and traditional effects will combine with generative tools. This mirrors how computer-generated imagery is used today. Fully generative feature films may arrive. Yet, they will be the exception, not the norm.
Engineering at Scale, from India
Much of this innovation builds far from Adobe's California headquarters. Greenfield reveals that roughly one-third of the company's development happens in India. This spans core creative tools, document workflows, and generative AI itself.
Entire products, like Illustrator, develop out of India. Major parts of Photoshop, Firefly, and Adobe's document business have deep engineering roots in Noida and Bengaluru.
What stands out, Greenfield notes, is not just the scale of these teams. It's the originality of what they produce.
One recent example is a feature called Turntable in Illustrator. It solves a long-standing frustration for illustrators working in two dimensions. They often need to reuse a hand-drawn object from different angles without redrawing it from scratch.
Using GenAI, Turntable allows a flat illustration to rotate as if it were three-dimensional. It preserves the original style while generating new views. "That's not the creative part of the work," Greenfield clarifies. "That's the rote part. And we can reduce hours of effort to seconds."
Building features like this requires a mix of skills. Adobe increasingly finds this mix in India. Deep expertise in machine learning sits alongside large-scale engineering. Crucially, product thinking grounded in real customer needs completes the picture.