What Workers Actually Want from AI Agents

The gap between AI investment and worker desires reveals an opportunity most are missing


The AI agent conversation is dominated by capability: What can these systems do? How autonomous can they become? When will they match human performance?

But there’s a question we’re not asking enough: What do workers actually want AI to handle?

Research surveying 1,500 workers across 104 occupations reveals answers that challenge common assumptions about AI and work - and expose a significant mismatch between where AI investment is flowing and where workers want help.

The Desire-Capability Mismatch

The occupations where workers most want AI automation are not the occupations actually using AI tools.

Tax preparers, public safety telecommunicators, and timekeeping clerks top the list for automation desire. These workers want AI to schedule their appointments, maintain their records, and handle routine reports. Yet these roles represent just 1.26% of actual AI chatbot usage.

Meanwhile, 41% of AI startup activity targets tasks where workers either don’t want automation or current technology can’t deliver. Investment is chasing capability frontiers rather than worker needs.

The pattern: we’re building AI for tasks that are technically impressive, not necessarily tasks where humans most want relief.

Workers Want Partnership, Not Replacement

The most striking finding isn’t about automation - it’s about collaboration.

When researchers asked workers to describe their ideal level of AI involvement using a five-point scale (from “AI handles it entirely” to “I need to be continuously involved”), the most common answer was the middle: equal partnership.

45% of occupations have “equal partnership” as workers’ dominant preferred mode. Workers want to work with AI, not be replaced by it or supervise it from a distance.

This makes sense when you examine motivations. Among workers who want more automation, 69% say their primary reason is “freeing up time for high-value work.” They’re not asking AI to do their jobs - they’re asking AI to handle the tedious parts so they can focus on what matters.

The repetitive tasks. The record-keeping. The scheduling. The parts of the job that don’t require judgment but still consume time.

The Creative Work Exception

Not all sectors feel the same way.

In computer and mathematical occupations, 54% of tasks show positive automation desire. In arts, design, and media? Just 17%.

Creative workers consistently draw a line: AI for workflow, not for content. As one art director put it: “I want it to be used for seamlessly maximizing workflow and making things less repetitive and tedious. No content creation.”

This isn’t irrational resistance. For creative workers, the creative work is the job. Automating it doesn’t free up time for higher-value work - it eliminates the work that has value.

Worker resistance isn’t always about job security. Sometimes it’s about meaning.

The Skill Shift Signal

The research surfaces an early signal worth watching: which human skills are becoming more important as AI agents enter the workplace.

When comparing traditional skill rankings (by wage) to new rankings (by required human agency), the shifts are telling:

Declining emphasis: Analyzing data or information. Documenting and recording information. Updating and using relevant knowledge.

Increasing emphasis: Training and teaching others. Organizing and prioritizing work. Communicating with supervisors and peers. Assisting and caring for others.

The pattern: information-processing skills are getting automated. Interpersonal and organizational skills are becoming more critical.

This doesn’t mean analytical skills stop mattering. But it suggests the baseline is shifting. “Can analyze data” increasingly becomes a checkbox. “Can teach others, coordinate work, and navigate relationships” becomes the differentiator.

Strategic Implications

If you’re building AI agents:

Target the tedium. Scheduling, record maintenance, routine reports, data entry - these are the tasks workers actively want help with. They’re not technically glamorous, but they’re where the demand is real.

Design for partnership. The dominance of equal partnership as workers’ preferred mode suggests AI interfaces should emphasize collaboration, not autonomy. Workers want to work with these systems, not hand off to them entirely.

Respect the meaning line. In creative and knowledge work, there’s a boundary between workflow automation (welcomed) and content automation (resisted). Understanding where that line falls for your users matters.

Watch the skill shift. As information-processing tasks automate, the humans who remain most valuable are those who can teach, coordinate, communicate, and care. Product design and workforce development should account for this.

The Bigger Picture

There’s a gap between how AI is discussed in demos versus how it appears in deployment.

In demos, we see autonomous agents completing complex tasks end-to-end. In deployment, workers often want something more modest: reliable help with the parts of their job they don’t enjoy, while keeping control over the parts they do.

The research doesn’t suggest workers are anti-AI. Nearly half of all tasks surveyed show positive automation desire. But workers have specific, practical ideas about where AI fits - and those ideas don’t always align with where investment and development are concentrated.

The opportunity isn’t just building more capable AI. It’s building AI that addresses the problems workers actually want solved.


Where in your organization are workers asking for help that current AI tools aren’t providing? What would change if you optimized for worker desire rather than capability showcase?

Related: 07-atom—worker-ai-preference-mismatch, 07-molecule—hybrid-human-ai-workflows, 07-atom—human-agency-scale