The Modern Software Developer
Survey|December 2025

The state of AI coding in 2025: Adoption, proficiency, and transformation

Our survey of 195 developers globally reveals how AI is reshaping software development workflows.

Key findings

1

AI software adoption is now nearly universal among developers: 98% of surveyed developers use AI coding tools several times a week. However there is still some resistance among senior developers who report less adoption.

2

AI coding tools are widely used for a variety of task types: There is strong adoption with “explain this” workflows, debugging, test generation, document writing, UI development, and code review. Security use cases however trail behind.

3

Claude Code and Cursor dominate among tool adoption with a long tail of other platform usage: Claude and Cursor have emerged as the current coding platform category winners, though many developers are experimenting with other tools. Dedicated code review platforms still aren’t as widely adopted.

4

Different categories of software engineering are seeing uneven efficiency gains: Frontend use cases are seeing the largest productivity gains. However in other software categories like infrastructure, mobile, and design the gains are not as widely perceived.

5

General satisfaction with AI coding tool setup is quite high: Developers are mostly happy with their AI coding tool setups though satisfaction levels drop at larger companies.

6

AI coding tools still have open problems that developers want to see improved: Hallucination and lack of customization are reported as some of the biggest AI software failure modes. Of additional note, developers say context switching among tasks has become worse with AI coding.

Methodology

Our survey collected responses from 195 professional software developers globally. The following describes the composition of our respondent pool across roles, company sizes, and seniority levels.

Exhibit 1

Survey respondent roles

Distribution of respondents by primary role (multiple answers roles were allowed)

The different software areas were fairly well represented with most respondents coming from backend, frontend, and machine learning roles.

Exhibit 2

Company size distribution

Breakdown of respondents by employer size

Different sized companies were fairly evenly represented in the responses.

Exhibit 3

Engineering seniority levels

Years of experience among survey respondents

The respondents were mostly senior software developers with over 65% having 4+ years of professional experience.

Adoption

Understanding how developers adopt AI tools reveals patterns across seniority levels and organization sizes. This section examines usage frequency, tool preferences, customization practices, and spending patterns.

Exhibit 5

How often do you use AI software tools?

Frequency of AI coding tool usage among respondents

Breakdown by seniority level

0-1 years

1-2 years

2-4 years

4-9 years

10+ years

Breakdown by company size

1-10 employees

10-50 employees

50-100 employees

100-1000 employees

1000-10000 employees

10K+ employees

AI coding tool adoption has become universal with nearly 98% of respondents using AI software tools several times a week. When broken down by seniority level, the more experienced developers with 2+ years of experience are the ones that aren’t daily active users. In some cases, up to 5% of them only used these tools once a week. Usage patterns are quite consistent when broken down by company size. Note that while several of the plots for junior developers indicate 100% adoption, this may indicate sampling bias.

Exhibit 6

Codebase percentage authored by AI

What percentage of your code in the last 30 days was AI-generated (i.e. AI had some part in creating)?

AI was used to generate more than 50% or more of codebases in the last 30 days by nearly 40% of developers.

Exhibit 7

Which use cases have you used AI coding for in the last 30 days?

Primary ways developers use AI coding tools (respondents could select multiple)

Tab complete usage patterns are nearly universal likely due to this being one of the first standout AI features in IDEs. Additionally there is very strong adoption with read-only “how does this work” workflows, debugging, test generation, document generation, UI development, and code review. Security use cases adoption still trails significantly.

Exhibit 8

Tool usage

Which AI coding tools do developers use most?

Tool usage by seniority level

0-1 years

1-2 years

2-4 years

4-9 years

10+ years

Tool usage by company size

1-10 employees

10-50 employees

50-100 employees

100-1000 employees

1000-10000 employees

10K+ employees

Claude Code and Cursor adoption is universally the highest. Github copilot is also well-represented especially among developers from larger companies, likely due to Microsoft enterprise contracts. Stand-alone code review platforms still haven’t captured as much of a developer market share.

Exhibit 9

Agent customization

Do developers customize their AI coding agents (multiple responses allowed)?

Most developers are taking steps to control their agent behavior with the majority adopting some sort of agent-specific configuration file (AGENTS.md, CLAUDE.md, etc).

Exhibit 10

How much do you spend on AI coding tools?

Monthly spending on AI coding tools

Spending by seniority level

0-1 years

1-2 years

2-4 years

4-9 years

10+ years

Spending by company size

1-10 employees

10-50 employees

50-100 employees

100-1000 employees

1000-10000 employees

10K+ employees

AI coding platforms are seeing increasing amounts of spend with nearly 25% of developers spending $100+/month. More senior developers are also the biggest spenders with 16% of developers with 10+ years of tenure claiming to spend $1000+/month. Additionally the highest spending tends to happen by developers at either very small companies or very big companies.

Exhibit 11

How happy are you with your current AI coding setup?

Satisfaction levels among developers

Rating distribution (%)
Mean: 3.62

Satisfaction by seniority level

0-1 years

Mean: 3.36

1-2 years

Mean: 3.62

2-4 years

Mean: 3.55

4-9 years

Mean: 3.68

10+ years

Mean: 3.64

Satisfaction by company size

1-10 employees

Mean: 3.70

10-50 employees

Mean: 3.71

50-100 employees

Mean: 3.89

100-1000 employees

Mean: 3.52

1000-10000 employees

Mean: 3.56

10K+ employees

Mean: 3.44

Developers are mostly satisfied with their AI coding tool setup. More senior developers are most happy with their setup though satisfaction drops at larger companies, likely due to red tape around flexibility to experiment with and access to different tools.

Stay Updated

AI coding strategies and insights trusted by 11,000+ developers delivered to your inbox

    Efficiency Gains

    AI coding tools promise productivity improvements, but the gains vary significantly by task type and functional area. This section breaks down reported efficiency multipliers across different categories of software engineering work.

    Exhibit 12

    Efficiency gains by task category

    For each category of software engineering, how much more productive does AI coding make you (as a multiplier)?

    Backend

    Frontend

    Machine Learning / Data Science

    Infrastructure

    Design

    Mobile

    For most software functional areas, the productivity gains sit normally distributed with a median around 1-2x. Frontend efficiency gains are significantly higher with many developers seeing 2x or more as a multiplier. This is likely due to how prevalent frontend code is in public repositories on the intent. Machine learning and data science work is even peakier in gains around the 1-2x multiplier. For design, mobile, and infrastructure, the normal distribution is much more flat indicating the gains are not as clear across developers.

    Exhibit 13

    Efficiency gains by functional area

    How AI tools impact different aspects of the development workflow

    Speed

    Quality

    Learning (how fast you understand new code/concepts)

    Staying in flow

    Context switching between tasks

    Pull request throughput

    Test coverage

    Onboarding ramp

    Incident response

    Across different aspects of the development workflow, the efficiency gains are typically 1-2x or more with especially prominent gains on speed of development. AI software tools also tend to improve how quickly developers are able to learn and pick up new concepts. A concerning observation is that nearly 20% of developers say AI software is making them worse at context switching. This may be due to the additional time needed to supervise AI tasks and how it’s impeding developer’s ability to get into flow state when doing deep work.

    Looking Ahead

    As AI coding tools continue to evolve, understanding current pain points and desired improvements helps chart the path forward. This section examines where these tools fall short and what developers hope to see next.

    Exhibit 14

    Coding agent failure modes

    What are the most common ways AI coding tools fail (multiple responses allowed)?

    While AI software tool adoption has been very strong, developers still report high failure rates due to hallucination, incomplete solutions, and generally not understanding the context of a problem.

    Exhibit 15

    Desired features

    What improvements do developers most want from AI coding tools?

    The most cited categories of desired improvements are around software agents adopting to developer and codebase practices, better codebase understanding, and reduced hallucinations. A growing percentage also wants even more autonomy from their agents.