Get all your news in one place.
100's of premium titles.
One app.
Start reading
TechRadar
TechRadar
Craig Hale

'That is unacceptable in a professional development workflow': Microsoft acts after VS Code gives Copilot credit for work a human developer did

A robot standing thoughtfully in front of a giant digital display with code on it.
  • Microsoft hit with backlash after 'Co-authored-by: Copilot' started appearing widely in VS Code
  • The company has reversed this decision effective with version 1.119
  • Developers are still unhappy that the 'bug' reached production

Microsoft has reversed a controversial change in VS Code which automatically partly attributing Github commits to Copilot - even when the AI tool wasn't used.

Developers had previously taken to forums including Reddit to complain that 'Co-authored-by: Copilot' was getting added to their commits, even when they had not used the assistant and had even turned off Copilot chat functionalities.

It remains unclear whether this earlier behavior was intended, however it seems that Microsoft has admitted the mistake and rectified the issue in a new update.

This explains the 'Co-authored-by: Copilot' issue in GitHub

A March 2026 change within VS Code reportedly added the Copilot authorship label regardless of Copilot usage, per reports, though a VS Code reviewer has since apologized: "There was no ill intent by [an] evil corporation, but rather a desire to support functionality that some customers expect of VS Code [with regard to] AI-generated code."

Following a more recent change applied to version 1.119, AI attribution will only be added if users explicitly choose it.

"Obviously, it should not be on when disableAIFeatures is on and it should not be reporting changes that were not done by AI," Dmitriy Vasyura wrote. "I'll work on fixing those and meanwhile revert default to off in 1.119 update."

The company has also scaled back intrusive Copilot integrations following broader developer backlash, with coders less likely to trust a tool that automatically changes metadata without their explicit consent.

Despite the Microsoft worker confirming that the Copilot author label changes have been reversed, users still expressed their distrust in the company for allowing the feature to reach production in the first place. Many criticized the company for referring to such changes as bugs, noting that they were intentional all along.

Sign up to read this article
Read news from 100's of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.