by EditorDavid on (#60RQ0)
It was June 29th of 2021 that Microsoft-owned GitHub first announced its AI-powered autocompletion tool for programmers — trained on GitHub repositories and other publicly-available source code. But after a year in "technical preview," GitHub Copilot has reached a new milestone, reports Info-Q: you'll now have to pay to use it after a 60-day trial:The transition to general availability mostly means that Copilot ceases to be available for free. Interested developers will have to pay 10 USD/month or $100 USD/year to use the service, with a 60-day free trial.... According to GitHub, while not frequent, there is definitely a possibility that Copilot outputs code snippets that match those in the training set. Info-Q also cites GitHub stats showing over 1.2 million developers used Copilot in the last 12 months "with a shocking 40% figure of code written by Copilot in files where it is enabled." That's up from 35% earlier in the year, reports TechCrunch — which has more info on the rollout: It'll be free for students as well as "verified" open source contributors — starting with roughly 60,000 developers selected from the community and students in the GitHub Education program... One new feature coinciding with the general release of Copilot is Copilot Explain, which translates code into natural language descriptions. Described as a research project, the goal is to help novice developers or those working with an unfamiliar codebase. Ryan J. Salva, VP of product at GitHub, told TechCrunch via email... "As an example of the impact we've observed, it's worth sharing early results from a study we are conducting. In the experiment, we are asking developers to write an HTTP server — half using Copilot and half without. Preliminary data suggests that developers are not only more likely to complete their task when using Copilot, but they also do it in roughly half the time." Owing to the complicated nature of AI models, Copilot remains an imperfect system. GitHub said that it's implemented filters to block emails when shown in standard formats, and offensive words, and that it's in the process of building a filter to help detect and suppress code that's repeated from public repositories. But the company acknowledges that Copilot can produce insecure coding patterns, bugs and references to outdated APIs, or idioms reflecting the less-than-perfect code in its training data. The Verge ponders where this is going — and how we got here:"Just like the rise of compilers and open source, we believe AI-assisted coding will fundamentally change the nature of software development, giving developers a new tool to write code easier and faster so they can be happier in their lives," says GitHub CEO Thomas Dohmke. Microsoft's $1 billion investment into OpenAI, the research firm now led by former Y Combinator president Sam Altman, led to the creation of GitHub Copilot. It's built on OpenAI Codex, a descendant of OpenAI's flagship GPT-3 language-generating algorithm. GitHub Copilot has been controversial, though. Just days after its preview launch, there were questions over the legality of Copilot being trained on publicly available code posted to GitHub. Copyright issues aside, one study also found that around 40 percent of Copilot's output contained security vulnerabilities.Read more of this story at Slashdot.