Useful information
Prime News delivers timely, accurate news and insights on global events, politics, business, and technology
Useful information
Prime News delivers timely, accurate news and insights on global events, politics, business, and technology
Join our daily and weekly newsletters to get the latest updates and exclusive content on industry-leading AI coverage. More information
Immediately after the launch of its new generative AI models, Google updated its Code Assist tools to work with Gemini 2.0 and expanded the external data sources it connects to.
Code Assist will now run on the recently released Gemini 2.0, offering a larger context window for understanding larger enterprise codebases.
Google will also release the Gemini Code Assist tools in a private preview. The platform will connect to data sources such as GitLab, GitHub, Google Docs, Sentry.io, Atlassian and Snyk. This will allow developers and other programmers to ask Code Assist for help directly in their IDEs. Previously, Code Assist connected to VS Code and JetBrains.
Google Cloud senior director of product management Ryan J. Salva told VentureBeat in an interview that the idea is to allow programmers to add more context to their work without interrupting its flow. Salva said Google will add more partners in the future.
Code Assist, formerly Duet AI, launched for enterprises in October. As organizations looked for ways to streamline coding projects, demand for AI coding platforms like GitHub Copilot grew. Code Assist added enterprise-grade security and legal compensation when the enterprise option was launched.
Salva said connecting Code Assist to other tools developers use provides more context for their work without having to open multiple windows simultaneously.
“There are many other tools that a developer uses in the course of the day,” Salva said. “They could use GitHub or Atlassian Jira or DataDog or Snyk or all these other tools. “What we wanted to do is allow developers to incorporate that additional context into their IDE.”
Salva said developers just need to open Code Assist’s chat window and ask it to summarize the most recent comments for particular issues or the most recent pull requests in repositories, “so that it queries the data source and returns context to the IDE and (the) big language model can synthesize it.”
AI code assistants were some of the first major use cases for generative AI, especially after software developers began using ChatGPT to help with coding. Since then, a large number of enterprise-focused coding assistants have been released. GitHub launched Copilot Enterprise in February and Oracle launched its Java and SQL coding assistant. Leverage launched a coding assistant built with Gemini that provides real-time suggestions.
Meanwhile, OpenAI and Anthropic began offering front-end features that allowed programmers to work directly on its chat platforms. ChatGPT Canvas allows users to generate and edit code without copying and pasting it elsewhere. OpenAI also added integrations to tools like VS Code, XCode, Terminal, and iTerm 2 from the ChatGPT MacOS desktop app. Meanwhile, anthropic launched Artifacts for Claude so Claude users can generate, edit, and run code.
Salva noted that while Code Assist is now compatible with Gemini 2.0, it remains completely separate from Jules, the coding tool Google announced during the launch of the new Gemini models.
“Jules is really one of many experiments that came out of the Google Labs team to show how we can use autonomous or semi-autonomous agents to automate the coding process,” Salva said. “You can expect that over time, the experiments that graduate from Google Labs, those same capabilities, can become part of products like Gemini Code Assist.”
He added that his team works closely with the Jules team and is excited to see Jules’ progress, but Code Assist remains the only generally available enterprise-grade coding tool powered by Gemini.
Salva said early feedback from Code Assist and Jules users shows strong interest in Gemini 2.0’s latency improvements.
“When you’re sitting there trying to code and trying to stay in the flow state, you want those kinds of responses to come in milliseconds. Any time the developer feels like they are waiting for the tool is a bad thing, so we get faster and faster responses,” he said.
Coding assistants will continue to be crucial to the growth of the generative AI space, but Salva said the next few years can see a shift in how companies develop code generation models and applications.
Salva pointed out the State of DevOps Acceleration Report 2024 from Google’s DevOps Research and Evaluation team, which showed 39% of respondents distrusting AI-generated code and a decline in the quality of documentation and delivery.
“We have an industry with AI assistance tools largely focused on productivity and speed improvements over the course of the last four years,” Salva said. “And as we’re starting to see that being associated with a drop in overall stability, I suspect the conversation in the next year will really shift to how we use AI to improve quality across multiple dimensions.”