Microsoft Build Conference Heralds Era of AI-Assisted Software Development
Artificial Intelligence might not be coming for your coding job, but it sure is going to change it.
Years from now, looking back, the 2022 Microsoft Build developer conference might be seen as the dawn of a new era of AI-assisted development.
“AI and coding have become deeply entwined, and at Build, we are proud to detail all the platforms and tools being provided to developers to aid them with AI development,” Microsoft said in a post about developer tooling.
That makes for a very long list, as can be seen in the company’s Book of News roundup, which is chock full of AI news, especially as it relates to the Microsoft Azure cloud (see “Nadella Highlights AI and Cloud Native Apps at Build Event for Developers” from sister publication RedmondMag.com).
And that list starts with GitHub Copilot, described as an “AI pair programmer” when it debuted as a Visual Studio Code extension last summer. Last month, the still-in-preview product came to the Visual Studio IDE.
“We’ve been building GitHub Copilot together with the incredibly talented team at OpenAI for the last year, and we’re so excited to be able to show it off today,” said Nat Friedman, CEO of Microsoft-owned GitHub, in a June 29, 2021, Hacker News post. “Hundreds of developers are using it every day internally, and the most common reaction has been the head exploding emoji. If the technical preview goes well, we’ll plan to scale this up as a paid product at some point in the future.”
Heads are still exploding as Copilot, starting with relatively simple IntelliSense-like code completion — even whole-line code completion — has been improved to the point where normal-language typed commands (voice is surely coming) can create entire projects such as simple games.
That startling capability is discussed in the post from Microsoft’s John Roach titled “How AI makes developers’ lives easier, and helps everybody learn to develop software.”
“For example, a gamer could use natural language to program non-player characters in Minecraft to accomplish tasks such as build structures, freeing the gamer to attend to other, more pressing tasks,” Roach said. “Graphic designers can use natural language to build 3D scenes in the graphics rendering engine Babylon.js. Teachers can use 3D creation and collaboration tools like FrameVR to speak into existence a metaverse world such as a moonscape with rovers and an American flag.”
Copilot’s magic is provided via Codex, a machine learning model from AI research/development company and Microsoft partner OpenAI that can translate natural language commands into code in more than a dozen programming languages. A May 25 video shows a programmer using natural language commands with Codex to easily and quickly build 3D scenes by translating the commands into Babylon.js code (a 3D renderer that runs in the browser), with commands like “add teal spheres above and below the cube” and then “make the cube spin.”
Kevin Scott, Microsoft’s CTO, also weighed in: “You can describe to the AI system what you want to accomplish. It can try to figure out what it is you meant and show you part of the solution and then you can refine what the model is showing you. It’s this iterative cycle that’s free flowing and natural.”
It also works in reverse, as Scott detailed in his Build keynote address that described Microsoft’s Copilot Explain project: “Basically, it’s like Copilot in reverse. Just select some code and you can ask Copilot to explain it to you in plain language.
“The possibilities for exploration and creativity with Copilot are practically endless, but the best part is that tools like Copilot won’t just make developers more productive. They will increasingly make coding more accessible to everybody.”
GitHub Copilot, though, hasn’t advanced enough in its technical preview to arrive in a splashy General Availability announcement at Build, as much as Microsoft probably wanted that to happen (though Microsoft CEO Satya Nadella did say in own keynote that one-third of the people who signed up for the preview are frequent users). Instead, that debut will come this summer, said Nadella, noting that it will be free for students and open source contributors.
Another AI-assisted product in preview featured at Build is Azure OpenAI, part of the company’s Azure Cognitive Services. “OpenAI Service helps customers enable new reasoning and comprehension capabilities for building cutting-edge apps for use cases such as writing assistance, code generation and making sense of unstructured data,” Microsoft said. “With features like fine-tuning and built-in responsible AI, customers can also tailor the model to their specific needs to detect and mitigate harmful use.”
A fellow Azure Cognitive Services update introduced during Build is Azure Cognitive Service for Language with the new capability to provide summarization for documents and conversations, helping developers quickly surface key information in documents and contact center calls. “Additional capabilities, now generally available, include custom-named entity recognition to help developers identify terms specific to a domain and custom-text classification to help developers organize and categorize text with a customer’s domain-specific labels, such as a support ticket or invoice.”
Also in preview is the Azure Machine Learning responsible AI dashboard, part of the Azure Machine Learning suite of offerings. Microsoft said it helps developers and data scientists more easily implement responsible AI. Related to that is Azure Machine Learning’s responsible AI scorecard, which summarizes model performance and insights in order to help technical and non-technical audiences understand the impact of applying responsible AI.
In looking ahead to a new era of AI and automation for all, not just developers, Microsoft showed off tools infused with AI and automation including:
- Microsoft Power Pages, a low-code development and hosting platform that allows anyone, from low-code maker to professional developer, to design, configure and publish websites for both desktop and mobile through a fluid, visual experience.
- Express Design in Power Apps, which allows you to upload a PDF, PowerPoint or even a hand-drawn sketch that Express Design will convert into a working app within seconds.
- With text summarization for customer support, users can recap complex conversations to help reduce handling time and improve job satisfaction.
What’s more, “Microsoft is creating a powerful, cross-platform development pattern for building AI experiences that span the cloud to the edge, using ONNX Runtime and Azure Machine Learning, along with an AI toolchain. In addition, the forthcoming Project Volterra is a development kit with AI capabilities that will come with a neural processor that has best-in-class AI computing capacity and mind-blowing efficiency.”
Microsoft said Project Volterra “will enable developers to take advantage of the powerful integrated neural processing unit (NPU) to build apps that execute local AI-accelerated workloads. As an Arm-powered device powered by the Snapdragon compute platform, it will enable Windows developers to build, test and debug Arm-native apps alongside all their favorite productivity tools, including Visual Studio, Windows Terminal, WSL, VSCode, Microsoft Office and Teams.”
The company also touted something it calls the “hybrid loop“: “We’ve built a powerful, cross-platform development pattern for building AI experiences that span the cloud and edge. This pattern allows you to make late binding runtime decisions on whether to run inferencing on Azure or the local client. It can also dynamically shift the load between client and cloud.”
Microsoft also announced several AI partnerships because “Microsoft is committed to advancing AI so that every person and organization on the planet can achieve more. In addition to evolving AI through Microsoft research-driven AI breakthroughs that are implemented into Azure tools and services customers can use today, Microsoft also works with other organizations to help the global AI community evolve, expand and thrive.”
Those partnerships include:
- Meta (formerly Facebook), which has selected Azure as a strategic cloud provider to help accelerate AI research and experimentation for developers. As part of the agreement, Meta will expand its use of Azure’s supercomputing power to accelerate AI research and development for its Meta AI group.
- AMD, as Azure will be the first public cloud to deploy AMD’s flagship M200 GPU for large-scale AI training. Microsoft is working with PyTorch and AMD to optimize the performance and developer experience for customers running PyTorch on Azure, and to ensure that developers’ PyTorch projects run optimally on AMD hardware.
- Hugging Face, an open source platform for data scientists and ML practitioners, will deepen its partnership with Microsoft and expand its Azure integration. The new Hugging Face Endpoints service, backed by Azure Machine Learning and available in Azure Marketplace, will help developers and data scientists more quickly and easily deploy thousands of custom or pretrained transformer models.
All of the above, while still an incomplete list of AI-centric news at Build, serves to show how the conference might someday be seen as heralding an AI-assisted transformation, as Roach noted in his post that featured comments from Microsoft CTO Kevin Scott: “This new era of AI-assisted software development can lead to greater developer productivity, satisfaction and efficiency and make software development more natural and accessible to more people, according to Scott.”
That era won’t come without some bumps and bruises, though, as GitHub Copilot rekindled some existential angst among developers who fear AI robots would cost them their jobs. Opinions and surveys about that tend to differ.
Also, GitHub Copilot was decried as “unacceptable and unjust” by the Free Software Foundation (FSF). It also sparked some security concerns, as developers were warned to “remain awake” after a study found a 40 percent bad code rate.
Even with all that, it has proved to be so game-changing that several open source alternatives have sprung up, so the AI-assisted train, just pulling out of the station, is sure to pick up speed. Time will tell how impactful the journey will be.