đ€ AI Summary
Little is known about how developers practically integrate AI-powered code generation toolsâsuch as Amazon CodeWhispererâinto real-world programming workflows.
Method: We conducted two mixed-methods user studies, combining controlled experiments, a custom telemetry plugin for fine-grained interaction logging, and qualitative thematic analysis.
Contribution/Results: We identify four novel behavioral patterns: (1) incremental code refinement, (2) explicit natural-language prompting via comments, (3) structured baseline construction grounded in model suggestions, and (4) synergistic tool usage integrating external resources (e.g., documentation, Stack Overflow). This work provides the first empirical characterization of multi-layered humanâAI collaboration in AI-assisted programming. It extends theoretical frameworks of developerâAI interaction and yields actionable, evidence-based design implicationsâparticularly for prompt guidance, suggestion integration mechanisms, and context-awarenessâin next-generation AI coding assistants.
đ Abstract
The use of AI code-generation tools is becoming increasingly common, making it important to understand how software developers are adopting these tools. In this study, we investigate how developers engage with Amazon's CodeWhisperer, an LLM-based code-generation tool. We conducted two user studies with two groups of 10 participants each, interacting with CodeWhisperer - the first to understand which interactions were critical to capture and the second to collect low-level interaction data using a custom telemetry plugin. Our mixed-methods analysis identified four behavioral patterns: 1) incremental code refinement, 2) explicit instruction using natural language comments, 3) baseline structuring with model suggestions, and 4) integrative use with external sources. We provide a comprehensive analysis of these patterns .