๐ค AI Summary
Patent drafting heavily relies on expert legal knowledge, leading to high costs, prolonged timelines, and insufficient standardization. To address these challenges, this paper proposes the first end-to-end automated generation framework that deeply integrates large language models (LLMs) with domain-specific patent law writing norms. Methodologically, it leverages a Transformer-based architecture synergizing natural language generation, legal text modeling, and a patent knowledge graph to enable controllable, structured generationโfrom technical disclosure to compliant patent documents, including claims and specifications. Its key contributions are a patent-legal-semantic instruction-tuning paradigm and a structured output constraint mechanism, ensuring strict adherence to the *Patent Examination Guidelines*. Experimental results demonstrate that the system reduces drafting time by an average factor of 5.3, decreases formal error rates by 82%, and significantly enhances the rigor and consistency of claim formulations.
๐ Abstract
Patent drafting presents significant challenges due to its reliance on the extensive experience and specialized expertise of patent attorneys, who must possess both legal acumen and technical understanding of an invention to craft patent applications in a formal legal writing style. This paper presents a demonstration of Patentformer, an AI-powered automated patent drafting platform designed to support patent attorneys by rapidly producing high-quality patent applications adhering to legal writing standards.