Pipeline Method for Domain-specific Language Generation in Low-code Platforms Using Large Language Models

Authors: Xin Cui, Weixing Zhang, Linnan Jiang, Aimin Pan, and Fei Yang
Conference: ICIC 2025 Posters, Ningbo, China, July 26-29, 2025
Pages: 1007-1020
Keywords: DSL generation, LLMs, Vector database, In-context learning, Prompt engineering

Abstract

The advancements in language models, particularly Large Language Models LLMs have propelled the evolution of front-end low-code platforms, transitioning from the traditional drag-and-drop approach to an automated Domain-Specific Language DSL code-based generation process. Within this context, the objective becomes to generate the appropriate DSL from textual descriptions using large language models. Nonetheless, due to the limitation of DSL data, challenges persist in training or fine-tuning LLMs for some DSL generation tasks such as the front-end low code platform. This study proposes a novel pipe-line approach for DSL generation, taking advantage of the potential of prompt engineering. The methodology utilizes Named Entity Recognition NER , a DSL knowledge vector database, and LLMs. The experiments demonstrated significant improvements in the quality of DSL generation while reducing token and time costs.
📄 View Full Paper (PDF) 📋 Show Citation