Knowledge Transfer from High-Resource to Low-Resource Languages for Code LLMs (2023)

2 weeks ago 3

[Submitted on 19 Aug 2023 (v1), last revised 22 Sep 2024 (this version, v6)]

View PDF HTML (experimental)

Abstract:Over the past few years, Large Language Models of Code (Code LLMs) have started to have a significant impact on programming practice. Code LLMs are also emerging as building blocks for research in programming languages and software engineering. However, Code LLMs produce impressive results on programming languages that are well represented in their training data (e.g., Java, Python, or JavaScript), but struggle with low-resource languages that have limited training data available. Low resource languages include OCaml, Racket, and several others.
This paper presents an effective approach for boosting the performance of Code LLMs on low-resource languages using semi-synthetic data. Our approach, MultiPL-T, translates training data from high-resource languages into training data for low-resource languages in the following way. 1) We use a Code LLM to synthesize tests for commented code from a high-resource language, filtering out faulty tests and code with low test coverage. 2) We use a Code LLM to translate Python code to a target low-resource language, and use tests to validate the translation. We apply this approach to generate tens of thousands of validated training items for Julia, Lua, OCaml, R, and Racket. Furthermore, we use an open model (StarCoderBase) with open training data (The Stack), which allows us to decontaminate benchmarks, train models without violating licenses, and run experiments that could not otherwise be done.
With MultiPL-T generated data, we present fine-tuned versions of StarCoderBase and Code Llama for Julia, Lua, OCaml, R, and Racket. On established benchmarks (MultiPL-E), these models outperform other open Code LLMs. The MultiPL-T approach is easy to apply to new languages, and is significantly more efficient and effective than alternatives such as training longer.

Submission history

From: Federico Cassano [view email]
[v1] Sat, 19 Aug 2023 03:19:01 UTC (803 KB)
[v2] Tue, 22 Aug 2023 01:51:54 UTC (802 KB)
[v3] Sat, 9 Dec 2023 20:17:09 UTC (476 KB)
[v4] Tue, 12 Dec 2023 04:00:31 UTC (790 KB)
[v5] Sat, 10 Feb 2024 18:40:31 UTC (476 KB)
[v6] Sun, 22 Sep 2024 03:53:23 UTC (457 KB)

Read Entire Article