这是我们最新精选的有关快速傅里叶变换 (FFT) 示例的全球英文出版物和专利,它们来自众多科学在线期刊,并以快速傅里叶、FFT、离散傅里叶和库里-图基为主题进行了分类。
专利: 最近没有 专利 有关这一特定主题的信息。请尝试在上面链接的专利数据库中进行广泛的人工搜索。
Automatic Configuration Search for Parameter-Efficient Fine-Tuning
Published on 2024-05-25 by Han Zhou, Xingchen Wan, Ivan Vuli?, Anna Korhonen @MIT
Abstract: Large pretrained language models are widely used in downstream NLP tasks via task-specific fine-tuning, but such procedures can be costly. Recently, Parameter-Efficient Fine-Tuning (PEFT) methods have achieved strong task performance while updating much fewer parameters than full model fine-tuning (FFT). However, it is non-trivial to make informed design choices on the PEFT configurations, such as their architecture, the number of tunable parameters, and even the layers in which the PEFT modules[...]
Our summary: Efficient fine-tuning method achieved with fewer parameters. AutoPEFT uses Bayesian optimization to discover optimal configurations. Outperforms existing methods on GLUE and SuperGLUE tasks. Pareto-optimal set balances performance and cost.
automatic configuration search, parameter-efficient fine-tuning, neural architecture search, pretrained language models, task-specific fine-tuning
Publication