» 最新专利和出版物 » 有关神经网络的最新出版物和专利

有关神经网络的最新出版物和专利

神经网络

这是我们最新精选的有关神经网络的全球英文出版物和专利,涉及许多科学在线期刊,分类并侧重于神经网络、人工神经元、纪元、神经架构、机器学习、深度学习和支持向量机。

专利: 最近没有 专利 有关这一特定主题的信息。请尝试在上面链接的专利数据库中进行广泛的人工搜索。

Fast Prediction of Combustion Heat Release Rates for Dual-Fuel Engines Based on Neural Networks and Data Augmentation

Published on 2025-02-19 by Mingxin Wei, Xiuyun Shuai, Zexin Ma, Hongyu Liu, Qingxin Wang, Feiyang Zhao, Wenbin Yu @MDPI

Abstract: As emission regulations become increasingly stringent, diesel/natural gas dual-fuel engines are regarded as a promising solution and have attracted extensive research attention. However, their complex combustion processes pose significant challenges to traditional combustion modeling approaches. Data-driven modeling methods offer an effective way to capture the complexity of combustion processes, but their performance is critically constrained by the quantity and quality of the test data. To add[...]


Our summary: Fast prediction model based on neural networks and data augmentation for dual-fuel engines. Hybrid regression data augmentation architecture and Bayesian Neural Network for high-quality predictions. Adaptive weight allocation method for balanced accuracy distribution and enhanced generalization ability.

Neural Networks, Data Augmentation, Combustion Prediction, Dual-Fuel Engines

Publication

Automatic Configuration Search for Parameter-Efficient Fine-Tuning

Published on 2024-05-25 by Han Zhou, Xingchen Wan, Ivan Vuli?, Anna Korhonen @MIT

Abstract: Large pretrained language models are widely used in downstream NLP tasks via task-specific fine-tuning, but such procedures can be costly. Recently, Parameter-Efficient Fine-Tuning (PEFT) methods have achieved strong task performance while updating much fewer parameters than full model fine-tuning (FFT). However, it is non-trivial to make informed design choices on the PEFT configurations, such as their architecture, the number of tunable parameters, and even the layers in which the PEFT modules[...]


Our summary: Efficient fine-tuning method achieved with fewer parameters. AutoPEFT uses Bayesian optimization to discover optimal configurations. Outperforms existing methods on GLUE and SuperGLUE tasks. Pareto-optimal set balances performance and cost.

automatic configuration search, parameter-efficient fine-tuning, neural architecture search, pretrained language models, task-specific fine-tuning

Publication

目录
    Aggiungi un'intestazione per iniziare a generare il sommario

    迎接新挑战
    机械工程师、项目、工艺工程师或研发经理
    有效的产品开发

    可在短时间内接受新的挑战。
    通过 LinkedIn 联系我
    塑料金属电子集成、成本设计、GMP、人体工程学、中高容量设备和耗材、精益制造、受监管行业、CE 和 FDA、CAD、Solidworks、精益西格玛黑带、医疗 ISO 13485

    我们正在寻找新的赞助商

     

    您的公司或机构从事技术、科学或研究吗?
    > 给我们发送消息 <

    接收所有新文章
    免费,无垃圾邮件,电子邮件不分发也不转售

    或者您可以免费获得完整会员资格以访问所有受限制的内容>这里<

    历史背景

    (如果日期不详或不相关,例如 "流体力学",则对其显著出现的时间作了四舍五入的估计)。

    涵盖的主题: 神经网络、人工神经元、纪元、神经架构、机器学习、深度学习、支持向量机、动态手势识别、基准数据集、预训练权重、训练数据、训练数据集、设计快照、树突状神经元模型、突触可塑性、强化动态分组差分进化、YOLOv8 和 ByteTrack。

    相关文章

    滚动至顶部

    你可能还喜欢