Development of an explainable dendritic neural network with transparent architecture for medical image analysis
项目来源
项目主持人
项目受资助机构
立项年度
立项时间
项目编号
项目级别
研究期限
受资助金额
学科
学科代码
基金类别
关键词
参与者
参与机构
1.Evaluating a novel incremental-input neural network for multivariate air temperature forecasting
- 关键词:
- Atmospheric temperature;Multivariable systems;Weather forecasting;Agricultural planning;Air temperature;Air temperature prediction;Energy;Incremental-input;Multivariate time series;Neural-networks;Non-stationary time series;Temperature forecasting;Temperature prediction
- Song, Zhenyu;Song, Shuangyu;Song, Shuangbao;Tan, Lixing;Tang, Cheng;Ji, Junkai
- 《Engineering Applications of Artificial Intelligence》
- 2026年
- 170卷
- 期
- 期刊
Air temperature prediction (ATP) plays a crucial role in meteorological applications, such as agricultural planning, disaster forecasting, and energy management. However, the existing methods often struggle with the challenges posed by nonstationary and nonlinear time series data. In this paper, we introduce a novel incremental-input neural network (IINN) model that is designed to improve the accuracy and stability of multivariate ATP processes. By leveraging an incremental-input mechanism, the IINN addresses key challenges such as gradient vanishing and explosion while enhancing the robustness and nonlinear modelling capacity of the model for use with high-dimensional datasets. Comprehensive evaluations conducted on the Seoul metropolitan summer temperature dataset demonstrate that the IINN achieves state-of-the-art performance across two forecasting horizons. Specifically, compared with the best-performing baseline model, the IINN produces a 6.1% MSE improvement for the minimum temperature (Tmin) and a 5.8% improvement for the maximum temperature (Tmax). Thus, this work provides a significant step forward in the field of air temperature forecasting, offering a lightweight, efficient, and interpretable solution for modelling complex, nonstationary time series. The proposed approach offers a new and practical paradigm for modelling multivariate temperature time series and shows strong potential for broader applications in environmental forecasting scenarios. © 2026 Elsevier Ltd.
...2.Enhancing nonlinear dependencies of Mamba via negative feedback for time series forecasting
- 关键词:
- Embeddings;Feedback;Forecasting;Memory architecture;Nonlinear analysis;Time series;Time series analysis;Complex pattern;Embedding channel attention;Embeddings;Historical data;Maclaurin;Mamba;Memory efficiency;Nonlinear dependencies;Performance;Time series forecasting
- Xiong, Sijie;Tang, Cheng;Zhang, Yuanyuan;Xiong, Haoling;Xu, Youhao;Shimada, Atsushi
- 《Applied Soft Computing》
- 2025年
- 184卷
- 期
- 期刊
Mamba is a rising model designed to distill complex patterns from historical data, providing predictive capabilities for time series forecasting tasks. Mamba's similarity to linear-based models has been criticized due to its limited ability to capture nonlinear dependencies. In this work, we propose a novel model named Embedding Channel Attention Maclaurin Einstein Mamba (CME-Mamba1.) based on Mamba framework, with both Embedding Channel Attention and Maclaurin mechanisms incorporated. To further address gradient vanishing issues, we integrate Einstein FFT algorithms, ensuring robust performance against abnormal behaviors of Mamba-based architectures. Extensive experiments conducted on 11 real-world datasets with different numbers of variates, domain focus and granularity, reveal that CME-Mamba achieves state-of-the-art performance in both MSE and MAE, while maintaining reasonable memory efficiency and low time cost. The robustness and credibility of all results are substantiated by a comprehensive convergence and stability analysis. Statistically, consolidated by the Friedman Nonparametric Test and the Wilcoxon Signed-Rank Test, CME-Mamba ranks the first place with significance over counterparts. In addition, in terms of time and memory analysis, CME-Mamba is among the top three models for time and memory efficiency. Despite this, our results further demonstrate that the main contributor is the Embedding Channel Attention Block, which greatly enhances nonlinear dependencies over datasets. The Einstein FFT Block effectively suppresses gradient vanishing occurrences and contributes considerably to performance improvements, driving CME-Mamba both stable and promising. Moreover, the Maclaurin Block based on negative feedback is asymptotically stable without additional gradient vanishing issues and pioneered in achieving synergies with other blocks and greatly enhances nonlinear dependencies. With enhanced nonlinear dependencies generated from the synergy effect of all the three blocks, CME-Mamba grows excellent to uncover complex paradigms and predict future states in various domains, especially improving the performance for periodic and high-variate situations, such as traffic flow management (≈+8%), electricity predictions(≈+6%). © 2025 Elsevier B.V.
...
