Aims and Scope
Research on Intelligent Manufacturing and Assembly (RIMA) (eISSN: 2972-3329) is an international, peer-reviewed, open access journal dedicated to the latest advancements in intelligent manufacturing and assembly. RIMA serves as a critical bridge between cutting-edge research and practical applications, fostering collaboration between the academic community and industry practitioners. The journal aims to publish high-impact research that pushes the boundaries of knowledge in the design, analysis, manufacturing, and operation of intelligent systems and equipment. RIMA focuses on innovative technologies and methodologies that are transforming the manufacturing landscape, driving efficiency, precision, and sustainability in industrial processes. By publishing rigorous research and fostering a vibrant community of scholars and practitioners, RIMA aims to be the go-to resource for advancing the state-of-the-art in intelligent manufacturing and assembly.
Topics of interest include, but are not limited to the following:
• Digital design and manufacturing
• Theories, methods, and systems for intelligent design
• Advanced processing techniques
• Modelling, control, optimization, and scheduling of systems
• Manufacturing system simulation and digital twin technology
• Industrial control systems and the industrial Internet of Things (IIoT)
• Safety and reliability assessment
• Robotics and automation
• Artificial intelligence and machine learning in manufacturing
• Supply chain optimization and management
• Additive manufacturing and materials science
• Cybersecurity and data privacy in manufacturing
• Sustainability and circular economy in manufacturing
• Bio-fabrication and other advanced manufacturing methods
• Digital Workforce and Automation
• etc.
Current Issue
Research Article
A Data-Driven Evaluation of ECD Measurement Techniques Across Traditional and AI-Based Modalities
Accurate measurement of corneal endothelial cell density (ECD) is crucial in evaluating the viability of donor corneas for transplantation. The consistency of ECD measurements is critical for predicting post-transplant results and monitoring corneal health. However, measurement methods have evolved, moving from manual counting to more complex semi-automatic and fully automated systems, including AI-powered solutions. This study compares the accuracy, dependability, and efficiency of manual, semi-automated, and fully automated ECD measurement techniques. It investigates the degree of heterogeneity among techniques and evaluates their potential to improve clinical outcomes in corneal transplantation. The sample includes corneal data from 300 participants, 150 male and 150 female donors, who were divided into three groups based on the measurement method: manual, semi-automated, or fully automated. The study also examined the gender distribution to see whether there was any difference in results between male and female donor corneas. Manual counting has previously been notable for its variability due to operator expertise and calibration discrepancies, with mean ECD values ranging from 2146 to 2775 cells/mm² (p < 0.05). Semi-automated procedures, which combine manual input with software aid, enhance consistency. In the Cornea Preservation Time Study, eye banks reported a mean ECD of 2773 ± 300 cells/mm², while CIARC reported 2758 ± 388 cells/mm², with agreement limits ranging from [-644, 675] cells/mm² (p < 0.05). The AxoNet deep learning model had a mean absolute error (MAE) of 12.1 cells/mm² and an R² value of 0.948, making it the most accurate fully automated system. A separate study on AI-based detection of aberrant endothelium cells achieved an accuracy of 0.95, precision of 0.92, recall of 0.94, and F1 score of 0.93, and an AUC-ROC of 0.98 (p < 0.01). Fully automated AI-based methods surpass manual and semi-automated approaches in accuracy and consistency, significantly reducing time and labor. The findings highlight the importance of adopting AI-driven technologies to enhance diagnostic precision and efficiency in clinical settings. However, the need for standardized calibration procedures and high- quality image acquisition remains critical for reliable ECD measurement.
Machine Learning Approaches to Predicting Pacemaker Battery Life
Accurate prediction of pacemaker battery life is critical to timely generator replacement and patient safety. We evaluated three regression approaches: multilayer perceptron Neural Networks (NN), Random Forests (RF), and Linear Regression (LR), using 42 real‑world interrogation reports spanning single, dual, and triple‑chamber Medtronic devices. Key electrical parameters (battery voltage/current, lead impedance, capture thresholds, pacing percentages, etc.) were modelled. Performance was quantified with mean absolute error (MAE), mean squared error (MSE), and coefficient of determination (R²). NNs achieved the highest accuracy (R² ≈ 1.0; MAE < 0.1 months), RF provided robust results (R² ≈ 0.85), whereas LR exhibited limited predictive fidelity (R² ≤ 0.41). “Monte‑Carlo simulations (n = 1000)” and 95 % prediction intervals characterized predictive uncertainty; residual and Q‑Q analyses verified statistical assumptions. Our findings indicate that a data‑driven NN framework can reliably forecast remaining battery longevity, enabling proactive replacement scheduling and reducing unexpected generator depletion. The methodology is compatible with different manufacturers and suitable to integration within remote device follow‑up systems to enhance longitudinal cardiac care.
eISSN: 2972-3329 Abbreviation: Res Intell Manuf Assem Editor-in-Chief: Prof. Matthew Chin Heng Chua (Singapore) Publishing Frequency: Continuous publication Article Processing Charges (APC): 0 Publishing Model: Open Access |