Javascript is required
Search

A vibrant hub of academic knowledge

Our mission is to inspire and empower the scientific exchange between scholars around the world, especially those from emerging countries. We provide a virtual library for knowledge seekers, a global showcase for academic researchers, and an open science platform for potential partners.

Recent Articles
Most Downloaded
Most Cited

Abstract

Full Text|PDF|XML
Continuous improvement in service quality assurance, based on customer satisfaction, is critical for loading and unloading activities at dry bulk ports. Many ports are now adopting and refining various methods in response to the advancements of Industry 4.0. This research aims to develop and implement Adaptive DMAIC 4.0. Key advantages of this method include IoT based real-time monitoring systems, predictive data analytics, and process automation capabilities. Current Six Sigma measurements show level 3 (DPMO 11,800). While the Cp value of 1.19 indicates stable process stability, the Cpk value of 0.76 $<$ 1 reveals remaining issues requiring systematic, continuous improvement. To enhance process performance, the average loading/unloading time should be maintained closer to the target midpoint of 1.5 minutes/bulk, creating a more balanced distribution. This adjustment would help increase the Cpk value to meet the minimum standard $\geq$ of 1.33, ensuring consistently efficient operations. In theory, implementing the DMAIC 4.0 framework will establish a system that is more resilient to internal and external disruptions, enables sustained performance improvement, and drives toward zero defects and Six Sigma capability. In practice, this approach significantly enhances loading and unloading performance for boosting capacity, operational capability, and TKBM professionalism while eliminating human error.
Open Access
Research article
A System Success Evaluation Framework for Digital Pension Platforms in Aging Societies
rattanavalee maisak ,
sirikarn tirasuwanvasee ,
kasidech sutthivanich ,
warintorn suwan ,
darunee bunma
|
Available online: 12-23-2025

Abstract

Full Text|PDF|XML
This study develops and validates a framework for evaluating the success of welfare-oriented digital platforms, with a focus on Thailand’s national pension system. The framework integrates the Information Systems Success Moe (ISSM) and the Technology Acceptance Model (TAM) with trust as a socio-technical construct to evaluate stability, usability, and trustworthiness in aging societies. The data was collected using a survey of 400 elderly citizens and analyzed using structural equation modeling (SEM) with the Jamovi software. The findings were further supplemented by a thematic analysis of the open-ended responses, which provided context for anomalies, such as instability in use, fraud risk, and usability issues, among other concerns. System quality increased perceived ease of use but decreased perceived usefulness when instability occurred. Trust increased usefulness but was not a predictor of behavioral intention. Ease of use unexpectedly decreased intention. User satisfaction, rather than actual use, surfaced as the strongest predictor of net benefits. These findings underscore that the anomalies of adoption are a result of structural and institutional barriers rather than user reluctance. The study rethinks adoption constructs as indicators of system success, thereby expanding the ISSM-TAM integration. It provides policymakers and system architects with a means to diagnose problems and develop welfare information systems for aging societies that are more resilient, trustworthy, and accessible.
Open Access
Research article
Leveraging Real-Time GTFS and Integrated Data for High-Accuracy LRT Departure Delay Prediction Using Optimized Machine Learning
rossi passarella ,
aulya putri ayu ,
mastura diana marieska ,
isbatudinia ,
nurainiyah solehan ,
harumi veny ,
romi fadillah rahmat
|
Available online: 12-22-2025

Abstract

Full Text|PDF|XML

Efficient light rail transit (LRT) systems are crucial for sustainable urban mobility; however, unforeseen departure delays continue to be a major hurdle, undermining operational reliability and passenger satisfaction. This study establishes a data-driven framework for forecasting departure delays by combining static GTFS schedules with real-time GTFS operational data from the Canberra LRT system. The dataset included 15,538 records with 42 attributes, spanning from 28 August 2020 to 13 August 2022. A stringent preprocessing pipeline was implemented, encompassing temporal feature engineering and feature selection based on mutual information. The Random Forest regressor with feature engineering and selection (RFR-FEFS) attained the highest predictive performance on the test set ($R^2$ = 0.94, MAE = 2.93, MSE = 34.32). The high accuracy indicates the model’s efficacy, yet it necessitates careful evaluation of potential overfitting and its generalizability beyond the examined system. Ablation experiments were performed to assess the impact of various feature groups by omitting temporal, spatial, or operational attributes. The findings indicate that the exclusion of temporal features decreased $R^2$ to 0.90, the exclusion of spatial features reduced it to 0.93, and the exclusion of operational features resulted in the most significant decline to 0.23. These findings affirm that all three feature categories contribute distinctly and synergistically to model performance. This research illustrates the capability of integrating diverse GTFS data with sophisticated machine learning techniques to attain precise LRT delay forecasts. Nevertheless, the framework was validated solely on one system and time frame; future research should investigate its transferability to other cities and integrate supplementary contextual data, including meteorological conditions and incident reports, to improve robustness and practical applicability.

Abstract

Full Text|PDF|XML
Modern image processing systems deployed on embedded and heterogeneous platforms face increasing pressure to deliver high performance under strict energy and real-time constraints. The rapid growth in image resolution and frame rates has significantly amplified computational demand, making uniform full-precision processing increasingly inefficient. This paper presents a significance-driven adaptive approximate computing framework that reduces energy consumption by tailoring computational precision and resource allocation to the spatial importance of image content. We introduce a statistical importance metric that captures local structural variability using low-complexity deviation-based analysis on luminance information. The metric serves as a lightweight proxy for identifying regions that are more sensitive to approximation errors, enabling differentiated processing without the overhead of semantic or perceptual saliency models. Based on this importance classification, the proposed framework dynamically orchestrates heterogeneous CPU–GPU resources, applies variable kernel sizes, and exploits dynamic voltage and frequency scaling (DVFS) to reclaim timing slack for additional energy savings. The framework is validated through two complementary case studies: (i) a heterogeneous software implementation for adaptive convolution filtering on an Odroid XU-4 embedded platform, and (ii) a hardware-level approximate circuit allocation approach using configurable-precision arithmetic units. Experimental results demonstrate energy reductions of up to 60\% compared to uniform-precision baselines, while maintaining acceptable visual quality. Image quality is evaluated using both PSNR and the perceptually motivated SSIM metric, confirming that the proposed approach preserves structural fidelity despite aggressive approximation.

Abstract

Full Text|PDF|XML
The rapid urbanization and economic development in China have led to increasing demand for infrastructure systems such as utilities, water, gas, and communication networks, exacerbating urban challenges like land scarcity and congestion. Previous studies have highlighted the potential of underground space development as a means to address these issues. Underground utility tunnel construction has been identified as a key solution for efficient pipeline maintenance and the advancement of smart city initiatives. However, as the scale of such projects continues to grow, so does the associated risk. Traditional risk assessment frameworks have often overlooked the significance of intelligent operation and maintenance (O&M) in the context of the digital transformation of infrastructure. This study proposes an updated risk assessment approach that integrates smart O&M into the evaluation framework, reflecting the adoption of technologies such as Building Information Modeling (BIM), digital twins, and big data in construction processes. The Analytic Hierarchy Process (AHP), expert consultations, questionnaire surveys, and fuzzy evaluation methods are applied to identify and assess risks in an underground utility tunnel project in Q City. The results indicate that the overall risk level of the project is above average, with the most significant risks occurring during the construction and operational phases. Risk mitigation measures have been proposed for the identified high-risk areas, tailored to the specific characteristics of the project. This study underscores the importance of incorporating smart operation and information technology risks into traditional risk management frameworks. The findings emphasize the need for a paradigm shift in the risk management of underground utility tunnel projects, particularly in light of the ongoing digital transformation of infrastructure. Such an approach would enhance the safety and efficiency of project management across the entire life cycle of the tunnel system.
Open Access
Research article
Optimizing Resource Utilization in Industrial Symbiosis: A DEMATEL and FAHP Approach for Sustainable Manufacturing
Juan Carlos Muyulema-Allaica ,
jaqueline elizabeth balseca-castro ,
francisco xavier aguirre-flores ,
paola martina pucha-medina
|
Available online: 12-22-2025

Abstract

Full Text|PDF|XML

Industrial symbiosis (IS) represents a strategic framework for collaboration among companies through innovative partnerships, which aimed at optimizing resource utilization, reducing environmental impact, and promoting sustainable development in line with the principles of circular economy. This study conducted a systematic literature review (SLR) and a quantitative analysis of the effectiveness of IS tools in resource management. Publications from January 2020 to December 2024 were retrieved from the established databases such as SpringerLink, ScienceDirect, EBSCO, and DOAJ, with a focus on industrial engineering, environmental management, circular economy, sustainable development, resource conservation, and recycling. Advanced methodologies including the Fuzzy Analytic Hierarchy Process (FAHP) and the Decision-Making Trial and Evaluation Laboratory (DEMATEL) were applied to evaluate four key dimensions, i.e., Decision-Making (DMD), Geographical Location (GLD), Strategic Planning (SD), and Lean Manufacturing (LMD), along with 21 subcriteria. The results indicated that DMD and GLD functioned as causal dimensions influencing SD and LMD, while alternatives such as Intelligent Waste Recycling Systems (IWRS) and Life Cycle Assessment (LCA) were considered to be highly efficient in resource utilization. The identification of dominant relationships via the threshold value of α = 0.58 highlighted strategic leverage points for implementing sustainable manufacturing practices. These findings emphasize that effective DMD, combined with strategic planning based on geographical considerations and application of technological tools, is critical for optimizing resources, enhancing environmental protection, and fostering economic and social development, thus providing clear guidance for the implementation of IS strategies in industrial settings.

Abstract

Full Text|PDF|XML
Recent literature has explored the nexus between macroeconomic policy uncertainty (MPU) and the environment in compliance with Sustainable Development Goals (SDGs). This study contributes to the literature by exploring the possible or negative environmental effects of MPU. The present study reviewed 117 research articles published from 2020 to 2025 to understand the multifaceted association between MPU and environmental sustainability, having considered sectoral and spatial dynamics, asymmetric responses, and heterogeneous responses from different countries and regions. The findings suggested that the relationship was complex, and varied upon the economic sector, emissions source, policy regime, and geographical location. MPU reduced the speed of transition from the first to the second phase of the Environmental Kuznets Curve (EKC). In the short run, MPU can reduce emissions due to temporary economic slowdowns. Nevertheless, it can be responsible for negative long-term environmental performance by delaying green investments, increasing fossil fuel reliance, and weakening institutional effectiveness. Sectoral analyses revealed that MPU raised emissions in the energy and industrial sectors and reduced them in the agricultural sector. While strong institutional quality helped to mitigate emissions, weak institutions raised environmental problems. The findings of this review suggested that policymakers should design adaptive, sector-sensitive, and regionally coordinated environmental strategies to protect the environment from macroeconomic policy volatility.

Abstract

Full Text|PDF|XML

In recent years, humanitarian logistics have received much attention from practitioners and researchers due to the significant damage from natural disasters on a global scale. This case study investigated the potential of leveraging social media data to enhance the effectiveness of humanitarian logistics in Vietnam after the disaster caused by Typhoon Yagi. The research examined public sentiment about the disaster response efforts, pinpointed the needs of critical relief, and assessed the performance of various machine learning models in classifying disaster-related content on social media. Data was sourced from multiple platforms, preprocessed and then categorized according to the damage types, required relief supplies, and sentiment labels. After that, different machine learning models were utilized to analyze the negative impact of the disaster. The analysis revealed that housing and transportation were the primary sources of negative public sentiment, indicating significant unmet needs in these areas. In contrast, generally more positive responses were received in relation to cash assistance, food, and medical support. A comparative evaluation of 12 machine learning models suggested that conventional algorithms, such as Random Forest, Support Vector Machine, and Logistic Regression, outperformed deep learning models in sentiment classification tasks. These findings shed light on the value of social media as a real-time indicator of public perception and logistical effectiveness. Therefore, incorporating sentiment analysis into the planning of disaster response can support more adaptive, timely, and community-informed decision-making for governments and humanitarian organizations.

Abstract

Full Text|PDF|XML
The suitability of the Garko area (Wudil Sheet 81 SE) for dam construction has been assessed through the analysis of aeromagnetic data with a spatial resolution of 500 m line spacing and a flight altitude of 80 m. The investigation, conducted in north-central Nigeria, aimed to delineate subsurface structural features and identify magnetic anomalies relevant to dam site selection. The integration of quantitative filtering techniques with magnetic interpretation significantly improved the reproducibility and reliability of the geophysical site evaluation process, thereby enhancing the accuracy of the assessment for sustainable dam development. The Total Magnetic Intensity (TMI) data was processed using upward continuation at a height of 1 km, with the resulting dataset serving as the primary input for the analysis. Several edge detection methods and interpretation techniques were employed, including the Gaussian filter (cut-off frequency of 0.05 cycles/km), Reduce to Pole (RTP) (for low latitudes), and Tilt Derivatives filters, to delineate structural trends and boundary zones. From the TMI data derived from the Tilt Derivative map, three magnetic zones were identified: a low magnetic intensity zone (LM) with an amplitude range of -1.4 to -0.3 nT, a moderate magnetic zone (MM) with amplitudes ranging from -0.3 to 0.4 nT, and a high magnetic intensity zone (HM) with amplitudes from 0.4 to 1.3 nT. These zones were represented by color codes from blue to pink, corresponding to the magnetic amplitude values. Lineament analysis conducted on the Tilt Derivative map revealed prominent NE–SW and NW–SE structural trends, which are believed to control subsurface drainage and fracture systems. Areas characterized by low magnetic intensities and sparse lineament density were identified as geologically stable, suggesting their suitability for the foundation of a dam. This study demonstrates that magnetic data, when combined with advanced geophysical techniques, can play a pivotal role in site selection for sustainable infrastructure development.
load more...
- no more data -
Most cited articles, updated regularly using citation data from CrossRef.
- no more data -