Javascript is required
Search
/
/
Acadlore Transactions on Applied Mathematics and Statistics
ATAIML
Acadlore Transactions on Applied Mathematics and Statistics (ATAMS)
ATG
ISSN (print): 2959-4057
ISSN (online): 2959-4065
Submit to ATAMS
Review for ATAMS
Propose a Special Issue
Current State
Issue
Volume
2023: Vol. 1
Archive
ATAMS Flyer
Home

Acadlore Transactions on Applied Mathematics and Statistics (ATAMS) is dedicated to advancing research in the fields of applied mathematics and statistics. Highlighting the pivotal role of mathematical methodologies and statistical techniques in diverse real-world applications, ATAMS strives to decode the complexities underpinning these domains. Published quarterly by Acadlore, this peer-reviewed, open access journal typically issues its editions in March, June, September, and December each year.

  • Professional Service - Every article submitted undergoes an intensive yet swift peer review and editing process, adhering to the highest publication standards.

  • Prompt Publication - Thanks to our proficiency in orchestrating the peer-review, editing, and production processes, all accepted articles see rapid publication.

  • Open Access - Every published article is instantly accessible to a global readership, allowing for uninhibited sharing across various platforms at any time.

Editor(s)-in-chief(2)
bisera andrić gušavac
University of Belgrade, Serbia
bisera.andric.gusavac@fon.bg.ac.rs | website
Research interests: Mathematical Modelling; Optimization; Industrial Engineering; Performance Analytics
milena popović
University of Belgrade, Serbia
milena.popovic@fon.bg.ac.rs | website
Research interests: Data Envelopment Analysis; Quantitative Models and Methods; Mathematical Modelling; Optimization; Business Analytics and Performance Analytics

Aims & Scope

Aims

Acadlore Transactions on Applied Mathematics and Statistics (ATAMS) stands as an academic beacon in the realms of applied mathematics and statistics, illuminating the academic horizon with profound insights. Designed to serve as a nexus for the global community of researchers, scholars, and professionals, ATAMS is committed to showcasing groundbreaking research articles, in-depth reviews, and technical notes that span the myriad intersections of mathematical applications and statistical methodologies.

As modern challenges beckon innovative solutions, the journal's core revolves around the transformative potential of mathematical and statistical theories. These theories, often intricately woven into sectors ranging from engineering to economics, physical to social sciences, form the fabric of contemporary advancements. ATAMS champions not just the formulation of avant-garde mathematical models but ardently promotes their practical applications, solving real-world conundrums.

Holding the torch of academic excellence, ATAMS seeks manuscripts that redefine boundaries, stir intellectual curiosity, and instigate meaningful discussions. By fostering a milieu of interdisciplinary dialogues and collaborative ventures, the journal becomes an academic crucible where theories meld and ideas crystallize.

Advocating for exhaustive explorations, ATAMS believes in unbridled knowledge dissemination. Consequently, there are no confines on the length of contributions. Authors are encouraged to elucidate with thoroughness, ensuring the replicability of their findings. Distinctive features of the journal encompass:

  • A commitment to equitable academic services, ensuring authors, irrespective of their geographical origins, receive unparalleled support.

  • An agile review mechanism that underpins academic rigor, paired with expedited post-approval publication timelines.

  • An expansive reach, powered by the journal's open access directive, ensuring research resonates globally.

Scope

In its pursuit of academic breadth and depth, ATAMS's scope is vast, intricately designed to cover the spectrum of applied mathematics and statistics. It includes:

  • Mathematical Modeling: A comprehensive exploration into how mathematical methods are tailored to describe, forecast, and resolve intricate real-world challenges, ranging from ecological systems to intricate urban planning.

  • Statistical Theory and Innovations: This section doesn't just introduce novel statistical methods but critically evaluates their properties, potential pitfalls, and adaptability in diverse scenarios. It shines light on emerging trends and their applicability in new domains.

  • Data Synthesis and Mining: Beyond just extraction, the focus here is on the holistic lifecycle of data. It delves into methods for preprocessing, transformation, deep analysis, interpretation, and the eventual representation of data to ensure informed decision-making.

  • Advanced Numerical Computations: Celebrating the confluence of pure mathematics, algorithm design, and computational sciences, this segment highlights the latest strides in numerical methods, iterative techniques, and high-performance computing applications.

  • Interdisciplinary Matrix: This isn't just a cursory glance but a deep dive. From the precision required in financial mathematics, the sensitivity of medical statistics, the predictive power of biostatistics, to the large-scale implications of environmental statistics, this section covers it all.

  • Probabilistic Systems and Stochastic Analysis: Investigate the realms of randomness and uncertainty, dissecting how probabilistic models and stochastic methodologies can offer insights in fields as varied as finance, quantum mechanics, and epidemiology.

  • Optimization Techniques: Be it linear programming, dynamic optimization, or the newer realms of quantum optimization, this domain touches upon the algorithms and strategies that strive for perfection, ensuring resources are utilized to their utmost potential.

  • Time Series Analysis and Forecasting: Engage with the rhythmic dance of data over time, understanding patterns, anomalies, and making informed predictions about future behaviors, critical for sectors like finance, meteorology, and even social sciences.

  • Machine Learning and Artificial Intelligence: In this age of automation and intelligence, understand the mathematical underpinnings of ML algorithms, neural network design, and the statistical validations that ensure AI operates within expected paradigms.

  • Graph Theory and Network Analysis: From social networks, biological pathways to the vast world wide web, delve into the intricate patterns, connectivity issues, and the cascading effects within networks.

Articles
Recent Articles
Most Downloaded
Most Cited

Abstract

Full Text|PDF|XML
In contemporary military contexts, the determination of an optimal course of action (COA) in combat operations emerges as a critical challenge. This study delineates a decision support methodology for military applications, employing sophisticated decision analysis techniques. The initial phase entails the identification of pivotal criteria for assessing and ranking COAs, followed by the assignment of weight coefficients to each criterion via the full consistency method (FUCOM). Subsequently, the Einstein weighted arithmetic average operator (EWAA) was utilized for the aggregation of expert opinions, ensuring a consensual evaluation of these criteria and culminating in the final values of their weight coefficients. The ensuing phase focuses on the selection of an optimal COA, incorporating the grey complex proportional assessment (COPRAS-G) method. This method addresses uncertainties and varying criterion values. Expert ratings were again aggregated using the EWAA operator. The findings from this phase are designed to provide military commanders with precise, data-driven guidance for decision-making. To validate and verify the stability of the proposed model, a series of tests were conducted, including a rank reversal test, sensitivity analysis regarding changes in weight coefficients, and a comparative analysis with alternative methods. These assessments uniformly indicated the model's consistency, stability, and validity as a military decision support tool. Emphasizing a high degree of confidence in COA selection, the methodology advocated herein is applicable to decision-making processes in the planning and execution of military operations. The uniform application of professional terms, consistent with the broader context of this research, ensures clarity and coherence in its presentation. The approach outlined in this study stands as a testament to rigorous analytical methodologies in the realm of military strategic planning, offering a robust framework for decision-making under conditions of uncertainty and complexity.

Abstract

Full Text|PDF|XML
In the realm of financial markets, the manifestation of volatility clustering serves as a pivotal element, indicative of the inherent fluctuations characterizing financial instruments. This attribute acquires pronounced relevance within the sphere of cryptocurrencies, a sector renowned for its elevated risk profile. The present analysis, conducted through the Autoregressive Moving Average - Generalized Autoregressive Conditional Heteroskedasticity (ARMA-GARCH) model, seeks to elucidate the enduring nature of volatility clustering and the occurrence of leverage effects within this domain. Over the course of a four-year time frame, it was observed that Bitcoin diverges from the anticipated Autoregressive Conditional Heteroskedasticity (ARCH) effects, in contrast to Ethereum and Cardano, which exhibit marked volatility clustering. Binance Coin, Ripple, and Dogecoin, whilst demonstrating moderate clustering, uniformly reflect the existence of leverage effects. An exception to this pattern was identified in Ripple, where it was discerned that positive market news exerts a disproportionate influence on log returns. The findings of this study illuminate the critical influence of both leverage effects and volatility clustering on the pricing dynamics of cryptocurrencies. It underscores the imperative for a nuanced comprehension of risk management in the context of cryptocurrency investments, given their susceptibility to abrupt price fluctuations. The distinct degrees to which these phenomena are manifested across diverse cryptocurrencies accentuate the necessity for a tailored risk management approach, resonant with the unique attributes of the asset in question. Such strategies, accounting for the potential amplification of losses through leverage, may encompass prudent position sizing, portfolio diversification, and the implementation of stress tests, thereby fortifying the investment against the dual perils of volatility clustering and leverage effects. The implications of this analysis serve to inform investors, providing a foundation upon which to construct risk management tactics that are responsive to the idiosyncrasies of the cryptocurrency market.

Abstract

Full Text|PDF|XML
In the present investigation, the phenomena of multi-scale volatility spillovers and dynamic hedging within the Chinese stock market are scrutinized, with particular emphasis on the implications of structural breaks. The decomposition of the returns from the CSI 300 and Hang sheng index’ spot and futures is achieved through the application of the Maximum Overlap Discrete Wavelet Transform (MODWT), categorizing the data into three distinct temporal scales: short-term, medium-term, and long-term. An enhancement upon the conventional VAR-BEKK-GARCH (Vector Autoregressive - Baba, Engle, Kraft, and Kroner - Generalized Autoregressive Conditional Heteroskedasticity) model is proposed, yielding the asymmetric VAR-BEKK-GARCH Model (VAR-BEKK-AGARCH), which adeptly integrates the structural break of return volatility. A comprehensive analysis is conducted to elucidate the interactions and spillovers between the CSI 300 and Hang Seng Index, as well as their respective spot and futures markets, across the various identified time scales. Concurrently, a dynamic hedging portfolio, comprised of index spot and futures, is meticulously constructed, with its performance rigorously evaluated under the influence of the different time scales. To ensure the robustness and validity of the findings, wavelet coherence and phase difference methodologies are employed as verification tools. The results unequivocally reveal a heterogeneity in the behavior of mean spillover, volatility spillover, and asymmetric spillovers between the spot and futures markets of the CSI 300 and Hang Seng Index across the diverse scales. The inclusion of a structural break in the dynamic hedge portfolio is demonstrated to confer a marked advantage over its counterpart that omits this critical factor. Particularly in the short and medium-term scenarios, the dynamically hedged portfolio, enriched by the consideration of the structural break, exhibits superior performance in comparison to the static hedge portfolio. Additionally, it is discerned that the CSI 300 Index and Hang Seng Index, along with their spot and futures components, predominantly manifest in synchrony, with no clear indication of a consistent leader-lag relationship. An intensification of correlation is observed in the long-term analysis, underscoring the utility of the spot and futures of the two indices as efficacious hedging tools.

Abstract

Full Text|PDF|XML
This paper presents an investigation of traveling wave solutions and a sensitivity analysis for the unidirectional Dullin-Gottwald-Holm ($DGH$) system, a well-established model for wave propagation in shallow water. We apply the novel auxiliary equation method, a unique integration norm, to extract various soliton solutions, including kink, rational, bright, singular, and bright-singular solutions. Precise explicit solutions of the resultant ordinary differential equations are demonstrated using suitable parametric values. Furthermore, we explore the conditions that ensure the existence of these solutions. By applying the Galilean transformation, we convert the model into a planar dynamical system and evaluate its sensitivity performance. The selection of appropriate parameters enables the generation of two and three-dimensional sketches, as well as contour plots for each solution.

Abstract

Full Text|PDF|XML

In recent years, a surge in studies concerning indigenous knowledge (IK) has been observed, yet a clear definition of IK remains elusive. Discrepancies in international studies lead to fluid interpretations of the concept. The present study seeks to delineate the key elements characterizing knowledge as either indigenous or foreign to a specific community. Through a meticulous exploration of definitions surrounding indigenous knowledge, it is posited that all knowledge forms can be considered indigenous within the communities of their origination. To elucidate this argument, the impact of community demographics on the adoption of knowledge perceived as indigenous within the Chief Albert Luthuli Municipality was investigated. Data were collected using structured interviews, involving a total of 398 respondents. Analyses were conducted employing a mixed-method approach, utilizing Microsoft Excel and the Statistical Package for Social Sciences (SPSS). Findings revealed a significant relationship between variables such as commonly spoken language, cultural attributes, age, and employment level with IK practices within communities. Furthermore, the economic factors, including employment status, education levels, and household income, were examined for their association with the adoption of IK practices. It was discerned that such variables were correlated with the adoption of IK practices, especially as alternative strategies in the absence of consistent household income. Key determinants like the language proficiency of the household head, employment status, educational attainment, family size, household income level, age, and gender of the household heads were analyzed. The influence of these determinants on household adoption of indigenous practices was assessed using inferential statistical methods, specifically probability and regression analysis.

Abstract

Full Text|PDF|XML
In businesses entailing the distribution of goods, the vehicle routing problem (VRP) critically influences the minimization of distribution costs and the curtailment of excessive vehicle utilization. This study delves into the formulation of the VRP within a firm specializing in the distribution of appliances and consumer goods, emphasizing the firm's unique operational characteristics. A mathematical model addressing the vehicle routing issue is meticulously crafted and subsequently resolved, yielding exact solutions through the application of the GNU Linear Programming Kit (GLPK). Comparative insights into the pre-existing and newly devised routing methodologies within the firm are elucidated. Owing to the dynamism in customer demands and daily deliveries, the propounded model has been designed for facile adaptability and frequent utilization. It demonstrates a marked enhancement over the conventional routing paradigms prevalent within the company. Recognizing potential avenues for advancement, considerations such as multi-warehouse integration and the introduction of customer-specific time windows, wherein goods must be dispatched within stipulated intervals, are acknowledged as prospects for future research and implementation.

Abstract

Full Text|PDF|XML

The process of decision-making involves selecting the most suitable management action from a range of options, thereby guiding the system towards its management objectives. Within the complex decision-making environment, uncertainty prevails, giving rise to the domain of risk. Effective risk management entails various activities that are implemented during distinct phases of system management. To address this, a systemic approach to risk management is crucial, along with the adoption of software solutions for risk analysis. This study examines the systemic approach to risk management and proposes a potential solution for managing uncertainties and risks by employing software tools that are rooted in system quality. System quality encompasses the development of novel models, methods, tools, and procedures, whose consistent application ensures reliable outcomes based on the best available information. Consequently, this study explores the application of innovative software solutions that support the risk management process across all phases. Given that risk management relies on data, which may not offer a comprehensive view of the environment, decision-making can be regarded as a process of managing the conversion of data into information. The acquisition of new information regarding the system's state determines the approach to modify the system through the chosen decision. Information serves as the essence of the decision-making process, as quality information facilitates quality decisions. However, in an information space characterized by incomplete data, the quality of decisions diminishes. Software solutions capable of providing the necessary level of information quality, despite uncertainties and incompleteness, enable decision-making based on partial information while upholding a minimum standard of quality.

Abstract

Full Text|PDF|XML

Goodwill impairment, resulting from the impairment tests conducted on goodwill generated during business mergers, serves as an effective indicator of a company's true and reliable goodwill value, as well as its operational and financial conditions. This study investigates the impact of earnings management motivations on goodwill impairment from the perspective of corporate governance, focusing on Chinese manufacturing listed companies between 2016 and 2020. Utilizing regression analysis and panel data models, the study examines the internal governance mechanisms, including the combined shareholding ratio of the top ten shareholders, and the external governance mechanisms, such as the role of the four major auditing firms. The findings reveal that both "big bath" and earnings smoothing motives can influence companies' decisions to recognize goodwill impairment, while effective internal and external governance mechanisms can help mitigate earnings management motivations. Further analysis shows that non-state-owned manufacturing listed companies are more likely to exhibit goodwill impairment behaviors driven by earnings management motives. These findings provide valuable insights for listed companies seeking to improve their corporate governance structures and for Chinese capital market regulators aiming to enhance relevant regulatory policies and refine goodwill measurement standards.

Abstract

Full Text|PDF|XML

The advent of air travel, originally proposed by the Wright brothers, has led to a significant surge in aircraft usage for human transportation. In its nascent stages, this mode of transport was linked with a high frequency of accidents and consequent fatalities, placing it in the high-risk category. To counter these risks, the International Civil Aviation Organization (ICAO) was established in 1947 as a collaborative effort among numerous countries with the primary goal of enhancing aviation safety regulations. This study analyzed archival data from the Bureau of Aircraft Accidents Archives (B3A), covering a span of 72 years from 1918, the year of the first commercial airplane crash, until 2020. The objective was to understand the ICAO's impact on altering accident rates, fatalities, and underlying causes. Analytical methodologies encompassed both descriptive statistics—examining data distribution, central tendencies, and category frequencies—and exploratory data analysis (EDA) to identify variable relationships and outlier identification. The results indicated that ICAO's interventions have led to a notable decline in accident rates, with an annual average reduction of 70.9%, and a corresponding decrease in incidents attributed to technical factors. However, an unexpected trend was the increase in fatalities despite the drop in accident numbers, attributable to the introduction of larger aircraft designs carrying more passengers per flight. The findings underscore the ICAO's successful efforts in reducing aircraft accidents, but also suggest a need for further exploration into factors contributing to the rise in fatalities.

Abstract

Full Text|PDF|XML

Existing legal and by-law regulations prescribe risk management methodologies for various domains, such as the transportation of hazardous materials, fire and explosion protection, environmental protection, and protection against chemical accidents. However, there is a lack of comprehensive methodological guidance that unifies the management of all risks associated with the transportation of explosive remnants of war (ERW), which pose significant threats to human life, cultural assets, and the environment. Furthermore, the transportation of ERW often occurs along traffic corridors with compromised infrastructure, increasing the range of potential risks affecting the safety of people, their property, and critical infrastructure. This study presents an integrated risk management model for ERW transportation in the Republic of Serbia, developed based on current legal and by-law regulations, as well as modern criteria and risk assessment methodologies. By applying this model, the various risks associated with ERW transportation can be effectively mitigated, ensuring the safety and protection of people, assets, and the environment.

Abstract

Full Text|PDF|XML

The continuous evolution of consumer behavior in the modern era of consumption has prompted enterprises to explore the underlying behavioral factors of consumers and cater to their particular needs. Moreover, developing a rational operational behavior model and responding effectively to the dynamic market environment have become critical concerns for businesses. This study examines the impact of consumer reference price effects and enterprise short-sighted behavior on strategic selection and performance, employing differential game theory to construct a game model between manufacturers and retailers. Utilizing Behrman's continuous dynamic programming theory, analytical solutions for various models are derived, followed by comparative analyses and numerical examples. The research reveals that: (1) manufacturers' behavior patterns are found to be dominant, favoring far-sighted behavior, which not only enhances profits but also enables consumers to access higher quality and cost-effective products; retailers should opt for collaboration with far-sighted manufacturers and exhibit a preference for short-sighted behavior. (2) In terms of overall system profit, the FM model emerges as the optimal combination. (3) When the reference price effect has a small impact on market demand, enterprises can make use of the reference price effect to actively promote marketing and gain profit; as the influence increases, intensifying the degree of influence effectively augments profits.

Abstract

Full Text|PDF|XML

This study aims to identify efficient Information Technology (IT) candidates for a specific position and highlight areas for improvement using Data Envelopment Analysis (DEA). By streamlining the selection process and reducing costs, the findings can assist companies in making better-informed hiring decisions. Additionally, the results provide candidates with valuable feedback on areas for development, increasing their chances of securing employment in their desired company. The DEA model offers a unique advantage in this context by generating reference units for each candidate, enabling precise determination of the necessary changes in inputs or outputs for achieving efficiency. The Charnes, Cooper, and Rhodes (CCR) model served as the baseline, with parallel comparisons drawn against the Banker, Charnes, and Cooper (BCC) and categorical models to identify the most effective approach. The findings reveal the efficient candidates based on the assessed criteria, demonstrating that less experienced candidates can be evaluated as efficient compared to their more experienced counterparts. The hypothesis that the BCC model, with its more flexible efficiency frontier, results in poorer candidate differentiation was confirmed. This study highlights the value of adopting the DEA method in evaluating the employment efficiency of IT candidates, offering practical implications for both hiring organizations and job-seekers.

load more...
- no more data -
- no more data -