Acadlore Transactions on Applied Mathematics and StatisticsLatest open access articles published in Acadlore Transactions on Applied Mathematics and Statistics at https://www.acadlore.com/journals/ATAMS
https://www.acadlore.com/journals/ATAMS
AcadloreenCreative Commons Attribution(CC - BY)ATAMSsupport@acadlore.comAcadlore Transactions on Applied Mathematics and Statistics, 2024, Volume 2, Issue 1, Pages undefined: Neighbourhood Degree-Based Graph Descriptors: A Comprehensive Analysis of Connectivity Patterns in Diverse Graph Families and Their Applicapability
https://www.acadlore.com/article/ATAMS/2024_2_1/atams020105
In the field of graph theory, the exploration of connectivity patterns within various graph families is paramount. This study is dedicated to the examination of the neighbourhood degree-based topological index, a quantitative measure devised to elucidate the structural complexities inherent in diverse graph families. An initial overview of existing topological indices sets the stage for the introduction of the mathematical formulation and theoretical underpinnings of the neighbourhood degree-based index. Through meticulous analysis, the efficacy of this index in delineating unique connectivity patterns and structural characteristics across graph families is demonstrated. The utility of the neighbourhood degree-based index extends beyond theoretical graph theory, finding applicability in network science, chemistry, and social network analysis, thereby underscoring its interdisciplinary relevance. By offering a novel perspective on topological indices and their role in deciphering complex network structures, this research makes a significant contribution to the advancement of graph theory. The findings not only underscore the versatility of the neighbourhood degree-based topological index but also highlight its potential as a tool for understanding connectivity patterns in a wide array of contexts. This comprehensive analysis not only enriches the theoretical landscape of graph descriptors but also paves the way for practical applications in various scientific domains, illustrating the profound impact of graph theoretical studies on understanding the intricacies of networked systems.02-04-2024<![CDATA[ In the field of graph theory, the exploration of connectivity patterns within various graph families is paramount. This study is dedicated to the examination of the neighbourhood degree-based topological index, a quantitative measure devised to elucidate the structural complexities inherent in diverse graph families. An initial overview of existing topological indices sets the stage for the introduction of the mathematical formulation and theoretical underpinnings of the neighbourhood degree-based index. Through meticulous analysis, the efficacy of this index in delineating unique connectivity patterns and structural characteristics across graph families is demonstrated. The utility of the neighbourhood degree-based index extends beyond theoretical graph theory, finding applicability in network science, chemistry, and social network analysis, thereby underscoring its interdisciplinary relevance. By offering a novel perspective on topological indices and their role in deciphering complex network structures, this research makes a significant contribution to the advancement of graph theory. The findings not only underscore the versatility of the neighbourhood degree-based topological index but also highlight its potential as a tool for understanding connectivity patterns in a wide array of contexts. This comprehensive analysis not only enriches the theoretical landscape of graph descriptors but also paves the way for practical applications in various scientific domains, illustrating the profound impact of graph theoretical studies on understanding the intricacies of networked systems. ]]>Neighbourhood Degree-Based Graph Descriptors: A Comprehensive Analysis of Connectivity Patterns in Diverse Graph Families and Their Applicapabilityabdu alameriabid mahboobemad toma karashdoi: 10.56578/atams020105Acadlore Transactions on Applied Mathematics and Statistics02-04-2024Acadlore Transactions on Applied Mathematics and Statistics02-04-2024202421Article5210.56578/atams020105https://www.acadlore.com/article/ATAMS/2024_2_1/atams020105Acadlore Transactions on Applied Mathematics and Statistics, 2024, Volume 2, Issue 1, Pages undefined: Complex Polytopic Fuzzy Model and Their Induced Aggregation Operators
https://www.acadlore.com/article/ATAMS/2024_2_1/atams020104
Inducing variables are the parameters or conditions that influence the membership value of an element in a fuzzy set. These variables are often linguistic in nature and represent qualitative aspects of the problem. Thus, the objective of this paper is introduce some aggregation operators based on inducing variable, such as induced complex Polytopic fuzzy ordered weighted averaging aggregation operator (I-CPoFOWAAO) and induced complex Polytopic fuzzy hybrid averaging aggregation operator (I-CPoFHAAO). Induced aggregation operators in decision-making process are indispensable tools for managing uncertainty, integrating multiple criteria, facilitating consensus, and providing a formal and flexible framework for modeling and solving complex decision problems. At the end of the paper, we make an illustrative example to prove the ability and efficiency of the novel proposed aggregation operators.02-03-2024<![CDATA[ Inducing variables are the parameters or conditions that influence the membership value of an element in a fuzzy set. These variables are often linguistic in nature and represent qualitative aspects of the problem. Thus, the objective of this paper is introduce some aggregation operators based on inducing variable, such as induced complex Polytopic fuzzy ordered weighted averaging aggregation operator (I-CPoFOWAAO) and induced complex Polytopic fuzzy hybrid averaging aggregation operator (I-CPoFHAAO). Induced aggregation operators in decision-making process are indispensable tools for managing uncertainty, integrating multiple criteria, facilitating consensus, and providing a formal and flexible framework for modeling and solving complex decision problems. At the end of the paper, we make an illustrative example to prove the ability and efficiency of the novel proposed aggregation operators. ]]>Complex Polytopic Fuzzy Model and Their Induced Aggregation Operatorskhaista rahmanjan muhammaddoi: 10.56578/atams020104Acadlore Transactions on Applied Mathematics and Statistics02-03-2024Acadlore Transactions on Applied Mathematics and Statistics02-03-2024202421Article4210.56578/atams020104https://www.acadlore.com/article/ATAMS/2024_2_1/atams020104Acadlore Transactions on Applied Mathematics and Statistics, 2024, Volume 2, Issue 1, Pages undefined: Exploring Novel Topological Descriptors: Geometric-Harmonic and Harmonic-Geometric Descriptors for HAC and HAP Conjugates
https://www.acadlore.com/article/ATAMS/2024_2_1/atams020103
In this investigation, the exact formulas for geometric-harmonic (GH), neighborhood geometric-harmonic (NGH), harmonic-geometric (HG), and neighborhood harmonic-geometric (NHG) indices were systematically evaluated for hyaluronic acid-curcumin (HAC) and hyaluronic acid-paclitaxel (HAP) conjugates. Through this evaluation, a comprehensive quantitative assessment was conducted to elucidate the structural characteristics of these conjugates, highlighting the intricate geometric and harmonic relationships present within their molecular graphs. The study leveraged these indices to illuminate the complex interplay between geometric and harmonic properties, providing a novel perspective on the molecular architecture of HAC and HAP conjugates. This analytical approach not only sheds light on the structural nuances of these compounds but also offers a unique lens through which their potential in drug delivery applications can be assessed. Graphical analyses of the results further enhance the understanding of these molecular properties, presenting a detailed visualization that complements the quantitative findings. The integration of these topological descriptors into the study of HAC and HAP conjugates represents a significant advance in the field of medicinal chemistry, offering valuable insights for researchers engaged in the development of innovative drug delivery systems. The findings underscore the utility of these descriptors in characterizing the molecular topology of complex conjugates, setting the stage for further exploration of their applications in therapeutic contexts.02-02-2024<![CDATA[ In this investigation, the exact formulas for geometric-harmonic (GH), neighborhood geometric-harmonic (NGH), harmonic-geometric (HG), and neighborhood harmonic-geometric (NHG) indices were systematically evaluated for hyaluronic acid-curcumin (HAC) and hyaluronic acid-paclitaxel (HAP) conjugates. Through this evaluation, a comprehensive quantitative assessment was conducted to elucidate the structural characteristics of these conjugates, highlighting the intricate geometric and harmonic relationships present within their molecular graphs. The study leveraged these indices to illuminate the complex interplay between geometric and harmonic properties, providing a novel perspective on the molecular architecture of HAC and HAP conjugates. This analytical approach not only sheds light on the structural nuances of these compounds but also offers a unique lens through which their potential in drug delivery applications can be assessed. Graphical analyses of the results further enhance the understanding of these molecular properties, presenting a detailed visualization that complements the quantitative findings. The integration of these topological descriptors into the study of HAC and HAP conjugates represents a significant advance in the field of medicinal chemistry, offering valuable insights for researchers engaged in the development of innovative drug delivery systems. The findings underscore the utility of these descriptors in characterizing the molecular topology of complex conjugates, setting the stage for further exploration of their applications in therapeutic contexts. ]]>Exploring Novel Topological Descriptors: Geometric-Harmonic and Harmonic-Geometric Descriptors for HAC and HAP Conjugatesali razafikadu tesgera tolasadoi: 10.56578/atams020103Acadlore Transactions on Applied Mathematics and Statistics02-02-2024Acadlore Transactions on Applied Mathematics and Statistics02-02-2024202421Article3210.56578/atams020103https://www.acadlore.com/article/ATAMS/2024_2_1/atams020103Acadlore Transactions on Applied Mathematics and Statistics, 2024, Volume 2, Issue 1, Pages undefined: Picture Fuzzy Linear Programming Problems
https://www.acadlore.com/article/ATAMS/2024_2_1/atams020102
This study introduces an advanced framework for picture fuzzy linear programming problems (PFLPP), deploying picture fuzzy numbers (PFNs) to articulate diverse parameters. Integral to this approach are the three cardinal membership functions: membership, neutral, and non-membership, each contributing distinctly to the formation of the PFLPP. Emphasis is placed on employing these degrees to formulate the PFLPP in its most unadulterated form. Furthermore, the research delineates a novel optimization model, tailored specifically for the resolution of the PFLPP. A meticulous case study, accompanied by a numerical example, is presented, demonstrating the efficacy and robustness of the proposed methodology. The study culminates in a comprehensive discussion of the findings, highlighting pivotal insights and delineating potential avenues for future inquiry. This exploration not only advances the theoretical underpinnings of picture fuzzy sets but also offers practical implications for the application of linear programming in complex decision-making scenarios.01-24-2024<![CDATA[ This study introduces an advanced framework for picture fuzzy linear programming problems (PFLPP), deploying picture fuzzy numbers (PFNs) to articulate diverse parameters. Integral to this approach are the three cardinal membership functions: membership, neutral, and non-membership, each contributing distinctly to the formation of the PFLPP. Emphasis is placed on employing these degrees to formulate the PFLPP in its most unadulterated form. Furthermore, the research delineates a novel optimization model, tailored specifically for the resolution of the PFLPP. A meticulous case study, accompanied by a numerical example, is presented, demonstrating the efficacy and robustness of the proposed methodology. The study culminates in a comprehensive discussion of the findings, highlighting pivotal insights and delineating potential avenues for future inquiry. This exploration not only advances the theoretical underpinnings of picture fuzzy sets but also offers practical implications for the application of linear programming in complex decision-making scenarios. ]]>Picture Fuzzy Linear Programming Problemschiranjibe janazdravko nunićdoi: 10.56578/atams020102Acadlore Transactions on Applied Mathematics and Statistics01-24-2024Acadlore Transactions on Applied Mathematics and Statistics01-24-2024202421Article2210.56578/atams020102https://www.acadlore.com/article/ATAMS/2024_2_1/atams020102Acadlore Transactions on Applied Mathematics and Statistics, 2024, Volume 2, Issue 1, Pages undefined: Complex Intuitionistic Hesitant Fuzzy Aggregation Information and Their Application in Decision Making Problems
https://www.acadlore.com/article/ATAMS/2024_2_1/atams020101
In the realm of decision-making, the delineation of uncertainty and ambiguity within data is a pivotal challenge. This study introduces a novel approach through complex intuitive hesitant fuzzy sets (CIHFS), which offers a unique multidimensional perspective for data analysis. The CIHFS framework is predicated on the concept that membership degrees reside within the unit disc of the complex plane, thereby providing a more nuanced representation of data. This method stands apart in its ability to simultaneously process and analyze data in a two-dimensional format, incorporating additional descriptive elements known as phase terms into the membership degrees. The study is bifurcated into two primary phases. Initially, a possibility degree measure is proposed, facilitating the ranking of numerical values within the CIHFS context. Subsequently, the development of innovative operational rules and aggregation operators (AOs) is undertaken. These AOs are instrumental in amalgamating diverse options within a CIHFS framework. The research dissects and deliberates on various AOs, including weighted average (WA), ordered weighted average (OWA), weighted geometric (WG), ordered weighted geometric (OWG), hybrid average (HA), and hybrid geometric (HG). Furthermore, the study extends to the realm of multi-criteria decision making (MCDM), where it proposes a methodology utilizing intricate intuitive and fuzzy information. This methodology emphasizes the objective management of weights, thereby enhancing the decision-making process. The study's findings hold significant implications for the optimization of resources and decision-making strategies, providing a robust framework for the application of CIHFS in practical scenarios.01-15-2024<![CDATA[ In the realm of decision-making, the delineation of uncertainty and ambiguity within data is a pivotal challenge. This study introduces a novel approach through complex intuitive hesitant fuzzy sets (CIHFS), which offers a unique multidimensional perspective for data analysis. The CIHFS framework is predicated on the concept that membership degrees reside within the unit disc of the complex plane, thereby providing a more nuanced representation of data. This method stands apart in its ability to simultaneously process and analyze data in a two-dimensional format, incorporating additional descriptive elements known as phase terms into the membership degrees. The study is bifurcated into two primary phases. Initially, a possibility degree measure is proposed, facilitating the ranking of numerical values within the CIHFS context. Subsequently, the development of innovative operational rules and aggregation operators (AOs) is undertaken. These AOs are instrumental in amalgamating diverse options within a CIHFS framework. The research dissects and deliberates on various AOs, including weighted average (WA), ordered weighted average (OWA), weighted geometric (WG), ordered weighted geometric (OWG), hybrid average (HA), and hybrid geometric (HG). Furthermore, the study extends to the realm of multi-criteria decision making (MCDM), where it proposes a methodology utilizing intricate intuitive and fuzzy information. This methodology emphasizes the objective management of weights, thereby enhancing the decision-making process. The study's findings hold significant implications for the optimization of resources and decision-making strategies, providing a robust framework for the application of CIHFS in practical scenarios. ]]>Complex Intuitionistic Hesitant Fuzzy Aggregation Information and Their Application in Decision Making Problemsmuhammad ahmedshahzaib ashrafdaoud suleiman mashatdoi: 10.56578/atams020101Acadlore Transactions on Applied Mathematics and Statistics01-15-2024Acadlore Transactions on Applied Mathematics and Statistics01-15-2024202421Article110.56578/atams020101https://www.acadlore.com/article/ATAMS/2024_2_1/atams020101Acadlore Transactions on Applied Mathematics and Statistics, 2023, Volume 1, Issue 3, Pages undefined: Stochastic Dynamics and Extinction Time in SIR Epidemiological Models
https://www.acadlore.com/article/ATAMS/2023_1_3/atams010305
In the realm of epidemiological modeling, the intricacies of epidemic dynamics are elucidated through the lens of compartmental models, with the SIR (Susceptible-Infectious-Recovered) and its variant, the SIS (Susceptible-Infectious-Susceptible) model, being pivotal. This investigation delves into both deterministic and stochastic frameworks, casting the SIR model as a continuous-time Markov chain (CTMC) in stochastic settings. Such an approach facilitates simulations via Gillespie's algorithm and integration of stochastic differential equations. The latter are formulated through a bivariate Fokker-Planck equation, originating from the continuous limit of the master equation. A focal point of this study is the distribution of extinction time, specifically, the duration until recovery in a population with an initial count of infected individuals. This distribution adheres to a Gumbel distribution, viewed through the prism of a birth and death process. The stochastic analysis reveals several insights: firstly, the SIR model as a CTMC encapsulates random fluctuations in epidemic dynamics. Secondly, stochastic simulation methods, either through Gillespie's algorithm or stochastic differential equations, offer a robust exploration of disease spread variability. Thirdly, the precision of modeling is enhanced by the incorporation of a bivariate Fokker-Planck equation. Fourthly, understanding the Gumbel distribution of extinction time is crucial for gauging recovery periods. Lastly, the non-linear nature of the SIR model, when analyzed stochastically, enriches the comprehension of epidemic dynamics. These findings bear significant implications for epidemic mitigation and recovery strategies, informing healthcare resource planning, vaccine deployment optimization, implementation of social distancing measures, public communication strategies, and swift responses to epidemic resurgences.12-30-2023<![CDATA[ In the realm of epidemiological modeling, the intricacies of epidemic dynamics are elucidated through the lens of compartmental models, with the SIR (Susceptible-Infectious-Recovered) and its variant, the SIS (Susceptible-Infectious-Susceptible) model, being pivotal. This investigation delves into both deterministic and stochastic frameworks, casting the SIR model as a continuous-time Markov chain (CTMC) in stochastic settings. Such an approach facilitates simulations via Gillespie's algorithm and integration of stochastic differential equations. The latter are formulated through a bivariate Fokker-Planck equation, originating from the continuous limit of the master equation. A focal point of this study is the distribution of extinction time, specifically, the duration until recovery in a population with an initial count of infected individuals. This distribution adheres to a Gumbel distribution, viewed through the prism of a birth and death process. The stochastic analysis reveals several insights: firstly, the SIR model as a CTMC encapsulates random fluctuations in epidemic dynamics. Secondly, stochastic simulation methods, either through Gillespie's algorithm or stochastic differential equations, offer a robust exploration of disease spread variability. Thirdly, the precision of modeling is enhanced by the incorporation of a bivariate Fokker-Planck equation. Fourthly, understanding the Gumbel distribution of extinction time is crucial for gauging recovery periods. Lastly, the non-linear nature of the SIR model, when analyzed stochastically, enriches the comprehension of epidemic dynamics. These findings bear significant implications for epidemic mitigation and recovery strategies, informing healthcare resource planning, vaccine deployment optimization, implementation of social distancing measures, public communication strategies, and swift responses to epidemic resurgences. ]]>Stochastic Dynamics and Extinction Time in SIR Epidemiological Modelsrachid el chaalsaid boucheframoulay othman aboutafaildoi: 10.56578/atams010305Acadlore Transactions on Applied Mathematics and Statistics12-30-2023Acadlore Transactions on Applied Mathematics and Statistics12-30-2023202313Article18110.56578/atams010305https://www.acadlore.com/article/ATAMS/2023_1_3/atams010305Acadlore Transactions on Applied Mathematics and Statistics, 2023, Volume 1, Issue 3, Pages undefined: Enhanced Industrial Control System of Decision-Making Using Spherical Hesitant Fuzzy Soft Yager Aggregation Information
https://www.acadlore.com/article/ATAMS/2023_1_3/atams010304
In the realm of emergency response, where time and information constraints are paramount, and scenarios often involve high levels of toxicity and uncertainty, the effective management of industrial control systems (ICS) is critical. This study introduces novel methodologies for enhancing decision-making processes in emergency situations, specifically focusing on ICS-security. Central to this research is the employment of spherical hesitant fuzzy soft set (SHFSS), a concept that thrives in the presence of ambiguity and incomplete information. The research adopts and extends the parametric families of t-norms and t-conorms, as introduced by Yager, to analyze these sets. This approach is instrumental in addressing multi-attribute decision-making (MADM) problems within the ICS-security domain. To this end, four distinct aggregation operators (AOs) are proposed: spherical hesitant fuzzy soft yager weighted averaging aggregation, spherical hesitant fuzzy soft yager ordered weighted averaging aggregation, spherical hesitant fuzzy soft yager weighted geometric aggregation, and spherical hesitant fuzzy soft yager ordered weighted geometric aggregation. These operators are tailored to harness the operational benefits of Yager's parametric families, thereby offering a robust framework for dealing with decision-making problems under uncertainty. Further, an algorithm specifically designed for MADM is presented, which integrates these AOs. The efficacy and precision of the proposed methodology are demonstrated through a numerical example, applied in the context of an ICS security supplier. This example serves as a testament to the superiority of the approach in handling complex decision-making scenarios inherent in ICS-security management.12-12-2023<![CDATA[ In the realm of emergency response, where time and information constraints are paramount, and scenarios often involve high levels of toxicity and uncertainty, the effective management of industrial control systems (ICS) is critical. This study introduces novel methodologies for enhancing decision-making processes in emergency situations, specifically focusing on ICS-security. Central to this research is the employment of spherical hesitant fuzzy soft set (SHFSS), a concept that thrives in the presence of ambiguity and incomplete information. The research adopts and extends the parametric families of t-norms and t-conorms, as introduced by Yager, to analyze these sets. This approach is instrumental in addressing multi-attribute decision-making (MADM) problems within the ICS-security domain. To this end, four distinct aggregation operators (AOs) are proposed: spherical hesitant fuzzy soft yager weighted averaging aggregation, spherical hesitant fuzzy soft yager ordered weighted averaging aggregation, spherical hesitant fuzzy soft yager weighted geometric aggregation, and spherical hesitant fuzzy soft yager ordered weighted geometric aggregation. These operators are tailored to harness the operational benefits of Yager's parametric families, thereby offering a robust framework for dealing with decision-making problems under uncertainty. Further, an algorithm specifically designed for MADM is presented, which integrates these AOs. The efficacy and precision of the proposed methodology are demonstrated through a numerical example, applied in the context of an ICS security supplier. This example serves as a testament to the superiority of the approach in handling complex decision-making scenarios inherent in ICS-security management. ]]>Enhanced Industrial Control System of Decision-Making Using Spherical Hesitant Fuzzy Soft Yager Aggregation Informationrazia choudharyshabana ashrafjafar anafidoi: 10.56578/atams010304Acadlore Transactions on Applied Mathematics and Statistics12-12-2023Acadlore Transactions on Applied Mathematics and Statistics12-12-2023202313Article16110.56578/atams010304https://www.acadlore.com/article/ATAMS/2023_1_3/atams010304Acadlore Transactions on Applied Mathematics and Statistics, 2023, Volume 1, Issue 3, Pages undefined: Optimizing Military Decision-Making: Application of the FUCOM– EWAA–COPRAS-G MCDM Model
https://www.acadlore.com/article/ATAMS/2023_1_3/atams010303
In contemporary military contexts, the determination of an optimal course of action (COA) in combat operations emerges as a critical challenge. This study delineates a decision support methodology for military applications, employing sophisticated decision analysis techniques. The initial phase entails the identification of pivotal criteria for assessing and ranking COAs, followed by the assignment of weight coefficients to each criterion via the full consistency method (FUCOM). Subsequently, the Einstein weighted arithmetic average operator (EWAA) was utilized for the aggregation of expert opinions, ensuring a consensual evaluation of these criteria and culminating in the final values of their weight coefficients. The ensuing phase focuses on the selection of an optimal COA, incorporating the grey complex proportional assessment (COPRAS-G) method. This method addresses uncertainties and varying criterion values. Expert ratings were again aggregated using the EWAA operator. The findings from this phase are designed to provide military commanders with precise, data-driven guidance for decision-making. To validate and verify the stability of the proposed model, a series of tests were conducted, including a rank reversal test, sensitivity analysis regarding changes in weight coefficients, and a comparative analysis with alternative methods. These assessments uniformly indicated the model's consistency, stability, and validity as a military decision support tool. Emphasizing a high degree of confidence in COA selection, the methodology advocated herein is applicable to decision-making processes in the planning and execution of military operations. The uniform application of professional terms, consistent with the broader context of this research, ensures clarity and coherence in its presentation. The approach outlined in this study stands as a testament to rigorous analytical methodologies in the realm of military strategic planning, offering a robust framework for decision-making under conditions of uncertainty and complexity.11-29-2023<![CDATA[ In contemporary military contexts, the determination of an optimal course of action (COA) in combat operations emerges as a critical challenge. This study delineates a decision support methodology for military applications, employing sophisticated decision analysis techniques. The initial phase entails the identification of pivotal criteria for assessing and ranking COAs, followed by the assignment of weight coefficients to each criterion via the full consistency method (FUCOM). Subsequently, the Einstein weighted arithmetic average operator (EWAA) was utilized for the aggregation of expert opinions, ensuring a consensual evaluation of these criteria and culminating in the final values of their weight coefficients. The ensuing phase focuses on the selection of an optimal COA, incorporating the grey complex proportional assessment (COPRAS-G) method. This method addresses uncertainties and varying criterion values. Expert ratings were again aggregated using the EWAA operator. The findings from this phase are designed to provide military commanders with precise, data-driven guidance for decision-making. To validate and verify the stability of the proposed model, a series of tests were conducted, including a rank reversal test, sensitivity analysis regarding changes in weight coefficients, and a comparative analysis with alternative methods. These assessments uniformly indicated the model's consistency, stability, and validity as a military decision support tool. Emphasizing a high degree of confidence in COA selection, the methodology advocated herein is applicable to decision-making processes in the planning and execution of military operations. The uniform application of professional terms, consistent with the broader context of this research, ensures clarity and coherence in its presentation. The approach outlined in this study stands as a testament to rigorous analytical methodologies in the realm of military strategic planning, offering a robust framework for decision-making under conditions of uncertainty and complexity. ]]>Optimizing Military Decision-Making: Application of the FUCOM– EWAA–COPRAS-G MCDM Modelduško tešićdarko božanićdoi: 10.56578/atams010303Acadlore Transactions on Applied Mathematics and Statistics11-29-2023Acadlore Transactions on Applied Mathematics and Statistics11-29-2023202313Article14810.56578/atams010303https://www.acadlore.com/article/ATAMS/2023_1_3/atams010303Acadlore Transactions on Applied Mathematics and Statistics, 2023, Volume 1, Issue 3, Pages undefined: The Cryptocurrency Market Through the Scope of Volatility Clustering and Leverage Effects
https://www.acadlore.com/article/ATAMS/2023_1_3/atams010302
In the realm of financial markets, the manifestation of volatility clustering serves as a pivotal element, indicative of the inherent fluctuations characterizing financial instruments. This attribute acquires pronounced relevance within the sphere of cryptocurrencies, a sector renowned for its elevated risk profile. The present analysis, conducted through the Autoregressive Moving Average - Generalized Autoregressive Conditional Heteroskedasticity (ARMA-GARCH) model, seeks to elucidate the enduring nature of volatility clustering and the occurrence of leverage effects within this domain. Over the course of a four-year time frame, it was observed that Bitcoin diverges from the anticipated Autoregressive Conditional Heteroskedasticity (ARCH) effects, in contrast to Ethereum and Cardano, which exhibit marked volatility clustering. Binance Coin, Ripple, and Dogecoin, whilst demonstrating moderate clustering, uniformly reflect the existence of leverage effects. An exception to this pattern was identified in Ripple, where it was discerned that positive market news exerts a disproportionate influence on log returns. The findings of this study illuminate the critical influence of both leverage effects and volatility clustering on the pricing dynamics of cryptocurrencies. It underscores the imperative for a nuanced comprehension of risk management in the context of cryptocurrency investments, given their susceptibility to abrupt price fluctuations. The distinct degrees to which these phenomena are manifested across diverse cryptocurrencies accentuate the necessity for a tailored risk management approach, resonant with the unique attributes of the asset in question. Such strategies, accounting for the potential amplification of losses through leverage, may encompass prudent position sizing, portfolio diversification, and the implementation of stress tests, thereby fortifying the investment against the dual perils of volatility clustering and leverage effects. The implications of this analysis serve to inform investors, providing a foundation upon which to construct risk management tactics that are responsive to the idiosyncrasies of the cryptocurrency market.11-27-2023<![CDATA[ In the realm of financial markets, the manifestation of volatility clustering serves as a pivotal element, indicative of the inherent fluctuations characterizing financial instruments. This attribute acquires pronounced relevance within the sphere of cryptocurrencies, a sector renowned for its elevated risk profile. The present analysis, conducted through the Autoregressive Moving Average - Generalized Autoregressive Conditional Heteroskedasticity (ARMA-GARCH) model, seeks to elucidate the enduring nature of volatility clustering and the occurrence of leverage effects within this domain. Over the course of a four-year time frame, it was observed that Bitcoin diverges from the anticipated Autoregressive Conditional Heteroskedasticity (ARCH) effects, in contrast to Ethereum and Cardano, which exhibit marked volatility clustering. Binance Coin, Ripple, and Dogecoin, whilst demonstrating moderate clustering, uniformly reflect the existence of leverage effects. An exception to this pattern was identified in Ripple, where it was discerned that positive market news exerts a disproportionate influence on log returns. The findings of this study illuminate the critical influence of both leverage effects and volatility clustering on the pricing dynamics of cryptocurrencies. It underscores the imperative for a nuanced comprehension of risk management in the context of cryptocurrency investments, given their susceptibility to abrupt price fluctuations. The distinct degrees to which these phenomena are manifested across diverse cryptocurrencies accentuate the necessity for a tailored risk management approach, resonant with the unique attributes of the asset in question. Such strategies, accounting for the potential amplification of losses through leverage, may encompass prudent position sizing, portfolio diversification, and the implementation of stress tests, thereby fortifying the investment against the dual perils of volatility clustering and leverage effects. The implications of this analysis serve to inform investors, providing a foundation upon which to construct risk management tactics that are responsive to the idiosyncrasies of the cryptocurrency market. ]]>The Cryptocurrency Market Through the Scope of Volatility Clustering and Leverage Effectsfilip peovskivioleta cvetkoskaigor ivanovskidoi: 10.56578/atams010302Acadlore Transactions on Applied Mathematics and Statistics11-27-2023Acadlore Transactions on Applied Mathematics and Statistics11-27-2023202313Article13010.56578/atams010302https://www.acadlore.com/article/ATAMS/2023_1_3/atams010302Acadlore Transactions on Applied Mathematics and Statistics, 2023, Volume 1, Issue 3, Pages undefined: Risk Spillovers and Hedging in the Chinese Stock Market: An Asymmetric VAR-BEKK-AGARCH Analysis
https://www.acadlore.com/article/ATAMS/2023_1_3/atams010301
In the present investigation, the phenomena of multi-scale volatility spillovers and dynamic hedging within the Chinese stock market are scrutinized, with particular emphasis on the implications of structural breaks. The decomposition of the returns from the CSI 300 and Hang sheng index’ spot and futures is achieved through the application of the Maximum Overlap Discrete Wavelet Transform (MODWT), categorizing the data into three distinct temporal scales: short-term, medium-term, and long-term. An enhancement upon the conventional VAR-BEKK-GARCH (Vector Autoregressive - Baba, Engle, Kraft, and Kroner - Generalized Autoregressive Conditional Heteroskedasticity) model is proposed, yielding the asymmetric VAR-BEKK-GARCH Model (VAR-BEKK-AGARCH), which adeptly integrates the structural break of return volatility. A comprehensive analysis is conducted to elucidate the interactions and spillovers between the CSI 300 and Hang Seng Index, as well as their respective spot and futures markets, across the various identified time scales. Concurrently, a dynamic hedging portfolio, comprised of index spot and futures, is meticulously constructed, with its performance rigorously evaluated under the influence of the different time scales. To ensure the robustness and validity of the findings, wavelet coherence and phase difference methodologies are employed as verification tools. The results unequivocally reveal a heterogeneity in the behavior of mean spillover, volatility spillover, and asymmetric spillovers between the spot and futures markets of the CSI 300 and Hang Seng Index across the diverse scales. The inclusion of a structural break in the dynamic hedge portfolio is demonstrated to confer a marked advantage over its counterpart that omits this critical factor. Particularly in the short and medium-term scenarios, the dynamically hedged portfolio, enriched by the consideration of the structural break, exhibits superior performance in comparison to the static hedge portfolio. Additionally, it is discerned that the CSI 300 Index and Hang Seng Index, along with their spot and futures components, predominantly manifest in synchrony, with no clear indication of a consistent leader-lag relationship. An intensification of correlation is observed in the long-term analysis, underscoring the utility of the spot and futures of the two indices as efficacious hedging tools.11-22-2023<![CDATA[ In the present investigation, the phenomena of multi-scale volatility spillovers and dynamic hedging within the Chinese stock market are scrutinized, with particular emphasis on the implications of structural breaks. The decomposition of the returns from the CSI 300 and Hang sheng index’ spot and futures is achieved through the application of the Maximum Overlap Discrete Wavelet Transform (MODWT), categorizing the data into three distinct temporal scales: short-term, medium-term, and long-term. An enhancement upon the conventional VAR-BEKK-GARCH (Vector Autoregressive - Baba, Engle, Kraft, and Kroner - Generalized Autoregressive Conditional Heteroskedasticity) model is proposed, yielding the asymmetric VAR-BEKK-GARCH Model (VAR-BEKK-AGARCH), which adeptly integrates the structural break of return volatility. A comprehensive analysis is conducted to elucidate the interactions and spillovers between the CSI 300 and Hang Seng Index, as well as their respective spot and futures markets, across the various identified time scales. Concurrently, a dynamic hedging portfolio, comprised of index spot and futures, is meticulously constructed, with its performance rigorously evaluated under the influence of the different time scales. To ensure the robustness and validity of the findings, wavelet coherence and phase difference methodologies are employed as verification tools. The results unequivocally reveal a heterogeneity in the behavior of mean spillover, volatility spillover, and asymmetric spillovers between the spot and futures markets of the CSI 300 and Hang Seng Index across the diverse scales. The inclusion of a structural break in the dynamic hedge portfolio is demonstrated to confer a marked advantage over its counterpart that omits this critical factor. Particularly in the short and medium-term scenarios, the dynamically hedged portfolio, enriched by the consideration of the structural break, exhibits superior performance in comparison to the static hedge portfolio. Additionally, it is discerned that the CSI 300 Index and Hang Seng Index, along with their spot and futures components, predominantly manifest in synchrony, with no clear indication of a consistent leader-lag relationship. An intensification of correlation is observed in the long-term analysis, underscoring the utility of the spot and futures of the two indices as efficacious hedging tools. ]]>Risk Spillovers and Hedging in the Chinese Stock Market: An Asymmetric VAR-BEKK-AGARCH Analysisjia wangxun huangxu wangdoi: 10.56578/atams010301Acadlore Transactions on Applied Mathematics and Statistics11-22-2023Acadlore Transactions on Applied Mathematics and Statistics11-22-2023202313Article11110.56578/atams010301https://www.acadlore.com/article/ATAMS/2023_1_3/atams010301Acadlore Transactions on Applied Mathematics and Statistics, 2023, Volume 1, Issue 2, Pages undefined: Analyzing Sensitivity and Solitonic Behavior Using the Dullin-Gottwald-Holm Model in Shallow Water Waves
https://www.acadlore.com/article/ATAMS/2023_1_2/atams010205
This paper presents an investigation of traveling wave solutions and a sensitivity analysis for the unidirectional Dullin-Gottwald-Holm ($DGH$) system, a well-established model for wave propagation in shallow water. We apply the novel auxiliary equation method, a unique integration norm, to extract various soliton solutions, including kink, rational, bright, singular, and bright-singular solutions. Precise explicit solutions of the resultant ordinary differential equations are demonstrated using suitable parametric values. Furthermore, we explore the conditions that ensure the existence of these solutions. By applying the Galilean transformation, we convert the model into a planar dynamical system and evaluate its sensitivity performance. The selection of appropriate parameters enables the generation of two and three-dimensional sketches, as well as contour plots for each solution.09-29-2023<![CDATA[ This paper presents an investigation of traveling wave solutions and a sensitivity analysis for the unidirectional Dullin-Gottwald-Holm ($DGH$) system, a well-established model for wave propagation in shallow water. We apply the novel auxiliary equation method, a unique integration norm, to extract various soliton solutions, including kink, rational, bright, singular, and bright-singular solutions. Precise explicit solutions of the resultant ordinary differential equations are demonstrated using suitable parametric values. Furthermore, we explore the conditions that ensure the existence of these solutions. By applying the Galilean transformation, we convert the model into a planar dynamical system and evaluate its sensitivity performance. The selection of appropriate parameters enables the generation of two and three-dimensional sketches, as well as contour plots for each solution. ]]>Analyzing Sensitivity and Solitonic Behavior Using the Dullin-Gottwald-Holm Model in Shallow Water Wavesshaheera haroonmuhammad abuzarmuhammad faisal khandoi: 10.56578/atams010205Acadlore Transactions on Applied Mathematics and Statistics09-29-2023Acadlore Transactions on Applied Mathematics and Statistics09-29-2023202312Article9610.56578/atams010205https://www.acadlore.com/article/ATAMS/2023_1_2/atams010205Acadlore Transactions on Applied Mathematics and Statistics, 2023, Volume 1, Issue 2, Pages undefined: Optimal Vehicle Routing in Consumer Goods Distribution: A GNU Linear Programming Kit-Based Analysis
https://www.acadlore.com/article/ATAMS/2023_1_2/atams010204
In businesses entailing the distribution of goods, the vehicle routing problem (VRP) critically influences the minimization of distribution costs and the curtailment of excessive vehicle utilization. This study delves into the formulation of the VRP within a firm specializing in the distribution of appliances and consumer goods, emphasizing the firm's unique operational characteristics. A mathematical model addressing the vehicle routing issue is meticulously crafted and subsequently resolved, yielding exact solutions through the application of the GNU Linear Programming Kit (GLPK). Comparative insights into the pre-existing and newly devised routing methodologies within the firm are elucidated. Owing to the dynamism in customer demands and daily deliveries, the propounded model has been designed for facile adaptability and frequent utilization. It demonstrates a marked enhancement over the conventional routing paradigms prevalent within the company. Recognizing potential avenues for advancement, considerations such as multi-warehouse integration and the introduction of customer-specific time windows, wherein goods must be dispatched within stipulated intervals, are acknowledged as prospects for future research and implementation.09-21-2023<![CDATA[ In businesses entailing the distribution of goods, the vehicle routing problem (VRP) critically influences the minimization of distribution costs and the curtailment of excessive vehicle utilization. This study delves into the formulation of the VRP within a firm specializing in the distribution of appliances and consumer goods, emphasizing the firm's unique operational characteristics. A mathematical model addressing the vehicle routing issue is meticulously crafted and subsequently resolved, yielding exact solutions through the application of the GNU Linear Programming Kit (GLPK). Comparative insights into the pre-existing and newly devised routing methodologies within the firm are elucidated. Owing to the dynamism in customer demands and daily deliveries, the propounded model has been designed for facile adaptability and frequent utilization. It demonstrates a marked enhancement over the conventional routing paradigms prevalent within the company. Recognizing potential avenues for advancement, considerations such as multi-warehouse integration and the introduction of customer-specific time windows, wherein goods must be dispatched within stipulated intervals, are acknowledged as prospects for future research and implementation. ]]>Optimal Vehicle Routing in Consumer Goods Distribution: A GNU Linear Programming Kit-Based Analysisuroš dedovićbisera andrić gušavacdoi: 10.56578/atams010204Acadlore Transactions on Applied Mathematics and Statistics09-21-2023Acadlore Transactions on Applied Mathematics and Statistics09-21-2023202312Article8710.56578/atams010204https://www.acadlore.com/article/ATAMS/2023_1_2/atams010204Acadlore Transactions on Applied Mathematics and Statistics, 2023, Volume 1, Issue 2, Pages undefined: Demographic Influences on Indigenous Knowledge Practices in Chief Albert Luthuli Municipality
https://www.acadlore.com/article/ATAMS/2023_1_2/atams010203
In recent years, a surge in studies concerning indigenous knowledge (IK) has been observed, yet a clear definition of IK remains elusive. Discrepancies in international studies lead to fluid interpretations of the concept. The present study seeks to delineate the key elements characterizing knowledge as either indigenous or foreign to a specific community. Through a meticulous exploration of definitions surrounding indigenous knowledge, it is posited that all knowledge forms can be considered indigenous within the communities of their origination. To elucidate this argument, the impact of community demographics on the adoption of knowledge perceived as indigenous within the Chief Albert Luthuli Municipality was investigated. Data were collected using structured interviews, involving a total of 398 respondents. Analyses were conducted employing a mixed-method approach, utilizing Microsoft Excel and the Statistical Package for Social Sciences (SPSS). Findings revealed a significant relationship between variables such as commonly spoken language, cultural attributes, age, and employment level with IK practices within communities. Furthermore, the economic factors, including employment status, education levels, and household income, were examined for their association with the adoption of IK practices. It was discerned that such variables were correlated with the adoption of IK practices, especially as alternative strategies in the absence of consistent household income. Key determinants like the language proficiency of the household head, employment status, educational attainment, family size, household income level, age, and gender of the household heads were analyzed. The influence of these determinants on household adoption of indigenous practices was assessed using inferential statistical methods, specifically probability and regression analysis.09-21-2023<![CDATA[ <p>In recent years, a surge in studies concerning indigenous knowledge (IK) has been observed, yet a clear definition of IK remains elusive. Discrepancies in international studies lead to fluid interpretations of the concept. The present study seeks to delineate the key elements characterizing knowledge as either indigenous or foreign to a specific community. Through a meticulous exploration of definitions surrounding indigenous knowledge, it is posited that all knowledge forms can be considered indigenous within the communities of their origination. To elucidate this argument, the impact of community demographics on the adoption of knowledge perceived as indigenous within the Chief Albert Luthuli Municipality was investigated. Data were collected using structured interviews, involving a total of 398 respondents. Analyses were conducted employing a mixed-method approach, utilizing Microsoft Excel and the Statistical Package for Social Sciences (SPSS). Findings revealed a significant relationship between variables such as commonly spoken language, cultural attributes, age, and employment level with IK practices within communities. Furthermore, the economic factors, including employment status, education levels, and household income, were examined for their association with the adoption of IK practices. It was discerned that such variables were correlated with the adoption of IK practices, especially as alternative strategies in the absence of consistent household income. Key determinants like the language proficiency of the household head, employment status, educational attainment, family size, household income level, age, and gender of the household heads were analyzed. The influence of these determinants on household adoption of indigenous practices was assessed using inferential statistical methods, specifically probability and regression analysis.</p> ]]>Demographic Influences on Indigenous Knowledge Practices in Chief Albert Luthuli Municipalitymatlhodi famomachate machatedoi: 10.56578/atams010203Acadlore Transactions on Applied Mathematics and Statistics09-21-2023Acadlore Transactions on Applied Mathematics and Statistics09-21-2023202312Article7710.56578/atams010203https://www.acadlore.com/article/ATAMS/2023_1_2/atams010203Acadlore Transactions on Applied Mathematics and Statistics, 2023, Volume 1, Issue 2, Pages undefined: A Systemic Approach to Risk Management: Utilizing Decision Support Software Solutions for Enhanced Decision-Making
https://www.acadlore.com/article/ATAMS/2023_1_2/atams010202
The process of decision-making involves selecting the most suitable management action from a range of options, thereby guiding the system towards its management objectives. Within the complex decision-making environment, uncertainty prevails, giving rise to the domain of risk. Effective risk management entails various activities that are implemented during distinct phases of system management. To address this, a systemic approach to risk management is crucial, along with the adoption of software solutions for risk analysis. This study examines the systemic approach to risk management and proposes a potential solution for managing uncertainties and risks by employing software tools that are rooted in system quality. System quality encompasses the development of novel models, methods, tools, and procedures, whose consistent application ensures reliable outcomes based on the best available information. Consequently, this study explores the application of innovative software solutions that support the risk management process across all phases. Given that risk management relies on data, which may not offer a comprehensive view of the environment, decision-making can be regarded as a process of managing the conversion of data into information. The acquisition of new information regarding the system's state determines the approach to modify the system through the chosen decision. Information serves as the essence of the decision-making process, as quality information facilitates quality decisions. However, in an information space characterized by incomplete data, the quality of decisions diminishes. Software solutions capable of providing the necessary level of information quality, despite uncertainties and incompleteness, enable decision-making based on partial information while upholding a minimum standard of quality.08-20-2023<![CDATA[ <p>The process of decision-making involves selecting the most suitable management action from a range of options, thereby guiding the system towards its management objectives. Within the complex decision-making environment, uncertainty prevails, giving rise to the domain of risk. Effective risk management entails various activities that are implemented during distinct phases of system management. To address this, a systemic approach to risk management is crucial, along with the adoption of software solutions for risk analysis. This study examines the systemic approach to risk management and proposes a potential solution for managing uncertainties and risks by employing software tools that are rooted in system quality. System quality encompasses the development of novel models, methods, tools, and procedures, whose consistent application ensures reliable outcomes based on the best available information. Consequently, this study explores the application of innovative software solutions that support the risk management process across all phases. Given that risk management relies on data, which may not offer a comprehensive view of the environment, decision-making can be regarded as a process of managing the conversion of data into information. The acquisition of new information regarding the system's state determines the approach to modify the system through the chosen decision. Information serves as the essence of the decision-making process, as quality information facilitates quality decisions. However, in an information space characterized by incomplete data, the quality of decisions diminishes. Software solutions capable of providing the necessary level of information quality, despite uncertainties and incompleteness, enable decision-making based on partial information while upholding a minimum standard of quality.</p> ]]>A Systemic Approach to Risk Management: Utilizing Decision Support Software Solutions for Enhanced Decision-Makingnenad komazeckatarina jankovicdoi: 10.56578/atams010202Acadlore Transactions on Applied Mathematics and Statistics08-20-2023Acadlore Transactions on Applied Mathematics and Statistics08-20-2023202312Article6610.56578/atams010202https://www.acadlore.com/article/ATAMS/2023_1_2/atams010202Acadlore Transactions on Applied Mathematics and Statistics, 2023, Volume 1, Issue 2, Pages undefined: An Empirical Analysis of Corporate Governance and Earnings Management Motives Influencing Goodwill Impairment in Chinese Manufacturing Firms
https://www.acadlore.com/article/ATAMS/2023_1_2/atams010201
Goodwill impairment, resulting from the impairment tests conducted on goodwill generated during business mergers, serves as an effective indicator of a company's true and reliable goodwill value, as well as its operational and financial conditions. This study investigates the impact of earnings management motivations on goodwill impairment from the perspective of corporate governance, focusing on Chinese manufacturing listed companies between 2016 and 2020. Utilizing regression analysis and panel data models, the study examines the internal governance mechanisms, including the combined shareholding ratio of the top ten shareholders, and the external governance mechanisms, such as the role of the four major auditing firms. The findings reveal that both "big bath" and earnings smoothing motives can influence companies' decisions to recognize goodwill impairment, while effective internal and external governance mechanisms can help mitigate earnings management motivations. Further analysis shows that non-state-owned manufacturing listed companies are more likely to exhibit goodwill impairment behaviors driven by earnings management motives. These findings provide valuable insights for listed companies seeking to improve their corporate governance structures and for Chinese capital market regulators aiming to enhance relevant regulatory policies and refine goodwill measurement standards.06-18-2023<![CDATA[ <p>Goodwill impairment, resulting from the impairment tests conducted on goodwill generated during business mergers, serves as an effective indicator of a company's true and reliable goodwill value, as well as its operational and financial conditions. This study investigates the impact of earnings management motivations on goodwill impairment from the perspective of corporate governance, focusing on Chinese manufacturing listed companies between 2016 and 2020. Utilizing regression analysis and panel data models, the study examines the internal governance mechanisms, including the combined shareholding ratio of the top ten shareholders, and the external governance mechanisms, such as the role of the four major auditing firms. The findings reveal that both "big bath" and earnings smoothing motives can influence companies' decisions to recognize goodwill impairment, while effective internal and external governance mechanisms can help mitigate earnings management motivations. Further analysis shows that non-state-owned manufacturing listed companies are more likely to exhibit goodwill impairment behaviors driven by earnings management motives. These findings provide valuable insights for listed companies seeking to improve their corporate governance structures and for Chinese capital market regulators aiming to enhance relevant regulatory policies and refine goodwill measurement standards.</p> ]]>An Empirical Analysis of Corporate Governance and Earnings Management Motives Influencing Goodwill Impairment in Chinese Manufacturing Firmscheng wangting hudoi: 10.56578/atams010201Acadlore Transactions on Applied Mathematics and Statistics06-18-2023Acadlore Transactions on Applied Mathematics and Statistics06-18-2023202312Article5110.56578/atams010201https://www.acadlore.com/article/ATAMS/2023_1_2/atams010201Acadlore Transactions on Applied Mathematics and Statistics, 2023, Volume 1, Issue 1, Pages undefined: An Integrated Risk Management Model for Transporting Explosive Remnants of War: A Case Study in the Republic of Serbia
https://www.acadlore.com/article/ATAMS/2023_1_1/atams010105
Existing legal and by-law regulations prescribe risk management methodologies for various domains, such as the transportation of hazardous materials, fire and explosion protection, environmental protection, and protection against chemical accidents. However, there is a lack of comprehensive methodological guidance that unifies the management of all risks associated with the transportation of explosive remnants of war (ERW), which pose significant threats to human life, cultural assets, and the environment. Furthermore, the transportation of ERW often occurs along traffic corridors with compromised infrastructure, increasing the range of potential risks affecting the safety of people, their property, and critical infrastructure. This study presents an integrated risk management model for ERW transportation in the Republic of Serbia, developed based on current legal and by-law regulations, as well as modern criteria and risk assessment methodologies. By applying this model, the various risks associated with ERW transportation can be effectively mitigated, ensuring the safety and protection of people, assets, and the environment.06-13-2023<![CDATA[ <p>Existing legal and by-law regulations prescribe risk management methodologies for various domains, such as the transportation of hazardous materials, fire and explosion protection, environmental protection, and protection against chemical accidents. However, there is a lack of comprehensive methodological guidance that unifies the management of all risks associated with the transportation of explosive remnants of war (ERW), which pose significant threats to human life, cultural assets, and the environment. Furthermore, the transportation of ERW often occurs along traffic corridors with compromised infrastructure, increasing the range of potential risks affecting the safety of people, their property, and critical infrastructure. This study presents an integrated risk management model for ERW transportation in the Republic of Serbia, developed based on current legal and by-law regulations, as well as modern criteria and risk assessment methodologies. By applying this model, the various risks associated with ERW transportation can be effectively mitigated, ensuring the safety and protection of people, assets, and the environment.</p> ]]>An Integrated Risk Management Model for Transporting Explosive Remnants of War: A Case Study in the Republic of Serbianenad komazeckatarina jankovicdoi: 10.56578/atams010105Acadlore Transactions on Applied Mathematics and Statistics06-13-2023Acadlore Transactions on Applied Mathematics and Statistics06-13-2023202311Article4410.56578/atams010105https://www.acadlore.com/article/ATAMS/2023_1_1/atams010105Acadlore Transactions on Applied Mathematics and Statistics, 2023, Volume 1, Issue 1, Pages undefined: Evaluating the Influence of the International Civil Aviation Organization on Aircraft Accident Rates and Fatalities: A Seven-Decade Historical Data Analysis
https://www.acadlore.com/article/ATAMS/2023_1_1/atams010104
The advent of air travel, originally proposed by the Wright brothers, has led to a significant surge in aircraft usage for human transportation. In its nascent stages, this mode of transport was linked with a high frequency of accidents and consequent fatalities, placing it in the high-risk category. To counter these risks, the International Civil Aviation Organization (ICAO) was established in 1947 as a collaborative effort among numerous countries with the primary goal of enhancing aviation safety regulations. This study analyzed archival data from the Bureau of Aircraft Accidents Archives (B3A), covering a span of 72 years from 1918, the year of the first commercial airplane crash, until 2020. The objective was to understand the ICAO's impact on altering accident rates, fatalities, and underlying causes. Analytical methodologies encompassed both descriptive statistics—examining data distribution, central tendencies, and category frequencies—and exploratory data analysis (EDA) to identify variable relationships and outlier identification. The results indicated that ICAO's interventions have led to a notable decline in accident rates, with an annual average reduction of 70.9%, and a corresponding decrease in incidents attributed to technical factors. However, an unexpected trend was the increase in fatalities despite the drop in accident numbers, attributable to the introduction of larger aircraft designs carrying more passengers per flight. The findings underscore the ICAO's successful efforts in reducing aircraft accidents, but also suggest a need for further exploration into factors contributing to the rise in fatalities.06-13-2023<![CDATA[ <p>The advent of air travel, originally proposed by the Wright brothers, has led to a significant surge in aircraft usage for human transportation. In its nascent stages, this mode of transport was linked with a high frequency of accidents and consequent fatalities, placing it in the high-risk category. To counter these risks, the International Civil Aviation Organization (ICAO) was established in 1947 as a collaborative effort among numerous countries with the primary goal of enhancing aviation safety regulations. This study analyzed archival data from the Bureau of Aircraft Accidents Archives (B3A), covering a span of 72 years from 1918, the year of the first commercial airplane crash, until 2020. The objective was to understand the ICAO's impact on altering accident rates, fatalities, and underlying causes. Analytical methodologies encompassed both descriptive statistics—examining data distribution, central tendencies, and category frequencies—and exploratory data analysis (EDA) to identify variable relationships and outlier identification. The results indicated that ICAO's interventions have led to a notable decline in accident rates, with an annual average reduction of 70.9%, and a corresponding decrease in incidents attributed to technical factors. However, an unexpected trend was the increase in fatalities despite the drop in accident numbers, attributable to the introduction of larger aircraft designs carrying more passengers per flight. The findings underscore the ICAO's successful efforts in reducing aircraft accidents, but also suggest a need for further exploration into factors contributing to the rise in fatalities.</p> ]]>Evaluating the Influence of the International Civil Aviation Organization on Aircraft Accident Rates and Fatalities: A Seven-Decade Historical Data Analysisrossi passarellaharumi venymuhammad fachrurrozisamsuryadi samsuryadimarsella vindrianidoi: 10.56578/atams010104Acadlore Transactions on Applied Mathematics and Statistics06-13-2023Acadlore Transactions on Applied Mathematics and Statistics06-13-2023202311Article3310.56578/atams010104https://www.acadlore.com/article/ATAMS/2023_1_1/atams010104Acadlore Transactions on Applied Mathematics and Statistics, 2023, Volume 1, Issue 1, Pages undefined: Dynamic Operational Strategies Incorporating Consumer Reference Price Effects and Enterprise Behavior: A Differential Game Approach
https://www.acadlore.com/article/ATAMS/2023_1_1/atams010103
The continuous evolution of consumer behavior in the modern era of consumption has prompted enterprises to explore the underlying behavioral factors of consumers and cater to their particular needs. Moreover, developing a rational operational behavior model and responding effectively to the dynamic market environment have become critical concerns for businesses. This study examines the impact of consumer reference price effects and enterprise short-sighted behavior on strategic selection and performance, employing differential game theory to construct a game model between manufacturers and retailers. Utilizing Behrman's continuous dynamic programming theory, analytical solutions for various models are derived, followed by comparative analyses and numerical examples. The research reveals that: (1) manufacturers' behavior patterns are found to be dominant, favoring far-sighted behavior, which not only enhances profits but also enables consumers to access higher quality and cost-effective products; retailers should opt for collaboration with far-sighted manufacturers and exhibit a preference for short-sighted behavior. (2) In terms of overall system profit, the FM model emerges as the optimal combination. (3) When the reference price effect has a small impact on market demand, enterprises can make use of the reference price effect to actively promote marketing and gain profit; as the influence increases, intensifying the degree of influence effectively augments profits.06-07-2023<![CDATA[ <p>The continuous evolution of consumer behavior in the modern era of consumption has prompted enterprises to explore the underlying behavioral factors of consumers and cater to their particular needs. Moreover, developing a rational operational behavior model and responding effectively to the dynamic market environment have become critical concerns for businesses. This study examines the impact of consumer reference price effects and enterprise short-sighted behavior on strategic selection and performance, employing differential game theory to construct a game model between manufacturers and retailers. Utilizing Behrman's continuous dynamic programming theory, analytical solutions for various models are derived, followed by comparative analyses and numerical examples. The research reveals that: (1) manufacturers' behavior patterns are found to be dominant, favoring far-sighted behavior, which not only enhances profits but also enables consumers to access higher quality and cost-effective products; retailers should opt for collaboration with far-sighted manufacturers and exhibit a preference for short-sighted behavior. (2) In terms of overall system profit, the FM model emerges as the optimal combination. (3) When the reference price effect has a small impact on market demand, enterprises can make use of the reference price effect to actively promote marketing and gain profit; as the influence increases, intensifying the degree of influence effectively augments profits.</p> ]]>Dynamic Operational Strategies Incorporating Consumer Reference Price Effects and Enterprise Behavior: A Differential Game Approachfangfang guozhuang wuyuanyuan wangjiaqi duwanshu fudoi: 10.56578/atams010103Acadlore Transactions on Applied Mathematics and Statistics06-07-2023Acadlore Transactions on Applied Mathematics and Statistics06-07-2023202311Article2210.56578/atams010103https://www.acadlore.com/article/ATAMS/2023_1_1/atams010103Acadlore Transactions on Applied Mathematics and Statistics, 2023, Volume 1, Issue 1, Pages undefined: Evaluating the Employment Efficiency of IT Candidates Using Data Envelopment Analysis
https://www.acadlore.com/article/ATAMS/2023_1_1/atams010102
This study aims to identify efficient Information Technology (IT) candidates for a specific position and highlight areas for improvement using Data Envelopment Analysis (DEA). By streamlining the selection process and reducing costs, the findings can assist companies in making better-informed hiring decisions. Additionally, the results provide candidates with valuable feedback on areas for development, increasing their chances of securing employment in their desired company. The DEA model offers a unique advantage in this context by generating reference units for each candidate, enabling precise determination of the necessary changes in inputs or outputs for achieving efficiency. The Charnes, Cooper, and Rhodes (CCR) model served as the baseline, with parallel comparisons drawn against the Banker, Charnes, and Cooper (BCC) and categorical models to identify the most effective approach. The findings reveal the efficient candidates based on the assessed criteria, demonstrating that less experienced candidates can be evaluated as efficient compared to their more experienced counterparts. The hypothesis that the BCC model, with its more flexible efficiency frontier, results in poorer candidate differentiation was confirmed. This study highlights the value of adopting the DEA method in evaluating the employment efficiency of IT candidates, offering practical implications for both hiring organizations and job-seekers.06-06-2023<![CDATA[ <p>This study aims to identify efficient Information Technology (IT) candidates for a specific position and highlight areas for improvement using Data Envelopment Analysis (DEA). By streamlining the selection process and reducing costs, the findings can assist companies in making better-informed hiring decisions. Additionally, the results provide candidates with valuable feedback on areas for development, increasing their chances of securing employment in their desired company. The DEA model offers a unique advantage in this context by generating reference units for each candidate, enabling precise determination of the necessary changes in inputs or outputs for achieving efficiency. The Charnes, Cooper, and Rhodes (CCR) model served as the baseline, with parallel comparisons drawn against the Banker, Charnes, and Cooper (BCC) and categorical models to identify the most effective approach. The findings reveal the efficient candidates based on the assessed criteria, demonstrating that less experienced candidates can be evaluated as efficient compared to their more experienced counterparts. The hypothesis that the BCC model, with its more flexible efficiency frontier, results in poorer candidate differentiation was confirmed. This study highlights the value of adopting the DEA method in evaluating the employment efficiency of IT candidates, offering practical implications for both hiring organizations and job-seekers.</p> ]]>Evaluating the Employment Efficiency of IT Candidates Using Data Envelopment Analysisandjela mrdaktijana nanuševskidoi: 10.56578/atams010102Acadlore Transactions on Applied Mathematics and Statistics06-06-2023Acadlore Transactions on Applied Mathematics and Statistics06-06-2023202311Article1010.56578/atams010102https://www.acadlore.com/article/ATAMS/2023_1_1/atams010102Acadlore Transactions on Applied Mathematics and Statistics, 2023, Volume 1, Issue 1, Pages undefined: Temporal Analysis of Infectious Diseases: A Case Study on COVID-19
https://www.acadlore.com/article/ATAMS/2023_1_1/atams010101
Historically, infectious diseases have greatly impacted human health, necessitating a robust understanding of their trends, processes, and transmission. This study focuses on the COVID-19 pandemic, employing mathematical, statistical, and machine-learning methods to examine its time-series data. We quantify data irregularity using approximate entropy, revealing higher volatility in the U.S., Italy, and India compared to China. We employ the Dynamic Time Warping algorithm to assess regional similarity, finding a strong correlation between the U.S. and Italy. The Seasonal Trend Decomposition using the LOESS algorithm illuminates strong trend degrees in all observed regions, but China's prevention measures show marked effectiveness. These tools, whilst already valuable, still present opportunities for development in both theory and practice.06-04-2023<![CDATA[ <p>Historically, infectious diseases have greatly impacted human health, necessitating a robust understanding of their trends, processes, and transmission. This study focuses on the COVID-19 pandemic, employing mathematical, statistical, and machine-learning methods to examine its time-series data. We quantify data irregularity using approximate entropy, revealing higher volatility in the U.S., Italy, and India compared to China. We employ the Dynamic Time Warping algorithm to assess regional similarity, finding a strong correlation between the U.S. and Italy. The Seasonal Trend Decomposition using the LOESS algorithm illuminates strong trend degrees in all observed regions, but China's prevention measures show marked effectiveness. These tools, whilst already valuable, still present opportunities for development in both theory and practice.</p> ]]>Temporal Analysis of Infectious Diseases: A Case Study on COVID-19jinyang liuboping tianjiaxuan wudoi: 10.56578/atams010101Acadlore Transactions on Applied Mathematics and Statistics06-04-2023Acadlore Transactions on Applied Mathematics and Statistics06-04-2023202311Article110.56578/atams010101https://www.acadlore.com/article/ATAMS/2023_1_1/atams010101