In the context of rapid advancements in machine learning and causal inference methodologies, their integration into medical research is of paramount importance. Implementing appropriate methods in the medical domain facilitates robust assessment of treatment efficacy at the individual level. This study aims to conduct experiments on synthetic data and evaluate the accuracy of predicting individual treatment effects using T-learner and S-learner methods. The article presents an integrated approach to medical data analysis, combining causal inference techniques with machine learning algorithms. For the first time, a comprehensive comparison of the effectiveness of T-learner and S-learner methods in assessing individual treatment effects has been conducted. Based on simulated data, the study experimentally determines the optimal application conditions for these methods, depending on the characteristics of clinical data. The experiments revealed that the T-learner method demonstrated higher accuracy (87%) compared to the S-learner (84%), making it preferable when there are significant differences between treatment and control groups. However, the S-learner method exhibited greater generalization capability in scenarios with limited data volume. The c-for-benefit index was employed to validate the predicted treatment effects, with results confirming the high accuracy of both methods. These findings underscore the potential of integrating machine learning and causal inference methods to develop personalized therapeutic strategies and automate medical data analysis, thereby improving clinical outcomes and treatment quality. The developed approach enhances the precision of predicting treatment outcomes at the individual level and can be integrated into clinical decision support systems. The presented results offer new opportunities for personalized healthcare and can serve as a foundation for subsequent research in this field. Continue... | |
The article explores the ways to increase the operational efficiency of IT companies by improving approaches to solving tasks at different stages of the software development life cycle (SDLC). It is shown that the greatest potential for growth of operational efficiency lies in the tasks of the stage of processing requirements. Automation of these tasks via artificial intelligence tools, especially those based on large language models (LLM), will significantly reduce the duration of SDLC by cutting the number of iterations for rework and error correction. The paper analyses the capabilities of such tools, as well as the complexities associated with their implementation, including the risk of technological dependence, the problems of integrating new tools into the current IT landscape of companies, the risks of disrupting the continuity of processes during implementation, as well as the difficulties of assessing the economic effects. The paper concludes that a comprehensive approach to SDLC modernisation is needed, which should combine technological innovation with organisational changes in the form of transformation of corporate culture and processes aimed at adapting to new technologies. Directions for further research are suggested, including the development of universal methodologies for implementing AI tools and models for assessing their economic efficiency. Continue... | |
№ 2(116)
25 april 2025 year
Rubric: Researching of processes and systems Authors: Puchkov A., Maksimkin M., Mashegov P., Prokimnov N. |
Digital signal processing in cyber-physical technological systems is based on algorithms that operate with information presented in a discretized form both by level and by time. In the latter case, the constancy of the time quantization interval is assumed as one of the postulated conditions for the application of algorithms. At the same time, in practice, such constancy is not always ensured, which leads to the omission of individual samples or even to a random nature of the discretization. Therefore, an urgent research task is to develop methods and algorithms for signal processing under conditions of random discretization, in particular, for the restoration of continuous signals from their discrete samples taken with violation of the requirements of the Kotelnikov – Shannon theorem. If the discretization interval of a continuous signal is calculated taking into account its requirements (i. e. discretization is carried out with a frequency not lower than the Nyquist frequency), then its exact restoration from discrete samples is allowed, otherwise it is impossible. However, even for this situation, there are approaches to the restoration of continuous signals that take into account additional a priori information about the nature of the signal. Some of these approaches are based on complex mathematical apparatus, which makes them difficult to apply and not universal, while others use deep machine learning models that are expensive in terms of computational resources and demanding in terms of training data volumes. Under these conditions, a method is proposed for restoring a signal with a limited spectrum from discrete samples, the time interval between which is random, and its mathematical expectation is greater than the value determined by the Kotelnikov – Shannon theorem for regular discretization. The novelty of the research results lies in the proposed method and algorithm for restoring a continuous signal, as well as in the results of the analysis of a numerical experiment conducted with a software model executed in the MatLab environment and implementing the developed algorithm. Continue... |
№ 2(116)
25 april 2025 year
Rubric: Researching of processes and systems Authors: Dli M. I., Chernovalova M., Sokolov A. M. |
The article is devoted to solving the problem of constructing and using models of regional economic systems that take into account management situational aspects. The specifics of information about these systems functioning, which cause difficulties in forming analytical and statistical dependencies, allowed us to conclude that it is advisable to use fuzzy situational models based on precedents for solving this problem. A procedure of constructing fuzzy situational precedent models of regional economic systems is proposed. It is characterized by the presence of additional transitions between nodes of the network graph (states) due to the presence of uncertainty factors leading to different observed reactions of the system to similar control effects. The procedure also allows the use of natural language information, which significantly expands the possibilities for economic and mathematical modeling of situational aspects of rational systems and processes management. This is achieved by applying domain ontology to determine the degree of proximity between the elements of the use case database and the current situation. The results of using the developed software tools implementing the proposed procedure to support decision-making on managing a regional IT cluster when implementing joint programs by its participants showed a fairly high degree of validity of the proposed recommendations. Continue... |
№ 2(116)
25 april 2025 year
Rubric: Researching of processes and systems Authors: Dzizinskaya D., Ledneva O., Tindova M., Yazykova S. |
To ensure effective management of energy systems, it is necessary to analyze and forecast time series of electricity consumption. Obtaining accurate forecasts of electricity consumption to optimize the operation of energy networks, planning the production and distribution of electricity explains the relevance of this study. This article presents a comparative analysis of medium-term electricity consumption forecasting models utilizing the R programming environment. The study encompasses classical forecasting models such as SARIMA and ETS, as well as less commonly referenced machine-oriented models like TBATS and Prophet in the scientific literature. The paper details the R functions necessary for performing calculations and includes a code snippet intended for preliminary data analysis and forecasting. All examined models demonstrate high accuracy in medium-term electricity consumption forecasting. However, variability in model fitting quality metrics is observed depending on the regional branches of the Unified Energy System of Russia. The application of ETS algorithms and bagging ETS yields the best forecasts with a minimal mean absolute error (slightly over 1%) for Russia as a whole, as well as for the consolidated energy system of the Urals. The TBATS model is recommended for predicting electricity consumption in the Center and East zones, while SARIMA is suggested for the South zone. Although the Prophet model exhibited satisfactory forecasting quality, the analysis indicates that its effectiveness significantly increases when applied to high-frequency data, such as weekly or hourly time series. Continue... |