+7 (495) 987 43 74 ext. 3304
Join us -              
Рус   |   Eng

articles

№ 4(118) 28 august 2025 year
Rubric: Performance management
Authors: Dudarev V. A., Kiselyova N., Ludwig A.

Купить статью

Download the first page

The MatInf research data management system has been developed to support teams of researchers working with large volumes of data from high-throughput experiments in the field of inorganic materials science. One of the key features of the system is its architecture, which provides full support for user-defined data types specified after deployment. This is achieved through flexible system configuration, late binding of data types to web services and integration of data validation, extraction and visualization mechanisms. As part of the work, a detailed analysis of the requirements for ­RDMS was carried out, which allowed the formation of the main functional principles of the system, including support for data on chemical compounds, flexible object typing, access delimitation and management of relationships between data. The developed structure of the relational database implements the information storage model with the possibility of creating new types of objects related to the subject area. Additional use of links between objects (graph structure) allows to manage effectively the relationships between experimental results, materials and methods of their synthesis. To ensure flexibility and extensibility, the system supports integration with external ­API services to integrate user-defined data formats and provides ­API access to data, providing opportunities for integration with other systems such as machine learning tools. ­RDMS is developed on the basis of ­ASP.Net Core and relational ­DBMS Microsoft ­SQL Server, which guarantees its reliability, performance and scalability. Examples of using the system to accumulate experimental data, document experiments and improve the reproducibility of research are presented. The open architecture and free distribution of the system make it a fairly universal tool for digitalization of research in the field of inorganic materials science, allowing the platform to be adapted to various tasks, including support for new data types and integration with external analytical tools. The development novelty lies in the absence of freely available alternative solutions capable of maintaining typed storage of materials science data (and thus search by quantitative composition of material), support extensible used-defined types system and integration with arbitrary formats of research documents without changing the system core. Continue...
№ 4(118) 28 august 2025 year
Rubric: Performance management
Authors: Novoselova I., Sharkova A.

Купить статью

Download the first page

The subject of the research is models and algorithms of calendar planning of projects of the target integrated program of ecological rehabilitation of the region, the peculiarity of which is the absence of technological links between program projects. The purpose of the work is to form an economic and mathematical model for determining the rational deadlines for the implementation of projects of the target program for environmental rehabilitation of the region in the absence of technological dependencies between projects; comparing and improving project scheduling algorithms. The result of the work is the formation of an economic and mathematical model that includes two criteria – minimizing the duration of the target comprehensive program and maximizing the progressiveness of achieving its goals, as well as a system of restrictions on the annual amount of investment and the relationship between the desired dates for the start and end of program projects. Two well-known variants of the algorithm for sequentially assigning projects to the calendar plan and a modification of the algorithm are considered, which makes it possible to form the optimal set of projects for each year of the program implementation. The proposed modification of the algorithm using the solution of the problem of finding the optimal set of criteria makes it possible to satisfy both criteria of the economic and mathematical model. To test the operability and effectiveness of the analyzed alternative algorithms, a software package in VBA-Excel was developed. Numerical calculations of the application of the developed algorithm are presented, showing the advantage of the developed algorithm. Conclusions are drawn about the expediency of using this algorithm and the possibility of its adjustment to consider the technological relationships between software projects, which makes it possible to significantly expand the scope of its application. Continue...
№ 4(118) 28 august 2025 year
Rubric: Models and methods
Authors: Antipina E., Antipin A. F.

Купить статью

Download the first page

When searching for solutions to nonlinear optimal control problems, one may encounter difficulties related to the presence of local extremes. The use of traditional optimization methods is effective in the case of convex problems with the property that the found local extremum is global. Therefore, it is important to develop methods and algorithms for solving multi-extremal optimal control problems. Since the operation of most optimization methods depends on the choice of the initial values of the optimized parameters, it is proposed to apply the method of differential evolution. This method optimizes a set of possible solutions in the range of acceptable values of the desired parameters, the initial values of which are set randomly. The aim of the work is to develop an evolutionary algorithm for finding a solution to a multi-extremal optimal control problem. Overcoming the stuck solution in the local optimum is possible by maintaining population diversity. If the solution falls into the region of the local extremum with an insufficient set number of iterations of the algorithm, an incorrect solution can be obtained. Therefore, in order to dislodge a population from the area of the local extremum, a modification of the differential evolution method is proposed – a dynamic population size. If the population is drawn into the region of a local extremum, then its average fitness changes slightly. In this case, the vectors-individuals with the lowest fitness are removed and new individuals are added. Computational experiments have been carried out on a model optimal control problem with a non-convex reachability domain. The work of the developed evolutionary algorithm is compared with the method of variations in the control space and the algorithm of differential evolution with a constant population size. The effectiveness of the developed evolutionary algorithm in solving a multi-extremal optimal control problem is demonstrated. Continue...
№ 4(118) 28 august 2025 year
Rubric: Models and methods
Authors: Sorokin A., Dzhalmukhambetova E., Kartashova O.

Купить статью

Download the first page

The study proposes a hierarchical fuzzy rule-based model that allows one to determine the assessment of the state of a client of the banking ecosystem and calculate his/her credit rating. Additionally, the proposed model allows one to form multiple intermediate assessments. These intermediate assessments are obtained at the output of individual fuzzy rule-based models that form a hierarchical structure. The use of intermediate assessments allows one to determine the groups of parameters that influenced the value of the aggregated assessment and the value of the client’s credit rating. If the proposed model produces low values of the credit rating, the analysis of intermediate variables begins to identify the causes. As a result of the analysis, a set of input variables is formed, with the help of which the reasons for assigning a certain credit rating are explained. To correct the low value of the credit rating, control actions are formed. A feature of such actions is the involvement of the client in the business processes of the banking ecosystem. The experiment confirmed the possibility of the proposed theoretical provisions to distribute the objects of analysis by state classes and determine the values of aggregated assessments for them depending on various combinations of input parameter values. The possibility of using the proposed model to explain the obtained results by forming maps of the client’s state and predicting the result of applying control actions on the client’s state has been confirmed. Continue...
№ 4(118) 28 august 2025 year
Rubric: Algorithmic efficiency
Authors: Afanasyev S., Nechta I.

Купить статью

Download the first page

Random number generators produce numbers with the uniform distribution, and there is no correlation between these numbers that would allow to predict the generated number better than random guessing. Such generators are used in various tasks, such as modeling or information security. Hardware devices that transform a random physical process into a data stream are used as generators. Another type of generators are software generators that use a formula or algorithm to obtain a new number. It is believed that software generators, although they work faster, but do not have a proven property of randomness. There are a number of known cases when patterns were found in such generators, which led to the refusal to use them. There are test sets for checking the generator exists. If all tests are successfully passed, the generator is recommended for use. The purpose of this article is to develop an algorithm for testing bit sequences for randomness. In this paper, a new algorithm for testing random number generators is proposed, that can be included in the general set of statistical tests required for verification. Unlike previously known tests based on the entropy approach, the new test uses autocorrelation of the generated data. It is shown that the proposed test allows one to identify deviations from randomness in generators that have previously passed known statistical tests. During the experiments, the output sequences of the cipher, hash function and some pseudo-random number generators were tested. As the analysis of the test results showed, some generators have autocorrelation in the generated data. The idea of the test is that any generator has a period when all numbers from the generated set (alphabet) are appeared in the sequence. In the ideal case, the period will be close to the size of the alphabet or will be slightly larger. However, if some symbols are repeated more often, the period will increase significantly. In other words, an increase in the period may indicate either a deviation from a uniform distribution or the presence of autocorrelation in the generated data. The latter phenomenon is the area of interest this article. In this research we use elements of probability theory and mathematical statistics. The experiments were conducted on a large volume of the generated sequence of numbers. Continue...
№ 4(118) 28 august 2025 year
Rubric: Algorithmic efficiency
Authors: Andrianov M., Belousov K., Cherny R.

Купить статью

Download the first page

In space-ground interferometry systems, in order to improve UV plane filling and the quality of synthesized images, antenna systems separated from each other are used, thus forming variable projections of the spacecraft – ground receiving station. The increasing amount of information from highly sensitive receiving devices leads to rapid filling of the onboard memory with scientific data. In order to quickly read data from the onboard memory device, it is reasonable to increase the speed of transmitted data on the spacecraft-NSPI line. A variant of correcting group errors in a communication channel using algorithms of interleaved lossless coding/decoding distributed according to linear and pseudo-random laws is considered, which allows increasing the survivability of memory elements when transmitting data from autonomous systems in near and deep space. Some memory elements can be switched off until a certain moment of time. Group errors arising in this case can be transformed into single errors in accordance with the interleaving algorithms for correction of the latter by redundant noise-resistant code. In case of failure of certain memory elements, using the above mentioned algorithms, it is possible to transform group errors into single errors with subsequent correction in the same way. The application of non-loss interleaving interleaving codes under the conditions of normalization of the frame of the basic packet allows to increase the reliability of functioning of electronic memory elements. Nowadays, the development of computer technology allows data accumulation/input via a high-speed bus. This significantly increases the recording speed and the volume of stored data. A variant of the recording system is proposed, which can be used for operation in satellite data reception centers and radio astronomical data processing centers. The versatility of this system provides the possibility to build in a video converter, decoder, increase SSD data array. Continue...