×

You are using an outdated browser Internet Explorer. It does not support some functions of the site.

Recommend that you install one of the following browsers: Firefox, Opera or Chrome.

Contacts:

+7 961 270-60-01
ivdon3@bk.ru

  • The technique of analyzing video files for detecting the presence of persons and attractions, using recognition by key, non-repeating frames

    In this paper, we consider a technique for automatic analysis of video files for detecting the presence of persons and attractions, using recognition by key, non-repeating frames, based on algorithms for their extraction. Recognition of landmarks and faces only by keyframes will significantly reduce computational costs, as well as avoid overflowing with repetitive information. The effectiveness of the proposed technique is evaluated in terms of accuracy and speed on a set of test videos.

    Keywords: keyframe, recognition, computer vision, algorithm, video

  • Aliasing-grams for express control of the adequacy of the choice of sampling interval of the measured signal

    A new mathematical apparatus is proposed for monitoring the adequacy of the choice of signal sampling interval from the point of view of taking into account the main high-frequency components and identifying the possibilities of increasing it. It is based on the construction of special aliasing grams based on measured signal samples. Aliasing grams are graphs of standard deviations of the amplitude spectra of a conventionally reference discrete signal, specified with the highest sampling frequency, and auxiliary discrete signals obtained over the same observation interval, but with lower sampling frequencies. By analyzing such graphs, it is easy to identify sampling frequencies that lead to the appearance of the aliasing effect in the case of sampling, and, consequently, to distortion of the signal spectrum. To speed up and simplify the construction of aliasinggrams, it is proposed to use as auxiliary signals obtained from the reference one by thinning. It has been shown that this device is also effective in the case of the spectrum spreading effect. It can be used in self-learning measuring systems.

    Keywords: sampling interval, aliasing, amplitude spectrum, aliasing-gram, sample decimation, spectrum spreading

  • Development of a graphical notation for representing models as a whole for methodology of automation of intellectual labor

    The article discusses a graphical notation using three-dimensional visualization for representing models of automated systems according to the Methodology of Automation of Intellectual Labor (MAIL). The research aims to enhance the efficiency of modeling automated systems by providing a more comprehensive representation of the models. Research methods employed include a systems approach. The study results in the formulation of descriptions and rules for creating the corresponding graphical notation for the initial and conceptual modeling stages of subject tasks in MAIL, as well as rules for forming representations for static and dynamic model structures and representing their interrelations. Additionally, rules for visually highlighting and concealing elements within the diagrams of the graphical notation are examined, rendering it suitable for implementation as a software module with a graphical interface for CASE tools, facilitating modeling according to MAIL. Such an approach enables the visualization of the model as a whole and enhances the efficiency of analysts conducting modeling following the methodology.

    Keywords: methodology of Automation of Intellectual Labor, modeling of automated systems, conceptual modeling, graphical notation, three-dimensional visualization

  • Models of inclusive learning in foreign language classes

    This paper reveals many topical problems related to the modernization of inclusive education in Russia, with an emphasis on the practice of teaching foreign language in higher educational institutions. The paper also presents models of inclusion of persons with disabilities relevant to the modern educational environment. A brief description of the historical and legal basis of inclusion in Russia is also given. The authors note that in higher education institutions inclusive education is still at the stage of formation and that for its successful implementation it is necessary to comprehend the problem and create a methodological basis.

    Keywords: inclusive education, persons with disabilities, equal access, quality education, integration, synergy, legal framework, adaptation, transformation.

  • Simulation of an autonomous control system for a slitting machine of a paper machine

    The work is aimed at modeling the control system of a slitting machine of a paper machine in order to improve the quality of products and eliminate defects in winding density. The developed automated system implements the functions of controlling the operating modes of the machine, distributing the loads of the bearing shafts, braking the roll and tensioning the paper web.

    Keywords: slitting machine, paper machine, automated control system, rewinder, pressure roller, decoiler, reeler, accelerating shaft, deflecting shaft, cutting section

  • On the issue of reducing the power consumption of wireless sensor nodes

    The article presents expressions that allow you to calculate the amount of power consumption of end nodes when transmitting a message in a wireless sensor network. Data are obtained on the values that the value of power consumption of the end node of the sensor network takes, depending on the attenuation of signals during transmission over a wireless channel, as well as on the set values of the output power and the spreading factor of the transmitted signals.

    Keywords: internet of Things, sensor network, LoRaWAN, IoT system, end node power consumption, spreading factor, output power

  • Detection of local defect areas during non-destructive testing of extended products

    The article discusses a method for detecting local areas with hidden defects in products whose length is several orders of magnitude greater than other dimensions, when processing information from non-destructive testing of the product. To obtain the necessary information, various means of introscopy and radiation of different nature are used. Processing of information obtained using scanning control should detect areas with defects and determine their nature. To compare different processing methods and select the optimal method for processing information, a computer modeling method was used, with the help of which the process of obtaining information and processing it was simulated, which simplifies the selection of the most suitable method for detecting a defect. The article describes typical models of the received signal and presents the simulation results.

    Keywords: defects, non-destructive testing, extended products, simulation model, moving averaging, time series

  • Investigation of measurement data in assessing the quality of mixing of dissimilar fibers

    The article discusses the conducted studies of changes in the output signal from a measuring device to assess the quality of mixing natural and chemical fibers in semi-finished products of spinning production obtained on a belt machine at various transitions. The construction of polynomial models in data analysis makes it possible to interpret information about the uniformity of fiber distribution in the tape, without taking into account the effect on changes in its linear density.

    Keywords: fiber mixing quality, linear density, infrared estimation method, data estimation, linear polynomial, polynomial function

  • Artificial immune systems in Cryptanalysis and solving Diophantine equations: a new approach to information protection

    The article considers the problem of cryptanalysis of an information security system based on a difficult-to-solve problem of Diophantine equations. A mathematical model of such a protection system is described in the article and a solution to the cryptanalysis problem using an artificial immune system adapted for solving Diophantine equations is proposed. The paper discusses the basic principles of building artificial immune systems and presents the results of experiments on evaluating the effectiveness of the proposed system of Diophantine equations of a degree not exceeding six. The results obtained demonstrate the possibility of using artificial immune systems to solve the problem of cryptanalysis of information security systems based on Diophantine equations.

    Keywords: cryptanalysis, information security system, diophantine equations, artificial immune system, adaptive algorithm, efficiency assessment

  • Implementation of a model for automatic recognition of human emotions from speech

    Determining human emotions from speech is a pressing task at the moment, because it can be applied in various industries, such as economics, medicine, marketing, security and education. This work examines the recognition of human emotions specifically from speech, because speech is an informative indicator that is quite difficult to fake. The paper discusses a neural network approach to solving the problem. A recurrent neural network with LSTM memory was implemented, and our own dataset was collected on which the model was trained. The dataset includes the speech of Russian-speaking actors, which will improve the quality of the model for Russian-speaking users.

    Keywords: neural network, emotion detection, speech, classification, deep learning, recurrent model, LSTM

  • Designing the ontological model for the domain model of «Information security»

    This article describes aspects of ontology design for the sphere of information security. There are some examples of the use of ontologies in the sphere of information security including risk management, classification of threats and vulnerabilities, monitoring incidents, as well as examples of existing developments of ontologies for information security. The relevance of the development of legal ontologies is determined and examples of their use in practice are given. Also, the importance of designing a legal ontology for the subject area of information security under consideration is given due to the presence of a large legal framework. The paper presents the developed ontology model for one of the regulatory documents in the field of personal data protection. The approach to ontology design presented in the paper is proposed to be applied in the development of an information security learning system.

    Keywords: security, information security, protection of information, information, domain model, normative legal act, ontology, ontological approach, design, legal ontology

  • Road sign detection based on the YOLO neural network model

    This article presents a research study dedicated to the application of the YOLOv8 neural network model for road sign detection. During the study, a model based on YOLOv8 was developed and trained, which successfully detects road signs in real-time. The article also presents the results of experiments in which the YOLOv8 model is compared to other widely used methods for sign detection. The obtained results have practical significance in the field of road traffic safety, offering an innovative approach to automatic road sign detection, which contributes to improving speed control, attentiveness, and reducing accidents on the roads.

    Keywords: machine learning, road signs, convolutional neural networks, image recognition

  • A mathematical model for assessing the applicability of intelligent chatbots for studying foreign language dialects

    The article presents a mathematical model for assessing the applicability of intelligent chatbots in the context of studying dialects of foreign languages. The model is based on the analysis of key parameters and characteristics of chatbots, as well as their ability to adapt to various dialects. The model's parameters include questions, answers, evaluation criteria, types, and costs of errors. The quality of the chatbot's responses is evaluated both according to individual criteria and overall. To test the effectiveness of the proposed method, an experimental study was conducted using the dialects of the German language as examples. During the research, such intelligent chatbots as ChatGPT-3.5, GPT-4, YouChat, Bard, DeepSeek, and Chatsonic were evaluated. The analysis of the results of applying the developed mathematical model showed that at present, the models by OpenAI (ChatGPT-3.5 and GPT-4) offer the broadest range of possibilities. ChatGPT-3.5 demonstrated the best results in communication in Bavarian and Austrian dialects, while YouChat excelled in the Swiss dialect. The obtained results allow for important practical recommendations to be made for selecting intelligent chatbots in the field of studying dialects of foreign languages and serve as a basis for further research in the area of evaluating the effectiveness of educational technologies based on artificial intelligence.

    Keywords: large language model, chatbot, quality assessment, foreign language learning, artificial intelligence technology in education

  • Exploring Long Short-Term Memory-based Encoder-Decoder Framework for Extractive Text Summarization

    In this article we present a study on Natural Language Processing (NLP) and Machine Learning (ML) techniques, specifically focusing on deep learning algorithms. The research explores the application of Long Short-Term Memory (LSTM) models with attention mechanisms for text summarization tasks. The dataset used for experimentation consists of news articles and their corresponding summaries. The article discusses the preprocessing steps, including text cleaning and tokenization, performed on the data. The study also investigates the impact of different hyperparameters on the model's performance. The results demonstrate the effectiveness of the proposed approach in generating concise summaries from lengthy texts. The findings contribute to the advancement of Natural Language Processing and Machine Learning techniques for text summarization.

    Keywords: extractive text summarization, sequence-to-sequence, long short-term memory, encoder_decoder, summarization model, natural language processing, machine learning, deep learning, attention mechanism

  • A Hybrid LSTM-DNN model, predicting fuel consumption of dump trucks in open-pit mining

    Fuel efficiency of dump trucks is affected by real world variables such as vehicle parameters, road conditions, weather parameters, and driver behavior. Predicting fuel consumption per trip using dynamic road condition data can effectively reduce the cost and time associated with on-road testing. This paper proposes new models for predicting fuel consumption of dump trucks in surface mining operations. The models combine locally collected data from dump truck sensors and analyze it to enhance their capabilities. The architectural design consists of two distinct parts, initially based on dual Long-term Short-Term Memories (LSTMs) and dual dense layers of Deep Neural Networks (DNNs). The new hybrid architecture improves the performance of the proposed model compared to other models, especially in terms of accuracy measurement. The MAE, RMSE, MSE and R2 scores indicate high prediction accuracy.

    Keywords: LSTM algorithm, DNN, density, prediction, fuel consumption, quarries

  • Organization of data transmission over the communication network, taking into account the subscription fee for the use of communication channels

    The task of planning the sending of messages of known volumes from source points to destinations with known needs. At the same time, it is assumed that the costs of transmitting information on the one hand are proportional to the transmitted volumes and the cost of transmitting a unit of information over the selected communication channels, and on the other hand are associated with a fixed subscription fee for the use of channels that does not depend on the volume of transmitted information. An indicator of the quality of the plan with such a statement is the total cost of sending the entire planned volume of messages. A comparative characteristic of the effectiveness of methods for obtaining optimal plans using a linearized objective function and an exact solution by one of the combinatorial methods is carried out.

    Keywords: message transmission, transport task, criterion of minimum total costs, computational complexity of the algorithm, linearization of the objective function

  • An overview of machine learning-based techniques for detecting outliers in data

    Outlier detection is an important area of data research in various fields. The aim of the study is to provide a non-exhaustive overview of the features of using methods for detecting outliers in data based on various machine learning techniques: supervised, unsupervised, semi-supervised. The article outlines the features of the application of certain methods, their advantages and limitations. It has been established that there is no universal method for detecting outliers suitable for various data, therefore, the choice of a particular method for the implementation of research should be made based on an analysis of the advantages and limitations inherent in the chosen method, with the obligatory consideration of the capabilities of the available computing power and the characteristics of the available data, in including those including their classification into outliers and normal data, as well as their volume.

    Keywords: outliers, machine learning, outlier detection, data analysis, data mining, big data, principal component analysis, regression, isolating forest, support vector machine

  • Optimizing Quantum Espresso for Nvidia GPUs with CUDA Technology

    This article explores how to optimize Quantum Espresso for efficient use of Nvidia's graphics processing unit (GPU) using CUDA technology. Quantum Espresso is a powerful tool for quantum mechanical simulation and calculation of material properties. However, the original version of the package was not designed for GPU use, so optimization is required to achieve the best performance.

    Keywords: Quantum Espresso, GPU, CUDA, compute acceleration

  • Problematic issues of cadastral valuation of cultural heritage objects

    This article discusses the process of collecting initial data for the cadastral assessment of cultural heritage objects. It has been revealed that cultural heritage objects have a number of features among other real estate objects, so special attention should be paid to the methodology for assessing such objects. The purpose of the study is to analyze and identify problematic issues in collecting initial data on cultural heritage objects during the state cadastral assessment. The results of the study showed a number of problematic issues related to the type of cultural heritage object, its binding to the cadastral number and the possibility of interpreting the information received to bring it to automatic data processing.

    Keywords: state cadastral valuation, cultural heritage object, collection of initial data, interdepartmental interaction, status of cultural heritage object

  • Automation of the competitive selection process for filling vacant positions of the University teaching staff

    The procedure for filling positions of teaching staff of universities belonging to the teaching staff is regulated by federal laws and local regulations. At the same time, it becomes necessary to store and exchange a large number of documents between various participants of competitive events. The aim of the work was to automate the process of holding competitive events and use a common data warehouse, with the help of which it is possible to speed up paperwork, save time and consumables, ensure the safety of storing, transmitting and processing information. The article reflects the obtained results of automation of the competitive selection process at the St. Petersburg State University of Architecture and Civil Engineering.

    Keywords: higher education institutions, competitive election, teaching staff, automation

  • Application of large language models in simulation modeling

    The modern cycle of creating simulation models is not complete without analysts, modelers, developers, and specialists from various fields. There are numerous well-known tools available to simplify simulation modeling, and in addition, it is proposed to use large language models (LLMs), consisting of neural networks. The article considered the GPT-4 model as an example. Such models have the potential to reduce costs, whether financial or time-related, in the creation of simulation models. Examples of using GPT-4 were presented, leading to the hypothesis that LLMs can replace or significantly reduce the labor intensity of employing a large number of specialists and even skip the formalization stage. Work has been conducted comparing the processes of creating models and conducting experiments using different simulation modeling tools, and the results have been formatted into a comparative table. The comparison was conducted based on the main simulation modeling criteria. Experiments with GPT-4 have successfully demonstrated that the creation of simulation models using LLMs is significantly accelerated and has great perspective in this field.

    Keywords: Simulation modeling, large language model, neural network, GPT-4, simulation environment, mathematical model

  • Numerical experiments to investigate the relationship between Poisson's ratio and cohesion

    This study is a pilot one. The purpose of the study is to identify the nature of the relationship between Poisson's ratio and cohesion, on the example of a soil mass. The main objective of the study is to identify the dependence of Poisson's ratio and cohesion coefficient to obtain the fracture limit of the material (in this study of soil massif) - plastic flows in the material. The study is conducted by methods of mathematical modeling. In order to achieve the objective, it is necessary to justify the possibility of performing this experiment by means of boundary value problem, and to perform the ranking of the number of numerical experiments by experiment planning method to obtain the extrema. Next, it is necessary to perform the numerical experiment itself to reveal the relationship between Poisson's ratio and cohesion. The obtained data will be used to compose the inverse problem when testing a new Russian software product in the field of geotechnical and geomechanical modeling.

    Keywords: Poisson's ratio, cohesion, soil massif, numerical experiment, finite element method, mathematical modelling, plastic flow, deformation, stress

  • Vulnerability analysis in data security systems

    The article is a review work on the methods and technologies used in the analysis of vulnerabilities in information systems. The article describes the main steps in conducting a vulnerability analysis, such as collecting information about the system, scanning the system for vulnerabilities, and analyzing the scan results. It also discusses how to protect against vulnerabilities, such as regularly updating software, conducting vulnerability analysis, and developing a data security strategy.

    Keywords: vulnerability analysis, data security, information security threats, attack protection, information security, computer security, security risk, network vulnerability, security system, protection

  • Basic criteria for choosing a common data environment for the work of design organizations

    The publication discusses the definition of a shared data environment. The main criteria for choosing SOD are put forward. A generalized analysis of the weaknesses of all existing ODS systems is provided. The article will help you better understand ODS and make the right choice of system.

    Keywords: general data environment, design, construction, information, information modeling, ODS, criteria, management, information organization, information transfer

  • Development of a malware detection method using a system call graph using machine learning

    This article is devoted to solving the problem of research and detection of malware. The method implemented in the work allows you to dynamically detect malware for Android using system call graphs using graph neural networks. The objective of this work is to create a computer model for a method designed to detect and investigate malware. Research on this topic is important in mathematical and software modeling, as well as in the application of system call control algorithms on Android devices. The originality of this direction lies in the constant improvement of approaches in the fight against malware, as well as limited information on the use of computer simulation to study such phenomena and features in the world.

    Keywords: system calls, android, virus, malware, neural networks, artificial intelligence, fuzzy logic