×

You are using an outdated browser Internet Explorer. It does not support some functions of the site.

Recommend that you install one of the following browsers: Firefox, Opera or Chrome.

Contacts:

+7 961 270-60-01
ivdon3@bk.ru

  • Moving from a university data warehouse to a lake: models and methods of big data processing

    The article examines the transition of universities from data warehouses to data lakes, revealing their potential in processing big data. The introduction highlights the main differences between storage and lakes, focusing on the difference in the philosophy of data management. Data warehouses are often used for structured data with relational architecture, while data lakes store data in its raw form, supporting flexibility and scalability. The section ""Data Sources used by the University"" describes how universities manage data collected from various departments, including ERP systems and cloud databases. The discussion of data lakes and data warehouses highlights their key differences in data processing and management methods, advantages and disadvantages. The article examines in detail the problems and challenges of the transition to data lakes, including security, scale and implementation costs. Architectural models of data lakes such as ""Raw Data Lake"" and ""Data Lakehouse"" are presented, describing various approaches to managing the data lifecycle and business goals. Big data processing methods in lakes cover the use of the Apache Hadoop platform and current storage formats. Processing technologies are described, including the use of Apache Spark and machine learning tools. Practical examples of data processing and the application of machine learning with the coordination of work through Spark are proposed. In conclusion, the relevance of the transition to data lakes for universities is emphasized, security and management challenges are emphasized, and the use of cloud technologies is recommended to reduce costs and increase productivity in data management. The article examines the transition of universities from data warehouses to data lakes, revealing their potential in processing big data. The introduction highlights the main differences between storage and lakes, focusing on the difference in the philosophy of data management. Data warehouses are often used for structured data with relational architecture, while data lakes store data in its raw form, supporting flexibility and scalability. The section ""Data Sources used by the University"" describes how universities manage data collected from various departments, including ERP systems and cloud databases. The discussion of data lakes and data warehouses highlights their key differences in data processing and management methods, advantages and disadvantages. The article examines in detail the problems and challenges of the transition to data lakes, including security, scale and implementation costs. Architectural models of data lakes such as ""Raw Data Lake"" and ""Data Lakehouse"" are presented, describing various approaches to managing the data lifecycle and business goals. Big data processing methods in lakes cover the use of the Apache Hadoop platform and current storage formats. Processing technologies are described, including the use of Apache Spark and machine learning tools. Practical examples of data processing and the application of machine learning with the coordination of work through Spark are proposed. In conclusion, the relevance of the transition to data lakes for universities is emphasized, security and management challenges are emphasized, and the use of cloud technologies is recommended to reduce costs and increase productivity in data management.

    Keywords: data warehouse, data lake, big data, cloud storage, unstructured data, semi-structured data

  • Development of a Simplified Calibration Method for a Collaborative Robot on a Mobile Cart

    The article develops calibration methods to improve accuracy and reduce operating costs of robotic systems in warehouse logistics. Special attention is given to the use of laser sensors and offset parameters, enabling the robot's position to adapt to changing conditions. The methodology includes the stages of initialization, orientation, and final verification, which help minimize deviations and reduce the need for manual adjustments. This approach ensures consistent operational accuracy and lowers operating costs through automated and adaptive robot calibration settings.

    Keywords: robot calibration, warehouse automation, laser sensor, offset, positioning accuracy, robotic system, adaptive calibration, automatic calibration, collaborative robot, cobot

  • On the Development of Secure Applications Based on the Integration of the Rust Programming Language and PostgreSQL DBMS

    Currently, key aspects of software development include the security and efficiency of the applications being created. Special attention is given to data security and operations involving databases. This article discusses methods and techniques for developing secure applications through the integration of the Rust programming language and the PostgreSQL database management system (DBMS). Rust is a general-purpose programming language that prioritizes safety as its primary objective. The article examines key concepts of Rust, such as strict typing, the RAII (Resource Acquisition Is Initialization) programming idiom, macro definitions, and immutability, and how these features contribute to the development of reliable and high-performance applications when interfacing with databases. The integration with PostgreSQL, which has been demonstrated to be both straightforward and robust, is analyzed, highlighting its capacity for efficient data management while maintaining a high level of security, thereby mitigating common errors and vulnerabilities. Rust is currently used less than popular languages like JavaScript, Python, and Java, despite its steep learning curve. However, major companies see its potential. Rust modules are being integrated into operating system kernels (Linux, Windows, Android), Mozilla is developing features for Firefox's Gecko engine and StackOverflow surveys show a rising usage of Rust. A practical example involving the dispatch of information related to class schedules and video content illustrates the advantages of utilizing Rust in conjunction with PostgreSQL to create a scheduling management system, ensuring data integrity and security.

    Keywords: Rust programming language, memory safety, RAII, metaprogramming, DBMS, PostgreSQL

  • The method of multiple initial connections as a tool for enhancing information security in peer-to-peer virtual private networks

    The article presents the method of multiple initial connections aimed at enhancing the information security of peer-to-peer virtual private networks. This method ensures the simultaneous establishment of several initial connections through intermediate nodes, which complicates data interception and minimizes the risks of connection compromise. The paper describes the algorithmic foundation of the method and demonstrates its application using a network of four nodes. An analysis of packet routing is conducted, including the stages of packet formation, modification, and transmission. To calculate the number of unique routes and assess data interception risks, a software package registered with the Federal Service for Intellectual Property was developed. The software utilizes matrix and combinatorial methods, providing high calculation accuracy and analysis efficiency. The proposed method has broad application prospects in peer-to-peer networks, Internet of Things systems, and distributed control systems.

    Keywords: multiple initial connections, peer-to-peer network, virtual private network, information security, data transmission routes, intermediate nodes, unique routes

  • Development of a secure connection establishment algorithm for peer-to-peer virtual private networks using multi-level cryptographic protection

    The article presents an algorithm for establishing a secure connection for peer-to-peer virtual private networks aimed at enhancing information security. The algorithm employs modern cryptographic protocols such as IKEv2, RSA, and DH, providing multi-level data protection. The developed algorithm structure includes dynamic generation and destruction of temporary keys, reducing the risk of compromise. The proposed solution is designed for use in corporate network security systems, Internet of Things system, and distributed systems.

    Keywords: virtual Private Network, peer-to-peer network, cryptographic protocols, RSA, Diffie-Hellman, IKEv2, secure connection, multi-layer protection, information security, distributed systems

  • A review of technologies for deceiving an attacker (traps, decoys, moving target defense, deception platform), their classification and interaction

    The purpose of the article is to review various types how to deceive attackers in the network, analyze the applicability and variability of modern deception technologies. The method of investigation - analyzing existing articles in reviewed Russian and foreign sources, aggregating researches, forming conclusions based on the analyzed sources. The review article considers technologies of deception an attacker (Honeypot traps, Honeytoken decoys, moving target defense MTD, Deception platform). The effectiveness of the use of deception in terms of the impact on the mental state of a person is given in the article. The article provides a description of different types of Honeypots, discusses the classification according to the target, place of introduction, level of interaction, location, type of introduction, homogeneity and type of activity. as well as their component parts. Different strategies for using traps in the network are discussed - sacrificial lamb, hacker zoo, minefield, proximity traps, redirection screens, and deception ports. Classification of decoys is given, methods of their application in an organization's network are described, additional conditions that increase the probability of detection of an attacker by using decoys are specified. The basic techniques of the MTD strategy to obfuscate the infrastructure are given. The interaction of these methods with Honeypot and Honeytoken technologies is described. Research that confirms the effectiveness of using MTD in conjunction with traps and decoys is given it he article, the difficulties in using this strategy are pointed out. A description of the Deception platform is given, its distinctive features from conventional traps and decoys are described, and the possibility of its interaction with MTD is given. As a result, the main technologies and strategies to deceive the attacker have been identified and described, their development is pointed, their interaction with attackers and counteraction to them is described.

    Keywords: Deception Platform, Honeypot, Honeytoken, Honeynet, MTD

  • The optimal multi-frequency modem for intermodule interaction of hybrid communications systems of a digital city

    The development, research and construction of devices that speed up the process of interaction between various modules (for example, telemetry and remote control systems), and in general, hybrid communication systems of a digital city that include a variety of systems used in an Intelligent Building is an urgent problem. One of these devices presented in the article is the optimal multi–frequency modem developed. In addition to the developed modem, the article presents examples of the development of similar types of devices and systems by both Russian and foreign researchers. At the same time, the authors proved that the use of the proposed modem provides a gain in spectral and energy efficiency in comparison with analogues. The proposed approach can be used to organize high-speed data transmission over frequency-limited communication channels based on new wired technologies of the digital subscriber line standard, as well as wireless systems.

    Keywords: telemetry and remote control system, intelligent building, digital city hybrid communications system, modem, multi-frequency modulation, digital subscriber line, optimal finite signal, modulator, demodulator, wireless communication system

  • Search for patent analogues based on a comparison of key phrases

    This study describes approaches to automating full-text keyword search in the field of patent information. Automating the search by keywords (n-grams) is a significantly more difficult task than searching by individual words, in addition, it requires morphological and syntactic analysis of the text. To achieve this goal, the following tasks were solved: (a) the full-text search systems were analyzed: Apache Solr, ElasticSearch and ClickHouse; (b) a comparison of the architectures and basic capabilities of each system was carried out; (c) search results in Apache Solr, ElasticSearch and ClickHouse were obtained on the same dataset. The following conclusions were drawn: (a) all the systems considered perform full-text keyword search; (b) Apache Solr is the system with the highest performance, it also has very convenient functions; (b) ElasticSearch has a fast and powerful architecture; (c) ClickHouse has a high data processing speed.

    Keywords: search, keyphrases, patent, Apache Solr, Elasticsearch, ClickHouse

  • A Systemic Approach to the Implementation and Development of Innovative Information Technologies for the Prevention of Offenses Committed by Foreign Citizens

    The article presents a comprehensive analysis of a systematic approach to the implementation and development of innovative information technologies aimed at preventing offenses committed by foreign citizens. The introduction provides an overview of the growing importance of employing advanced technological solutions in law enforcement, particularly in addressing challenges associated with foreign nationals. The main objectives of the study are to explore how the integration of technologies such as big data processing, artificial intelligence, and geographic information systems can enhance the efficiency of preventive measures. The article details the use of data analysis techniques, machine learning models, and system integration to create a unified information platform. This platform enables the consolidation of data from diverse sources, thereby improving the coordination between different law enforcement units and facilitating faster and more informed decision-making processes. The integration of these technologies also supports process standardization, reducing data inconsistencies and ensuring more reliable operations across various departments. The results highlight the benefits of utilizing big data analytics to process vast amounts of information that would be otherwise impossible to handle efficiently. Artificial intelligence, through predictive models and risk assessment tools, plays a crucial role in identifying potential threats and allocating resources effectively. Geographic information systems contribute by mapping crime hotspots and providing spatial analysis, which aids in targeted intervention strategies. The discussion emphasizes the importance of a unified approach to technology implementation, focusing on the creation of an integrated information system that can adapt to ongoing changes in the social and legal environment. The adaptability of the system is critical for maintaining its effectiveness in the face of new challenges and evolving regulatory requirements. The development of standardized data collection and processing protocols further enhances the system's resilience and operational efficiency. In conclusion, the article underscores that a systematic and integrated use of innovative information technologies significantly improves the effectiveness of crime prevention efforts and the overall efficiency of law enforcement agencies. The proposed approach not only facilitates proactive measures but also ensures a high level of responsiveness to emerging security threats, thereby strengthening public safety.

    Keywords: systemic approach, innovative information technologies, crime prevention, foreign citizens, big data, artificial intelligence, geoinformation systems, information platform, standardization, law enforcement agencies, efficiency management, data integration

  • Game-based training models in a simulation environment of organizational conflicts

    Abstract. It is revealed that specific forms of a simulation game combined with some peculiarities of training sessions in organizational systems could result in developing adaptable simulation models of a business situation. It is recommended to use a cognitive model in problem analysis of organizational systems, which allows switching from cognitive to simulation models naturally still being in visual topological descriptions. The AnyLogic software platform was chosen for developing a model which provides ample opportunities for creating an innovative educational environment with the elements of game simulations and AI. Cognitive analysis of a game learning process has revealed that the latter should have one cycle of a business game with two interactive nodes to introduce a host and a player into the game. It is noted that business games focused on developing management styles in a conflict are mostly in demand. Therefore, a simulation model has been developed to train executives to counteract an organizational conflict within the variability of authoritarian, democratic and liberal management styles. The model uses a paradigm of systems dynamics and is implemented in the AnyLogic software platform notation. To set the rules, the game host in the initial state or when starting the next game cycle sets the dynamics characteristics of a process while managing the organizational structure, as well as changes characteristics values of a pre-conflict situation. In response to conflict development the player performs management using auxiliary services available to him. In fact, the model is not limited by a list of the game’s tasks or possible options for a player’s decision.

    Keywords: management diversification, production diversification, financial and economic diversification goals, production and technical goals to ensure production flexibility

  • Development of a speed correction algorithm to reduce wear on rollers of a feeding machine based on metal tension data

    The article discusses the problems of wear of the feeding machine rollers associated with speed mismatch in the material tracking mode. Existing methods of dealing with wear and tear struggle with the effect of the problem not the cause. One of the ways to reduce the intensity of wear of roller barrels is to develop a method of controlling the speed of the feeding machin, which reduces the mismatch between the speeds of rollers and rolled products without violating the known technological requirements for creating pulling and braking forces. Disclosed is an algorithm for calculating speed adjustment based on metal tension which compensates for roller wear and reduces friction force. Modeling of the system with the developed algorithm showed the elimination of speed mismatch during material tracking and therefore it will reduce the intensity of roller wear.

    Keywords: speed correction system, feeding machine, roller wear, metal tension, control system, speed mismatch, friction force reduction

  • Development of a malmical traffic detection system to increase the number of detected anomalies

    Relevance of the research topic. Modern cyber attacks are becoming more complex and diverse, which makes classical methods of detecting anomalies, such as signature and heuristic, insufficiently effective. In this regard, it is necessary to develop more advanced systems for detecting network threats based on machine learning and artificial intelligence technologies. Problem statement. Existing methods of detecting malicious traffic often face problems associated with high false-positive response and insufficient accuracy in the face of real threats on the network. This reduces the effectiveness of cybersecurity systems and makes it difficult to identify new attacks. The purpose of the study. The purpose of this work is to develop a malicious traffic detection system that would increase the number of detected anomalies in network traffic through the introduction of machine learning and AI technologies. Research methods. To achieve this goal, a thorough analysis and preprocessing of data obtained from publicly available datasets such as CICIDS2017 and KDD Cup 1999 was carried out.

    Keywords: anomaly detection, malicious traffic, cybersecurity, machine learning, artificial intelligence, signature methods

  • A fuzzy comparison method for managing the functioning of an organizational system

    Modern digitalization processes involve the use of intelligent systems at key stages of information processing. Given that the data available for intelligent analysis in organizational systems are often fuzzy, there is a problem of comparing the corresponding units of information with each other. There are several known methods for such a comparison. In particular, for random fuzzy variables with known distribution laws, the degree of coincidence of these distribution laws can be used as a criterion for the correspondence of one random variable to another. However, this approach does not have the necessary flexibility required to solve practical problems. The approach we propose allows you to compare fuzzy, fuzzy and clear, as well as clear and clear data. The paper will provide an example illustrating this approach. The material presented in the study was initially focused on managing organizational systems in education. However, its results can be extended to other organizational systems.

    Keywords: fuzzy data, weakly structured problems, comparison criteria, hierarchy analysis method, systems analysis, fuzzy benchmarking

  • A mathematical model of a fault-tolerant nonlinear conversion unit for OFDM wireless systems with frequency hopping

    With the development of low-orbit satellite Internet systems (NSIS), issues of ensuring effective operation in conditions of intentional interference come to the fore. One of the solutions is related to the use of systems using both OFDM methods and generators implementing frequency hopping (HF). Obviously, the more complex the algorithm for selecting operating frequencies, the more efficient the operation of the microwave. In the article, it is proposed to use the SPN cipher "Grasshopper" as a generator for selecting operating frequencies. As a result, the CCF system will have a high resistance to calculating operating frequency numbers by electronic warfare systems. However, failures and failures may occur during the operation of the SSC. To prevent their consequences, it is proposed to implement an SPN cipher using polynomial modular codes of residue classes (PMCC). One of the transformations in the "Grasshopper" is a nonlinear transformation that performs the substitution operation. It is obvious that the creation of a new mathematical model for performing a nonlinear transformation using MCCS will ensure the operation of the SPN-cipher-based RF generator in conditions of failures and failures.

    Keywords: low-orbit satellite Internet systems, the Grasshopper SPN cipher, nonlinear transformations, modular codes of residue classes, mathematical model, fault tolerance, frequency hopping, polynomial modular code of residue classes

  • One of the Approaches to Analyzing Source Code in Student Projects

    When evaluating student work, the analysis of written assignments, particularly the analysis of source code, becomes particularly relevant. This article discusses an approach for evaluating the dynamics of feature changes in students' source code. Various metrics of source code are analyzed and key metrics are identified, including quantitative metrics, program control flow complexity metrics, and the TIOBE quality indicator. A set of text data containing program source codes from a website dedicated to practical programming, was used to determine threshold values for each metric and categorize them. The obtained results were used to conduct an analysis of students' source code using a developed service that allows for the evaluation of work based on key features, the observation of dynamics in code indicators, and the understanding of a student's position within the group based on the obtained values.

    Keywords: machine learning, text data analysis, program code analysis, digital footprint, data visualization