1. The analysis of the effects of a fare free public transport travel demand based on e-ticketingDanijel Hojski, David Hazemali, Marjan Lep, 2022, izvirni znanstveni članek Opis: The traditional approach in public transport planning was to collect travel demand data for a more extended period and compose timetables to serve this demand. There are two significant identifiable issues. In the rural areas and off-peak hours, public transport operators provide much more capacities than needed. On the other hand, more capacities than scheduled are needed on certain lines at certain departures on some sporadically occurring occasions. The problem is how to react to short-term changes (daily) triggered by exceptional circumstances and events and midterm changes (weekly, monthly basis) in travel demand. We can trigger changes in travel demand chiefly by introducing a desirable (almost for free) tariff system applied to specific populations. No long-term travel response data exists for this kind of intervention, but an immediate response in public transport supply is needed. In Slovenia, public transport for free for the whole population over 65 years was introduced. With the modern ticketing system, which was designed to be as simple as possible for users (that means "check-in only" at the moment of boarding), the research task was to analyze the travel behavior of the retired population, faced with a new attractive option to travel, based on data of purchased tickets and their afterward validation, for better mid-and long-term planning. Our study finds that ITS technology (in this case, e-ticketing system) can satisfactorily solve the discussed planning and management task. Ključne besede: fare-free public transport, smart card data collecting, population mobility, travel demand Objavljeno v DKUM: 13.03.2025; Ogledov: 0; Prenosov: 0
Celotno besedilo (2,11 MB) Gradivo ima več datotek! Več... |
2. Application of machine learning to reduce casting defects from bentonite sand mixtureŽiga Breznikar, Marko Bojinović, Miran Brezočnik, 2024, izvirni znanstveni članek Opis: One of the largest Slovenian foundries (referred to as Company X) primarily focuses on casting moulds for the glass industry. In collaboration with Pro Labor d.o.o., Company X has been systematically gathering defect data since 2021. The analysis revealed that the majority of scrap caused by technological issues is attributed to sand defects. The initial dataset included information on defect occurrences, technological parameters of sand mixture and chemical properties of the cast material. This raw data was refined using data science techniques and statistical methods to support classification. Multiple binary classification models were developed, using sand mixture parameters as inputs, to distinguish between good casting and scrap, with the k-nearest neighbours algorithm. Their performances were evaluated using various classification metrics. Additionally, recommendations were made for development of a real-time industrial application to optimize and regulate pouring temperature in the foundry process. This is based on simulating different pouring temperatures while keeping the other parameters fixed, selecting the temperature that maximizes the likelihood of successful casting Ključne besede: gravity casting, machine learning, defects, classifier, data science Objavljeno v DKUM: 11.03.2025; Ogledov: 0; Prenosov: 3
Celotno besedilo (518,07 KB) Gradivo ima več datotek! Več... |
3. Enhancing manufacturing precision: Leveraging motor currents data of computer numerical control machines for geometrical accuracy prediction through machine learningLucijano Berus, Jernej Hernavs, David Potočnik, Kristijan Šket, Mirko Ficko, 2024, izvirni znanstveni članek Opis: Direct verification of the geometric accuracy of machined parts cannot be performed simultaneously with active machining operations, as it usually requires subsequent inspection with measuring devices such as coordinate measuring machines (CMMs) or optical 3D scanners. This sequential approach increases production time and costs. In this study, we propose a novel indirect measurement method that utilizes motor current data from the controller of a Computer Numerical Control (CNC) machine in combination with machine learning algorithms to predict the geometric accuracy of machined parts in real-time. Different machine learning algorithms, such as Random Forest (RF), k-nearest neighbors (k-NN), and Decision Trees (DT), were used for predictive modeling. Feature extraction was performed using Tsfresh and ROCKET, which allowed us to capture the patterns in the motor current data corresponding to the geometric features of the machined parts. Our predictive models were trained and validated on a dataset that included motor current readings and corresponding geometric measurements of a mounting rail later used in an engine block. The results showed that the proposed approach enabled the prediction of three geometric features of the mounting rail with an accuracy (MAPE) below 0.61% during the learning phase and 0.64% during the testing phase. These results suggest that our method could reduce the need for post-machining inspections and measurements, thereby reducing production time and costs while maintaining required quality standards Ključne besede: smart production machines, data-driven manufacturing, machine learning algorithms, CNC controller data, geometrical accuracy Objavljeno v DKUM: 10.03.2025; Ogledov: 0; Prenosov: 3
Celotno besedilo (4,44 MB) Gradivo ima več datotek! Več... |
4. Enhancing trust in automated 3D point cloud data interpretation through explainable counterfactualsAndreas Holzinger, Niko Lukač, Dzemail Rozajac, Emil Johnston, Veljka Kocic, Bernhard Hoerl, Christoph Gollob, Arne Nothdurft, Karl Stampfer, Stefan Schweng, Javier Del Ser, 2025, izvirni znanstveni članek Opis: This paper introduces a novel framework for augmenting explainability in the interpretation of point cloud data by fusing expert knowledge with counterfactual reasoning. Given the complexity and voluminous nature of point cloud datasets, derived predominantly from LiDAR and 3D scanning technologies, achieving interpretability remains a significant challenge, particularly in smart cities, smart agriculture, and smart forestry. This research posits that integrating expert knowledge with counterfactual explanations – speculative scenarios illustrating how altering input data points could lead to different outcomes – can significantly reduce the opacity of deep learning models processing point cloud data. The proposed optimization-driven framework utilizes expert-informed ad-hoc perturbation techniques to generate meaningful counterfactual scenarios when employing state-of-the-art deep learning architectures. The optimization process minimizes a multi-criteria objective comprising counterfactual metrics such as similarity, validity, and sparsity, which are specifically tailored for point cloud datasets. These metrics provide a quantitative lens for evaluating the interpretability of the counterfactuals. Furthermore, the proposed framework allows for the definition of explicit interpretable counterfactual perturbations at its core, thereby involving the audience of the model in the counterfactual generation pipeline and ultimately, improving their overall trust in the process. Results demonstrate a notable improvement in both the interpretability of the model’s decisions and the actionable insights delivered to end-users. Additionally, the study explores the role of counterfactual reasoning, coupled with expert input, in enhancing trustworthiness and enabling human-in-the-loop decision-making processes. By bridging the gap between complex data interpretations and user comprehension, this research advances the field of explainable AI, contributing to the development of transparent, accountable, and human-centered artificial intelligence systems. Ključne besede: explainable AI, point cloud data, counterfactual reasoning, information fusion, interpretability, human-centered AI Objavljeno v DKUM: 06.03.2025; Ogledov: 0; Prenosov: 2
Celotno besedilo (186,97 KB) |
5. Identification of Lithium-Ion Battery Parameter Variations Across Cells using Artificial IntelligenceTine Lubej, 2025, magistrsko delo Opis: This thesis focuses on improving the simulation, estimation, and accuracy of parameter identification in lithium-ion battery models. The key objective was to enhance a previously developed program by transitioning it to an object-oriented design, making it more efficient, user-friendly, and modular. Additionally, efforts were made to optimize the parameter estimation process by upgrading the cost function used during simulations and integrating real-world battery measurement data, specifically for the LGM50 battery type.
The first step in the thesis involved reworking the codebase to an object-oriented structure, which improved not only the code’s clarity but also its extensibility and efficiency. With this change, the program was better suited for future improvements and became more accessible for other users through simplified installation procedures. This was accompanied by the implementation of unit testing to ensure the reliability of the code.
Experiments were conducted across a range of discharge rates (from 0.05C to 1C) to evaluate the performance of the model under different conditions. These tests helped to identify trends in how the model responded to changes in operational parameters. Additionally, a dynamic pulse test was performed, which allowed for more precise estimation of the parameters. The results of these tests demonstrated the robustness of the methodology, especially under dynamic conditions.
A major innovation introduced in this thesis was the development of a new cost function, which led to noticeable improvements in parameter estimation accuracy, particularly under high discharge rates and when estimating multiple parameters simultaneously. This new cost function proved especially effective in more complex scenarios, where the original cost function struggled to maintain the same level of accuracy.
The program’s capabilities were further extended by incorporating real experimental data. Using a constant discharge profile for the LGM50 battery, the results showed some challenges when dealing with real-world data, particularly due to issues in measurement or data preprocessing. Nonetheless, the model consistently produced solutions, although the accuracy was influenced by the quality of the input data.
The thesis concludes by highlighting the success of the improvements made, both in terms of the program’s structure and the precision of its estimations. However, it also emphasizes the importance of improving the quality of real-world data to fully leverage the model’s potential in practical applications. This work lays a foundation for future developments in battery modeling, providing a framework that is adaptable for further research and practical use. Ključne besede: Machine Learning, Lithium-Ion Batteries, Parameter Estimation, Uncertainty Quantification, Real-experimental data Objavljeno v DKUM: 03.03.2025; Ogledov: 0; Prenosov: 10
Celotno besedilo (5,99 MB) |
6. The use of artificial intelligence in building engineering for historic buildings build in the Austro-Hungarian monarchyDaniela Dvornik Perhavec, Rok Kamnik, 2025, izvirni znanstveni članek Opis: Knowledge discovery from databases (KDD) and data mining (DM) belong to the field of artificial intelligence (AI). The integration of artificial intelligence into various segments of the construction industry is still in its infancy, but it is expected to be used more widely soon, driven by the development of databases and data warehouses. By using BIM (Building Information Modelling) technologies in the planning of new buildings, we will be able to obtain valuable data. The situation is different for old, existing buildings and the building engineering associated with these properties. Civil engineers, renovation planners and architects need knowledge of the building before renovation. This knowledge is much less than the possibilities that exist. Information about the building can be found in provincial archives. For historic buildings, 10- 15% of the plans, drawings, descriptions, or projects are available. The remaining 85% must be researched on site, which is a lengthy and costly process and hinders the construction process. The question arose as to how the findings from the study of buildings based on written and preserved sources can be applied to the 85% of buildings for which no data is available. This paper presents the use of the collected data as an idea for an initiative to develop a database and modelling using artificial intelligence algorithms. The research study investigates the feature “load-bearing wall” for residential buildings with basements and floors built between 1857 and 1948 in the former Austro-Hungarian Empire. The aim of this study is to create a model to predict the characteristics of a building for which no archival material is available. The study is based on the use of artificial intelligence in the creation of decision trees to help engineers improve their knowledge of historic buildings in the former Austro-Hungarian Empire and building engineering for historic objects. Ključne besede: knowledge discovery from data, machine learning, Austro-Hungarian Monarchy buildings Objavljeno v DKUM: 03.03.2025; Ogledov: 0; Prenosov: 0
Celotno besedilo (6,35 MB) Gradivo ima več datotek! Več... |
7. |
8. |
9. Survey of inter-prediction methods for time-varying mesh compressionJan Dvořák, Filip Hácha, Gerasimos Arvanitis, David Podgorelec, Konstantinos Moustakas, Libor Váša, 2025, izvirni znanstveni članek Opis: Time-varying meshes (TVMs), that is mesh sequences with varying connectivity, are a greatly versatile representation of shapesevolving in time, as they allow a surface topology to change or details to appear or disappear at any time during the sequence.This, however, comes at the cost of large storage size. Since 2003, there have been attempts to compress such data efficiently. Whilethe problem may seem trivial at first sight, considering the strong temporal coherence of shapes represented by the individualframes, it turns out that the varying connectivity and the absence of implicit correspondence information that stems from itmakes it rather difficult to exploit the redundancies present in the data. Therefore, efficient and general TVM compression is stillconsidered an open problem. We describe and categorize existing approaches while pointing out the current challenges in thefield and hint at some related techniques that might be helpful in addressing them. We also provide an overview of the reportedperformance of the discussed methods and a list of datasets that are publicly available for experiments. Finally, we also discusspotential future trends in the field. Ključne besede: compression algorithms, data compression, modelling, polygonal mesh reduction Objavljeno v DKUM: 07.02.2025; Ogledov: 0; Prenosov: 2
Celotno besedilo (3,11 MB) |
10. Differences in user perception of artificial intelligence-driven chatbots and traditional tools in qualitative data analysisBoštjan Šumak, Maja Pušnik, Ines Kožuh, Andrej Šorgo, Saša Brdnik, 2025, izvirni znanstveni članek Opis: Qualitative data analysis (QDA) tools are essential for extracting insights from complex datasets. This study investigates researchers’ perceptions of the usability, user experience (UX), mental workload, trust, task complexity, and emotional impact of three tools: Taguette 1.4.1 (a traditional QDA tool), ChatGPT (GPT-4, December 2023 version), and Gemini (formerly Google Bard, December 2023 version). Participants (N = 85), Master’s students from the Faculty of Electrical Engineering and Computer Science with prior experience in UX evaluations and familiarity with AI-based chatbots, performed sentiment analysis and data annotation tasks using these tools, enabling a comparative evaluation. The results show that AI tools were associated with lower cognitive effort and more positive emotional responses compared to Taguette, which caused higher frustration and workload, especially during cognitively demanding tasks. Among the tools, ChatGPT achieved the highest usability score (SUS = 79.03) and was rated positively for emotional engagement. Trust levels varied, with Taguette preferred for task accuracy and ChatGPT rated highest in user confidence. Despite these differences, all tools performed consistently in identifying qualitative patterns. These findings suggest that AI-driven tools can enhance researchers’ experiences in QDA while emphasizing the need to align tool selection with specific tasks and user preferences. Ključne besede: user experience, UX, usability, qualitative data analysis, QDA, chatbots Objavljeno v DKUM: 07.02.2025; Ogledov: 0; Prenosov: 10
Celotno besedilo (1,51 MB) |