| | SLO | ENG | Cookies and privacy

Bigger font | Smaller font

Search the digital library catalog Help

Query: search in
search in
search in
search in
* old and bologna study programme

Options:
  Reset


1 - 3 / 3
First pagePrevious page1Next pageLast page
1.
Vine canopy reconstruction and assessment with terrestrial lidar and aerial imaging
Igor Petrović, Matej Sečnik, Marko Hočevar, Peter Berk, 2022, original scientific article

Abstract: For successful dosing of plant protection products, the characteristics of the vine canopies should be known, based on which the spray amount should be dosed. In the field experiment, we compared two optical experimental methods, terrestrial lidar and aerial photogrammetry, with manual defoliation of some selected vines. Like those of other authors, our results show that both terrestrial lidar and aerial photogrammetry were able to represent the canopy well with correlation coefficients around 0.9 between the measured variables and the number of leaves. We found that in the case of aerial photogrammetry, significantly more points were found in the point cloud, but this depended on the choice of the ground sampling distance. Our results show that in the case of aerial UAS photogrammetry, subdividing the vine canopy segments to 5 × 5 cm gives the best representation of the volume of vine canopies.
Keywords: precision agriculture, remote sensing, 3D point clouds, vineyard, canopy reconstruction, terrestrial lidar, aerial photogrammetry, manual defoliation
Published in DKUM: 15.07.2024; Views: 123; Downloads: 4
URL Link to full text
This document has many files! More...

2.
Sensor fusion-based approach for the field robot localization on Rovitis 4.0 vineyard robot
Jurij Rakun, Matteo Pantano, Peter Lepej, Miran Lakota, 2022, original scientific article

Abstract: This study proposed an approach for robot localization using data from multiple low-cost sensors with two goals in mind, to produce accurate localization data and to keep the computation as simple as possible. The approach used data from wheel odometry, inertial-motion data from the Inertial Motion Unit (IMU), and a location fix from a Real-Time Kinematics Global Positioning System (RTK GPS). Each of the sensors is prone to errors in some situations, resulting in inaccurate localization. The odometry is affected by errors caused by slipping when turning the robot or putting it on slippery ground. The IMU produces drifts due to vibrations, and RTK GPS does not return to an accurate fix in (semi-) occluded areas. None of these sensors is accurate enough to produce a precise reading for a sound localization of the robot in an outdoor environment. To solve this challenge, sensor fusion was implemented on the robot to prevent possible localization errors. It worked by selecting the most accurate readings in a given moment to produce a precise pose estimation. To evaluate the approach, two different tests were performed, one with robot localization from the robot operating system (ROS) repository and the other with the presented Field Robot Localization. The first did not perform well, while the second did and was evaluated by comparing the location and orientation estimate with ground truth, captured by a hovering drone above the testing ground, which revealed an average error of 0.005 m±0.220 m in estimating the position, and 0.6°±3.5° when estimating orientation. The tests proved that the developed field robot localization is accurate and robust enough to be used on a ROVITIS 4.0 vineyard robot.
Keywords: localization, odometry, IMU, RTK GPS, vineyard, robot, sensors fusion, ROS, precision farming
Published in DKUM: 02.07.2024; Views: 123; Downloads: 9
.pdf Full text (690,56 KB)
This document has many files! More...

3.
Search done in 0.09 sec.
Back to top
Logos of partners University of Maribor University of Ljubljana University of Primorska University of Nova Gorica