| | SLO | ENG | Piškotki in zasebnost

Večja pisava | Manjša pisava

Iskanje po katalogu digitalne knjižnice Pomoč

Iskalni niz: išči po
išči po
išči po
išči po
* po starem in bolonjskem študiju

Opcije:
  Ponastavi


1 - 3 / 3
Na začetekNa prejšnjo stran1Na naslednjo stranNa konec
1.
Vine canopy reconstruction and assessment with terrestrial lidar and aerial imaging
Igor Petrović, Matej Sečnik, Marko Hočevar, Peter Berk, 2022, izvirni znanstveni članek

Opis: For successful dosing of plant protection products, the characteristics of the vine canopies should be known, based on which the spray amount should be dosed. In the field experiment, we compared two optical experimental methods, terrestrial lidar and aerial photogrammetry, with manual defoliation of some selected vines. Like those of other authors, our results show that both terrestrial lidar and aerial photogrammetry were able to represent the canopy well with correlation coefficients around 0.9 between the measured variables and the number of leaves. We found that in the case of aerial photogrammetry, significantly more points were found in the point cloud, but this depended on the choice of the ground sampling distance. Our results show that in the case of aerial UAS photogrammetry, subdividing the vine canopy segments to 5 × 5 cm gives the best representation of the volume of vine canopies.
Ključne besede: precision agriculture, remote sensing, 3D point clouds, vineyard, canopy reconstruction, terrestrial lidar, aerial photogrammetry, manual defoliation
Objavljeno v DKUM: 15.07.2024; Ogledov: 123; Prenosov: 11
URL Povezava na celotno besedilo
Gradivo ima več datotek! Več...

2.
Sensor fusion-based approach for the field robot localization on Rovitis 4.0 vineyard robot
Jurij Rakun, Matteo Pantano, Peter Lepej, Miran Lakota, 2022, izvirni znanstveni članek

Opis: This study proposed an approach for robot localization using data from multiple low-cost sensors with two goals in mind, to produce accurate localization data and to keep the computation as simple as possible. The approach used data from wheel odometry, inertial-motion data from the Inertial Motion Unit (IMU), and a location fix from a Real-Time Kinematics Global Positioning System (RTK GPS). Each of the sensors is prone to errors in some situations, resulting in inaccurate localization. The odometry is affected by errors caused by slipping when turning the robot or putting it on slippery ground. The IMU produces drifts due to vibrations, and RTK GPS does not return to an accurate fix in (semi-) occluded areas. None of these sensors is accurate enough to produce a precise reading for a sound localization of the robot in an outdoor environment. To solve this challenge, sensor fusion was implemented on the robot to prevent possible localization errors. It worked by selecting the most accurate readings in a given moment to produce a precise pose estimation. To evaluate the approach, two different tests were performed, one with robot localization from the robot operating system (ROS) repository and the other with the presented Field Robot Localization. The first did not perform well, while the second did and was evaluated by comparing the location and orientation estimate with ground truth, captured by a hovering drone above the testing ground, which revealed an average error of 0.005 m±0.220 m in estimating the position, and 0.6°±3.5° when estimating orientation. The tests proved that the developed field robot localization is accurate and robust enough to be used on a ROVITIS 4.0 vineyard robot.
Ključne besede: localization, odometry, IMU, RTK GPS, vineyard, robot, sensors fusion, ROS, precision farming
Objavljeno v DKUM: 02.07.2024; Ogledov: 123; Prenosov: 17
.pdf Celotno besedilo (690,56 KB)
Gradivo ima več datotek! Več...

3.
Iskanje izvedeno v 0.09 sek.
Na vrh
Logotipi partnerjev Univerza v Mariboru Univerza v Ljubljani Univerza na Primorskem Univerza v Novi Gorici