Elias Marks
PhD Student Contact:Email: elias.marks@nulluni-bonn.de
Tel: +49 – 228 – 73 – 29 03
Fax: +49 – 228 – 73 – 27 12
Office: Nussallee 15, 1. OG, room 1.007
Address:
University of Bonn
Photogrammetry, IGG
Nussallee 15
53115 Bonn
Research Interests
- Computer Vision
- Machine Learning
- Agricultural Robotics
Short CV
Elias Marks is a Ph.D. student at the Photogrammetry Lab at the Rheinische Friedrich-Wilhelms-Universität Bonn since February 2021. He received his M.Sc. in Artificial Intelligence and Robotics from “La Sapienza” University of Rome with a thesis on Perception and path planning for Mobile Robots.He previously worked for Fraunhofer Institute for Manufacturing Engineering and Automation in Stuttgart.
Projects
- RegisTer
Publications
2024
- F. Magistri, T. Läbe, E. Marks, S. Nagulavancha, Y. Pan, C. Smitt, L. Klingbeil, M. Halstead, H. Kuhlmann, C. McCool, J. Behley, and C. Stachniss, “A Dataset and Benchmark for Shape Completion of Fruits for Agricultural Robotics,” arXiv Preprint, 2024.
[BibTeX] [PDF]@article{magistri2024arxiv, title={{A Dataset and Benchmark for Shape Completion of Fruits for Agricultural Robotics}}, author={F. Magistri and T. L\"abe and E. Marks and S. Nagulavancha and Y. Pan and C. Smitt and L. Klingbeil and M. Halstead and H. Kuhlmann and C. McCool and J. Behley and C. Stachniss}, journal = arxiv, year=2024, eprint={2407.13304}, }
- J. Weyler, F. Magistri, E. Marks, Y. L. Chong, M. Sodano, G. Roggiolani, N. Chebrolu, C. Stachniss, and J. Behley, “PhenoBench: A Large Dataset and Benchmarks for Semantic Image Interpretation in the Agricultural Domain,” IEEE Trans. on Pattern Analysis and Machine Intelligence (TPAMI), 2024. doi:10.1109/TPAMI.2024.3419548
[BibTeX] [PDF] [Code]@article{weyler2024tpami, author = {J. Weyler and F. Magistri and E. Marks and Y.L. Chong and M. Sodano and G. Roggiolani and N. Chebrolu and C. Stachniss and J. Behley}, title = {{PhenoBench: A Large Dataset and Benchmarks for Semantic Image Interpretation in the Agricultural Domain}}, journal = tpami, year = {2024}, volume = {}, number = {}, pages = {}, doi = {10.1109/TPAMI.2024.3419548}, codeurl = {https://github.com/PRBonn/phenobench}, }
- E. A. Marks, J. Bömer, F. Magistri, A. Sah, J. Behley, and C. Stachniss, “BonnBeetClouds3D: A Dataset Towards Point Cloud-Based Organ-Level Phenotyping of Sugar Beet Plants Under Real Field Conditions,” in Proc. of the IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), 2024.
[BibTeX] [PDF]@inproceedings{marks2024iros, author = {E.A. Marks and J. B\"omer and F. Magistri and A. Sah and J. Behley and C. Stachniss}, title = {{BonnBeetClouds3D: A Dataset Towards Point Cloud-Based Organ-Level Phenotyping of Sugar Beet Plants Under Real Field Conditions}}, booktitle = iros, year = 2024, }
- J. Bömer, F. Esser, E. A. Marks, R. A. Rosu, S. Behnke, L. Klingbeil, H. Kuhlmann, C. Stachniss, A. -K. Mahlein, and S. Paulus, “A 3D Printed Plant Model for Accurate and Reliable 3D Plant Phenotyping,” GigaScience, vol. 13, p. giae035, 2024. doi:10.1093/gigascience/giae035
[BibTeX] [PDF]@article{boemer2024giga, author = {J. B\"omer and F. Esser and E.A. Marks and R.A. Rosu and S. Behnke and L. Klingbeil and H. Kuhlmann and C. Stachniss and A.-K. Mahlein and S. Paulus}, title = {{A 3D Printed Plant Model for Accurate and Reliable 3D Plant Phenotyping}}, journal = giga, volume = {13}, number = {}, pages = {giae035}, issn = {2047-217X}, year = 2024, doi = {10.1093/gigascience/giae035}, url = {https://academic.oup.com/gigascience/article-pdf/doi/10.1093/gigascience/giae035/58270533/giae035.pdf}, }
- F. Magistri, R. Marcuzzi, E. A. Marks, M. Sodano, J. Behley, and C. Stachniss, “Efficient and Accurate Transformer-Based 3D Shape Completion and Reconstruction of Fruits for Agricultural Robots,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2024.
[BibTeX] [PDF] [Code] [Video]@inproceedings{magistri2024icra, author = {F. Magistri and R. Marcuzzi and E.A. Marks and M. Sodano and J. Behley and C. Stachniss}, title = {{Efficient and Accurate Transformer-Based 3D Shape Completion and Reconstruction of Fruits for Agricultural Robots}}, booktitle = icra, year = 2024, videourl = {https://youtu.be/U1xxnUGrVL4}, codeurl = {https://github.com/PRBonn/TCoRe}, }
2023
- R. Marcuzzi, L. Nunes, L. Wiesmann, E. Marks, J. Behley, and C. Stachniss, “Mask4D: End-to-End Mask-Based 4D Panoptic Segmentation for LiDAR Sequences,” IEEE Robotics and Automation Letters (RA-L), vol. 8, iss. 11, pp. 7487-7494, 2023. doi:10.1109/LRA.2023.3320020
[BibTeX] [PDF] [Code] [Video]@article{marcuzzi2023ral-meem, author = {R. Marcuzzi and L. Nunes and L. Wiesmann and E. Marks and J. Behley and C. Stachniss}, title = {{Mask4D: End-to-End Mask-Based 4D Panoptic Segmentation for LiDAR Sequences}}, journal = ral, year = {2023}, volume = {8}, number = {11}, pages = {7487-7494}, issn = {2377-3766}, doi = {10.1109/LRA.2023.3320020}, codeurl = {https://github.com/PRBonn/Mask4D}, videourl = {https://youtu.be/4WqK_gZlpfA}, }
- Y. Pan, F. Magistri, T. Läbe, E. Marks, C. Smitt, C. S. McCool, J. Behley, and C. Stachniss, “Panoptic Mapping with Fruit Completion and Pose Estimation for Horticultural Robots,” in Proc. of the IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), 2023.
[BibTeX] [PDF] [Code] [Video]@inproceedings{pan2023iros, author = {Y. Pan and F. Magistri and T. L\"abe and E. Marks and C. Smitt and C.S. McCool and J. Behley and C. Stachniss}, title = {{Panoptic Mapping with Fruit Completion and Pose Estimation for Horticultural Robots}}, booktitle = iros, year = 2023, codeurl = {https://github.com/PRBonn/HortiMapping}, videourl = {https://youtu.be/fSyHBhskjqA} }
- N. Zimmerman, M. Sodano, E. Marks, J. Behley, and C. Stachniss, “Constructing Metric-Semantic Maps using Floor Plan Priors for Long-Term Indoor Localization,” in Proc. of the IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), 2023.
[BibTeX] [PDF] [Code] [Video]@inproceedings{zimmerman2023iros, author = {N. Zimmerman and M. Sodano and E. Marks and J. Behley and C. Stachniss}, title = {{Constructing Metric-Semantic Maps using Floor Plan Priors for Long-Term Indoor Localization}}, booktitle = iros, year = 2023, codeurl = {https://github.com/PRBonn/SIMP}, videourl = {https://youtu.be/9ZGd5lJbG4s} }
- J. Weyler, F. Magistri, E. Marks, Y. L. Chong, M. Sodano, G. Roggiolani, N. Chebrolu, C. Stachniss, and J. Behley, “PhenoBench –- A Large Dataset and Benchmarks for Semantic Image Interpretation in the Agricultural Domain,” arXiv preprint, vol. arXiv:2306.04557, 2023.
[BibTeX] [PDF] [Code]@article{weyler2023arxiv, author = {Jan Weyler and Federico Magistri and Elias Marks and Yue Linn Chong and Matteo Sodano and Gianmarco Roggiolani and Nived Chebrolu and Cyrill Stachniss and Jens Behley}, title = {{PhenoBench --- A Large Dataset and Benchmarks for Semantic Image Interpretation in the Agricultural Domain}}, journal = {arXiv preprint}, volume = {arXiv:2306.04557}, year = {2023}, codeurl = {https://github.com/PRBonn/phenobench} }
- E. Marks, M. Sodano, F. Magistri, L. Wiesmann, D. Desai, R. Marcuzzi, J. Behley, and C. Stachniss, “High Precision Leaf Instance Segmentation in Point Clouds Obtained Under Real Field Conditions,” IEEE Robotics and Automation Letters (RA-L), vol. 8, iss. 8, pp. 4791-4798, 2023. doi:10.1109/LRA.2023.3288383
[BibTeX] [PDF] [Code] [Video]@article{marks2023ral, author = {E. Marks and M. Sodano and F. Magistri and L. Wiesmann and D. Desai and R. Marcuzzi and J. Behley and C. Stachniss}, title = {{High Precision Leaf Instance Segmentation in Point Clouds Obtained Under Real Field Conditions}}, journal = ral, pages = {4791-4798}, volume = {8}, number = {8}, issn = {2377-3766}, year = {2023}, doi = {10.1109/LRA.2023.3288383}, codeurl = {https://github.com/PRBonn/plant_pcd_segmenter}, videourl = {https://youtu.be/dvA1SvQ4iEY} }
- S. Kelly, A. Riccardi, E. Marks, F. Magistri, T. Guadagnino, M. Chli, and C. Stachniss, “Target-Aware Implicit Mapping for Agricultural Crop Inspection,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2023.
[BibTeX] [PDF] [Video]@inproceedings{kelly2023icra, author = {Shane Kelly and Alessandro Riccardi and Elias Marks and Federico Magistri and Tiziano Guadagnino and Margarita Chli and Cyrill Stachniss}, title = {{Target-Aware Implicit Mapping for Agricultural Crop Inspection}}, booktitle = icra, year = 2023, videourl = {https://youtu.be/UAIqn0QnpKg} }
- A. Riccardi, S. Kelly, E. Marks, F. Magistri, T. Guadagnino, J. Behley, M. Bennewitz, and C. Stachniss, “Fruit Tracking Over Time Using High-Precision Point Clouds,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2023.
[BibTeX] [PDF] [Video]@inproceedings{riccardi2023icra, author = {Alessandro Riccardi and Shane Kelly and Elias Marks and Federico Magistri and Tiziano Guadagnino and Jens Behley and Maren Bennewitz and Cyrill Stachniss}, title = {{Fruit Tracking Over Time Using High-Precision Point Clouds}}, booktitle = icra, year = 2023, videourl = {https://youtu.be/fBGSd0--PXY} }
2022
- F. Magistri, E. Marks, S. Nagulavancha, I. Vizzo, T. Läbe, J. Behley, M. Halstead, C. McCool, and C. Stachniss, “Contrastive 3D Shape Completion and Reconstruction for Agricultural Robots using RGB-D Frames,” IEEE Robotics and Automation Letters (RA-L), vol. 7, iss. 4, pp. 10120-10127, 2022.
[BibTeX] [PDF] [Video]@article{magistri2022ral-iros, author = {Federico Magistri and Elias Marks and Sumanth Nagulavancha and Ignacio Vizzo and Thomas L{\"a}be and Jens Behley and Michael Halstead and Chris McCool and Cyrill Stachniss}, title = {Contrastive 3D Shape Completion and Reconstruction for Agricultural Robots using RGB-D Frames}, journal = ral, url = {https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/magistri2022ral-iros.pdf}, year = {2022}, volume={7}, number={4}, pages={10120-10127}, videourl = {https://www.youtube.com/watch?v=2ErUf9q7YOI}, }
- E. Marks, F. Magistri, and C. Stachniss, “Precise 3D Reconstruction of Plants from UAV Imagery Combining Bundle Adjustment and Template Matching,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2022.
[BibTeX] [PDF]@inproceedings{marks2022icra, author = {E. Marks and F. Magistri and C. Stachniss}, title = {{Precise 3D Reconstruction of Plants from UAV Imagery Combining Bundle Adjustment and Template Matching}}, booktitle = icra, year = 2022, }
2021
- F. Görlich, E. Marks, A. Mahlein, K. König, P. Lottes, and C. Stachniss, “UAV-Based Classification of Cercospora Leaf Spot Using RGB Images,” Drones, vol. 5, iss. 2, 2021. doi:10.3390/drones5020034
[BibTeX] [PDF]
Plant diseases can impact crop yield. Thus, the detection of plant diseases using sensors that can be mounted on aerial vehicles is in the interest of farmers to support decision-making in integrated pest management and to breeders for selecting tolerant or resistant genotypes. This paper investigated the detection of Cercospora leaf spot (CLS), caused by Cercospora beticola in sugar beet using RGB imagery. We proposed an approach to tackle the CLS detection problem using fully convolutional neural networks, which operate directly on RGB images captured by a UAV. This efficient approach does not require complex multi- or hyper-spectral sensors, but provides reliable results and high sensitivity. We provided a detection pipeline for pixel-wise semantic segmentation of CLS symptoms, healthy vegetation, and background so that our approach can automatically quantify the grade of infestation. We thoroughly evaluated our system using multiple UAV datasets recorded from different sugar beet trial fields. The dataset consisted of a training and a test dataset and originated from different fields. We used it to evaluate our approach under realistic conditions and analyzed its generalization capabilities to unseen environments. The obtained results correlated to visual estimation by human experts significantly. The presented study underlined the potential of high-resolution RGB imaging and convolutional neural networks for plant disease detection under field conditions. The demonstrated procedure is particularly interesting for applications under practical conditions, as no complex and cost-intensive measuring system is required.
@Article{goerlich2021drones, AUTHOR = {Görlich, Florian and Marks, Elias and Mahlein, Anne-Katrin and König, Kathrin and Lottes, Philipp and Stachniss, Cyrill}, TITLE = {{UAV-Based Classification of Cercospora Leaf Spot Using RGB Images}}, JOURNAL = {Drones}, VOLUME = {5}, YEAR = {2021}, NUMBER = {2}, ARTICLE-NUMBER = {34}, URL = {https://www.mdpi.com/2504-446X/5/2/34/pdf}, ISSN = {2504-446X}, ABSTRACT = {Plant diseases can impact crop yield. Thus, the detection of plant diseases using sensors that can be mounted on aerial vehicles is in the interest of farmers to support decision-making in integrated pest management and to breeders for selecting tolerant or resistant genotypes. This paper investigated the detection of Cercospora leaf spot (CLS), caused by Cercospora beticola in sugar beet using RGB imagery. We proposed an approach to tackle the CLS detection problem using fully convolutional neural networks, which operate directly on RGB images captured by a UAV. This efficient approach does not require complex multi- or hyper-spectral sensors, but provides reliable results and high sensitivity. We provided a detection pipeline for pixel-wise semantic segmentation of CLS symptoms, healthy vegetation, and background so that our approach can automatically quantify the grade of infestation. We thoroughly evaluated our system using multiple UAV datasets recorded from different sugar beet trial fields. The dataset consisted of a training and a test dataset and originated from different fields. We used it to evaluate our approach under realistic conditions and analyzed its generalization capabilities to unseen environments. The obtained results correlated to visual estimation by human experts significantly. The presented study underlined the potential of high-resolution RGB imaging and convolutional neural networks for plant disease detection under field conditions. The demonstrated procedure is particularly interesting for applications under practical conditions, as no complex and cost-intensive measuring system is required.}, DOI = {10.3390/drones5020034} }