About Me

I am a doctoral student in the Distributed RObotics and Networked Embedded Sensing (DRONES) Lab at the University at Buffalo (UB), advised by Dr. Karthik Dantu (Dept. of Computer Science and Engineering), co-advised by Dr. John Crassidis in the Advanced Navigation and Control Systems (ANCS) Lab (Dept. of Mechanical and Aerospace Engineering). I am a current Pathways Student at NASA Goddard Space Flight Center (GSFC) in the Science Data Processing branch (code 587).

My research interests include spacecraft perception and autonomy, optical navigation systems, simultaneous localization and mapping (SLAM), embedded computing, and computer vision. My current research is focused on unsupervised, generative, and representation learning-based solutions for space-vision tasks such as visual terrain detection, scene reconstruction, and landmark recognition. In the past, I have also worked on dynamic feature reasoning, 3D feature location estimation, and sim-to-real domain adaptation.

Education

Aug. 2020 - Present
University at Buffalo
Ph.D. Candidate, Computer Science and Engineering
Distributed RObotics and Networked Embedded Sensing (DRONES) Lab
Advised by Dr. Karthik Dantu
Advanced Navigation and Control Systems (ANCS) Lab
Co-advised by Dr. John Crassidis
Aug. 2020 - Feb. 2023
University at Buffalo
M.S. in Computer Science and Engineering
Aug. 2016 - May 2020
University at Buffalo
B.S. in Computer Science
Aug. 2016 - May 2020
University at Buffalo
Certificate, Data Intensive Computing

Selected Work Experience

Dec. 2021 - Present
NASA Goddard Space Flight Center
Pathways Student, Science Data Processing Branch (Code 587)
Embedded Autonomy and AI
R&D Flight Software
May 2018 - Dec. 2021
NASA Goddard Space Flight Center - Wallops Flight Facility
Pathways Student, Wallops Systems Software Engineering Branch (Code 589)
Cube/Small-satellite Flight Software
Sep. 2019 - Jan. 2020
NASA Jet Propulsion Laboratory
Intern, Robot Operations Group (347K)
Simulation, Mars 2020 Rover Operations
Jan. 2019 - Jan. 2020
NOVI Aerospace
Machine Learning Consultant
Dataset Curator
Mar. 2016 - May. 2020
UB Nanosatellite Laboratory
Flight Software Lead (~15-45 Students)
Three CubeSat Missions

Selected Publications

You Only Crash Once: Improved Object Detection for Real-Time, Sim-to-Real Hazardous Terrain Detection and Classification for Autonomous Planetary Landings
Timothy Chase Jr, Chris Gnam, John Crassidis, Karthik Dantu
AAS/AIAA Astrodynamics Specialist Conference, 2022 (Oral Presentation)
In this work, we introduce You Only Crash Once (YOCO), a learning-based visual hazardous terrain detection and classification technique for autonomous spacecraft planetary landings. Through the use of unsupervised domain adaptation we tailor YOCO for training by simulation, removing the need for real-world annotated data and expensive mission surveying phases. We further improve the transfer of representative terrain knowledge between simulation and the real world through visual similarity clustering. We demonstrate the utility of YOCO through a series of terrestrial and extraterrestrial simulation-to-real experiments and show substantial improvements toward the ability to both detect and accurately classify instances of planetary terrain.
Efficient Feature Matching and Mapping for Terrain Relative Navigation Using Hypothesis Gating
Chris Gnam*, Timothy Chase Jr*, Karthik Dantu, John Crassidis
*Equal Contribution
AIAA SciTech Forum, 2022 (Oral Presentation)
This paper tackles the inaccuracies and inefficiencies of standard image feature matching processes on spaceflight processors, by leveraging traditional onboard navigation filter information to drastically reduce the number of matching candidates. Estimated feature location is used to form statistical prediction gates around a given feature, for which all points lying inside are treated as inliers and fed to the matching process. Using a simulated trajectory around a high-fidelity 3D asteroid model and a single monocular camera, we demonstrate an overall reduction of around 87% in average matching time for three popular feature description techniques. We showcase how feature gating substantially increases matching accuracy, giving utility towards purely monocular terrain relative navigation.
PRE-SLAM: Persistence Reasoning in Edge-assisted Visual SLAM
Timothy Chase Jr, Ali J. Ben Ali, Steven Y. Ko, Karthik Dantu
IEEE Conference on Mobile Ad Hoc and Smart Systems (MASS), 2022 (Oral Presentation)
We introduce PRE-SLAM, an edge-assisted visual SLAM system that incorporates feature persistence filtering. We revisit the centralized persistence filter architecture and make a series of modifications to allow for dynamic feature filtering in an edge-assisted setting. Using two locally collected datasets, we show how our split persistence filter implementation reduces map-point and keyframe retention by 26.6% and 16.6% respectively. By filtering out dynamic map-points from the system, we demonstrate an improvement in average localization accuracy by more than 50%. We also demonstrate how incorporating feature persistence filtering into Edge-SLAM retains the key benefits and performance enhancements of an edge-assisted Visual-SLAM system, with an added communication overhead of only 500 KB while decreasing overall map size by 8.6%.

News

Older Updates