Readme should be good for now

This commit is contained in:
2026-03-16 01:02:59 -05:00
parent 1389959e96
commit 59b4ef2415

View File

@@ -1,5 +1,5 @@
# GDC_ATRIUM
Repository for robust, adaptive 3D person tracking and reidentification in busy areas
Repository for robust, adaptive 3D person tracking and reidentification in busy areas. Please contact adipu@utexas.edu with questions. Property of AMRL - UT Austin.
## Environment Setup
**Next, your environment MUST have MMPOSE and the tracking_re_id ROS2 package properly installed** Follow the instructions below to set it up
@@ -38,3 +38,37 @@ Many nodes in the ```tracking_re_id``` package rely on the ```/stereo/left/camer
Visit https://docs.ros.org/en/jazzy/p/camera_calibration/doc/tutorial_stereo.html for instructions on running the calibration. The left and right yaml files always had ```camera_name=narrow_stereo/left``` and ```camera_name=narrow_stereo/right``` which have to be changed to ```stereo/left``` and ```stereo/right```, respectively.
To run the camera driver after calibration has been conducted, simply execute ```ros2 launch tracking_re_id start_cameras_only.launch.py```. Running ```colcon build``` again is **NOT** required, as you should have run it with ```--symlink-install``` previously.
# The following must only be run after the camera driver is started and once mmpose is installed:
## single_person_loc_node
This node forms the basis for 3D pose tracking methods, taking raw image feeds and camera_info from both cameras, generating joint keypoints using MMPOSE, projecting these points onto the image plane, then tracing rays from each camera's copy of the keypoint through the respective focal center to their intersection, the 3D keypoint.
```ros2 launch tracking_re_id single_person_demo.launch.py``` will start a visualization of the keypoints and show the average normal distance and average coordinate relative to the left camera.
Other launchfiles run ```single_person_loc_node``` in headless mode, drawing upon the published 3D keypoints.
## ground_plane_node
Listens to keypoints from ```single_person_loc_node```, waiting for ankle keypoints that stay in a 3D location within a 5cm radius for more than 5 frames. Once at least 3 such keypoints are found with at least one off of a line connecting the other two by at least 25 cm, a plane is formed via a least-squares method. The plane is published as a large, disc-shaped marker, along with the keypoints it was based off of.
## overlay_node
Provides visualization of the ground plane, transformed back into the camera feed and overlaid on the undistorted image along with the 3d keypoints
**```ros2 launch tracking_re_id full_pipeline.launch.py``` will launch ```single_person_loc_node```, ```ground_plane_node```, and ```overlay_node```, allowing for visualization of the ground plane while people are walking through.**
## reid_utils and reid_node
Work together to deliver KeyRe-ID functionality (from ```KeyRe_ID_model.py```) based on weights stored in ```tracking_re_id/weights```. Currently, the pipeline uses the weights I finetuned from ViT-base on the iLIDSVID dataset.
The Re-ID system works on top of MMPOSE, utilizing generated keypoints to create attention heatmaps and use these to reidentify subjects.
Running ```ros2 launch reid_pipeline.launch.py``` will return a visualization of the live camera feed, along with reidentification bounding boxes (the confidence value on the first identification is always 1.0, the confidence on a new box with the same ID is the similarity between the current subject and the subject they were matched with, and the confidence on a new box with a different ID is the highest confidence KeyRe-ID had when attempting to match to an existing subject.)
The Re-ID implementation is based on [KeyRe-ID](https://arxiv.org/abs/2507.07393):
```
@article{kim2025keyreid,
title = {KeyReID: KeypointGuided Person ReIdentification using PartAware Representation in Videos},
author = {Jinseong Kim and Jeonghoon Song and Gyeongseon Baek and Byeongjoon Noh},
journal = {arXiv preprint arXiv:2507.07393},
year = {2025}
}
```