Contact Dr. Cang Ye

Phone:(501) 683-7284

Google Voice: (501) 237-1818

Fax: (501) 569-8020   

Email: cxye@ualr.edu

Laboratory

Dept. of Systems Engineering

ETAS 523, 2801 S. University Ave

Little Rock, AR 72204

Office

Dept. of Systems Engineering

EIT 532, 2801 S. University Ave

Little Rock, AR 72204

2016 Dr. Ye's  LAB. ALL RIGHTS RESERVED.

Intelligent Autonomous Systems Laboratory

College of Engineering & Information Technology

University of Arkansas at Little Rock

Home Ongoing Project  Past Projects Publications & Patents Grants

A Co-Robotic Navigation Aid for the Visually Impaired

The objective of the project is to develop a co-robot cane that may be used by a visually impaired person for wayfinding. The device consist of compute vision module, robotic guide module, and human-device interaction modules. The computer vision module includes 6-DOF SLAM for device pose estimation and mapping, and 3D object recognition. The robotic guide module steer the cane into the desired direction of travel if the robotic guide mode is selected. The human-device interaction module includes a speech interface that allows communication between the cane and the user and a human intent detection interface that senses the user's intent about the desired mode--conventional white cane mode or robotic guide mode--and automatically selects the mode for the user. This project is co-funded by the National Institute of Biomedical Imaging and Bioengineering and the National Eye Institute of the NIH under award R01EB018117.

Overview of the Co-Robot Cane

Click to view video clip of  wayfinding in a home environment

Publications

  1. H. Zhang and C. Ye*, "An Indoor Wayfinding System based on Geometric Features Aided Graph SLAM for the Visually Impaired," submitted to IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2016. (in revision)
  2. C. Ye* and Xiangfei Qian, "3D Object Recognition of a Robotic Navigation Aid for the Visually Impaired," submitted to IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2016. (in revision)
  3. H. Zhang and C. Ye*, "An Indoor Wayfinding Method for Robotic Navigation Aids," submitted to 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems.
  4. C. Ye, S. Hong, X. Qian, and Wei Wu, "A Co-robot Cane for the Visually Impaired," to appear in IEEE System, Man, and Cybernetics Magazine, 2015.
  5. C. Ye, S. Hong, and A. Tamjidi, "6-DOF Pose Estimation of a Robotic Navigation Aid by Tracking Visual and Geometric Features," IEEE Transactions on Automation Science and Engineering, vol. 12, no. 4, pp. 1169-1180, 2015.
  6. X. Qian and C. Ye, "NCC-RANSAC: A Fast Plane Extraction Method for 3D Range Data Segmentation," IEEE Transactions on Cybernetics, vol. 44, no. 12, pp. 2771-2783, 2014.
  7. S. Hong and C. Ye, "A Pose Graph based Visual SLAM Algorithm for Robot Pose Estimation," in Proceedings of 2014 World Automation Congress, Big Island, HI.
  8. C. Ye, S. Hong and X. Qian, "A Co-Robotic Cane for Blind Navigation," in Proceedings of 2014 IEEE International Conference on Systems, Man, and Cybernetics, San Diego, CA.
  9. X. Qian and C. Ye, "3D Object Recognition by Geometric Context and Gaussian-Mixture-Model-Based Plane Classification," in Proceedings of 2014 IEEE International Conference on Robotics and Automation, Hong Kong, China.
  10. S. Hong and C. Ye, "A Fast Egomotion Estimation Method based on Visual Feature Tracking and Iterative Closest Point," in Proceedings of 2014 IEEE International Conference on Networking, Sensing and Control, Miami, FL. (received Best Student Paper Finalist Award)