The Human Centered Robotics & Automation

Dejun Guo   

Office: 2-200, SEIEE Building

Tel: (+86) 13127910824
Research: Mobile Robot, Multi-robot Systems, Machine Vision

Greeting! I'm a second year master student of Autonomous Robot Lab at Shanghai Jiao Tong University(SJTU), China. I'm working with my advisor Prof. Weidong Chen and my co-supervisor Prof. Hesheng Wang. I received my B.E. degree in Autonation from Northwestern Polytechnical University, China, in 2012.

My Research Interests include Visual Servo, Adaptive Control, Mobile Robot, Manipulator and Computer Vision. Here is my [CV].



  1. Dejun Guo, Hesheng Wang, Weidong Chen, Xinwu Liang, "Adaptive Image-based Leader-Follower Approach of Mobile Robot with Omnidirectional Camera", J. Applied Mathematics, published, 2014. [Video Youtube][Paper URL]
  2. Dejun Guo, Hesheng Wang, Weidong Chen, "Adaptive Image-Based Leader-Follower Approach Using Uncalibrated Cameras", IEEE Trans. Industrial Electronics, major revision, 2014.[Video Youtube]
  3. Xinwu Liang, Hesheng Wang, Weidong Chen, Dejun Guo, "Adaptive Image-based Trajectory Tracking Control of Wheeled Mobile Robots with An Uncalibrated Fixed Camera", IEEE Trans. Control System Technology, accepted, 2014. [Video Youtube]
  4. Dejun Guo, Hesheng Wang, Weidong Chen, "Unified Leader-Follower Scheme for Mobile Robots Using An Uncalibrated On-board Camera", in IROS15, submitted, 2015.

Honors and Awards:

  • Three Dan in the Game of Go.
  • The Champion of the Go Competition at NWPU, 2009
  • The Third Prize in Mathematical Modeling Contest, NWPU, 2010
  • The First Prize Scholarship, NWPU, 2010 & 2011
  • The Guang Hua Educational Scholarship, SJTU, 2013
  • The Second Prize of Family Service Competition in Chinese RoboCup, SJTU, 2013
  • National Graduate Scholarship, SJTU, 2014 [News]

Research Experiences:

►Innovation on Control Algorithm

  • Adaptive Vision-based Leader-Follower Formation Control of Mobile Robots with an Uncalibrated On-board Camera (Independently Conduct)

09 / 2013 ~ 07 / 2014



This research focuses on the problem of the Image-based Leader-Follower formation control of mobile robots. An uncalibrated and arbitrary pose camera is mounted on the Follower, and a feature point is fixed on the Leader in any position. The controller is based on image information only and is independent of the Leader’s motion. They do not depend on communication with each other and any positioning sensors. A new observer is proposed to estimate the unknown intrinsic and extrinsic camera parameters as well as the unknown coefficients of the plane where the feature point moves. At last, the Lyapunov method can prove the Uniform Semiglobal Practical Asymptotic Stability (USPAS) for the system. Experiments are conducted to validate the performance of the algorithm. Relevant Papers [1-2][4]

  • Dynamic Eye-in-hand Visual Tracking of Space Manipulator (Independently Conduct)

08 / 2014 ~ present


This research focuses on the Tracking Control of Eye-in-Hand Free-Floating Space Manipulator. A new adaptive controller is proposed to lock the target at the desired position in its view and keep the relative pose. Adaptive laws estimating the unknown motion of the target and the uncertainty dynamic parameters are developed. Simulations results support the control scheme. Relevant Paper [5]

  • Adaptive Image-based Trajectory Tracking Control of Wheeled Mobile Robots with an Uncalibrated Fixed Camera (Experiment Part)

10 / 2014

This research focuses on uncalibrated image-based trajectory tracking control problem of wheeled mobile robots. The motion of the wheeled mobile robot can be observed by using an uncalibrated fixed camera on the ceiling. The fixed camera can be placed at a general position. A new adaptive image-based trajectory tracking control approach is proposed.  An adaptive laws is designed to estimate the unknown camera intrinsic and extrinsic parameters and the position parameter of the feature point online. The depth information of the feature point can be allowed to be time-varying in this case. Lyapunov stability analysis is applied to show asymptotical convergence of image position and velocity tracking errors. Simulation results based on a two-wheeled mobile robot is conducted to illustrate the performance of the proposed approach as well. Relevant Paper [3]

►Experiences on Engineering Practices

  • Global Visual Positioning System(GVPS) (Independently Conduct)
    10 / 2012 ~ 03 / 2013 

The GVPS, consisting of only 2 wide angle lens cameras, can precisely locate at most 16 mobile robots moving in a 6m×9m area with ±1cm and ±1° positioning error. It costs only $3000 and is at least $2000 cheaper than the existing commercial products with same performances.

  • Vision-based Control Platform for Multi-Robots (Independently Conduct)

04 / 2013 ~ 06 / 2013

The goal is to develop a flexible and efficient vision-based control experimental platform which needs to support both perspective and omnidirectional on-board cameras. The visual recognition model can identify the relative positions and IDs of at most 9 different neighbours in the range of 2.5m with 20 Hz (running in a Pentium IV CPU).

  • Autonomous Shopping in the Supermarket (Visual Recognition part)

07 / 2013 ~ 10 / 2013

In Competition in Chinese RoboCup, 2013, my goal is to endow an intelligent mobile robot with a monocular camera to follow a special mark on a manager or to remember and recognize different commodities on the shelves. The robot follows the manager to remember the commodities in different places and then was asked to bring back the wanted one to the customer. At last, we stood out among 15 teams from all around the country and got the second prize.

Teaching Experience

  • Graduate Teaching Assistant of Fundamentals and Control of Robotics at SJTU

03 / 2014 ~ Present

Designed a presentation on Visual Servo in Robotics and Computer Vision by combining the latest video with basic theory. Given the presentation for three 30-individual classes including undergraduates, graduate students and practicing engineers, and received the appreciation from the audiences.[Visual Servo in Robotics.ppt][Computer Vision.ppt]

Mentored students in the experiment to capture a static cup using an eye-in-hand manipulator.