This project demonstrates the implementation of classical Bug 2 navigation algorithms in a simulated environment using Webots. It includes the development of a custom robot model and a designed simulation environment with obstacles. This project was completed as part of the Intelligent Robotics course at TUKE.
Bug2-Algorithm-Webots-Bot/
├── controllers/
│ └── bug_2_controller/
│ ├── bug_2_controller.py # Main control script implementing Bug2 algorithm
│ ├── initialization.py # Initialization of sensors, motors, and robot parameters
│ └── navigation_utils.py # Utility functions for navigation logic and state handling
├── protos/
│ ├── boxObject.proto # Custom obstacle/object prototype
│ └── robotModel.proto # Custom robot model definition
├── worlds/
│ └── world.wbt # Webots world file with robot and obstacles
- Language:
Python
- Environment:
Webots
- Robot Model:
Custom
- Algorithm:
Bug2
bug2_demo_video.mp4
-
Install Webots.
-
Clone the repository:
git clone https://github.com/dmytro-varich/Bug2-Algorithm-Webots-Bot.git
-
Open the
world.wbt
file in Webots. -
Start the simulation.
-
The robot will begin moving toward the goal, navigating around obstacles using the Bug2 algorithm.
The Bug2 algorithm is a classical path-planning algorithm for mobile robots navigating in environments with unknown obstacles. It combines goal-oriented motion with obstacle following, making it both simple and effective for many robotic applications. Bug2 allows a robot to move towards a goal in a straight line (called the M-line) until it encounters an obstacle. When that happens, the robot follows the obstacle’s boundary until it can rejoin the M-line and continue toward the goal.
The robot model in Webots represents a mobile platform with a differential drive and a set of sensors for navigation and spatial orientation. The robot is equipped with various sensors, including lidars, a compass, GPS, and ultrasonic sensors (sonars), enabling obstacle detection and environmental awareness. Each wheel is controlled by a motor and tracked with a position encoder to enable precise motion control. Both the wheels and the robot body have boundingObjects
defined via Shape
(Cylinder). A simplified physical behavior is specified (mass, density) suitable for motion simulation.
Drives and Wheels
Wheel | Position | Mass | Size | Motor |
---|---|---|---|---|
wheel_left |
Rear left | 1 kg | r=0.05, h=0.03 | motor_left |
wheel_right |
Rear right | 1 kg | r=0.05, h=0.03 | motor_right |
wheel_rear |
Support (rear) | 5 kg | r=0.05, h=0.04 | motor_rear |
Lidars
Name | Position | Field of View | Range | Resolution |
---|---|---|---|---|
lidar_front |
Front | ~60° | 0.5 m | 256 |
lidar_left |
Left | ~45° | 0.6 m | 256 |
lidar_right |
Right (fixed) | ~45° | 0.6 m | 256 |
Ultrasonic Distance Sensors (Sonars)
Name | Position | Direction | Max Range |
---|---|---|---|
sonar_front |
Front | Forward | 0.5 m |
sonar_right |
Right | Right | 0.5 m |
sonar_back_right |
Back right | Back-right (~30°) | 0.5 m |
sonar_left |
Left | Left | 0.5 m |
sonar_back_left |
Back left | Back-left (~30°) | 0.5 m |
Additional Sensors
- GPS — determines the global coordinates of the robot.
- Compass — measures orientation relative to magnetic north.
The algorithm is implemented in Python using the Webots simulation environment with a custom robot model. The navigation logic is based on two main states: go_to_goal
(moving towards the target) and follow_wall
(obstacle avoidance). The robot relies on GPS, compass, lidar, and ultrasonic sensors to perceive its environment and make decisions.
At the start, the robot determines its initial position and begins moving in a straight line towards the goal (the so-called M-line). If it encounters an obstacle, it switches to the follow_wall
state, where it follows the boundary of the obstacle. The robot records the hit point (where it first touches the obstacle) and searches for a leave point on the M-line that is closer to the goal.
While following the wall, the robot transitions between substates:
turn_right
— turning right when an obstacle is directly in front.turn_left
— turning left if the wall is lost on the left side.move_forward
— moving forward along the wall.search_wall
— searching for the wall after a turn.
When the robot returns to the M-line and is closer to the goal than it was at the hit point, it switches back to go_to_goal
and continues moving towards the target. The implementation also includes a safeguard against infinite loops: if the robot returns to the hit point on the M-line and the goal is still unreachable, it stops and concludes that the goal cannot be reached.
Dmytro Varich is the creator of this robotics project. You can learn more about his projects on his personal Telegram channel, as well as connect with him via LinkedIn (dmytro-varich) and email (varich.it@gmail.com).