Final Product
Finite State Machine
graph TD A[Wait] B[Forward] C[Backward] D[Left] E[Right] F[Set home] G[Go home] H[Go to Apriltag] I[Go to cup] H --> |distance from Apriltag| I I --> |distance from cup| G G --> |distance from home| A F --> |immediately| A B --> |time| A C --> |time| A D --> |time| A E --> |time| A
This is our finite state machine. One thing you may notice about it is that many states
don't have an entry point. That is because no other state leads to them, however these
states can still be entered. Out FSM is structured in a way that any state can be entered
at any time.
Every state has three functions, enter()
, step()
, and
exit()
. Whenever the state is switched, the exit()
function of
the current state is run, then the enter()
function of the new state is run.
The run loop will run the current state's step()
function 20 times per second.
The neato_control.py
node subscribes to the vora_command
topic, and whenever the topic is updated, the node attempts to enter a new state. If the state
is recognized, it will switch to that state, cleanly exiting the previous state and entering
the new state. It will switch even if the commanded state is the current state, as this way
the state can be restarted if necessary.
Node Diagram
graph TD A[neato_control] B[voice_handler] C[teleop] D[Neato] B & C --> |vora_command| A A --> |cmd_vel| D D --> |odom| A D --> |camera/image_raw| A
This is the structure of our ROS2 nodes and the topics that link them. The vora_command
topic controls the finite state machine in the neato_control
node. Both the
voice_handler
and teleop
nodes publish to this topic. This allows us to
easily test the control algorithms on the Neato without having to use voice commands, but also makes
integrating voice control as simple as running a different node.
The Neato is controlled by the cmd_vel
topic, and the only node in our system that
publishes to that to that topic is neato_control
. This ensures that the neato is only ever
told to do one thing at a time, which makes isolating problems during debugging much simpler. The Neato
also publishes it's odometry data to odom
and it's camera feed to camera/image_raw
,
which are used by the neato_control
node for odometry-based closed loop navigation,
computer vision navigation with Apriltags, and object identification and tracking using computer vision.