An example of an interaction is: pressing the Left Mouse Button (LMB) to select a robot - which is both an Object3D and an Event Handler (EH)- making him the Active Event Handler (AEH) and then clicking on a person - that is just an Object3D - delivers the event to the robot because it is the AEH. The robot EH can then display a popup with the options of actions that the robot can perform on the clicked person. A simplified explanation of how interaction is processed is demonstrated as the chain of responsibility represented in the rightmost diagram.
To make any C++ class an EH the developer only has to inherit from the EH virtual base class. Then when it becomes the AEH (normally because a user clicked it when no other AEH was active) the input of the user are then redirected to the AEH. From the information of the user input - keyboard or mouse - the AEH can decide what actions to take. The AEH knowns the type of Object3D clicked by looking at its Tags, which can be either be attributes, names, etc.
The major guidelines when developing the interaction part of the framework was to not impose much constrains to the developer while ensuring a minimum set of rules to assert that the composed applications have a consistent behavior. The interaction logic is all included in ERF, making the components cleaner from interaction code and further eliminating the possibility of unpredictable behavior.