We explore to what degree movement together with facial features in a humanoid robot, such as eyes and mouth, can be used to convey mental states. Several animation variants were iteratively tested in a series of experiments to reach a set of five expressive states that can be reliably expressed by the robot. These expressions combine biologically motivated cues such as eye movements and pupil dilation with elements that only have a conventional significance, such as changes in eye color.