The human haptic system, among all senses, provides unique and bidirectional communication between humans and their physical environment. Yet, to date, most human-computer interactive systems have focused primarily on the graphical rendering of visual information and, to a lesser extent, on the display of auditory information. Extending the frontier of visual computing, haptic interfaces, or force feedback devices, have the potential to increase the quality of human-computer interaction by accommodating the sense of touch. They provide an attractive augmentation to visual display and enhance the level of understanding of complex data sets. They have been effectively used for a number of applications including molecular docking, manipulation of nano-materials, surgical training, virtual prototyping, and digital sculpting. Compared with visual and auditory display, haptic rendering has extremely demanding computational requirements. In order to maintain a stable system while displaying smooth and realistic forces and torques, high haptic update rates in the range of 5001000 Hz or more are typically used. Haptics present many new challenges to researchers and developers in computer graphics and interactive techniques. Some of the critical issues include the development of novel data structures to encode shape and material properties, as well as new techniques for geometry processing, data analysis, physical modeling, and haptic visualization. This synthesis examines some of the latest developments on haptic rendering, while looking forward to exciting future research in this area. It presents novel haptic rendering algorithms that take advantage of the human haptic sensory modality. Specifically it discusses different rendering techniques for various geometric representations (e.g. point-based, polygonal, multiresolution, distance fields, etc), as well as textured surfaces.