Abstract : Virtual Reality allows the simulation and interaction with Virtual Environments (VE) through different sensory modalities. However, interacting with complex physically based VE, such as non-rigid or large environments, presents many challenges in terms of interaction and sensory feedback. The first part of this Ph.D. thesis addresses haptic and multimodal feedback issues during manipulation of non-rigid media. We first present a novel approach for 6 degrees of freedom haptic interaction with fluids, allowing the generation of force feedback from viscous fluids through arbitrary-shaped rigid bodies. This approach is extended by including the haptic interaction with deformable bodies, thus allowing a unified haptic interaction with the different states of matter. A perceptual experiment showed that users could efficiently identify the different states through the haptic modality alone. Then, we introduce a novel vibrotactile fluid rendering model, leveraging previous knowledge on fluid sound synthesis. Through this approach, we allow the interaction with fluids with multimodal feedback, through vibrotactile, kinesthetic, acoustic and visual sensory channels. The second part of this Ph.D. thesis addresses interaction issues during walking navigation of large VE. Since the VE is often larger than the available real workspace, we introduce a novel navigation metaphor that informs users about the real physical boundaries. Using hybrid position/rate control, this technique provides a simple and intuitive metaphor for a navigation safe from collisions and breaks of immersion. Other workspaces, such as CAVE-like environments, present rotation boundaries due to missing screens. Thus, we present three novel metaphors dealing with these additional boundaries. Overall, the evaluation of these navigation techniques showed that they efficiently fulfilled their objectives while being highly appreciated by users.