Abstract : The multi-disciplinary work of this thesis results in its contributing to two research domains: Human-Computer Interaction (HCI), in particular Augmented Reality (AR), and Computer Assisted Medical Intervention (CAMI), a concrete application domain of AR. The contribution is twofold: on the one hand, it enriches the HCI design concepts and methods of AR by adopting a task centered point of view; on the other hand it improves the clinical usability of CAMI systems. Merging real and virtual worlds opens a wide range of possibilities for the design of AR systems, such as CAMI systems. However, there is currently no consensus either on a precise definition of AR or on a design space. Within this context, we have identified intrinsic characteristics of AR systems and then developed a classification space. In order to gain an understanding of the duality of interaction, (users interacting with part of the real world and simultaneously with part of the virtual world) we have developed the ASUR notation that describes the interaction. ASUR identifies entities involved in the system, in particular the adapters. The adapters constitute one type of ASUR component, which establishes a bridge between the real and virtual worlds. The notation ASUR constitutes the basis of our ergonomic and software design method for AR systems. Our ergonomic design method is grounded on the translation of ergonomic properties in terms of ASUR, while our software design method is based on the specialization of the PAC-Amodeus software architecture model for AR systems. In collaboration with a surgeon, we have applied and tested our methodological results on a cardiac surgery application, CASPER. It provides the surgeon with on-screen guidance information while performing a pericardial puncture. A new version of CASPER has been designed that implements a see-through head-mounted display, which enables the perception of the guidance information superimposed on the patient.