This paper addresses the design of embedded systems for outdoor augmented reality (AR) applications integrated to\nsee-through glasses. The set of tasks includes object positioning, graphic computation, as well as wireless\ncommunications, and we consider constraints such as real-time, low power, and low footprint. We introduce an\noriginal sailor assistance application, as a typical, useful, and complex outdoor AR application, where\ncontext-dependent virtual objects must be placed in the user field of view according to head motions and ambient\ninformation. Our study demonstrates that it is worth working on power optimization, since the embedded system\nbased on a standard general-purpose processor (GPP) + graphics processing unit (GPU) consumes more than\nhigh-luminosity see-through glasses. This work presents then three main contributions, the first one is the choice and\ncombinations of position and attitude algorithms that fit with the application context. The second one is the\narchitecture of the embedded system, where it is introduced as a fast and simple object processor (OP) optimized for\nthe domain of mobile AR. Finally, the OP implements a new pixel rendering method (incremental pixel shader (IPS)),\nwhich is implemented in hardware and takes full advantage of Open GL ES light model. A GP+OP(s) complete\narchitecture is described and prototyped on field programmable gate-array (FPGA). It includes hardware/software\npartitioning based on the analysis of application requirements and ergonomics.
Loading....