Google has released a new set of best practices aimed at helping make augmented reality more enjoyable for the company’s Android mobile platform. Of course, the best practices will also enable developers to bring AR functionality to their applications more easily and to create new AR-driven applications from the ground up. The primary point of the guidelines is to make interactions with using-facing AR aspects more intuitive, with a specific focus on the design elements involved. However, Google also plans to release a more comprehensive guide to its developer pages in the near future for those who want to go in-depth with the design principles.
The first area of AR development addressed by Google involves the constraints placed on AR smartphone applications by their very nature. Namely, since the interactions take place on a mobile device, users are likely to need to hold the phone in at least one hand while interacting with a given application. What that means is that developers should create immersion by using all of the tools available on a mobile device, or at least as many as possible. That includes everything from the device’s sensors and camera to the touch screen and the implementation of real-world coordinates for in-app digital objects. That also means building interactions with an understanding of the constraints placed on a user, including varying screen sizes and the environment the app is intended to be used in. In fact, Google says that every application really needs to have what they call a “dedicated experience space.” Developers should consider not only the physical space the user will be interacting within but also the range of motion required by the apps – as well as taking advantage of ARCore’s built-in capabilities, such as its ability to detect varying planes and overlapping planes at various elevations in the user’s environment. According to the company, that makes using visual cues to direct AR interactions much more intuitive and immersive. Moreover, it lends to another of Google’s best practices, which is that developers should make use of plane detection, digital object depth, and phone-position to world-lock objects and U.I. elements in order to encourage users to move throughout the physical space they’ve created.
Meanwhile, 2D screen-locked U.I. can be added to the mix in order to vary the experience and make things more immersive. The plane detection tools should also be used to guarantee that placement of digital objects happens in a way that doesn’t break immersion by placing objects in places a user wouldn’t accept that they could feasibly be placed. Those objects should also stand out in a way that makes clear that they can be interacted with while also making use of display and lighting technologies available to ensure that they still appear naturally in the augmented scene. Finally, immersion can be achieved by incorporating interaction mechanics ranging from digital or real-world object interactions, browsing, information displays, and visual guidance. Developers should also consider using raycast or an in-app reticle for helping users judge distance and to make implementing intents and focus easier – as well as having those U.I. elements scale or react based on distances and where the focus is centered. That same rule can be applied to interactive elements or characters in the in-app world, with those reacting differently to the user as they decrease or increase the distance from a given object or character.
The post Google Lays Out New Best Practices For Mobile AR Apps appeared first on AndroidHeadlines.com |.
【Top 10 Malaysia & Singapore Most Beautiful Girls】Have you follow?