If you’re looking to develop an Augmented Reality (AR) app, or extend the functionality of a mobile software offering, before you get started there’s a tonne of stuff you need to consider. The truth is, there are many ways in which to approach the design and development of an AR app and the best approach will always depend on the requirements of your project. It’s always best to start with the end in mind, establish your key measures of success and ensure you’ve established a mechanism by which to measure the performance of your app. Measuring success can be tricky and this is why having the right data and analytics is essential. If you’re trying to engage users and stimulate downloads, generate a new revenue stream or cut costs by enhancing your business processes, you’ll need to consider how to measure performance before a single line of code is written. Today we’re exploring how to develop an Augmented Reality app in 5 easy steps.
Step 1: The environment
The first part of the process is to consider your requirements and establish which platform you’re trying to target. If you’re looking to develop an AR app that will be accessible to millions of users, you’ll need to think about developing software for mobile devices and consider how iOS and Android will fit into your technical roadmap. If you’re a software development manager, you might be looking to extend the functionality of an existing mobile app into AR. Alternatively, you might be looking to create an AR initiative from scratch which will mean recruiting AR developers or engaging with an agency to create the appropriate development capability.
If you’re developing an AR app for enterprise, the options will be slightly different, in that you’ll be developing apps for HUD’s (heads up displays) such as the Microsoft Hololens or the DAQRI smart helmet. These platforms focus heavily on the application Augmented Reality within an industrial or enterprise context, whereby AR technologies can be deployed to generate new revenue streams and optimise business processes and production workflows. If you’re developing an Augmented Reality app for Hololens or DAQRI (or any of the other HUD manufacturers) you’ll need to think about voice control, gesture and gaze recognition. If your AR app is being developed in an industrial context, specifically for DAQRI or a similar product, you’ll need to think carefully about how to utilise features such as the stereo infrared camera.
Step 2: Development system
There are a range of tools that can help you create Augmented Reality systems. Game development systems such as Unity and Unreal are usually the tool of choice for developing Augmented Reality apps (primarily Unity), but there are a lot of other ways to go. ARToolkit is open source, providing AR capabilities in a range of environments including IoS and Android. If you want to use native development environments such as XCode and have the AR as an added feature, this could be a good way to go. Then there are toolkits that give specific abilities. Here are some of the toolkits you can consider using (we’ll cover this in more detail in a later post):
Vuforia is great for creating tracking based AR where you tie the 3D objects to a 2D image. It generates a digital code representing the image then recognises that image in real time and places your 3D object in the appropriate location. This is often used to show 3D images from catalogues or magazines though there are many other uses, identifying industrial components for example. This will work as a plugin to Unity.
Vuforia Pro gives a 1-5 rating for how easily recognisable an image is. By tweaking images subtly you can dramatically improve the score. This takes some serious work in photoshop. It is very useful at this point to have a skilled team of artists who you can throw the problem at and say ‘make it so’ in order to deliver the appropriate results.
Wikitude is a toolkit with a range of capabilities such as location based tracking (think Pokemon Go). Of course there are a whole range of applications for this from virtual tours in museums and visitor attractions to site management. It also has some useful cloud based object recognition built in. This also has a Unity plug in as well as a standalone SDK.
Kudan is another SDK which has SLAM (simultaneous localisation and mapping) incorporated. This is a set of algorithms which analyses the environment and places visual artefacts into that space. The Hololens development kit does the same sort of thing. It allows a high level of interaction with the environment, analysing surfaces, depth, etc. This comes at the cost of a lot of processing power but can give truly impressive results.
These are just some of the tools available. Some are free, some come at a cost. New ones are coming out all the time so it’s difficult to create an exhaustive list. If you’re a software development manager or CTO, you also have the option of writing the code from scratch. Effectively you are displaying a view of the real world, using inbuilt cameras, then overlaying 3D objects mapping them into that environment. The toolkits above are shortcuts. You may get more more control by writing from scratch. It’s going to take a lot of time but gives the ultimate level of desired control depending on the requirements of your project. For most purposes, using a toolkit can be a great way to get started in AR, but writing from scratch is always an option.
Step 3: Assets
Your art work. The 3D objects you are going to display. Ultimately, no matter how clever the background code is, this is how people will judge your app. It has to look and feel good, which means having a slick UI and UX. If you’re using an animation the UX has to be smooth and intuitive. You can download models from the internet, paid or free, or you can go the high quality route and design custom assets in conjunction with an experienced AR development agency.
There are performance considerations as well, of course. Just jamming thousands of objects on the screen can be fun technically but can overwhelm the users. You are better using fewer screen objects but using them well. A psychologist called Miller wrote a famous paper ‘the magic number seven plus or minus two’, which states that we can really only focus on seven things at a time (plus or minus two). So putting hundreds of items up isn’t particularly useful if you need to focus the user’s attention. If you’re doing a mapping of bitcoin transactions across the globe, then go for it, jam in as many objects as your GPU can handle, but as a general rule, be a miser with your objects. There is nothing impressive about the frame rate crawling along, no matter how good the art is.
Step 4: Coding time
So you know your target environment, your development environment and you’ve got your art assets all loaded, shiny and ready for use.
Ok, so stop – let me ask you a question. When will you know you’ve finished? What criteria does this have to fulfil. I’m not saying you have to go into a full TDD (test driven development) cycle, but know your goals and how you identify when you’ve achieved them. This relates back to understanding your key measures of success and KPI’s at the start of the project.
Once you start throwing code into the system it’s very easy to get carried away. Set your goals. Write code. Test to check it has met your goals. There are a whole range of development methodologies to guide you through this process. The simple fact is, if you don’t build quality in from the start, you won’t get it. Unity has its own unit tests. It integrates with some fantastic tools, such as Visual Studio. Use them to keep your code clean.
In Unity, or whatever you choose to use. Do the same, keep the hierarchies of objects, scripts, assets, etc clean. It’ll make your life so much easier.
The code can be broken down into three subsystems:
Display. This is pretty obvious. It shows the 3D artefacts in the environment, dealing with cameras, positions, orientations, skins/textures, all the pretty stuff.
Interactions. This is vitally important. How can the user interact with the objects? How do the objects interact with the environment. Specifically how will they interact in the target environment(s). Will they be triggered by mouse, by finger swipes, by voice or gaze? Putting these in their own specific section has made life much, much easier as, if something has gone wrong, ninety percent of the time, it’s here. This is also where you will make the most changes as requirements evolve. So make it easy for yourself, extract this code from the display code even if it’s just by putting it in a different area of the source file.
Support. Any program, whether AR or not, needs a purpose. This is the logic, the underlying code that actually does the work. Again, keep it separate from the Design and Interaction code. The beauty of doing this is that it allows you to test it independently ad build up unit tests you can run automatically without the need for user clicks, swipes, etc. This makes testing faster and automatically checks for regressions. i.e. less headaches for the developer and better quality code.
Breaking it down in this way will also allow you to easily extend and reuse your code. I know I’m preaching to the choir with experienced developers but newer developers may not have experienced the sheer joy of going back into your code library, pasting in tested and fully refactored code then sitting back in your chair with the happy sigh of a developer who has a free afternoon ahead of them.
Step 5: Errors and Testing
The image you’re tracking to might not be recognised. The surface you want to put the display on won’t track. The GPS isn’t picking up the location and you currently have your wonderful Car design displaying three miles down the road. These are all factors you will have to consider carefully when testing your AR app.
It is vital to understand the limitations of the technology and come up with a backup plan. For a game it’s frustrating if things go wrong. For a medical or industrial application it can be catastrophic so always, always have a plan B.
Assuming you’ve used good testing practices in building the code, we’re ready to test the whole thing. Make it fail. Make it fail in every way you can. Test what happens when it fails. Check when the GPS is turned off. Test when you spill coffee on the image you’re tracking. (That happened to one of our sales guys – fortunately we’d built in a ‘no-tracker’ switch so he could still display the images while a new image was printed out.) Never ever leave the end-user scratching their heads wondering what is happening. Also don’t leave drones flying around the sky with you disconnected from them, but that’s another story.
There is a great deal of satisfaction in clicking that build button, running your app and seeing the virtual and physical worlds bridged. Whether it’s a simple object displayed on your coffee table or a high quality medical application supporting laser surgery the key values are the same. Display what’s needed. Don’t display the irrelevant. Always degrade gracefully and in a controlled way.
Of course, it’s much simpler to have the software developed for you by specialists. here at Augnite, we can deploy a team of highly trained project managers, developers, artists and optimisation experts who have been working with Augmented Reality for over five years, helping our clients to develop successful AR apps through an agile approach to data-driven development and execution. We have experts in artificial intelligence and commercial data systems. For simple AR applications though it is entirely possible to do it on a smaller scale. Going through the five steps above will get you there.
If you’re a software development manager or CTO seeking a development partner and looking for a quote, scope of work and timescale for delivery, its worth following through each of the 5 steps before you engage a partner. Understand their processes for development of art assets as well as code. Understand how they test both the code and the final product and what the environment you want to deploy to is capable of. Understand the development environment and any toolkits it commits you to. Understand what happens when the system fails.
Augmented and mixed reality are bridging worlds, bringing devices from the Internet of things (IoT) into our virtual space alongside virtual artefacts. Everything from how we do business to how we control our homes is set to change. It allows us to integrate artificial intelligence into our daily processes to ensure we limit our attention to the relevant and significant. Or it can put a cute kitten in your coffee cup. We’re on the verge of reality engineering, and these first steps in augmented reality are opening the door. If you’re a business or brand looking to develop an Augmented Reality app, contact Augnite today to kick-start the conversation.