Posts

Choosing An Engine For Your Mobile Game, AR, Or VR App

Whether you’re making a mobile game, an AR, or a VR app, you’ll need to choose the right tools for the job. You may prefer to develop your own custom tools or opt for off-the-shelf solutions to save money and time. We’ll focus on the latter and reveal the game engines that can bring your app ideas to fruition.

1. Why You Need A Game Engine For Developing Interactive Experiences

Creating interactive experiences such as games, AR, and VR apps is usually a lot harder than developing standard applications. Developers often spend thousands of hours developing, debugging and testing their interactive applications before deployment. With the right tools, they can reduce their costs and time to market (TTM) significantly. And the most suitable tools for making interactive applications are game engines.

What a good game engine brings to the table is a suite of tools that properly integrate with one and another and third-party tools. These tools may include an animator component, audio mixer, content management system, scene graph, shader graph, scripting language, level editor, mesh editor, and tilemap editor, to name a few. While any talented development team can custom develop all these tools themselves, it’s a costly and time-consuming process.

But what makes modern commercial game engines so compelling is their ability to export projects to all the most popular platforms with a single click. Thus, it’s no longer necessary to use multiple programming languages and toolchains when targeting more than one platform.

2. Not All Engines Are Created Equal

The two most popular game engines on the market at the moment are Unity and Unreal Engine. And there’s a good reason for this, as both offer the most comprehensive and robust suite of tools than their competitors. Furthermore, the companies behind these engines, namely, Unity Technologies and Epic Games, are well-funded and invest heavily in their respective flagship tools.

However, the game engine development space doesn’t stand still, and there’s a growing number of alternatives in the market. In recent years, the open-source Godot engine has made significant inroads in this space. It’s a more lightweight alternative to Unity that offers comparable features and tools, especially for developing 2D games. Yet, it doesn’t quite match Unity’s 3D, AR, and VR capabilities and export to as many platforms.

3. Costs Of Using Commercial Engines

The game engine market is incredibly competitive, and that’s forced companies to rethink their pricing policies in recent years. Both Unity and Unreal Engine have a free tier aimed at indie developers that operate on a shoestring budget.

With Unity Personal, an individual or small team development team doesn’t have to pay a cent if they earn less than $ 100,000 in 12 months. And if they make more than that amount, they’ll have to upgrade to the Plus or Pro tier. Unity Plus requires that the developer pay $ 399 per year for one seat, and Unity Pro costs $ 1,800 per year for one seat.

On the other hand, Unreal Engine has an entirely different licensing and pricing model. Developers can choose either the Creators or Publishing License, which are both free to use. Those working on custom, free, internal, or linear projects should choose the Creators License. And for those developing off-the-shelve interactive experiences should opt for the Publishing License. The latter requires that developers pay 5% royalty if their products earn over $ 1 million gross revenue during their lifetime.

4. Cross-Platform Considerations

Most modern game engines make it possible to export to a wide variety of platforms. Both Unity and Unreal Engine development teams work closely with all the leading platform holders. When new game consoles, mobile devices, AR, or VR headsets hit the market, Unity and Unreal Engine will almost always support these from the get-go. So, if you plan to target multiple platforms and future-proof your upcoming project, then you can’t go wrong with either engine.

5. When To Choose An Open Source Engine Over A Commercial Game Engine

In most cases, you’ll want to work with a commercial engine vendor, as they’ll regularly provide the features, updates, and support you’ll need. But an open-source engine could have certain unique features and tooling that’s more suitable for your project. Ultimately, you’ll want to complete your project quickly and efficiently, so choose the right tool for the job.

An open-source engine also allows you to view and change its code, which isn’t possible with most commercial game engines. For example, Unity feels like a black box to most developers because they don’t have access to its source code and can’t comprehend the engine’s inner workings.

6. Why Unity Is The Most Popular Mobile Game Development Tool

Unity has gained a reputation for being a beginner-friendly engine and attracts many would-be mobile game developers. And with the Unity Asset Store, it’s easy for developers to download free and paid 3D models, game kits, sprites, sound clips, scripts, and various other assets to complete their projects quickly and cost-effectively.

Nowadays, over 50% of mobile games have been made with Unity, solidifying the engine’s dominance in this market segment. Furthermore, Unity makes it easy to integrate a wide variety of ad APIs and monetization components and distribute Android games worldwide via a single hub.

7. How Unreal Engine Can Bring Your AR & VR Ideas To Life

Now, Unity’s an adequately powerful engine that should meet the needs of most developers. And the Unity development team has made great strides in improving its performance in recent years. However, it doesn’t quite match the performance and visual fidelity of the Unreal Engine, which is used extensively by triple-A game developers.

If you’re planning on developing an AR or VR app that requires photorealistic 3D visuals, then Unreal Engine is your best bet! And since Unreal Engine users have access to the Quixel Megascan library, it’s a relatively quick and painless process to get hold of various high-quality 3D assets. Moreover, the engine’s versatility makes it a great choice for developers working on architectural, automotive, broadcast, film, and simulation projects.

8. What Development Environments Are Available For ARCore?

With the growing popularity of AR, both Apple and Google have released powerful technologies to help developers. In Google’s case, they’ve released ARCore, which facilitates the creation of compelling AR applications. It’s designed so that developers don’t need extensive knowledge of OpenGL or rendering to bring their applications to life. Furthermore, ARCore seamlessly integrates environmental understanding, light estimation, and motion tracking components.

But what’s of great interest to developers is how ARCore works with their favorite development environments. It fully supports Android Studio and Android NDK and interfaces with Apple’s ARKit to provide iOS support via Cloud Anchors and Augmented Faces. Also, Google provides an ARCore plugin and SDK for Unity and an ARCore plugin for Unreal Engine.

9. Why You Should Work With A Development Partner

It’s no easy task creating an engaging mobile game or a trailblazing AR application. Thus, you’ll need the expertise of a development partner that understands the intricacies of custom development. The right partner will choose the right engine and tools to complete your project as efficiently as possible. And advise you throughout the planning, development, and deployment phases of your app to ensure its success. Contact us today to learn how NS804 can help you create exciting interactive experiences using the latest technologies.

An AR Case Study – NS804’s First AR Venture

Takeaways From NS804’s First AR Venture – An AR Case Study

Augmented reality (AR), is still being mastered by tech and software professionals alike. While there have been bounds and leaps made in the pursuit of developing more complex and more robust AR technology. These advances support the delivery of more immersive, realistic, and functional AR applications. NS804’s introduction to building an AR application came a few years ago and was mostly accomplished using the antiquated AR Kit 2. While there have been major advances in the software available to building AR, a lot of the core-concepts NS804 learned through this build apply, and will continue to apply to all AR applications now and in the future. This AR case study will evaluate and address some of the issues and complications that arose through the duration of this AR project. Then, the AR case study will detail some of the main lessons and takeaways from the project.

The Ask

The client in this instance was asking for an AR application that would relay data from machinery and equipment to a collector, without the need for the collector to ever come within eyesight of physical contact of the machine from which they’re pulling data. This data refers to the ‘vitals’ of the machinery and included aspects like temperature, pressure, load capacity, and other integral information regarding the machines maintenance and operational efficiency. This was a complicated project because it involved using coordinates and GPS navigation to pinpoint the equipment location, and feed that information into the AR apps map.

The Obstacles

In approaching this complication there were multiple hurdles that needed to be addressed. Firstly, there was an issue with accuracy. As even the most powerful satellite mapping can only provide an accuracy of five meters, give or take, pinpointing the exact location of the machinery became difficult. What added to this difficulty was the equipments proximity to itself. Oftentimes, different machines would be less than 100 yards away from one another. This accentuates the accuracy issue, providing another obstacle NS804 needed to work around.

In addition to the accuracy being an issue for the machines themselves and their placement on the AR interface, the tags that were required to populate also ran into a proximity issue. The tags would populate overlaying one another, or start to flicker instead of hold solid when appearing in AR.

Another complication occurred in building the perspective of the AR interface. Since there is a specific aspect to positioning in AR, and since AR was still brand new at the time; there was a lot of learning and trial by error conducted. Luckily for you, NS804 has done the work of pioneering, so they’re in a position to help design, consult, and guide on the most sophisticated and robust AR apps on the market.

Solutions

The solutions for the main obstacles above were all integrated and related. The first aspect of the comprehensive solution to these problems revolving around positioning and placement had to do with orientation. In AR, everything has to be oriented to true north – it’s how the position of virtually everything is calculated. This anchors the AR interface which was the first step to solving these issues.

Once the app was oriented toward true north, the next step was creating a more user-friendly perspective. In the first iterations, the perspective was delivered through a sort of cone-view. This was disorienting and hard to use. Instead, the app was designed to place the user in the middle of area they were surveying. This allowed the user to then use the AR app as a sort of lens, and as they moved it along their survey-area, different tags would populate.

This leads to the next issue that NS804 needed to address in order to deliver a functional, and user-friendly experience to the client. As the user viewed their survey area, and information tags began to populate; the info-tags would begin to flicker in and out. It was soon discovered that these info-tags would flicker if they were set to the same depth. Setting the tags to varying depths solved this issue, making it possible to keep the tags from overlapping and flickering in and out.

All-in-all the bulk of the complications that arose during this build were involved with the visual aspect and perspective in one way or another. NS804 was able to successfully deliver the client-ready version in about 60 days and after around 25 iterations. Being at the onset of the AR-era, NS804 could build an app of similar capability today in 2-3 weeks and more than half the iterations. This is massive jump in efficiency stems from more advanced and more robust AR design software available, as well as the experience involved in building this AR app and learning a lot of the core theories of AR design.

Key Takeaways From the AR Case Study

This AR case study should highlight a variety of important lessons regarding the development of AR applications, the trial-and-error process, and knowledge that comes with experiencing an AR design and build firsthand.

True North: In designing an AR application NS804’s first big-lesson was in orientation. Orienting everything that renders within the AR universe to true-north is how the entirety of an AR application locates everything. This enables other functionality of the AR like mapping, and positional population – as was necessary in this design.

Centered Perspective: This was another positional setting that NS804 learned in regards to creating AR applications. Centering the user perspective within the AR universe allows an expanded and more user-friendly visual field. Rather than viewing the world through a distorted “cone-view”, centering the user allows for a more comfortable user experience, plus it’s less straining on the eyes.

Layering: The third lesson related to positioning and orientation, as well as the visual experience – had to do with layering the environment successfully. This involved placing information-tags that carried the sought-after data at different depths within the AR environment. This eliminated the issues that were posed by overlapping tags that had a bad habit of flickering.

Prepared to Pivot: The final takeaway from this client project was be flexible and capable of pivots. After the successful delivery of this build; the client immediately began discovering additional uses for the software that would require additional builds and versions. These future visions revolved around converting the AR application into a marketing and sales tool. While these iterations have yet to be realized, the ability to adapt, evolve, and improve should always be the foremost priority of software developers and mobile app designers.

This AR case study illuminated a lot of knowledge into AR design for NS804, and it’s our hope that it acts as a good guide for other industry professionals looking for documentation on developing an AR application.

A similar project undertaken by NS804, today, could be accomplished in a fraction of the time. Rather than a two-month turnaround with over 25 reiterations, and a prolonged testing period, NS804 could deliver the same level of an AR application in a 2-week-timeframe. This is capable today because of learned knowledge from the original project, an upgrade in AR building software available, and more accessible and comprehensive information regarding AR that is also available.

NS804 is dedicated to making mobile app development services of any scale, available to anybody. Armed with years of experience across industries, softwares, and platforms NS804 is an excellent choice for appreneurs looking for assistance designing their next mobile-app venture. Get in touch with NS804 today to start the design-discussion, and receive expert-level guidance on your mobile-app venture.