Building Augmented Reality Features in XML Apps

Augmented Reality (AR) is rapidly transforming how users interact with mobile applications, blending the digital world with the real one. While modern frameworks like ARCore and Sceneform offer powerful tools for building AR experiences, it’s also possible to integrate AR features into traditional Android applications built with XML. This blog post explores various methods to add AR functionalities to XML-based Android apps.

What is Augmented Reality?

Augmented Reality is an interactive experience that combines a real-world environment and computer-generated content. It enhances the user’s current perception of reality through the use of digital visual elements, sound, or other sensory stimuli delivered via technology.

Why Add AR Features to XML Apps?

  • Enhanced User Experience: Adds immersive elements that can captivate users.
  • Innovative Features: Opens new possibilities for app functionality, like virtual try-ons or interactive product displays.
  • Compatibility: Provides a path to integrating AR in older or simpler applications that might not be ready for a complete rewrite with modern frameworks.

Methods for Integrating AR Features in XML Apps

Several methods exist to add AR functionalities to XML-based Android apps. Let’s explore some popular options:

1. Using Web Views and Web-Based AR Libraries

One approach is to use a WebView to incorporate web-based AR libraries. These libraries abstract the complexity of AR, making it easier to integrate into your app. Popular libraries include AR.js and A-Frame.

Step 1: Add a WebView to Your XML Layout

First, include a WebView in your activity’s XML layout file:


<WebView
    android:id="@+id/arWebView"
    android:layout_width="match_parent"
    android:layout_height="match_parent"/>
Step 2: Enable JavaScript in the WebView

In your activity’s onCreate method, enable JavaScript for the WebView:


import androidx.appcompat.app.AppCompatActivity
import android.os.Bundle
import android.webkit.WebView
import android.webkit.WebSettings

class MainActivity : AppCompatActivity() {
    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)

        val arWebView: WebView = findViewById(R.id.arWebView)
        val webSettings: WebSettings = arWebView.settings
        webSettings.javaScriptEnabled = true

        // Load your AR content
        arWebView.loadUrl("file:///android_asset/ar/index.html") // Ensure your HTML file is in the assets folder
    }
}
Step 3: Create AR Content using AR.js and A-Frame

AR.js and A-Frame allow you to create AR experiences with HTML, JavaScript, and a minimal amount of code. Create an index.html file with your AR content:


<!DOCTYPE html>
<html>
<head>
    <meta charset="utf-8">
    <title>AR.js A-Frame Location-based</title>
    <script src="https://aframe.io/releases/3.0.0/aframe.min.js"></script>
    <script src="https://unpkg.com/aframe-look-at-component@0.8.0/index.js"></script>
    <script src="https://raw.githack.com/AR-js-org/AR.js/master/aframe/build/aframe-ar-nft.js"></script>
</head>
<body style='margin : 0px; overflow: hidden;'>
    <a-scene embedded arjs='sourceType: webcam; debugUIEnabled: false; trackingMethod: best;'>
        <a-marker type='barcode' value='7'>
            <a-entity
                geometry='primitive: box; depth: 0.2; height: 0.2; width: 0.2'
                material='color: red; opacity: 0.6'
                scale="2 2 2"
                look-at='[gps-camera]'
            >
                <a-animation attribute="rotation" dur="5000" to="360 360 0" repeat="indefinite" easing="linear"></a-animation>
            </a-entity>
        </a-marker>
        <a-entity camera gps-camera rotation-reader/>
    </a-scene>
</body>
</html>

Place this index.html file in the assets/ar/ folder of your Android project.

Key Points:

  • The WebView loads local HTML content containing AR functionalities.
  • AR.js and A-Frame handle the complexities of AR rendering and tracking.
  • The HTML file uses JavaScript libraries to display AR elements over the camera view.

2. Using ARCore with Native Integration

For more sophisticated AR features, you can integrate ARCore (Google’s AR SDK) directly into your XML app, but this is a bit more complex.

Step 1: Add ARCore Dependency

Add the ARCore dependency to your app’s build.gradle file:


dependencies {
    implementation("com.google.ar:core:1.40.0") // Use the latest version
    implementation("androidx.appcompat:appcompat:1.6.1") // Optional, AppCompat Library for Material design support
}
Step 2: Verify AR Support

Check if the device supports ARCore and prompt the user to install ARCore if needed:


import androidx.appcompat.app.AppCompatActivity
import android.os.Bundle
import android.content.Intent
import android.net.Uri
import android.util.Log
import android.widget.Toast
import com.google.ar.core.ArCoreApk

class MainActivity : AppCompatActivity() {

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)

        checkARSupport()
    }

    private fun checkARSupport() {
        when (ArCoreApk.getInstance().checkAvailability(this)) {
            ArCoreApk.Availability.SUPPORTED -> {
                Log.d("AR", "ARCore is supported on this device.")
                // Proceed to use AR features
            }
            ArCoreApk.Availability.UNSUPPORTED_DEVICE_NOT_CAPABLE -> {
                Toast.makeText(this, "ARCore is not supported on this device.", Toast.LENGTH_LONG).show()
                finish()
            }
            ArCoreApk.Availability.INSTALL_REQUESTED -> {
                try {
                    ArCoreApk.getInstance().requestInstall(this, true)
                } catch (e: Exception) {
                    Toast.makeText(this, "ARCore installation failed.", Toast.LENGTH_LONG).show()
                }
            }
            ArCoreApk.Availability.UNKNOWN_CHECKING -> {
                // Try again later.
            }
            ArCoreApk.Availability.UNKNOWN_ERROR -> {
                // Handle error condition.
            }
            null -> {
                // This is impossible since the checkAvailability call is synchronous.
            }
        }
    }
}
Step 3: Create a GLSurfaceView for AR Rendering

Add a GLSurfaceView to your XML layout for rendering the AR scene:


<android.opengl.GLSurfaceView
    android:id="@+id/surfaceview"
    android:layout_width="match_parent"
    android:layout_height="match_parent"/>

In your activity, set up the GLSurfaceView with an ARCore session:


import androidx.appcompat.app.AppCompatActivity
import android.opengl.GLSurfaceView
import android.os.Bundle
import com.google.ar.core.ArCoreApk
import com.google.ar.core.Session
import android.opengl.GLES20
import javax.microedition.khronos.egl.EGLConfig
import javax.microedition.khronos.opengles.GL10

class MainActivity : AppCompatActivity() {

    private lateinit var surfaceView: GLSurfaceView
    private var arSession: Session? = null

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)

        surfaceView = findViewById(R.id.surfaceview)
        surfaceView.preserveEGLContextOnPause = true
        surfaceView.setEGLContextClientVersion(2)
        surfaceView.setRenderer(object : GLSurfaceView.Renderer {
            override fun onSurfaceCreated(gl: GL10?, config: EGLConfig?) {
                GLES20.glClearColor(0.1f, 0.1f, 0.1f, 1.0f)
            }

            override fun onSurfaceChanged(gl: GL10?, width: Int, height: Int) {
                GLES20.glViewport(0, 0, width, height)
            }

            override fun onDrawFrame(gl: GL10?) {
                GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT or GLES20.GL_DEPTH_BUFFER_BIT)
            }
        })

        // Ensure ARCore is installed and supported
        checkARSupport()

        try {
            arSession = Session(this)
            val config = arSession!!.config
            arSession!!.configure(config)
        } catch (e: Exception) {
            // Handle ARCore session creation errors
            Log.e("AR", "Failed to create AR session", e)
        }
    }

    override fun onResume() {
        super.onResume()
        arSession?.resume()
        surfaceView.onResume()
    }

    override fun onPause() {
        super.onPause()
        arSession?.pause()
        surfaceView.onPause()
    }

    //Implement checkARSupport as provided earlier.
    private fun checkARSupport() {
        when (ArCoreApk.getInstance().checkAvailability(this)) {
            ArCoreApk.Availability.SUPPORTED -> {
                Log.d("AR", "ARCore is supported on this device.")
                // Proceed to use AR features
            }
            ArCoreApk.Availability.UNSUPPORTED_DEVICE_NOT_CAPABLE -> {
                Toast.makeText(this, "ARCore is not supported on this device.", Toast.LENGTH_LONG).show()
                finish()
            }
            ArCoreApk.Availability.INSTALL_REQUESTED -> {
                try {
                    ArCoreApk.getInstance().requestInstall(this, true)
                } catch (e: Exception) {
                    Toast.makeText(this, "ARCore installation failed.", Toast.LENGTH_LONG).show()
                }
            }
            ArCoreApk.Availability.UNKNOWN_CHECKING -> {
                // Try again later.
            }
            ArCoreApk.Availability.UNKNOWN_ERROR -> {
                // Handle error condition.
            }
            null -> {
                // This is impossible since the checkAvailability call is synchronous.
            }
        }
    }

}

Key Points:

  • Requires manual management of OpenGL ES for rendering the AR scene.
  • Involves handling ARCore lifecycle events and session configurations.
  • Offers deeper integration and access to ARCore’s advanced features.

3. Utilizing External AR SDKs and Libraries

Various third-party AR SDKs, like Vuforia, Wikitude, and Kudan, can also be integrated into your XML-based Android applications.

Implementation Steps:
  1. Select and Integrate the SDK: Choose an AR SDK based on your project requirements and integrate its libraries into your Android project.
  2. License Setup: Obtain and configure any necessary licenses or API keys for the chosen SDK.
  3. AR Scene Configuration: Use the SDK’s APIs to set up the AR scene, including marker detection, object rendering, and interaction handling.

Using External SDKs Provides:

  • Extended Feature Sets: Third-party SDKs often offer specialized functionalities like advanced tracking or cloud recognition.
  • Simplified Implementation: Some SDKs abstract away much of the low-level AR complexities.
  • Cross-Platform Support: Many external SDKs support multiple platforms, allowing you to reuse AR functionalities across projects.

4. Using Fragments and Modern AR Frameworks (Bridge Approach)

An advanced method involves using Fragments with modern AR frameworks like ARCore/Sceneform, leveraging XML for layout and Compose for AR components.

Implementation Overview:
  1. XML Layout with Fragment: Define an XML layout that contains a <fragment> element.
  2. Fragment Implementation: Create a Fragment that utilizes ARCore and Compose interop for AR functionalities.
  3. Communication: Implement communication mechanisms between the Fragment and the Activity.

This hybrid approach benefits from:

  • Leveraging modern AR capabilities within existing XML layouts.
  • Flexibility in incrementally adopting modern frameworks like Jetpack Compose.
  • Better separation of concerns between UI structure and AR components.

Conclusion

Integrating AR features into XML-based Android applications offers compelling ways to enhance user experiences. Whether through WebViews, native ARCore integration, or external AR SDKs, there are several pathways to incorporating augmented reality into your apps. Choosing the right method depends on your project’s complexity, performance requirements, and development resources. With the right approach, you can seamlessly blend the digital and physical worlds, providing innovative and engaging experiences for your users.