14 Dec 2024
The world of extended reality (XR) is expanding rapidly, merging physical and digital realms to create immersive experiences. Android XR offers a versatile platform for developers to build applications that blend augmented reality (AR) and virtual reality (VR) into everyday life. In this post, we’ll explore the essentials of Android XR and provide you with a starting point to dive into this exciting technology.
What is Android XR?
-
XR (Extended Reality) encompasses all immersive technologies—AR, VR, and mixed reality (MR). Android XR integrates these experiences seamlessly into Android devices, allowing developers to create cutting-edge applications that:
-
Overlay digital objects on the real world (AR).
-
Fully immerse users in virtual environments (VR).
-
Combine real and virtual objects that interact in real-time (MR).
-
Android’s XR ecosystem is built on frameworks like ARCore and leverages powerful hardware capabilities available in modern devices.
Key Components of Android XR
1. ARCore
ARCore is Android’s primary SDK for building AR applications. It provides tools to:
-
Track motion in 3D space.
-
Understand environmental features like flat surfaces.
-
Estimate lighting conditions for realistic AR rendering.
2. XR Interaction Tools
Android XR provides APIs and libraries to simplify interactions, such as detecting gestures or recognizing physical objects. Developers can use Unity or Unreal Engine to create rich 3D experiences or integrate ARCore directly into Android apps for custom solutions.
3. Cross-Platform Development
Android XR supports frameworks like OpenXR, making it easier to build applications that work across multiple devices, from smartphones to head-mounted displays (HMDs).
Getting Started with Android XR Development
1. Set Up Your Development Environment
Start by installing Android Studio and configuring it for XR development:
-
Install the latest version of Android Studio.
-
Add the ARCore dependency to your project.
-
Use a physical device with ARCore support for testing.
2. Learn the Basics
Explore Android XR’s official documentation:
3. Build Your First App
Try creating a simple AR app that displays a 3D object on a flat surface. ARCore’s Plane Detection API can help you get started quickly.
My Android XR Demo Project
To help you jumpstart your journey, I’ve created a simple demo app showcasing basic XR features using ARCore and Jetpack Compose. This project serves as a practical example to learn XR development fundamentals.
Check it out on GitHub: Android XR Demo
Notice that you can also check some samples from Google team.
03 Dec 2024
Last week, I had the privilege of presenting at The Test Tribe 12th Calgary Meetup, hosted at the Neo Financial office in Calgary. The event, held on November 28, 2024, brought together an amazing community of testers and developers passionate about creating better user experiences.
During the session, titled “Creating Inclusive Experiences: A Guide to Accessibility in Android Apps with Jetpack Compose,” I delved into the vital role accessibility plays in shaping a truly universal user experience. We explored topics such as:
The challenges faced by users with disabilities, including visual, mobility, cognitive, and hearing impairments.
The accessibility services provided by Android, like TalkBack, Switch Access, and Voice Access.
Practical guidelines for creating inclusive designs, such as optimizing touch targets, simplifying gestures, and ensuring sufficient color contrast.
Tools and methods for testing accessibility, including manual testing with TalkBack and automated testing using semantics in Jetpack Compose.
The presentation also included practical demonstrations of accessibility testing, featuring examples from an app I created specifically to highlight accessibility issues and solutions (InaccessibleApp).
Why Accessibility Matters
As the World Health Organization notes, over 1.3 billion people live with some form of disability. Accessibility isn’t just about compliance; it’s about empathy and inclusion. By leveraging Jetpack Compose’s accessibility tools, we can build Android apps that make everyone feel welcome and empowered.
I want to extend a heartfelt thank you to the organizers, attendees, and everyone who made this event possible. The energy in the room was fantastic, with engaging questions and thoughtful discussions. It was a delightful evening where we all learned so much about building better, more inclusive experiences.
Looking forward to more opportunities to share knowledge and grow together with this vibrant community!
20 Oct 2024
Android development just got smarter with the introduction of Gemini, an AI-powered assistant integrated into Android Studio. Gemini is designed to enhance productivity, reduce repetitive tasks, and support developers throughout the app lifecycle. Here’s a closer look at how Gemini transforms your coding experience.
1. Code Writing and Refactoring Made Easy
Gemini doesn’t just suggest code snippets—it can write, refactor, and document code. With Gemini, you can:
Generate commit messages: Analyze your code changes and get suggested descriptions for version control.
Refactor code with ease: Rename variables, classes, and methods using intuitive AI-driven suggestions.
Streamline prototypes to production: Implement common design patterns and iterate faster than ever.
2. UI Automation for Jetpack Compose
Visualizing and fine-tuning UI designs can be tedious. Gemini enhances Compose workflows by:
Auto-generating UI previews: Use AI to create relevant mock data and preview your Composables without manual setup.
Simplifying multimodal design: Gemini can process contextual image attachments to assist in crafting visually engaging user interfaces.
3. Improving App Quality with AI Insights
Gemini integrates directly with the App Quality Insights tool, leveraging local code context to:
Suggest fixes for crashes reported via Firebase Crashlytics or Google Play Console.
Generate unit test scenarios based on your codebase, helping ensure robustness.
Provide insights into build and sync errors for faster troubleshooting.
4. Streamlined Documentation
With Gemini, generating documentation is no longer a chore. Simply highlight your code, and Gemini will produce clear, comprehensive comments, making it easier to onboard new team members and maintain codebases.
5. Why Gemini Matters
AI tools like Gemini represent the future of app development. By automating routine tasks, developers can focus on solving complex problems and innovating in their projects. With features like commit message generation and error analysis, Gemini ensures your codebase is not only efficient but also of high quality.
11 Sep 2024
Android has a collection of tools which can help people with disability and the following list has the most important features in my opnion:
✨ Talkback screen reader and Select to Speak
This feature helps you as you navigate your smartphone. On a given screen, it will tell you what kind of screen it is, and what’s on it. For example, if you’re on a settings page, Talkback will read out the section name (such as notifications). When you tap an icon or item, your selection gets a green outline, and the assistant identifies it. Double tapping the same icon opens it. Talkback reminds you to double-tap when you tap on an item.
✨ Font Size and High Contrast Text
This setting lets you change the font size on your device from the default. You can make the text smaller than the default, or various levels larger. As you make adjustments, you can see how it will look.
✨ Magnification 🔍
The user can use a gesture to zoom in on certain parts of your screen. Once you enable the feature in settings, you can zoom in by tapping the screen three times with your finger.
✨Switch Access
Users can Interact with Android app using one or more switches or a keyboard instead of the touch screen. It is also possible to use Camera Switches to navigate with facial gestures.
Jetpack Compose & Accessibility
Jetpack Compose is great when it comes accessibility - it uses semantics properties to pass information to Accessibility services to transform what’s shown on screen to a more fitting format for a user with a specific need.
Most built-in composables like Text and Button fill these semantics properties with information inferred from the composable and its children. Some modifiers like toggleable and clickable will also set certain semantics properties.
However, sometimes the framework needs more information to understand how to describe a UI element to the user.
Improving accessibility using Jetpack compose
This list is mainly focused on some approaches you should think about while developing the app.
There are a bunch of other items related to UI/UX which weren’t included on this list (eg. Colour contrast ratios, minimum touch target sizes, High Contrast theme, etc).
The following items are meant to help people with accessibility needs use the app successfully:
Describe visual elements
Pass a textual description of the visual element whenever is relevant.
Describe click labels
You can use a click label to add semantic meaning to the click behavior of a composable. This way, accessibility services can explain to the user what will happen when the user interacts with the component.
Describe an element’s state
You can describe the state of a component in order to Android to read out the state that the component is in. For example, a toggleable checkbox can be in either a “Checked” or an “Unchecked” state.
Merge visual elements
Sometimes you need to merge elements to make Talkback and Switch Access more efficient.
For example if every single low-level visual element in your screen is focused independently, a user will have to interact a lot to move across the screen. Other issue you may face is a user not being able to understand lists of the app because each element of the list will be focused independently (instead of the item itself).
04 Sep 2024
In the ever-evolving world of mobile app development, creating seamless experiences across Android and iOS is no small feat. Traditionally, developers have had to choose between platform-specific native development, which provides the best performance and user experience but involves duplicate efforts, or cross-platform solutions like React Native, Flutter, or Xamarin, which promise “write once, run everywhere” with some compromises.
Kotlin Multiplatform (KMP) takes a unique approach by enabling code sharing while retaining native capabilities, making it a compelling choice for modern developers. Let’s dive into the benefits of KMP for Android and iOS development, compare it with other alternatives, and explore why KMP might be the right fit for your next project.
Kotlin Multiplatform (KMP) is a feature of Kotlin, a programming language developed by JetBrains and widely adopted for Android development. Unlike other cross-platform tools that require you to build the entire application using their framework, KMP focuses on sharing business logic, network code, and data layers while allowing you to write platform-specific UI code. This balance provides a unique blend of code reusability and platform fidelity.
Key Benefits of KMP
-
Code Reuse with Platform Flexibility
KMP allows you to write common business logic once and share it across platforms, reducing duplication and development time. At the same time, you can implement platform-specific UI and interactions, ensuring that your app feels native on both Android and iOS.
-
Native Performance
Unlike JavaScript-based frameworks, KMP compiles shared code to native binaries using Kotlin/Native for iOS and the JVM for Android. This results in high-performance apps that leverage each platform’s full capabilities.
-
Leverage Existing Ecosystems
KMP integrates seamlessly with native development tools:
For Android: KMP works directly with Android Studio, Gradle, and Kotlin extensions.
For iOS: You can use Xcode and Swift alongside KMP-generated binaries.
-
Interoperability
KMP provides robust interoperability with both Java on Android and Swift/Objective-C on iOS, enabling you to integrate shared and native code effortlessly.
-
Future-Proof Solution
As Kotlin is officially supported by Google and widely adopted for Android, KMP aligns with the long-term direction of modern Android development. Its active community and JetBrains’ backing ensure continued innovation.
Why KMP is Better for Modern Teams
-
Focus on Native-Like User Experiences
Unlike Flutter and React Native, which use custom rendering engines, KMP encourages developers to craft native UIs for each platform. This ensures the app feels truly native, aligning with platform-specific design guidelines.
-
Incremental Adoption
You don’t need to rewrite your app to use KMP. It’s easy to adopt incrementally, sharing only selected modules like data access or networking while leaving existing code intact.
-
Shared Code Without Compromising Control
KMP provides the best of both worlds: code reuse where it matters (business logic) and full control over platform-specific implementations. This is ideal for teams that prioritize user experience.
-
Fewer External Dependencies
KMP uses Kotlin, which is already a familiar and powerful language for Android developers. There’s no need to learn a new framework or deal with third-party dependencies for basic functionality.
When to Choose KMP
KMP is ideal if:
- You already have an Android development team familiar with Kotlin.
- You need a shared codebase for business logic but want platform-specific UIs.
- Performance is critical, and you can’t compromise with JavaScript-based solutions.
- You’re migrating an existing codebase and prefer incremental adoption.
Conclusion
Kotlin Multiplatform offers a pragmatic approach to cross-platform development, empowering developers to share code where it matters while crafting tailored, high-performance native experiences. Its seamless integration with native ecosystems, native performance, and incremental adoption make it a future-ready choice for teams looking to optimize their development processes. If you’re building a new app or looking to reduce technical debt in an existing one, Kotlin Multiplatform might just be the game-changer you need.