Exploring Android Studio's Gemini Journeys: AI-Powered Testing Revolution

Android development just got a significant upgrade with the introduction of Gemini Journeys in Android Studio. This innovative AI-powered feature promises to transform how we approach end-to-end testing by leveraging natural language prompts instead of traditional manual test creation.

What is Gemini Journeys?

Gemini Journeys represents a paradigm shift in mobile testing methodology. Instead of writing complex test scripts line by line, developers can now describe their testing intentions in plain English, and Gemini AI translates these prompts into comprehensive end-to-end tests.

The feature integrates seamlessly with Android Studio’s preview environment, offering developers an intuitive way to:

  • Generate automated UI tests through conversational prompts
  • Create comprehensive test scenarios without deep testing framework knowledge
  • Accelerate the testing workflow significantly
  • Reduce the barrier to entry for comprehensive mobile testing

Hands-On Experience: Building with KoinBase

To explore Gemini Journeys’ capabilities, I created a demo project called KoinBase - a simple cryptocurrency tracking application built with Jetpack Compose. The app showcases modern Android development practices while serving as a perfect testing ground for AI-assisted test generation.

Key Features of the Demo:

  • Clean Architecture: Implementing MVVM pattern with proper separation of concerns
  • Jetpack Compose UI: Modern declarative UI framework
  • Dependency Injection: Using Koin for lightweight DI
  • Network Integration: RESTful API consumption for crypto data
  • Material 3 Design: Following latest design guidelines

First Impressions: A Game Changer

After experimenting with Gemini Journeys on the KoinBase project, here are my initial thoughts:

The Good:

  • Intuitive Workflow: Describing test scenarios in natural language feels remarkably natural
  • Productivity Boost: Test creation time reduced significantly compared to manual approaches
  • Intelligent Context: Gemini understands app structure and suggests relevant test scenarios
  • Quality Output: Generated tests are comprehensive and well-structured

The Promise: This technology represents a fundamental shift toward more accessible and efficient mobile testing. For teams struggling with testing coverage or developers new to automated testing, Gemini Journeys could be transformational.

Looking Forward

Gemini Journeys appears to be more than just another AI tool - it’s positioning itself as a genuine game changer for mobile testing workflows. The ability to generate robust E2E tests through conversational prompts could democratize comprehensive testing practices across development teams of all skill levels.

As AI continues to integrate deeper into development workflows, features like Gemini Journeys demonstrate how machine learning can augment human creativity rather than replace it. The future of Android development looks increasingly collaborative between human insight and artificial intelligence capabilities.

Try It Yourself

Interested in exploring Gemini Journeys? Check out the official documentation and consider experimenting with your own projects. The KoinBase demo is also available as a reference implementation.

The intersection of AI and mobile development continues to evolve rapidly, and Gemini Journeys represents an exciting step toward more intelligent, efficient development practices.


Untangling State - Easier Android App Management with Compose

Building Android apps today is a lot about managing “state.” Think of state as all the information that makes your app tick: the text a user typed, whether a button is enabled, a list of items to display. As your app grows, managing this state can get tricky, making your code messy and hard to maintain.

Thankfully, Jetpack Compose, Android’s modern UI toolkit, offers some elegant patterns to keep your state under control. Let’s break down some of the key ideas, making them easier to understand than a complex technical paper.

The Core Idea: State Hoisting

Imagine you have a Checkbox in your app. It has two states: checked or unchecked. If the Checkbox manages its own state, it’s called “internal state.” But what if another part of your app needs to know if it’s checked?

This is where State Hoisting comes in. Instead of the Checkbox holding its own “checked” status, we “hoist” that status up to a parent component. The Checkbox then becomes a “dumb” component. It just shows what it’s told to show and tells its parent when it’s clicked.

Think of it like a child asking a parent for permission. The child (our Checkbox) doesn’t decide if it can have a cookie (change its state). It asks the parent (the higher-level component), and the parent makes the decision and tells the child what to do.

In Compose, this often looks like:

@Composable
fun MyFancyCheckbox(
    isChecked: Boolean, // The state is passed in
    onCheckedChange: (Boolean) -> Unit // An event is passed out
) {
    Checkbox(
        checked = isChecked,
        onCheckedChange = onCheckedChange // The parent handles the actual state update
    )
}

@Composable
fun ParentScreen() {
    var checkedState by rememberSaveable { mutableStateOf(false) } // Parent manages the state
    MyFancyCheckbox(
        isChecked = checkedState,
        onCheckedChange = { newCheckedState -> checkedState = newCheckedState }
    )
}

This makes MyFancyCheckbox reusable and testable because it doesn’t care how its state is managed, only what its state is and when it’s interacted with.

State Holders: Your State Organizers

As your app gets more complex, you’ll have more and more state. Just having a bunch of vars in your @Composable function can get unwieldy. This is where State Holders come in handy.

A State Holder is essentially a plain old Kotlin class that holds and manages a piece of your UI’s state. It centralizes all the logic related to that state.

Imagine a user profile screen. It might have the user’s name, email, and a “save” button. Instead of managing all these bits of information directly in your ProfileScreen Composable, you could have a ProfileScreenStateHolder (or ViewModel if it’s lifecycle-aware).

// A simple example of a State Holder
class MyLoginScreenStateHolder {
    var username by mutableStateOf("")
    var password by mutableStateOf("")

    fun onUsernameChanged(newUsername: String) {
        username = newUsername
    }

    fun onPasswordChanged(newPassword: String) {
        password = newPassword
    }

    fun login() {
        // Perform login logic using username and password
        println("Attempting to log in with username: $username")
    }
}

@Composable
fun LoginScreen(stateHolder: MyLoginScreenStateHolder = remember { MyLoginScreenStateHolder() }) {
    Column {
        TextField(
            value = stateHolder.username,
            onValueChange = stateHolder::onUsernameChanged,
            label = { Text("Username") }
        )
        TextField(
            value = stateHolder.password,
            onValueChange = stateHolder::onPasswordChanged,
            label = { Text("Password") }
        )
        Button(onClick = stateHolder::login) {
            Text("Login")
        }
    }
}

This separates the UI (LoginScreen) from the logic and state management (MyLoginScreenStateHolder), making your code cleaner and easier to understand.

ViewModels: The Android-Aware State Holders When your State Holder needs to survive configuration changes (like rotating your phone) or interact with data from your app’s deeper layers (like a database or network), you often use a ViewModel.

A ViewModel is a special kind of State Holder provided by Android Architecture Components. It’s designed to hold UI-related data in a way that survives app lifecycle events. It’s often where you’ll find your network calls, database operations, and other business logic that feeds into your UI.

Think of it as the brain of your screen or feature. It fetches data, processes it, and then exposes that data to your Composables.

When to Choose What?

  • State Hoisting: For simple UI elements where the parent needs to control the state. It makes components reusable and less coupled.
  • Simple State Holders (Plain Kotlin classes): When you have a group of related UI state that needs to be managed together within a single Composable, and it doesn’t need to survive lifecycle changes or interact with deeper app layers.
  • ViewModels: For complex screens or features where you need to manage state that survives configuration changes, interacts with data sources (like network or database), or requires more complex business logic. They are typically used for a whole screen or a significant portion of it.

The Benefits of Good State Management

By applying these patterns, you gain:

  • Cleaner Code: Your UI code focuses solely on how things look, not what data they hold or how that data changes.
  • Easier Testing: You can test your State Holders and ViewModels independently of your UI.
  • Better Reusability: Components become generic and can be used in different parts of your app.
  • Improved Maintainability: When something breaks, it’s easier to pinpoint where the issue lies.

Understanding and applying these state management patterns in Jetpack Compose will significantly improve the quality and maintainability of your Android applications. It’s a fundamental concept that will serve you well as you build more complex and robust experiences.


Execution Order in Jetpack Compose Explained with Analogies

Understanding when LaunchedEffect, DisposableEffect, and composables run in Jetpack Compose can be tricky. Let’s simplify with a few real-world analogies.

🎭 Composables = Stage Actors

Composables are like actors:

  • They enter when the screen appears.

  • They update their lines when state changes (recomposition).

  • They exit when removed from the UI.

🕯️ LaunchedEffect = Candle in a Room

  • You light a candle when entering a room → LaunchedEffect runs.

  • If the room changes (key changes), you blow it out and light a new one.

  • If you leave, the candle is blown out.

  • Use it for one-time effects or state collection.

🧹 DisposableEffect = Hotel Housekeeping

  • Housekeeper sets up the room → DisposableEffect runs.

  • When you check out (or key changes), the room is cleaned → onDispose is called.

  • Perfect for listeners or subscriptions that need cleanup.

🔄 Recomposition = Changing Actor’s Lines

If the script (state) changes, actors stay on stage but adjust their lines. No need to re-run effects unless keys change.

Quick Comparison

Concept Analogy When It Runs When It Cleans Up
Composable Actor On screen draw/state On removal
LaunchedEffect Candle On enter/key change On key change/removal
DisposableEffect Housekeeping On enter/key change On key change/removal

✅ Final Tip

So next time you add a LaunchedEffect or a DisposableEffect, ask yourself:

  • Is this a one-time action? → Use LaunchedEffect.

  • Does it need cleanup? → Use DisposableEffect.

  • Thinking this way makes Compose easier and your code cleaner.

  • Side-Effect official docs.

  • LaunchedEffect official docs.


Exploring Android XR

The world of extended reality (XR) is expanding rapidly, merging physical and digital realms to create immersive experiences. Android XR offers a versatile platform for developers to build applications that blend augmented reality (AR) and virtual reality (VR) into everyday life. In this post, we’ll explore the essentials of Android XR and provide you with a starting point to dive into this exciting technology.

What is Android XR?

  • XR (Extended Reality) encompasses all immersive technologies—AR, VR, and mixed reality (MR). Android XR integrates these experiences seamlessly into Android devices, allowing developers to create cutting-edge applications that:

  • Overlay digital objects on the real world (AR).

  • Fully immerse users in virtual environments (VR).

  • Combine real and virtual objects that interact in real-time (MR).

  • Android’s XR ecosystem is built on frameworks like ARCore and leverages powerful hardware capabilities available in modern devices.

Key Components of Android XR

1. ARCore

ARCore is Android’s primary SDK for building AR applications. It provides tools to:

  • Track motion in 3D space.

  • Understand environmental features like flat surfaces.

  • Estimate lighting conditions for realistic AR rendering.

2. XR Interaction Tools

Android XR provides APIs and libraries to simplify interactions, such as detecting gestures or recognizing physical objects. Developers can use Unity or Unreal Engine to create rich 3D experiences or integrate ARCore directly into Android apps for custom solutions.

3. Cross-Platform Development

Android XR supports frameworks like OpenXR, making it easier to build applications that work across multiple devices, from smartphones to head-mounted displays (HMDs).

Getting Started with Android XR Development

1. Set Up Your Development Environment

Start by installing Android Studio and configuring it for XR development:

  • Install the latest version of Android Studio.

  • Add the ARCore dependency to your project.

  • Use a physical device with ARCore support for testing.

2. Learn the Basics

Explore Android XR’s official documentation:

3. Build Your First App

Try creating a simple AR app that displays a 3D object on a flat surface. ARCore’s Plane Detection API can help you get started quickly.

My Android XR Demo Project

To help you jumpstart your journey, I’ve created a simple demo app showcasing basic XR features using ARCore and Jetpack Compose. This project serves as a practical example to learn XR development fundamentals.

android-xr-bitcoin-ethereum

Check it out on GitHub: Android XR Demo

Notice that you can also check some samples from Google team.


A Guide to Accessibility in Android Apps with Jetpack Compose

Last week, I had the privilege of presenting at The Test Tribe 12th Calgary Meetup, hosted at the Neo Financial office in Calgary. The event, held on November 28, 2024, brought together an amazing community of testers and developers passionate about creating better user experiences.

During the session, titled “Creating Inclusive Experiences: A Guide to Accessibility in Android Apps with Jetpack Compose,” I delved into the vital role accessibility plays in shaping a truly universal user experience. We explored topics such as:

The challenges faced by users with disabilities, including visual, mobility, cognitive, and hearing impairments. The accessibility services provided by Android, like TalkBack, Switch Access, and Voice Access. Practical guidelines for creating inclusive designs, such as optimizing touch targets, simplifying gestures, and ensuring sufficient color contrast. Tools and methods for testing accessibility, including manual testing with TalkBack and automated testing using semantics in Jetpack Compose. The presentation also included practical demonstrations of accessibility testing, featuring examples from an app I created specifically to highlight accessibility issues and solutions (InaccessibleApp).

Accessibility pet

Why Accessibility Matters

As the World Health Organization notes, over 1.3 billion people live with some form of disability. Accessibility isn’t just about compliance; it’s about empathy and inclusion. By leveraging Jetpack Compose’s accessibility tools, we can build Android apps that make everyone feel welcome and empowered.

Thank You to the Community

I want to extend a heartfelt thank you to the organizers, attendees, and everyone who made this event possible. The energy in the room was fantastic, with engaging questions and thoughtful discussions. It was a delightful evening where we all learned so much about building better, more inclusive experiences.

Looking forward to more opportunities to share knowledge and grow together with this vibrant community!