Build hybrid and on-device experiences with Firebase AI Logic

You can build AI-powered Android and Web apps and features with hybrid inference using Firebase AI Logic. Hybrid inference enables running inference using on-device models when available and seamlessly falling back to cloud-hosted models otherwise (and vice versa).

  • Using an on-device model for inference offers:

    • Enhanced privacy
    • Local context
    • Inference at no-cost
    • Offline functionality
  • Using hybrid functionality offers:

    • Reach more of your audience by accommodating on-device model availability and internet connectivity

Follow our get started guides

These guides provide step-by-step instructions to set up hybrid experiences in your apps.