When you're ready to launch your app and have real end users interact with your generative AI features, make sure to review this checklist of best practices and important considerations.
General
Review the general launch checklist for apps that use Firebase
This Firebase launch checklist describes important best practices before launching any Firebase app to production.
Make sure your Firebase projects follow best practices
For example, make sure that you use different Firebase projects for development, testing, and production. Review more best practices for managing your projects.
Access and security
Review the general security checklist for apps that use Firebase
This security checklist describes important best practices for access and security for Firebase apps and services.
Start enforcing Firebase App Check
App Check helps protect the Vertex AI Gemini API by verifying that requests are from your actual app. It supports attestation providers for Apple platforms (DeviceCheck or App Attest), Android (Play Integrity), and Web (reCAPTCHA Enterprise).
Set up restrictions for your Firebase API keys
Review each Firebase API key's "API restrictions" allowlist:
Make sure that the Vertex AI in Firebase API is in the allowlist.
Make sure that the only other APIs in the key's allowlist are for Firebase services that you use in your app. See the list of which APIs are required to be on the allowlist for each product.
Set "Application restrictions" to help restrict usage of each Firebase API key to only requests from your app (for example, a matching bundle ID for the Apple app). Note that even if you restrict your key, Firebase App Check is still strongly recommended.
Note that Firebase-related APIs use API keys only to identify the Firebase project or app, not for authorization to call the API.
Disable any unused APIs in your Firebase project
For example, if you first tried out the Gemini API using Google AI Studio, you can now disable the Generative Language API. Your app now uses Vertex AI in Firebase, which relies on the Vertex AI API and the Vertex AI in Firebase API instead.
Billing and quota
Review your quotas for the required underlying APIs
Using Vertex AI in Firebase requires two APIs: the Vertex AI API and the Vertex AI in Firebase API.
Each API's quota is measured slightly differently, which means that they can be used for different purposes. For important considerations, see Understand the quotas for each API.
Note that quotas are also variable according to model and region, so make sure that your quotas are set accordingly for your users and use cases.
You can also edit quota or request a quota increase, as needed.
Avoid surprise bills
As a best practice for production, monitor your usage and set up budget alerts.
Management of configurations
Use a stable model version in your production app
In your production app, only use
stable model versions (like
gemini-1.5-flash-002
), not a preview version or an auto-updated version.
Even though an auto-updated version points to a stable version, the actual model version it points to will automatically change whenever a new stable version is released, which could mean unexpected behavior or responses. Also, preview versions are only recommended during prototyping.
We also strongly recommend using Firebase Remote Config to control and update the model name used in your app (see the next section for details).
Set up and use Firebase Remote Config
With Remote Config, you can control important configurations for your generative AI feature in the cloud rather than hard-coding values in your code. This means that you can update your configuration without releasing a new version of your app. You can do a lot with Remote Config, but here are the top values that we recommend you control remotely for your generative AI feature:
Keep your app up-to-date.
- Model name: Update the model your app uses as new models are released or others are discontinued.
Adjust values and inputs based on client attributes, or to accommodate feedback from testing or users.
Model configuration: Adjust the temperature, max output tokens, and more.
Safety settings: Adjust safety settings if too many responses are getting blocked or if users report harmful responses.
System instructions and any prompts that you provide: Adjust the additional context that you're sending to the model to steer its responses and behavior. For example, you might want to tailor prompts for specific client types, or personalize prompts for new users that differ from those used to generate responses for existing users.
You could also optionally set a minimum_version
parameter in Remote Config
to compare the app's current version with the Remote Config-defined latest
version, to either show an upgrade notification to users or force users to
upgrade.
Set the location for where to run the Vertex AI service and access a model
Setting a location can help with costs as well as help prevent latency for your users.
If you don't specify a location, the default is us-central1
. You can set this
location during initialization, or you can optionally
use Firebase Remote Config to dynamically change the location based on each user's location.