|
| 1 | +# Firebase AI Live API Demo |
| 2 | +**Target Platforms:** iOS, Android, Web |
| 3 | + |
| 4 | +**Tech Stack:** [Flutter](https://flutter.dev/) (frontend), [Firebase AI Logic](https://firebase.google.com/docs/ai-logic) (Gemini API in Vertex AI for the backend) |
| 5 | + |
| 6 | + |
| 7 | + |
| 8 | +This app demonstrates how to build a Flutter app with real-time bidirectional |
| 9 | +audio & video streaming to the Gemini "Live API" using Firebase AI Logic. |
| 10 | + |
| 11 | +As seen in as seen in the |
| 12 | +["What's New in Flutter"](https://youtu.be/v6Rzo5khNE8?si=0316B2O7xDM4Zp4S&t=2278) |
| 13 | +Google I/O 2025 keynote. |
| 14 | + |
| 15 | +## Getting Started |
| 16 | + |
| 17 | +1. Follow [these instructions](https://firebase.google.com/docs/ai-logic/get-started?&api=vertex#set-up-firebase) |
| 18 | +to set up a Firebase project |
| 19 | + |
| 20 | +1. Connect the app to your Firebase project by using `flutterfire configure`. |
| 21 | + |
| 22 | +Install `flutterfire_cli`: |
| 23 | + |
| 24 | +```console |
| 25 | +flutter pub global activate flutterfire_cli |
| 26 | +``` |
| 27 | + |
| 28 | +Then run the `flutterfire` command to configure this project for your Firebase project: |
| 29 | + |
| 30 | +```console |
| 31 | +rm lib/firebase_options.dart |
| 32 | +flutterfire configure |
| 33 | +``` |
| 34 | + |
| 35 | +1. Run `flutter pub get` in the root of the project directory `firebase_ai_live_api_demo` to |
| 36 | +install the Flutter app dependencies |
| 37 | + |
| 38 | +1. Run `flutter run -d <device-id>` to start the app on iOS, Android, or Web. |
| 39 | + |
| 40 | +> [!TIP] |
| 41 | +> Get available devices by running `flutter devices` ex: `AA8A7357`, `macos`, `chrome`. |
| 42 | +> The live video functionality won't work on iOS simulators due to camera restrictions. |
| 43 | +
|
| 44 | +## How to use the demo app |
| 45 | + |
| 46 | +1. When prompted, allow the app permission to access your camera and microphone. |
| 47 | + |
| 48 | +1. Click the call button and say "Hello Gemini!" |
| 49 | + |
| 50 | +2. Turn on your device camera by clicking the camera button. |
| 51 | + |
| 52 | +1. You'll see in `lib/src/flutterfire_ai_live_api_demo.dart` that the app |
| 53 | +is pre-configured with a "plant identifier" system instruction. So point your camera at a plant and ask Gemini to identify it! |
| 54 | + |
| 55 | +```dart |
| 56 | +systemInstruction: Content.text( |
| 57 | + 'You are a plant identifier. Greet the user by telling them that you ' |
| 58 | + 'are a plant identifier. Ask them to turn on their camera and show ' |
| 59 | + 'you a plant and you can help them identify plants and flowers. ' |
| 60 | + 'Your job is to help the user dentify plants and flowers. ' |
| 61 | + 'When the user asks you to identify a plant or flower, respond ' |
| 62 | + 'by telling them what it is and along with fun fact about it. ' |
| 63 | + 'If you\'re unable to identify the plant or flower, you may ask the user ' |
| 64 | + 'for more information about it or ask for a closer look.', |
| 65 | +) |
| 66 | +``` |
| 67 | + |
| 68 | +If you have some time on your hands, try to modify the system instruction, |
| 69 | +model configuration, or add tools that let Gemini retrieve real-time info |
| 70 | +or to take take some sort of action. |
| 71 | + |
| 72 | +## Additional Resources |
| 73 | +- [[Codelab] Build a Gemini powered Flutter app with Flutter & Firebase AI Logic](https://codelabs.developers.google.com/codelabs/flutter-gemini-colorist) |
| 74 | +- [Firebase AI Logic docs](https://firebase.google.com/docs/ai-logic) |
| 75 | + |
| 76 | +Feeling inspired? Check out these other Flutter & Firebase AI Logic sample apps! |
| 77 | +- [Agentic App Manager](https://github.com/flutter/demos/tree/main/agentic_app_manager): Build an agentic experience in a Flutter app using Firebase AI Logic with the Gemini API in Vertex AI. |
| 78 | +- [Colorist](https://github.com/flutter/demos/tree/main/vertex_ai_firebase_flutter_app): Explore LLM tooling interfaces by allowing users to describe colors in natural language. The app uses Gemini LLM to interpret descriptions and change the color of a displayed square by calling specialized color tools. |
0 commit comments