-
Notifications
You must be signed in to change notification settings - Fork 667
Add internal generative model provider architecture #7722
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: feature/hybrid
Are you sure you want to change the base?
Conversation
Introduces a new internal architecture for generative model providers, allowing for flexible integration of different generative AI backends. Key changes include: - Added `GenerativeModelProvider` interface to define common operations for generative models. - Implemented `CloudGenerativeModelProvider` for interacting with cloud-based generative AI services. - Implemented `OnDeviceModelProvider` to support on-device generative AI models, including logic for handling text and image parts in prompts and mapping interop types. - Introduced `FallbackGenerativeModelProvider` to enable a primary model with a configurable fallback to a secondary model based on a precondition or exceptions. - Added `MissingOnDeviceModelProvider` as a placeholder for when on-device models are not available. - Modified `Candidate` to include a `fromInterop` method for converting on-device model responses. - Changed `GenerateContentResponse` constructor visibility to `internal` to accommodate new provider implementations.
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. |
|
/gemini review |
Generated by 🚫 Danger |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request introduces a well-designed internal architecture for generative model providers, which significantly improves flexibility for integrating different AI backends. The separation of concerns with CloudGenerativeModelProvider, OnDeviceModelProvider, and FallbackGenerativeModelProvider is clean and effective. My review includes a couple of suggestions for the OnDeviceModelProvider to enhance its robustness and configurability.
firebase-ai/src/main/kotlin/com/google/firebase/ai/generativemodel/OnDeviceModelProvider.kt
Outdated
Show resolved
Hide resolved
firebase-ai/src/main/kotlin/com/google/firebase/ai/generativemodel/OnDeviceModelProvider.kt
Outdated
Show resolved
Hide resolved
|
/gemini review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request introduces a new internal architecture for generative model providers, which is a great step towards a more flexible and maintainable system. The new GenerativeModelProvider interface and its various implementations (Cloud, OnDevice, Fallback, Missing) are well-designed and effectively abstract away the specifics of different AI backends. The changes to related data classes like Candidate and GenerateContentResponse are consistent with this new architecture. My review includes a few suggestions to further improve maintainability by reducing some code duplication in the new provider implementations. Overall, this is a solid contribution.
...se-ai/src/main/kotlin/com/google/firebase/ai/generativemodel/CloudGenerativeModelProvider.kt
Show resolved
Hide resolved
firebase-ai/src/main/kotlin/com/google/firebase/ai/generativemodel/OnDeviceModelProvider.kt
Outdated
Show resolved
Hide resolved
firebase-ai/src/main/kotlin/com/google/firebase/ai/generativemodel/OnDeviceModelProvider.kt
Outdated
Show resolved
Hide resolved
|
The public api surface has changed for the subproject firebase-ai: Please update the api.txt files for the subprojects being affected by this change by running ./gradlew ${subproject}:generateApiTxtFile. Also perform a major/minor bump accordingly. |
|
The public api surface has changed for the subproject firebase-ai: Please update the api.txt files for the subprojects being affected by this change by running ./gradlew ${subproject}:generateApiTxtFile. Also perform a major/minor bump accordingly. |
emilypgoogle
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Surface level naming consistency comments, contents seem fine
...se-ai/src/main/kotlin/com/google/firebase/ai/generativemodel/MissingOnDeviceModelProvider.kt
Outdated
Show resolved
Hide resolved
firebase-ai/src/main/kotlin/com/google/firebase/ai/generativemodel/OnDeviceModelProvider.kt
Outdated
Show resolved
Hide resolved
|
Test coverage should be good now: Test added for:
Test not needed for:
|
Introduces a new internal architecture for generative model providers, allowing for flexible integration of different generative AI backends.
Key changes include:
GenerativeModelProviderinterface to define common operations for generative models.CloudGenerativeModelProviderfor interacting with cloud-based generative AI services.OnDeviceModelProviderto support on-device generative AI models, including logic for handling text and image parts in prompts and mapping interop types.FallbackGenerativeModelProviderto enable a primary model with a configurable fallback to a secondary model based on a precondition or exceptions.MissingOnDeviceModelProvideras a placeholder for when on-device models are not available.Candidateto include afromInteropmethod for converting on-device model responses.GenerateContentResponseconstructor visibility tointernalto accommodate new provider implementations.Test added for:
FallbackGenerativeModelProvider.ktOnDeviceGenerativeModelProvider.ktTest not needed for:
GenerativeModelProvider.ktIs an interfaceMissingOnDeviceGenerativeModelProvider.ktOnly throws exceptionsCloudGenerativeModelProvider.ktAll the SDK test suite verifies that it works