Implementation of a Question Answering System using LLMs
-
Updated
May 16, 2024 - Python
Implementation of a Question Answering System using LLMs
Matrix Factorization Recommender System: A scalable, cloud-based implementation using MongoDB, BigQuery, and Google Vertex AI for data processing, model training, and deployment.
a demo to try out stuff and see how things work out for remind ryt
A simple Retrieval-Augmented Generation (RAG) web application chatbot called Raggy 🤖
Building an AI chatbot using the Google Gemini AI API in Python
AI conversation bot for foreign language practice
Modify a product image by replacing the background with Imagen
A Ruby Gem created to communicate with Gemini via Vertex AI, Generative Language API, or AI Studio, Google's generative AI services. It works with Ruby versions 2.6.0 and higher.
HTTP API for Nano Bots: small, AI-powered bots that can be easily shared as a single file, designed to support multiple providers such as Cohere Command, Google Gemini, Maritaca AI MariTalk, Mistral AI, Ollama, OpenAI ChatGPT, and others, with support for calling tools (functions).
Easy "1-line" calling of all LLMs from OpenAI, MS Azure, AWS Bedrock, GCP Vertex, and Ollama
Gemini is Code generator and Code Interpreter for Google Gemini.
Ruby Implementation of Nano Bots: small, AI-powered bots that can be easily shared as a single file, designed to support multiple providers such as Anthropic Claude, Cohere Command, Google Gemini, Maritaca AI, Mistral AI, Ollama, OpenAI ChatGPT, and others, with support for calling tools (functions).
A Ruby Gem for interacting with Gemini through Vertex AI, Generative Language API, or AI Studio, Google's generative AI services.
Add a description, image, and links to the google-vertex-ai topic page so that developers can more easily learn about it.
To associate your repository with the google-vertex-ai topic, visit your repo's landing page and select "manage topics."