|
- Open Source and In-House: How Uber Optimizes LLM Training
Generative AI powered by LLMs (Large Language Models) has a wide range of applications at Uber, like Uber Eats recommendations and search, customer support chatbots, code development, and SQL query generation
- GitHub - ericzakariasson uber-eats-mcp-server
What is MCP? The Model Context Protocol (MCP) is an open protocol that enables seamless integration between LLM applications and external tools
- Uber: LLM-Driven Developer Experience and Code Migrations at Scale . . .
This case study from Uber details their journey in implementing LLMs across their developer platform, highlighting three major initiatives and the lessons learned from each
- Continuous Experiment Framework at Uber | Open Data Science Conference
Uber Eats ran ML algorithms on the ML platform called Michelangelo and then used the Bayesian optimization to choose the best parameter to maximize conversion rates and revenues Uber's AI labs is also our collaborator
- LLM Integration | ericzakariasson uber-eats-mcp-server | DeepWiki
This document details the Large Language Model (LLM) integration within the Uber Eats MCP Server It covers how the system leverages LLMs to power browser automation, allowing the server to intelligen
- Engineering | Uber Blog
If you have an Uber account, you may opt-out of the “sale” or “sharing” of your data here
- Uber Eats MCP Server | Smithery
Enable seamless integration of Uber Eats data and actions with LLM applications using the Model Context Protocol Facilitate interaction with Uber Eats services through a standardized MCP server interface
- Uber enables outstanding on-demand experiences with AI - OpenAI
For example, we measure the quality of LLM responses and conduct controlled experiments to compare AI-augmented workflows with traditional ones Metrics like user engagement, and incremental gross bookings, across geographic segments help us assess both customer satisfaction and business impact
|
|
|