Introducing Command R+: Our new, most powerful model in the Command R family.
A large North American retail chain engaged Cohere because it wanted to support its managers and employees with quick answers to common human resources and operational questions. Some answers weren’t always obvious and busy managers would often generate tickets that would overwhelm IT and take weeks to answer.
A large North American retail chain engaged Cohere because it wanted to support its managers and employees with quick answers to common human resources and operational questions. Some answers weren’t always obvious and busy managers would often generate tickets that would overwhelm IT and take weeks to answer.
Using Cohere Command and Rerank with retrieval-augmented generation (RAG), the customer built a search app connected to the company’s HR database. Users could ask questions in plain language and receive simple, conversational responses based on location, union affiliation, and job level.
Using Cohere Command and Rerank with retrieval-augmented generation (RAG), the customer built a search app connected to the company’s HR database. Users could ask questions in plain language and receive simple, conversational responses based on location, union affiliation, and job level.
STEP 1.
Cohere Command builds a query based on an employee’s level, union affiliation, and search terms for employment contracts
STEP 2.
The database is queried to find the employee’s union affiliation, and then manuals and employment contracts are searched
STEP 3.
Cohere Rerank sorts collected results based on semantic relevance to the original query
STEP 4.
Based on the most relevant documents, Command generates a conversational response with citations
Improved employee satisfaction
Reduced burden on store managers
>50% lower burden on IT support
Cohere’s retrieval prioritizes accurate responses and citations
Rerank comes with connectors to common data sources
Rerank can be fine-tuned to further improve domain performance
Cohere’s powerful inference frameworks optimize throughput and reduce compute requirements
Cohere models can be accessed through a SaaS API, on cloud infrastructure (Amazon SageMaker, Amazon Bedrock, OCI Data Science, Google Vertex AI, Azure AI), and private deployments (virtual private cloud and on-premises)
Over 100 languages are supported, so the same topics, products, and issues are identified the same way