Introducing Command R+: Our new, most powerful model in the Command R family.
A leading North American asset management company found its analysts were spending a lot of time painstakingly sifting through news, commentary, and financial reports from their cloud-based repository to give the right information in response to customer questions about company performance.
A leading North American asset management company found its analysts were spending a lot of time painstakingly sifting through news, commentary, and financial reports from their cloud-based repository to give the right information in response to customer questions about company performance.
The company used Cohere Command and Rerank with retrieval- augmented generation (RAG) to build an AI knowledge assistant connected to its cloud repository. Analysts could immediately ask the assistant questions in plain English and receive conversational responses they could use to answer customer questions, along with citations to the source documents within the repository.
The company used Cohere Command and Rerank with retrieval- augmented generation (RAG) to build an AI knowledge assistant connected to its cloud repository. Analysts could immediately ask the assistant questions in plain English and receive conversational responses they could use to answer customer questions, along with citations to the source documents within the repository.
STEP 1.
Cohere Command generates queries for the cloud repository based on questions and conversation history
STEP 2.
Cohere Rerank ranks repository search results by their relevance to user questions
STEP 3.
Cohere Command generates a conversational answer, along with citations to source documents
Less time spent finding information by analysts
Faster research and due diligence process
Higher analyst job satisfaction
Cohere’s retrieval prioritizes accurate responses and citations
Rerank comes with connectors to common data sources
Rerank can be fine-tuned to further improve domain performance
Cohere’s powerful inference frameworks optimize throughput and reduce compute requirements
Cohere models can be accessed through a SaaS API, on cloud infrastructure (Amazon SageMaker, Amazon Bedrock, OCI Data Science, Google Vertex AI, Azure AI), and private deployments (virtual private cloud and on-premises)
Over 100 languages are supported, so the same topics, products, and issues are identified the same way