Introducing Command R+: Our new, most powerful model in the Command R family.
A global provider of video conferencing services needed to add call summarization and action item extraction features to its video conferencing offering in response to strong customer demand. Previous attempts to use generative AI with competitive models had been difficult to customize, and they were frequently inaccurate and too verbose for customer requirements.
A global provider of video conferencing services needed to add call summarization and action item extraction features to its video conferencing offering in response to strong customer demand. Previous attempts to use generative AI with competitive models had been difficult to customize, and they were frequently inaccurate and too verbose for customer requirements.
The customer built a solution by fine-tuning Cohere’s Command model on the nuances of transcript summarization. It accurately summarized live meetings in bullet points, including action items for call participants, and delivered it on the company’s proprietary conferencing platform. Command delivered higher quality output than competitor models out of the box, and within two weeks of fine-tuning, achieved 4:1 preference on output quality.
The customer built a solution by fine-tuning Cohere’s Command model on the nuances of transcript summarization. It accurately summarized live meetings in bullet points, including action items for call participants, and delivered it on the company’s proprietary conferencing platform. Command delivered higher quality output than competitor models out of the box, and within two weeks of fine-tuning, achieved 4:1 preference on output quality.
STEP 1.
The fine-tuned Command model ingests a meeting transcript
STEP 2.
Command delivers a meeting summary
STEP 3.
Command delivers a separate list of action items with owner names
Higher end-customer satisfaction and retention
Higher end-customer productivity
High quality summaries and action items
Cohere’s retrieval prioritizes accurate responses and citations
Cohere’s models come with connectors to common data sources
Cohere’s models can be fine-tuned to further improve domain performance
Cohere’s powerful inference frameworks optimize throughput and reduce compute requirements
Cohere models can be accessed through a SaaS API, on cloud infrastructure (Amazon SageMaker, Amazon Bedrock, OCI Data Science, Google Vertex AI, Azure AI), and private deployments (virtual private cloud and on-premises)
Over 100 languages are supported, so the same topics, products, and issues are identified the same way