For best results when printing this announcement, please click on link below:
https://newsfile.refinitiv.com/getnewsfile/v1/story?guid=urn:newsml:reuters.com:20250604:nRSD3014La&default-theme=true
RNS Number : 3014L Sama 04 June 2025
Sama Launches Multimodal AI, Leveraging Diverse Data Types Alongside Human
Intelligence for Next-Gen AI Models
Initial implementations have delivered 35% accuracy improvement and 10%
reduction in product returns
SAN FRANCISCO, CA / ACCESS Newswire (https://www.accessnewswire.com/) /
June 4, 2025 / Sama (https://pr.report/b348) , the leader in purpose-built,
responsible enterprise AI with agile data labeling for model training and
performance evaluation, today announced the launch of Sama Multimodal, a new
solution that combines multiple data types and inputs with human-in-the-loop
(HITL) validation to create more powerful, accurate AI systems. By integrating
diverse modalities including images, video, text, audio, LiDAR and radar data,
Sama Multimodal has demonstrated significant improvements in model accuracy
for industries such as automotive and retail. Early results have been
impressive, including a large retail implementation that saw a 35% increase in
model accuracy and a 10% reduction in product returns.
Sama Multimodal offers enterprise AI teams a flexible framework and its
widget-based architecture makes it easy to rapidly integrate multiple AI
models at different stages of workflow, including using pre-annotations from
open source, client and/or Sama-based models, while incorporating strategic
HITL validation - ensuring quality and mitigating bias in model outputs.
"With Sama Multimodal, organizations can build differentiated AI solutions
using the full spectrum of data available, including sensor data which is
growing ever more prolific," said Duncan Curtis, SVP of AI Product and
Technology at Sama. "What makes our platform truly unique is its flexibility -
teams can ingest, align, and annotate any combination of modalities, then
transition from pre-trained to proprietary models at the right moment in their
development workflow. It's designed to evolve with AI itself."
Sama Multimodal democratizes access to advanced AI technologies and creates
differentiated customer experiences for the retail and automotive industries.
In retail applications, for example, Sama's multimodal capabilities
significantly improve search relevance applications and product discovery with
a combination of image, text, and video annotations. In automotive, Sama
Multimodal excels at integrating camera, LiDAR, and radar data to create more
comprehensive environmental understanding for advanced driver assistance
systems and autonomous vehicles.
Sama's multimodal infrastructure is future-proof, enabling enterprises to
scale model sophistication without rebuilding data pipelines from scratch. By
leveraging human expertise for complex contextual understanding while
automating routine data processing tasks, Sama Multimodal is ideal not only
for today's applications but also for emerging needs, such as voice-assisted
retail search, vision-enhanced robotics, and personalized customer experiences
powered by real-time behavioral data.
Sama Multimodal is fully supported by SamaHub™, a collaborative workspace
and by SamaAssure™, the industry's highest quality guarantee, which
routinely delivers a 98% first batch acceptance rate.
About Sama
Sama is a global leader in data annotation solutions for computer vision,
generative AI and large language models. Our solutions minimize the risk of
model failure and lower the total cost of ownership through an enterprise
ready ML-powered platform and SamaIQ™, actionable data insights uncovered by
proprietary algorithms and a highly skilled on-staff team of over 5,000 data
experts. 40% of FAANG companies and other major Fortune 50 enterprises,
including GM, Ford and Microsoft, trust Sama to help deliver industry-leading
ML models.
Driven by a mission to expand opportunities for underserved individuals
through the digital economy, Sama is a certified B-Corp and has helped more
than 68,000 people lift themselves out of poverty. An MIT-led Randomized
Controlled Trial has validated its training and employment program. For more
information, visit www.sama.com (https://pr.report/b349) .
Sama Media Contact:
press@samasource.org
SOURCE: Sama
This information is provided by Reach, the non-regulatory press release distribution service of RNS, part of the London Stock Exchange. Terms and conditions relating to the use and distribution of this information may apply. For further information, please contact
rns@lseg.com (mailto:rns@lseg.com)
or visit
www.rns.com (http://www.rns.com/)
.
RNS may use your IP address to confirm compliance with the terms and conditions, to analyse how you engage with the information contained in this communication, and to share such analysis on an anonymised basis with others as part of our commercial services. For further information about how RNS and the London Stock Exchange use the personal data you provide us, please see our
Privacy Policy (https://www.lseg.com/privacy-and-cookie-policy)
. END NRAUOSKRVKUNRAR