220 AI developers voted in a competitive but structured category. The results reveal a split between market adoption and innovation leadership.
The March 2026 IT Brand Pulse AI Brand Leader survey asked 220 members of the AI developer community to vote for Market Leader and Intelligence & Innovation Leader in Multimodal Memory Platforms, one of 26 products in the AI Engineering stack. Twelve Labs was voted Market Leader with 30.0% of votes, ahead of Memories.ai at 20.0%. However, Memories.ai was voted Innovation Leader with 31.8% of votes, ahead of Twelve Labs at 25.0%.
This split between market leadership and innovation leadership suggests that Twelve Labs currently holds stronger developer mindshare and adoption, driven by its video-first multimodal intelligence positioning, while Memories.ai is increasingly viewed as the vendor pushing the category forward technically with its focus on persistent multimodal memory and cross-session intelligence. The IT Brand Pulse analyst team expects this category to grow rapidly as agentic AI systems require richer, multi-modal context.
What Are Multimodal Memory Platforms?
IT Brand Pulse defines Multimodal Memory Platforms as systems that ingest, index, store, and retrieve memory across multiple data modalities, including video, audio, images, and text, to enable AI applications and agents to reason over rich, unstructured data. These platforms provide capabilities such as semantic video search, temporal indexing, multimodal embeddings, cross-modal retrieval, and persistent context across media types. They represent a foundational layer for AI systems that operate beyond text, enabling perception, memory, and reasoning across real-world data. Multimodal Memory Platforms sits within the Context & Memory sub-layer of the broader AI Engineering stack, alongside AI Context Engineering Platforms, AI Memory Platforms, Knowledge Graph Platforms, and Vector Databases.
Download the Brand Leader Report for Multimodal Memory Platforms.















