AI

Liquid AI Launches Liquid Foundation Models: A Game-Changer in Generative AI

In a groundbreaking announcement Liquid AIa spin-off from MIT, has introduced its first series Liquid foundation models (LFMs). Designed from first principles, these models set a new benchmark in the generative AI space, offering unparalleled performance at different scales. LFMs, with their innovative architecture and advanced capabilities, are poised to challenge leading AI models, including ChatGPT.

Liquid AI was founded by a team of MIT researchers including Ramin Hasani, Mathias Lechner, Alexander Aminiand Daniela Rus. Headquartered in Boston, Massachusetts, the company’s mission is to create capable and efficient general-purpose AI systems for enterprises of all sizes. The team originally pioneered liquid neural networks, a class of AI models inspired by brain dynamics, and now aims to expand the capabilities of AI systems at every scale, from edge devices to enterprise-scale deployments.

What are Liquid Foundation Models (LFMs)?

Liquid Foundation Models represent a new generation of AI systems that are highly efficient in both memory use and computing power. Built with a foundation in dynamic systems, signal processing and numerical linear algebrathese models are designed to process various types of sequential data, such as text, video, audio and signals, with remarkable accuracy.

Liquid AI has developed three primary language models as part of this launch:

  • LFM-1B: A compact model with 1.3 billion parameters, optimized for resource-constrained environments.
  • LFM-3B: A 3.1 billion parameter model ideal for edge deployment scenarios, such as mobile applications.
  • LFM-40B: A Mixture of Experts (MoE) model with 40.3 billion parameters, designed to perform complex tasks with exceptional performance.

These models have already shown state-of-the-art results in major AI benchmarks, making them a formidable competitor to existing generative AI models.

See also  Beyond Scripts: The Future of Video Game NPCs with Generative AI

State-of-the-art performance

Liquid AI’s LFMs deliver best-in-class performance across a variety of benchmarks. For example, LFM-1B outperforms transformer-based models in its size category, while LFM-3B competes with larger models such as Microsoft’s Phi-3.5 and Meta’s Llama series. The LFM-40B Despite its size, the model is efficient enough to compete with models with an even larger number of parameters, and offers a unique balance between performance and resource efficiency.

Some highlights of the LFM performance include:

  • LFM-1B: Dominates benchmarks such as MMLU and ARC-C and sets a new standard for 1B parameter models.
  • LFM-3B: Outperforms models like Phi-3.5 and Google’s Gemma 2 in efficiency, while maintaining a small memory footprint, making it ideal for mobile and edge AI applications.
  • LFM-40B: This model’s MoE architecture offers comparable performance to larger models, with 12 billion active parameters at any time.

A new era in AI efficiency

A key challenge in modern AI is managing memory and computation, especially when working with long-context tasks such as document summarization or chatbot interactions. LFMs excel in this area by efficiently compressing input data, resulting in less memory usage during inference. This allows the models to process longer runs without the need for expensive hardware upgrades.

For example, LFM-3B offers one 32k token context length– making it one of the most efficient models for tasks that require processing large amounts of data simultaneously.

A revolutionary architecture

LFMs are built on a unique architectural framework, which differs from traditional transformer models. The architecture is centered around adaptive linear operators, which modulate the computation based on the input data. This approach allows Liquid AI to significantly optimize performance across a variety of hardware platforms, including NVIDIA, AMD, Cerebras, and Apple hardware.

See also  Jamba: AI21 Labs' New Hybrid Transformer-Mamba Language Model

The design space for LFMs includes a new mix of token mixes And channel mixing structures that improve the way the model processes data. This leads to superior generalization and reasoning capabilities, especially in long-context tasks and multimodal applications.

Expanding the AI ​​frontier

Liquid AI has big ambitions for LFMs. In addition to language models, the company is working to expand its base models to support different data modalities, including video, audio, and time series data. These developments will enable LFMs to scale across multiple sectors, such as financial services, biotechnology and consumer electronics.

The company is also focused on contributing to the open science community. While the models themselves are not open source at this time, Liquid AI plans to release relevant research results, methods, and datasets to the broader AI community, encouraging collaboration and innovation.

Early access and adoption

Liquid AI currently offers early access to its LFMs through several platforms including Liquid playground, Lambda (Chat UI and API), and Bewilderment Labs. Companies looking to integrate advanced AI systems into their operations can explore the potential of LFMs in various deployment environments, from edge devices to on-premise solutions.

Liquid AI’s open science approach encourages early adopters to share their experiences and insights. The company is actively seeking feedback to refine and optimize its models for real-world applications. Developers and organizations interested in being part of this journey can contribute to the red team’s efforts and help Liquid AI improve its AI systems.

Conclusion

The release of Liquid Foundation Models marks a significant advancement in the AI ​​landscape. With a focus on efficiency, adaptability and performance, LFMs are poised to reshape the way companies approach AI integration. As more organizations adopt these models, Liquid AI’s vision of scalable general-purpose AI systems will likely become a cornerstone of the next era of artificial intelligence.

See also  Anthropic Launches Visual PDF Analysis in Latest Claude AI Update

If you are interested in exploring the potential of LFMs for your organization, Liquid AI invites you to reach out and join the growing community of early adopters shaping the future of AI.

For more information, visit The official website of Liquid AI and start experimenting with LFMs today.

Source link

Related Articles

Back to top button