Red Hat announces definitive agreement to acquire Neural Magic

Publisher:HeavenlyWhisperLatest update time:2024-11-14 Source: eepw Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

● This transaction reflects Red Hat’s commitment to helping customers achieve flexible deployment in hybrid cloud environments, supporting the delivery of applications and workloads from on-premises data centers to public clouds and to edge computing to meet a variety of needs, including artificial intelligence (AI)

● Neural Magic will bring Red Hat expertise in generative AI performance engineering, as well as advanced model optimization algorithms and efficient GPU and CPU inference services

Red Hat, Inc., the world’s leading provider of open source solutions, today announced that it has signed a definitive agreement to acquire Neural Magic, a pioneer in software and algorithms for accelerating generative AI (gen AI) inference workloads.

Neural Magic's expertise in inference performance engineering and its commitment to open source are highly aligned with Red Hat's vision for high-performance AI workloads that are precisely matched to customers' specific use cases and data, across all layers of hybrid cloud environments, whether on-premises, in the public cloud, or at the edge.

Although generative AI dominates the current technology landscape, the large language models (LLMs) that underpin these systems continue to scale. As a result, building efficient and reliable LLM services requires enormous computing power, energy resources, and specialized operational skills. These challenges make customized, deployment-ready, and security-focused AI solutions out of reach for most enterprises.

Red Hat plans to help more companies meet these challenges by simplifying access to generative AI through the open innovation vLLM project. Developed by the University of California, Berkeley, vLLM is a community-driven open source project that aims to provide open model services (i.e., how generative AI models reason and solve problems). It supports all mainstream models, advanced reasoning acceleration research, and is compatible with a variety of hardware backends, including AMD GPUs, AWS Neuron, Google TPUs, Intel Gaudi, NVIDIA GPUs, and x86 CPUs.

The combination of Neural Magic’s leadership in vLLM and Red Hat’s powerful hybrid cloud AI technology platform will provide enterprises with an open and flexible solution to build an AI strategy that meets their unique needs, regardless of where their data is stored.

Red Hat + Neural Magic: Leading the way to a hybrid cloud-ready, generative AI future

Neural Magic was spun off from MIT in 2018 and is dedicated to building efficient reasoning software for deep learning. With Neural Magic's deep accumulation in technology and performance engineering, Red Hat will accelerate the realization of its AI vision for the future and break through the bottleneck of large-scale enterprise AI applications by relying on Red Hat's AI technology portfolio. Red Hat promotes the popularization of AI's transformative power through open source innovation, which is specifically reflected in the following aspects:

● Open source licensed models, with parameter sizes ranging from 1B to 405B, support running anywhere in hybrid cloud environments - including enterprise data centers, multi-cloud platforms and edge devices.

● Fine-tuning capabilities that enable organizations to easily customize large language models (LLMs) and adjust them to meet the needs of private data and specific scenarios to improve security.

● Inference performance engineering expertise to improve operational efficiency and infrastructure effectiveness, ensuring higher performance.

● A strong partner and open source ecosystem that expands customer choice by providing comprehensive support from LLM and tools to certified server hardware and underlying chip architecture.

Expanding Red Hat AI leadership in vLLM

With its deep technical accumulation in the field of vLLM, Neural Magic has built an enterprise-level reasoning architecture that enables customers to optimize, deploy and scale LLM workloads in a hybrid cloud environment, and fully control infrastructure selection, security policies and model lifecycles. At the same time, Neural Magic has also conducted in-depth research on model optimization, launched LLM Compressor (a unified library that optimizes LLM through the most advanced sparsification and quantization algorithms), and maintained a pre-optimized model library that customers can deploy directly through vLLM.

Red Hat AI helps customers reduce the cost and skill threshold of AI applications through powerful technologies, including:

● Red Hat Enterprise Linux AI (RHEL AI): A foundational model platform designed to more seamlessly develop, test, and run the IBM Granite family of open source LLMs to support enterprise applications on Linux servers.

● Red Hat OpenShift AI: A comprehensive AI platform that provides tools to quickly develop, train, serve, and monitor machine learning models, supporting deployment in distributed Kubernetes environments, whether on-premises, in the public cloud, or on edge devices.

● InstructLab: An open source AI community project co-created by Red Hat and IBM, dedicated to improving the open source licensed Granite LLM through InstructLab's fine-tuning technology, helping anyone to co-shape the future of generative AI.

Neural Magic’s leadership in vLLM technology will further enhance Red Hat AI’s ability to support LLM deployments in hybrid cloud environments, providing a ready-made and highly optimized open source inference architecture.

The transaction is subject to relevant regulatory review and other customary closing conditions.

Supporting Testimonials

Matt Hicks, President and CEO, Red Hat

“AI workloads need to run wherever customer data resides, which requires platforms and tools that are flexible, standardized, and open, helping enterprises choose the most appropriate environment, resources, and architecture based on their unique operational and data needs. We are excited to further enhance our hybrid cloud AI portfolio with Neural Magic’s innovations, advancing Red Hat’s position as a leader not only in open source, but also in AI.”

Brian Stevens, CEO of Neural Magic

"Open source has proven time and again that it can drive innovation through the power of community collaboration. At Neural Magic, we have brought together the industry's top AI performance engineering experts, focused on building open, cross-platform, and efficient LLM services. Joining Red Hat is not only a cultural fit, but will also help all types of companies, large and small, meet their AI transformation needs."

Dario Gil, Senior Vice President and Director of Research, IBM

"As customers scale AI applications across hybrid environments, virtualized, cloud-native LLMs built on an open foundation will become the industry standard. Red Hat's leadership in open source, coupled with effective open source models like IBM Granite and Neural Magic, will give enterprises more choice and the control and flexibility they need to scale AI across platforms."


Reference address:Red Hat announces definitive agreement to acquire Neural Magic

Previous article:5G network speed is faster than 4G, but the perception is poor! Wu Hequan: 6G standard formulation should focus on user needs
Next article:最后一页

Latest Embedded Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号