Red Hat Announces Podman AI Lab

Accelerates developer adoption of generative AI in applications with tools that simplify and guide each step, from ideation to production, all from a local desktop environment

DENVER – RED HAT SUMMIT 2024 -

Red Hat, Inc., the world's leading provider of open source solutions, today announced Podman AI Lab, an extension for Podman Desktop that gives developers the ability to build, test and run generative artificial intelligence (GenAI)-powered applications in containers using an intuitive, graphical interface on their local workstation. This contributes to the democratization of GenAI, and gives developers the benefits of convenience, simplicity and cost efficiency of their local developer experience while maintaining ownership and control over sensitive data.

Icon-Red_Hat-Media_and_documents-Quotemark_Open-B-Red-RGB

The AI era is here, but for many application developers, building intelligent applications presents a steep learning curve. Podman AI Lab presents a familiar, easy-to-use tool, and playground environment to apply AI models to their code and workflows safely and more securely, without requiring costly infrastructure investments or extensive AI expertise.

Sarwar Raza

Vice President and General Manager, Application Developer Business Unit, Red Hat

The recent surge of GenAI and open source large language models (LLMs) has ushered in a new era of computing that relies heavily on the use of AI-enabled applications, and organizations are moving quickly to establish expertise, processes and tools to remain relevant. Industry analyst firm IDC notes this shift, predicting “By 2026, 40% of net-new applications will be intelligent apps, where developers incorporate AI to enhance existing experiences and form new use cases.”1

As AI and data science move into mainstream application development, tools like Podman AI Lab can help fuel developer adoption of GenAI for building intelligent applications or enhancing their workflow using AI-augmented development capabilities. AI Lab features a recipe catalog with sample applications that give developers a jump start on some of the more common use cases for LLMs, including:

  • Chatbots that simulate human conversation, using AI to comprehend user inquiries and offer suitable responses. These capabilities are often used to augment applications that provide self-service customer support or virtual personal assistance.
  • Text summarizers, which provide versatile capabilities across many applications and industries, where they can deliver effective and efficient information management. Using this recipe, developers can build applications to assist with things like content creation and curation, research, news aggregation, social media monitoring, and language learning.
  • Code generators, which empower developers to concentrate on higher-level design and problem-solving by automating repetitive tasks like project setup and API integration, or to produce code templates.
  • Object detection helps identify and locate objects within digital images or video frames. It is a fundamental component in various applications, including autonomous vehicles, retail inventory management, precision agriculture, and sports broadcasting.
  • Audio-to-text transcription involves the process of automatically transcribing spoken language into written text, facilitating documentation, accessibility, and analysis of audio content.

These examples provide an entry point for developers where they can review the source code to see how the application is built and learn best practices for integrating their code with an AI model.

For developers, containers have traditionally provided a flexible, efficient and consistent environment for building and testing applications on their desktops without worrying about conflicts or compatibility issues. Today, they are looking for the same simplicity and ease of use for AI models. Podman AI Lab helps meet this need by giving them the ability to provision local inference servers, making it easier to run a model locally, get an endpoint, and start writing code to wrap new capabilities around the model.

In addition, Podman AI Lab includes a playground environment that allows users to interact with models and observe their behavior. This can be used to test, experiment and develop prototypes and applications with the models. An intuitive user prompt helps in exploring the capabilities and accuracy of various models and aids in finding the best model and the best settings for the use case in the application.

As AI becomes more ubiquitous in the enterprise, Red Hat is leading the way in unlocking the potential for AI to drive innovation, efficiency and value through its portfolio of consistent, trusted and comprehensive AI platforms for the hybrid cloud.

Podman AI Lab builds on the strength of Podman Desktop, an open source project founded at Red Hat which now has more than one million downloads. It also offers tight integration with image mode for Red Hat Enterprise Linux, a new deployment method for the world’s leading enterprise Linux platform that delivers the operating system as a container image. This integration enables developers to more easily go from prototyping and working with models on their laptop to turning the new AI-infused application into a portable, bootable container that can easily be run anywhere across the hybrid cloud, from bare metal to a cloud instance, using Red Hat OpenShift.

The cloud is hybrid. So is AI.

For more than 30 years, open source technologies have paired rapid innovation with greatly reduced IT costs and lowered barriers to innovation. Red Hat has been leading this charge for nearly as long, from delivering open enterprise Linux platforms with RHEL in the early 2000s to driving containers and Kubernetes as the foundation for open hybrid cloud and cloud-native computing with Red Hat OpenShift.

This drive continues with Red Hat powering AI/ML strategies across the open hybrid cloud, enabling AI workloads to run where data lives, whether in the datacenter, multiple public clouds or at the edge. More than just the workloads, Red Hat’s vision for AI brings model training and tuning down this same path to better address limitations around data sovereignty, compliance and operational integrity.  The consistency delivered by Red Hat’s platforms across these environments, no matter where they run, is crucial in keeping AI innovation flowing.

Red Hat Summit

Join the Red Hat Summit keynotes to hear the latest from Red Hat executives, customers and partners:

Supporting Quotes

Sarwar Raza, vice president and general manager, Application Developer Business Unit, Red Hat

“The AI era is here, but for many application developers, building intelligent applications presents a steep learning curve. Podman AI Lab presents a familiar, easy-to-use tool, and playground environment to apply AI models to their code and workflows safely and more securely, without requiring costly infrastructure investments or extensive AI expertise.”

Jim Mercer, program vice president, Software Development, DevOps and DevSecOps, IDC

"The new wave of generative AI (GenAI) is washing over the industry, with the impact on developers and DevOps professionals at the leading edge of this new paradigm. Across the seascape of development and DevOps, it can be assumed that change will accelerate. The next few years are likely to go down as some of the most important in the history of the development and deployment of applications."

Additional Resources

Connect with Red Hat

  • ABOUT RED HAT
  • Red Hat is the world’s leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies. Red Hat helps customers integrate new and existing IT applications, develop cloud-native applications, standardize on our industry-leading operating system, and automate, secure, and manage complex environments. Award-winning support, training, and consulting services make Red Hat a trusted adviser to the Fortune 500. As a strategic partner to cloud providers, system integrators, application vendors, customers, and open source communities, Red Hat can help organizations prepare for the digital future.



  • FORWARD-LOOKING STATEMENTS
  • Except for the historical information and discussions contained herein, statements contained in this press release may constitute forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995. Forward-looking statements are based on the company’s current assumptions regarding future business and financial performance. These statements involve a number of risks, uncertainties and other factors that could cause actual results to differ materially. Any forward-looking statement in this press release speaks only as of the date on which it is made. Except as required by law, the company assumes no obligation to update or revise any forward-looking statements.

    ###

    Red Hat, Red Hat Enterprise Linux, the Red Hat logo and OpenShift are trademarks or registered trademarks of Red Hat, Inc. or its subsidiaries in the U.S. and other countries. Linux® is the registered trademark of Linus Torvalds in the U.S. and other countries.