Connect with us

Technology

Akamai Sharpens Its AI Edge with Launch of Akamai Cloud Inference

Published

on

New service gives companies the ability to realize a 3x improvement in throughput, 60% less latency, and 86% lower cost than traditional hyperscale infrastructure

CAMBRIDGE, Mass., March 27, 2025 /PRNewswire/ — Akamai (NASDAQ: AKAM), the cybersecurity and cloud computing company that powers and protects business online, today unveiled Akamai Cloud Inference, to usher in a faster, more efficient wave of innovation for organizations looking to turn predictive and large language models (LLMs) into real-world action. Akamai Cloud Inference runs on Akamai Cloud, the world’s most distributed platform, to address escalating limitations of centralized cloud models.

“Getting AI data closer to users and devices is hard, and it’s where legacy clouds struggle,” said Adam Karon, Chief Operating Officer and General Manager, Cloud Technology Group at Akamai. “While the heavy lifting of training LLMs will continue to happen in big hyperscale data centers, the actionable work of inferencing will take place at the edge where the platform Akamai has built over the past two and a half decades becomes vital for the future of AI and sets us apart from every other cloud provider in the market.”

AI inference on Akamai Cloud
Akamai’s new solution provides tools for platform engineers and developers to build and run AI applications and data-intensive workloads closer to end users, delivering 3x better throughput while reducing latency up to 2.5x. Using Akamai’s solution, businesses can save up to 86% on AI inference and agentic AI workloads compared to traditional hyperscaler infrastructure. Akamai Cloud Inference includes:

Compute: Akamai Cloud offers a versatile compute arsenal, from classic CPUs for fine-tuned inference, to powerful accelerated-compute options in GPUs, and tailored ASIC VPUs to provide the right horsepower for a spectrum of AI inference challenges. Akamai integrates with Nvidia’s AI Enterprise ecosystem, leveraging Triton, TAO Toolkit, TensorRT, and NVFlare to optimize performance of AI inference on NVIDIA GPUs.Data management: Akamai enables customers to unlock the full potential of AI inference with a cutting-edge data fabric purpose-built for modern AI workloads. Akamai has partnered with VAST Data to provide streamlined access to real-time data to accelerate inference-related tasks, essential to delivering relevant results and a responsive experience. This is complemented by highly scalable object storage to manage the volume and variety of datasets critical to AI applications, and integration with leading vector database vendors, including Aiven and Milvus, to enable retrieval-augmented generation (RAG). With this data management stack, Akamai securely stores fine-tuned model data and training artifacts to deliver low-latency AI inference at global scale.Containerization: Containerizing AI workloads enables demand-based autoscaling, improved application resilience, and hybrid/multicloud portability, while optimizing both performance and cost. With Kubernetes, Akamai delivers faster, cheaper, and more secure AI inference at petabyte-scale performance. Underpinned by Linode Kubernetes Engine (LKE)-Enterprise, a new enterprise edition of Akamai Cloud’s Kubernetes orchestration platform designed specifically for large-scale enterprise workloads, and the recently announced Akamai App Platform, Akamai Cloud Inference is able to quickly deploy an AI-ready platform of open source Kubernetes projects, including KServe, Kubeflow, and SpinKube, seamlessly integrated to streamline the deployment of AI models for inference.Edge compute: To simplify how developers build AI-powered applications, Akamai AI Inference includes WebAssembly (Wasm) capabilities. Working with Wasm providers like Fermyon, Akamai enables developers to execute inferencing for LLMs directly from serverless apps, allowing customers to execute lightweight code at the edge to enable latency-sensitive applications.

Together, these tools create a powerful platform for low-latency, AI-powered applications that allows companies to deliver the experience their users demand. Akamai Cloud Inference runs on the company’s massively distributed platform capable of consistently delivering over one petabyte per second of throughput for data-intensive workloads. Comprising more than 4,200 points of presence across greater than 1,200 networks in over 130 countries worldwide, Akamai Cloud makes compute resources available from cloud to edge while accelerating application performance and increasing scalability.

The shift from training to inference
As AI adoption matures, enterprises are recognizing that the hype around LLMs has created a distraction, drawing focus away from practical AI solutions better suited to solve specific business problems. LLMs excel at general-purpose tasks like summarization, translation, and customer service. These are very large models that are expensive and time-consuming to train. Many enterprises have found themselves constrained by architectural and cost requirements, including data center and computational power; well-structured, secure, and scalable data systems; and the challenges that location and security requirements place on decision latency. Lightweight AI models, — designed to address specific business problems — can be optimized for individual industries, can use proprietary data to create measurable outcomes, and represent a better return on investment for enterprises today.

AI inference needs a more distributed cloud
Increasingly, data will be generated outside of centralized data centers or cloud regions. This shift is driving demand for AI solutions that leverage data generation closer to the point of origin. This fundamentally reshapes infrastructure needs as enterprises move beyond building and training LLMs, toward using data for faster, smarter decisions and investing in more personalized experiences. Enterprises recognize that they can generate more value by leveraging AI to manage and improve their business operations and processes. Distributed cloud and edge architectures are emerging as preferable for operational intelligence use cases because they can provide real-time, actionable insights across distributed assets even in remote environments. Early customer examples on Akamai Cloud include in-car voice assistance, AI-powered crop management, image optimization for consumer product marketplaces, virtual garment visualization shopping experiences, automated product description generators, and customer feedback sentiment analyzers.

“Training an LLM is like creating a map, requiring you to gather data, analyze terrain, and plot routes. It’s slow and resource-intensive, but once built, it’s highly useful. AI inference is like using a GPS, instantly applying that knowledge, recalculating in real time, and adapting to changes to get you where you need to go,” explained Karon. “Inference is the next frontier for AI.”

About Akamai
Akamai is the cybersecurity and cloud computing company that powers and protects business online. Our market-leading security solutions, superior threat intelligence, and global operations team provide defense in depth to safeguard enterprise data and applications everywhere. Akamai’s full-stack cloud computing solutions deliver performance and affordability on the world’s most distributed platform. Global enterprises trust Akamai to provide the industry-leading reliability, scale, and expertise they need to grow their business with confidence. Learn more at akamai.com and akamai.com/blog, or follow Akamai Technologies on X and LinkedIn.

Contacts
Akamai Media Relations
akamaipr@akamai.com 

Akamai Investor Relations
invrel@akamai.com 

View original content to download multimedia:https://www.prnewswire.com/news-releases/akamai-sharpens-its-ai-edge-with-launch-of-akamai-cloud-inference-302412571.html

SOURCE Akamai Technologies, Inc.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Technology

Chef Robotics Physical AI Models Can Now Automate Baked Goods Packing

Published

on

By

SAN FRANCISCO, April 29, 2026 /PRNewswire/ — Chef Robotics, a leader in physical AI for the food industry, today announced that Chef robots can now automate tray assembly for baked goods packing. The application places baked products, such as burger buns, chocolate chip cookies, biscotti, butter cookies, biscuits, fortune cookies, granola bars, rusks, and shortbreads into trays and packaging containers before sealing.

Watch Chef robots in action.

Baked goods packing has historically been difficult to automate for high-mix production. Each item behaves differently on the production line—a granola bar compresses under the wrong grip, while a biscotti or rusk can crack if placed at the wrong angle. Surface textures range from glazed and smooth to crumbly and irregular, and strict presentation requirements leave little room for error. This variability has made it challenging for automation systems to reliably handle baked goods at production speeds, leaving food manufacturers dependent on manual labor and traditional bakery equipment.

To address this, Chef built its baked goods packing application on its existing piece-picking capability, which uses Chef’s AI-powered computer vision and physical AI models trained across diverse real-world production environments. This allows Chef robots to assess each item’s position, shape, and orientation in real time and determine how to pick the items from the pan and place them quickly and precisely without damaging them.

The baked goods packing application supports four distinct placement capabilities.

First, Chef’s vision system detects the angle at which each item sits in the pan and reorients it after picking, placing it on the tray at the exact angle required, regardless of its original position, enabling retail-ready presentation for SKUs that require precise angular placement.

Second, Chef robots can place multiple baked goods into the same packaging container in a single automated pass, completing full tray assembly without manual intervention.

Third, for packaging containers with multiple small compartments, Chef robots can precisely place items into each designated section, including multiple items in the same compartment, using Chef’s AI vision model to detect compartment positions and orientations in real time.

Fourth, Chef’s vision system identifies the exact center of each tray and places every item at a predefined offset from that center, ensuring a uniform, consistent arrangement across every pack regardless of how trays arrive on the conveyor.

For food manufacturers evaluating bakery systems and baked goods packaging automation, the application offers higher throughput, reduced labor dependency, and consistent presentation across shifts. The capability runs on Chef’s existing robotic hardware and software, allowing manufacturers to deploy it without requiring any changes to their production lines.

Chef’s baked goods packing application is available in the U.S., Canada, Germany, and the UK and is included as part of Chef’s robotics-as-a-service (RaaS) pricing model.

About Chef Robotics
Chef is the first company to have commercialized a scalable AI-driven food robotics solution. With over 104 million servings made in production, Chef leverages ChefOS, an AI platform for food manipulation, to offer a Robotics-as-a-Service solution that helps industry-leading food companies increase production volume and meet demand. Headquartered in San Francisco, CA, Chef aims to empower humans to do what humans do best by accelerating the advent of intelligent machines. Visit https://chefrobotics.ai to learn more.

View original content:https://www.prnewswire.com/news-releases/chef-robotics-physical-ai-models-can-now-automate-baked-goods-packing-302756923.html

SOURCE Chef Robotics

Continue Reading

Technology

Chef Robotics Physical AI Models Can Now Automate Baked Goods Packing

Published

on

By

SAN FRANCISCO, April 29, 2026 /PRNewswire/ — Chef Robotics, a leader in physical AI for the food industry, today announced that Chef robots can now automate tray assembly for baked goods packing. The application places baked products, such as burger buns, chocolate chip cookies, biscotti, butter cookies, biscuits, fortune cookies, granola bars, rusks, and shortbreads into trays and packaging containers before sealing.

Watch Chef robots in action.

Baked goods packing has historically been difficult to automate for high-mix production. Each item behaves differently on the production line—a granola bar compresses under the wrong grip, while a biscotti or rusk can crack if placed at the wrong angle. Surface textures range from glazed and smooth to crumbly and irregular, and strict presentation requirements leave little room for error. This variability has made it challenging for automation systems to reliably handle baked goods at production speeds, leaving food manufacturers dependent on manual labor and traditional bakery equipment.

To address this, Chef built its baked goods packing application on its existing piece-picking capability, which uses Chef’s AI-powered computer vision and physical AI models trained across diverse real-world production environments. This allows Chef robots to assess each item’s position, shape, and orientation in real time and determine how to pick the items from the pan and place them quickly and precisely without damaging them.

The baked goods packing application supports four distinct placement capabilities.

First, Chef’s vision system detects the angle at which each item sits in the pan and reorients it after picking, placing it on the tray at the exact angle required, regardless of its original position, enabling retail-ready presentation for SKUs that require precise angular placement.

Second, Chef robots can place multiple baked goods into the same packaging container in a single automated pass, completing full tray assembly without manual intervention.

Third, for packaging containers with multiple small compartments, Chef robots can precisely place items into each designated section, including multiple items in the same compartment, using Chef’s AI vision model to detect compartment positions and orientations in real time.

Fourth, Chef’s vision system identifies the exact center of each tray and places every item at a predefined offset from that center, ensuring a uniform, consistent arrangement across every pack regardless of how trays arrive on the conveyor.

For food manufacturers evaluating bakery systems and baked goods packaging automation, the application offers higher throughput, reduced labor dependency, and consistent presentation across shifts. The capability runs on Chef’s existing robotic hardware and software, allowing manufacturers to deploy it without requiring any changes to their production lines.

Chef’s baked goods packing application is available in the U.S., Canada, Germany, and the UK and is included as part of Chef’s robotics-as-a-service (RaaS) pricing model.

About Chef Robotics
Chef is the first company to have commercialized a scalable AI-driven food robotics solution. With over 104 million servings made in production, Chef leverages ChefOS, an AI platform for food manipulation, to offer a Robotics-as-a-Service solution that helps industry-leading food companies increase production volume and meet demand. Headquartered in San Francisco, CA, Chef aims to empower humans to do what humans do best by accelerating the advent of intelligent machines. Visit https://chefrobotics.ai to learn more.

View original content:https://www.prnewswire.com/news-releases/chef-robotics-physical-ai-models-can-now-automate-baked-goods-packing-302756923.html

SOURCE Chef Robotics

Continue Reading

Technology

Air Products to Expand Industrial Gas Supply for Samsung Electronics’ Next-Generation Semiconductor Fab in South Korea

Published

on

By

New investment underscores the company’s long-term commitment to Korea and its leading role in the global semiconductor industry 

LEHIGH VALLEY, Pa., April 29, 2026 /PRNewswire/ — Air Products (NYSE:APD), a world-leading industrial gases company and serving Samsung globally, today announced it has been selected by Samsung to supply industrial gases for its new advanced semiconductor fab in Pyeongtaek, Gyeonggi Province, South Korea.

Under the agreement, Air Products will build, own and operate multiple state-of-the-art production facilities and a bulk specialty gas supply system to supply nitrogen, oxygen, argon, and hydrogen for Samsung’s new semiconductor fab. The new facilities are expected to come onstream in multiple phases from 2028 through 2030.

Air Products has a long track record of executing multiple phase expansions in Pyeongtaek to support Samsung’s growing manufacturing needs. This latest project represents Air Products’ largest investment to date in the semiconductor industry and will establish Pyeongtaek as the company’s single largest operations site globally supporting the electronics industry. 

“Air Products is honored to be selected once again by Samsung and to have their continued confidence as a trusted partner supporting their strategic growth plans,” said SR Kim, President, Air Products Korea. “This significant investment reinforces Air Products’ role as a leading global supplier to the semiconductor industry and underscores our long-standing commitment to supporting our strategic customers with safety, reliability, efficiency and excellent service.”

Air Products has served the global electronics industry for more than 40 years, supplying industrial gases safely and reliably to many of the world’s leading technology companies. The company has operated in Korea for more than 50 years and has established a strong position in electronics and manufacturing sectors.

About Air Products

Air Products (NYSE: APD) is a world-leading industrial gases company in operation for over 85 years focused on serving energy, environmental, and emerging markets and generating a cleaner future. The Company supplies essential industrial gases, related equipment and applications expertise to customers in dozens of industries, including refining, chemicals, metals, electronics, manufacturing, medical and food. As the leading global supplier of hydrogen, Air Products also develops, engineers, builds, owns and operates some of the world’s largest clean hydrogen projects, supporting the transition to low- and zero-carbon energy in the industrial and heavy-duty transportation sectors. Through its sale of equipment businesses, the Company also provides turbomachinery, membrane systems and cryogenic containers globally.

Air Products had fiscal 2025 sales of $12 billion from operations in approximately 50 countries. For more information, visit airproducts.com or follow us on LinkedInXFacebook or Instagram.

This release contains “forward-looking statements” within the safe harbor provisions of the Private Securities Litigation Reform Act of 1995. These forward-looking statements are based on management’s expectations and assumptions as of the date of this release and are not guarantees of future performance. While forward-looking statements are made in good faith and based on assumptions, expectations and projections that management believes are reasonable based on currently available information, actual performance and financial results may differ materially from projections and estimates expressed in the forward-looking statements because of many factors, including the risk factors described in our Annual Report on Form 10-K for the fiscal year ended September 30, 2025 and other factors disclosed in our filings with the Securities and Exchange Commission. Except as required by law, we disclaim any obligation or undertaking to update or revise any forward-looking statements contained herein to reflect any change in the assumptions, beliefs or expectations or any change in events, conditions or circumstances upon which any such forward-looking statements are based.

View original content to download multimedia:https://www.prnewswire.com/news-releases/air-products-to-expand-industrial-gas-supply-for-samsung-electronics-next-generation-semiconductor-fab-in-south-korea-302757497.html

SOURCE Air Products

Continue Reading

Trending