AI on Edge: the next frontier for South-East Asia

Building AI on Edge applications, HP's Garage 2.0 program, and AI on Edge startups

AI on Edge is the next frontier for GenAI innovation, especially in manufacturing, healthcare, smart cities, and more.

On 13th March 2025, we hosted an event with our partner HP to discuss AI on Edge.

On our panel, we had esteemed AI leaders such as Amit Mangwani (HP), Leo An (Intel), Calvin (CTO, Pints AI), and Alfred Siew (Founder, Techgoondu) on the challenges, opportunities, and innovations in the field of AI on edge.

We had 2 Singapore startups - Pints AI and Elgo AI - demo their product on how they can deploy on-premise for a safer and privacy-first experience.

Left to right: Amit, Leo, Calvin, Alfred, Shivang

What is AI on Edge, and why does it matter for South-East Asia?

AI on Edge brings artificial intelligence processing directly to devices rather than relying on cloud servers. This enhances real-time, self-contained decision-making on the edge device and ensures data privacy, especially for sensitive industries. This technology is crucial for applications requiring immediate responses and robust security. As the next frontier, it holds significant potential for innovation in South-East Asia across manufacturing, healthcare, fintech/banking, smart cities.

Insights from the panel discussion

Q: What are the key challenges in scaling edge AI solutions? How can startups address them?
A: Cost, compute, LLM size are some limitations for scaling AI on edge. There is a lot of innovation happening on building smaller, more powerful, and cheaper chips to power edge devices which can deliver faster inference for real-time decision-making.

This is critical for certain industries such as healthcare, manufacturing, smart cities, and self-driving vehicles, where real-time data processing and output are critical for safety and decision-making. In these scenarios, you cannot completely rely on the cloud, but, must explore a hybrid approach.

Startups should look at the use cases that require real-time decision-making and decide which ones can be processed on the cloud vs on edge. Building Small Language Models, or pretraining larger models, which can easily sit on edge devices for the specific use case will be super important.

DeepSeek is the perfect, most recent, example of ‘size does not matter’ when it comes to the advancements in reasoning models. Startups must continue to experiment and see which LLM + Cloud/Edge combination works best.

The future is leaning towards a hybrid approach, where your solution can smartly switch between AI of Edge vs Cloud.

Q: How can integration of AI on Edge impact data privacy and security, especially in industries that handle sensitive information like healthcare?
A: Data privacy and security is one the most important factors driving the adoption for on-premise AI solutions, building smarter Edge devices, and ensuring that the ownership of data sits with the user.

The PII data of a user or organization is at risk when data flows out of the data warehouse of the owner, and into 3rd party software, cloud infrastructure, or, shared via unsecured channels.

Healthcare, fintech, banking - these industries rely on protecting user data and financial data to avoid misuse. The data is critical for insights and real-time decision-making, but, building guardrails around data sharing become critical.

With AI on Edge, ensuring the data stays in the ecosystem of the organization, makes the AI applications safer and trustworthy. This enables organizations to scale, abide by local laws, ensure global best practices and industry standards.

Q: How can businesses balance the power of cloud-based AI with the need for on-edge, real-time processing?
A: Hybrid is the key - the switch between Cloud and Edge depending on the use case and scenario will drive better efficiency and cost. Businesses need to identify if they need real-time inferencing for video surveillance to catch bad actors, or high compute for model training. Both require an edge and cloud solution respectively.

Startups need to understand their customers and the workflow to be able to balance the use of cloud and edge into their product experience.

While cloud can provide fast inferencing, colocation distance and other factors can affect the variance. These can be managed when time is not of essence, but, compute is. However, when it comes to real-time data-processing and decision-making, internet speed, geographical location, hardware setup, and many other factors can affect the time to process. Hence, AI on Edge becomes critical to deliver consistent experience.

Q: For startups working on edge AI, what are the most important considerations when designing solutions for scalability and interoperability?
A: Cost, R&D, customer - these have to be top of mind always.

Cost of building and deploying an edge solution is critical for startups, especially in the early stages of testing the product-market-fit. This means that you find the right GPU hardware, or, build your own if you need to, but, do not let the costs get out of hand.

Continuous experimentation with the latest LLMs to elevate your product experience is the key to stay relevant. We can’t let our current model be the driver for next 2 years, because most likely the next disruptive model (DeepSeek) may completely change your product roadmap 2 months later. Have your product built in such as way that switching between models is easy.

Customers are the key to unlocking scale and revenue, but, you need to get paid POCs and move beyond POCs to production as soon as possible. AI on Edge can quickly become a very competitive space, so, understanding your customer pain points quickly and addressing them with your unique proposition will help you lock them in.

Q: How can AI at the edge transform industries like manufacturing, healthcare, logistics, and smart cities?
A: AI on Edge can drive innovation and unlock new use cases and delivery methods. It can lead to a faster, more efficient, and self-serve network of autonomous decision-making, without sacrificing data privacy and security.

Defect detection in manufacturing, using AI on Edge, helps to quickly address the issue, without halting production. This has huge implications for production quality and quantity. AI on Edge can enhance workplace safety while elevating staff productivity.

AI on Edge can solve these common challenges that affect multiple industries such as manufacturing, construction, logistics, warehousing, agri-tech, etc.

Healthcare is likely the most critical industry where the risks of making the ‘wrong decision’ are high. With lives at stake and tight industry standards, the AI on Edge solutions must deliver high accuracy, ease of use, and real-time response.

The concept of smart cities has been around for a decade, however, we have yet to see traffic decongestion solutions (especially in Asian markets) by analyzing traffic CCTV footage to reroute traffic in real-time.

Note: all these insights are from the perspective of the panel moderator (Shivang Gupta) based on the conversations and insights shared by the panelists.

Connect with AI founders and builders in Asia, stay updated with the latest innovations in Asia, and join exciting events, with The Generative Beings community.

Join HP’s Garage 2.0 innovation program

HP Garage 2.0 is where startups and innovation come together. As an incubator designed to fuel groundbreaking ideas, we provide the resources, ecosystem access and mentorship startups need to turn concepts into reality, while co-creating next to HP's experienced teams.

If you are building an AI on Edge solution, you should consider applying for HP’s Garage 2.0 innovation program and fast-track your growth and validation.

Upcoming Events

Stay connected with our community events across Singapore, Malaysia, Indonesia, Thailand, and India. Soon, we will be running events in more countries across SEA.

Stay ahead of the curve with the latest in AI on Edge by joining The Generative Beings community.

Cheers,
Shivang

Reply

or to participate.