spot_img
HomeResearch & DevelopmentQANA: Bringing Advanced Skin Disease Diagnosis to Portable Devices...

QANA: Bringing Advanced Skin Disease Diagnosis to Portable Devices with Neuromorphic AI

TLDR: QANA is a new, energy-efficient AI architecture for skin disease classification on small, portable devices. It uses a quantization-aware design that allows it to be converted into a Spiking Neural Network (SNN) for deployment on neuromorphic hardware like BrainChip Akida. This enables high accuracy (over 90%), extremely low latency (1.5 ms), and minimal energy consumption (1.7 mJ per image), making advanced dermatological diagnosis accessible and privacy-preserving at the edge.

Diagnosing skin diseases accurately and efficiently, especially in remote or resource-limited settings, is a significant challenge in healthcare. Traditional deep learning models often require powerful cloud servers, raising concerns about data privacy and accessibility. Imagine a system that could provide expert-level dermatological care directly on a small, portable device, without needing to send sensitive patient data to the cloud. This is the vision behind QANA, a new approach designed to bring advanced skin lesion classification to the edge.

QANA, which stands for Quantization-Aware Neuromorphic Architecture, is a groundbreaking framework specifically developed for incremental skin lesion classification on hardware with limited resources. Unlike conventional deep learning models that consume significant computational power and energy, QANA is built to be highly efficient. It achieves this by integrating several clever components: ghost modules for diverse feature representation with minimal computation, efficient channel attention to focus on important visual cues, and squeeze-and-excitation blocks to refine channel information. These elements work together to ensure robust feature extraction while keeping latency and energy consumption remarkably low.

One of QANA’s most innovative aspects is its compatibility with neuromorphic computing. Traditional deep learning models, known as Convolutional Neural Networks (CNNs), process information continuously. However, QANA is designed with a “quantization-aware head” and “spike-compatible transformations,” allowing it to be seamlessly converted into a Spiking Neural Network (SNN). SNNs mimic the human brain more closely by using discrete “spikes” of information, leading to extremely low power consumption and event-driven computation. This makes them ideal for deployment on specialized neuromorphic processors like BrainChip Akida, which are built for energy efficiency and can even support on-chip incremental learning, meaning the system can adapt to new data without needing a full retraining from scratch.

The development of QANA involved a comprehensive, end-to-end pipeline. It begins with meticulous data preprocessing, including screening images for quality, augmenting them to increase diversity, and using a technique called SMOTE to balance out rare lesion categories. Next, the core QANA network extracts features using its specialized Ghost, ECA, and SE blocks, ensuring the features are ready for conversion to spikes. The trained network is then automatically converted into an SNN using the Akida MetaTF toolkit, which handles the complex mapping of CNN operations to spike-based equivalents and quantizes the data for hardware compatibility. Finally, the SNN is deployed and optimized on the BrainChip Akida neuromorphic platform, enabling real-time, energy-efficient inference.

The results of QANA’s evaluation are highly impressive. Tested on the large-scale HAM10000 benchmark dataset and a real-world clinical dataset, QANA consistently outperformed leading CNN-to-SNN models. On HAM10000, it achieved a Top-1 accuracy of 91.6% and a macro F1 score of 82.4%. On the clinical dataset, it reached 90.8% Top-1 accuracy and 81.7% macro F1. These figures demonstrate QANA’s strong ability to accurately classify various skin lesions, including less common types, even with limited training data.

Beyond accuracy, QANA truly shines in its efficiency. When deployed on BrainChip Akida hardware, it achieved an astonishingly low inference latency of just 1.5 milliseconds per image and consumed only 1.7 millijoules of energy per image. This represents a reduction in inference latency of over 94.6% and energy use of over 98.6% compared to GPU-based CNNs. Such performance makes QANA an ideal candidate for portable diagnostic instruments, enabling real-time and privacy-sensitive medical analysis directly at the point of care.

Also Read:

In conclusion, QANA offers a compelling solution for accessible dermatological care on resource-constrained devices. By combining a novel quantization-aware architecture with the power of neuromorphic computing, it delivers high accuracy, ultra-low latency, and exceptional energy efficiency. This advancement paves the way for more widespread and privacy-preserving AI deployment in dermatology, bringing expert diagnostic capabilities to edge environments. For more in-depth technical details, you can refer to the full research paper: Quantization-Aware Neuromorphic Architecture for Efficient Skin Disease Classification on Resource-Constrained Devices.

Nikhil Patel
Nikhil Patelhttps://blogs.edgentiq.com
Nikhil Patel is a tech analyst and AI news reporter who brings a practitioner's perspective to every article. With prior experience working at an AI startup, he decodes the business mechanics behind product innovations, funding trends, and partnerships in the GenAI space. Nikhil's insights are sharp, forward-looking, and trusted by insiders and newcomers alike. You can reach him out at: [email protected]

- Advertisement -

spot_img

Gen AI News and Updates

spot_img

- Advertisement -