Project Detail |
In the ExplainableAD project, we address the problem of eXplainable Anomaly Detection (XAD) on data streams with evolving, previously unknown, complex anomaly types. To bridge the gap between growing demands in the data industry and severely limited industry-grade solutions available today, we propose a breakthrough XAD system that offers unprecedented capabilities for safeguarding digital services. To this end, it seamlessly integrates deep learning-based anomaly detection (AD), which unlocks the power of detecting evolving, previously unknown, complex anomaly types, and explanation discovery (ED), which overcomes the non-interpretability of deep learning by returning human-readable, actionable insights. To establish a pathway from fundamental research to innovation, we will conduct 1) additional research activities on automated model selection, scalable execution, and data visualization; (2) prototyping, benchmarking analysis, and system demonstration; (3) large-scale use case studies with industry partners, leveraging ongoing collaborations and building new use cases across banking, finance, and healthcare domains; (4) market analysis and market entry strategy development to enable technology transfer. Our XAD system will have a direct impact on end users across a variety of sectors, including fraud analysts in banking and financial services to combat ever-growing, complex fraud types, IT departments and cloud services to manage malfunction and performance challenges proactively, and biomedical researchers and staff to benefit from enhanced diagnostic accuracy and improved confidence. The PI will bring extensive experience from the ERC CoG grant, including a large number of publications and a highly skilled team, to this project. With a proven track record of delivering cutting-edge solutions to critical challenges faced by enterprises and society, she will leverage her strong industry connections to facilitate the transition from research to innovation. |