AI-Powered Multimodal Disaster Response Enhancement using Social Media Streams
Main Article Content
Abstract
Social media serves as a valuable source of real-time disaster data, providing timely updates that are critical for effective disaster monitoring and response. It plays an essential role in raising awareness, enhancing preparedness, and facilitating prompt action during emergencies. While much of the current research focuses on disaster prediction, real-time updates, which are key to saving lives and coordinating response efforts, are often overlooked. Our project addresses this gap by focusing on the collection and analysis of real-time data from various social media platforms, including disaster-related images and textual data like tweets.We use multiple APIs to gather real-time data from platforms such as Instagram, Facebook, and Twitter. To analyze visual data, we employ state-of-the-art multimodal models like ResNet50 and EfficientNet for image classification. For the textual data, advanced natural language processing models like BERT and XLNet are used for sentiment analysis, event detection, and severity assessment. By integrating and fusing the image and text data, our system is able to identify the type of disaster, estimate its severity, and provide actionable insights.