Advancing Steerable LLM-powered Agents for Dynamic Multimodal Interactions. This project aims to develop advanced, interactive AI systems by leveraging large language models (LLMs) to create adaptive,
Description
Advancing Steerable LLM-powered Agents for Dynamic Multimodal Interactions. This project aims to develop advanced, interactive AI systems by leveraging large language models (LLMs) to create adaptive, multimodal agents that are capable of understanding and interacting through texts, images, and audios. The research will revolutionise decision-making, planning, and alignment with user preferences by advancing multimodal alignment techniques and reinforcement learning. The significance lies in addressing current limitations and making AI adaptive, context-aware, and responsive in real-world scenarios. Expected outcomes include transforming AI performance in dynamic environments such as manufacturing and education, with benefits including scholarly advancements and practical applications in user-centered AI systems. . Scheme: ARC Future Fellowships. Field: 4605 - Data Management and Data Science. Lead: Prof Lina Yao