The International Workshop on

Edge and Mobile Foundation Models (EdgeFM)

MobiSys 2024, Tokyo, Japan | June 7, 2024

Friday, June 7th (Location: 5F Hall A) HOST
9:00 - 9:05 Opening Mengwei Xu
9:05 - 9:50 Keynote 1 by Lili Qiu Mengwei Xu
The future of healthcare powered by wireless sensing and AI
Abstract: The multifaceted nature of individual health can be captured using an array of physiological and behavioral indicators, including but not limited to respiratory patterns, cardiac rhythms, neural activity as evidenced by brain waves, articulation and linguistic nuances, kinesthetic dynamics, and the intricate details captured in medical imaging. This talk will present Microsoft Research Asia’s innovative wireless sensing and advanced machine learning technologies, which have the potential to transform daily health monitoring and revolutionize the accuracy and efficiency of disease diagnosis. Furthermore, the talk will share the valuable lessons we have learned through our collaboration with leading hospitals. In the end, I will briefly introduce other wireless research projects we have done at MSRA Shanghai.
Biography: Dr. Lili Qiu is an Assistant Managing Director of Microsoft Research Asia and is mainly responsible for overseeing the research, as well as the collaboration with industries, universities, and research institutes, at Microsoft Research Asia – Shanghai. She obtained her MS and PhD degrees in computer science from Cornell University. Dr. Qiu is an expert in Internet and wireless networking. In 2005, she joined the University of Texas at Austin as an assistant professor in the Department of Computer Science, and later, in view of her outstanding achievements inthe internet and wireless networks fields, she was promoted to a tenured professor and doctoral advisor. Dr. Qiu is an IEEE Fellow, an ACM Fellow and NAI Fellow. She also serves as the ACM SIGMOBILE chair. She was named an ACM Distinguished Scientist and was a recipient of the NSF CAREER award, among many other honors.
9:50 - 10:30 Session 1: LLM and Sensing Yuanchun Li
ChainStream: A Stream-based LLM Agent Framework for Continuous Context Sensing and Sharing
Jiacheng Liu (Beijing Institute of Technology) , Wenxing Xu (Beijing University of Posts and Telecommunications) , Yuanchun Li (Institute for AI Industry Research (AIR), Tsinghua University)
[online] Are Large Language Models Capable of Causal Reasoning for Sensing Data Analysis?
Zhizhang Hu (University of California, Merced) , Yue Zhang (University of California, Merced) , Ryan Rossi (Adobe Research) , Tong Yu (Adobe Research) , Sungchul Kim (Adobe Research) , Shijia Pan (University of California, Merced)
A Solution for Reducing MLLM-Based Agent Interaction Overhead
Wenjie Li (Huawei Technologies Co., Ltd.) , Xiaoyang Liu (Huawei Technologies Co., Ltd.) , Zihao Zheng (Huawei Technologies Co., Ltd.) , Jishun Wang (Huawei Technologies Co., Ltd.) , Kang Ling (Huawei Technologies Co., Ltd.) , Ming Fu (OS Kernel Lab, Huawei Technologies)
10:30 - 11:00 Coffee Break
11:00 - 11:40 Session 2: LLM for Mobile Scenario Yuanchun Li
Large Language Models on Mobile Devices: Measurements, Analysis, and Insights
Xiang Li (Beijing University of Posts and Telecommunications) , Zhenyan Lu (Beijing University of Posts and Telecommunications) , Dongqi Cai (Beijing University of Posts and Telecommunications) , Xiao Ma (Beijing University of Posts and Telecommunications) , Mengwei Xu (Beijing University of Posts and Telecommunications)
[online] Hybrid SLM and LLM for Edge-Cloud Collaborative Inference
Zixu Hao (Tsinghua University, Microsoft Research) , Huiqiang Jiang (Microsoft Research) , Shiqi Jiang (Microsoft Research) , Ju Ren (Tsinghua University) , Ting Cao (Microsoft Research)
An On-device LLM-based Approach to Query Privacy Protection
Yizhen Yuan (Institute for AI Industry Research (AIR), Tsinghua University) , Rui Kong (Shanghai Jiao Tong University) , Yuanchun Li (Institute for AI Industry Research (AIR), Tsinghua University) , Yunxin Liu (Institute for AI Industry Research (AIR), Tsinghua University)
11:40 - 12:30 Panel: When LLM meets mobile and edge computing
Ting Cao (Microsoft Research)
Felix Xiaozhu Lin (University of Virginia)
Mi Zhang (The Ohio State University)
Yuanchun Li (Tsinghua University)
Steven Y. Ko
14:00 - 14:45 Session 3: LLM for Mobile Hardware Ting Cao
Towards a Task-agnostic Distillation Methodology for Creating Edge Foundation Models
Swarnava Dey (TCS Research, Tata Consultancy Services) , Arijit Mukherjee (TCS Research, Tata Consultancy Services) , Arijit Ukil (TCS Research, Tata Consultancy Services) , Arpan Pal (TCS Research, Tata Consultancy Services)
Towards Light Adaptation of Large Language Models For Personal Hardware
Liangyu Wang (King Abdullah University of Science and Technology) , Junxiao Wang (King Abdullah University of Science and Technology) , Di Wang (King Abdullah University of Science and Technology)
Efficient LLM Prefilling with Mobile NPU
Daliang Xu (Peking University) , Hao Zhang (Beijing jiaotong Univsersity) , Liming Yang (Peking University) , Ruiqi Liu (Peking University) , Mengwei Xu (Beijing University of Posts and Telecommunications) ,Xuanzhe Liu (Peking University)
14:45 - 15:30 Keynote 2 (Nic Lane) Ting Cao
The Future of Large Language Models (and AI) is Federated
Abstract: As established scaling laws indicate, the future performance improvements of LLMs depend on the amount of computing and data sources we can leverage. Where will we get the necessary compute and data to drive the continued advances in LLMs that the world now has grown to expect? I believe all roads lead to federated learning. Federated and de-centralized approaches to machine learning will be how the strongest LLMs (and foundation models more generally) are trained in the relatively near future, and in time, we will see federated as one of the core enablers of the entire AI revolution. In this talk, I will describe why the future of AI will be federated, and describe early solutions developed by Flower and CaMLSys that address the underlying technical challenges in the world shifting from a centralized data-center mindset to de-centralized alternatives that can facilitate the continued scaling of AI capabilies.
Biography: Nic Lane (http://niclane.org) is a full Professor in the department of Computer Science and Technology, holds a Royal Academy of Engineering Chair in De-centralized AI, and is a Fellow of St. John's College, at the University of Cambridge. Nic also leads the Cambridge Machine Learning Systems Lab (CaMLSys -- http://http://mlsys.cst.cam.ac.uk/). Alongside his academic appointments, he is the co-founder and Chief Scientific Officer of Flower Labs (https://flower.dev/), a venture-backed AI company (YCW23) behind the Flower framework. Nic has received multiple best paper awards, including ACM/IEEE IPSN 2017 and two from ACM UbiComp (2012 and 2015). In 2018 and 2019, he (and his co-authors) received the ACM SenSys Test-of-Time award and ACM SIGMOBILE Test-of-Time award for pioneering research, performed during his PhD thesis, that devised machine learning algorithms used today on devices like smartphones. Nic was the 2020 ACM SIGMOBILE Rockstar award winner for his contributions to “the understanding of how resource-constrained mobile devices can robustly understand, reason and react to complex user behaviors and environments through new paradigms in learning algorithms and system design.”

Supported by:

SIGMOBILE SIGBED China Chapter