导语


分享内容简介
分享内容简介
分享内容大纲
分享内容大纲
Part1:时序时空大模型背景简介
Part2:本季读书会talk主题介绍
专题一:时序大模型(姚迪老师)
时间序列和时空数据大模型综述
大模型赋能时序数据分析
时序数据基座模型
专题二:时空大模型(梁宇轩老师)
大模型赋能时空增强器(LLM-as-Enhancer)
大模型赋能时空预测器(LLM-as-Predictor)
大模型赋能时空智能体(LLM-as-Agent)
专题三:轨迹大模型(薛昊老师)
轨迹表征学习
大模型应用专题:POI轨迹预测,城市移动模拟生成
Part3:Time-LLM论文分享
时序大模型介绍和背景
Time-LLM介绍
设计动机
框架结构
实验结果
其他有代表性的时序大模型工作
例如OFA,LLMTime,TimesFM等
未来方向展望
Part4:发起人圆桌讨论总结
核心概念
核心概念
时间序列 Time Series
时空数据 Spatio-Temporal Data
大语言模型 Large Language Model
基座模型 Foundation Model
深度学习 Deep Learning
轨迹 Trajectory
主持人介绍
主持人介绍
(1)梁宇轩

(2)姚迪

(3)薛昊

(4)金明

(5)文青松

本期主要参考文献
本期主要参考文献
文献解读推文:大模型如何建模时序?莫纳什等最新《面向时间序列和时空数据的大型模型》综述与展望
文献解读推文:【https://zhuanlan.zhihu.com/p/665409859】
其它补充参考文献
其它补充参考文献
Y. Zheng et al., Urban computing: concepts, methodologies, and applications. TIST 2014. G. Jin et al., Spatio-Temporal Graph Neural Networks for Predictive Learning in Urban Computing: A Survey. TKDE 2023. Y. Liang et al., Modelling Trajectories with Neural Ordinary Differential Equations. IJCAI 2021. Y. Liang et al., TrajFormer: Efficient Trajectory Classification with Transformers. CIKM 2022. Y. Liang et al., Fine-grained Urban Flow Prediction. WWW 2021. Y. Xia et al., Deciphering Spatio-Temporal Graph Forecasting: A Causal Lens and Treatment. NeurIPS 2023. Y. Liang et al., AirFormer: Predicting Nationwide Air Quality in China with Transformers. AAAI 2023. M. Jin et al., Large Models for Time Series and Spatio-Temporal Data: A Survey and Outlook. arXiv 2023. M. Jin et al., Position Paper: What Can Large Language Models Tell Us about Time Series Analysis. arXiv 2024. Y. Liang et al., Foundation Models for Time Series Analysis: A Tutorial and Survey. arXiv 2024. Y. Liang et al., Exploring Large Language Models for Human Mobility Prediction under Public Events. arXiv 2023. X. Xu et al., Temporal Data Meets LLM -Explainable Financial Time Series Forecasting. arXiv 2023. Q. Zhang et al., Spatio-Temporal Graph Learning with Large Language Model. Openreview 2023. S. Zhang et al., TrafficGPT: Viewing, Processing and Interacting with Traffic Foundation Models. arXiv 2023. Y. Yan et al., UrbanCLIP: Learning Text-enhanced Urban Region Profiling with Contrastive Language-Image Pretraining from the Web. WWW 2024. M. Jin et al., Time-LLM: Time Series Forecasting by Reprogramming Large Language Models. ICLR 2024. X. Liu et al., UniTime: A Language-Empowered Unified Model for Cross-Domain Time Series Forecasting. WWW 2024. C. Liu et al., Spatial-Temporal Large Language Model for Traffic Prediction. arXiv 2024. X. Wang et al., Where Would I Go Next? Large Language Models as Human Mobility Predictors. arXiv 2024. X. Liu et al., LargeST: A Benchmark Dataset for Large-Scale Traffic Forecasting. NeurIPS 2023. Z. Zhou et al., Large language model empowered participatory urban planning. arXiv 2024. S. Lai et al., LLMLight: Large Language Models as Traffic Signal Control Agents. arXiv 2024. Jin M, Koh H Y, Wen Q, et al. A survey on graph neural networks for time series: Forecasting, classification, imputation, and anomaly detection[J]. arXiv preprint arXiv:2307.03759, 2023. Xue H, Salim F D. PromptCast: A New Prompt-based Learning Paradigm for Time Series Forecasting[J]. 2022. Zhou T, Niu P, Wang X, et al. One Fits All: Power General Time Series Analysis by Pretrained LM[J]. arXiv preprint arXiv:2302.11939, 2023. Sun C, Li Y, Li H, et al. TEST: Text Prototype Aligned Embedding to Activate LLM's Ability for Time Series[J]. arXiv preprint arXiv:2308.08241, 2023. Chang C, Peng W C, Chen T F. Llm4ts: Two-stage fine-tuning for time-series forecasting with pre-trained llms[J]. arXiv preprint arXiv:2308.08469, 2023. Jin M, Wang S, Ma L, et al. Time-llm: Time series forecasting by reprogramming large language models[J]. arXiv preprint arXiv:2310.01728, 2023. Defu Cao, Furong Jia, Sercan O Arik, Tomas Pfister, Yixiang Zheng, Wen Ye, and Yan Liu. 2023. TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting. arXiv preprint arXiv:2310.04948 (2023). Xu Liu, Junfeng Hu, Yuan Li, Shizhe Diao, Yuxuan Liang, Bryan Hooi, and Roger Zimmermann. 2023. UniTime: A Language-Empowered Unified Model for Cross-Domain Time Series Forecasting. arXiv preprint arXiv:2310.09751 (2023). Santosh Palaskar, Vijay Ekambaram, Arindam Jati, Neelamadhav Gantayat, Avirup Saha, Seema Nagar, Nam H Nguyen, Pankaj Dayama, Renuka Sindhgatta, Prateeti Mohapatra, et al. 2023. AutoMixer for Improved Multivariate Time-Series Forecasting on BizITOps Data. arXiv preprint arXiv:2310.20280 (2023). Garza A, Mergenthaler-Canseco M. TimeGPT-1[J]. arXiv preprint arXiv:2310.03589, 2023. Rasul K, Ashok A, Williams A R, et al. Lag-Llama: Towards Foundation Models for Time Series Forecasting[J]. arXiv preprint arXiv:2310.08278, 2023. Woo G, Liu C, Kumar A, et al. Unified training of universal time series forecasting transformers[J]. arXiv preprint arXiv:2402.02592, 2024. Goswami M, Szafer K, Choudhry A, et al. MOMENT: A Family of Open Time-series Foundation Models[J]. arXiv preprint arXiv:2402.03885, 2024. Gao S, Koker T, Queen O, et al. UniTS: Building a Unified Time Series Model[J]. arXiv preprint arXiv:2403.00131, 2024. Ansari A F, Stella L, Turkmen C, et al. Chronos: Learning the language of time series[J]. arXiv preprint arXiv:2403.07815, 2024.

直播信息
直播信息
时间:
参与方式:
1、集智俱乐部 B 站免费直播,扫码可预约。

报名成为主讲人:
时序时空大模型读书会招募中

点击“阅读原文”,报名读书会
内容中包含的图片若涉及版权问题,请及时与我们联系删除
评论
沙发等你来抢