TY - GEN
T1 - CityLifeSim
T2 - 2nd IEEE International Conference on Intelligent Reality, ICIR 2022
AU - Wang, Cheng Yao
AU - Nir, Oron
AU - Vemprala, Sai
AU - Kapoor, Ashish
AU - Ofek, Eyal
AU - McDuff, Daniel
AU - Gonzalez-Franco, Mar
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022/1/1
Y1 - 2022/1/1
N2 - Simulations are a powerful tools particularly in the case of safety critical scenarios. However, simulating complex temporal events in multi-Agent scenarios with vehicles and pedestrians, such as those that exist in urban environments, is challenging. We present CityLifeSim, a simulation for the research community that focuses on rich pedestrian behavior, such as the one that arises when different personalities, environmental events, and group goals are simulated. In our simulations we can see cases of people jay walking a red light, sitting on a bench, waiting for the bus, or calling on the phone, but also more complex creation and management of crowds that might even line up or just keep moving while observing interpersonal distances. CityLifeSim is configurable and can create unlimited scenarios with detailed logging capabilities. As a demonstration we have run CityLifeSim to create a demo dataset for training setups that includes 17 different cameras, views from a moving vehicle in the street under different weather conditions (rain, snow, sun), and from a drone with frontal and downward views. All content is released with the corresponding original configuration files, ground truth pedestrian segmentation, and RGB-D frames. We evaluate our dataset on a pedestrian detection and identification task with state of the art Multi-Object Tracker (MOT), showing the limitations and opportunities for synthetic data in this use case.
AB - Simulations are a powerful tools particularly in the case of safety critical scenarios. However, simulating complex temporal events in multi-Agent scenarios with vehicles and pedestrians, such as those that exist in urban environments, is challenging. We present CityLifeSim, a simulation for the research community that focuses on rich pedestrian behavior, such as the one that arises when different personalities, environmental events, and group goals are simulated. In our simulations we can see cases of people jay walking a red light, sitting on a bench, waiting for the bus, or calling on the phone, but also more complex creation and management of crowds that might even line up or just keep moving while observing interpersonal distances. CityLifeSim is configurable and can create unlimited scenarios with detailed logging capabilities. As a demonstration we have run CityLifeSim to create a demo dataset for training setups that includes 17 different cameras, views from a moving vehicle in the street under different weather conditions (rain, snow, sun), and from a drone with frontal and downward views. All content is released with the corresponding original configuration files, ground truth pedestrian segmentation, and RGB-D frames. We evaluate our dataset on a pedestrian detection and identification task with state of the art Multi-Object Tracker (MOT), showing the limitations and opportunities for synthetic data in this use case.
KW - causal ML
KW - dataset
KW - pedestrian simulation
KW - self-driving cars
UR - https://www.scopus.com/pages/publications/85151915183
U2 - 10.1109/ICIR55739.2022.00018
DO - 10.1109/ICIR55739.2022.00018
M3 - Conference contribution
AN - SCOPUS:85151915183
T3 - Proceedings - 2022 IEEE 2nd International Conference on Intelligent Reality, ICIR 2022
SP - 11
EP - 16
BT - Proceedings - 2022 IEEE 2nd International Conference on Intelligent Reality, ICIR 2022
PB - Institute of Electrical and Electronics Engineers
Y2 - 14 December 2022 through 16 December 2022
ER -