Ever struggled with multi-sensor data from cameras, depth sensors, and other custom sensors? Meet AnySense—an iPhone app for effortless data acquisition and streaming. Working with multimodal sensor data will never be a chore again!
Comments
Log in with your Bluesky account to leave a comment
Even if you don’t have external sensors, you can start using AnySense immediately to record and stream:
✅ RGB + Depth + Pose data
✅ Audio from the iPhone mic or custom contact microphones
✅ Seamless Bluetooth integration for external sensors
Need to connect external sensors? No problem! AnySense supports data streaming over Bluetooth at the press of a button! Here’s a visualization of data collected by connecting the AnySkin (https://any-skin.github.io) tactile sensor with AnySense via Bluetooth.
Why does this matter? Here, we use AnySense to scale data and train visuo-tactile policies using Robot Utility Models (https://robotutilitymodels.com) for a whiteboard erasing task. With AnySense-enabled live streaming, you can just plug your iPhone into your robot and seamlessly deploy your policies!
AnySense is built to empower researchers, engineers, and developers with better tools for sensor-based AI. Our code is fully open-source and can be found on
Making an app was uncharted territory for us as a research group, and would not have been possible without my wonderful collaborators: Zeyu (Michael) Bian, @venkyp.bsky.social, @haritheja.bsky.social, @eneserciyes.bsky.social, @notmahi.bsky.social and @lerrelpinto.com.
Comments
Even if you don’t have external sensors, you can start using AnySense immediately to record and stream:
✅ RGB + Depth + Pose data
✅ Audio from the iPhone mic or custom contact microphones
✅ Seamless Bluetooth integration for external sensors
Github: https://github.com/NYU-robot-learning/AnySense
Website: https://AnySense.app