Advances in microscopy mean we can now do more than just observe biologyβwe can control it. But how far can we really push this in mammalian cells with all their (beautiful but annoying) heterogeneity? π§ͺπ¬(π§΅)
Comments
Log in with your Bluesky account to leave a comment
In our recent preprint, we address this question - combining smart microscopy with optogenetics to present a platform for 'outcome-driven' microscopy, precisely controlling cells to bring them to the same outcome, despite all the variation. https://www.biorxiv.org/content/10.1101/2024.12.12.628240v1
First, we had to build the smart microscopy platform. We chose a modular structure, with plug-and-play modules for image analysis, control, and microscope bridging - as required by the user. All this is coordinated by a user-specified 'outcome-driven strategy', with a handy UI for ease-of-use.
We first tested this with directed cell migration, with the 'outcome' being to guide cells to specific migration paths. Using optogenetic recruitment of TIAM1 (Coppey lab), we could automatically update an area of illumination and precisely guide cells to predefined paths, like little tiny robots! π€
If we changed the irradiance over time (in-between loops), we could even slow down and speed up the cells, demonstrating even further control and making use of the dose-responsiveness of optogenetics.
Of course, what's the use of making little cell robots if you can't control a few at once with some communication with each other? Here, we added an 'active avoidance' system, giving each cell a searchlight and temporarily reversing migration when a collision is imminent.
In order to push outcome-driven microscopy further from the controller side, we also used the same approach with LEXY - the light-inducible nuclear export system (Di Ventura lab). We hoped to control precise concentrations in the nucleus or the cytosol, by placing irradiance under PID control.
Comments