Profile avatar
lhinderling.bsky.social
tries to make microscopes smarter · bioimage analysis, optogenetics, ml, 3d printing, open science · phd student in cellular signalling dynamics @PertzLab
51 posts 1,801 followers 1,253 following
Regular Contributor
Active Commenter
comment in response to post
will check it out thanks!!
comment in response to post
closest thing i've printed is traps for parasites (see pic). the catch features in the paper you linked are close to the limit of whats possible with our setup, but should be possible. never tried valves; this requires stacking multiple layers and is not directly printed, so should be the same?
comment in response to post
yes, we re-purpose our microscopes set up for optogenetics, no modifications! very similar hardware requirements
comment in response to post
and the simplified protocol we present should be compatible with alveole hardware. especially the complete platform alveole recently introduced could be a good solution for labs who want to try the protocol, and are looking to buy a dedicated micofrabrication microscope. happy to hear your inputs!
comment in response to post
yes great question, tried to address this a bit in the paper.. i never got the chance to try the system, but maybe think of the hardware/software stack as an alternative open-source approach, compatible with microscopes people might already have in their labs. 1/2
comment in response to post
many people involved in this project, incredibly grateful for such amazing colleagues and mentors ✨ special huge thank you to Pertz Lab and MSc students Remo, Moritz and Joël!
comment in response to post
PAPER: detailed methods with tips and tricks, more examples and QC in the pre-print! happy for feedback and questions! www.biorxiv.org/content/10.1...
comment in response to post
as everything is controlled from python, automating prints becomes really easy, for example simplifies exposure calibration. excited to see what else is possible, especially with image feedback.
comment in response to post
CODE: if you have a micro-manager compatible microscope with a DMD/SLM, UV light source and XY-stage, you have all the hardware required to try it yourself! code on github: github.com/hinderling/f... don't have a DMD microscope? stay tuned for our hardware project! 🛠️
comment in response to post
we also printed circular microchambers, so worms cannot wiggle away. video shows a worm's entire lifecycle, from hatching to laying eggs. grainy texture is bacteria food source. collab with @betowbin.bsky.social Lab next door.
comment in response to post
micro pillars and pits for cells to crawl on. @bgraedel.bsky.social managed to get down to 1um sized features!
comment in response to post
microfluidic chips to study how cells squeeze trough tight gaps. movie shows actin (confocal slice) and brightfield. look these blebs 💕
comment in response to post
this made printing new microstructures so easy, we started applying it to all the projects in the lab, where we previously didn't even consider it!
comment in response to post
FAST: from idea to prototype within a day. we create a simple protocol, combining ideas from literature and adding our own tricks. focus on replacing expensive and toxic chemicals: SU8 → consumer 3D printing resin, chloro silane → methanol, silicon wafers → glass slides...
comment in response to post
I would also very much second your opinion. Inkscape is amazing and is constantly improving. Here also a link to a video from the Halfway to I2K conference showing the combination of ImageJ/Fiji with Inkscape for proper editing and high quality figure output. Hope it helps somebody!
comment in response to post
i remember how this blew my mind when i first saw it hahaha so well done
comment in response to post
Thing 3 to gain attention and followers: This was #NotTheCover of JCB - 3D rendering of a bunch of cancer cells imaged in zebrafish, pseudocolored by morphotype (e.g. shape classification)
comment in response to post
There are several, our own paper was the easiest to find ;-) elifesciences.org/articles/84364
comment in response to post
Here is a movie locally activating RhoA from our paper doi.org/10.7554/eLif... See fig. 4 for details. Our OptoLARG construct (pB3.0-optoLARG-mVenus-SspB-p2A-stargazin-mtq2-iLID) is adapted from this publication: doi.org/10.1038/ncom...
comment in response to post
Here is a movie locally activating RhoA from our paper doi.org/10.7554/eLif... See fig. 4 for details. Our OptoLARG construct (pB3.0-optoLARG-mVenus-SspB-p2A-stargazin-mtq2-iLID) is adapted from this publication: doi.org/10.1038/ncom...
comment in response to post
Klaus Hahn has some classic movies of manually directing migration: tinyurl.com/49ftza3h. The Weiner lab use HL60 (v fast) continually illuminating one side: tinyurl.com/mryvy9m7 . Here is an example of our automatic guidance of HT1080 (slower) to specific paths, look out for a bioRxiv upload soon!🌀
comment in response to post
@jpassmore.bsky.social has beautiful migration with fast HT1080? Here are some slow & sticky REF52 from our lab: bsky.app/profile/lhin...
comment in response to post
shows the optogentic stimulation mask 🔦 www.biorxiv.org/content/10.1...
comment in response to post
It may be time to face the fact that, after 300 years, A better way of doing things occasionally appears. In terms of classification, microscopists we should perhaps allege Now be replaced by computers which have not been trained to hedge.