Part of what’s really interesting in thinking about new computational microscopes has to do with ideas popularized many decades ago, in texts like Goethe’s Metamorphosis – that systems at different scales tell us different things about the universe.

We get more detail on some of those from Bill Freeman at IIA as he talks about a motion microscope that amplifies motion.

The example that he comes up with hits parents particularly hard – if you’ve ever struggled to see whether your baby is breathing, Freeman shows how one of these tools can make it more evident. Check out the demo!

It’s a fair question to ask whether, with new tools like this, humans might get lazy about their own direct observations, but it’s not hard to imagine this being useful, as Freeman points out, in many different kinds of applications.

Look at the slide when he talks about measuring local phase with sine and cosine phase wavelets, and how to amplify and reconstruct this data.

“Normally, an image is represented by pixels: you can do a transformation and represent an image instead, as a sum of lots and lots of little wavelets, little tiny ripples at different positions, different phases, different orientations and different scales, okay. But if you have a pair of those wavelets, a sine phase and a cosine phase, the change in the ratio of the sine to the cosine phase tells you about perhaps a very small motion that’s going on at that location, orientation and scale.”

It’s also interesting to look at sub-patterns in the visual interaction of the sine and cosine waves from the visualization that gives us some ideas about how this business works.

Calling the concept both “brain-dead simple” and “rock solid,” Freeman talks about temporal frequency and doing the transformations frame by frame.

If you’re queasy about using American infrastructure, the next part might give you a little heartburn, as Freeman shows vibrations at a larger scale. Pay particular attention as he describes a bridge in New England and the “temporal frequency corresponding to one of the torsional modes,” showing how the motion amplifier can help with safety efforts.

“That (motion) information was in the original video, but you just couldn’t see it because it was too small to visualize,” he says. “And this lets us see that vibration that’s going on. Now, in this case, it wasn’t a dangerous vibration. But our microscope lets us see it, lets us analyze it.”

He also points out how you can analyze people with an example based on his daughter, and from there into examples like a human eye, and a ballet dancer.

“When a ballet dancer stands on one foot and stands motionless, you know, there’s control systems going on there,” he explains. “You know, if you just really froze her, she would fall over…. So … the left is the input video, and the right is motion magnifying it, revealing all the little controls that a dancer has to make, in order to keep from falling over. So we might use that to train the dancer – we might use that to identify motion deficits. And you can also take this computational microscope and point it through an actual microscope to amplify small motions in microscope videos.”

He also illustrates how this technology can be applied to senses i.e. hearing with the very technical term ‘tectorial membrane’.

At the end, you hear Freeman bringing it back to healthcare, with content around potential possibilities for internal imaging, and likening a motion amplifier to a stethoscope, suggesting that this technology in something like an ultrasound can be diagnostically important.

We also listen along to Freeman’s thinking about applications for mechanical systems, or human anatomy, such as blood flow

Indeed, when you magnify things that looked static, you see more dynamic movement than you might expect.

Don’t rule out the use of algorithms in showing us new data on natural routine observations.

Freeman ends with an appeal to the crowdsourcing of new ideas:

“I’d love to hear your suggestions for other places where this motion microscope might be useful,” he says.

Video: some of the newest technology lets us see motions that were previously invisible – and that will have big ramifications for all kinds of research.