# Amazing Slow Motion Videos With Optical Flow | Two Minute Papers #119

## Метаданные

- **Канал:** Two Minute Papers
- **YouTube:** https://www.youtube.com/watch?v=7aLda2E0Yyg
- **Дата:** 11.01.2017
- **Длительность:** 6:56
- **Просмотры:** 23,254
- **Источник:** https://ekstraktznaniy.ru/video/14728

## Описание

The paper "An Iterative Image Registration Technique
with an Application to Stereo Vision" is available here:
http://cseweb.ucsd.edu/classes/sp02/cse252/lucaskanade81.pdf

Our earlier episode on extrapolation:
https://www.youtube.com/watch?v=AHl2JjGsu0s

WE WOULD LIKE TO THANK OUR GENEROUS PATREON SUPPORTERS WHO MAKE TWO MINUTE PAPERS POSSIBLE:
Sunil Kim, Daniel John Benton, Dave Rushton-Smith, Benjamin Kang.
https://www.patreon.com/TwoMinutePapers

Other video credits:
Simulating Viscosity and Melting Fluids - https://www.youtube.com/watch?v=KgIrnR2O8KQ&list=PLujxSBD-JXgnnd16wIjedAcvfQcLw0IJI&index=2
Modeling Colliding and Merging Fluids - https://www.youtube.com/watch?v=uj8b5mu0P7Y&list=PLujxSBD-JXgnnd16wIjedAcvfQcLw0IJI&index=8
Multiphase Fluid Simulations - https://www.youtube.com/watch?v=cUWDeDRet4c&list=PLujxSBD-JXgnnd16wIjedAcvfQcLw0IJI&index=11

Thumbnail background image credit: https://pixabay.com/photo-1032741/

Subscribe if you would like to see more of these! - http://www.

## Транскрипт

### Intro []

Dear Fellow Scholars, this is Two Minute Papers with Károly Zsolnai-Fehér. I am really excited to show this to you as I was looking to make this episode for quite a while. You'll see lots of beautiful slow-motion footage during the narration. And at first, it may seem disconnected from the narrative, but by the end of video, you'll understand why they look the way they do. Now, before we proceed, let's talk about the difference between interpolation and extrapolation. Interpolation means that we have measurement points for a given quantity, and we'd like to know what happened between these points. For instance, we have two samples of a person's location at four and at five o'clock, and we'd like to know where the guy was at four thirty. However, if we're doing extrapolation, we're interested in guessing a quantity beyond the reach of our sample points. For instance, extrapolation would be predicting what happens after the very last frame of the video. In our earlier episode, we talked about financial extrapolation, make sure to have a look, it was super fun. The link is in the video description. Optical flow is really useful because it can do this kind of interpolation and extrapolation for images. So let's do one of them right now.

### Frame Interpolation [1:13]

You'll see which it will be in a second. So this is a classical scenario that we often encounter when producing a new Two Minute Papers episode - here, we have a 25 or 30 frames per second video on a 60 frames per second timeline. This means that roughly only every other frame is duplicated, and offers no new information. You can see this as I step through these individual frames. The more astute Fellow Scholars immediately would point out that wait, we have a lot of before and after image pairs, so we could do a lot better! Why don't we try to estimate what happened between these images? And that is exactly what we call frame interpolation. Interpolation because it is something between two known measurement points. And if we run the optical flow algorithm that can accomplish this, we can fill in these doubled frames with new ones that actually carry new information. So the ratio here was roughly two to one. Roughly every other frame provides new information. Super cool! So, what are the limits of this technique?

### Limitations [2:28]

What if we artificially slow the video down, so that it's much longer, so not only every other, but most of the frames are just duplicates? This results in a boring and choppy animation. Can we fill those in too? Note that the basic optical flow equations are written for tiny changes in position, so we shouldn't expect it to be able to extrapolate or interpolate any quantity over a longer period of time. But of course, it always depends on the type of motion we have at hand, so let's give it a try!

### Demo [3:18]

As you can see, with optical flow, the algorithm has an understanding of the motions that take place in the footage, and because of that, we can get some smooth, buttery, slow motion footage that is absolutely mesmerizing. Almost like shooting with a slow motion camera. And note that the majority of these frames were not containing any information, and this motion was synthesized from these distant sample points that are miles and miles away from each other. However, it is also important to point out that optical flow is not a silver bullet and it should be used with moderation and special care as it can also introduce nasty artifacts like the one that you see here. This is due to an abrupt, high frequency change that is more difficult to predict than a slow and steady translation or rotation motion. To avoid these cases, we can use a much simpler frame interpolation technique that we call frame blending. This is a more naive technique that doesn't do any meaningful guesswork and computes the average of the two results.

### Comparison [4:19]

Why don't we give this one a try too? Or even better, let's have a look at the difference between the original choppy footage, and the interpolated versions with frame blending and optical flow. If we do that, we see that frame blending is unlikely to give us nasty artifacts, but in return, the results are significantly more limited compared to optical flow because it doesn't have an understanding of the motion taking place in the footage. So, the question is, when to use which? Well, until we get an algorithm that is able to adaptively decide when to use what, it still comes down to individual judgement and sometimes quite a bit of trial and error. I'd like to make it extremely sure that you don't leave this video thinking that this is the only application of optical flows. It's just one of the coolest ones! But this motion estimation technique also has many other uses. For instance, if we have an unmanned aerial vehicle, it's really great if we can endow it with an optical flow sensor, because then, it will be able to know in which direction it needs to rotate to avoid a tree, or whether it is stable or not at a given point in time.

### Conclusion [5:21]

And, with your support on Patreon, we were not only able to bump up the resolution of future Two Minute Papers episodes to 4K, but we're also running them at true 60 frames per second, which means that every footage can undergo either a frame blending or optical flow step to make the animations smoother and more enjoyable for you. This takes a bit of human labor and is computationally expensive, but our new Two Minute Papers rig is now capable of handling this. It is fantastic to see that you Fellow Scholars are willing to support the series, and through this, we can introduce highly desirable improvements to the production pipeline. This is why we thank you at the end of every episode for your generous support. You Fellow Scholars are the best YouTube audience anywhere. And who knows, maybe one day, we'll be at a point where Two Minute Papers can be a full time endeavor, and we'll be able to make even more elaborate episodes. As I am tremendously enjoying making these videos, that would be absolutely amazing. Have you found any of these disturbing optical flow artifacts during this episode? Have you spotted some of these in other videos on YouTube? Let us know in the comments section so we can learn from each other. Thanks for watching and for your generous support, and I'll see you next time!
