Every single vintage effect misses out on one crucial factor - Resolution. Vintage films (even when scanned in 4K) do not have the same type of aliasing and sharpening aspects that digital cameras have. This effect is what “makes” it vintage for me.
If you haven’t seen, Apollo (2019) documentary is simply stunning and even though it’s in 4K, it appears vintage.
I'm curious what you're referring to specifically, since there is incredibly sharp film footage out there.
Are you talking about resolution from film grain? Because 35mm film is generally to be considered a good bit higher resolution than 4K.
Or the noise properties of film grain, which is a separate issue from resolution? Which shows up some in well-lit conditions, but is even more obvious in nighttime shots? That's certainly easy to add.
Or are you talking about lower-quality lenses (e.g. chromatic abberation), or focus issues?
Or something else? I don't know at all what you mean by the "aliasing and sharpening" that you claim professional digital cameras perform. (Since we're not talking about iPhone camera processing, but rather the cameras movies and broadcasters use.)
I've done A-list visual effects work since the 90s, and I can say that film for motion picture production[1] varies widely in it's grain density and size. This variation is amplified by the fact that DPs often "push" or "pull" the stock which means they treat it as if it was a different ISO speed than labeled (exposure and lab processing change to compensate). In practical application digital elements nearly always have to be reduced in effective resolution to match film plates when composited.
Of course under laboratory conditions using larger negative size (like 65mm), Velvia or some other extraordinary film stock you can of course produce extremely high resolution individual frames, but as we know from a lifetime of movies this is neither practical nor necessary. Though a great source of religious wars.
However, there is a reason why film's "lower resolution" still produces superior resolving power...
Every single frame of film has a different randomized set of sample points.
This in combination with human persistence of vision[2] and our ability to see pattern but discard noise means that a static shot on film is significantly more resolved in our minds after only a few frames than the equivalent shot recorded by color filter array[3]. It's not just perceptual, temporal sampling truly produces a more accurate representation of the image content over time.
This is a fascinating quality of film that can give you the sense of graininess and low quality while still producing superior image detail in your mind, but as soon as you stop the projector and evaluate the film image by image the effect is lost. You might be left wondering where did all that detail go?
Analog support doesn't sample individual pixels in a grid format. That's why it's rather meaningless to talk about the resolution of analog photos or video, the actually information density will vary depending on a large number of factors and may not even be the same depending on what you measure. For instance old school analog TV had significantly more information in the lumas than in the chroma.
And then you have old school lenses vs. modern lenses and many other factors.
I agree with the parent that the end result is an amusing novelty but if the intention was to fool me into thinking that it was actually an ancient recording I don't think I would've fallen for it. It's too sharp, the blurring effect on movement feel wrong too and the dynamics are off I think (film has amazing contrast that digital sensors still struggle to emulate).
Sure, but I wouldn't go so far as to say resolution of analog is "meaningless". I'm not aware of a precise quantitative measure of film "resolution", but simply from a qualitative sense of resolving detail, it's quite meaningful and useful to say that 35mm film is generally considered to lie somewhere between 4K and 8K. Which is far higher than 1080p.
In other words, if you're watching a 1080p video and complaining it appears too "high-resolution" to have been transferred from film, that's almost certainly incorrect, and it's something else that cluing you in it wasn't film -- probably grain and/or color characteristics.
Isn't the opposite the case? If I see aliasing and sharpening, I see it as digital distortion you wouldn't expect from film. When I see a bunch of digital artifacts on a film transfer, I think of it as a bad transfer.
For home movies, at least, 8mm and Super 8mm was a very common format. An 8mm frame has a lot less resolution than 35mm. Kodachrome and Ektachrome film have very identifiable artifacts, too.
There's vintage and vintage. Old film scanned today (with modern resolution and chromatic fidelity, reproducing accurately chemical and optical issues in the original) is very different from old TV footage (with anisotropic filtering and old magnetic tape degradation effects) and from early digital cameras.
Then add that affect lol. In production you don't just apply one filter and be done with it, and you don't just apply the filter evenly, you want to layer mask and add something dynamic to it. A proper tutorial won't tell you about all the other filters, just the one specific effect it is talking about.
Nice! I am actually just using my iPhone videos and converting them this way! The output is pretty good and when I shared them with family, they loved it. Thanks for the lens links.
> The film consists solely of archival footage, including 70 mm film previously unreleased to the public, and does not feature narration, interviews or modern recreations.
> In late 2016, Todd Douglas Miller had recently completed work on The Last Steps, a documentary about Apollo 17, when British archivist and film editor Stephen Slater suggested making a similarly themed documentary for the upcoming 50th anniversary celebrations of Apollo 11.
> Miller's conception of the film was centered on a direct cinema approach. The final film contains no voice-over narration or interviews beyond what was available in the contemporary source material.
Yes indeed, this one. It gives me chills what humans did back in the 1960's. The quality of the film amplifies this emotion by making it seem like it was just a normal everyday life in modern fidelity.
That clip may be max 1080. Whats crazy is that while the film was released in 4k, the content sourced from 70mm film has a way higher (8k? more?) potential resolution. Combined with something like https://news.ycombinator.com/item?id=25105713 and we may still have really cool things to see from the 60s.
It’s cool to see more people writing posts and experimenting on this topic. I was surprised at how little info I could find trying to research this about a year ago, and spent a decent chunk of this year experimenting with trying to create vintage/retro video filters using FFmpeg. If anyone is curious, I shared a bunch of my notes on this topic here a little while ago: https://zayne.io/articles/vintage-camera-filters-with-ffmpeg
To me the film damage/dust/hair overlay seems a bit severe. It’s kind of a default “old stuff looks like this” filter, but I don’t think old films were actually this damaged unless it was stored outside for years or something.
Slow. If you look back at footage of say superbowl 1975, everything just moves slower than real life. And they don't make an effort to match the timing of the film to real life time, like when they show clips from 40 years ago or whatever during football games
Does anybody know how to overlay a shorter footage on a longer one? Because the shortest footage just freeze at the last frame in the tutorial. I want the shorter overlaid footage to loop on top of the longer one, Here is what I've tried:
I like to get the duration of the base video using ffprobe, then use -stream_loop -1 to loop the overlay video on top of it, while using -t with the duration before the output to ensure it will only be as long as the duration of the base video. Here's an example:
Also, the vignette preset for the curves filter is a bit over-the-top, you may want to get a smoother preset (the vignette filter can import Photoshop preset files, so you can look if you can find a free preset somewhere). Alternatively you can find out strategic keypoints within GIMP and translate them to FFmpeg parameter (a bit more hassle I admit), but note that the interpolation between points might not be exactly the same.
If mega.nz can keep user attention during download and decryption - yes, that happens all client side - then I don't see user doesn't wait for local transcoding process to finish. Show them pre/post processed keyframes as progress indicator.
The last step involves video compression which could take a bit of time depending on the size of the video. If someone can put it on the cloud, I'd love to take a look :)! Cheers!
If you haven’t seen, Apollo (2019) documentary is simply stunning and even though it’s in 4K, it appears vintage.