Nearing the filling of my 14.5TB hard drive and wanting to wait a bit longer before shelling out for a 60TB raid array, I’ve been trying to replace as many x264 releases in my collection with x265 releases of equivalent quality. While popular movies are usually available in x265, less popular ones and TV shows usually have fewer x265 options available, with low quality MeGusta encodes often being the only x265 option.
While x265 playback is more demanding than x264 playback, its compatibility is much closer to x264 than the new x266 codec. Is there a reason many release groups still opt for x264 over x265?
A lot of TV shows are direct rips from streaming services and they don’t use H.265 because of the ridiculous licensing it comes with.
I suspect AV1 will become much more popular for streaming in a few years when the hardware support becomes more common. It’s an open source codec, so licensing shouldn’t be an issue. Then we will see a lot more AV1 releases.
What’s AV1 compression like compared to x265?
In my experience, you always gain space savings going av1 from 264 and 265 as well. For me its always been significant savings at the same quality level.
Ofc YMMV and use a very recent ffmpeg with the best av1 libraries.
Some notes: Don’t use GPU to reencode you will lose quality.
Don’t worry for long encoding times, specially if the objective is long term storage.
Power consumption might be significant. I run mine what the sun shine and my photovoltaic picks up the tab.
And go AV1, open source and seems pretty committed to by the big players. Much more than h265.
Why is the GPU reencoding bad for the quality? Any source for this?
Yeah that caught my eye too, seems odd. Most compression/encoding schemes benefit from a large dictionary but I don’t think it would be constrained by the sometimes lesser total RAM on a GPU than the main system - in most cases that would make the dictionary larger than the video file. I’m curious.
The way it was explained to me once is that the asic in the gpu makes assumptions that are baked in to the chip. It made sense because they can’t reasonably “hardcode” for every possible variation of input the chip will get.
The great thing though is if you’re transcoding you can use the gpu to do the decoding part which will work fine and free up more cpu for the encoding half.
RARBG was so good for this, their releases were of such good consistent quality
If you search for ORARBG on therarbg site you can still find some OG releases and not random YIFY crap
I just use qbitorrent and the search feature
Didn’t even know that was a thing ngl and I use qbit nox on my server. Kinda obsoletes the *arr suite
I only found out shortly after the closure of RARBG. Found this is the best way to find old RARBG torrents, just search for whatever then filter for RARBG or 265
Eh, the *arr apps are more about the freedom to be hands-off. Ideally you will just request something in a frontend like Overseerr from your phone and it will handle the rest. Or automatically grab upon release.
Not if you have lots of specific filters set up in the *Arr suite. So much better getting a HQ rip automatically than choosing a random one in qbit.