It’s not Jellyfin, but here’s my N100 simultaneously doing two 4K HDR transcodes with tone mapping enabled. Neither stream had buffering.
So it’s definitely a capable chip, but might be dependent on transcode settings.
It’s not Jellyfin, but here’s my N100 simultaneously doing two 4K HDR transcodes with tone mapping enabled. Neither stream had buffering.
So it’s definitely a capable chip, but might be dependent on transcode settings.
Indeed. Sounds like in your case the i5 6500 you have is already suiting your needs, so really no need for more expense. For someone who doesn’t have something like that already though and needs to make a purchase, I’ve come around to generally recommending something like the n100 over a used older-generation processor simply because they cost very similar prices, but I feel you get a bit more with the more recent chips due to the modern HW encode/decode and low power use.
The n100 mini PCs are a fantastic choice for hosting media server software primarily because of its transcoding capabilities.
The i5-6500 you have and the N100 perform very similarly with general compute tasks (though the TDW of the n100 is 6W vs 65W for the same performance). However, the N100 comes with the full Alder Lake Quick Sync engine compared to the Skylake engine on to i5-6500. If you review the hardware encode/decode table here, you can see Skylake HW encode/decode caps out at 8-bit HEVC (HDR 4K content is typically 10 or 12-bit HEVC), whereas the N100 supports even very recent codecs like 10-bit AV1. I recently set up Plex on a N100 mini PC I got for $150 (with 8gb RAM and 256gb NVMe drive included), and it was able to simultaneously do 2x 4K HDR transcodes with tone mapping while also doing a full library scan and credits detection. Of course, if you’re picky about what clients are watching your content to ensure they always watch original quality, you may not need to transcode.
That said, the N100 mini PC I purchased only has slots for 1 NVMe drive and one 2.5" SATA drive. In my case this was perfect because all my media is on a NAS which the N100 now access using a NFS mount, and I can easily back up the minimum persistent data on the N100 PC.
But it sounds like it wouldn’t 100% satisfy everything OP is looking for on its own. If they still wanted a N100 for the transcode capabilities, they may be able to use a USB HDD hardware enclosure to add additional storage capabilities without needing a separate system, but because I already had a NAS for my dedicated storage, it isn’t something I looked into with detail.
Oh I know, but my thermostat and a handful of other devices are Zwave, so for me specifically it’s probably not worth changing things up at this time.
I was weighing the same options recently and went with a n100 mini PC with 8gb RAM and 256GB m.2 SSD for $150. Absolutely no regrets.
I noticed you didn’t list storage with your RPi5. Are you just using eMMC? I’d strongly recommend against eMMC as your only storage if you’re doing anything write-intensive, since the life cycle of eMMC is generally much shorter than even cheap SSDs (and performance is much lower compared to m.2 via PCIe) and it’s not something you can just swap out if it dies. On my existing Pis and other SBCs, I use any eMMC only for the bootloader and/or core OS image (if at all) and store anything else either on physically attached SD cards, SSDs, or mounted network volumes.
This additional storage adds even more cost to the Pi, even if you go with my recommended minimum of a SD card (low life cycle, but at least you can replace it). So now the 8GB Pi is $80 + $10-15 for case with fan and heatsinks + $10-15 for power supply + $15+ for a SD card or other storage = $115-125+ total.
In comparison, the $150 n100 mini PC comes with case, power supply, and storage. Both the included m.2 256GB SSD and 8GB RAM are easily replaced or upgraded using standard SSDs and laptop memory (up to 16GB DDR4-3200). The Intel n100 scores more than twice as high in Passmark compared to the ARM Cortex A76, and includes a full Alder Lake QuickSync engine (meaning it can hardware encode/decode a large variety of video codecs with the integrated GPU including very new and demanding ones like 10-bit AV1). I’ve stress tested it recently and it was capable of simultaneously transcoding 2x 4K HDR movies (both full UHD Blu-ray quality, one of them 60fps and 100Mbps bitrate) with tone mapping in Plex in real time while also doing a full library scan and credit detection. In addition, x86 architecture is still more broadly supported than arm, so compatibility is less an issue. (That said, in this particular case, the n100 is only fully supported in newer Linux kernels. I upgraded Ubuntu 22.04.4 to 6.5 kernel and installed a few other driver packages to get it fully working, which wasn’t hard, but it’s an additional step).
For me, in the end the price difference was at most $25 and the advantages made it clearly worth it.
That said, if all I wanted was a much lower powered SBC just to run a handful of light services, I might look at one of the cheaper Pis or similar and just accept that it’ll eventually die when the eMMC dies (and back up any persistent data I’d want to retain).
That does also look like a good option. In my case, I have a Pi 4 running both zigbee2mqtt and zwave-js-ui using connected Zigbee and Zwave USB dongles placed centrally in the house (Eclipse mosquitto is running on a separate 3-server cluster). I’ve only briefly searched, but network zwave controllers seem to be much less common or more expensive, so I probably wouldn’t benefit much from changing my Zigbee controller at the moment.
I’ll check it out. I suspect configuration would likely be a little bit more complicated in my case because I’m using Authentik for proxy forward authentication and had also been using access control groups in NPM (both a LAN group and a WAN group containing Cloudflare proxy IP addresses, since currently all my publicly accessible domains proxy through Cloudflare).
This 100%. A FLAC CD rip is maybe 400MB. That’s 2,500 albums per terabyte, and I just recently got an 18TB drive for my NAS for $180. That’s $0.004 per album storage cost. I’d rather have a lossless permanent copy of any of my CDs than save fractions of a penny per album.
Nginx is a lot less painful if you use Nginx Proxy Manager. You get a nice GUI and can easily get SSL certificates with Let’s Encrypt, including wildcard certs. I’m running it in front of a docker swarm and 3 other servers, and in most cases, it takes me about 30 seconds to add a new proxy host and set it up with https using my *.domain.com wildcard cert. I also use it with Authentik as a forward proxy auth for SSO (since many containers out there don’t have the best security).
It’s good to be cautious about nonprofit organizations, but in the case of DSI, they’ve been around a while, have a good reputation, and score well on third-party sites like Charity Navigator.
I’d also like to make clear that their Dark Sky Sanctuary certification isn’t a scientific one based solely on light pollution, but also that the local/state/etc governments have implemented certain policies to help ensure the area remains a dark sky area. It’s best to think of it akin to something like a designated “wilderness” or “wildlife sanctuary” area, but for starry skies. Because DSI works with governments to certify these areas, you’re right that certain regions are likely to be more represented, and some not represented at all due to geographic and political barriers.
P.S. I’m not affiliated with DSI, but have used their accreditations in the past to pick wilderness areas to visit for hiking/camping/photography.
They do accreditations internationally too, though you’re almost certainly right that some locations aren’t as represented, especially since it requires communication and cooperation with the governments for these areas.
To be fair, the add-ons are just containers installed and managed by HA. In most cases, you can install all of them as separate containers via something like Docker, but configuration takes more steps (though you also get more control).
Example: I have HA, Eclipse mosquitto, zigbee2mqtt, zwave-js-ui, node-red, Grafana, and influxdb all running as docker containers on two different devices (my main HA host wasn’t ideal for Zigbee and zwave USB dongles, so those are on a Pi 4). The other containers are accessible separately or from within HA as iFrame panels.
Technically 3.5" SSDs are still out there, but they’re massive (16-64 TB) and target enterprise use (with a price to match).
And 3.5" is still the standard for platter HDDs, which are still the more economical option if you need large amounts of storage.
Now if you meant no more 3.5" floppy disk drives, then yes, those are definitely gone. ;)
I think you may have misread OPs post. They haven’t built a PC since shirtly after they were 10-11, which was almost 30 years ago. So developments since the turn of the century are in fact relevant here, heh.
This is the way. I’ve had absolutely zero issues with my Hue bulbs directly connected to a USB Zigbee controller and running zigbee2mqtt. With Zigbee bindings to smart switches, they respond practically instantly as well whenever we decide to control them that way.
Traditionally in systems like D&D and Pathfinder, Wisdom is more representative of inherent common sense, awareness, and intuition and Intelligence representative of the ability to reason and learn. What actually represents accumulated knowledge is your bonus in the related skill, which is a combination of ranks (representative of time spent training/studying that skill) plus inherent bonus (in the case of knowledge, usually Intelligence since Int is representative of your ability to learn).
Some skills end up seeming like they should be a bit of both though. For example, Healing/Medicine requires knowledge of anatomy, ailments, treatments, etc (which would be Int aligned) but actually diagnosing and using it requires perception to recognize symptoms and often some intuitive choices about how best to treat ailments (which is more Wisdom aligned). It’s simpler to just pick one, and often Wisdom is chosen to avoid too many things being Int.
The only way I see a company like this having “significant economic harm” from you not using their free app is if 1) they eventually plan to charge a fee to use the app or 2) they profit from data their app collects about you (third party data sales, for example).
Not something I’m interested in either way, so they’ve lost a potential customer.
That’s a good point I hadn’t considered from a legal standpoint before. I believe there’s also some network media players out there that can load up iso files, so in theory you could have a library of iso files that you load up as if you were playing the disc, complete with menus and all.
I have no idea if this is any better from a legal standpoint though, since you’d still be using what I assume is unauthorized software to bypass the DVD and Blu-ray encryption whenever you play the iso.
Long story short, they really need to carve out a DMCA exception for this specific conflicting case (which they’ve done for other conflicting situations), but I suspect there’s some strong lobbying against it by interested parties…
That one certainly came to mind. ;)
Makes sense. Pathfinder already shifted over to Ancestries in their 2nd Edition. Paizo has a pretty good history of representation and sensitivity to stuff like this though.