I use node_exporter on all of my systems. They get scraped by a VictoriaMetrics instance on one of my servers, and then AlertManager is used to push out alerts if any of my configured thresholds are met (high temperature, high memory usage, high swap usage, high disk usage, etc.), and Grafana is used to plot it.
eg: CPU Temperatures, Drive Temperatures, RAM Usage, Disk Usage, and many more. Disk usage on that scale is pretty boring, but there's a lot to see if you zoom out.
It records thousands of metrics from every system every 15 seconds and holds onto it for 6 months, so it's easy to spot trends, look at historical changes, the differences caused by hardware changes, etc. I also use smart plugs which get pulled into the same database, so I can look at power consumption on all of the systems.
What's the point? Even if you pay extra for "4K" streaming, it's compressed to hell and the quality is no better than 1080p. What are you going to even watch on an 8K TV?