Is PoE more efficient than plugging in adapters for each network device?

And at what scale does it start to matter?

From my perspective; I’m going for a 3 node mesh router, plus 2 switches, and was considering if in 5 years time the electricity difference would be less than the extra upfront cost. The absolute max length of cable would probably be around 30m

  • Max-P@lemmy.max-p.me
    link
    fedilink
    arrow-up
    4
    arrow-down
    2
    ·
    9 months ago

    I’ll add, it also depends on the efficiency of the local power supplies if those devices were using wall warts. Those are often pretty generic, and may only be used at 25% which for some wall warts would be outside of their top efficiency curve. A single power supply in the form of PoE can be more efficient if it lets both the switch and PoE regulator on the device operate at a better efficiency point.

    In some way, stepping down 48V DC down to 3.3/5V is a bit easier than stepping down the 168V that results from rectifying 120V AC to DC. But the wart could be stepping down the 120V to 5V first with a simple AC transformer which are nearly always more efficient (95%+) than a DC/DC buck converter, but those can still reach 90% efficiency as well.

    In terms of cabling, power loss is a function of current and length (resistance). AC is nice because we can step it up easily and efficiently to extremely high voltages as to minimize the current flowing through the wire, and then step it back down to a manageable voltage. In that way, american 120V has more loss than rest of the world 240V, although it only matters for higher power devices. That also means that the location of the stepping down matters: if you’re gonna run 30m of ethernet and a parallel run of 30m of 5V power, there will be more loss than if you just ran PoE. But again, you need to account the efficiency of the system as a whole. Maybe you’d have a wart that’s 5% more efficient, but you lose that 5% in the cable and it’s a wash. Maybe the wart is super efficient and it’s still way better. Maybe the switch is more efficient.

    It’s going to be highly implementation dependent in how well tuned all the power supplies are across the whole system. You’d need either the exact specs you’ll run, or measure both options and see which has the least power usage.

    I would just run PoE for the convenience of not having to also have an outlet near the device, especially APs which typically work best installed on ceilings. Technically if you run the heat at all during the winter, the loss from the power supplies will contribute to your heating ever so slightly, but will also work against your AC during summers. In the end, I’d still expect the losses to amount to pennies or at best a few dollars. It may end up more expensive just in wiring if some devices are far from an outlet.