Many people find Ethernet confusing.
CAT6 and CAT5e are very similar, but standard CAT5 can not do above 100mbps duplex. 5e can, but isn’t tested for 1000mbps duplex, whereas CAT6 is designed specifically for that speed.
CAT6 doesn’t actually need to have shielding to be CAT6 but in the early days it was a common way to meet the standards.
The biggest difference is actually the connectors. You need CAT6 RJ45 to make a CAT6 cable. 5e cable with a 6 connector works better than CAT6 cable with a CAT5 RJ45 in my own testing.
CAT5 connectors have pins that all line up straight across, CAT6 has staggered pins. This staggering helps prevent crosstalk between the wires.
CAT8 is the current newest standard, and I am not brushed up on the standards specs yet.
CAT7 was never made official, manufacturers made it ahead of time and it might be a little better than 6 but it almost never meets 8 standards, and is about the same price.
Cat7 doesn’t use RJ45 connectors, so it’s not useful for residential use. Any Cat7 with RJ45 connectors is fake.
Cat8 isn’t worth it for residential usage, and IMO 6A isn’t worth it either. Cat6 will do 10Gbps. For anything faster than 10Gbps, I’d use fiber. It’s similar in big data centers - they almost always use Cat6, Cat6A, and fiber (for 25Gbps, 40Gbps and 100Gbps connections). No Cat7 or Cat8 or anything like that.
The one place I’d always use fiber instead of Cat6, regardless of distance, is when connecting to a different building. There’s risks with lightning strikes, ground loops, etc that fiber avoids since it’s just light.
Many people find Ethernet confusing.
CAT6 and CAT5e are very similar, but standard CAT5 can not do above 100mbps duplex. 5e can, but isn’t tested for 1000mbps duplex, whereas CAT6 is designed specifically for that speed. CAT6 doesn’t actually need to have shielding to be CAT6 but in the early days it was a common way to meet the standards. The biggest difference is actually the connectors. You need CAT6 RJ45 to make a CAT6 cable. 5e cable with a 6 connector works better than CAT6 cable with a CAT5 RJ45 in my own testing. CAT5 connectors have pins that all line up straight across, CAT6 has staggered pins. This staggering helps prevent crosstalk between the wires.
CAT8 is the current newest standard, and I am not brushed up on the standards specs yet. CAT7 was never made official, manufacturers made it ahead of time and it might be a little better than 6 but it almost never meets 8 standards, and is about the same price.
Cat7 doesn’t use RJ45 connectors, so it’s not useful for residential use. Any Cat7 with RJ45 connectors is fake.
Cat8 isn’t worth it for residential usage, and IMO 6A isn’t worth it either. Cat6 will do 10Gbps. For anything faster than 10Gbps, I’d use fiber. It’s similar in big data centers - they almost always use Cat6, Cat6A, and fiber (for 25Gbps, 40Gbps and 100Gbps connections). No Cat7 or Cat8 or anything like that.
The one place I’d always use fiber instead of Cat6, regardless of distance, is when connecting to a different building. There’s risks with lightning strikes, ground loops, etc that fiber avoids since it’s just light.
I’m getting 10gbps over my 100ft 5e run. But iirc 33m/100ft is the max for that