I have a managed network with a Domain Controller. Some PCs are running at a speed of 10.0 Mbps and others at 100.0 Mbps. What determines this setting? I feel strange asking such a simple question ... but I "thought" that since my machine's rated speed is 100 Mbps, when I build new machines or rebuild old ones, I have always assumed they were running at 100.
1. I all hardware has to be 100Mb/sec. compliant.
2. Network settings (mainly in the NIC Properties) has to be set on 100Mb/sec or Auto.
3. Cables Plugs etc. has to be in good shape.
Most common reduction in “Speed” in 100Mb/sec. is “BAD” Cables
The following is considered as one of the standards to measure Network “Speed” and debug it.
Both the computer and the component on the other end of the wire determine network speeds for most modern Ethernet setups. So, even if your computer is capable of 100 MBPS, it'll only run at 10 is that's all your hub/switch is capable of. In some cases, this auto-negotiation of speeds doesn't work properly, so you can try to force a certain speed, though that doesn't always work either.
1-Open Device Manager
2-Find your network card and open properities
3-Click advanced tab
4-Look for a setting "Link Speed & Duplex" or something close to that
5-Change to the desired speed
The settings change per network card, but most modern cards will allow this type of behavior.