Using a resistor for sensing current should be a simple affair. After all, one has only to apply Ohm’s law or I=V/R. So, all it takes is to measure the voltage drop across a resistor to find the current flowing through it. However, things are not as simple as that. The thorn in the flesh is the resistor value.

Using a large resistor value has the advantage of offering a large reading magnitude, greater resolution, higher precision, and improved SNR or Signal to Noise Ratio. However, the larger value also wastes power, as W=I^{2}R. It may also affect loop stability, as the larger value adds more idle resistance between the load and the power source. Additionally, there is an increase in the resistors self-heating.

Would a lower resistor value be better? But then, it will offer higher SNR, lower precision, resolution, and a low reading magnitude. The solution lies in a tradeoff.

Experimenting with various resistor values to sense different ranges of currents, engineers have concluded that a resistor offering a voltage drop of about 100 mV at the highest current is a good compromise. However, this should preferably be a starting point, and the best value for the current sense resistor depends on the function of priorities for sensing the current in the specific application.

The voltage or IR drop is only one of two related problems, with the second problem being a consequence of the chosen resistor value. This second issue, resistive self-heating, is a potential concern, especially when a high-value current flows through the resistor. Considering the equation W=I^{2}R, even for a milliohm resistor, the dissipation may be in several watts when the current is multiple amperes.

Why should self-heating be a concern? Because, self-heating shifts the nominal value of the sense resistor, and this corrupts the current-value reading.

Therefore, unless the designer is measuring microamperes or milliamperes, where they can neglect the self-heating, they would need to analyze the resistance change with temperature change. For doing this, they will need to consult the data for TCR or temperature coefficient of resistance typically available from the resistor’s vendor.

The above analysis is usually an iterative process. That is because the resistance change affects the current flow, which, in turn, affects self-heating which affects resistance, and so on.

Therefore, the current-sensing accuracy depends on three considerations—the initial resistor value and tolerance, the TCR error due to ambient temperature change, and the TCR error due to self-heating. To overcome the iterative calculations, vendors offer resistors with very low TCR.

These resistors are precision, specialized metal-foil types. Making them from various alloys like copper, manganese, and other elements, manufacturers use special production techniques for managing and minimizing TCR. To reduce self-heating and improve thermal dissipation, some manufacturers add copper to the mix.

Instrumentation applications demand the ultimate precision measurements. Manufacturers offer very low TCR resistors and fully characterized curves of their resistance versus temperature. The nature of the curve depends on the alloy mix and is typically parabolic.