[lm-sensors] Using hwmon in-kernel

Matthew Garrett mjg59 at srcf.ucam.org
Sun Oct 19 20:02:57 CEST 2008


On Sun, Oct 19, 2008 at 06:20:00PM +0200, Jean Delvare wrote:

> If you instead refer to a board-specific offset that should be applied
> to compensate for the distance between the thermal sensor and the
> graphics core, or for a non-standard thermal diode, the lm90 driver
> exposes attribute temp2_offset so user-space can set and read the
> temperature offset.

Right. My kernel driver is in the privileged position of knowing 
precisely what offset should be applied to the lm90 readings, so doing 
this in-kernel would be advantageous :)

> Why do you want to retrieve the temperature value from the kernel?
> Please explain your use case.

I'm implementing power management for GPUs. These typically have several 
different performance constraints, but one of them is chip temperature. 
The maximum supported temperature is generally exported via tables in 
the graphics card BIOS, so it's necesssary for the kernel driver to be 
aware of the current temperature in order to limit the available 
performance modes to ensure the GPU stays within its thermal envelope.

-- 
Matthew Garrett | mjg59 at srcf.ucam.org




More information about the lm-sensors mailing list