Jump to content
InsanelyMac Forum

Pietruszka

Members
  • Content count

    89
  • Joined

  • Last visited

About Pietruszka

  • Rank
    InsanelyMac Protégé

Profile Information

  • Gender
    Male
  1. Pietruszka

    HWSensors

    Hi kozlek, Can you tell me where to look for "getting nvidia gpu temp" functions. I could find it in nvclock code but with nouveau project.... I can't is it still nv50.cpp for 8800gs card from nvclock project or it has been changed? thank you very much
  2. Pietruszka

    HWSensors

    GPU TEMP 239C will try to open nvthermaldiode source code with Visual C++ and find which part of code is for my GPU update: VC++ 2010 can't open project (can't upgrade), will try older version
  3. Pietruszka

    HWSensors

    no score GPU TEMP 0C THX Sorry kozlek my mistake... to many tests NV_20400 : 0000002f (this is hex) (DEC 47) NV_20400 : 00000030 (this is hex) (DEC 48) informations are from gt230 nvidia (the same G92 arch) with my 8800gs NV_20400 I've got 00000000 need to find how Riva Tuner gets gpu temp thank you update: riva tuner shows gpu temp because of use nvthermaldiode.dll there is a source code but don't know if it can help kozlek update2: nvthermaldiode.dll is the only way to see gpu temp with 90% of 8800gs there is something strange with built-in thermal diode so don't bother kozlek
  4. Pietruszka

    HWSensors

    thanks will try after work with Riva Tuner (Windows) I see something like this: NVIDIA graphics processor registers: NV_20400 : 0000002f (this is hex) (DEC 47) NV_20400 : 00000030 (this is hex) (DEC 48) thanks for help and your time I will check new version
  5. Pietruszka

    HWSensors

    I think you are so close...kozlek with this kext (the same with latest installer) there is GPU TEMP !!!!... but shows 186C this is from kernel log GeForceX: VBIOS successfully read from PRAMIN Apr 26 20:52:01 localhost kernel[0]: GeForceX: BIT VBIOS found Apr 26 20:52:01 localhost kernel[0]: GeForceX: detected an NV50 generation card (0x092300a2) with 384Mb of GDDR3 memory (8) attached ioreg dump I know nvclock is not supported but with nvclockx: G84 arch -> 0C G92 arch ->139C nv50 arch ->18C nv50 with correction posted earlier -> 48C (OK) don't know how can I help... I know you've got others things to do... one more time!!!! this is "MUST HAVE" tool THX Mac Pro (Pietruszka).zip
  6. Pietruszka

    HWSensors

    Hi 8800GS here again gpu, shaders and memory clocks are OK !!!! but still no GPU TEMP some info from kernel log: GeForceX: detected an NV50 generation card (0x092300a2) Apr 25 21:23:17 localhost kernel[0]: GeForceX: VBIOS successfully read from PRAMIN Apr 25 21:23:17 localhost kernel[0]: GeForceX: BIT VBIOS found Apr 25 21:23:17 localhost kernel[0]: GeForceX: 384Mb of GDDR3 (8) VBIOS enabled but my id card is (0x060600a2) - nvidia 0x10de 0x0923 is something like webcams id??? this tool is great!!!! THX
  7. Pietruszka

    HWSensors

    Hi without vbios (gpu temp 0) with vbios (gpu temp 0) no KP... my card is kind of magic will try to investigate your new code... thanks
  8. Pietruszka

    HWSensors

    Hi kozlek can you make small exception for my 8800gs??? no more KP it's simple IF...ELSE... construction IF my card device -->>nv50 temp ELSE normal G92 temp here is what I've changed: info.cpp case 0x600: /* G92 */ arch = G92; //CHANGED for G92 PALIT 8800GS break; case 0x610: /* G92 */ arch = G84; break; nv50.cpp (change nr 1): static int nv50_get_gpu_temp(void *sensor) . . . slope = 430.0/10000.0; if (nv_card->device_id == 0x606) {//ADDED for G92 PALIT 8800GS correction = nv_card->bios->sensor_cfg.temp_correction<<2;//ADDED for G92 PALIT 8800GS }//ADDED for G92 PALIT 8800GS if(nv_card->debug) . . . nv50.cpp (change nr 2): /* Temperature monitoring; all NV50 cards feature an internal temperature sensor / but only use it when there is no I2C sensor around. */ . . . else if((nv_card->arch & G92) && !(nv_card->caps & GPU_TEMP_MONITORING)) { /* Nearly all G92 boards use a ADT7473 except some Asus models. They don't use the bios data properly, so give it its own function */ nv_card->caps |= GPU_TEMP_MONITORING; nv_card->sensor_name = (char*)STRDUP("GPU Internal Sensor", sizeof("GPU Internal Sensor")); if (nv_card->device_id == 0x606) //CHANGED for G92 PALIT 8800GS { //CHANGED for G92 PALIT 8800GS nv_card->get_gpu_temp = (int(*)(I2CDevPtr))nv50_get_gpu_temp; //CHANGED for G92 PALIT 8800GS } //CHANGED for G92 PALIT 8800GS else //CHANGED for G92 PALIT 8800GS nv_card->get_gpu_temp = (int(*)(I2CDevPtr))g92_get_gpu_temp; . . . this could solve the problem with next updates of HWSensors for G92 0x606 cards Thank you for help 8800GS_0x606.zip
  9. Pietruszka

    HWSensors

    Hi kozlek this is .zip with changes for my Palit 8800GS send you PM with more info thank you Archiwum.zip
  10. Pietruszka

    HWSensors

    OK everything works ...GPU temp 48C (idle) as it should be just made changes from post #320 (some fight with xcode ) would be nice to write more universal code... need to change nvclockx every hwsensors update thank you for all your help great work!!!
  11. Pietruszka

    HWSensors

    Hi can someone help with compiler errors??? still fight with Palit 8800gs gpu temp thanks for any help else if (nv50_get_gpu_temp(0)>0) { nv_card->sensor_name = malloc(64); //Use of undeclared identifier 'malloc' sprintf(nv_card->sensor_name,"NV50 GPU Internal Sensor (correction=%d)", nv_card->bios->sensor_cfg.temp_correction<<2); nv_card->get_gpu_temp = (int(*)(I2CDevPtr))nv50_get_gpu_temp; http://dl.dropbox.co...532/compile.png \\OK no errors now but KP with nvclockx during boot need to learn more and more
  12. Pietruszka

    HWSensors

    Ok I think I found something... (compared with this version http://www.projectos...?showtopic=1246) 1) info.cpp there is no my pci id (case 0x606 arch = nv50) and it works 2) nv50.cpp static int nv50_get_gpu_temp(void *sensor) ... slope = 430.0/10000.0; //here is something like this (working version) if (nv_card->arch & (G92 | GT200)) { correction = nv_card->bios->sensor_cfg.temp_correction<<2; } if(nv_card->debug) ... } 3) nv50.cpp kozlek's ver. /* Temperature monitoring section /* Nearly all G92 boards use a ADT7473 except some Asus models. They don't use the bios data properly, so give it its own function */ nv_card->caps |= GPU_TEMP_MONITORING; nv_card->sensor_name = (char*)STRDUP("GPU Internal Sensor", sizeof("GPU Internal Sensor")); nv_card->get_gpu_temp = (int(*)(I2CDevPtr))g92_get_gpu_temp; working version: /* Nearly all G92 boards use a ADT7473 except some Asus models. They don't use the bios data properly, so give it its own function */ nv_card->caps |= GPU_TEMP_MONITORING; if (g92_get_gpu_temp(0)>0) { nv_card->sensor_name = (char*)strdup("ASUS GPU Internal Sensor"); nv_card->get_gpu_temp = (int(*)(I2CDevPtr))g92_get_gpu_temp; } else if (g84_get_gpu_temp(0)>0) { nv_card->sensor_name = (char*)strdup("G84 GPU Internal Sensor"); nv_card->get_gpu_temp = (int(*)(I2CDevPtr))g84_get_gpu_temp; } else if (nv50_get_gpu_temp(0)>0) { nv_card->sensor_name = malloc(64); sprintf(nv_card->sensor_name,"NV50 GPU Internal Sensor (correction=%d)", nv_card->bios->sensor_cfg.temp_correction<<2); nv_card->get_gpu_temp = (int(*)(I2CDevPtr))nv50_get_gpu_temp; } Sorry about this code format, I'm not an expert just try to understand the problem with nvclock -i I see this: NV50 GPU Internal Sensor so working function for my card is nv50_get_gpu_temp; ??? if so it checks temp_correction with G92 arch it looks like my card is g92 arch but nv50_get_temp function shows right temp??? or I'm wrong can I compile the code with changes using xcode??? just need to download nvclockx project, make changes and compile??? or this is not so easy for noob like me thanks for any help
  13. Pietruszka

    HWSensors

    Thanks for help but doesn't work... still gpu temp is 0 maybe adding my device id is not enough??? tried with latest installer I'll try to compare some code with this version http://www.projectosx.com/forum/index.php?showtopic=1246 with this, temp is ok 48c thank you for help
  14. Pietruszka

    HWSensors

    OK I've checked with NVClock Linux port and this is what I got: -- General info -- Card: G92 [GeForce 8800 GS] Architecture: G92 A2 PCI id: 0x0606 Subvendor id: 0x0000 GPU clock: 576.000 MHz Bustype: PCI-Express -- Shader info -- Clock: 1458.000 MHz Stream units: 96 (11110011b) ROP units: 12 (1110b) -- Memory info -- Amount: 384 MB Type: 192 bit DDR3 Clock: 848.568 MHz -- PCI-Express info -- Current Rate: 16X Maximum rate: 16X -- Sensor info -- Sensor: NV50 GPU Internal Sensor (correction=32) GPU temperature: 48C -- VideoBios information -- Version: 62.92.1f.00.09 Signon message: GeForce 8800 GS VGA BIOS Performance level 0: gpu 575MHz/shader 1438MHz/memory 850MHz/0.00V/100% VID mask: 3 Voltage level 0: 0.95V, VID: 0 Voltage level 1: 1.00V, VID: 1 Voltage level 2: 1.05V, VID: 2 Voltage level 3: 1.10V, VID: 3 So my card (Architecture: G92 A2 PCI id: 0x0606) should use nv50 (not g84, not g92) get_temp function to get proper TEMP value I've found that in info.cpp of nvclockx case 0x600: /* G92 */ case 0x610: /* G92 */ arch = G84; //NV50; so adding my device id (0x0606) and arch = nv50 should fix my problem??? can you kozlek make a small change in the code of nvclockx thanks for help , great tool
×