Jump to content
Welcome to InsanelyMac Forum

Register now to gain access to all of our features. Once registered and logged in, you will be able to contribute to this site by submitting your own content or replying to existing content. You'll be able to customize your profile, receive reputation points as a reward for submitting content, while also communicating with other members via your own private inbox, plus much more! This message will be removed once you have signed in.

  • Announcements

    • Allan

      Forum Rules   04/13/2018

      Hello folks! As some things are being fixed, we'll keep you updated. Per hour the Forum Rules don't have a dedicated "Tab", so here is the place that we have our Rules back. New Users Lounge > [READ] - InsanelyMac Forum Rules - The InsanelyMac Staff Team. 
jkwarras

Ati Radeon HD4850 (10bit)

9 posts in this topic

Recommended Posts

Hi,

 

I'm not in a hackintosh anymore (I was a hackintosh user a few years ago), I'm using an imac (Ati Radeon HD4850) and I'm in the video editing sector (low-budget) and I wondered if someone around the community knows if there's a way of doing this:

 

1) Enable 10bit display support on mac os x (apple doesn't seems to care about that, even if in windows it's already available), and i.e in my case with a 10-bit monitor connected via displayport connection, it's a shame not being able to do it natively on mac. The card, on specs should be able to support 10-bit (http://www.amd.com/u...4870-specs.aspx), as it's "Full 30-bit display processed"

 

2) Enable a secondary monitor (a computer display connected via DVI/HDMI/Displayport) to be "seen" as a video monitor without an I/O card/box, just attached via displayport from the imac graphic card. It'll be useful for applications that can only use a full accurate full screen preview on a video broadcast monitor, like Apple Color, FCP... Premiere CS6 can do that, but others can't, they expect a video monitor.

 

Thanks.

Share this post


Link to post
Share on other sites

I realize that enabling 10-bit shold be almost impossible because afaik it should be done by Apple on a wide system basis. This sucks, because on windows 7 you can already do this.

 

The second one, having the ability of changing the "display type" to video/tv instead of a desktop monitor, anyone has any ideas?

Share this post


Link to post
Share on other sites

UPDATE: I've enabled 10-bit display support on windows 7 (64 bits), under bootcamp. I've had to softmod the card and install the drivers for the Firepro V8700, and the Catalyst Control Center. Then I can enable 10-bit display, and I've checked in photoshop (I've also enable 10-bit display in the application) with a ramp 10-bit test grey psd, and boom, smooth gradient. It's available!

 

If anyone is interested in getting 10-bit output from the imac (late 2009) video card on windows 7 (64 bits), this is how you do it:

http://forums.guru3d...ad.php?t=313065

 

So, it's a driver thing. I have 10-bit support on the imac card, but only working under windows. If I can softmod (actually, trick the installer to use 9456 ID (Firepro v8700) drivers in the 944A (HD 8450) and it works, can I do something like that in Mac os X? Is there a way to trick the system and use firegl kext on the internal pcie card? I could reflash the card and change the device ID, but I don't want to if softmod is available somehow.

 

PLEASE?

Share this post


Link to post
Share on other sites

So, anybody? No way to do a softmod for the ati device ID under a real imac?

 

I've explored the possibility of reflashing under windows the card, to change the device ID, but apparently it can make the mac system not bootable.

Share this post


Link to post
Share on other sites

I'll bring back this old subject.

 

I've been able to fake the graphic card ID of my HD 4850 (944A) to use 9440 (which is the HD ATI 4870, one of the few 10-bit supported cards under mac os x) using Clover bootloader on a usb and booting the iMac from it, and tried to force "MotMot" framebuffer (which apparently it's the framebuffer the 4870 use). But it doesn't work. It always load "Quail" framebuffer, which is tied to the 4850 on an imac, verified with "grep | ATY".

 

Using gpusniffer, it reveals that it's using the 4850 opengl engine, so I suspect that this is only a "cosmetic" change, and not a real one. Or I'm failing somewhere in doing it right :)

Share this post


Link to post
Share on other sites

hi ,   :)

 

the HD 4850 is 256 bit et no 10 bit ;)

It's related to deep color, 30 bits (1.073 billion colors), often called 10-bit (per channel) :) I have a 10-bit display (HP Dreamcolor) that's why I'll like to get 10-bit output from the card.

 

The card is 10-bit capable and works in windows under bootcamp, when its' faked as a firegl pro.

Share this post


Link to post
Share on other sites

ha !! ok ,

 

and "Graphics Mode"="resolution moniteur ex : 1280x1024x32@60" 

 

donc correct flag : "Graphics Mode"="1280x1024x32@60"  ;) or  1920x1080x32@60 (@60 it's facultatif) 

Share this post


Link to post
Share on other sites

ha !! ok ,

 

and "Graphics Mode"="resolution moniteur ex : 1280x1024x32@60" 

 

donc correct flag : "Graphics Mode"="1280x1024x32@60"   ;) or  1920x1080x32@60 (@60 it's facultatif) 

 

Tried that but doesn't work. It uses regular 8-bit output.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


  • Recently Browsing   0 members

    No registered users viewing this page.

×