Jump to content

Intel HD Graphics / GMA 5700


sockerkid
 Share

1,324 posts in this topic

Recommended Posts

I cant seem to find a thread about switchable graphics in OSX. Does anyone know if switchable graphics works? My laptop has a ATI Radeon 5650m which is supported, but I can't use it because my BIOS can only be set as Hybrid Graphics, so when OSX boots it loads (or, at least it tries to load) Intel HD graphics. Once the Intel HD graphics works, will I be able to switch from IntelHD to the ATI card after booting?

 

btw, jlp, I have made my donation. Please get the dropbox folder sorted and let me in :)

Link to comment
Share on other sites

Done, Thanks SaltSachet

I'll be updating the spreadsheet and sending dropbox invites at least once a day but I can get pretty busy at the office so bear with me.

 

A side note, I'm here for technical help too and am not the one to answer any questions about hardware.

Happy to get this started anyway and thanks to Alex

Link to comment
Share on other sites

Yes, if someone will verify the Dell Latitude E4310 is indeed the GMA 5700, I will chip in as well.

 

Short answer: Yes

 

Long answer: Very likely. Although, I have seen some i3 machines specified with the 4500MHD. Don't look at me....

 

I don't know why some manufacturer would install a 4500MHD (or maybe it's just wrongly listed in the specifications) into an i3 class machine.

 

Far as I know, the Arrandale (i3/i5/i7) chips have 2 dies (pieces of silicon). One is the CPU (the i3/i5/i7), and the other is the GPU. The CPU is 32 nm (latest and greatest process), whereas the GPU is (45nm). The GPU, far as I know, is the 5700 HD, and there is only one iteration (far as I know).

 

Unlike those with the Sandy Bridge class machines, depending on the SKU, the GPU can be a HD 3000, or a HD 2000. Nobody has got the HD 2000 working yet (another thing that's cooking my noodle).....

 

So yea, I believe that all i3/i5/i7 CPUs that are not Sandy Bridge, ie a 3xxM 4xxM and 5xxM have the HD 5700.

 

Of course, if I'm wrong, somebody please correct me, lest we gather money from those without these GPUs.

 

As for Dells, ahhhhh.... that's another story. Especially the Latitudes.... Ahh.... getting CPU Power Management on them requires hacking, yet again....

I have a Latitude E6400, and a Latitude D420 (both working perfectly, Power Management and all - again, lots of hacking required).

 

For these Dells, I had to physically open them (with screwdriver, beer, and lots of bad language). Then I had to measure the CPU power rails to verify that I actually got C4/C6 states working (again, more bad language).

 

The Dell Vostro 1510 (that I also have) was much easier to hack, got it working pretty much first time from the get-go.

 

The E6400 is the family behind the E4310, and yes, I actually thought of getting one as well. I like the looks of the E4310, but alas, nobody has got the GPU working yet.

 

That's why we're here.

 

Regards,

 

Alex

 

 

I cant seem to find a thread about switchable graphics in OSX. Does anyone know if switchable graphics works? My laptop has a ATI Radeon 5650m which is supported, but I can't use it because my BIOS can only be set as Hybrid Graphics, so when OSX boots it loads (or, at least it tries to load) Intel HD graphics. Once the Intel HD graphics works, will I be able to switch from IntelHD to the ATI card after booting?

 

btw, jlp, I have made my donation. Please get the dropbox folder sorted and let me in :)

 

This is another thing that has me scratching my head - and my hair is actually beginning to thin - you know, I'm not young anymore.

 

I'll tell you what I know (or think I know).

 

Apple machines with dual GPU, have an electronic switch. We engineers call it a multiplexor (MUX). It's a switch. One you can electronically control, via software.

 

Looking at AppleGraphicsControl.kext, it's quite fancy actually.

 

You have Intel GPU (HD 5700, or SNB) and another one (NVIDIA or ATI). The outputs are wired to this switch (think of VGA cables coming out of two computers, into one switch). Then the switch has a wire to the monitor.

 

Depending on workload, the kext decides whether to use the slow Intel or the fast (ATI/NVIDIA) GPU. It's quite fancy like I've said, because it's got timing etc, so it probably switches when the time is right (at bottom of the screen scanline), so there should be no glitches (monitor flickering, picture jumping etc) during the switch.

 

Somebody, with a REAL MacBookPro, please confirm this...

 

From a software (windowserver) point of view, you render the screen to this 'virtual video card', instead of Intel or Nvidia/ATI, again making it seamless. Very nifty.

 

There are another class of PC hardware (Sony VAIO). They have a similar setup. A friend has one.

 

It's got a switch for "Speed" and "Stamina", ie, Nvidia or Intel GPU. When you turn it on, the BIOS reads the position of the switch, and sets up the hardware accordingly (and the DSDT also changes to reflect this!). Everytime you want to change GPUs, you switch it off first, then flick the switch and turn it back on. This also has a MUX, however, it's not real-time controllable, you need to shut down first.

 

Dell also had a similar setup, with their Vostro 3300/3400/3500 series. This was not a physical switch, but settable in the BIOS (read, turn off and reboot).

 

These class of laptops have no issue with OSX, switch (or set in BIOS) who you want to use, and Bob's your uncle.

 

So these PC laptop manufacturers began thinking - why pay for a hardware MUX, if I don't have to? Why not save some money?

 

So we now have Optimus and ATIs switchable graphics. Not having one of these machines (because they don't work with OSX), I can't tell you for sure, but again, here's what I think.

 

The two GPUs are wired in permanently into the PCIE bus, not swictched in/out at power up. Ie, you can see them in ioreg. From what I've read, the Intel has the wire connected to the display because they want to save money, hence no switch.

 

The tricky part is, how to get the Nvidia output to the display?

 

From what I read, it's copied via PCIE, to the Intel Framebuffer (memory). So you set up the Intel GPU, then define a 'port', sort of like an overlay. Then you instruct the NVIDIA GPU to write to that section of memory (likely over PCIE), and magic... it's as if you have faster graphics.

 

I doubt that this is copied over by the CPU, my guess would be they have some sort of DMA channels set up.

 

I read lots of nightmares about this, because linux users have a lot of grief. Likewise OSX users.

 

In the best case, you get the Intel HD working (since it's wired to the display), but the NVIDIA or ATI is still there, consuming power, burning up battery life, like a cancer. But, doing nothing useful whatsoever.... nice.

 

Some users can switch off the ATI/NVIDIA hardware via ACPI calls, but nobody (as far as I know) has managed to get it working right if you don't use Windows 7.

 

It would be quite a monumental task I would think, to work out how to copy the NVIDIA/ATI output across to the Intel, unless NVIDIA/ATI release programmer documentation. To reverse an NVIDIA driver, would be beyond the man-hours of one person. You'd need a distributed team of reverse engineers to take it apart, and work it out.

 

Regards,

 

Alex

  • Like 1
Link to comment
Share on other sites

Thanks for that reply Alex Chin, very informative :). I have a HP Envy. It has the option to turn switchable graphics to integrated or discrete in the BIOS setup menus. But it doesnt work :/. When setting to discrete (dGPU) it just crashes when you save and exit (screen goes black and caps lock light flashes). When setting integrated (iGPU) I think it crashes also, but manages to restart itself but has by then reverted back to the previous BIOS settings ('Hybrid'/Switchable).

 

Any thoughts?

 

According to what I've read on forums, HP deliberately made it so you can't successfully change from switchable graphics in the BIOS, even though the option is there when you access the advanced menu. The BIOS is an InsydeH2O...

Link to comment
Share on other sites

Thanks for that reply Alex Chin, very informative :). I have a HP Envy. It has the option to turn switchable graphics to integrated or discrete in the BIOS setup menus. But it doesnt work :/. When setting to discrete (dGPU) it just crashes when you save and exit (screen goes black and caps lock light flashes). When setting integrated (iGPU) I think it crashes also, but manages to restart itself but has by then reverted back to the previous BIOS settings ('Hybrid'/Switchable).

 

Any thoughts?

 

According to what I've read on forums, HP deliberately made it so you can't successfully change from switchable graphics in the BIOS, even though the option is there when you access the advanced menu. The BIOS is an InsydeH2O...

 

I can believe that.

 

It basically boils down to whether it's one of those laptops where 'can we save money by not installing a MUX'.

 

One way to find out, take your trusty rusty screwdriver and open it up. Naturally, voiding your warranty (if you're not careful). I'm careful. Never any evidence of this <evil grin>

 

The other dead giveaway, you need special ATI drivers, stock ones don't work, and you need vendor (HP) provided ones.

 

The other giveaway, in (Windows 7) device manager, you see both Intel and AMD(ATI) video cards at the same time.

 

For true switchable graphics machines, with a MUX, you only see one, it's either the Intel GMA, *OR* the ATI (AMD they are called now).

 

My guess, yours is the cheap without the MUX variety.

 

In which case, you can't switch, because the AMD card is headless, ie, the video out is not connected to anything. It's wired to fresh air.

 

Probably the BIOS writer was lazy, and didn't take out the code for switching. We have a term for people like this in the IT industry.

 

It starts with the letter F and ends with ERS.

 

Alex

Link to comment
Share on other sites

How's about this: If we havent raised the total in 3 weeks, then we'll only release the solution to the donators, and to everyone else 6months after the solution has been found? Maybe this will push ppl into donating muahahaaaa ^^

 

I had a quick read on the linux forums regarding your machine earlier.

 

They have grief with the ATI in yours as well, but I think somebody has worked out how to shut it down.

 

With both running, the battery life is about 2.5 hrs, and with the ATI shut down, about 4 hours.

 

Nobody has figured out how to get the ATI working, and from what I read, the HDMI is actually hard wired to the ATI. This means there is no chance for you to get HDMI on OSX, ie, you will only have internal LCD screen on OSX.

 

That being said, if a solution is found, you may have split graphics, ie, HDMI over ATI for external, and LCD via Intel 5700. I know OSX supports multiple video cards.

 

Also, they have grief with your HP Envy Internal Intel LCD brightness. Initially, they got the Intel HD to work, but at full brightness.

 

Somebody has figured how to dim it (apparently, the HP has no ACPI _BCM method). I don't know if this affects you directly, but my experience with OSX Intel is the brightness is independent of the ACPI BIOS.

 

We shall see, too early to tell.

 

As a side note, I did see some Dells 11z i3 go for about $280 on ebay. Hopefully, when the time comes, we can get one cheap.

 

Alex

Link to comment
Share on other sites

Short answer: Yes

 

Long answer: Very likely. Although, I have seen some i3 machines specified with the 4500MHD. Don't look at me....

 

I don't know why some manufacturer would install a 4500MHD (or maybe it's just wrongly listed in the specifications) into an i3 class machine.

 

Far as I know, the Arrandale (i3/i5/i7) chips have 2 dies (pieces of silicon). One is the CPU (the i3/i5/i7), and the other is the GPU. The CPU is 32 nm (latest and greatest process), whereas the GPU is (45nm). The GPU, far as I know, is the 5700 HD, and there is only one iteration (far as I know).

 

Unlike those with the Sandy Bridge class machines, depending on the SKU, the GPU can be a HD 3000, or a HD 2000. Nobody has got the HD 2000 working yet (another thing that's cooking my noodle).....

 

So yea, I believe that all i3/i5/i7 CPUs that are not Sandy Bridge, ie a 3xxM 4xxM and 5xxM have the HD 5700.

 

Of course, if I'm wrong, somebody please correct me, lest we gather money from those without these GPUs.

 

As for Dells, ahhhhh.... that's another story. Especially the Latitudes.... Ahh.... getting CPU Power Management on them requires hacking, yet again....

I have a Latitude E6400, and a Latitude D420 (both working perfectly, Power Management and all - again, lots of hacking required).

 

For these Dells, I had to physically open them (with screwdriver, beer, and lots of bad language). Then I had to measure the CPU power rails to verify that I actually got C4/C6 states working (again, more bad language).

 

The Dell Vostro 1510 (that I also have) was much easier to hack, got it working pretty much first time from the get-go.

 

The E6400 is the family behind the E4310, and yes, I actually thought of getting one as well. I like the looks of the E4310, but alas, nobody has got the GPU working yet.

I ran an AIDA report, but it doesn't specify what is in my E4310.

 

 

Windows Video

 

--------------------------------------------------------------------------------

 

 

[ Intel® HD Graphics ]

 

Video Adapter Properties:

Device Description Intel® HD Graphics

Adapter String Intel® HD Graphics

BIOS String Intel Video BIOS

Chip Type Intel® HD Graphics (Core i5)

DAC Type Internal

Driver Date 7/18/2010

Driver Version 8.15.10.2182

Driver Provider Intel Corporation

Memory Size 3861428 KB

 

Installed Drivers:

igdumd64 8.15.10.2182

igd10umd64 8.15.10.2182

igdumdx32 8.15.10.2182

igd10umd32 8.15.10.2182

 

Video Adapter Manufacturer:

Company Name Intel Corporation

Product Information http://www.intel.com/products/chipsets

Driver Download http://support.intel.com/support/graphics

Driver Update http://www.aida64.com/driver-updates

 

 

GPU

 

--------------------------------------------------------------------------------

 

 

[ Integrated: Intel Auburndale/Arrandale Processor - Integrated Graphics Controller ]

 

Graphics Processor Properties:

Video Adapter Intel Auburndale/Arrandale Processor - Integrated Graphics Controller

GPU Code Name Ironlake-M

PCI Device 8086-0046 / 1028-0410 (Rev 02)

Process Technology 45 nm

Bus Type Integrated

GPU Clock 500 MHz

RAMDAC Clock 350 MHz

Pixel Pipelines 4

TMU Per Pipeline 1

Unified Shaders 12 (v4.0)

DirectX Hardware Support DirectX v10

Pixel Fillrate 2000 MPixel/s

Texel Fillrate 2000 MTexel/s

 

Graphics Processor Manufacturer:

Company Name Intel Corporation

Product Information http://www.intel.com/products/chipsets

Driver Download http://support.intel.com/support/graphics

Driver Update http://www.aida64.com/driver-updates

 

------

 

Have you wrote anything up about getting Power Management to work? That's a main concern I have running OS X. In Windows I get double the battery life vs OS X. I did add some P-States stuff to the boot.plist, but I'm not sure if it's doing anything.

Link to comment
Share on other sites

I ran an AIDA report, but it doesn't specify what is in my E4310.

 

GPU

 

[ Integrated: Intel Auburndale/Arrandale Processor - Integrated Graphics Controller ]

 

GPU Code Name Ironlake-M

PCI Device 8086-0046 / 1028-0410 (Rev 02)

Process Technology 45 nm

 

If you have been reading my posts, the HD 5700 is 45 nm.

 

Look in:

 

/System/Library/Extensions/AppleIntelHDGraphicsFB.kext/Contents/Info.plist

 

<key>IOPCIPrimaryMatch</key>

<string>0x468086 0x428086</string>

 

That matches your PCI Device. Looks like you're a good candidate.

 

Myself, I have always found Dells a pretty good shot for hackintoshing (I own 3 Dells). Primarily, because their brightness control is actually via SMM and BIOS. This means, you can *always* control the brightness on these Dells via fn-<blue sun icon>. This is regardless of OS, be it Linux or DOS or whatever under the sun.

 

Some others like Toshiba and Sonys (Toshiba and Sony, I'm looking at you), provide their specific vendor software. Without this software, it can be hard to control the brightness. Read my post about the HP.

 

Have you wrote anything up about getting Power Management to work? That's a main concern I have running OS X. In Windows I get double the battery life vs OS X. I did add some P-States stuff to the boot.plist, but I'm not sure if it's doing anything.

 

No I have not, that's also beyond the scope of this thread.

 

However, if it's working, the dead giveaway is, you get equal battery life as Windows. In rare cases, you can get more. I've managed this on an EeePC 901 SSD 20G and a HP 2510p. If the BIOS is broken, you can patch DSDT manually to enable specific C states of your CPU that your BIOS vendor didn't. Sometimes BIOS writers are lazy and don't write the C states correctly. There is a term for these kind of people in the IT industry. Read previous posts.

 

Alex

Link to comment
Share on other sites

I own an lenovo y560 with an i3 350m and a n ati 5730M. I think I have an hardware-mux, because under 1.6 snow Leopard i managed to get to the dashboard, with gl enabled and swicthable graphics enabled, by simply by following the steps posted on post no. 1. As far as I understand this would not be possible when I have normal switchable graphics. Also in windows 7 when I switch graphics I get an black screen for a few sec.

Link to comment
Share on other sites

Alex Chin, I actually had a dream last night that my old laptop had a mux I could use on my HP Envy, I woke up and had to think to myself 'Oh my god, does my old laptop have mux?! Oh, wait, it was just a dream :P'

 

Alex I am willing to trust you -_- Don't listen to that dickhead.. Listen to us, your loyal admirers :)

Link to comment
Share on other sites

Maybe offtopic - sorry for that,

I'll be glad if it's helpfull for anyone. Sorry again if it's been already discussed. It's possible to use the discrete graphics card (nvidia) on laptops with Optimus technology in Linux. Although there isn't automatic switching, its's better than not using the card at all. Applications need to be run from command line. Here is the project

 

https://github.com/Bumblebee-Project/Bumblebee

Some explanation info from the creators:

"In order to do that, we set up a second X server running on the nVidia card (the default one run on the Intel Integrated Graphics Processor), run graphic applications in it, and then transfer frames from this one to the normal one using VIrtualGL."

 

using it on my acer as5742g (full specs)with ubuntu 11.10

As for the donation, I'm not sure it is possible from my country so sorry for that.

 

If there is anything to test, I think I can help - currently running 10.7.2 with all video kexts (ati, nvda, intel) deleted.

Link to comment
Share on other sites

seriously dude, don't expect people to believe you know {censored} before you provide any proof of your work... just seems plain silly that you just asked for donations before providing {censored}... actually, that was a bright idea, cause insanelymac is full of {censored}heads who are willing to sodomize themselves with a .50 caliber sniper rifle in exchange for anything that may make them feeling like theres a hope... seriously... if by any chance you do know your {censored} thats fine, just don't ask for donations before you provide actual proof of your work.

i can say whatever i want on the internet, and so can you but that doesn't mean its the truth. i apologize if thats your work but really, the way you're doing it just looks like you're trying to scam people...

 

So our troll is back... Who is this guy anyhow?

 

I do thank everybody here who believes in the project.

 

I can surely tell you one thing... I am not pleased (and that's putting it mildly). John himself is disheartened.

 

He sent me an email, saying he'll just tally up the list once daily, but not promote this. By looking at the responses so far, I'm also thinking that the interest in the project is not as big as we thought it would be.

 

I thought that we'd be able to get plenty people to chip in, perhaps everybody has bought Sandy Bridge machines like I have, or there aren't that many Arrandale hacks out there.

 

I am going to say that it just takes one bad Apple (pun intended) to spoil it for everybody.

 

As John has said, why would a scammer use his real name, and post on a public forum, all for a cheap laptop. I have to agree with him.

 

I actually sent him an email earlier, because if I actually manage to succeed in making this work, it potentially unlocks the ability to convert every PC laptop sold in the past 2 years (they are all i3/i5/i7) into a hack. I'm sure a certain fruit company won't be happy about that.

 

They'd either hire me, or sue me... the sue part, I'm not so crazy about.....

 

As for the troll, fgtmoron - why don't you post your grievances out in the forum for the community to see?

 

Yea, you'll probably get booed off the stage. It takes people like myself and John to put things out there, and find followers....

And it just takes one person like you to spoil it for everybody - John certainly has not much to gain, likewise for myself. All this over a cheap laptop? Surely....

 

John even said, unless I am *THAT* keen on hacking this, there is no point for me to cancel the project and fund it myself, unless I want another laptop. He's right. Why would I want to waste effort on an old class machine, that I don't even have?

 

I tell you, post here, and surely you'll get flamed... I bet there are many people who want a solution, and by the looks of it, if you keep this trolling up, maybe only 20 investors may ever see the solution.. those who believe in me and John..

 

Like Leon Hong said, he can afford more, but wants to give other people an opportunity.

 

Luckily I made one of the terms a vote, you better hope that the investors, John and I vote to release it for free if successful...

 

Regards,

 

Alex

 

Alex Chin, I actually had a dream last night that my old laptop had a mux I could use on my HP Envy, I woke up and had to think to myself 'Oh my god, does my old laptop have mux?! Oh, wait, it was just a dream ;)'

 

Alex I am willing to trust you :D Don't listen to that dickhead.. Listen to us, your loyal admirers :D

 

The thing with MUXes is it's basically that. It's an electrical multiplexor. And they are truly one of man's greater inventions, and also, a pain in the posterior region.

 

I can tell you this from the perspective of a person who has designed PCBs (me). What you need to do is take outputs, LVDS for example. Going from memory which may be hazy, LVDS is 3 pairs (6 wires) per channel. It's 2 wires for a current loop and the bits are transferred over 3 pairs. Dual channel (for panels with greater than a certain resolution, 1440x900 or something like that) require more wires, so it's 12 wires I believe.

 

So now, you have this chip with 12 wires, for one GPU. So it's 12 wires in from Intel, another 12 wires in from NVIDIA, and 12 wires out to the LVDS panel. That's not even counting HDMI, DVI and VGA...

 

Nevermind you have all these wires, but routing these amount of wires from all over the motherboard, where the NVIDIA chip is here (finger pointing to one corner), the Intel chip is there (pointing to another corner), and the LVDS socket is here (pointing to another section of the board). How about the VGA socket, and HDMI socket....

 

You get the idea.

 

Now, you need to run these wires (we call them tracks) all over this motherboard, and ensure signal integrity as well (so you follow rules like they can't be too close, they must run parallel etc). All these tracks, to a tiny chip on the board the size of a poofteenth of an inch square.

 

So let me ask you - is it worth the effort, or do you just put the NVIDIA chip in, don't bother wiring it up, and call it OPTIMUS?

 

Maybe Apple will think the same, and build their machines this way too.

 

The main issue of the OPTIMUS solution (it's also called hybrid graphics), is you need to copy data back and forth between GPUs. This accounts for a performance loss. I believe Anandtech or some website benchmarked it to be 5-10%. That's why real gaming laptops (I have a friend with one) don't use this technology.

 

Regards,

 

Alex

Link to comment
Share on other sites

Thanks for the vivid illustration, Alex. I am thinking of this: Do we Ironlake-M users really need the Muxer? We already have the direct hard-wired outputs to LVDs at the first place without running OS X. So the possible explanation is that OS X mis-understands our hardware configurations and redirects it to an incorrect NVdia card output to make the black screen. I found the same results happening when upgrading a NVdia 9650 machine from 10.7.0 to 10.7.2, it would get a black screen right after the post-upgrading reboot that you can remotely access it seeing everything in the right places just like what happened on IronLake-M, and after some DSDT editing the NVdia 9650 came back to fully work again w/o using of kexts/Chameleon Graphic enable options!! The strings in the DSDT "device-type" from "NVDA,GeForce" -> "NVDA,Parent" do the magic!!

 

I cannot find a correct string like "Intl,Parent""Gen575,Parent" or something resembled on the web till now, does anybody have any good idea?

Link to comment
Share on other sites

The issue with the toshiba HD3000 is getting the internal screen to work. I do have support on HDMI, full resolution and acceleration but I cant get the laptop screen to work if the frame buffer loads. Editing connector tables, dsdt patches, etc don't work. Without the frame buffer its the same old 1024 x 768 screen with no acceleration. I've spent more hours than I care to count looking for the solution.

Link to comment
Share on other sites

Alex if you have the only working Sandy Bridge HD 3000 Laptop, then why is the graphics supported with Chimera?

 

I direct you to:

 

http://www.tonymacx86.com/viewtopic.php?f=170&t=28471

 

Starts Aug 09

 

http://www.tonymacx86.com/viewtopic.php?f=...26&start=40

 

To..

 

Oct 23.

 

Also:

 

http://www.insanelymac.com/forum/index.php?showtopic=261043

 

I've had this working since early August. Video (proof is uploading now).

 

Regards,

 

Alex

Link to comment
Share on other sites

okay

 

did some testing today

 

did a clean install of Lion 10.7.2

and inject the os-info string of MacBookAir 4,2

 

and now the external monitor connectet to vga, is recognized

i have transparent menu in system preferences

see the screenshot

 

the internal LCD is still black

  • Like 1
Link to comment
Share on other sites

Guest 123osx867
I suggest at all the people involved here to STOP now.

 

Best regard.

 

 

what you did there is censorship. if you really want to put an end to this i suggest you to delete the posts of everyone involved. you clearly did it only to people against it. so you either allow our posts back and give our posting permissions back or you actually delete everyone else's post.

also, may i know why you deleted my post and simply edited shepdog's one?

 

my previous post contained no offensive content, the same goes to shepdog's post. and may i know why you deleted them? you are not willing to respond pm's so i made this account. i do realize this may be the last post on this account as it will probably get deleted for no reason and no clarification will be given. mod, stop playing god. I'm pretty sure someone talked about freedom of speech in this thread at some point... you may as well hunt that down and revoke his posting permissions and delete his post.

 

thank you all.

Link to comment
Share on other sites

what you did there is censorship.

 

No it isn't. I have just chimed in, but regardless of who is right or wrong you have broken all our rules.

You do not retaliate, especially the way you did. You do not create multiple accounts.

I have banned you as fgtmoron and I am going to delete all your other accounts but one. You can use funnybot

now (provided of course that you behave yourself).

Then I'll look into other users responsibility and, believe me, if they are wrong I won't be soft.

And next time use report instead. If you don't get a satisfactory action in response, pm me.

Link to comment
Share on other sites

hi everybody i have been watching this thread cause i have a gma5700. i plan on donating when i get paid on Friday. So whats the pot at, and i look forward to Alex giving it a shot. I for one think 10.00 to 20.00 is what one pizza and totally worth it even if he doesn't succeed. Cause then at least we are that much closer to knowing whether its possible or not. Me, i personally hate not knowing and ill gladly contribute a little just to know.

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
 Share

×
×
  • Create New...