Computer Guru Posted May 16, 2009 Share Posted May 16, 2009 Hi all, I have some very basic semaphore code that works great on Linux, but cannot for the life of me get it to run properly on OS X... It returns the oddest of results... #include <iostream> #include <fcntl.h> #include <stdio.h> #include <semaphore.h> int main() { sem_t* test; test = sem_open("test", O_CREAT, 0, 1); int value; sem_getvalue(test, &value); printf("Semaphore initialized to %d\n", value); } Compiling this on OS X with g++ returns the following output: iQudsi:Desktop mqudsi$ g++ test.cpp iQudsi:Desktop mqudsi$ ./a.out Semaphore initialized to -1881139893 Whereas on Ubuntu, I get the decidedly more-sane result: iQudsi: Desktop mqudsi$ g++ test.cpp -lrt iQudsi:Desktop mqudsi$ ./a.out Semaphore initialized to 1 I've been at this for 3 hours straight, and cannot figure out why OS X is returning such bizarre results... I've tried using file paths as the semaphore name, it didn't make a difference. Any ideas? Link to comment Share on other sites More sharing options...
realityiswhere Posted May 19, 2009 Share Posted May 19, 2009 Guess you found your answer http://forums.macosxhints.com/showthread.php?t=101647 Link to comment Share on other sites More sharing options...
Recommended Posts