Jump to content

Bizarre Semaphore Behavior


2 posts in this topic

Recommended Posts

Hi all,

 

I have some very basic semaphore code that works great on Linux, but cannot for the life of me get it to run properly on OS X... It returns the oddest of results...

 

#include <iostream>
#include <fcntl.h>
#include <stdio.h>
#include <semaphore.h>

int main()
{
sem_t* test;
test = sem_open("test", O_CREAT, 0, 1);

int value;
sem_getvalue(test, &value);
printf("Semaphore initialized to %d\n", value);
}

 

Compiling this on OS X with g++ returns the following output:

iQudsi:Desktop mqudsi$ g++ test.cpp
iQudsi:Desktop mqudsi$ ./a.out 
Semaphore initialized to -1881139893

 

Whereas on Ubuntu, I get the decidedly more-sane result:

iQudsi: Desktop mqudsi$ g++ test.cpp -lrt
iQudsi:Desktop mqudsi$ ./a.out 
Semaphore initialized to 1

 

I've been at this for 3 hours straight, and cannot figure out why OS X is returning such bizarre results...

 

I've tried using file paths as the semaphore name, it didn't make a difference.

 

Any ideas?

Link to comment
Share on other sites

 Share

×
×
  • Create New...