Jump to content

Bizarre Semaphore Behavior

  • Please log in to reply
1 reply to this topic

Computer Guru

Computer Guru

    InsanelyMac Geek

  • Members
  • PipPipPip
  • 112 posts
  • Location:Teh Great Middle East
  • Interests:Coding, Hacking, Cracking
Hi all,

I have some very basic semaphore code that works great on Linux, but cannot for the life of me get it to run properly on OS X... It returns the oddest of results...

#include <iostream>
#include <fcntl.h>
#include <stdio.h>
#include <semaphore.h>

int main()
	sem_t* test;
	test = sem_open("test", O_CREAT, 0, 1);
	int value;
	sem_getvalue(test, &value);
	printf("Semaphore initialized to %d\n", value);

Compiling this on OS X with g++ returns the following output:
iQudsi:Desktop mqudsi$ g++ test.cpp
iQudsi:Desktop mqudsi$ ./a.out 
Semaphore initialized to -1881139893

Whereas on Ubuntu, I get the decidedly more-sane result:
iQudsi: Desktop mqudsi$ g++ test.cpp -lrt
iQudsi:Desktop mqudsi$ ./a.out 
Semaphore initialized to 1

I've been at this for 3 hours straight, and cannot figure out why OS X is returning such bizarre results...

I've tried using file paths as the semaphore name, it didn't make a difference.

Any ideas?



    Insanely Unreal

  • Retired
  • 345 posts
  • Gender:Male
  • Location:Outskirts of insanity
Guess you found your answer :D


0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

© 2017 InsanelyMac  |   News  |   Forum  |   Downloads  |   OSx86 Wiki  |   Designed by Ed Gain  |   Logo by irfan  |   Privacy Policy