pouët.net

So, you learned Objective C...

category: general [glöplog]
Neoneye, what I'm trying to say, AGAIN, is: As it is describes above, autorelease doesn't solve anything that couldn't be solved without it.

Simply invent the notion of transferring ownership and you're done. In fact, it's even more consistent than how ObjC seems to handle it now: the alloc class method already DOES transfer ownership of the object to the caller (as you don't have to do a call to retain afterwards). So instead of declaring this behavior an exception and then invent workarounds for the problem arising from that, one could simply give the reference to the caller (and keep the reference count where it is), and the problem is solved.
added on the 2008-12-01 13:10:31 by kb_ kb_
@kb, that sounds pretty cool. double linked list, maybe I should try that out instead of gc. thanks for the tip.
added on the 2008-12-01 13:10:43 by neoneye neoneye
@kb, agree. ownership can be transfered in many ways.
added on the 2008-12-01 13:24:02 by neoneye neoneye
Quote:
c++
person = NULL;
person->age(); // boom

objc
person = nil;
[person age]; // ok


Quote:
or even better: pointing to a dummy object that does nothing


Is it just me being sort of oldschool here, but isn't that asking for trouble? I mean, as convenient as it is, you still have errors occurring, you just choose to ignore them. It seems to me this would only cure the most obvious symptoms of those errors and further obscure their more subtle implications.

I have the same problem with things like GC. Smart pointers that are able to throw exceptions when they've been abused, reference counting to tell the coder if he's releasing an object prematurely or neglecting to do it altogether, I can relate to that, but I don't feel convinced that curing the surface symptoms of sloppy programming (or worse, poor overall design) with automated GC is a good idea, even in the short term.
added on the 2008-12-01 13:56:26 by doomdoom doomdoom
@command cybord, in objc its used everywhere as a way to save typing..
Code:-(void)dealloc { if(_cpp_object) delete _cpp_object; // c++ [_objc_object release]; // objc [super dealloc]; }

added on the 2008-12-01 14:12:11 by neoneye neoneye
I just found this:

http://code.google.com/p/cocos2d-iphone/

"cocos2d for iPhone is a framework for building 2D games, demos, and other graphical/interactive applications. It is based on the cocos2d design: it uses the same API, but instead of using python it uses objective-c."
added on the 2008-12-01 14:14:02 by maw maw
/cybord/cyborg/ sorry
added on the 2008-12-01 14:17:37 by neoneye neoneye
Quote:
Is it just me being sort of oldschool here, but isn't that asking for trouble? I mean, as convenient as it is, you still have errors occurring, you just choose to ignore them. It seems to me this would only cure the most obvious symptoms of those errors and further obscure their more subtle implications.


There's only one answer to that: "depends". Normally I'm all for punishing null pointer dereferences, but sometimes it makes sense.

Actual example: Our in-house audio engine does this full reference tracking thing. Which is nice because on PC, every sound that's running holds a reference to itself so that even if you release a sound bank you can be sure that all sounds that are still running aren't cut off but will fade out gracefully. As soon as all sounds have finished playing all the references will be cleared, total count goes to zero and the bank will be purged from memory.

So far, so good, but: On consoles this is a different matter. You've got a quite limited amount of memory and you simply don't want a sound bank you just decided you don't need anymore lying around in memory while some other stuff is already loading. Fragmentation is a bitch.

So, at one point, and because you've got full reference tracking instead of only counting, you as console coder call the sound bank's "shut the fuck down" routine - and it nulls out all references and then deletes itself. But there's a catch: We're talking about a real world project with maybe 15 different coders and hundreds of thousand of lines of code here, so you can't be sure that at the point where you force the bank out of the memory the rest of the game code has already let go of all of its references to sound effects, music clips or whatever. Perhaps some guy got the shutdown order wrong, perhaps he simply forgot to release a reference.

Now, in this very case, having a null pointer reference something that always says "yeah, no, i'm not playing anymore, whatever" might not be the cleanest solution, but in the context that there's a product to ship and you don't want to send your whole gameplay programming team onto a two weeks bug hunt that ends in "oops, the error is systemic, that'll take another two weeks to fix for good", it certainly is the better solution. And it doesn't have any negative effects - except from implicitly nurturing programmers' sloppiness by not explicitly punishing them.

added on the 2008-12-01 14:38:05 by kb_ kb_
Perhaps I should add that the stuff above is at least my personal approach to console coding: The game MUSTN'T ever crash. So after making sure you're reasonably bug free it's always good to make the code handle errors as gracefully as possible.

The extreme example is perhaps that in at least one console title I shipped there was a bug that sometimes crashed the GPU (as in once every few days). After very much not finding it for a week I simply hacked together a full GPU reset if it doesn't reply after 1/4 second, effectively reducing the bug to "the game is stuttering for one frame every few days".

And yes, I'm still feeling dirty :)
added on the 2008-12-01 14:52:54 by kb_ kb_
crashing is a terrible invention and because of it lots of safety mechanisms have been invented. and humans are not entirely perfect too.
1. assert
2. unittests
3. debug/trace/
4. references
5. harmless null-pointers
6. scripting languages
7. template hell
8. ...

if pouet is the church for coders then its possible to be for(given);

ok, maybe a bit far fetched :-)
added on the 2008-12-01 16:53:24 by neoneye neoneye
Just write the code in 100% asm, and write it correctly in the first place.
added on the 2008-12-01 17:07:36 by kusma kusma
9. profit!
Yep, computer are just binary machines. Either the code works perfectly, or it is bugged.
added on the 2008-12-01 18:08:10 by pmdata pmdata
Quote:
if(_cpp_object) delete _cpp_object; // c++
[_objc_object release]; // objc
[super dealloc];


"The C++ language guarantees that delete p will do nothing if p is equal to NULL. Since you might get the test backwards, and since most testing methodologies force you to explicitly test every branch point, you should not put in the redundant if test (source)."

C++ saves more typing here :)
added on the 2008-12-01 19:06:16 by slack slack
Guys, could you please stop to make me feel stupid? Thank you.
added on the 2008-12-01 19:35:34 by maw maw
kb: Well, I get it, but still, the way I've always thought of "good design", even when it's in this fancy new "OOP" context, if you have so little control over your objects that you don't know when they exist and when they don't, then you've made some bad overall design decisions. Keeping track of objects is what things like container/factory classes are for. And it seems to me that especially on limited-memory systems like consoles you'd want the extra control you get from that. But then again, maybe I'm underestimating how efficient automatic GC is in modern implementations? And there could be stuff people aren't telling me (I hate that!), like, does Objective-C try to let you know if you fail to properly release an object, say if two objects that both need to be released end up stuck in memory because they're referencing each other, that sort of thing?
added on the 2008-12-01 20:02:13 by doomdoom doomdoom
Quote:
kb: Well, I get it, but still, the way I've always thought of "good design", even when it's in this fancy new "OOP" context, if you have so little control over your objects that you don't know when they exist and when they don't, then you've made some bad overall design decisions.


I literally LOLed when I read the "fancy new OOP" bit. But yeah, that's demoscene for you :)

As I said, the only "bad design decision" I tried to circumvent with the whole reference tracking stuff is "people making mistakes". Sound effects behave a bit differently than most other stuff you have to deal with in game programming, because they're much more dynamic and get created, deleted, forgotten about and modified all the time, all while it's YOU as game programmer who has a reference to it. And sometimes people forget to check for stale references in all of the many places you would have o check for them.

Point is: Programmers ARE sloppy and make mistakes, even the very best ones such as you and I [<- well hidden irony]. You can deny this simple fact as long as you want, but I prefer a system that handles the occasional game programmer fuckup gracefully any time over a system that throws a null pointer exception while the game is in the console manufacturer's approval process and the tester skipped between two menu screens just in the ONE millisecond that triggered a bug (and therefore passed in-house QA for months).

"good design" is a very important thing to have, but things in the real world sometimes work differently.
added on the 2008-12-01 20:32:25 by kb_ kb_
And if you try to translate the above post into something that at least resembles actual English, you might even understand me! Haha!
added on the 2008-12-01 20:34:23 by kb_ kb_
I don't imagine programmers are perfect ;). And I'm not dismissing things like GC as "an easy way out for the sloppy programmer" or anything. It could well be that GC built into the language is just the natural evolution of what everyone's doing anyway with their own code, same as the way more and more programmers discovered the benefits of grouping together data structures and functions (etc. etc.), which I guess ultimately lead to a standard notion of object-orientation that found its way into compilers.

Like I said, I get why you end up with objects that are referenced from all over the place. Haven't done much in the way of sound engines, but I have done physics, and it's more or less the same deal. I'd love to have objects live and die on their own without any containers to answer to. I've even thought of implementing a single super container for all objects, which I guess amounts to implementing my own GC system. The problem is I can't come up with a good enough scheme for it that I can't see just adding a whole other category of potential fuck-ups to the system, unless, that is, I make it extremely strict so that it requires all the connected objects to constantly tell it what they are (and intend to be) doing, which would defeat the purpose of the exercise.

I might just be underestimating modern GC, or the ingenuity of clever people who invent custom pointer classes, but I can't see how something like reference counting actually solves the problem. Just off the top of my head, suppose A references B and wants to release the reference, to hand it over to C. C then registers that it's hanging on to B. Ultimately C is done with B and the system makes sure B is deallocated some time after that.

But, if something happens in the wrong order, the system may or may not delete B after A has released its hold on it. If GC is done periodically, you have an intermittent bug that probably only shows very rarely. And if it does, A would be unaware of that and pass a NULL pointer to C, and C in turn would carry on using it, unaware that it's already deleted because *NULL is treated as a valid (though inert) object. So you'll still be checking that pointers are not NULL and/or you'll have to keep the inner workings of the garbage collector in mind all the time.

Anyway I don't disagree that once an application or game leaves the developers, it should at most log errors and otherwise try to carry on as if nothing happened. Some object that stops respecting gravity is a lot less frustrating to a player than the whole game crashing.
added on the 2008-12-01 21:55:43 by doomdoom doomdoom
Code:uint* ptr = NULL; delete [] ptr;


embarasing, didn't knew about this one. thank you @slack.
added on the 2008-12-01 22:52:48 by neoneye neoneye
Doom: Did you even read what I was saying?

First, The A/B/C thing is pretty much easy as pie - as soon as you allow transfer of ownership:

If Function A wants to give a reference to Object X to Function B, it simply DOESN'T release the reference but just passes it on. Symmetrically, B assumes that every reference it gets from another object is from now on its own. Poof, all of the problems you described vanished, and everything in two lines of code LESS. The only thing "problematic", and this only as in "but this isn't how i want to do it!!!11!! *weep* *cut vein* *call ambulance, then pass out because of all the blood*", is that A sometimes needs to increase the reference count itself before passing it on - depending on if it still needs X or not.

Oh, and custom pointer classes work because the custom pointers manage the reference counts, not the objects using them. Imagine the following (AddRef() and Release() being the refcounting functions of the addressed object like in COM):

(and typed from memory without looking at it twice)

template <typename T> class Ptr
{
private:
T *blah;
static T *dummy;

friend class T;
Ptr(T *ptr) { blah=ptr; if (blah) blah->AddRef(); }

public:
Ptr() { blah=0; }
Ptr(const Ptr &b) { blah=b.blah; if (blah) blah->AddRef(); }

~Ptr() { Release(); }

Ptr& operator= (const Ptr &b) { Release(); blah=b.blah; blah->AddRef();

T * operator->() { if (blah) return blah; else return &dummy; }

void Release() { if (blah) blah->Release(); blah=0; }

}

From now on you just need to pass Ptr<whatever> around and the reference counting is taken care of automatically. And with reference tracking (all Ptrs to an object are in a list) you can even forcibly invalidate references and destinguish between strong and weak references easily, thus avoiding the problems arising from circular references.

Simple as that :)
added on the 2008-12-02 00:03:40 by kb_ kb_
Yep, there's a "return *this; }" missing at the end of operator=, and the initial assignment sucks and only works with a factory pattern behind it, but you get the idea. Also, I like posting twice in a row.
added on the 2008-12-02 00:07:17 by kb_ kb_
You do like posting twice in a row, don't you. Please don't misunderstand, I did read what you said. I'm only probing the issue because it's interesting and relevant to me. Smart pointers like that might actually solve some problems I'm having, I mean, if nothing else I could use the reference count to test if my code is really in control of all the pointers it's throwing around (I use many more references than I should. In that respect I'm what you might call an "idiot"). Thanks for sharing. ;)
added on the 2008-12-02 01:20:38 by doomdoom doomdoom
pretty neat impl.
added on the 2008-12-02 07:17:34 by neoneye neoneye
For more details on what kb said have a look at Charles Bloom's explanation:
http://www.cbloom.com/3d/techdocs/smart_pointers.txt
added on the 2008-12-12 13:21:08 by arm1n arm1n

login