Over the years, since I was first introduced to this "pattern", this happens once a year or so. It would happen more, if I used it more. This recent case was a case where I had broke my personal rule a while back, and it caught up with me. Again.
It's not a totally evil pattern, I see its value (and use it) for things like class var singletons.
The problem I have with it is one of Predictability and Responsibility. It seems like an encapsulated thing to do. You 'grow' the state of an object as different callers call upon for it services. So the state comes into being only as its needed. For simple patterns, this works fine:
foo
^foo ifNil: [foo := 42]
and
bar
^bar ifNil: [bar := 18]
No problems. It's hard to imagine what will go wrong at this point. What often happens though, is that objects evolve over time. Different people come along and maintain them. They do so, not with the whole object in mind, but just looking at one view. So someone discovers that foo and bar actually have some interplay. And we end up with
foo
foo ifNil: [self useConsistentFooBars ifTrue: [foo := self bar * 10]].
^foo
And then someone later does something like
bar
bar ifNil: [self useConsistentFooBars ifTrue: [bar := self foo / 10]].
^bar
The thing is, you might get away with this for a while. It's quite possible that when this was done, all uses of the object were using a setter to set foo before either accessor is invoked. So things Just Work(tm). Until later when someone changes the order of the way the object is being talked to.
In short, as the nature of lazy initializers grows in complexity, the odds rise that the object has hidden expectations about how it has to be interfaced with. And that is anything but encapsulated. Now you have the internal implementation of the object leaking out in hard to see or document ways.
It's an interesting problem, Travis. I've suffered similarly on occasion over the years :-) I've just written up a few thoughts on the topic here: http://www.eighty-twenty.org/index.cgi/tech/smalltalk/declarative-laziness-20110428.html
ReplyDeleteDont get carried away... :-)
ReplyDeleteThis has nothing to do with lazy initialization; this is just a plain programming error.
Consider the would-be initialization expressions (cruft removed) in the alternative #initialize method:
foo := bar * 10.
bar := foo / 10.
this is a simple bug calling for a reassessment of the algorithm.
This kind of bug is even benign since it gets triggered very deterministacally (on access) and it hits you the very first time when you try what you have worked on.
To tonyg: since this is a bug, I want the exception. Having a system detecting this situation during definition would be interesting in an academic sense, but not really necessary.
I like Bobby Woolf's old classification of instvars into key, state and cache. I use lazy initialisation for cache variables. I use explicit initialisation for state instvars. I use creation methods to initialise key instvars.
ReplyDelete- When time permits, I get rid of lazy initialisation in areas I rewrite whenever I judge the instvars to be key or state, since that is when I expect the pattern to hide or cause bugs.
- When using the pattern with cache instVars, I like to pair with flush methods that indicate any interdependencies.
I find shoving all the instVars I can into this categorisation helps. It makes my initialisation decisions for me in most cases, and a debatable case at least reveals itself as a debatable case.