I am trying to understand how the defer
statement works in Swift 2 because I apparently do not understand it correctly.
I have a postprocess()
method that should only be called when view
is instantiated:
internal func postprocess() {
assert(self.node.isViewInstantiated()) // <- this is failing using the first method
...
}
I was originally attempting to call it like so:
public var view: UIView {
get {
if !node.isViewInstantiated() {
defer {
postprocess()
}
}
return node.view // node.view getter creates view
}
}
But this was setting off the assert. When I changed it to this, it started working:
public var view: UIView {
get {
if node.isViewInstantiated() {
return node.view
} else {
var result = node.view
postprocess()
return result
}
}
(Note that node.view
is self-instantiating, hence isViewIntantiated()
.)
Can someone please explain why the defer
doesn't actually defer?
Swift's defer
keyword queues up a block to execute when the current scope is exited , which isn't necessarily the same as when the function returns.
A defer
block inside an if
block will be executed as you leave the scope of the if
block.
You could rewrite your code as such:
public var view: UIView {
get {
let shouldPostProcess = !node.isViewInstantiated
defer {
if shouldPostProcess {
postprocess()
}
}
return node.view // node.view getter creates view
}
}
And it should work perfectly fine. Now the defer
-ed block is scoped to the get
method call and executes as it returns, where before it was scoped to the if
block and executed as the if
block scope exited.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.