Merge pull request #159 from cch123/master

漏翻译了。。
This commit is contained in:
chai2010 2015-12-11 14:59:39 +08:00
commit 92ae7fad4f

View File

@ -82,7 +82,7 @@ func crawl(url string) []string {
} }
``` ```
The second problem is that the program never terminates, even when it has discovered all the links reachable from the initial URLs. (Of course, youre unlikely to notice this problem unless you choose the initial URLs carefully or implement the depth-limiting feature of Exercise 8.6.) For the program to terminate, we need to break out of the main loop when the worklist is empty and no crawl goroutines are active. 第二个问题是这个程序永远都不会终止,即使它已经爬到了所有初始链接衍生出的链接。(当然除非你慎重地选择了合适的初始化URL或者已经实现了练习8.6中的深度限制,你应该还没有意识到这个问题)。为了使这个程序能够终止我们需要在worklist为空或者没有crawl的goroutine在运行时退出主循环。
```go ```go