From 195e5d450fdbc653906c9ada64f8f575405c64ad Mon Sep 17 00:00:00 2001 From: Xargin Date: Fri, 11 Dec 2015 13:03:04 +0800 Subject: [PATCH] =?UTF-8?q?=E6=BC=8F=E7=BF=BB=E8=AF=91?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- ch8/ch8-06.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/ch8/ch8-06.md b/ch8/ch8-06.md index 3fbdc91..d98a573 100644 --- a/ch8/ch8-06.md +++ b/ch8/ch8-06.md @@ -82,7 +82,7 @@ func crawl(url string) []string { } ``` -The second problem is that the program never terminates, even when it has discovered all the links reachable from the initial URLs. (Of course, you’re unlikely to notice this problem unless you choose the initial URLs carefully or implement the depth-limiting feature of Exercise 8.6.) For the program to terminate, we need to break out of the main loop when the worklist is empty and no crawl goroutines are active. +第二个问题是这个程序永远都不会终止,即使它已经爬到了所有初始链接衍生出的链接。(当然,除非你慎重地选择了合适的初始化URL或者已经实现了练习8.6中的深度限制,你应该还没有意识到这个问题)。为了使这个程序能够终止,我们需要在worklist为空或者没有crawl的goroutine在运行时退出主循环。 ```go