如何在 N > 10 URLS 的情况下发送 N 个获取请求

我正在尝试发出 N 个获取请求,但我的代码适用于 8 个 URL,但 10 个始终堆叠没有问题。


我是 GO 新手,所以我无法理解这个问题。


我正在尝试编写一个应用程序来击败具有相同任务的 .NET 应用程序。


你能提出什么问题吗?


package main


import (

    "fmt"

    "io/ioutil"

    "log"

    "net/http"

    "os"

    //"bufio"

    "time"

)


type HttpResponse struct {

    url      string

    response *http.Response

    err      error

}


func main() {

    fmt.Println("Hello, world3.")


    var urls []string = []string{

    "www.webmagnat.ro",

    "nickelfreesolutions.com",

    "scheepvaarttelefoongids.nl",

    "tursan.net",

    "plannersanonymous.com",

    "saltstack.com",

    "deconsquad.com",

    "migom.com",

    "tjprc.org",

    "worklife.dk",

    "food-hub.org"}


    start := time.Now()

    results := asyncHttpGets(urls)


    f, err := os.Create("test.txt")

    if err != nil {

        fmt.Println(err)

        return

    }



    for _, result := range results {

        fmt.Printf("%s status: %s\n", result.url,

            result.response.Status)


            l, err := f.WriteString(result.url+"\n")

            if err != nil {

                fmt.Println(err)

                f.Close()

                return

            }

            _ = l

    }

    t := time.Now()

    elapsed := t.Sub(start)

    fmt.Printf("Ellipsed: %s\n", elapsed)


    err = f.Close()

    if err != nil {

        fmt.Println(err)

        return

    }



    fmt.Println("Buy, world2.")

}



func asyncHttpGets(urls []string) []*HttpResponse {

    ch := make(chan *HttpResponse, len(urls)) // buffered

    responses := []*HttpResponse{}

    for _, url := range urls {

         go func(url string) {

            fmt.Printf("Fetching %s \n", url)

            resp, err := http.Get("http://" + url)

            if err != nil {

                fmt.Printf("Failed to fetch %s\n", err)

                return

            }



https://play.golang.org/p/pcKYYM_PgIX


大话西游666
浏览 129回答 3
3回答

慕尼黑5688855

这里的第一个问题是在发生错误的情况下您不会返回响应,因此len(responses) == len(urls)很可能永远不会匹配,从而迫使您的循环永远继续下去。首先sync.WaitGroup为并发请求添加一个    var wg sync.WaitGroup    ch := make(chan *HttpResponse)     responses := []*HttpResponse{}    for _, url := range urls {        wg.Add(1)        go func(url string) {            defer wg.Done()然后你可以覆盖响应,直到所有未完成的 goroutine 都完成    go func() {        wg.Wait()        close(ch)    }()    for r := range ch {        fmt.Printf("%s was fetched\n", r.url)        responses = append(responses, r)    }    return responses然后,您必须决定如何处理响应,您是要在并发调用中读取它们,还是返回它们的正文未读。由于如果您想重用连接,您将始终Body.Close()尝试使用主体,并且由于您已经推迟了,因此目前需要在同一个函数调用中发生。您可以更改httpResponse类型以使其成为可能,或者将 替换为resp.Body包含响应的缓冲区。最后,您将希望为客户端设置某种超时(可能使用 a Context),并对发出的并发请求数进行限制。

UYOU

问题是您在没有写入通道的情况下返回错误。看你的if err != nil { return }说法。因为你不写信给频道,所以len(responses) == len(urls)声明永远不会是真的。&nbsp;go func(url string) {&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; fmt.Printf("Fetching %s \n", url)&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; resp, err := http.Get("http://" + url)&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; if err != nil {&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; fmt.Printf("Failed to fetch %s\n", err)&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; return&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; }&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; defer resp.Body.Close()&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; if resp.StatusCode == http.StatusOK {&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; fmt.Printf("HTTP Response Status : %v", resp.StatusCode)&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; bodyBytes, err := ioutil.ReadAll(resp.Body)&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; if err != nil {&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; log.Fatal(err)&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; }&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; bodyString := string(bodyBytes)&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; fmt.Printf("HTTP Response Content Length : %v\n", len(bodyString))&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; }&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; ch <- &HttpResponse{url, resp, err}&nbsp; &nbsp; &nbsp; &nbsp; }(url)

MYYA

您可以使用以下库:Requests:一个 Go 库,用于减少发出 HTTP 请求时的麻烦(20k/s req)https://github.com/alessiosavi/Requests它是为解决to many open files处理并行请求而开发的。这个想法是分配一个请求列表,而不是使用可配置的“并行”因子发送它们,该因子允许一次只运行“N”个请求。初始化请求(你已经有一组 url)// This array will contains the list of requestvar reqs []requests.Request// N is the number of request to run in parallel, in order to avoid "TO MANY OPEN FILES. N have to be lower than ulimit threshold"var N int = 12// Create the list of requestfor i := 0; i < 1000; i++ {&nbsp; &nbsp; // In this case, we init 1000 request with same URL,METHOD,BODY,HEADERS&nbsp;&nbsp; &nbsp; req, err := requests.InitRequest("https://127.0.0.1:5000", "GET", nil, nil, true)&nbsp;&nbsp; &nbsp; if err != nil {&nbsp; &nbsp; &nbsp; &nbsp; // Request is not compliant, and will not be add to the list&nbsp; &nbsp; &nbsp; &nbsp; log.Println("Skipping request [", i, "]. Error: ", err)&nbsp; &nbsp; } else {&nbsp; &nbsp; &nbsp; &nbsp; // If no error occurs, we can append the request created to the list of request that we need to send&nbsp; &nbsp; &nbsp; &nbsp; reqs = append(reqs, *req)&nbsp; &nbsp; }}此时,我们有一个列表,其中包含必须发送的请求。让我们并行发送它们!// This array will contains the response from the givens requestvar response []datastructure.Response// send the request using N request to send in parallelresponse = requests.ParallelRequest(reqs, N)// Print the responsefor i := range response {&nbsp; &nbsp; // Dump is a method that print every information related to the response&nbsp; &nbsp; log.Println("Request [", i, "] -> ", response[i].Dump())&nbsp; &nbsp; // Or use the data present in the response&nbsp; &nbsp; log.Println("Headers: ", response[i].Headers)&nbsp; &nbsp; log.Println("Status code: ", response[i].StatusCode)&nbsp; &nbsp; log.Println("Time elapsed: ", response[i].Time)&nbsp; &nbsp; log.Println("Error: ", response[i].Error)&nbsp; &nbsp; log.Println("Body: ", string(response[i].Body))}您可以在存储库的示例文件夹中找到示例用法。
打开App,查看更多内容
随时随地看视频慕课网APP

相关分类

Go