We're almost done! Our web crawler now extracts rich data from every page and stores it efficiently in a dictionary. But printing this data to the console isn't very useful for analysis. Let's export it to CSV format so its easier to read and parse.
For example, from this page data dictionary:
pages := map[string]PageData{
"blog.boot.dev": {
URL: "https://blog.boot.dev",
H1: "Learn Backend Development",
FirstParagraph: "Boot.dev teaches backend development...",
OutgoingLinks: []string{"https://boot.dev/courses", "https://boot.dev/about"},
ImageURLs: []string{"https://blog.boot.dev/logo.png"},
},
}
We want to create a CSV file with columns: page_url, h1, first_paragraph, outgoing_link_urls, image_urls
import "encoding/csv"
func writeCSVReport(pages map[string]PageData, filename string) error:
pages is the map returned by your crawler (keys are normalized URLs, values are PageData structs)filename is the CSV file to create (defaults to "report.csv" in main.go)page_url, h1, first_paragraph, outgoing_link_urls, image_urls;) for the link and image URL columnsHere are some tips to get you started:
os.Create(filename)csv.NewWriter(file) to write the datawriter.Write([]string{...})PageData in the map, write a row with its fieldsstrings.Join(page.OutgoingLinks, ";")writeCSVReport functionwriteCSVReport(cfg.pages, "report.csv")report.csv is created after running the crawlerRun and submit the CLI tests.