Auto Generating Blog Feed

4 months ago 2

📅 Thu, Jul 10, 2025 ⏱️ 4-minute read

I don’t know about you, but I love the RSS feed, and over the years, I’ve accumulated various blogs that I follow. However, it’s hard to keep track of them all, as I don’t use any RSS clients. So, I decided to create a GitHub Action that automatically generates and publishes a reading list from the RSS feeds of the blogs I follow.

So now I can open this page to see the latest posts from all the blogs I follow, and it updates automatically at night. Feel free to share with me your favorite blogs about software engineering, and I’ll add them to the list.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 package main import ( "fmt" "log" "os" "sort" "time" "github.com/mmcdole/gofeed" ) // Some great blogs to follow var feeds = []RSSFeed{ {URL: "https://go.dev/blog/feed.atom", Name: "Go Blog"}, {URL: "https://notes.eatonphil.com/rss.xml", Name: "eatonphil.com"}, {URL: "https://mcyoung.xyz/feed.xml", Name: "mcyoung"}, {URL: "https://matklad.github.io/feed.xml", Name: "matklad"}, {URL: "https://tpaschalis.me/feed.xml", Name: "tpaschalis"}, {URL: "https://davi.sh/rss.xml", Name: "davi.sh"}, {URL: "https://www.scattered-thoughts.net/feed.xml", Name: "scattered-thoughts.net"}, } type Article struct { Title string Link string Published time.Time BlogName string } type RSSFeed struct { URL string Name string } func main() { allArticles := []Article{} fp := gofeed.NewParser() for _, feed := range feeds { log.Printf("fetching articles from %s (%s)", feed.Name, feed.URL) rssFeed, err := fp.ParseURL(feed.URL) if err != nil { log.Printf("error fetching feed %s: %v", feed.URL, err) continue } for i, item := range rssFeed.Items { if i >= 5 { break } pubDate := time.Now() if item.PublishedParsed != nil { pubDate = *item.PublishedParsed } else if item.UpdatedParsed != nil { pubDate = *item.UpdatedParsed } allArticles = append(allArticles, Article{ Title: item.Title, Link: item.Link, Published: pubDate, BlogName: feed.Name, }) } } sort.Slice(allArticles, func(i, j int) bool { return allArticles[i].Published.After(allArticles[j].Published) }) markdownContent := `+++ type = "page" title = "Reading List" date = "` + time.Now().Format(time.RFC3339) + `" +++ This page is auto-generated from Github Actions workflow that runs every day at night and fetches the 5 latest articles from each of my favorite blogs. ### Latest Articles ` for _, article := range allArticles { dateStr := article.Published.Format("2006-01-02") markdownContent += fmt.Sprintf( "- %s [%s](%s) - %s\n", dateStr, article.Title, article.Link, article.BlogName, ) } outputPath := "../content/reading-list.md" err := os.WriteFile(outputPath, []byte(markdownContent), 0644) if err != nil { log.Fatalf("failed to write Markdown file: %v", err) } log.Printf("successfully generated %s", outputPath) }

And daily at midnight, a GitHub Action runs this code and pushes the generated Markdown file to my website repository. Then there is another GitHub Action that builds the website and deploys it to the server.

Now I have a stable and automated way to keep track of the latest posts from my favorite blogs, and I hope you find it useful too. If you have any suggestions or improvements, feel free to reach out!

Read Entire Article