from Hacker News

Ask HN: Fastest Crawl of HN Articles

by agencies on 4/29/22, 3:49 PM with 13 comments

HN links to over 6 million urls in stories and comments. Many domains have expired or content is no longer available. Internet archive has much of the content but throttles requests. What's the fastest way to get the historical content?
  • by arinlen on 4/29/22, 4:06 PM

    HN does have a REST API which is quite easy to use.

    https://github.com/HackerNews/API

    I'm not sure what rate limiting policy is in place, but in theory you can start with a request for maxitem and from that point on just GET all items down to zero until you hit some sort of blocker.

  • by jpcapdevila on 4/30/22, 12:22 AM

    The best way to do it is from Google BigQuery.

    There's a dataset containing everything: bigquery-public-data.hacker_news.full

    You can write SQL and is super fast. Sample:

    SELECT * FROM bigquery-public-data.hacker_news.full LIMIT 1

  • by python273 on 4/29/22, 7:37 PM