简体   繁体   English

如何从网站列表中下载新文章? 推荐?

[英]How to Download New Articles From a List of Websites? Recommendations?

I've ran into several dead ends trying to come up with a result from a Google search.我在尝试从 Google 搜索中得出结果时遇到了几个死胡同。 Essentially, I have a list of say 20 websites, all research institutes that occasionally update their websites/blogs with their latest findings.本质上,我有一个列表,比如 20 个网站,所有研究机构偶尔会用他们的最新发现更新他们的网站/博客。

I'm trying to either A - find a software that can check for new articles, then send me the title and link to the article, or B - write a script that check for new articles, then send me the title and link.我正在尝试 A - 找到一个可以检查新文章的软件,然后将标题和文章链接发送给我,或者 B - 编写一个检查新文章的脚本,然后将标题和链接发送给我。

Any suggestions or software recommendations?有什么建议或软件推荐吗?

You should first see if any of the sites have an RSS feed. 您应该首先查看是否有任何站点具有RSS feed。 That is fairly common, and will do the work for you. 那是很普遍的,它将为您完成工作。

I've built similar things. 我已经建立了类似的东西。 If the articles have a published date, you could keep a file or database of with the new 如果文章有发布日期,则可以使用新的保留文件或数据库。

The easiest way to save a web page is to download it to your computer.保存网页的最简单方法是将其下载到您的计算机。 In Chrome, open the three-dot menu and select More Tools > Save page as.在 Chrome 中,打开三点菜单并选择更多工具 > 页面另存为。 For Firefox, open the hamburger menu and choose Save Page As.对于 Firefox,打开汉堡菜单并选择将页面另存为。 On Safari, go to File > Save as or File > Export as PDF, and in Microsoft Edge, open the three-dot menu and choose More tools > Save page as.在 Safari 上,转到“文件”>“另存为”或“文件”>“导出为 PDF”,在 Microsoft Edge 中,打开三点状菜单并选择“更多工具”>“将页面另存为”。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM