[英]Scraping multiple web pages with Cheerio
I'm learning to use Cheerio to scrape data from web pages. 我正在学习使用Cheerio从网页上抓取数据。 I know already how to get data from a single page but now I'm trying to figure out how to do the same with multiple pages.
我已经知道如何从单个页面获取数据,但是现在我想弄清楚如何对多个页面执行相同的操作。
I have two separate functions, one for both url. 我有两个单独的函数,两个都一个。 In my index.js I'm using the functions like this:
在我的index.js中,我正在使用如下功能:
const express = require('express');
const scraper = require('./scraper');
const fs = require('fs');
const app = express();
app.get('/search/:title', (req, res) => {
scraper.func1(req.params.title).then(cars => {
res.json(cars);
fs.writeFile(
'./json/cars.json',
JSON.stringify(cars, null, 2), // optional params to format it
nicely
err =>
err
? console.error('Data not written!', err)
: console.log('Data written!')
);
});
scraper.func2(req.params.title).then(cars => {
res.json(cars);
fs.writeFile(
'./json/cars2.json',
JSON.stringify(cars, null, 2), // optional params to format it
nicely
err =>
err
? console.error('Data2 not written!', err)
: console.log('Data2 written!')
);
});
});
const port = process.env.PORT || 3000;
app.listen(port, () => {
console.log(`Listening on ${port}`);
});
Obviously these two functions don't work when chained like this. 显然,当像这样链接时,这两个功能不起作用。 Seperately they both work just fine.
另外,它们都可以正常工作。 So my question is, how should I chain these two functions to use them correctly?
所以我的问题是,我应该如何链接这两个函数以正确使用它们?
I would use the async / await syntax for this purpose, it will keep the code a bit cleaner. 为此,我将使用async / await语法,这将使代码更简洁。
We'll call each function in sequence, then combine the response and send back to the client. 我们将依次调用每个函数,然后组合响应并将其发送回客户端。
const express = require('express');
const scraper = require('./scraper');
const fs = require('fs');
const app = express();
function writeJsonToFile(fileName, data) {
fs.writeFile(fileName,
JSON.stringify(data, null, 2), // optional params to format it nicely
err =>
err
? console.error('Data not written!', err)
: console.log(`Data written to file: ${fileName}!`)
);
}
app.get('/search/:title', async (req, res) => {
try {
let cars1 = await scraper.func1(req.params.title);
writeJsonToFile('./json/cars1.json', cars1);
let cars2 = await scraper.func2(req.params.title);
writeJsonToFile('./json/cars2.json', cars2);
let combinedResponse = { cars1, cars2 };
res.json(combinedResponse);
} catch (err) {
res.json({ error: `Something bad happened: ${err.message}` });
}
})
const port = process.env.PORT || 3000;
app.listen(port, () => {
console.log(`Listening on ${port}`);
});
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.