[英]google cloud pubsub node.js client not compatible with google cloud functions
We have an architecture using 2 pubsub topic/subscription pairs: 我们有一个使用2个pubsub主题/订阅对的架构:
T1
is triggered by a cronjob periodically (every 5 minutes for example). T1
由cronjob定期触发(例如每5分钟)。 Subscription S1
is the trigger for our cloud function. S1
是我们的云功能的触发器。 T2
serves as a queue for background jobs that are published by one of our services. T2
用作由我们的某个服务发布的后台作业的队列。 Subscription S2
is read by the cloud function on each execution to service the queued background jobs. S2
以服务排队的后台作业。 This allows us to control the frequency the background jobs are serviced independent of when they are added to the queue. 这允许我们控制后台作业的服务频率,而不管它们何时被添加到队列中。
The cloud function (triggered by S1
) reads messages from S2
by pulling . 云功能(由
S1
触发)通过拉动从S2
读取消息。 It decides which background jobs are ready and upon successfully servicing the job, it ACK's the associated messages. 它决定哪些后台作业准备就绪,并在成功为作业提供服务后,确认相关消息。 Jobs not ready or failed are not ACK'ed to be serviced later.
未准备好或未通过的作业不会被确认以后再进行维修。
We have issues using the official node.js pubusb client from google: 我们使用谷歌的官方node.js pubusb客户端有问题:
ack()
by investigating our logs. ack()
。 S2
either by verifying the unacknowledged message count in stackdriver or by re-deploying the function and seeing the messages getting serviced. S2
中排队。 We believe this is a problem with google's node.js pubsub client. 我们认为谷歌的node.js pubsub客户端存在问题。 The cloud function docs clearly state not start background activities .
云功能文档明确指出不启动后台活动 。 However, looking into the node.js pubsub client source, it clearly services acknowledgements in the background using timeouts.
但是,查看node.js pubsub客户端源,它显然使用超时在后台提供确认。
Is google's node.js pubsub client not compatible with google cloud functions? 谷歌的node.js pubsub客户端与谷歌云功能不兼容吗? Google recommends accessing the service API's only when a client library does not exist or does not meet other needs .
Google 建议仅在客户端库不存在或不满足其他需求时才访问服务API 。 Is running the client in a cloud function "other needs", requiring us to write our own client using the service API's?
在云功能“其他需求”中运行客户端,要求我们使用服务API编写自己的客户端吗?
As a "workaround" we tried delaying the end of the execution of the cloudfunction to allow any "background" processes in the node.js pubsub client to complete, but this did not consistently eliminate our issue. 作为“解决方法”,我们尝试延迟执行云功能以允许node.js pubsub客户端中的任何“后台”进程完成,但这并不能始终如一地消除我们的问题。 It seems that pubsub client is not cloud function friendly and cannot recover from being stopped in between cloud function executions.
似乎pubsub客户端不是云功能友好的,无法在云功能执行之间停止恢复。
I wrote an article on our blog that describes in detail why we used PubSub in this way and how we are working around the fact that node.js pubsub client is not compatible with cloud functions. 我在我们的博客上写了一篇文章 ,详细描述了为什么我们以这种方式使用PubSub以及我们如何处理node.js pubsub客户端与云功能不兼容的事实。
How are you triggering your functions? 你是如何触发你的功能的?
According to the docs , if your function is consuming pubsub messages, then you should use the pubsub trigger. 根据文档 ,如果您的函数正在使用pubsub消息,那么您应该使用pubsub触发器。 When using the pubsub trigger, the library is not needed.
使用pubsub触发器时,不需要库。 Simply call
callback()
at the end of your function, and the pubsub message will be properly acknowledged. 只需在函数末尾调用
callback()
,就可以正确确认pubsub消息。
For what you intend to do, I don't think your current architecture is the proper option. 对于您打算做的事情,我不认为您当前的架构是正确的选择。
I would move your first step to Google App Engine with a cron task , and making this task simply move messages from T2
to T1
, leaving the function having the trigger S2
and processing the message. 我将使用cron任务将您的第一步移至Google App Engine,并将此任务简单地从
T2
移至T1
,使该功能具有触发器S2
并处理该消息。
So, your jobs would be published on T2
, and you'd have a GAE app with a pull subscription S2
triggered by a cron task, and this app would re-publish the message to T1
. 因此,您的工作将在
T2
上发布,并且您将拥有一个GAE应用程序,其中包含由cron任务触发的请求订阅S2
,此应用程序将重新将该消息发布到T1
。 Then your function would be triggered by a subscription S1
to topic T1
, and would run the job in the message, avoiding the extra-processing of importing the pubsub library, and using the product as expected. 然后,您的功能将由订阅
S1
到主题T1
触发,并将在消息中运行作业,避免导入pubsub库的额外处理,并按预期使用产品。
Furthermore, I'm not sure how you are originally publishing your jobs to the topic, but Task Queues are a good GAE (and product-agnostic in Alpha ) option for rate-limiting tasks. 此外,我不确定您最初如何将作业发布到该主题,但任务队列是一个很好的GAE(与Alpha中的产品无关 )选项,用于速率限制任务。
A GAE app only used for this (setting a 1 max instance) would be within the always free limit , so costs would not be noticeably increased. 仅用于此的GAE应用程序(设置1个最大实例)将在始终免费限制内 ,因此成本不会显着增加。
A developer from the node.js pubsub client confirmed that using the client to pull messages from a Cloud Function is not a supported use case. 来自node.js pubsub客户端的开发人员确认使用客户端从云功能中提取消息不是受支持的用例。
The alternative is to use the service APIs . 另一种方法是使用服务API 。 However, the REST APIs have their own caveats when attempting to pull all messages from a subscription.
但是,在尝试从订阅中提取所有消息时,REST API会有自己的注意事项 。
I ran into the same problem, I wanted better control over .ack()
. 我遇到了同样的问题,我想更好地控制
.ack()
。 Looking at the nodejs library from google, it would an option to refactor the ack()
to return a promise so the function can wait for ack()
to complete. 从google查看nodejs库,可以选择重构
ack()
以返回一个promise,这样函数就可以等待ack()
完成。
Subscriber.prototype.ack_ = function(message) {
var breakLease = this.breakLease_.bind(this, message);
this.histogram.add(Date.now() - message.received);
if (this.writeToStreams_ && this.isConnected_()) {
this.acknowledge_(message.ackId, message.connectionId).then(breakLease);
return;
}
this.inventory_.ack.push(message.ackId);
this.setFlushTimeout_().then(breakLease);
};
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.