[英]Stream reading from database using spark streaming
I want to use spark streaming to read data from RDBMS database like mysql. 我想使用spark streaming来读取来自RDBMS数据库的数据,比如mysql。
but I don't know how to do this using JavaStreamingContext 但我不知道如何使用JavaStreamingContext来做到这一点
JavaStreamingContext jssc = new JavaStreamingContext(conf, Durations.milliseconds(500));
DataFrame df = jssc. ??
I search in the internet but I didn't find anything 我在互联网上搜索,但我没有找到任何东西
thank you in advance. 先感谢您。
You cannot do it like that without installing some third party piece of software. 如果不安装某些第三方软件,你就无法做到这一点。
What you CAN do is creating a personalized receiver which does what you want, using the SparkSQL package and the Streaming one combined. 你可以做的是创建一个个性化的接收器,它可以完成你想要的,使用SparkSQL包和Streaming组合。
Implement a class extending Receiver and inside do all the connections and querys needed to pull the data from the DB. 实现扩展Receiver的类,并在内部执行从数据库中提取数据所需的所有连接和查询。
I am at work now, so I'll give you a link to see instead of producing the code, sorry: 我现在在工作,所以我会给你一个链接,看看而不是产生代码,抱歉:
http://spark.apache.org/docs/latest/streaming-custom-receivers.html http://spark.apache.org/docs/latest/streaming-custom-receivers.html
https://medium.com/@anicolaspp/spark-custom-streaming-sources-e7d52da72e80 https://medium.com/@anicolaspp/spark-custom-streaming-sources-e7d52da72e80
The best possible and reliable solution would be avoid using MySqL at all. 最好的可靠解决方案是避免使用MySqL。 when you insert your records to MySQl put them also into Kafka (Kafka producer) by a transaction and then use them in your streaming application. 当您将记录插入MySQl时,通过事务将它们也放入Kafka(Kafka生产者),然后在流应用程序中使用它们。
It's not possible to stream from MySql I think. 我认为不可能从MySql流式传输。 Data can be ingested from many sources like Kafka, Flume, Twitter, ZeroMQ, Kinesis, or TCP sockets. 数据可以从许多来源摄取,如Kafka,Flume,Twitter,ZeroMQ,Kinesis或TCP套接字。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.