[英]How to read data from a website through its API?
I'm very new to Spark.我对 Spark 很陌生。 I need to read data from the website Opensky, using the api they have for it ( https://openskynetwork.github.io/opensky-api/python.html ).
I need to read data from the website Opensky, using the api they have for it ( https://openskynetwork.github.io/opensky-api/python.html ). The bbox parameter is a tuple of exactly four values (min_latitude, max_latitude, min_longitude, max_latitude).
bbox 参数是正好四个值(min_latitude、max_latitude、min_longitude、max_latitude)的元组。 The following code shows the flights registered on certain coordinates:
以下代码显示了在某些坐标上注册的航班:
import json
from random import sample
from opensky_api import OpenSkyApi
api = OpenSkyApi()
states = api.get_states(bbox=(45.8389, 47.8229, 5.9962, 10.5226))
for s in sample(states.states,5):
flight = {
'callsign':s.callsign,
'country': s.origin_country,
'longitude': s.longitude,
'latitude': s.latitude,
'velocity': s.velocity,
'vertical_rate': s.vertical_rate,
}
flight_data= json.dumps(flight, indent=2).encode('utf-8')
print("(%r, %r,%r, %r, %r, %r)" % (s.callsign, s.origin_country, s.longitude, s.latitude,s.velocity,s.vertical_rate))
I need to create a python program to be able to send flight information every 10 seconds (through a port that I have assigned).我需要创建一个 python 程序,以便能够每 10 秒发送一次航班信息(通过我分配的端口)。 First I have to run the python program with the socket server that reads from Opensky in a terminal, and then I have to run the Spark program with structured streaming in another terminal.
首先,我必须在终端中使用从 Opensky 读取的套接字服务器运行 python 程序,然后我必须在另一个终端中运行带有结构化流的 Spark 程序。 I need to send the data and display it by the terminal in json format (using the json.dumps function).
我需要发送数据并通过终端以 json 格式显示(使用 json.dumps 函数)。
I have the following templates to do it, but I don't know how I should modify them to be able to read the data.我有以下模板可以做到这一点,但我不知道应该如何修改它们才能读取数据。 The templates are as follows:
模板如下:
Server Socket:服务器套接字:
import socket
server = socket.socket()
host = ????
port = ????
server.bind((host, port))
server.listen(2)
client_socket, addr = server.accept()
print("connection established.")
# Sending data
client_socket.sendall("Text".encode())
Spark Structured Streaming: Spark结构化流:
from pyspark.sql import SparkSession
from pyspark.sql.functions import explode
from pyspark.sql.functions import split
spark = SparkSession \
.builder \
.appName("FlightsInformation") \
.getOrCreate()
flights= spark \
.readStream \
.format("socket") \
.option("host", "????") \
.option("port", ????) \
.load()
flights_information= ????
query = flight_information\
.writeStream \
.outputMode("complete") \
.format("console") \
.start()
query.awaitTermination()
How can I do it?我该怎么做?
This is how I create the socket to send the JSON data through the socket.这就是我创建套接字以通过套接字发送 JSON 数据的方式。
import socket
import sys
import json
from random import sample
from time import sleep
from opensky_api import OpenSkyApi
api = OpenSkyApi()
states = api.get_states(bbox=(45.8389, 47.8229, 5.9962, 10.5226))
# Create a socket (SOCK_STREAM means a TCP socket)
try:
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
except socket.error as err:
print('Socket error because of %s' %(err))
try:
# Connectar al server
sock.bind(('127.0.0.1', PORT))
except socket.error as err:
print('Error, could not bind to server because of %s' %(err))
sys.exit
sock.listen(2)
client_socket, addr = sock.accept()
print("connection established.")
while True:
for s in sample(states.states, 5):
vuelo_dict = {
'callsign':s.callsign,
'country': s.origin_country,
'longitude': s.longitude,
'latitude': s.latitude,
'velocity': s.velocity,
'vertical_rate': s.vertical_rate,
}
flight_data = json.dumps(vuelo_dict, indent=2).encode('utf-8')
print("(%r, %r,%r, %r, %r, %r)" % (s.callsign, s.origin_country, s.longitude, s.latitude,s.velocity,s.vertical_rate))
try:
client_socket.sendall(flight_data)
sleep(10)
#print('Sent: {}'.format(flight_data))
except socket.gaierror:
print ('There an error resolving the host')
sock.close()
Spark Structured Streaming: Spark结构化流:
from pyspark.sql import SparkSession
from pyspark.sql.functions import explode
from pyspark.sql.functions import split
spark = SparkSession \
.builder \
.appName("FlightsInformation") \
.getOrCreate()
flights_information= spark \
.readStream \
.format('socket')\
.option('host', 'localhost')\
.option('port', XXXXX)\
.load()
query = flights_information\
.writeStream \
.outputMode("append") \
.format("console") \
.start()
query.awaitTermination()
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.