简体   繁体   English

使用带有结构化文本编程的套接字将数据流传输到MongoDB

[英]Data streaming to MongoDB using sockets with Structured Text programming

How do I stream a string (hello world, for example) from a PLC by Structured Text programming (IEC 61131-3) to MongoDB using sockets? 如何通过套接字使用结构化文本编程(IEC 61131-3)从PLC将字符串(例如,hello world)流式传输到MongoDB? I heard that function blocks like SysLibSockets have to be used but I am not sure how. 我听说必须使用SysLibSockets类的功能块,但是我不确定如何使用。

It would be great if you could help me with this as I am relatively new to ST and want to learn the language. 如果您能为我提供帮助,那太好了,因为我是ST的新手,并且想学习该语言。

Thanks. 谢谢。

Your question does not have sufficient information. 您的问题没有足够的信息。

Structured text is a variation on Pascal. 结构化文本是Pascal的变体。 As such similar to "C" WITHOUT any of the standard libraries such as the BSD sockets library, it has no inherent communication capabilities of its own. 由于没有任何标准库(例如BSD套接字库),因此类似于“ C”,它没有自己的固有通信功能。 Communication capabilities are defined by the "system" level details of the PLC itself. 通信能力由PLC本身的“系统”级别详细信息定义。

Furthermore, many PLC's don't have a "generic" UDP or TCP interface so if that is the case then you have to implement the correct protocol on the PC side. 此外,许多PLC没有“通用” UDP或TCP接口,因此,在这种情况下,您必须在PC端实现正确的协议。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 无法使用Kafka-Spark结构化流将数据发送到MongoDB - Unable to send data to MongoDB using Kafka-Spark Structured Streaming 如何使用结构化文本编程在PLC和MongoDB(NoSQL)之间建立连接? - How to establish a connection between PLC and MongoDB (NoSQL) using Structured Text programming? 使用 PySpark 结构化流将 Kafka Stream 接收到 MongoDB - Sink Kafka Stream to MongoDB using PySpark Structured Streaming 如何使用python中的结构化火花流使用ForeachWriter将行插入到Mongodb中? - How to use structured spark streaming in python to insert row into Mongodb using ForeachWriter? 如何为MongoDB接收器构建用于Spark Structured Streaming应用程序的超级jar - How to build uber jar for Spark Structured Streaming application to MongoDB sink 如何处理 JSON 文档(来自 MongoDB)并在结构化流中写入 HBase? - How to process JSON documents (from MongoDB) and write to HBase in Structured Streaming? 如何将 Kafka 与 Spark Structured Streaming 与 MongoDB Sink 集成 - How to Integrate Kafka with Spark Structured Streaming with MongoDB Sink 使用Spark和Kafka进行Twitter流式传输:如何在MongoDB中存储数据 - Twitter streaming using spark and kafka: How store the data in MongoDB 无法使用 Spark 结构化流向 MongoDB 发送数据 - unable to send data to MongoDB using spark strucutred streaming 使用Storm螺栓或Spark流与MongoDB丰富数据 - Enrich data using Storm bolt or Spark-streaming with MongoDB
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM