Currently I have a shiny
web app that can do some calculations on a 3GB data.frame loaded in memory.
Now instead of implementing this feature on shiny
web app, I need to make it a restful service that pipelines its calculation to another app in JSON
format, so that people can use it by sending http request with url like http://my-app.com/function
I'm trying opencpu
right now, but I don't quite understand how can I load and keep the large data in memory so that I can use api of opencpu
to call functions in package just to do calculations, not to load the large data from disk every time I send http request.
One workaround might be to use hbase
as in memory database and use rhbase
to load the data. But before I invest time in learning it I want to know if it is a reasonable choice for 3GB data.frame since it might add more overhead in serialization and other stuff that offset its speed benefit.
What would be a better way to implement this functionality? Solutions using packages other than opencpu
are also welcome, and it's better to be free.
You may want to look at Plumber . You decorate your R functions with comment code (that can include you loading data) and it makes it available via a REST API.
您应该将数据放入程序包中,然后将此程序包添加到服务器配置中以preload
。
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.