本文介绍了RServe共享库代码的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

RServe产生的进程是否可能共享一些一次加载到内存中的通用库?想象一下,我需要同时在100个不同的RConnections上执行以下代码.

Is it possible that processes spawned by RServe share some common libraries loaded once into memory?Imagine that I need to execute bellow code on 100 different RConnections concurrently.

library(libraryOfSize40MB)
fun()

这意味着我仅需要3.9GB的内存即可加载库.我希望先加载一次库,然后执行fun()一百次,以便可以在便宜的主机上运行它.

It means that I need about 3.9GB of memory just to load library. I would prefer to load library once and then execute fun() one hundred times, so that I can run this on cheap host.

这可能有用吗? https://github.com/su/Rserve/blob/master/NEWS #L40-L48

推荐答案

有可能.您必须先使用run.serve从R Shell运行RServe,然后再加载库:

It is possible. You have to run RServe from R shell using run.serve preceded by loaded libraries:

library(Rserve)

#load libraries so all connections will share them
library("yaml")
library("reshape")
library("rjson")
library("zoo")
(...)
library("stringr")

run.Rserve(debug = TRUE, port = 6311, remote=TRUE, auth=FALSE, args="--no-save", config.file = "/etc/Rserve.conf")

每个新的连接都将能够看到该库

Every new connection will be able to see this libraries

library(RSclient)
con = RS.connect(host='10.1.2.3')
RS.eval(con, quote(search()))
> #lots of libraries available

这篇关于RServe共享库代码的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-13 22:51