a better R workflow on a webserver
We see that R uses a lot of power, especially to do the :
s4r < - read.delim(quote="", file="./stats_4_R.txt", header=T, sep="|")
For each single graph, R recreates the s4r object. Is there a way to make it persistant so we can just use the s4r object directly in our wikipages?
https://wikispiral.org
If R opens a new session for each graph (what is more or less OK), can't it define some objects for all the sessions, for instance via a bash script?
Is R running as an app or as a aerver? Do we need a special parameter to run R as a server? Does R when he closes empty all its memory? Can't it keep some data in memory for future use?
Sorry for all this questions but 12sec to load three simple pies is a bit harsh...
Have a nice day,
Joël