R: Painfully slow read performance using RODBC & SQL Server -


i new r interested in using shiny create dynamic charts using data stored in sql server database. enable interactivity, want bring in raw data database , perform calculations within r rather have database summarize data.

i able connect database using rodbc, execute query, , receive results in data.frame. however, read time in r 12x longer than same query executed in sql server management studio (ssms). ssms takes ~600 ms, whereas r takes 7.6 seconds. question whether doing wrong, or r slow database access? , if so, there faster alternatives (e.g. writing database output file , reading file)?

some information query may help: query retrieves 250k rows 4 columns. first column date , other 3 numeric values. machine running r , ssms high-end win 7 workstation 32gb of memory. r command running is:

system.time(df <- sqlquery(cn, query)) 

which returns:

user  system elapsed 7.17   0.01   7.58 

interestingly, appears data transfer sql machine fast, r busy doing things internally several seconds before returning data.frame. see because network utilization spikes in first second , returns near 0. several seconds later, r data.frame returns.

i try rjdbc http://cran.r-project.org/web/packages/rjdbc/rjdbc.pdf

with these drivers https://msdn.microsoft.com/en-us/sqlserver/aa937724.aspx

library(rjdbc) drv <- jdbc("com.microsoft.sqlserver.jdbc.sqlserverdriver","/sqljdbc4.jar")  con <- dbconnect(drv, "jdbc:sqlserver://server.location", "username", "password") dbgetquery(con, "select column_name table") 

Comments

Popular posts from this blog

powershell Start-Process exit code -1073741502 when used with Credential from a windows service environment -

twig - Using Twigbridge in a Laravel 5.1 Package -

c# - LINQ join Entities from HashSet's, Join vs Dictionary vs HashSet performance -