数据库
首页 > 数据库> > mysql-在is(object,Cl)中:获取行R时出错

mysql-在is(object,Cl)中:获取行R时出错

作者:互联网

我有一个MySQL表,我正在尝试使用RMySQL访问R.

应该从中返回1690004行

dbGetQuery(con, "SELECT * FROM tablename WHERE export_date ='2015-01-29'")

不幸的是,我收到以下警告消息:

In is(object, Cl) : error while fetching row
In dbGetQuery(con, "SELECT * FROM tablename WHERE export_date ='2015-01-29'",  : pending rows

并且仅接收〜400K行.

如果我使用dbSendQuery将查询分为几个“访存”,则在收到约40万行之后,警告消息开始出现.

任何帮助,将不胜感激.

解决方法:

因此,这似乎是由于我的托管服务提供商强加了60秒超时(该死的Arvixe!).我通过“分页/分块”输出来解决此问题.因为我的数据有一个自动递增的主键,所以返回的每一行都是有序的,这使我可以在每次迭代后获取下X行.

为了获得160万行,我执行了以下操作:

library(RMySQL)
con <- MySQLConnect() # mysql connection function
day <- '2015-01-29' # date of interest
numofids <- 50000 # number of rows to include in each 'chunk'
count <- dbGetQuery(con, paste0("SELECT COUNT(*) as count FROM tablename WHERE export_date = '",day,"'"))$count # get the number of rows returned from the table.
dbDisconnect(con)
ns <- seq(1, count, numofids) # get sequence of rows to work over
tosave <- data.frame() # data frame to bind results to
# iterate through table to get data in 50k row chunks
for(nextseries in ns){ # for each row
  print(nextseries) # print the row it's on
  con <- MySQLConnect()
  d1 <- dbGetQuery(con, paste0("SELECT * FROM tablename WHERE export_date = '",day,"' LIMIT ", nextseries,",",numofids)) # extract data in chunks of 50k rows
  dbDisconnect(con)
  # bind data to tosave dataframe. (the ifelse is avoid an error when it tries to rbind d1 to an empty dataframe on the first pass).
  if(nrow(tosave)>0){
      tosave <- rbind(tosave, d1)
  }else{
      tosave <- d1
  }
}

标签:rmysql,mysql,r
来源: https://codeday.me/bug/20191120/2045818.html