Downloading images from different url and saving it

r

#1

I have a list of URLs.Each url containing an image. I want to download image from each URL and want to save it in a folder by R. I saw that there are some solutions for linux interface but no solution for windows.Can anyone help please?


#2

@Tapojyoti_Paul
Hi there,
Let L be the list of url names and n be the length and “wd” be your working directory

Then,
library("RCurl") name<-seq(1:n) for(i in 1:n){ if(!url.exists(L[i]) next if(!file.exists(paste("folder",i,sep="")){ dir.create(paste("folder",i,sep="")) download.file(L[i],paste(wd,"/","folder",i,"/","image.jpg",sep = ""),mode = "wb") } }

Though it is just a workaround. It should serve the purpose.


#3

@NSS
It is showing the error like below,
"Error in download.file(url = L[i]) :
argument “destfile” is missing, with no default"
I also tried with lapply(L[i],download.file)
over there as well it’s showing same error.
Thanks


#4

@Tapojyoti_Paul
sorry, my bad. I erred in the parameter. I have edited the code and tested for single url. Should work for the batch too.

Hope this helps.

Neeraj


#5

Thanks a ton , Neeraj.It is working :slight_smile:


#6

@Tapojyoti_Paul
I am glad , I was of help :slight_smile:


#7

@NSS
I want another help from you.Suppose in a batch of urls one some of them are not valid.So for that how to skip the urls and shift to next one.for now it’s showing:
Error in download.file(L[i], paste(“C:\Users\Tapo\Documents\R”, “/”, :
cannot open URL ‘https://…/sites/all/themes/custom/akeliustheme/logo.png’

Please Help.Thanks in advance!!


#8

@Tapojyoti_Paul give me a sample url with the problem you mentioned.


#9

www.akelius.com/sites/all/themes/custom/akeliustheme/logo.png


#10

@Tapojyoti_Paul

I edited the code for the task.I hope it should work.

Regards.


#11

@ NSS
Thanks again :slight_smile: It really help me a lot today! But for that existence checking, it’s taking too much time to run for 10000 urls .I will try to improve the code to reduce time span but in between if you can improve the code please share it.
Thanks a lot!!
Regards,
Tapojyoti


#12

@Tapojyoti_Paul

require(RCurl)
a=0
wd=getwd()
L=list("http://www.akelius.com/sites/all/themes/custom/akeliustheme/logo.png","https://dl2.pushbulletusercontent.com/xg2R7m3iOLGa9DhjR6I8KfP2gaH1uJTq/Screenshot%20%2843%29.png")
for (n in 1:2){
  z <- ""
  try(z <- getBinaryURL(L[[n]], failonerror = TRUE))   
  if (length(z) > 1) {dir.create(paste("folder",n,sep=""))
    download.file(L[[n]],paste(wd,"/","folder",n,"/","image.jpg",sep = ""),mode = "wb")
  } else {
    a=a+1
    }
}
print(a) 

check out this code and please tell it works faster :smiley:

Regards
Neeraj


#13

Much better!! Thanks :slight_smile:

this is also working !!
instead of checking URLs and instead of using try over there we can do this:
for (n in 123:f){
dir.create(paste(“folder”,n,sep=""))
try(download.file(L[[n]],paste(wd,"/",“folder”,n,"/",“image.png”,sep = “”),mode = “wb”))
print(n)
}