Scraping the web is fraught with peril. URLs die; networks get disrupted and best laid plans for building a corups from links can quickly go awry. Use this funtion to mitigate some of the pain of retrieving web resoures.
safeGET(url = NULL, config = list(), timeout = httr::timeout(5), ..., handle = NULL)
url | the url of the page to retrieve |
---|---|
config | Additional configuration settings such as http
authentication ( |
timeout | a call to |
... | Further named parameters, such as |
handle | The handle to use with this request. If not
supplied, will be retrieved and reused from the |
This is a thin wrapper for httr::GET()
using purrr::safely()
that will
either return a httr
response
object or NULL
if there was an error.
If you need the reason for the error (e.g. Could not resolve host...
)
you should write your own wrapper.