Download files from the Web based on a list of URLs in a textfile

From ToWaSo
Jump to navigation Jump to search

Here is some code that shows how to download files from a webserver using a list of Urls in a textfile. I wrote this because i wanted to download a bunch of files from the internet archive where the links to the files are stored in a playlist.

This example uses a list of DocObjects which have a name and a url.

internal class DocObject
{
   public string Name { get; set; }
   public string Url{ get; set; }
}

Then we need a method to read the textfile. The part to extract the filename is not shown. You can do something like this:

 List<DocObject> doclist = new List<DocObject>();
 using (var reader = new StreamReader(@"C:\Path_toTextfile.txt"))
 {
      string line;
      while ((line = reader.ReadLine()) != null)
      {
           doclist.Add(new DocObject
           {
                    Url = line,
                    Name= ExtractNameFromLine(line)
          });
      }
}

The actual download is run in sync with the following method:

private void  DownloadFile(DocObject docObj)
{
    try
    {
        using (WebClient webClient = new WebClient())
        {
            string downloadToDirectory = Path.Combine(@"C:\Your downloadPath" , docObj.Name);
            webClient.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;
            webClient.DownloadFile(new Uri(docObj.Url), downloadToDirectory);

        }
    }
    catch (Exception ex)
    {
     //do some logging
    }
}

Running it async would not need much change of the code, however i don't want to burden the server of the internet archive too much.