0

i m trying to go through a list of ahref links in a web page and click on each link at a time. i managed to list all the links and display them using the GetAttribute method, but i m struggling to click on each link. Any advise please or similar example in C# with using selenium would be appreciated?

The reason i need to do this is to try and check that all the links can be clicked and don't return a page not found error

Thank in advance

List<IWebElement> item = new List<IWebElement>(); foreach (IWebElement item in OpenPageSteps.driver1.FindElements(By.TagName("a"))) { if (item.Displayed) { Console.WriteLine(item.GetAttribute("href")); } }` 

    2 Answers 2

    1

    Your objective is to check whether the link is working fine and it does not return page not found error. One simple way without selenium is we can use WebRequest and WebResponse classes in .NET and C# to call a link and read its content.

    public void callurl(string url) { WebRequest request = HttpWebRequest.Create(url); WebResponse response = request.GetResponse(); StreamReader reader = new StreamReader(response.GetResponseStream()); string urlText = reader.ReadToEnd(); // it takes the response from your url. now you can use as your need Response.Write(urlText.ToString()); } 

    Call this function inside the loop and check whether it's return the page content or not.

    1
    • The initial step should be to retrieve the page and parse it to get "a" tags and their href attribute - it can be also done bypassing browser. To parse the request result HtmlAgility library can be used html-agility-pack.net
      – Piotr M.
      CommentedOct 25, 2020 at 20:10
    1

    Selenium is not the best tool to check the links for 404 error. You can gather links with Selenium, but then just use HttpClient to check them. It will be much more efficient.

    var client = new HttpClient(); try { var response = await client.GetAsync(url); if (response.IsSuccessStatusCode) { // normal link } else { // Something wrong with link here (500 etc) } } catch (Exception e) { // .. Network related issues (site does not exist etc.) } 
    4
    • Thanks but first thing i need to do is to go through the list of ahref links. there are many links on the webpage so i will first need to loop through each one and check they exist first. to achieve this i use driver1.FindElements(By.TagName("a"))) but i need to loop through all the links and just click on each one by one first
      – hm9
      CommentedOct 25, 2020 at 16:07
    • With Selenium you can only click one link at a time and after that your Selenium will load that link. So you will need to create something like crawler. Why have you decided to use Selenium?
      – Anatoly
      CommentedOct 25, 2020 at 16:18
    • I m working on selenium.It is possible to go through all the links and click on them by using FindElements. ReadOnlyCollection<IWebElement> links = driver.FindElements(By.TagName("a")); foreach (IWebElement link in links) { String href = link.GetAttribute("href"); // do something with href }
      – hm9
      CommentedOct 25, 2020 at 16:48
    • Selenium is the browser automation framework. Which result do you expect if you click on all links on a webpage? You won't get any meaningful result. So you need to store links you found and then navigate them one by one
      – Anatoly
      CommentedOct 25, 2020 at 17:49

    Start asking to get answers

    Find the answer to your question by asking.

    Ask question

    Explore related questions

    See similar questions with these tags.