You are tasked with swapping hundreds or thousands of urls in your account.
One character off and you would be burning dollars to send traffic to a “does not exist” 404 page.

After opening a few urls in your browser to check, a heavy feeling of futility descends -opening every single one is out of the question.

Heres Python to the rescue!

3 simple steps to get this to work:

1) save your urls in a csv file (save as test.csv)

2)Run the code

3)get the flagged urls

Hit me up if I could help! Meanwhile, would be working on a more user-friendly version.

import csv
import requests


with open(‘test.csv’,”rb”) as csvfile:
urllist=[item for sublist in urllist for item in sublist]


result=[str(requests.get(item).status_code)+” detected for: “+str(item) for item in urllist if requests.get(item).status_code!=200] for i in result:
print i

Running this for a dummy input of:

Would generate the following output: