Background

This tour is based on a list from PBS food where people submit their favorite pizza places. I figured since this list has 97 pizza places (96 Pizza places once you filter out  the duplicate and the one not in Michigan) from all around Michigan it would have a pretty good spread. The list varies in distance from where I am (Michigan State University in East Lansing) to about 8 hours away.

Gathering Data

So I have a site with a list and a load of data that I need. The first thing I though of doing was grabbing python and making a quick script to grab all the items on the list.

from urllib import request
from bs4 import BeautifulSoup
import csv

pizza_page="http://www.pbs.org/food/features/best-pizza-michigan/"

page=request.urlopen(pizza_page)

soup = BeautifulSoup(page,'html.parser')

location_list=soup.find_all('em')
name_list=soup.find_all('h2')
combined_list=[]

for x in range(0,len(location_list)):
    combined_list.append((location_list[x].string+" Michigan",name_list[x].string))

with open('pizzalist.csv','w',newline='') as out:
    csv_out=csv.writer(out)
    csv_out.writerows(combined_list)

Now we have a csv file with all of our pizza places and the city they are located in. Now we need to know exactly where they are and how long it will take to get to them.

from urllib import request
from bs4 import BeautifulSoup
import csv
import googlemaps
#The union at msu
origin_location=["49 Abbot Rd, East Lansing, MI 48824"]

gmaps = googlemaps.Client(key='Use your own secret key')

pizza_page="http://www.pbs.org/food/features/best-pizza-michigan/"

page=request.urlopen(pizza_page)

soup = BeautifulSoup(page,'html.parser')

location_list=soup.find_all('em')
name_list=soup.find_all('h2')
combined_list=[]


for x in range(0,len(location_list)):
    combined_list.append((location_list[x].string+" Michigan",name_list[x].string))

print(combined_list)
# Time to combined list
t_t_c_l=[]

for i in combined_list:
    address_result=gmaps.find_place(input=i[0]+" "+i[1],input_type='textquery')
    address=(gmaps.reverse_geocode(latlng=address_result['candidates'][0]['place_id'])[0]['formatted_address'])
    dmatrix=(gmaps.distance_matrix(origin_location,address))
    duration=dmatrix['rows'][0]['elements'][0]['duration']['text']
    distance=dmatrix['rows'][0]['elements'][0]['distance']['text']
    t_t_c_l.append((i[0],i[1],address,distance,duration))
    # print(gmaps.reverse_geocode(latlng=address_result['candidates'][0]['place_id'])[0]['formatted_address'])

with open('pizzalist.csv','w',newline='') as out:
    csv_out=csv.writer(out)
    csv_out.writerows(t_t_c_l)

Now we have a list with all the information we need. Onto the next part of generating a list of what order we are going to all of these places.

Making a plan

So I had a list of about 100 pizza places in Michigan now the question was how am I going to go about going to all of these places. The answer RouteXL since I didn't want to learn how to find the shortest and most optimized path between all of these locations I decided to pay $3.60 to have it done for me. With RouteXL all you have to do is specify a starting point and an ending point. For me both of these where the same MSU Union hall. Then you can drop in just a list of addresses( getting all the additional information above was just done for myself) and RouteXL will find the shortest path. Since I paid for this service I downloaded all the formats available to me. Now we have a nice GPX file to throw into a Garmin and we can start the road trip.

A map of our trip