We will use BeautifulSoup to scrap data from ESPN and upload the team name and rosters via the notion API.
This is way we will handle the running of the code in one environment.
Once we have created our notion page, we will want to input a table to be able to upload the info into a database.
Now we are synced up and can upload data to the table from the API.
Let’s go back over to the google colab notebook now and start inputing some of the functions we will need.
Your notebook should look like this now
import pandas as pd
from urllib.parse import urlparse
url = input()
parseurl = urlparse(url)
parseurl[2 : 3]
database = str(parseurl[2 : 3])
print(database.translate({ord(i): None for i in '/,()'}))
This code will create an input function that will allow you to input your notion URL that you generate from shared link from Notion. Click on the copy link, the share button can be found on the tab part of the page.
Your code should now look something like this.
We now have our database ID.
Next we need to make the secret from our Notion integrations and it’ll be stored in your notebook
secret = "YOUR-NOTION-SECRET"