Skip to content Skip to sidebar Skip to footer

Live Data From Bigquery Into A Python Dataframe

I am exploring ways to bring BigQuery data into Python, here is my code so far: from google.cloud import bigquery from pandas.io import gbq client = bigquery.Client.from_service_a

Solution 1:

The method read_gbq expects a str as input and not a QueryJob one.

Try running it like this instead:

query = """
    #standardSQL
    SELECT date,
    SUM(totals.visits) AS visits
    FROM `projectname.dataset.ga_sessions_20*` AS t
    WHERE parse_date('%y%m%d', _table_suffix) between 
    DATE_sub(current_date(), interval 3 day) and
    DATE_sub(current_date(), interval 1 day)
    GROUP BY date
"""results_df = gbq.read_gbq(query, project_id=project_id, private_key='path_to_my.json')

Post a Comment for "Live Data From Bigquery Into A Python Dataframe"