How to get daily Google Trends data for any period with R

Last updated in June 2021

Recently, I needed some seven years of Google Trends daily data. It turned out that by default it’s not possible to get it neither through the web interface nor via API. So I wrote a tiny script that pulls daily Google Trends data for any period using gtrendsR package

What’s the problem with Google Trends?

Google Trends returns data in daily granularity only if the timeframe is less than 9 months. If the timeframe is between 9 months and 5 years, you’ll get weekly data, and if it’s longer than 5 years – you’ll get monthly data.

A trivial solution like querying the data month by month and then tieing it together won’t work in this case, because Google Trends assess interest in relative values within the given time period. It means that for a given keyword and month, Google Trend will estimate interest identically – with a local minimum of 0 and a local maximum of 100 – event in one month it had twice as many searches than in the other.

Querying Google Trend daily data properly

To get proper daily estimates, I do the following:

  1. Query daily estimates for each month in the specified timeframe;
  2. Queries monthly data for the whole timeframe;
  3. Multiply daily estimates for each month from step 1 by its weight from step 2.

Here is the R code:

library(gtrendsR)
library(tidyverse)
library(lubridate)

get_daily_gtrend <- function(keyword = c('Taylor Swift', 'Kim Kardashian'), geo = 'US', from = '2013-01-01', to = '2019-08-15') {
  if (ymd(to) >= floor_date(Sys.Date(), 'month')) {
    to <- floor_date(ymd(to), 'month') - days(1)
    
    if (to < from) {
      stop("Specifying \'to\' date in the current month is not allowed")
    }
  }
  
  mult_m <- gtrends(keyword = keyword, geo = geo, time = paste(from, to))$interest_over_time %>%
    group_by(month = floor_date(date, 'month'), keyword) %>%
    summarise(hits = sum(hits)) %>%
    ungroup() %>%
    mutate(ym = format(month, '%Y-%m'),
           mult = hits / max(hits)) %>%
    select(month, ym, keyword, mult) %>%
    as_tibble()
  
  pm <- tibble(s = seq(ymd(from), ymd(to), by = 'month'), 
               e = seq(ymd(from), ymd(to), by = 'month') + months(1) - days(1))
  
  raw_trends_m <- tibble()
  
  for (i in seq(1, nrow(pm), 1)) {
    curr <- gtrends(keyword, geo = geo, time = paste(pm$s[i], pm$e[i]))
    print(paste('for', pm$s[i], pm$e[i], 'retrieved', count(curr$interest_over_time), 'days of data (all keywords)'))
    raw_trends_m <- rbind(raw_trends_m,
                         curr$interest_over_time)
  }
  
  trend_m <- raw_trends_m %>%
    select(date, keyword, hits) %>%
    mutate(ym = format(date, '%Y-%m')) %>%
    as_tibble()
  
  trend_res <- trend_m %>%
    left_join(mult_m) %>%
    mutate(est_hits = hits * mult) %>%
    select(date, keyword, est_hits) %>%
    as_tibble() %>%
    mutate(date = as.Date(date))
  
  return(trend_res)
}

get_daily_gtrend(keyword = c('Taylor Swift', 'Kim Kardashian'), geo = 'US', from = '2013-01-01', to = '2013-09-01')

get_daily_gtrend function should return a tibble with daily trend. Now you can plot it nicely or use in some analysis

 715   2019   R
4 comments
mary azari 2 mon

Hi Alex
Many thanks for sharing the code
I applied it but found an error like this:
Error in UseMethod(“count”) :
no applicable method for ‘count’ applied to an object of class “NULL”
I’d greatly appreciated if you kindly guide me.

Alex Dyachenko 2 mon

Hi Mary, I just double-checked the script and it seemed to work fine. Most likely it’s something on your side. I just replied you via email.

Giovanni da Rosa 1 mon

Hi Alex!

Thanks for sharing this code.
Do you have any idea of how can I get daily city level data?
I’ve never used R, but  I really need to work with daily data grouped by day, especeally applied to Brazil.
I was trying to do that in python, but I could not find a way to do that with pytrends, so here I am.

If you have any ideas, it would be of great help!

Alex Dyachenko 2 d

Hi Giovanni,

As far as I can see, there is no simple way to implement this. Google Trends web UI -- as well as gtrendR library -- do not provide trends broke down by geo. You can have either, but not both.

The only bypath I can think of is to query daily data separately for each city and than normalize it by aggregated popularity of the query in those cities.

I’d suggest you to take a look at gtrendR documentation https://cran.r-project.org/web/packages/gtrendsR/gtrendsR.pdf. It has a function called ‘gtrends’ that return an object with multiple params, including ‘interest_over_time’ (trend) and ‘interest_by_city’ (snapshot). Combine the two and you’ll get what you want.

Thoraya R 12 d

Hi Alex,
Thanks for this, it’s very useful! I got the same error as Mary, any thoughts on how to resolve this?

Alex Dyachenko 2 d

Hi Thoraya,

I wasn’t able to reproduce Mary’s case. If you still have this problem, you can send me your code via email and I’ll try to take a look at it.

Yoan Aleksandrov 7 d

Hi Alex,
Thank you for providing this solution. I tried the code and it works for me, except I am not an expert in R and I am not able to view or plot the data I obtained. Plot() does not work in displaying the data from the get_google_trends function I obtained in the environment with your code and gives an error. So in short, my question is how do I access the complete daily data for use in further analysis.

Alex Dyachenko 2 d

Hi Yoan,

I’d suggest you to go over some basics books on R, such as https://r4ds.had.co.nz/. If you need to plot the data ASAP and don’t have time to read the whole book, try this: https://uoftcoders.github.io/rcourse/lec04-dplyr.html