Track your ad tests at scale with this advanced AdWords script

Spread the love


Keeping track of when your ad tests have enough data to be concluded can be a real challenge when you may have dozens of different ad tests running within an account.

The most common way to split test revolves around labeling sets of ad copies “Ad copy 1” and “Ad copy 2” and using the dimensions tab to see the aggregate performance of each set of ads.

The problem here is that you have to check your dimensions tab regularly to see when a test has enough data to conclude.

It’s also easy to lose track of which labels are being tested against each other. You’ll sometimes find yourself wondering whether ad copy 50 is being tested against ad copy 49 or 51?

So the developers at my agency have built this script to allow you to see quickly a top line summary of all of the split tests that are running within your account.

It also allows you to quickly determine when an ad test has finished, even if you’re running hundreds of tests at a time.

Better still, it will notify you of the completion, so you can go into your account and create a new ad copy.

How the script works

The script works by users designing experiments within an input sheet on Google Sheets.

Here you tell the program when the test will start and which labels will be compared. For example, “Ad copy 1” and “Ad copy 2.”

The script will then pull the data for the labels that you have selected for that time range and will then show you how the ad test is performing as you will see below. This is pulled into an output sheet on the same Google Sheets document.

READ ALSO  5 advanced Google AdWords features to enhance your PPC

It will also run daily and email you once a test has enough data to conclude a test.

Setting up the script

To get the script to run, there are a few things that you need to do first:

  1. Go to this link here and create a copy of my experiments studio dashboard sheet.
  2. Paste the link to the dashboard into line 17 of the script.
  3. On line 20, select the metric which you want to split test by. This can take either CTR, CvR or CvR*.
  4. On line 23, select the statistical significance threshold that you want to test towards. This can take the following values (0.90, 0.95 or 0.99).
  5. On line 27, enter the currency that your account runs in (e.g., “£”)
  6. On line 31, enter the email address that you want to receive notifications daily to tell you which ad tests have finished.

Setting up the studio sheet

Once you have set up the script within Google Ads, you will need to configure the studio Google Sheet. Here is what you need to do to create a new experiment.

  1. Within the input tab give your experiment a name, for example “CTA test.”
  2. Set the start date for each experiment to the day that the experiment started on.
  3. In the compare labels column, select two labels that you want to compare with a comma between them (e.g. “label 1,label2”).
  4. Name the sheet that you want your tests to be displayed on. The script will then automatically generate these tabs.
  5. If you only want to include certain conversions you can use the custom conversions tab to select the conversions that you want to include.
  6. In the output template, you can select which metrics you want to either include or exclude. If you change these you will need to delete the tabs with the experiments in and then re-run the script within Google Ads.
  7. You should then set the script to run daily so it can work out when the test has been completed.
READ ALSO  Using the new Google AdWords dashboard

Script


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Wesley is managing director at Clicteq. He currently manages a $4M-plus Google Ads portfolio across a range of different sectors. He regularly writes in leading marketing publications such as Econsultancy and Campaign Magazine.



Source link


Spread the love

Leave a Comment

Your email address will not be published. Required fields are marked *