Frequently Asked Questions

What does Sticky's eye-tracking platform measure?
Eye tracking is the process of measuring the point of gaze (where one is looking) and can tell you three things:

  1. How many people that have seen a specific area of interest (AOI)
  2. How long they looked at it
  3. In what order they have looked at different objectsEye tracking technology makes it easy to obtain hard facts and statistics as well as intuitive visualizations of what captured the consumers’ attention.

What are the metrics that I can get out of a Sticky experiment?
Sticky records several different metrics when an area of interest is defined. The metrics Sticky collects are:

  1. Percentage Seen: Percentage of how many of the respondents that saw the stimulus actually saw the AOI.
  2. Earned Attention: The average amount of time that a respondent spends on an AOI. If the respondent did not see the AOI, they are not included in the statistics.
  3. Average Exposure: The average number of seconds the stimulus where shown on the respondent's screen
  4. Max Exposure: The maximum amount of seconds a stimulus were shown on the respondents screen
  5. Percentage First Look: Percentage of how many of the respondents that saw the stimulus saw this AOI first.
  6. Time to first Look: The average amount of time it takes for a respondent to see an AOI. Respondents that who does not see the the AOI will not be included in the statistics.
  7. Percentage doubletake: Percentage of how many of the respondents that saw this AOI revisited it. (Looked at it, looked away, then back again)
  8. Percentage of Clicks: Number of participants who clicked in the AOI / total participants who saw this stimulus.
  9. Number of Clicks: The number of clicks which happen within the AOI.
  10. Average time to click: The average amount of time it takes for a respondent to click inside the AOI.


Are there any limitations for picture size and dimensions?
The limitation is the resolution of the user’s screen. Anything above 1200 x 768 roughly will not be seen on most user computers. 


How do I compare stationary/hardware eye trackers to webcam eye tracking?
Where specialized hardware and webcam based eye tracking differ is the frequency at which the respondents gaze is captured. Sticky's algorithm captures and calculates respondent gaze at a 10Hz frequency. Parallel studies have shown high correlation of results between hardware and webcam eye tracking on an aggregated level.


What is an AOI (Area of Interest)?
AOIs is an abbreviated term for "Areas of Interest". In order to understand the engagement and interaction with each tagged element on the page, the system outputs the standard area of interest statistics for eye tracking.
After uploading stimuli onto the platform there is the option to tag AOIs on your stimuli. You can click and drag boxes or polygons to create AOIS surrounding specific areas. Tagging these elements allows the analysis engine to output AOI statistics including:

  • Percentage of users who saw the area
  • Earned Attention for the area of interest
  • Avg. Exposure
  • Max. Exposure
  • Percentage of users with First Look on the AOI
  • Time to 1st look
  • Percentage of clicks on the area
  • Total clicks on the area
  • Avg. Time to Click

    **Please Note: It is important that your AOI names are descriptive, as your AOI statistic table will only show the name that you have given an area.


What's the best way for me to get support?
Since Sticky experiments are all unique, we suggest that you contact with a detailed description of the issue you are facing. In order to do so you can use the template below. By including all of these details the support team will have an easy time understand what your issue is and the best way to diagnose it.

Hi Team,
I had an issue while doing _________.

The steps I took to reproduce it are as follows:

Experiment / Study ID: ____
Participant ID: ____
Link to study: ______
Browser: _____
Screenshot: ____


What languages do you support?
On Sticky, you can create experiments in over 40 languages. Some of these are: Arabic, Chinese, Dutch, English, Finnish, French, German, Hebrew, Hindi, Hungarian, Indonesian, Italian, Japanese, Korean, Malay, Norwegian, Polish, Portuguese/Brazil, Portuguese/ Portugal, Portuguese, Romanian, Russian, Spanish (formal), Spanish (informal), Swedish, Tamil, Thai, Tagalog, Turkish, Vietnamese.

If there is a language that you would like to use but is not part of the list, please reach out to our support team for more help, at


What are the main reasons for a low usable rate on my experiment?
The following reason can contribute to low usable rates.

  1. Minimize time - as the longer the time, the higher the probability that users move their heads, invalidating the session.
  2. Don’t use too many AOIs - If there are too many AOIs or the AOIs are too small, the sample size necessary to differentiate results can become prohibitive.
  3. Consider the attention span or fatigue threshold - of the respondent. Including too many clutter images or directions integrated with your eye tracking can lose the attention of the respondent, potentially lowering your usable counts.
  4. Avoid showing the same page/image/ad - to the same respondent, even if you change elements in it. They may not catch the differences and their attention may be lost.
  5. Direct your respondents - to behave well during testing. If respondents are told how to act during testing there is a higher chance respondents will be useable. Remind clients to keep their head still, be in a well lit area, remove glasses if they are not necessary, etc.
  6. Larger Sample Size - Our tests are done on 30 usable sessions usually. At small sample sizes, individual biases are more clearly seen.


Are we missing a key FAQ? Please let us know