Common reporting-related questions and issues.
For new onboarded traits, why does the Trait Graph sometimes display lower than expected numbers or 0?
Sometimes, after you upload traits, the Trait Graph doesn't show any results or shows lower than expected numbers. This happens when the volume of data we receive is so great that the inbound processing job cannot finish ingesting this information until after the reporting deadline for that day.
As a result, this data is sent to the reporting system late and won't show up in the 1-day reporting interval which is used for plotting the Trait Graph. However, you can view this data in the 7, 14, 30, and 60-day report intervals in a Trend or General Report on the following day.
Some segments are missing from an Overlap report. Where are they?
To help reduce computational demand, these reports omit statistically insignificant data from the results. Your segments are not missing, they're just dropped because they do not yield meaningful results or useful pools of users that you can target. See also:
If I run an email marketing campaign, how can I determine if redirected users come to my site from that campaign or from other sources?
Append a campaign-specific query string to the URL of the site section you want to monitor. Next, set up a trait rule to capture this variable. For example, if your URL passes in a campaign ID like this, www.test123.com/electronics?campaign=123 , then create a trait rule to capture that data from the h_referer variable with a trait rule that looks for a header like h_referer = 'campaign=123' ).
What is the difference between real-time and total segment population counts?
I have a segment consisting of just one trait. When I look at Reporting metrics, their counts don't match. Why is that?
I Inbound a file and my Inbound receipt shows a high number of successfully processed records, but reporting shows much lower numbers. Why?
In the backend, onboarded data gets attached only to users that are still active in AAM (user must have had recent DCS activity in the past 120 days). Therefore, if you onboard data for users that have already expired in Audience Manager, Inbound might tell you that a certain number of user records were onboarded, but if these users have not had any recent activity, this data is dropped when it reaches our User Profile Store and reporting will surface that.
Why are the trait uniques for my cross-device onboarded traits much higher than the total number of onboarded records?
If you onboard a file for a cross-device data provider keyed off the customer ID, Audience Manager performs a lookup to get all device IDs that are associated with each of the onboarded customer IDs. Audience Manager then assigns the onboarded traits to the device ID associated with the customer ID.
As an example, suppose you have onboarded 100 records. For each of these customer IDs, on average, AAM has associated three device IDs. As a result, the trait that was onboarded is assigned to 300 device IDs.
There are two reasons why a single cross-device customer ID can be associated with multiple device IDs:
- Users are logging in to the same cross-device account from multiple computers/browsers.
- Users are clearing their cookies. Note: “Abandoned” cookies are deleted after 120 days of user inactivity.
Why are Total Trait Realizations for my onboared traits always 0?
Total Trait Realizations correspond to page loads. Total Trait Realizations provide the number of times that specific trait has fired in real-time. This number is calculated for rule-based traits only. Onboarded traits always show Total Trait Realizations as 0.
I created a trait and the Trait Graph shows a larger number of Unique Trait Realizations than the Total Trait Population. Is this normal?
You are seeing this because the Unique Trait Realizations are real-time metrics, but the reporting jobs we do to calculate the Total Trait Population are not real-time. The Total Trait Population should be larger than the Unique Trait Realizations within a couple of days.