The Official Story on AT&T Mark the Spot

by: Mark Austin and Mike Wish, Tue Oct 05 16:47:00 EDT 2010

If you’re an AT&T smart phone customer and are having a problem, AT&T wants to know where, what, when, and why.  It’s a lot of information, and to get it AT&T created the AT&T Mark the SpotSM app that lets customers easily mark problem types and locations. This data is reported directly to engineers who use it to optimize network performance and prioritize improvements.

It’s a win-win situation: AT&T gets real-time data about where customers are having performance or coverage issues, and customers get a voice in where AT&T should make upgrades or fixes.


Overlaid on a map, customer reports help AT&T pinpoint exactly where (and what) problems are occurring in relation to cell sectors. Color coding identifies the report type.


But when AT&T Mark the Spot was released in December 2009, some questioned it. Why did customers need to report problems? Surely AT&T already had the information it needed to improve its network.

Yes, AT&T does maintain exhaustive metrics and statistics—hundreds of them—on network performance. There are also call records that detail dropped calls and other issues. And for monitoring coverage levels, AT&T conducts extensive drive tests and uses modeling tools as well.

But these monitoring mechanisms look at the network solely from the engineering perspective. It’s unclear how specific network metrics, and which of the hundreds collected, most accurately reflect the quality issues customers care about, such as no coverage, dropped calls, and slow data connections.


The difficult problem of evaluating the customer experience

In fact, it’s hard to evaluate quality issues as perceived by the customer, even by engineers knowledgeable about the local area. A customer is there when a problem occurs; engineers are not and are dependent on characteristics collected from the device or from network management systems to interpret quality issues.

But not all issues are associated with readily collected characteristics. Holes in coverage are one example. AT&T has very good coverage, comprising 97% of US homes; however, obstructions such as hills or buildings create gaps that don’t provide easily detected characteristics. Levels of coverage are also hard to gauge. Technicians can’t enter homes or apartments. A signal might be strong outside a building but weak inside, particularly on the 50th floor.

Many failed call attempts (especially those due to no coverage) are also hard to detect simply because, without a connection, no call record is created.

With the introduction of the iPhone in 2007, data traffic started increasing markedly on AT&T mobile networks.

 Sometimes the customer and the system classify the same event differently. Take dropped calls; customers may perceive a call as dropped when they don’t hear anything and hang up quickly, but the system requires a certain time interval before registering the call as dropped. Or maybe the audio is there, too faint for the listener but detectable by the system; for the customer it’s a dropped call, for the system it’s a poor-quality audio issue.

Customers easily detect quality issues but certainly don’t bother reporting them every time, and many don’t report at all.  Granted some do call customer care centers, but usually after the fact when the details of where and when are fuzzy. Data-centric customers—those who predominantly use the data network—call even less.

As valuable as they are, customer evaluations and surveys lack the necessary, specific information—exact time and location, sector, cell tower, device type, network characteristics at the time of the problem—needed by engineers to get a clear fix on the likely cause.

So while general satisfaction or other survey information has the customer perspective, it lacks the specifics needed by engineers. Engineers have the metrics but lack the customer perspective.


Growing need for customer experience data

With the introduction of the iPhone in 2007, data traffic started increasing markedly on AT&T mobile networks; traffic levels would jump even more dramatically in 2008 with the release and phenomenal sales of the iPhone 3G. The resulting congestion was beginning to strain the network and negatively impact customers. With more people, more data, and increased expectations, quality issues for voice and data were suddenly very noticeable.

AT&T was already spending billions on upgrades, but with the network strained, it was becoming critical to concentrate on those upgrades that would most help customers. But what were they?

Customers often blame the network, whether or not the network is the cause of the problem.

 To answer this question, AT&T Chief Technology Officer John Donovan initiated a wide-ranging effort to better understand the customer experience. He asked two groups that had previously collaborated —Network Planning and Engineering (NP&E) and AT&T Labs Research—to form a team and come up with a way to correlate customer experience with AT&T's extensive network knowledge.

AT&T already had a tool called NIT (network incident tracker) that could help. It was a PC application given to some employees in early 2008 to log holes in coverage or other incidents so planners would know better where to put upgrades. And NIT worked to the extent that employees recorded the location and time accurately.  The resulting data did locate problem areas, at least those clustered near where employees lived and traveled.

Since NIT’s creation, GPS had been embedded in the iPhone, and the iTunes store stocked with apps. The team took full advantage of the new developments to create the iPhone app, AT&T Mark the Spot.  While it had some common functionality with NIT, it looked, felt and worked nothing like the original.

AT&T Mark the Spot was easy to use. Customers simply selected a problem type from a list, and the app did the rest, appending time and exact location (grabbed from the device) as well as network characteristics (captured from the network after the report was submitted). While NIT had been restricted to coverage holes and dropped calls, the scope of problems was expanded in Mark the Spot to include data speed and other connection information. And information was submitted primarily in real time with the option to provide more detail, even a brief survey. 

Placing new cell towers is sometimes an 18-month process.

It’s important to point out that AT&T Mark the Spot is only one element of Donovan’s initiative to learn more about the customer experience—customer care, trouble tickets, external surveys, feedback from sales, and data mining of tweets are others. But because of its immediacy and the metrics each report carries, Mark the Spot data was particularly valuable.


Mark the Spot goes public

AT&T Mark the Spot for the iPhone was ready for release in December 2009.

Releasing AT&T Mark the Spot to the public was not without risks since it would focus even more attention on AT&T’s network. But Donovan cared enough about the customer experience to go ahead. And crowdsourcing as a means of data collection was gaining mainstream respectability (see Statistics Can Find You a Movie).

The bigger risk was that customers would expect the problems they reported to be fixed immediately.  Some fixes (especially equipment problems) are easy,  but others, particularly those requiring new capacity, take time. Placing new cell towers is sometimes an 18-month (or longer) process: permits need to be obtained, local concerns and ordinances addressed as well as neighborhood objections. It was thus recognized from the beginning that the app would need two-way communication so customers could be notified of improvements coming soon to their area. (Two-way communication was enabled with Version 2.0’s Network News feature in August 2010.)    


What happens to the data?

As Mark the Spot data arrives, it feeds into a database (along with data from many other sources), where it is accessible to AT&T engineers and scientists, who look at it from different perspectives using a variety of tools.

Some groups incorporate the data into map-based visualization tools along with existing network information (close to 50 other metrics in all). From a high level, engineers can quickly see where and what the problems are, what zip codes have the most dropped calls or slowest data connections, and then zoom in for a closer look.

. . . the more fundamental, long-term strategic investment prioritization will come from systematically analyzing Mark the Spot reports in combination with other data.

By filtering reports by type and location while looking at network information (coverage and load levels, physical obstructions, cell tower alarms) engineers zero in on the likely cause, even before the problems are reported by other means.  


Customer reports show clearly where a new cell site is needed. Short-term solution: Re-orient some antennas to cover complaint areas. Long-term solution: Build a cell site at area of no-coverage reports.


Other groups, including Network Planning and Engineering (NP&E) and AT&T Research, have created a range of analytical tools specifically for Mark the Spot data, including one that detects clusters of submissions and then automatically generates service alarms to the appropriate work group. Other tools aggregate reports and plot them on a graph to reveal overall trends and patterns; anomalies in the data graphically highlight issues to be investigated.

The more fundamental, long-term strategic investment prioritization will come from systematically analyzing AT&T Mark the Spot reports in combination with other data.

Statisticians, mathematicians, data miners, and others at AT&T Research will soon be analyzing Mark the Spot data using the network reporting architecture created for tracking the complicated end-to-end paths within the IP network. (See The Data-Driven Approach to Network Management: Innovation Delivered. ) As mobile devices are used increasingly for data connections and data issues become a larger component of the customer experience, this ability to look deeper into the network, tracking events with customer reports, will minimize data connection problems.


When one Mark the Spot respondent submitted the message: “Coverage usually good. No coverage today,” engineers easily traced the data session to the cell tower that handled it, finding there the responsible alarm.


Expert systems will soon be able to automatically classify problems and suggest the likely causes based on network conditions at the time, and alert the responsible work groups, while auto-generating feedback for customers. 


More uses of the data

Mark the Spot data also helps validate other metrics and data sources. By linking reports to network KPIs (key performance indicators), AT&T has been able to statistically validate that each report type (e.g., dropped calls, failed call attempts, voice quality, etc.) relates to its own set of objective network metrics. This provides validation of Mark the Spot as well as the KPIs as being relevant customer experience metrics. Engineers now know which KPIs to measure and improve in order to have the greatest customer benefit. If Mark the Spot respondents report higher rates of dropped calls than suggested by other sources, those other sources need to be examined more closely.

AT&T Mark the Spot also paints a clearer picture of the types of problems and their importance for different customers. What matters is not only where and when customers use the network, but also how (e.g., calls vs. data). Customers often blame the network, whether or not the network is the cause of the problem. Tbe real cause may be a device issue, a configuration issue, a problem with the called-party’s network, a slow server or web site, or possibly a mismatch between the customer and the device (a data-centric customer for example using a 2G phone).

Thousands of Mark the Spots respondents surveyed reported improved service overall or in some locations.

To know how to help customers based on how they use the network, AT&T is incorporating Mark the Spot data into predictive models to understand what solutions are most appropriate: It may be a 3G microcell or identifying a nearby Wi-Fi location. These models also help AT&T understand which customers are most vulnerable so that AT&T can take corrective action.


Is it working?

AT&T Mark the Spot will work if customers download it and use it. On this measure, it has been successful with more than 1,000,000 downloads and 4,000,000 submissions, numbers that will increase as the app becomes available on more devices. AT&T Mark the Spot is available on the iPhone and Android (since August 2010); it is coming soon to the BlackBerry, Windows, and Symbian smart phones.

For AT&T, Mark the Spot is working because it provides high-quality, specific, timely customer experience data, and in a way that’s a lot easier and faster for customers than calling customer care. AT&T saves the cost of avoidable calls while getting information from customers who don’t want the hassle of calling. Mark the Spot gives an early warning, what John Donovan calls a whisper rather than the shout more typical of customer care.

But is it working to make customers more satisfied? While it’s still early in the process, initial signs are encouraging: thousands of Mark the Spot respondents surveyed reported improved service overall or in some locations.


Closing the loop

Some customers who would have left AT&T service have stayed thanks to the Network News feature introduced in version 2.0 (August 2010). This feature lets customers know that their reports are being addressed and sends notifications within the app when improvements are scheduled for areas they care about. Several hundred thousand submissions have received auto-feedback so far.


News items respond directly to specific submissions.


The news items are not generic updates. AT&T can link a registered customer’s previous submissions to a specific cell sector being upgraded (though customers can also see upgrades anywhere in the country or in a specific zip code). Notifications are for both fixes in the works (typically 1-90 days) and those already completed.

The communication does not stop there. AT&T can tell customers about improvements, but customers can tell AT&T whether or not the improvement actually resolved the initial concern. Learning which improvements helped customers (and which did not) provides additional information about what network events most impact customers.

There’s a dynamic quality to the information exchange. Customer feedback helps AT&T understand the customer experience and solicit more relevant information. The network upgrades and improvements themselves change customer behavior: customers can more easily make calls where it was difficult before, or download more data on ever more devices. This generates more and different feedback.

Things will continue changing. But with Network News and other channels, AT&T has opened a direct, two-way communications channel with customers, creating in essence a constant, ongoing customer survey that will help the company keep up in real time with changing customer behavior and keep up with rising expectations.

New features will be coming in 2011 and beyond as AT&T learns more from the data.  AT&T’s challenge and opportunity is to find new ways to keep customers informed and engaged, thus increasing satisfaction and lifetime value.



What AT&T is up against

A picture best tells the story:


Since 2007 (when the first iPhone came out), traffic over AT&T mobile networks has increased 5,000%. This growth is due to the increase in data.

No network provider, AT&T or any other, can build physical infrastructure as fast as people can find new ways to download or stream extraordinary amounts of data: full-length movies (U-verse is soon to be available on mobile phones), music, and even graphics-laden, interactive video games.


What can be done

There are many solutions. The trick is to match the solution that fits the problem (adding a cell tower might not be the best solution if radio interference is the cause). And that’s where Mark the Spot data comes in; it gives AT&T a handle on what the problems are from a customer perspective.

Add cell sites. Solves no-coverage problems and dropped calls. In 2009, AT&T added 1900 cell towers; in 2010, it expects to add even more.

Re-aim antenna. Prevents dropped calls caused by interference between towers.

Set up Wi-Fi hotspots in business locations. Helps offload data traffic from overburdened cell towers.

Change the tower to a different spectrum or change the frequency.
A lower frequency enables signals to more easily penetrate walls.


Why are there performance problems?

There are many reasons:

Network problems (outages, cut cables, glitches, etc.)

Overburdened cell tower (due to physical limitations, a cell tower has a limit to the amount of voice and data it can carry).

Physical obstructions, including buildings and leaf coverage.

Radio interference, often from signals from neighboring cell towers.

The caller on the other end drops the call.

A problem with the device.


Why are data connections sometimes slow?

A phone cannot match a PC in power, so data connections from phones will be slower than from PCs.

Heavy voice and data traffic during peak hours causes the network to apportion the bandwidth to accommodate the users.

A cell tower or Wi-Fi is out and the signal is being diverted to tower or Wi-Fi host further away.

The web site itself may be slow or experiencing problems.

The app or application may be slow.


To answer some questions

How can a no-coverage problem be submitted when there’s no coverage?

Having no coverage prevents a report from being submitted, but the information including the GPS location can still be compiled and saved until a network or Wi-Fi hotspot is detected.

GPS is taken from satellites, not by triangulating cell towers, so location information is not dependent on network coverage.

Won’t all these submissions just add to the congestion on the network?

A Mark the Spot submission is only several  Kbytes. With an average 15,000 submissions per day (23,000 in summer when leaves block reception), the total amount of added traffic amounts to less than 120 Mbytes (approximately 600 emails). Mark the Spot submission data is barely noticeable.

Why does the customer need to go through the steps? Can’t Mark the Spot automatically collect information about dropped calls?

Mark the Spot could do that, but there are two reasons to not passively collect information. One, having customers go through the explicit steps (3 taps) sends a stronger message about what they care about. And second, customers know when there’s no coverage and a call has dropped. The system must infer such problems from characteristics that may or may not be present.