Search Engine Objectivity

Mark R. Patterson in Concurring Opinions, November 23, 2013

Media Source

(This is a guest post from Professor Mark R. Patterson of Fordham Law School. As someone who has participated in panels on antitrust with Prof. Patterson, I thought our readers would be interested in his perspective. –Frank Pasquale.)

“Search is inherently subjective: it always involves guessing the diverse and unknown intentions of users. Regulators, however, need an objective standard to judge search engines against.”

The two claims above, from an essay by James Grimmelmann, are at the center of the conflict over regulation of search engines. Some argue that Google is a powerful gatekeeper for competing firms’ access to customers, so that it must operate in an objective or neutral manner to preserve a level competitive playing field. Those who make this argument necessarily assume that we can assess objectivity or neutrality in this context. Others, like Grimmelmann, support the first statement above, arguing that there is no objective, neutral means of assessing search results, so that there is no way to regulate search engines.

The European Commission (EC), having investigated Google’s practices and concluded that there are “competition concerns,” is apparently on the pro-regulation side, because it is entertaining proposed commitments from Google to address those concerns. (The U.S. F.T.C. conducted its own investigation and closed it without action, concluding that there was insufficient evidence to support the claim that Google’s practices lacked a legitimate business justification.) Google proposed a first set of commitments to the EC in April, but the Commission received “very negative” feedback from a market test of those commitments, so it asked Google for an improved proposal. Last month, Google proposed a second set of commitments. This new proposal was not put to a market test. Instead, the EC sent private inquiries to the complainants in the case and other market participants. Nevertheless, the proposal was leaked, and it offers much food for thought.


Google’s second proposal throws light on the two sides of the search-engine debate. One of the EC’s four concerns is “[t]he favourable treatment, within Google’s web search results, of links to Google’s own specialised web search services as compared to links to competing specialised web search services.” Google’s proposal would address this concern by displaying what it calls “Rival Vertical Search Sites” near Google’s own results. The critical question, of course, is which Rival Vertical Search Sites? Here is where it gets interesting. The choice of sites to be displayed is to be based on an auction, with each rival’s bid multiplied by its “position-independent predicted click-through-rate (pCTR),” i.e., a prediction of the percentage of those viewing the link who click on it, corrected for the position of the link on the page (because more prominent locations get a greater percentage of clicks). The pCTRs would be “calculated using solely a machine-learning regression model that will rely only on objective and verifiable explanatory features and will follow standard industry practices for such models as described in the scientific literature.”

Google has no doubt made an “objective and verifiable” proposal in response to EU competition law’s rule that even conduct that is prima facie abusive may be permissible if it has an “objective business justification.” But what does the proposal tell us about the two statements in the epigraph above? Maybe Google’s proposal is evidence against the search-is-subjective position, and shows that effective regulation is possible. Or maybe it shows no such thing. Maybe Google’s offer says nothing about the feasibility of regulation, because maybe pCTR is actually an ineffective, even if “objective,” means of addressing the competition problem with which the EC is concerned. That is, maybe the pCTR approach is objective, but would be an ineffective remedy because it does not provide a level playing field. (There are also objections to Google’s proposed remedy because it requires rivals to pay for their placement as Rival Vertical Search Sites, but here my concern is only with the pCTR mechanism.)

What do we want from a remedy, assuming that Google’s conduct poses a competition problem? Does a remedy need to be objective? In what sense? Certainly not in the sense that it is universally agreed to be the best, or even a good, remedy. Few remedies in antitrust, or at least few behavioral remedies, would satisfy that criterion. But “objective” does not mean “correct.” Instead, it means “not influenced by personal feelings or opinions.” Google’s offer seems to satisfy that definition. Is that enough?

The offer could at least allow the Monitoring Trustee provided for in the proposal to assess whether Google is complying with its commitments, because the Trustee would receive information about the how the pCTR mechanism works (though the proposal is not very clear regarding how much information Google would be required to provide). In that sense, the objectivity is of value, because it provides some degree of transparency, though only for the Rival Vertical Search Sites remedy, that at least allows the Trustee to determine whether Google is complying with its commitment.

The objectivity of the proposed remedy does not, however, tell us (or the Trustee) whether the remedy is correct in some other sense. In that respect, it is not clear that the EC asked the right question in its inquiry to market participants regarding the proposal: “In your opinion, is this mechanism objective, neutral and non-discriminatory or can it be subject to manipulation?” The pCTR approach could be “objective, neutral and non-discriminatory” but still not deliver the most appropriate and relevant results. Sites can produce click-throughs, and thus presumably high pCTRs, without being sites that consumers really want to visit. We all have clicked on sites that turn out not to serve our purposes. That is why Google uses a Quality Score in its AdWords advertising program, yet the proposed remedy does not include such a quality measure (though it does include two different, somewhat odd “quality-protection thresholds”). Nevertheless, the proposal states that “[t]he sole purpose of the machine-learning regression model shall be to calculate the pCTR as a means to evaluate the expected quality of a particular Rival Link.”

Regardless of whether the pCTR is a good approach to ranking, though, the very fact that Google has offered it as one raises interesting possibilities. For one thing, it should allow the Trustee to compare Google’s pCTR rankings to Google’s organic results. If the pCTR approach produces rankings that are very different from the organic results, that should raise concerns. The difference in the organic results might suggest that Google itself finds the pCTR rankings inadequate and corrects for their flaws. In that case, the remedy would arguably be shown to be inadequate, even on Google’s terms.

However, it is unclear whether, if the remedy were inadequate, the Trustee would have the authority to improve it. The proposal states that “[t]he Commission, upon advice from the Monitoring Trustee, may require changes to the detailed mechanism for calculating pCTRs or to the level of the quality-protection thresholds, if these elements . . . , without objective justification, discriminate against or exclude Rival Vertical Search Sites.” But it is not clear that changes to the “detailed mechanism” would be sufficient. It could be that the pCTR mechanism is fundamentally inadequate. The commitments should provide for the possibility of more extensive changes, or even termination, if the Trustee concludes that the pCTR mechanism is unsatisfactory.

Suppose, though, that the pCTR ranking matched Google’s organic results fairly well. Would that show that it is a good ranking approach? Not necessarily. The organic results might not be a satisfactory benchmark, because even if the pCTR rankings produce poor results, Google might use a similar approach in its organic results. After all, the sites at issue are Google competitors, so there seems no particular reason why Google would object to those competitors being downgraded in the organic results as well as the remedial proposal. If that happened, would we know? Maybe we wouldn’t have to. There may be ways in which the Trustee could use the data to provide valuable information, even if the remedy is inadequate.

First, presumably the Trustee could apply the pCTR approach to Google’s own sites and compare their rankings to Google’s rivals’. If the pCTR of rivals’ sites is better than Google’s own, then if the pCTR ranking is a valid one, the rivals’ sites presumably deserve similar prominence in Google’s display, perhaps even without having to pay for that placement. It would be awkward, it seems, for Google to simply respond that the pCTR approach is invalid, since it provided it and stated that it was intended to reveal the “expected quality” of sites. At the least, Google should have to offer an “objective justification” for displaying sites with higher pCTRs than its own sites’ pCTRs less prominently.

Second, perhaps we could rely on the market, as those who object to regulation of Google often argue. Not everyone clicks on sponsored links, either to Google’s own sites or to its rivals’. Some look only to the organic results. Google might therefore be unwilling to provide poor organic results, so that it might in fact be appropriate to rely on similarity between the pCTR rankings and the organic results as an indication of Google’s neutrality. This might not be true, as suggested above, because Google might be willing to allow its organic results to deteriorate, with the idea that doing so could just push consumers to the sponsored results. But the Trustee should be able to assess that possibility by comparing clicks from the organic results and from the sponsored ones. Moreover, if sponsored clicks predominate, the suggestion in the previous paragraph, of comparing the relationship of the pCTRs of Google and its rivals to the display of those sites, would be all the more important.

The fundamental point here is that Google has offered an objective ranking system, and by doing so has presumably represented it as one that is reasonable. It is not perfect, of course, and maybe it is not even good, but it provides a benchmark from which to develop techniques for evaluating search rankings. It provides such a benchmark because Google’s proposal of the mechanism would make it difficult for Google to simply reject comparisons based on it. It would still be perfectly valid for Google to explain why any comparisons to the pCTR benchmark were invalid, but those objections should start from the basis that the proposal itself is objective and neutral. In that sense, the proposal moves the ball forward, and it moves it forward in the direction favored by those who argue that regulation of search engines is feasible.

All this depends, of course, on the effectiveness of the Monitoring Trustee. For the proposal to provide value, the Trustee must receive sufficient information to make assessments like those suggested above. The Trustee also must have the authority to make these sorts of assessments and to force changes, perhaps significant changes, to the remedial proposals, or even to conclude that the commitments as a whole are inadequate to address the Commission’s concerns. The proposal presents some concerns in these respects. The Trustee’s powers appear to be limited to monitoring Google’s compliance with the proposed commitments, rather than to exploring the validity of those commitments. The proposal explicitly states that the Trustee “shall have no decision-making powers or powers of investigation of the kind vested in the Commission” and that “the Monitoring Trustee’s functions shall not include the power to review or resolve individual complaints relating to the ranking of websites in Google’s Search Results.” In this respect, the proposal may be unsatisfactory.

These evaluations could be performed by private parties, of course, if the pCTR model were made public, but the proposal, not surprisingly, does not contemplate public disclosure. Some value is certainly lost if only the EC and the Monitoring Trustee are privy to the results of any evaluations of this kind. But it is also true that in novel circumstances like this, there is considerable value in the opportunity for enforcement authorities to gather information about alternative means of evaluating possibly anticompetitive practices by search engines. Although competition law recognizes the importance of non-price competition like that for the quality of search results, the law has few techniques for assessing the effectiveness of such competition.

In sum, Google’s proposed commitments offer an “objective” remedy. It may or may not be an effective remedy, but the fact that Google has proposed it can be taken as accepting it as a valid starting point for considering alternative ways of evaluating the validity of search results. To advance beyond that starting point, though, the EC should require Google to allow it a reasonable scope for evaluating the validity of the remedy, rather than just providing for a narrow monitoring of whether Google is complying with it. If Google’s proposal provides that scope, it will go at least some way to providing the Commission with what Grimmelmann says it needs, “an objective standard to judge search engines against.”

- See more at: http://www.concurringopinions.com/archives/2013/11/search-engine-objectivity.html#sthash.bZTuwK33.dpuf