FEDERAL COURT OF AUSTRALIA
Australian Competition and Consumer Commission v Google LLC (No 2) [2021] FCA 367
ORDERS
AUSTRALIAN COMPETITION AND CONSUMER COMMISSION Applicant | ||
AND: | First Respondent GOOGLE AUSTRALIA PTY LTD Second Respondent |
DATE OF ORDER: |
THE COURT ORDERS THAT:
1. The parties confer with a view to providing within 14 days agreed orders reflecting the conclusions reached by the Court and appropriate further steps.
THE COURT NOTES THAT:
1. Limited parts of these reasons for judgment have been redacted to give effect to non-publication orders made on 4 and 10 December 2020.
Note: Entry of orders is dealt with in Rule 39.32 of the Federal Court Rules 2011.
THAWLEY J:
[1] | |
[19] | |
[20] | |
[28] | |
Collection, storage and use of personal data about a user’s location | [33] |
[39] | |
[45] | |
[46] | |
[50] | |
[50] | |
[65] | |
[77] | |
[77] | |
[78] | |
[87] | |
[99] | |
[121] | |
[136] | |
[141] | |
[142] | |
[151] | |
[153] | |
[158] | |
[159] | |
[164] | |
[166] | |
[213] | |
[229] | |
[230] | |
[239] | |
[241] | |
[244] | |
[245] | |
[251] | |
[256] | |
[265] | |
[275] | |
[278] | |
[279] | |
[282] | |
[289] | |
[290] | |
[293] | |
[298] | |
[299] | |
[301] | |
[302] | |
SCENARIO 3: USERS CONSIDERING WHETHER TO TURN WEB & APP ACTIVITY “OFF” | [303] |
[304] | |
[309] | |
[317] | |
[318] | |
[323] | |
[327] | |
[327] | |
[328] | |
[329] | |
[334] | |
[341] |
1 The Australian Competition and Consumer Commission (ACCC) alleges that Google LLC and Google Australia Pty Ltd (GAPL) contravened ss 18, 29 and 33 or 34 of the Australian Consumer Law (ACL), being Sch 2 of the Competition and Consumer Act 2010 (Cth). Google LLC is incorporated in the United States of America. GAPL is incorporated in Australia. They are referred to collectively in these reasons as Google.
2 The ACCC’s case was that particular users of mobile devices with Android operating systems (Android OS) were misled by reason of the content of various screens those users saw on their devices. Two settings were central to the ACCC’s case: “Web & App Activity” and “Location History”. When setting up a device relevant to the proceedings, Web & App Activity was defaulted to “on” and Location History was defaulted to “off”. These default settings meant that Google LLC could obtain, retain and use personal location data when a user was using various apps, including Google services such as Google Maps. At the core of the ACCC’s case was the contention that there were users who were misled or likely to have been misled by what was, and what was not, stated or shown on various relevant screens on the users’ devices; there were users who, acting reasonably, would have been led into thinking that, with Location History “off”, Google LLC would not obtain, retain and use personal data about a user’s location, and that this was not relevantly changed by the fact that Web & App Activity was “on”.
3 The ACCC ran its case by reference to particular classes of users in three different scenarios, confined to particular time periods. The classes of users in each scenario were quite specifically identified and identified in a way which necessarily carried an implication that the relevant users had certain characteristics.
4 Scenario 1 concerned users who:
(1) used an Android OS on a mobile device, between 30 April 2018 and 19 December 2018, to set-up his or her Google Account;
(2) when viewing the Privacy and Terms screen (Annexure A to these reasons), chose to click on “More Options” rather than “I agree” or “Don’t create the account” and were accordingly, shown the More Options screen (Annexure B to these reasons).
5 The ACCC’s case in relation to Scenario 1 was not that Google should have given some general warning to all users at the commencement of setting up a Google Account. On the ACCC’s case and the evidence, the vast majority of users would not have clicked on “More Options” and would not have reached the More Options screen. One reason a user may have clicked on “More Options” was that the user had a concern about privacy generally or, more specifically, about personal data concerning the user’s location being obtained or used. Such a user would be expected to pay more attention to material relevant to his or her privacy concerns and perhaps also to seek out further information. The extent to which that would be so is addressed later.
6 At its simplest, the ACCC’s case in relation to Scenario 1 was that, as a result of what users saw on the Privacy and Terms screen (Annexure A) and the More Option screen (Annexure B), users would have been misled into thinking that Google LLC would not obtain, retain and use personal location data whilst Location History was defaulted to “off”. In fact, however, Google would continue to obtain, retain and use location data when a user used Google products or services because Web & App Activity was “on”.
7 Scenario 2 concerned events after the initial set-up of the device and the opening of a Google Account. Scenario 2 concerned users who:
(1) either during set-up or at some other time had turned the Location History setting to “on”, from its default position of “off”; and
(2) had then later made a decision to turn Location History back to “off”.
8 The class of users was confined to users who had already made a decision to turn Location History “off”. That decision must have been made for a reason, presumably connected with what the user knew or thought about the function of the Location History setting. This class of user had a specific objective beyond simply setting up their Google Account.
9 In summary, the ACCC’s case was that Google LLC incorrectly represented to certain users who had decided to turn Location History “off” that:
(1) Google LLC would not continue to obtain, retain and use personal data about the user’s location after the Location History setting was turned “off”; and
(2) Google LLC would only obtain, retain and use personal data about location for the user’s purposes, not for Google’s purposes.
10 In fact Google LLC could obtain, retain and use personal data about location whilst Web & App Activity was “on”, even when Location History was “off”. That data would be used both for Google’s purposes and the user’s purposes.
11 Scenario 3 concerned users considering whether to turn the Web & App Activity setting “off”. Although this could occur during the set-up process, the case was argued on the basis of this occurring after set-up of the device and the opening of a Google Account.
12 The class of persons to whom Scenario 3 applied was stated to be confined to users considering whether to turn Web & App Activity “off”. The relevant users must have been considering whether to turn the setting off for some reason, presumably connected with what the person knew or thought about the function of the Web & App Activity setting. The relevant user identified by the ACCC was someone who had an objective, namely to make a decision about something that the user had decided to look into.
13 In summary, the ACCC’s case in relation to Scenario 3 was that Google incorrectly represented to certain users considering whether to turn Web & App Activity “off” that:
(1) having Web & App Activity “on” would not allow personal data in relation to the users’ location being obtained, retained and used by Google LLC; and
(2) Google LLC would only obtain, retain and use personal data about location for the users’ purposes, not for Google’s purposes.
14 Whilst Google did not dispute that the ACCC could run its case by reference to users in the classes and scenarios identified by the ACCC, Google disputed that it breached any of the relevant provisions of the ACL.
15 The ACCC ran its case on the basis of numerous variations of screens presented to users over various and overlapping time periods. The possible permutations of precisely what users might have seen was not reasonably calculable. Ultimately, the case was run by reference to an amended aide memoire identifying representative screens. This was a reasonable way to deal with the issues in an appropriate manner. The aide memoire, updated after closing submissions, is MFI 3 (Annexure C to these reasons).
16 One matter of principle which divided the parties was whether, within each class, the Court had to assess the case by reference to a single hypothetical user having only one possible reaction or response to the material presented to the user by the screens. Google submitted that the law was to that effect. For the reasons given below, I reject that submission. So far as concerns s 18 of the ACL, it is sufficient for the ACCC to establish that Google’s conduct was misleading or deceptive or likely to mislead or deceive ordinary or reasonable members of the relevant class, extreme or fanciful responses being disregarded.
17 For the reasons which follow, I have concluded that the ACCC’s case under s 18 of the ACL is partially made out in respect of each of the three scenarios. Google’s conduct would not have misled all reasonable users in the classes identified; but Google’s conduct misled or was likely to mislead some reasonable users within the particular classes identified. The number or proportion of reasonable users who were misled, or were likely to have been misled, does not matter for the purposes of establishing contraventions.
18 I have also concluded that the ACCC’s case under ss 29(1)(g) and 34 of the ACL is partially made out in respect of each scenario.
19 Many of the facts were agreed between the parties for the purposes of s 191 of the Evidence Act 1995 (Cth). The following account of the facts is drawn largely from the “Further Amended Statement of Agreed Facts” and the “Further Agreed Facts”.
Overview of Google’s services in Australia
20 Google LLC supplied a range of software products and services to consumers in Australia, referred to in these reasons as “Google services”, including:
(a) the “Google Play Store”, a mobile app and online entertainment store;
(b) “Google Search”, an internet search engine;
(c) “Google Chrome”, a web browser;
(d) “Google Maps”, a mapping app;
(e) “Gmail”, an email service and app; and
(f) “YouTube”, an online video platform.
21 From at least 1 January 2017, Google supplied:
(1) various goods and services to consumers in Australia, including certain Google services;
(2) the Android OS (which is not comprised within the Google services), namely a mobile operating system developed by Google LLC which it regularly updated. Google LLC permitted the Android OS to be used by other companies on an “open source” basis.
22 Google Accounts are accounts available to individual users. Use of some Google services (such as Google Search, Google Maps and YouTube) does not require a Google Account. Other Google services, such as the Google Play Store, require the user to be signed into his or her Google Account. Some product features in certain Google services are not available if the user does not sign into a Google Account (for example, some features of YouTube and Google Maps).
23 The content of the Android OS was available to millions of Australian users in the relevant period. Around 6.3 million Australian users set-up a new Google Account on devices using the Android OS between January 2017 and August 2019.
24 Third-party manufacturers of mobile devices that used the Android OS may have also licensed Google Mobile Services (GMS) from Google LLC and pre-installed GMS on their mobile devices. GMS is a set of Google apps (including the Google Play Store, Google Search, Google Chrome and YouTube) and application programming interfaces used on mobile devices running on the Android OS.
25 Google LLC required licensees of GMS to meet certain requirements, including that no modifications were made to the Google Account settings on devices with GMS pre-installed. The significant majority of third-party manufacturers who used the Android OS for mobile devices supplied in Australia also licensed GMS.
26 The Pixel is a mobile phone which Google LLC caused to be manufactured by third-parties before October 2018. Google LLC has manufactured the Pixel since October 2018. Around 280,000 Pixels were sold in Australia either indirectly by third-party resellers or directly by GAPL in financial years 2017 to 2019.
27 GAPL supplied Pixels to third-party suppliers of mobile devices in Australia at all material times and, since around June 2018, also supplied Pixels directly to consumers in Australia. The Pixels supplied by GAPL to third-party suppliers and consumers in Australia were pre-installed with the Android OS and GMS. GAPL was not responsible for the Google services, Google Accounts or the Android OS.
28 The parties referred to a “user” as a person who:
(a) had a mobile device which used the Android OS and on which GMS was installed; and
(i) either had a Google Account and was signed into his or her Google Account on a mobile device which used the Android OS and on which GMS was installed (those devices being referred to as Linked Devices); or
(ii) alternatively, was in the process of setting up a Google Account on the mobile device.
29 The parties referred to personal data as data which is identifiable as being associated with the holder of a particular Google Account. For the purposes of this proceeding, the parties agreed that IP address data is not personal data in relation to or about a user’s location.
30 From 1 January 2017 to 29 October 2019, the settings within users’ Google Accounts included Location History and Web & App Activity.
31 At all relevant times, the default setting when a user set-up a Google Account was that the Location History setting was turned “off” (or “paused” or “disabled”).
32 At all relevant times, the default setting when a user set-up a Google Account was that the Web & App Activity setting was turned “on” (or “enabled”).
Collection, storage and use of personal data about a user’s location
33 Google LLC collected and stored personal data about a user’s location from a Linked Device if the “Device-level Location Setting” (described in paragraphs 39 to 44 below) was enabled, the user was signed in to their Google Account, and either or both of the following applied:
(1) the Web & App Activity setting was enabled and the user had used certain Google services (for example Google Maps or Google Search); and
(2) the Location History setting was enabled.
34 Further, some apps, such as Google Photos, included settings that permitted data from the use of that app to be associated with the user’s Google Account. These discrete app settings were not the subject of these proceedings. Leaving this aside, Google LLC did not collect or store personal data about a user’s location except in the circumstances identified immediately above.
35 The personal data about a user’s location that Google LLC collected from the user’s Linked Device in the circumstances described at [33(1)] above was stored by Google LLC in association with the user’s Google Account and was accessible to the user in the “My Activity” feature of their Google Account.
36 The personal data about a user’s location that Google LLC collected from the user’s Linked Device in the circumstances described at [33(2)] above was stored by Google LLC in association with the user’s Google Account and was accessible to the user in the “Timeline” feature of Google Maps.
37 The Location History and Web & App Activity settings could be enabled or paused:
(a) between 1 January 2017 and 30 April 2018: at any time after the set-up of a Google Account; and
(b) after 30 April 2018: at any time during or after the set-up of a Google Account.
38 Google LLC collected, stored and used personal data (including personal data in relation to a user’s location) for purposes including the following:
(a) the user’s use of Google services;
(b) to personalise advertisements for the user;
(c) in an anonymised form, to personalise advertisements for any other user or users;
(d) in an anonymised form, to infer demographic information;
(e) in an anonymised form, to measure the performance of advertisements;
(f) in an anonymised, aggregated form, to promote, offer to supply or supply advertising services to third parties; and
(g) to produce anonymised, aggregated statistics (such as store visit conversions statistics) and sharing those statistics with advertisers.
39 Whether location data might have be collected from a Pixel depended upon whether the “Use Location” or “Location” setting on that Pixel was enabled. Similar settings were available on other Android devices, but may have been labelled differently. The parties referred to these device settings as the Device-level Location Setting.
40 At all relevant times when a user set-up their Linked Device, the user was presented with an option to disable the Device-level Location Setting, which was enabled by default.
41 The location data collected by Google LLC when the Device-level Location Setting was enabled was not personal data unless the circumstances described at [33] above applied, namely Web & App Activity and/or Location History was enabled. This is because the data was not identifiable as being associated with the holder of a particular Google Account.
42 If a user disabled the Device-level Location Setting, no apps (whether Google apps or third-party apps) were able to access the location of the user’s Linked Device (even if either or both of Location History or Web & App Activity were enabled).
43 Where the Device-level Location Setting was disabled, some product features in certain apps (including some Google services) were not available to the user. In circumstances where those features would otherwise be available to the user, the relevant apps (including some Google services) might prompt the user to enable the Device-level Location Setting on the Linked Device. Where, in response to such a prompt in certain apps (including some Google services), the user chose to enable the Device-level Location Setting, that had the effect of toggling a particular switch to “on”. For example, with the Device-level Location Setting disabled, Google Maps would:
(a) not show the user their location on a map or give directions using the user’s current location (but could give point-to-point directions between two specified locations); and
(b) prompt the user to enable the Device-level Location Setting on that Linked Device.
44 Where the user chose to enable the Device-level Location Setting in this example that had the effect of toggling a particular switch to “on”.
45 The “Further Amended Statement of Agreed Facts” referred to a “Screenshot Bundle” (SB) and “Supplementary Screenshot Bundle”, which the parties agreed formed part of the agreed facts. It contained screenshots, typically from a Pixel device, which were representative of screens shown to users of Android devices running the then-current versions of the Android OS in the period 1 January 2017 to 29 October 2019. The Screenshot Bundle was divided into sections in relation to setting controls presented during “Set-up”, and post set-up “Settings”.
46 During the period 1 January 2017 to 29 October 2019, the screens that a user would see during the set-up of their Linked Devices varied. This depended on, for example, the device used and the point in time at which the user was setting up the Linked Device.
47 When setting up a new Linked Device, a user could select the following options:
(a) create a new Google Account;
(b) sign in to an existing Google Account; or
(c) skip signing in to or creating a Google Account.
48 A user’s options in the period 1 January 2017 to 29 April 2018 were shown at pages 8 – 9 of the Screenshot Bundle. A user’s options in the period 30 April 2018 to 29 October 2019 were shown at pages 47 – 50 of the Screenshot Bundle.
49 The various permutations of what would be seen depending on what link or option a user chose are not practical to put in writing. They are referred to below in resolving the issues raised in the proceedings and can be partially seen by an examination of MFI 3 at Annexure C.
50 The ACCC and Google relied on evidence of distinguished economists with particular expertise in behavioural economics. The ACCC relied on evidence from Professor Robert Slonim, a professor at the School of Economics at the University of Sydney. Google relied on evidence from Professor John List, a professor of economics at the University of Chicago. Professor Slonim prepared the first report filed in the proceedings, Professor List responded and Professor Slonim prepared a report in reply. In addition, Professors Slonim and List prepared a joint report. I was impressed and assisted by both professors.
51 The expert evidence was to the effect that the appropriate framework for understanding how users approached the process of navigating the relevant screens involved a cost-benefit analysis, subject to certain behavioural biases. At any point in the navigation process, if users perceived the marginal benefit of navigating or searching further exceeded the marginal cost, the user would continue. Conversely, users would not search or navigate further where they perceived the marginal cost to exceed the marginal benefit. A user may have known that information was available but rationally chose not to read it. If a user was particularly concerned to know about a particular topic, all other things being equal, such a user was more likely to look for that information. But even a person interested in a topic may have reached a point where he or she would stop notwithstanding that the person knew there was more information available.
52 Professor Slonim and Professor List drew a distinction between traditional economic models of decision making and that indicated by the field of behavioural economics. At its most severe, traditional economic modelling answers what should be the result by analysing the decision which would be made by a person who, to adopt Professor List’s language, “is unswervingly rational, completely selfish, analytical, reliable, and can effortlessly and costlessly solve even the most difficult optimization problems”. Behavioural economics recognises that actual human behaviour deviates from the traditional rational model in predictable ways. Behavioural economics starts from the premise that people are time constrained, do not have all (or even most) information easily accessible and have cognitive capacity with serious limits in processing information when making choices. These constraints cause people to use short-cuts (referred to as “heuristics”) to make choices. The heuristics are subject to many biases, which result in systematic and predictable deviations from making the optimal choices which would be assumed in the traditional economic approach. Behavioural economists would say that people are “boundedly rational”.
53 Professor Slonim and Professor List agreed that people are generally subject to a number of behavioural biases. These include the following.
54 First, risk aversion: This refers to a bias whereby people prefer the expected value of a risky prospect rather than the risky prospect. Professor List gave the following example:
To understand risk aversion, consider a simple example in which a decision maker has to value a lottery that pays either $100 or $0, both with a 50% probability. This might reflect a random draw from an urn with 5 red balls and 5 black balls, where drawing a red ball results in a payment of $100 and a black ball results in no payment. The expected value of this lottery is $50 (= 0.5*$100 + 0.5*0). Risk aversion implies that a person prefers a certain $50 payment to this lottery and thus would be willing to pay less than $50 to participate in the lottery.
55 The point in its application to the present case, is that risk aversion will reduce the value to the user if the outcome is unknown to some value below the expected value.
56 Secondly, ambiguity aversion: This refers to a bias whereby people prefer a risky choice with known probabilities to an ambiguous choice where the probabilities are unknown. Ambiguity aversion and risk aversion are usefully considered together, because it often happens that people do not know the actual probabilities associated with outcomes. Professor List explained (footnotes omitted):
In many situations, however, people do not know the actual probabilities associated with outcomes. For example, consider the above example of a lottery determined by an urn filled with red and black balls, but where the number of balls of each color is unknown. This problem is referred to as an “ambiguous” choice in contrast to the “risky” choice involving known probabilities. In 1961, Daniel Ellsberg showed that people preferred a risky choice (with known probabilities) to ambiguous choices and hence are willing to pay less for ambiguous choices than risky ones. This preference is known as “ambiguity aversion”.
57 Thirdly, present bias: This refers to a bias causing people to place an inordinate amount of weight on costs or benefits that affect them in the present when compared to future costs or benefits. Both experts agreed that present bias could have affected users’ decisions regarding effort that they expended in navigating the screens.
58 Fourthly, status quo bias: Status quo bias refers to a common preference people have for the existing state of affairs rather than choosing an alternative. This applies in a variety of situations, including a preference for items people already own, current service providers, current insurance policies and default options. Status quo bias implies that default settings – for example Location History switched “off” and Web & App Activity switched “on” – may play an important role in user decisions regarding privacy settings. Referring to literature addressing online choices and technology more generally, Professor List stated (footnotes omitted):
Economic literature recognizes that the use of defaults can be efficient in the sense that in their absence Users would face more complex and demanding choices, potentially resulting in greater confusion and more frequent “mistakes”. The use of defaults is ubiquitous in the [technology] industry.
59 Fifthly, loss aversion: Loss aversion refers to the tendency for people to dislike losing more than they like winning. For example, Professor Slonim explained that participants in many experiments demanded much more money to give up an item than they were willing to pay to obtain the same item. In traditional economics, it would be assumed that people determine the correct value of alternatives and choose the better option. It is well established that this does not in fact occur in decision-making because people often bypass a sure win in order to avoid a possible, equivalently sized, loss.
60 The experts also addressed other issues. The experts referred to “cognitive cost”, which recognises that a person has limited mental resources to evaluate all possible options. Both experts considered the cognitive costs that users faced in navigating the various screens, particularly those involved in the set-up process. The amount of effort a user might be prepared to expend depends on a number of matters including how important a particular issue is to the person. A person with a particular concern about privacy might be prepared to expend greater mental resources on the topic than someone who did not have such a concern.
61 The experts agreed that there is a technological trade-off between privacy and service quality. This was described by Professor List in the following way (footnotes omitted):
Digital platforms and their users face a technological tradeoff between privacy and service quality. The use of personal data enables Google to offer a variety of personalized services based on individuals’ historical locations, travel patterns, web search and browsing activity, etc. Personal data enables Google to tailor advertisements to Users’ interests and enables Google to generate higher revenues and better serve advertisers. The tradeoff between privacy and service quality is central to the value proposition that Google offers its users and Google has been enormously successful. In 2017, for example, Google’s Android operating system was used in 65% of smartphones in Australia. Google’s market share of Australian search queries has averaged approximately 94% since 2014.
The tradeoff between privacy and service quality is well understood in the economics literature. In a paper that summarizes theoretical and empirical research on the economics of privacy, Acquisti et al. (2016) conclude that:
[I]ndividuals and organizations face complex, often ambiguous, and sometimes intangible trade-offs. Individuals can benefit from protecting the security of their data to avoid the misuse of information they share with other entities. However, they also benefit from the sharing of information with peers and third parties that results in mutually satisfactory interactions.
62 The experts referred to “choice architecture”. This refers to the entirety of the design of the screens. The experts agreed that this can affect whether, how much and how carefully users will invest effort to read and understand the content as well as affect the paths they will use to navigate through the screens.
63 The experts referred to “salience” and addressed the role of headings on the screens. On the basis that people have limited time and energy, a heading or some “salient” feature may capture the audience’s attention and perhaps also divert attention from some other aspect of the screen.
64 Professor Slonim and Professor List accepted that the various biases pulled in different directions. Some implied that users would spend more time assessing the content of the various screens; some implied that the user would spend less time. The experts ultimately agreed in their Joint Report that “[b]ehavioural economics … yields ambiguous predictions regarding the effort users will put into reading and navigating” the screens. As both parties ultimately submitted, in short, the economists agreed that behavioural economics alone could not predict how users’ decision-making would be affected by the screens. Nevertheless, the various points the experts made, which accord with common-sense, are useful in considering how a user faced with the various screens referred to in the proceedings might have reacted.
65 Google relied on the evidence of Mr David Monsees, a Senior Product Manager at Google LLC. Mr Monsees was responsible for the user-facing aspect of user data controls (UDC) or “UDC settings”, which included the Location History and Web & App Activity settings.
66 Mr Monsees gave evidence about Google’s products and accounts, and more specifically about the Location History and Web & App Activity settings and how those settings functioned.
67 In cross-examination, Mr Monsees was taken to a number of internal Google documents, which disclosed ongoing discussions regarding the Location History and Web & App Activity settings. The ACCC submitted that those documents demonstrated that Google employees considered the information provided to users in respect of Location History and Web & App Activity to be wholly deficient.
68 The first of these documents was a document labelled “go/ul2017”, and associated emails about the go/ul2017 document. The go/ul2017 document was a document prepared by Mr Lopyrev, the head engineer for Location History, and Mr Lopyrev’s team. XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX:
XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX
69 The go/ul2017 document had been sent to Mr Horling, Mr Monsees’ boss, on 23 February 2017, and forwarded to Mr Monsees by Mr Horling three days later. In his email to Mr Horling on 23 February 2017, Mr Lopyrev stated that the user story in relation to Location History and Web & App Activity was “crazy confusing”:
… I have two more topics for us:
- there was a plan to merge LH [Location History] and WAAH [Web & App Activity] - where did we land on this? should we consider it again (now that LH and WAAH world have so much overlap, our user story is crazy confusing - see go/ul2017)
- can you tell me the latest on Context Manager / Footprints ?
70 Mr Monsees replied to Mr Horling’s email, stating “that go/ul2017 doc from Mike is very unsettling”. Mr Monsees gave evidence that when he wrote that the go/ul2017 document was “very unsettling”, he was referring to XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX. Whilst acknowledging that at the time he read the document he would have understood that XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XXXX XX, Mr Monsees stated that this was not what “unsettled” him about the document.
71 Mr Monsees also gave evidence about internal Google communications regarding an article entitled “AP Exclusive: Google tracks your movements, like it or not” published by the Associated Press on or around 13 August 2018 (AP Article). The AP Article criticised the fact that Google retained users’ location information, even after a user had “paused” the Location History setting. The AP Article included:
An Associated Press investigation found that many Google services on Android devices and iPhones store your location data even if you’ve used privacy settings that say they will prevent it from doing so.
…
Google’s support page on the subject states: “You can turn off Location History at any time. With Location History off, the places you go are no longer stored.”
That isn’t true. Even with Location History paused, some Google apps automatically store timestamped location data without asking.
…
To stop Google from saving these location markers, the company says, users can turn off another setting, one that does not specifically reference location information. Called “Web and App Activity” and enabled by default, that setting stores a variety of information from Google apps and websites to your Google account.
When paused, it will prevent activity on any device from being saved to your account. But leaving “Web & App Activity” on and turning “Location History” off only prevents Google from adding your movements to the “timeline,” its visualization of your daily travels. It does not stop Google’s collection of other location markers …
72 After the publication of the AP Article, an urgent meeting was held between various Google employees, where the AP Article was discussed. It was referred to internally as the “Oh Shit” meeting. In response to the AP Article and following the meeting, Google’s Director of Communication & Public Affairs circulated a document entitled “Making Location History simple”, which referred to works being carried out to “reduce user confusion re: how location is used across our products and services”. The document was sent to Mr Monsees, who added information to the document regarding Location History projects that had been previously discussed or were already in progress.
73 Mr Monsees was also taken to a number of communications regarding the AP Article on “Industryinfo”, an internal Google platform used for discussion of tech industry news. A number of Google staff members commented about the AP Article on Industryinfo. For example, a Staff Software Engineer said (emphasis and use of word “redacted” in original):
I agree with the article. Location off should mean location off; not except for this case or that case.
The current UI [User Interface] feels like it is designed to make things possible, yet difficult enough that people won’t figure it out. New exceptions, defaulted to on, silently appearing in settings menus you may never see is <redacted>.
74 A Senior Technical Solutions Consultant also said:
Although I know how it works and what the difference between “Location” and “Location History” is, I did not know Web and App Activity had anything to do with location.
Also seems like we are not very good at explaining this to users.
75 Internal Google documents showed that following the publication of the AP Article, there was at least a 500% increase in the number of users disabling Location History and Web & App Activity.
76 It is not clear that the settings referred to in the documents to which Mr Monsees was taken, or to which the AP Article referred, were the same as the settings which were the subject of these proceedings, although I infer that there is likely to have been broad similarities. Mr Monsees’ evidence, and the documents to which he was taken do not establish, or assist in establishing in any meaningful way, that Google’s conduct in the specific respects pleaded in the present proceedings was misleading or likely to be so to the particular classes identified. Accordingly, I have not placed any reliance on this evidence.
77 Section 18(1) of the ACL provides that:
A person must not, in trade or commerce, engage in conduct that is misleading or deceptive or is likely to mislead or deceive.
78 The relevant general principles may be stated as follows.
79 First, because s 18 is focussed on “conduct” that is misleading or deceptive or is likely to mislead or deceive, it is critical to commence consideration of whether the section applies by first identifying the relevant conduct with precision: Campbell v Backoffice Investments Pty Ltd (2009) 238 CLR 304 at [32] (French CJ); Google Inc v Australian Competition and Consumer Commission (2013) 249 CLR 435 at [89] (Hayne J).
80 Secondly, it is necessary to consider whether the identified conduct was conduct “in trade or commerce”. In the present case, there is no dispute that the conduct was in trade or commerce.
81 Thirdly, it is necessary to consider whether the conduct as a whole, and in context, was misleading or deceptive or likely to mislead or deceive: Campbell at [102]; Comité Interprofessionnel du Vin de Champagne v Powell (2015) 330 ALR 67 at [172].
82 Where conduct includes the making of express representations or where it is alleged that conduct gives rise to representations being made, it is necessary to bear in mind that “[r]eferences to misrepresentation or reliance must not be permitted to obscure the need to identify contravening conduct”: Campbell at [102]. The relevant “conduct is not to be pigeon-holed into the framework or language of representation”: Comité Interprofessionnel at [172].
83 The following further observations should be made in relation to cases, such as the present, where it is asserted that conduct gave rise to representations which were misleading or deceptive or likely to mislead or deceive:
representations may be express or they may be implied from words or conduct: Given v Pryor (1979) 24 ALR 442 at 446; Aqua-Marine Marketing Pty Ltd v Pacific Reef Fisheries (Australia) Pty Ltd (No 5) [2012] FCA 908 at [78];
it is necessary to determine whether the representations were in fact conveyed by the relevant conduct as a whole, assessed in context; examining only isolated parts of the conduct, for example individual express representations, “invites error”: Butcher v Lachlan Elder Realty Pty Limited (2004) 218 CLR 592 at [109] per McHugh J; approved in Campbell at [102];
where a Court is concerned to ascertain the overall impression created by a number of express and implied representations conveyed by one communication, or by a series of representations made during an online process or presentation, it is wrong simply to analyse the separate effect of each representation;
where a publication, or online process or presentation, contains a misleading statement in one place, but contains material which corrects or explains the misleading statement in another, the question is one of overall assessment of the whole publication or online process: Australian Competition and Consumer Commission v TPG Internet Pty Ltd (2020) 381 ALR 507 at [25]; a variety of considerations will be relevant, including the prominence of the various statements and the likelihood of the consumer reading or absorbing any neutralising material;
the conduct must ultimately be assessed as a whole and this means that individual representations must be assessed in the context of the whole of the conduct: Parkdale Custom Built Furniture Pty Ltd v Puxu Pty Ltd (1982) 149 CLR 191 at 199; Butcher at [39]; Australian Competition and Consumer Commission v TPG Internet Pty Ltd (2013) 250 CLR 640 (ACCC v TPG) at [52].
84 Fourthly, it is necessary to determine whether the conduct was misleading or deceptive or likely to mislead or deceive. Conduct will be misleading or deceptive if it induces or is capable of inducing error or has a tendency to lead into error. In ACCC v TPG at [39], French CJ, Crennan, Bell and Keane JJ stated:
… Conduct is misleading or deceptive, or likely to mislead or deceive, if it has a tendency to lead into error. That is to say there must be a sufficient causal link between the conduct and error on the part of persons exposed to it [Elders Trustee and Executor Co Ltd v E G Reeves Pty Ltd (1987) 78 ALR 193 at 241 (Gummow J)].
85 Conduct which merely causes confusion or uncertainty or wonderment is not necessarily misleading or deceptive: Google Inc at [8]; Australian Competition and Consumer Commission v Coles Supermarkets Australia Pty Ltd (2014) 317 ALR 73 at [39] (Allsop CJ).
86 The Court must put itself in the position of the relevant consumer. There is no question that the more one pores over the relevant screens, the more one notices matters of detail, the more one appreciates the literal meaning rather than what might first have been understood and the more one sees nuances and subtleties which might have been overlooked by the consumer. The relevant consumers in the classes identified by the ACCC would have read the material in a manner consistent with the consumer’s context. The question is not whether, on close analysis of written material by the Court after detailed argument, the various screens can be seen to be strictly accurate. The question is whether Google’s conduct as a whole, including what was and what was not stated on the various screens, was misleading or deceptive or likely to mislead or deceive reasonable members of the class of consumers likely to be affected by the conduct. The consumers in the relevant classes are in a different position to the Court.
Principles where conduct directed to a class of persons
87 Where the question whether conduct is misleading or deceptive arises in relation to a particular class of persons, as opposed to specified individuals, the Court must assess whether the conduct is likely to mislead or deceive by reference to the ordinary or reasonable members of that class, namely the “class of consumers likely to be affected by the conduct”: Puxu at 199; see also: Campomar Sociedad, Limitada v Nike International Limited (2000) 202 CLR 45 at [102]; Google Inc at [7]; TPG Internet at [23].
88 It was common ground between the parties that there is no “not insignificant number” test – that is, it is not necessary for an applicant to prove that a “not insignificant number” of people within the class were likely to be misled: TPG Internet at [23] and Trivago NV v Australian Competition and Consumer Commissioner (2020) 384 ALR 496 at [192].
89 During oral argument, Google submitted that:
(1) the law requires the identification of a single hypothetical person within the relevant class of users to test, by reference to that hypothetical person, whether the members of the class would have been misled by the conduct;
(2) the question is whether that hypothetical person would have been misled, the hypothetical person being capable of only one response.
90 Google relied upon what the High Court stated in Campomar at [103] (footnotes omitted; Google’s emphasis added):
Where the persons in question are not identified individuals to whom a particular misrepresentation has been made or from whom a relevant fact, circumstance or proposal was withheld, but are members of a class to which the conduct in question was directed in a general sense, it is necessary to isolate by some criterion a representative member of that class. The inquiry thus is to be made with respect to this hypothetical individual why the misconception complained has arisen or is likely to arise if no injunctive relief be granted. In formulating this inquiry, the courts have had regard to what appears to be the outer limits of the purpose and scope of the statutory norm of conduct fixed by s 52. Thus, in Puxu, Gibbs CJ observed that conduct not intended to mislead or deceive and which was engaged in “honestly and reasonably” might nevertheless contravene s 52. Having regard to these “heavy burdens” which the statute created, his Honour concluded that, where the effect of conduct on a class of persons, such as consumers, was in issue, the section must be “regarded as contemplating the effect of the conduct on reasonable members of the class”.
91 I do not accept that in all misleading or deceptive conduct cases where the conduct is directed to the public at large, or a segment of the public, it is necessary to isolate one hypothetical person within the class to determine whether there has been a contravention. It may be necessary to isolate a number of hypothetical persons within the class, assuming the variable characteristics of the reasonable members of the class suggest that such an approach is appropriate. As was made clear in Campomar at [99], cases should not be considered in the abstract; regard must be had to the circumstances of the particular case and the remedy sought in respect of the contravention alleged to have occurred. I do not read Campomar at [103] as requiring the identification of only one hypothetical person in all cases. The final sentence of [103] recognises that “where the effect of conduct on a class of persons, such as consumers, was in issue, the section must be ‘regarded as contemplating the effect of the conduct on reasonable members of the class’”, citing Puxu at 199. It may be that reasonable members of the class cannot be distilled into a single hypothetical reasonable person.
92 In any event, even if there must be an identification of a single hypothetical member of the class, it does not follow that in all cases the identified hypothetical person is only capable of one response or reaction. There may well be situations where a hypothetical person might reasonably have been misled and might reasonably not have been misled. The law recognises that there may be a number of different “reasonable” responses to conduct. Indeed, the High Court in Campomar expressly so concluded at [105] (footnotes omitted; emphasis added):
Nevertheless, in an assessment of the reactions or likely reactions of the “ordinary” or “reasonable” members of the class of prospective purchasers of a mass-marketed product for general use, such as athletic sportswear or perfumery products, the court may well decline to regard as controlling the application of s 52 those assumptions by persons whose reactions are extreme or fanciful. For example, the evidence of one witness in the present case, a pharmacist, was that he assumed that “Australian brand name laws would have restricted anybody else from putting the NIKE name on a product other than that endorsed by the [Nike sportswear company]”. Further, the assumption made by this witness extended to the marketing of pet food and toilet cleaner. Such assumptions were not only erroneous but extreme and fanciful. They would not be attributed to the “ordinary” or “reasonable” members of the classes of prospective purchasers of pet food and toilet cleaners. The initial question which must be determined is whether the misconceptions, or deceptions, alleged to arise or to be likely to arise are properly to be attributed to the ordinary or reasonable members of the classes of prospective purchasers.
93 In National Exchange Pty Ltd v Australian Securities and Investments Commission (2004) 49 ACSR 369, Dowsett J stated at [24]:
While it is true that members of a class may differ in personal capacity and experience, that is usually the case whenever a test of reasonableness is applied. Such a test does not necessarily postulate only one reasonable response in the particular circumstances. Frequently, different persons, acting reasonably, will respond in different ways to the same objective circumstances. The test of reasonableness involves the recognition of the boundaries within which reasonable responses will fall, not the identification of a finite number of acceptable reasonable responses.
94 In Comité Interprofessionnel at [171], Beach J accepted that there was scope for different responses:
[W]here the issue is the effect of conduct on a class of persons such as consumers (rather than identified individuals to whom a particular misrepresentation has been made or particular conduct directed), the effect of the conduct or representations upon ordinary or reasonable members of that class must be considered (Campomar Sociedad, Limitada v Nike International Ltd (2000) 202 CLR 45 at [102] and [103]). This hypothetical construct avoids using the very ignorant or the very knowledgeable to assess effect or likely effect; it also avoids using those credited with habitual caution or exceptional carelessness; it also avoids considering the assumptions of persons which are extreme or fanciful. Further, the objective characteristics that one attributes to ordinary or reasonable members of the relevant class may differ depending on the medium for communication being considered. There is scope for diversity of response both within the same medium and across different media.
95 In other areas of the law where a hypothetical reasonable individual is considered, the Court might be bound to land upon one response. For example, the determination of whether a published matter conveys a pleaded imputation requires the Court to identify one meaning, despite the obvious truth that the publication is likely to mean different things to different people. As Diplock LJ stated in Slim v Daily Telegraph Ltd [1968] 2 QB 157 at 173:
[When] words are published to the millions of readers of a popular newspaper, the chances are that if the words are reasonably capable of being understood as bearing more than one meaning, some readers will have understood them as bearing one of those meanings and some will have understood them as bearing others of those meanings. But none of this matters. What does matter is what the adjudicator at the trial thinks is the one and only meaning that the readers as reasonable men should have collectively understood the words to bear. That is “the natural and ordinary meaning” of words in an action for libel.
96 This principle is one developed in respect of a private action for the tort of defamation. Applying such an approach to assessing alleged misleading or deceptive conduct might, on many occasions, fail to protect ordinary or reasonable consumers, an outcome that is unlikely given the ACL is directed at broad consumer protections. One would not condone misleading conduct directed to the public at large just because 51% of consumers, or an even greater majority, of consumers would not be misled. The law in the consumer protection field does not confine a hypothetical member of the class to one response. As Allsop CJ stated in Australian Competition and Consumer Commission v Coles Supermarkets Australia Pty Ltd (2015) 327 ALR 540 at [95]: “[t]he fact that some people may not be misled is not the point”.
97 In summary, the test was accurately stated by the Full Court in TPG Internet at [22(e)]:
[W]here the impugned conduct is directed to the public generally or a section of the public, the question whether the conduct is likely to mislead or deceive has to be approached at a level of abstraction where the Court must consider the likely characteristics of the persons who comprise the relevant class to whom the conduct is directed and consider the likely effect of the conduct on ordinary or reasonable members of the class, disregarding reactions that might be regarded as extreme or fanciful: Campomar at [101]-[105]; Google at [7] per French CJ and Crennan and Kiefel JJ.
98 If there were a requirement to land upon a single response by a hypothetical person, this would provide significant tension, if not be inconsistent, with the proposition advanced by both parties that it is not necessary for an applicant to prove that a “not insignificant number” of people within the class were likely to be misled. If one had to land upon one response, this would naturally invite consideration of the response of the majority of reasonable members of the class. If the majority were not misled, then the case would fail notwithstanding that the conduct misled (a not insignificant number of) reasonable members of the class.
99 Section 29(1)(g) of the ACL provides:
A person must not, in trade or commerce, in connection with the supply or possible supply of goods or services or in connection with the promotion by any means of the supply or use of goods or services: …
(g) make a false or misleading representation that goods or services have sponsorship, approval, performance characteristics, accessories, uses or benefits;
100 It was not in dispute that, if the pleaded representations were made, then they were made “in trade or commerce”. It follows that, to establish a breach of s 29(1)(g), the ACCC must establish that Google, “in connection with the supply or … promotion … of goods or services”, made a “representation that goods or services have … performance characteristics … uses or benefits” which was “false or misleading”.
101 The words “in connection with” in this context are broad; see also [340] below. The various pleaded representations, if made, were made “in connection with” the supply or promotion of the Android OS, the Pixel phone and various Google services. Google did not submit that this requirement was not made out in relation to Google LLC. The ACCC relied upon a number of performance characteristics, uses and benefits – see: letter from Norton Rose Fulbright to Corrs Chambers Westgarth dated 21 July 2020. It is not necessary to set these out in detail. By way of example, it was alleged that Google represented that the Android OS had “performance characteristics” which it did not have including that the Location History setting controlled whether Google would obtain personal data about a user’s location and that if the setting was “off” Google would not obtain personal data about a user’s location from a linked device or use that data. Equivalent particulars were provided with respect to “uses” and “benefits” of the Android OS.
102 A comparison of the text of ss 18 and 29 suggest potentially significant differences in their respective operation. For example:
(1) First, the legislative drafter has chosen quite different language with “misleading or deceptive” and “false or misleading”. Ordinarily, this might be thought to indicate a different test was intended. However, as will be explained below, the different wording simply reflects the fact that the provisions were drawn from different origins and it was probably not intended from the choice of words to indicate substantially different tests. This result is also suggested by the fact that both tests cover conduct or representations which are “misleading”.
(2) Secondly, s 18 includes the words “likely to mislead or deceive” which reflects an amendment to s 52 of the Trade Practices Act 1974 (Cth) (TPA) made in 1977; these words do not have a counterpart in s 29 and did not have a counterpart in s 53 of the TPA (from which s 29 of the ACL was drawn).
(3) Thirdly, s 18 is concerned with “conduct” and s 29 is concerned with “representations”.
103 Section 18 of the ACL was based on s 52 of the TPA. Section 52 of the TPA, headed “misleading or deceptive conduct” enacted a generally applicable minimum standard, applicable to any conduct in trade or commerce. Section 52 of the TPA (and now s 18 of the ACL) only ever carried civil consequences. Section 52 reflected, at least in the Australian context, a new regulatory approach. It was based at least in part on s 5 of the US Federal Trade Commission Act 1914 although there were significant differences, in particular s 52 did not prohibit “unfair” practices – see: Hornsby Building Information Centre Pty Ltd v Sydney Building Information Centre Ltd (1978) 140 CLR 216 at 226-227.
104 Section 29 of the ACL was based on s 53 of the TPA. Section 53 of the TPA, as enacted in 1974, was headed “false representations” and created criminal offences in respect of specifically enumerated false representations and “false or misleading statements”. Section 53, being specific in its focus and carrying criminal consequences, was by no means novel. At least in part, it was modelled on or influenced by State and UK legislation including the Consumer Affairs Act 1972 (Vic), the Trade Descriptions Act 1968 (UK) and the Merchandise Marks Act 1887 (UK).
105 Whilst ss 52 and 53 were both within Div 1 of the TPA, entitled “Unfair practices”, the purpose of s 53 was to create specific offences carrying criminal consequences, whereas the purpose of s 52 was to create a minimum standard of conduct, breach of which – through other provisions – would carry civil consequences. It has been said that “ACL s 29 (formerly TPA s 53) supports ACL s 18 [formerly TPA s 52] by enumerating specific types of conduct which, if engaged in trade or commerce in connection with the promotion or supply of goods or services, will give rise to a breach of the Act”: Miller RV, Miller’s Australian Competition and Consumer Law Annotated (43rd ed, Thomson Reuters, 2021) at [ACL.29.20]. It is important, however, to appreciate that the provisions have different origins, objectives and operation even if s 29 can be described as “supporting” s 18 or it can be seen that there is partial overlap in purpose.
106 In considering the operation of s 52, it has been said that the provisions of Part V of the TPA (which included ss 52 and 53) are “remedial” and should, accordingly, be given a liberal construction: Accounting Systems 2000 (Developments) Pty Ltd v CCH Australia Ltd (1993) 42 FCR 470 at 503. That proposition has been stated in relation to ss 18 and 29 of the ACL: Australian Competition and Consumer Commission v Australian Private Networks Pty Ltd [2019] FCA 384 at [15]. Authority establishes that “remedial” legislation should be construed so as to give the fullest relief which the fair meaning of its language will allow, even where a person contravening the provision will be liable to a penalty: Devenish v Jewel Food Stores Pty Ltd (1991) 172 CLR 32 at 44 (Mason CJ). It should also be recognised, however, that there is a distinction between s 18 and s 29 which is evident from the statutory language and purpose. Section 29 has a penal operation and s 18 does not. There is nothing unusual in a provision having more than one purpose – cf: Rich v Australian Securities and Investments Commission (2004) 220 CLR 129 at [35]; New South Wales Aboriginal Land Council v Minister Administering the Crown Lands Act (2016) 260 CLR 232 at [92]. The ordinary rules of construction apply to a “penal” provision: Beckwith v The Queen (1976) 135 CLR 569 at 576. The ordinary rules of construction also apply to “remedial” or “beneficial” legislation. Underlying the observations which are made from time to time that penal provisions should be construed strictly, remedial provisions liberally, or beneficial provisions beneficially, is the simple proposition that, where different constructions are available, a legislative provision should receive a construction which promotes its purpose over one which does not. The labelling of a provision as “penal”, “remedial” or “beneficial” merely reflects a conclusion about purpose as revealed by the statutory text read in context. A label can be unhelpful if it disguises that a provision might have multiple purposes or if it leads to an assumption that the purpose is to be pursued at all cost.
107 Ultimately, it is the terms of the particular provision which must be applied to the facts. There are differences between ss 18 and 29 which might be important in particular cases. First, as mentioned, s 29 is a civil penalty provision – see: s 224 of the ACL. In a civil proceeding, the court must find the case of a party proved if it is satisfied that the case has been proved on the balance of probabilities: s 140(1) of the Evidence Act. Section 140(2) of the Evidence Act provides:
Without limiting the matters that the court may take into account in deciding whether it is so satisfied, it is to take into account:
(a) the nature of the cause of action or defence; and
(b) the nature of the subject-matter of the proceeding; and
(c) the gravity of the matters alleged.
108 Whilst s 140(2) applies according to its terms, it is appropriate to observe that s 140(2)(c) reflects the common law position that the more serious the allegation, the less likely it might be expected, all other things being equal, that a respondent would have committed the relevant act: Briginshaw v Briginshaw (1938) 60 CLR 336. The Court must have regard to the gravity of what is sought to be established in assessing whether the party bearing the onus (the ACCC) has discharged that onus: Patrick Stevedores Holdings Pty Ltd v Construction, Forestry, Maritime, Mining and Energy Union (2019) 286 IR 52 at [17]-[18] (Lee J).
109 Secondly, s 29 revolves around the making of representations. Whilst it may be accepted that representations can arise through conduct, the application of s 18 requires identification of “conduct” and s 29 requires the identification of “representations”. A representation is a statement about or related to a matter of fact, which might be made in writing, orally, pictorially or through conduct: Given v Pryor at 445-446 (Franki J).
110 Thirdly, unlike s 18, there is no express reference in s 29 to a concept of “likely to mislead”. Google submitted that, by reason of the lack of reference in s 29 to the concept of something being “likely to mislead”, the ACCC had to prove to the requisite standard that Google made representations that were actually false or misleading; it was not sufficient for the ACCC to prove that it was likely, or that there was a real or not remote chance or possibility, that relevant users would be misled. This would be sufficient for s 18, but not for s 29. For the reasons which follow, I accept this submission.
111 The words “likely to mislead or deceive” were introduced in 1977 into what was then s 52 of the TPA. This followed a recommendation made by the Swanson Committee in its August 1976 report: Trade Practices Act Review Committee, Report to the Minister for Business and Consumer Affairs (Australian Government Publishing Service, Canberra, 1976). The Swanson Committee explained at [9.55]:
A number of submissions also suggested to the Committee that it was not certain whether section 52 required proof of actual damage or whether the mere possibility of damage were sufficient to invoke the section. The Committee considers that the section should apply to conduct which is likely to mislead or deceive, without requiring proof that the conduct has mislead or deceived, but should not apply to conduct which has merely a tendency to mislead or deceive. We recommend that section 52 should be amended to make it clear that it applies only to conduct ‘that is, or is likely to be, misleading or deceptive’.
112 In relation to s 53 of the TPA (upon which s 29 is based), the Committee stated at [9.64]:
… [T]he Committee would like to express the general view that it should not be the function of section 53, which has criminal law sanctions, to prohibit as wide a sweep of false and misleading conduct as possible. Section 53 should deal only with conduct which has demonstrably led to abuses and involves a real potential for harm. Section 52, which has sanctions of a civil nature, provides a more appropriate approach to a general prohibition of undesirable practices.
113 A number of cases have recognised that, although s 29 of the ACL uses the term “false or misleading” rather than “misleading or deceptive” as used in s 18, there is no meaningful difference between the two phrases; indeed, it is at least implicitly suggested in some of the cases that there is no meaningful difference between a representation being “false or misleading” and conduct being “likely to mislead or deceive”. This line of authority, at least in the recent past, starts with the decision of Gordon J in Australian Competition and Consumer Commission v Dukemaster Pty Ltd [2009] FCA 682. At [14] and [15] her Honour stated:
In relation to the first element, s 53(e) [of the TPA] requires the representation to be “false or misleading” as opposed to “misleading or deceptive” (in s 52). I was not taken to, and I have not found, any authority which attributes a meaningful difference to this dichotomy for the purposes of the TPA. (For a discussion of the phrase “false and misleading” under a different Act, see Construction, Forestry, Mining and Energy Union v Hadkiss (2007) 160 FCR 151). Indeed, the vast majority of cases that discuss an alleged breach of s 53(e) couple it with a breach of s 52 and deal with the “false or misleading” and “misleading or deceptive” aspect of the conduct mutatis mutandis: see Foxtel Management Pty Ltd (2005) 214 ALR 554 at [94]; ACCC v Target Australia Pty Ltd (2001) ATPR 41-840; ACCC v Harbin Pty Ltd [2008] FCA 1792; ACCC v Prouds Jewellers Pty Ltd [2008] FCAFC 199 at [42].
That is not altogether surprising. The purpose of s 53 has been described as being to “[support] s 52 by enumerating specific types of conduct which, if engaged in by a corporation in trade or commerce in connection with the promotion or supply of goods or services, [would] give rise to a breach of the Act”: see R Miller, Miller’s Annotated Trade Practices Act (30th ed, 2009), [1.53.5]. Accordingly, and in the absence of any submission to the contrary, I see no reason why the application of s 53(e) should not fall to be determined upon the conclusions I reach in relation to s 52 – namely that the representations were misleading and deceptive or likely to mislead or deceive.
114 In Coles Supermarkets at [38] to [40], Allsop CJ stated:
For the enquiry under s 18, it is necessary to identify the impugned conduct and then to consider whether that conduct, considered as a whole and in context, is misleading or deceptive or likely to mislead or deceive: Google Inc v Australian Competition and Consumer Commission (2013) 249 CLR 435; 294 ALR 404; 99 IPR 197; [2013] HCA 1 at [89], [102] and [118]; and Campomar Sociedad Limitada v Nike International (2000) 202 CLR 45; 169 ALR 677; 46 IPR 481; [2000] HCA 12 at [100]–[101] (Campomar). The same applies to the enquiry as to representations and conduct under ss 29(1)(a) and 33, respectively.
Conduct is misleading or deceptive or likely to mislead or deceive if it has the tendency to lead into error, if there is a sufficient causal link between the conduct and the error on the part of the person exposed to the conduct: Australian Competition and Consumer Commission v TPG Internet Pty Ltd (2013) 250 CLR 640; 304 ALR 186; 96 ACSR 475; [2013] HCA 54 at [39] (TPG). The causing of confusion or questioning is insufficient; it is necessary to establish that the ordinary or reasonable consumer is likely to be led into error.
There is no meaningful difference between the words and phrases “misleading or deceptive” and “mislead or deceive” (s 18), “false or misleading” (s 29(1)(a)) and “mislead” (s 33): Australian Competition and Consumer Commission v Dukemaster Pty Ltd [2009] FCA 682 at [14].
115 In Comité Interprofessionnel at [170], Beach J accepted that there “is no meaningful difference” between the words and phrases “misleading or deceptive”, “mislead or deceive” or “false or misleading”, citing Dukemaster at [14] and Coles Supermarkets at [40].
116 In Australian Competition and Consumer Commission v HJ Heinz Co Australia Ltd (2018) 363 ALR 136 at [37], White J stated:
The principles which the Court applies when considering alleged contraventions of s 29(1) and s 33 of the ACL are settled. Section 29 of the ACL is the counterpart to s 53 of the Trade Practices Act 1974 (Cth). It was common ground that the case law developed in relation to s 53 may be applied in relation to s 29. It was also common ground that, despite the slight differences in language between the terms “misleading or deceptive” and “mislead or deceive” used in s 18 of the ACL and the term “false or misleading” used in s 29(1), the terms have the same meaning: Australian Competition and Consumer Commission v Dukemaster Pty Ltd [2009] FCA 682 at [14], cited with approval by Allsop CJ in Australian Competition and Consumer Commission v Coles Supermarkets Australia Pty Ltd [2014] FCA 634; (2015) 317 ALR 73 at [40]. It has, however, been held that conduct which is “liable to mislead” (being the term used in s 33) applies to a narrower range of conduct than does conduct which is “likely to mislead or deceive” (being the term used in s 18): Coles Supermarkets at [44] and the cases cited therein. Under s 33, what is required is that there be an actual probability that the public would be misled: Trade Practices Commission v J&R Enterprises Pty Ltd (1991) 99 ALR 325 at 339.
117 In Australian Competition and Consumer Commission v GlaxoSmithKline Consumer Healthcare Australia Pty Ltd (2019) 371 ALR 396 at [6] and [7], Bromwich J stated:
The three pleaded provisions have aspects in common, and to that extent overlap, but also some important areas of difference. There is no material difference in the factual inquiry as between s 18 and s 29(1)(g) in this case because:
(1) there is no meaningful distinction between “misleading or deceptive” and “false or misleading”: Australian Competition and Consumer Commission v Dukemaster Pty Ltd [2009] FCA 682 at [14]–[15]; and
(2) the conduct, both admitted and denied, is by way of the same express or implied representations in relation to both provisions, even though s 18 is general in its scope and does not attract any civil penalty consequences, while s 29(1)(g) is specific in its focus and does have civil penalty consequences.
Section 33 is in a somewhat different category, because the requirement to establish that the impugned conduct was “liable to mislead the public” as to characteristics or suitability is both narrower than “likely to mislead or deceive” in s 18, and requires proof of an actual probability that the public would be misled: see Australian Competition and Consumer Commission v Coles Supermarkets Australia Pty Limited (2014) 317 ALR 73; [2014] FCA 634 (ACCC v Coles) at [44] and the cases there cited.
118 The Full Court in TPG Internet at [21] stated:
Although s 18 takes a different form to s 29, the prohibitions are similar in nature. Whilst s 29 uses the phrase “false or misleading” rather than “misleading or deceptive”, it has been said that there is no material difference in the two expressions: see Australian Competition and Consumer Commission v Dukemaster Pty Ltd [2009] FCA 682 at [14] per Gordon J; Australian Competition and Consumer Commission v Coles Supermarkets Australia Pty Ltd (2014) 317 ALR 73; [2014] FCA 634 at [40] per Allsop CJ; Comité Interprofessionnel du Vin de Champagne v Powell (2015) 330 ALR 67; 115 IPR 269; [2015] FCA 1110 at [170] per Beach J.
119 The amendment which was made in 1977 to s 52 of the TPA was apparently motivated by an intention to make clear that, in order to establish a contravention of s 52, it was not necessary to adduce evidence that someone had actually sustained loss or was in fact misled: see [111] above. It may be questioned whether the amendment was strictly necessary. Section 52 set a standard of conduct. It is not necessary to establish loss in order to establish a failure to meet a certain standard of conduct. It would have been necessary to prove loss if a person sought a remedy in respect of contravening conduct, and the relevant remedy was premised on loss being suffered (for example under s 82 of the TPA). Further, leaving loss to one side, if the proper inference to draw is that a reasonable person was likely to have been misled by relevant conduct, the Court would ordinarily conclude that the conduct was “misleading or deceptive”. Nevertheless, the amendment to s 52 indicates, both textually and as a matter of legislative history, that there is a difference in what may be described as the “purposes” (and operation) of s 52 (s 18 of the ACL) and s 53 (s 29 of the ACL).
120 Section 29(1)(g) requires that a representation be “false or misleading”. The representation must actually be “false or misleading”. It is not necessary to adduce evidence that a reasonable person in the relevant class was in fact misled. That is an inference which the Court can draw from the objective circumstances. In my view, it is an inference the Court would have to draw, having regard to the correct standard of proof, in order to be satisfied that there had been a contravention of s 29(1)(g). If the Court only considered that there was a “real or not remote chance or possibility” that a reasonable person in the relevant class was in fact misled, but was not prepared to conclude that any person was in fact misled, then a contravention of s 29(1)(g) would not be established even though that might be sufficient to establish a breach of s 18 – see: Global Sportsman Pty Ltd v Mirror Newspapers Pty Ltd (1984) 2 FCR 82 at 87. If the proper inference to draw is that the representation was “false or misleading” to some members of the class acting reasonably, but not to other reasonable members, the fact that some reasonable members of the class would not have been misled can be taken into account in determining an appropriate penalty, but a contravention will still have been established.
121 Section 33 – which relates to goods – provides:
A person must not, in trade or commerce, engage in conduct that is liable to mislead the public as to the nature, the manufacturing process, the characteristics, the suitability for their purpose or the quantity of any goods.
122 Section 34 – which relates to services – provides:
A person must not, in trade or commerce, engage in conduct that is liable to mislead the public as to the nature, the characteristics, the suitability for their purpose or the quantity of any services.
123 To establish a breach of ss 33 or 34, the ACCC must establish that:
(1) in trade or commerce;
(2) Google engaged in conduct that was “liable to mislead”;
(3) the conduct was liable to mislead “the public”;
(4) the conduct was liable to mislead as to the nature, characteristics or suitability for purpose of the goods (s 33) or services (s 34).
124 The first matter was not in dispute.
125 As to the second matter, “liable to mislead” is a higher standard than “likely to mislead or deceive” under s 18. The ACCC is required to demonstrate that there was an “actual probability that the public would be misled”: Coles Supermarkets at [44]. In Coles Supermarkets at [44], Allsop CJ stated:
While the words and phrases “misleading or deceptive”, “mislead or deceive”, “false or misleading” and “mislead” are synonymous, the authorities reveal that a distinction is to be made between “likely to mislead or deceive” (in s 18) and “liable to mislead” (in s 33). The latter has been said to apply to a narrower range of conduct: Westpac Banking Corporation v Northern Metals Pty Ltd (1989) 14 IPR 499 at 502; Trade Practices Commission v J & R Enterprises Pty Ltd (1991) 99 ALR 325 at 338–9 (J & R Enterprises); and Australian Competition and Consumer Commission v Turi Foods Pty Ltd (No 4) [2013] FCA 665 at [79]. Under s 33, what is required is that there be an actual probability that the public would be misled: J & R Enterprises at 339. (This citation of J & R Enterprises at 338–9 should not be taken to endorse the comments of O’Loughlin J as to a burden beyond reasonable doubt at 339.)
126 In GlaxoSmithKline at [7], Bromwich J accepted that s 33 required proof of an actual probability that the public would be misled, referring to Coles Supermarkets at [44].
127 As to the third matter, a representation will be made to the public if the approach is general and the number of people who are approached is sufficiently large or if the approach is to all within a sufficient segment of the community at large. In Trade Practices Commission v J & R Enterprises (1991) 99 ALR 325 at 347-348, O’Loughlin J said:
The word “public” is not to be taken as meaning the world at large or the whole community. There will be a sufficient approach to the public if, first, the approach is general and at random and secondly, the number of people who are approached is sufficiently large. In dealing with the phrase “invitation to the public”, Barwick CJ said in Lee v Evans (1964) 112 CLR 276 at 285:
… the basic concept is that the invitation, though maybe not universal, is general; that it is an invitation to all and sundry of some segment of the community at large. This does not mean that it must be an invitation to all the public either everywhere, or in any particular community.
128 As to Scenario 1, concerning the set-up of a device, the relevant users became Google Account holders during the set-up process, assuming that they so agreed. I am satisfied that, if any conduct otherwise breached ss 33 or 34, the requirement that the conduct mislead “the public” is satisfied.
129 As to Scenarios 2 and 3, it was accepted that there were approximately 6.3 million users who opened a Google Account on their Android phone during the period January 2017 to August 2019. Google contended that the various representations concerning Scenarios 2 and 3 were not “to the public” on the basis that the relevant users had entered into the Terms of Service, so that the group was not random; the class of persons to whom the representations were made were people who had a specific relationship with Google as Google Account holders.
130 Google referred in this regard to Shahid v Australasian College of Dermatologists (2008) 168 FCR 46, a case concerning s 55A of the TPA, which was the predecessor to s 34 of the ACL. Google referred to the observations of Jessup J at [206] (with whom Branson and Stone JJ relevantly agreed):
... I accept, of course, that the concept of “the public” is, in an appropriate context, narrower than the world at large, and narrower even than all persons who, for example, live or work within a particular area. I would accept that a representation might be regarded as being addressed to the public in the relevant sense notwithstanding that the potential users of the services in question were, in the nature of things, few in number. I have in mind, for example, a representation made in an advertisement for services of a very specialised kind. It would be the generality of the range of persons to whom the representation was addressed, rather than the practical likelihood of many of them being interested in acting upon the representation, that would justify the conclusion that it was addressed to the public: see Lee v Evans (1964) 112 CLR 276 and J & R Enterprises 99 ALR at 347-348 ...
131 The ACCC noted that Jessup J concluded that there was no representation to the public, with his Honour stating immediately after the part quoted by Google (ACCC’s emphasis added):
However, in the present case the representations were not made to the public at all. They were made to medical practitioners who enquired about, or showed an interest in, becoming Fellows of the College, and even then only to such practitioners as had passed the Part 1 examination or its equivalent. In this respect I would add that, in their endeavour to persuade us that s 55A was activated in the circumstances of the present case, counsel for the appellant referred to no evidence as to the public availability of the training handbooks, or to the public accessibility of the College’s web site, from which the appellant downloaded at least some of those handbooks; and his Honour made no findings on these questions. In the circumstances, I do not think it has been established as a matter of fact that the representations upon which the appellant sued were addressed to the public for the purposes of s 55A of the Trade Practices Act. For those reasons, I would dismiss so much of the appellant’s case as relies upon that section.
132 The ACCC submitted that the key reason in Shahid that the representations were found not to be “to the public” was that they were not freely available to the public, but were only available to persons who had passed an exam as a step to becoming a member of the College of Dermatologists. The ACCC submitted that passing that exam was no small endeavour and that one could well understand why the recipients of the representation were not found to be “the public”. The ACCC contrasted that to the present case, where the only requirement was a Google Account. That was a service that was available freely to all.
133 Most members of the public were entitled to become Google Account holders. It may be that there were some age restrictions, but that can be put to one side. I am satisfied that Google Account holders constituted a sufficient “segment of the community at large” for this requirement to be satisfied: Lee v Evans (1964) 112 CLR 276 at 285 (Barwick CJ). I am satisfied that, if ss 33 or 34 is otherwise shown to have been breached in relation to Scenarios 2 and 3, the requirement that the relevant conduct mislead “the public” is satisfied.
134 As to the fourth matter, the ACCC alleged that the requirement was satisfied in various ways set out in a letter from Norton Rose Fulbright to Corrs Chambers Westgarth dated 21 July 2020. It is not necessary to set these out in detail. By way of example, it was alleged that Google represented the “nature” of the Android OS in a manner that was liable to mislead the public, including by representing that the Location History setting controlled whether Google would obtain personal data about a user’s location and that if the setting was “off” Google would not obtain or use personal data about a user’s location.
135 Both ss 33 and 34 are civil penalty provisions – see: s 224 of the ACL. Accordingly, the observations referred to above concerning s 140(2) of the Evidence Act and the principles in Briginshaw apply also to ss 33 and 34.
THE CLASS OF USERS IDENTIFIED BY THE ACCC
136 In his first report, Professor Slonim identified two groups of users: “typical” and “atypical”. “Typical” users were those users who were not “atypical”. “Atypical” users were those for whom the benefits of further navigation were extremely high, or the costs of further navigation were extremely low, such that the marginal benefit exceeded marginal cost sufficiently to favour further navigation and the design of the screens was unlikely to affect this cost/benefit analysis. In his view, atypical users included technical or legal experts, or users with special interest or motivation and included persons “especially concerned about privacy”.
137 In Professor Slonim’s view, typical users constituted the “vast majority of users”. Professor List agreed with Professor Slonim that the majority of users would “blow through” the set-up process, by which it was meant that the user would take the shortest possible route through the set-up process, without paying much attention at all, so as to be able to begin using the device as quickly as possible.
138 Google submitted that the end result was that the experts agreed that the “vast majority of users” were unlikely to reach any of the screens central to the ACCC’s case. This submission is correct, but it is important to understand that the ACCC’s case was focussed on the particular users who did in fact reach the relevant screens. As to the ACCC’s three scenarios:
(1) Scenario 1 – set-up : Professor Slonim agreed that there was a “low likelihood” that a typical user would read the Privacy and Terms screen and a lower likelihood that he or she would read it carefully. In his first report, Professor Slonim stated that only atypical users would reach the More Options screen during the set-up process. Although it is correct that his reports stated that view, this is best viewed as a flourish intended to emphasise his point that it was very few people. Professor Slonim made it clear in his oral evidence that he considered some typical users might reach the More Options screen, although it was “very few” or a “very small per cent”. I accept his oral evidence in this respect.
(2) Scenario 2 – turning Location History “off”: Professor Slonim’s view was that most typical users would not reach the Location History landing page. Indeed, he considered it unlikely that an atypical user would do so, but that those users who did reach it would “mostly” be atypical. Professor Slonim’s evidence did not specifically address the position of a user who had turned Location History “on” and then later wanted to turn it “off”. The ACCC’s case was directed to the user who had already made a conscious decision to turn Location History “off”. Such a user necessarily knew that the setting existed, because the user has already formed the view that he or she wanted to turn it “off”. Such a user would have an understanding about the Location History function, which might or might not be correct or complete but which led the user to have formed the desire to turn it “off”. Common sense indicates that a user wanting to turn Location History “off” would be likely to reach the Location History landing page. It was not suggested that a reasonable user who wanted to turn Location History “off” would not be able to find the Location History landing page or that it was in any way difficult to find.
(3) Scenario 3 – turning Web & App Activity “off”: Professor Slonim’s evidence was that it was not likely that many typical users would reach the Web & App Activity landing page. Again, this evidence did not specifically address the position of a user who was considering turning Web & App Activity “off”. Such a user necessarily knew that the setting existed. Common sense indicates that a user considering whether to turn Web & App Activity “off” would be likely to reach the Web & App Activity landing page. It was not suggested that a reasonable user considering turning Web & App Activity “off” would not be able to find the Web & App Activity landing page or that it was in any way difficult to find.
139 Google submitted that the relevant audience in the ACCC’s case was comprised of atypical users who had the ability and incentive to navigate and search until they understood the true function of the setting and referred to Professor Slonim’s evidence that atypical users are “going to keep looking until they understand everything”. There is no neat dividing line between typical and atypical users. For example, those users who do not “blow through” the set-up process by taking the path of least resistance will have different reasons for seeking out further information. Not all such users are going to keep looking until they understand everything. It depends on various matters, including the reason the particular user had for deviating from the typical path or seeking out further information.
140 It is also relevant to note that, whilst the classes of users that the ACCC relied upon was quite precisely articulated, the classes included a large variety of consumers. To name a few obvious possible differentiating characteristics, the classes would have comprised both intelligent and less intelligent people, educated and not so well educated people, old people and young people.
141 As noted earlier, the first scenario was directed to users who, between 30 April 2018 and 19 December 2018, used the Android OS on his or her mobile device to set-up his or her Google Account and, when viewing the Privacy and Terms screen (Annexure A), chose to click on “More Options” rather than “I agree” or “Don’t create the account”.
142 The ACCC’s case in relation to the first scenario was not that Google should have given some general warning to all users at the commencement of setting up an account. Rather, the ACCC’s case was concerned only with users who did in fact reach the More Options screen.
143 The necessary consequence of the way the ACCC has defined the class of users is that the relevant user (being a user who behaved differently from the majority) necessarily took or was involved in three steps:
(1) deciding that the user would create an account, but not to click “I agree” on the Privacy and Terms screen (Annexure A), and instead click on “More Options”;
(2) being then shown the More Options screen (Annexure B), through which the user would need to scroll, which showed default settings with respect to various matters and which contained links to further information (“Learn more” links) about the relevant settings; and
(3) then having to click on “I agree” at the bottom of the More Options screen in order to continue the set-up process.
144 In its opening submissions, the ACCC submitted that the users who clicked on “More Options” on the Privacy and Terms screen were particularly interested in privacy and that clicking on “More Options” was “not the natural pathway” for the majority of users. The ACCC’s case was that most users would not click on “More Options” and so would not be misled in the way contended. The evidence of both Professor Slonim and Professor List confirmed this: the vast majority of users would not click on “More Options”.
145 In its closing submissions, the ACCC referred to the users to which this part of the case was directed as a “broad class”, stating:
That is the broad class to whom the conduct was directed. There is no evidence of any restrictions on persons who could purchase an Android phone and set up a Google Account (other than some restrictions that applied to children). For that reason, the class is that part of the public at large who pressed on the “More Options” screen while setting up their Google Account on their Android phone.
146 This submission might be seen as attempting to depart from the way the case had been opened. There might be many different kinds of person who are in the class of people who clicked on “More Options”. But that potentially obfuscates an important point about the class. The class itself is narrow, in the sense that most people would not click on “More Options”.
147 Given that most people would not have clicked on “More Options”, what are the characteristics of the user who would have clicked on “More Options”? The ACCC submitted:
(1) “what can be inferred about this broad class of persons to whom the conduct was directed is that it included reasonable and ordinary people who were interested enough in the matters the subject of the ‘More Options’ link so as to invest the time to look at that information”;
(2) “[t]here might be a multitude of reasons for that”, including that the user:
(a) may have a particular interest in the use of their personal information and, in this regard, a particularly sensitive category of personal information is user location data;
(b) might be concerned about his or her search history;
(c) was particularly interested as a matter of intellectual curiosity or was diligent enough to seek further information or did so because they were not familiar with the technology; and
(3) “persons who click on ‘More Options’ can be taken to be particularly concerned to control how Google collects and uses data which is to be treated as personal information”.
148 As to (1), the “More Options” link was at the bottom of the Privacy and Terms screen. Every user had to pass through the Privacy and Terms screen during the process of creating a new Google Account when setting-up a Linked Device. A user would not know, on arriving at the Privacy and Terms screen, what was on the More Options screen otherwise than by inferences drawn from the content of what the user had read to that point, including in particular what the user had read on the Privacy and Terms screen.
149 The ACCC’s case was that most users would not have clicked on “More Options” at all, because they would have scrolled through the Privacy and Terms screen without digesting its content and clicked “I agree” rather than “More Options”. It is likely that the sort of user who would have clicked on “More Options” was the sort of user who would have paid more attention to the content of the Privacy and Terms screen than those (the majority of users) who effectively ignored its content. That might be because of the sort of person that the user was or because of some particular interest such as privacy or because the user’s interest increased by reason of what the user read on the Privacy and Terms screen.
150 Whatever the reason, I conclude that users who clicked on “More Options” would have read the Privacy and Terms screen to a greater extent than the majority of users who, the parties agreed, would have simply scrolled through the screen without paying any particular attention to its content and clicked “I agree”.
151 It would have been necessary for a user to scroll to the bottom of the Privacy and Terms screen in order to get to the “More Options” link (or to click “I agree” or “Don’t create the account”). The bottom part of the Privacy and Terms screen was as follows:
152 In my view, a reasonable hypothetical user in the class identified by the ACCC is likely to have focussed on the heading “You’re in control” and the content underneath. That material is the material on the Privacy and Terms screen most relevant to explaining the “More Options” link. It is the material most proximate to the “More Options” link. A user scrolling through the Privacy and Terms screen, whether reading its content slowly and carefully or scanning it, would have come to a rest at the bottom of the screen where that material was located. That material is a likely reason a user in the class would click on “More Options”. It is likely that a user in the class would have understood that, by clicking on “More Options”, the user would be provided with options through which the user could control how Google collected and used data associated with the user’s Google Account.
153 The second step for the particular type of user which the ACCC says was misled was that the user, having clicked on “More Options”, was shown an additional screen (Annexure B) which showed default settings with respect to various matters and which contained links to further information (“Learn more”). The version of this screen which the ACCC relied on was shown to all users who clicked on “More Options” between 30 April 2018 and 22 October 2018 and some (but not all) users between 23 October 2018 and 19 December 2018.
154 The two critical settings on the More Options screen were Web & App Activity and Location History. The former was defaulted to “on”. The latter was defaulted to “off”.
155 The Web & App Activity setting was:
156 The More Options screen then referred to five other settings. It was necessary to scroll down on a mobile device to see all of them. After “Web & App Activity” came “Ads Personalisation”, then “YouTube Search History”, then “YouTube Watch History”, then “Location History” and finally “Voice & Audio Activity”.
157 The Location History setting was:
158 The third step was that the user scrolled to the bottom of the More Options screen and clicked “I agree”. The ACCC submitted, and it was implicit in the way it ran its case, that users who clicked on “More Options”, whilst different from the typical user in doing so, were generally likely not to have clicked on the “Learn more” links. The ACCC acknowledged, however, that some users would have clicked on one or more of those links.
159 If a user clicked on the “Learn more” link which appeared under the Location History setting, a pop up message appeared (the LH Set-up Pop Up). This was in the following form:
160 The text in the LH Set-up Pop Up changed from time to time. I do not consider that the various iterations alter the end result.
161 If a user clicked on the “Learn more” link under the Web & App Activity setting, he or she was shown a pop up message (the WAA Set-up Pop Up) in the following form:
162 The text in the WAA Set-up Pop Up changed from time to time. Again, I do not consider that the various iterations alter the end result.
163 The ACCC accepted that, if a user clicked on all of the “Learn more” links the user would not have been misled. In my view, a user who clicked on the Web & App Activity “Learn more” link, is the sort of person who would have read what was stated on the WAA Set-up Pop Up and would not have been misled.
164 The ACCC’s case is that the user who went through the three identified steps (some of whom reasonably would not have clicked on the “Learn more” links) and saw the default settings on the More Options screen was not notified that, even if Location History was “off”, Google might continue to obtain, retain and use “personal data” about the user’s location. The ACCC noted in respect of the More Options screen that:
(1) the only setting which had a title or description which used the word “location” was the Location History setting;
(2) the toggles listed under the Location History setting made it apparent that the default option (described as “Don’t save my Location History in my Google Account”) was that Location History was turned “off”;
(3) the description for Web & App Activity did not make any reference to location or any reference to Google obtaining, retaining or using personal data about the user’s location.
165 The ACCC’s case put simply was that there were reasonable users in the class of users who clicked on “More Options” who would have concluded from the More Options screen – in particular, the Location History setting being defaulted to “off” and with little else on the page to indicate that any other setting might control the collection and use of location data – that Location History was the only setting which controlled the collection and use of location data or that with Location History “off” Google would not obtain, retain or use location data.
166 Google made twelve submissions concerning what the hypothetical user would have done and understood during the three steps referred to earlier.
167 One difficulty with the submissions made by Google is that reasonable members of the class would have behaved in various ways. Google’s submissions presume only one reaction or response. Some of the members of the class would have acted in the way Google submitted in its twelve particular points. But others would not have. I am not satisfied that all reasonable members of the class would have behaved in the way Google submitted. Accepting that the class the ACCC has selected was largely comprised of atypical users, because very few typical users would not have clicked on “More Options”, I am not satisfied that the appropriate hypothetical user (if one is confined to one hypothetical user) would have behaved in the extremely careful and attentive way in which Google submitted that user would behave.
168 Overarching Google’s twelve submissions was the submission that the Privacy and Terms screen and the More Options screen must be read as a whole (and with the links contained in them). I accept that the Privacy and Terms screen and the More Options screen must be read as a whole. They must be read in the way in which consumers would have read them. As noted at [86] above, it is necessary to approach the assessment of the relevant conduct by placing oneself in the position of the relevant consumers, namely people setting up their device for the first time. As to the links contained in the Privacy and Terms screen (other than the “More Options” link) and the More Options screen, only some people in the class identified by the ACCC would have clicked on those further links.
169 First, Google noted that the Privacy and Terms screen provided information under the heading “Data that we process when you use Google”. The opening part of the screen, including the content under the heading “Data that we process when you use Google”, was:
170 Google placed particular emphasis on the third and fourth dot points set out above. Google noted that location is specifically identified as information that is processed. The dot points also explained that this kind of information (including information about location) is processed when the user uses apps or sites that use Google services.
171 The ACCC submitted that a user would not realise from this that information was being given to Google about the user’s location and that Google’s submission otherwise suffered from the vice of reading the Privacy and Terms screen as if it were a contract.
172 In response, Google noted that the reasonable member of the relevant class of users, on the ACCC’s case, was privacy-focussed and that such a user would be expected to read the text under the heading “Data that we process when you use Google”. Such a person would pay attention to a statement that “we process information about that activity – including information such as … location”.
173 Google also submitted that “a person who is taking proper care of his or her own interests will immediately perceive that what is being asked of them is to agree to ‘Terms of Service’ and that the person is entering into a contract with Google LLC”. The first words on the Privacy and Terms screen were: “To create a Google Account, you’ll need to agree to the Terms of Service below”. The last words on the screen required the user to press “I agree”.
174 Google submitted that the next thing a user was told after stating that the user needed to agree to the Terms of Service was that when the user created a Google Account “we process your information as described in our Privacy Policy”. A summary of the key points of the Privacy Policy then followed. In Google’s submission the message was unambiguous: if you were interested in privacy and how information was processed, you should have read this screen carefully.
175 All of this may be accepted. The issue is that the class identified by the ACCC, included members who – even though they may have been more conscious about privacy issues, or location data in particular – would not have paid the degree of careful attention to the content of the Privacy and Terms screen as Google suggested. The degree of attention would have varied according to the level of the user’s interest and concern. Some may have paid careful attention. Others would not have. On balance, it is likely that most users scanned through the Privacy and Terms screen or read it relatively quickly.
176 Secondly, Google pointed to the fact that the Privacy and Terms screen then provided information about why Google processed the data under the heading: “Why we process it”. The information included:
177 The word “it” in the heading referred to the data that is being processed. It would have been obvious to a reasonable user, Google submitted, that information about location may have been relevant to these things. For example, things such as “more relevant search results” and “personalised ads” may have been expected to draw on the user’s location.
178 It follows, Google submitted, that users were expressly informed that Google would process information about location and that it could do so for a variety of purposes. Google submitted that at least some of those purposes appeared to be, at least principally, for the benefit of Google, for example, the purpose of conducting analytics and measurement to understand how Google’s services are used.
179 In opening the case, the ACCC submitted that reasonable users would not understand, from the repeated references to “processing” on the Privacy and Terms screen, that “information would be collected and used”. I accept that there would have been users who, if they read this information at all, acting reasonably, thought that “processing” did not involve retaining data and using it later. I would think that most reasonable users, if they read this material, would have understood that Google received data, otherwise it could not “process” it. I do not accept that all, or even the majority, of users reading the material under “Why we process it” would have turned their minds to whether, or understood that, location or location data might be relevant to such things as “more relevant search results” and “personalised ads” or appreciated that location data was included in the data processed.
180 In my view, there would have been reasonable users within the class identified by the ACCC who would not have paid the degree of attention necessary to appreciate the matters to which Google pointed. The likelihood is that reasonable users in the class identified by the ACCC would have scrolled relatively quickly through the Privacy and Terms screen, seeking to gain a general but not detailed understanding of what was written, and then focussed on the last segment under the heading “You’re in control”.
181 Even if reasonable users had understood from the Privacy and Terms screen that Google might be processing location data, reasonable users in the class identified by the ACCC would have thought that could be modified (or “controlled”) by clicking on “More options”. Further, the class would have included reasonable users who would not think that, because data was processed, it was retained for later use.
182 Thirdly, Google observed that the Privacy and Terms screen included several prominent hyperlinks to Google’s Privacy Policy with an explanation that the user could obtain from that policy more information about what data Google processed and why it did so. The ACCC did not allege that the Privacy Policy failed to disclose the detail about how and why Google obtained and used data, including location data. Rather, the ACCC contended that few users would have turned to or digested the Privacy Policy. Google submitted that users were made aware in clear terms that, if they wished to learn the detail about how and why Google processed the user’s information, the Privacy Policy was the place to go to for a full description. Google submitted that there was no reason to think that the “notional representative class member”, who reached the More Options screen, would have overlooked the Privacy Policy. Google submitted that the hypothetical user would be expected to have gone to and read the Privacy Policy.
183 As mentioned, the question whether conduct is misleading or deceptive does not depend on an artificial selection of a singular response from a single representative member of the relevant class. Even if it did, it would not be reasonable to conclude that such a person would be diverted into reading the full terms of the Privacy Policy whilst setting up his or her device. In my view, reasonable members of the class would not have read the Privacy Policy and would have assumed that it had been accurately summarised.
184 Fourthly, Google pointed to the following words on the Privacy and Terms screen:
185 Google submitted that, by the time the user reached this part of the screen, he or she had been informed about what the relevant “data” included. The user had been expressly informed that when he or she searched for a restaurant on Google Maps, for example, information about that activity was processed, including information such as location. The user had been told, in terms, that this kind of information was processed when he or she used apps or sites that used Google services. That is, the user was informed that Google might obtain and use information about the user’s location, being information that was processed by Google when the user used apps or sites that use Google services or conducted Google searches.
186 Some users may well have understood these matters. But it is likely that reasonable members within the class identified by the ACCC would not have. Further, even if users had understood these matters, as submitted by Google, those users would have considered that they would be able to control the obtaining, retention and use of the data by clicking on “More Options”.
187 Fifthly, Google observed that the More Options screen included information about the Web & App Activity setting. The relevant text on this screen changed over time. In the period 30 April 2018 to 19 December 2018, the following text was shown under the heading “Web & App Activity” to some users (other users were shown some different text for some of the period after 23 October 2018):
188 Importantly, this text did not use the word “location”. It stated that Web & App Activity saved “activity from sites and apps that use Google services”. Google submitted that the word “activity” clearly included location, emphasising that this is made plain from the description of “activity” on the Privacy and Terms screen. Google submitted that, having just come from the Privacy and Terms screen, the reasonable user would have known that his or her activity from sites and apps provided data to Google about his or her location and that Google might have used that data. At all relevant times, the Web & App Activity setting’s default position was “on”.
189 In my view, there were reasonable users in the class identified by ACCC who would not have concluded from this part of the More Options screen that Web & App Activity was directed to obtaining, retaining or using location data. It is true that, if one paid a lot of attention to the content of both the Privacy and Terms screen and the More Options screen and, in particular, the Web & App Activity and Location History parts of the More Options screen, then one might have worked out that Web & App Activity was likely to allow the obtaining and use, and potentially retaining, of personal data about the user’s location. However, there were reasonable users in the class who would not have paid significant attention to the detail provided in relation to Web & App Activity. I am comfortably satisfied that there were reasonable users within the class identified by ACCC who would have assumed from the content of the various screens which the user had read to that point that, if they wanted to prevent Google obtaining, retaining or using personal data about the user’s location, then the way to do that was by ensuring that “Location History” was turned “off”. This was the default setting, so the user would have assumed that the user did not need to do anything. Further, I am satisfied that other users, also acting reasonably, would have concluded that Google would be able to obtain and use personal data about location, but would have assumed that Google would not retain that information with Location History defaulted to “off”.
190 Sixthly, Google observed that the More Options screen also included a setting concerning “Ads Personalisation”. The text concerning that setting provided:
191 Google submitted that the user reading this page would have reasonably supposed that Google may use the user’s location data to undertake this personalisation. According to Google, a person’s location was an obvious aspect of the personalisation of advertisements.
192 In my view, there were reasonable users in the class identified by ACCC who would not have attached any significance to this aspect of the More Options screen in terms of what it meant for obtaining, retaining or using personal information about location. It may be accepted that some may have, but in my view, it would simply not occur to many reasonable users in ACCC’s class that location had anything to do with personalising ads.
193 Seventhly, Google referred to the text on the More Options screen under the heading “Location History”. For reasons I will come to, it is better to focus on the whole of the Location History entry:
194 Google submitted that “[i]t is a seriously distorted reading of this text to suggest that it conveys that Location History is the setting, and the only setting, that controlled whether Google would obtain personal data about the relevant User’s location from a Linked Device”. Rather, Google submitted “it tells the User that the Location History setting saves a private map of where he or she goes, and does so even when a Google product is not being actively used, and that this gives the User better map searches, commute routes, and more”. Google submitted that “[i]t is not stated that this is the setting that controls everything concerning location”.
195 It is true that if one spent a lot of time analysing what was stated on the Privacy and Terms screen and the More Option screen then it is less likely that one would conclude that Location History was the only setting that controlled everything to do with location. However, it must be borne steadily in mind that one is talking about a reasonable user setting up his or her device, sufficiently interested to click on “More Options”, but not necessarily meticulous. Google’s arguments have more and more attraction the longer one pores over the various screens and analyses their content. This is not the appropriate approach. What is necessary is to put oneself in the position of reasonable users in the class identified by the ACCC and determine whether users in that class would have been misled. I am satisfied that there were reasonable users in that class who would have gleaned (incorrectly) from scanning over or reading quickly through the More Options screen, that Google would not be obtaining, retaining or using personal data about location whilst the Location History setting was “off”.
196 In my view, reasonable users would have paid more attention to the heading “Location History” and the descriptions next to the toggles than they would have to the explanatory words under the heading “Location History”.
197 I would interpolate that a number of reasonable users would later have worked out that the user’s initial assumption, formed during the set-up process, was likely to have been wrong. Perhaps that is why those users might return to “Location History” or “Web & App Activity”. For example, a reasonable user in the class identified by the ACCC might have reasonably formed the impression during set-up that Google was not obtaining, retaining or using personal data about location because Location History was defaulted to “off”, but then have conducted a Google Maps search which indicated that Google must be obtaining and using location data.
198 Eighthly, Google submitted that the express disclosure that Location History (if turned “on”) would access location data “even when you’re not actively using a Google product” conveyed (accurately) that normally only the active use of a Google product would involve the collection and use of location data but that the position was changed if Location History was turned “on” because Google would access location data “even when you’re not actively using a Google product”.
199 Google submitted that it was a significant agreed fact that, when the Location History setting was enabled, Google collected and stored data about the user’s location even if the user was not actively using a Google product (what is sometimes described as the collection of location data in the “background”). In contrast, if the Location History setting was disabled but the Web & App Activity setting was enabled, Google only collected and stored data about the user’s location if the user was actively using certain Google services (for example, Google Maps or Google Search). Google gave the example that, with Location History turned “on”, a user’s location would be recorded as they commuted from their home to their place of work. That would not occur if the user had Location History turned “off” but Web & App Activity turned “on”, unless that user was actively using a Google Service that collected and used location data (such as Google Maps) during that commute. This might be a significant agreed fact but it was not something which was known to reasonable consumers in the class.
200 I do not accept that the words on this part of the More Options screen are as clear in meaning as Google would have it. In particular, the words, in context, are easily understood as suggesting that Location History is the setting which controls whether location data is saved or retained by Google irrespective of how that data was obtained or whether it was used and that, with Location History “off”, location data could not be retained by Google for later use.
201 Leaving that aside, it may be accepted that some users in the class identified by the ACCC would have paid sufficient attention to the words under the heading “Location History” and the remaining content of both the Privacy and Terms screen and More Options screen and reached the conclusion that, even with Location History defaulted to “off”, Google was obtaining, using and possibly retaining personal information about a user’s location. However, there were reasonable users in the class identified by the ACCC who reasonably would not have paid close or detailed attention or have reached this conclusion and would instead have concluded that, with Location History defaulted to “off”, Google would not be obtaining, retaining or using personal data about location.
202 The logic of some of the arguments put forward by Google increase in attraction the more they are analysed and the more frequently the screens are read but this distracts from the real inquiry and is ultimately apt to mislead. The inquiry is into what reasonable users in the class would have concluded being consumers who were not armed with a statement of the agreed facts. True it is that, after a careful examination of the content of both screens, the conclusions Google urged might be reached. That is not the issue. The screens were read by users setting up the device. Such users, even ones with heightened privacy concerns, would not re-read screens with the kind of careful attention that has been necessary in considering the various arguments put by the parties.
203 Ninthly, Google submitted that, when one reads the Privacy and Terms and More Options screens as a whole, it is apparent that the first three settings on the More Options screens (Web & App Activity, Ads Personalisation and YouTube Search History) related to how information concerning a user’s “activity” on Google services was retained and used, which corresponded to the statement on the Privacy and Terms screen that Google processed information about a user’s activity on Google services. The Location History setting was then described in different terms on the More Options screen. That description expressly noted that the setting saved location information “even when you’re not using a Google product”. Accordingly, Google submitted, the user was informed, and would have understood, that activating that setting involved an extension, being that location information will be saved even when there is no “activity” on the part of the user.
204 It may just be accepted that some users may conceivably have had the high degree of observation suggested by Google and have reached the conclusion advocated by Google. But the structure referred to would not have been “apparent” to the average users in the class identified by the ACCC, the relevant users being ones setting up their devices and probably being quite uninterested in the structural similarities or otherwise between the Privacy and Terms and More Options screens as a whole.
205 Tenthly, Google referred to the fact that the default setting for Location History was “off” and submitted that this would serve to confirm for the relevant user that the Location History setting was not the one setting that governed the collection and storage of location data. That is because, Google submitted, the user was expressly informed by the Privacy and Terms screen that Google collected and used location data. Google submitted that, in those circumstances, it would be “incongruous” if the only setting that uniquely controlled the obtaining and use of location data was defaulted to “off”.
206 For reasons given earlier, I do not accept that reasonable users would necessarily have understood from the Privacy and Terms screen that Google collected and used location data. Nor do I consider it necessarily incongruous that Location History was defaulted “off” given the fact that the Privacy and Terms screen informed users that they could control what Google does – see [186] above.
207 I am satisfied that there were reasonable users in the class identified by the ACCC who would not have thought from seeing that the Location History setting was “off” that this confirmed that Location History was not the only setting which controlled the collection and storage of location data.
208 Eleventhly, Google noted that, under the text concerning Web & App Activity, there was a prominent hyperlink which invited the relevant user to “Learn more”. The user who accessed the “Learn more” pop up was, Google submitted, informed in “crisp terms” that data saved in the user’s Google Account may include location data.
209 Google submitted:
At this point, the weakness of the ACCC’s case is truly laid bare. The User that the ACCC apparently has in mind is a person who: (a) is interested in privacy but does not carefully read four points made under the heading “Data that we process when you use Google”, or review the terms of the Privacy Policy; (b) having not read or digested that material, this User nevertheless (because they are interested in privacy issues) clicks on the “MORE OPTIONS” tab; (c) having clicked on that tab, this person then ‘runs out of puff’ and does not click [on] the “Learn more” tab under the Web & App Activity heading. This kind of User is a theoretical construct that finds no reflection in the economic evidence or in common sense.
210 The ACCC’s case was expressly confined to a particular type of user. The ACCC’s case was not that most users were misled. Its case was that most users, whether or not they were interested in privacy at a theoretical level, simply clicked through the set-up process without paying any particular attention to what was put before them. The ACCC’s case was targeted to those users who were sufficiently concerned about privacy (or for some other reason felt it desirable) to click on “More Options”. The level of interest of such a person, or the level of concern about privacy, would determine the extent to which such a person would seek out information. A user particularly paranoid about having his or her location used might have clicked on “Learn more” underneath Web & App Activity despite there being no reference to “location” connected to that setting on the More Options screen. Others may have too. However, I am satisfied that there were reasonable users who had clicked on “More Options”, who would choose not to continue and click on each of the “Learn more” links or one or other of them. There is a point where reasonable people give up drilling down to plumb the depths for further information. I would think the lack of desire increases with each link.
211 Twelfthly, Google noted that, referring to the text under the heading “Location History”, Professor Slonim agreed that “it’s very unlikely that a typical user would reach that particular part of the screenshot”, and that “the users who would get to [that part] would be atypical”. Google submitted that it should be expected that the user who reached this page would also click on “Learn more”. Google submitted that the user who took up that invitation would learn some more things that confirmed that the Location History setting was not the setting that controlled everything to do with data collection and use.
212 I repeat what I have said above in relation to the eleventh matter raised by Google.
213 I accept that reasonable users in the class included users who, acting reasonably, would not have clicked on any of the “Learn more” links. Other users, acting reasonably, would have clicked on the “Learn more” links. More users would have clicked on the “Learn more” link related to the Location History setting than the Web & App Activity setting. This is because of the place on the More Options screen of the Location History setting and the fact that reasonable members of the class were likely to be more interested (by reason of what they had read up until that point) in Location History than in Web & App Activity (which did not refer to location).
214 The LH Set-up Pop Up (which appeared when the “Learn more” link was clicked) did not inform users that regardless of the Location History setting, data about the user’s location would still be collected by reason of the Web & App Activity setting. Some reasonable users would have assumed that the Location History setting was the only setting which needed to be “off” in order for personal data about location not to be retained and later used. Some reasonable users would not have thought also to look at Web & App Activity which did not refer to “location” on the More Options screen.
215 If a user did click on “Learn more” in relation to the Web & App Activity setting, the user would have realised from the WAA Set-up Pop Up that data about the user’s location would be collected by reason of the Web & App Activity setting. Such a user would probably have assumed that the data would be retained and used; in any event, I understood the ACCC to have accepted that this was the case. There was little on the More Options screen to indicate that Web & App Activity had anything to do with location, meaning it was less likely that a user concerned about privacy generally or location data in particular would click on “Learn more” under the Web & App Activity setting.
216 Each of the users in the class had to have passed through the Privacy and Terms screen. The users in the class were ones who behaved differently to the vast majority of users in clicking “More Options”. That is probably because the user had a higher degree of interest than the majority of users in privacy generally or location data specifically. I accept that such a user would ordinarily be expected to pay some attention to what was written on both the Privacy and Terms screen and the More Options screen. I do not accept that they would have all parsed and analysed the screens in the careful and meticulous way suggested by Google.
217 If the correct understanding of the law is, as Google submitted, that the hypothetical reasonable member of the class might only have one reaction, I would have held that the hypothetical member of the class taking the identified steps, and noting that the Location History setting was switched “off”, would have:
(1) correctly understood that Google might obtain and use the user’s location from time to time when the user used certain Google services;
(2) incorrectly understood that Google would not retain personal information about the user’s location;
(3) incorrectly understood that Google would not later use the personal information about the user’s location.
218 As noted earlier, I do not consider that the question is resolved by assuming that the hypothetical consumer is capable of only one response. I consider that construction to be antithetical to the statutory objectives of consumer protection legislation such as this. Accordingly, my conclusions in relation to s 18 of the ACL are as follows:
219 First, I conclude that the class of users who clicked on “More Options” included users who, acting reasonably, did not click on any “Learn More” links and would have incorrectly concluded from Google’s conduct as a whole (which conveyed a representation to that effect) that the Location History setting was the setting which controlled whether Google would obtain person data about the user’s location.
220 Secondly, some users within the class, acting reasonably, would have thought that Google would not, even on a once off basis, obtain or use personal data about a user’s location if the Location History setting was “off”. I therefore also conclude that the relevant class of users being users who clicked on “More Options” (including those who clicked on the “Learn more” link to the Location History setting but not to the Web & App Activity setting), included users who, acting reasonably, would have incorrectly concluded from Google’s conduct as a whole (which conveyed a representation to that effect) that with Web & App Activity turned “on” and Location History turned “off” Google would not obtain or use the user’s location at all, even when the user used certain Google services.
221 Thirdly, many reasonable users in the class would have thought that Google might obtain and use a user’s location on a “once off” basis, for example, when the user was using certain Google services (for example, to provide directions on Google Maps), notwithstanding that the Location History setting was “off”. However, of these users, many would have reasonably assumed that the data so obtained would not be retained or subsequently used. Therefore, I also conclude that users within the class (including those who clicked on the “Learn more” link to the Location History setting but not to the Web & App Activity setting), acting reasonably, would have incorrectly concluded from Google’s conduct as a whole (which conveyed representations to that effect) that, although Google might obtain and use personal data about the user’s location on “one off” occasions to provide a particular service, Google would not:
(1) retain personal data about the user’s location after having so obtained location data; or
(2) later use the personal data about the user’s location which had been so retained.
222 I accept that there were also reasonable users in the class who would not have been misled at all.
223 Both parties denied that it was relevant to seek to determine the number of users who would have been misled. It was not in dispute that the representations, if conveyed by the conduct as a whole, were misleading or deceptive because they were incorrect.
224 I am satisfied that the ACCC has established contraventions of s 18 to the extent indicated at [219] to [221] above.
225 The ACCC contended that s 33 or, alternatively, s 34 of the ACL was also contravened. I am satisfied that Google’s conduct in the respects indicated at [219] to [221] was “liable to mislead the public” in that there is a probability that members within the class identified by the ACCC, acting reasonably, were misled “as to the nature, the characteristics, [or] the suitability for their purpose … of … services”. The breaches are more appropriately classified as one with respect to services than goods. Given that the ACCC’s case was s 33 or s 34 as alternatives, I conclude that Google LLC breached s 34.
226 Further, I would draw the inference that, within the class identified by the ACCC, there were users who, acting reasonably, were in fact misled by representations to the effect indicated at [219] to [221], which I find Google made through what it stated and what it did not state – see: [120] above. I therefore conclude that Google breached s 29(1)(g) of the ACL by representing that the services which Google provided had “performance characteristics”, “uses” or “benefits” which they did not have in breach of s 29(1)(g). Such representations were “false or misleading” within the meaning of s 29(1)(g).
227 I would observe in relation to ss 29(1)(g) and 34 (being civil penalty provisions), that, if users had read all of the material made available by Google, or read the information on the Privacy and Terms screen with greater care, or had clicked on the “Learn more” link to the Web & App Activity setting on the More Options screen, reasonable users in the class identified by the ACCC would probably not have been misled. This matter may be relevant to penalties.
228 Also potentially relevant to penalties are the following:
(1) the class of users was narrowly defined and there was no suggestion that Google engaged in contraventions applicable to the public at large, as opposed to that portion of the public which, acting unusually, clicked on “More Options”;
(2) whilst there would have been, within the class identified by the ACCC, users who reasonably would not have clicked on the “Learn more” link to the Web & App Activity setting, those users who did not knew that there was additional information available (but not that it necessarily related to location) and chose not to look at it;
(3) there was no express representation to the effect that Google would not obtain, retain or use personal location data when Location History was turned “off”. The representations (relevant to s 29(1)(g)) are ones which can only be implied by taking into account conduct, and silence, and then only with respect to those unusual users who read some but not all screens.
SCENARIO 2: USERS WANTING TO TURN LOCATION HISTORY “OFF”
229 The second aspect of the case was directed to users who had previously turned Location History “on” and wished to turn it “off”. The allegations concerned three different types of “Location History Statements”, alleged to have been made where a user had previously turned the Location History setting “on” and wished to turn it “off”.
The class of users
230 As noted earlier, the Location History setting was “off” by default. The ACCC’s case concerned a class of users who had “consciously navigated back to [the Location History setting] page” in order to turn Location History “off”. The members of the class must have had at least the following characteristics:
the user must have already created a Google Account and seen screens relevant to the creation of the account;
the user must have had a mobile device which used the Android OS and had one or more mobile devices linked with their Google Account;
the user must have turned the Location History setting from its default position of “off” to “on” – this could have occurred during the set-up process, including on the More Options screen, or at some later time; and
the user must later have made a decision to turn Location History back to “off”.
231 As Google submitted, every member of the class of persons to whom the Location History Statements were made must have known (or at least determined) the following things:
(1) first, that the Location History setting existed;
(2) secondly, how to navigate to that setting; and
(3) thirdly, that the setting was “on”.
232 The ACCC did not put a positive case as to why a user might have turned Location History “on”. There was no allegation that any user was misled at the time of turning the Location History setting to “on”.
233 There were various ways that a user might turn Location History “on” apart from the obvious one of navigating during set-up to the More Options screen and turning it “on”. One was by turning on Google Assistant. Part of the Screenshot Bundle dealt with the set-up of a Google Account in the early period, and included screens relevant to turning Google Assistant “on”. The process by which this was done involved the following screen:
234 The user who turned Location History “on” through Google Assistant was informed that Location History “Creates a private map of where you go with your signed-in devices”. At least the users who had set-up an Android OS mobile device during the relevant period and created a Google Account in the set-up process are likely to have previously seen the Privacy and Terms screen.
235 Information was also provided to users who had turned Location History “on” in other ways:
(1) If the user had turned Location History “on” during set-up (for example at SB 61), the user would have just been through the Privacy and Terms screen (SB 15, 56) and, after 30 April 2018, the More Options screen.
(2) If the user had turned Location History “on” at a later point, the user would have been shown the screens at SB 119-121, which convey equivalent information. Having pressed the toggle next to the word “Off” or “Use Location History” on those pages, the user would have been shown the screen at SB 117, and would then have pressed “OK”.
236 The information on the screen at SB 117 included such matters as:
Location History being a setting which “helps you get useful information … by creating a private map of where you go”;
that to “create this map, Google regularly obtains location data from devices … [including] when you aren’t using a specific Google product”;
that the private map helps the user in various ways, including, for example, by getting commute predictions and improved search results.
Reasonable users in the class would have included users who absorbed little or none of this information and users who absorbed much of this information. Much would depend on the reason why the user was turning Location History “on”.
237 It was not made clear, either through pleadings or through submissions, what the ACCC relied upon as to the reasons why a user wanted to turn Location History “off”. However, I infer that users who wanted to turn Location History “off” knew something about the function of the Location History setting, or at least, had a view or opinion about what that function was. More likely than not, users would at least have thought that the Location History setting, when turned “on”, meant that Google kept a history of the user’s location. Users who wanted to turn Location History “off” included users who knew or thought that Google had been keeping, or was able to keep, a history of the user’s location and did not want Google to retain or subsequently use a history of the user’s location. The class of users would also have included users who knew or thought that Google had been obtaining and using, or could obtain and use, location data, but did not want Google obtaining or using personal data about the user’s location in the future.
238 I do not accept Google’s submission that the hypothetical reasonable user would have known at the time of making the decision to turn Location History “off” what the function of Location History was, at least at the level of precision for which Google contended. I do accept that reasonable users who had made a decision to turn Location History “off” had, for some reason or other, a concern about the person’s location being obtained and used or retained for later use. The users in the class are likely, therefore, to have paid more attention to the content of the screens shown during the process of turning Location History “off”, particularly the first two. Nevertheless, reasonable users would have viewed the screens at different speeds and with different degrees of attention.
Steps in turning Location History “off”
239 There were three stages which users went through when turning Location History “off”:
(1) First, users navigated to a page within the settings menu of their devices which contained the toggle to turn Location History “on” and “off” (the Landing Page).
(2) Secondly, users pressed on the toggle, which caused a pop up message to appear on the screen (the Pop Up).
(3) Thirdly, after pressing “OK”, users came to the Exit Page.
240 The screens changed over the relevant period. For ease of analysis, the screens were categorised by the ACCC into three different groups referred to as “Type A”, “Type B” and “Type C”.
Type A Location History Statements
241 The “Type A Location History Statements” were used in the period 1 January 2017 until 22 October 2018. There were different screens during that period, depending on the precise time and the type of device being used. The ACCC’s allegations concerned users who took each of the following steps:
(1) The first step was that the users obtained access to the Landing Page, using one of the steps articulated in the Concise Statement at [21(a)]-[21(b)]: Concise Statement [27(a)]. The Landing Page differed slightly for different users: see SB 113 to 116.
(2) The second step was that the user pressed a toggle situated next to the words “On” or “Use Location History” (so as to turn the setting “off”), and was then shown a Pop Up similar to the screen at SB 118: Concise Statement at [27(b)].
(3) The third step was that the user would press “OK”, and be shown an Exit Page similar to the screens at SB 119, 120 and 121: Concise Statement at [27(c)]).
242 The ACCC alleged that the user who went through these steps was not notified that, even if Location History was “off”, Google might continue to obtain, retain and use personal data about the user’s location: Concise Statement at [29]. The ACCC contended that by providing users with the Type A Location History Statements, and not at the same time providing this information, Google (impliedly) represented to users that, if they turned Location History “off”:
(1) Google would not “obtain” further personal data about the user’s location from a Linked Device (Concise Statement [49(i)]); and
(2) Google would not “use” any further personal data about the user’s location from a Linked Device (Concise Statement [49(ii)]).
243 I address next each of the relevant screens. I make some observations about what users may have thought at the end of the first and second steps of the process. It is necessary to remember, however, that for the purposes of s 18 of the ACL, the question is whether the conduct as a whole was misleading or deceptive or likely to mislead or deceive. Contravention of s 18 turns on whether Google’s conduct as a whole – assessed at the end of the relevant process, namely at the end of the third step – misled or was likely to mislead reasonable members of the class identified by the ACCC – see: [83] above.
244 There were five variants of the Type A Landing Page.
245 The first two variants of the Type A Landing Page referred to the creation of a “private map of where you go”. Little information about that map was provided. Variant 2 was as follows:
246 Variant 1 was broadly the same except that the words were:
Creates a private map of where you go with your signed-in devices in order to provide improved map searches, commute routes, and more.
247 Users arriving on the Variant 1 or 2 Type A Landing Page would, at that point, know that Location History created a private map of where the user went with signed-in devices. This is what these variants of the Landing Page expressly stated. It does not follow that those users would have thought that the creation of a private map was a complete description, or an exhaustive statement of the function of, the Location History setting. Nor does it follow that reasonable users would have known precisely what was meant by, or what follows from, the creation of a private map. Users would have concluded that the statement was a summary of what the Location History setting did and not necessarily a complete summary.
248 These two variants did not indicate that personal data about location might be obtained by Google LLC through some other channel, such as when the user used a Google product or service such as Google Maps.
249 In my view, there were users in the class (seeing Variants 1 or 2 of the Type A Landing Page) who would reasonably have understood or thought from Google’s conduct to this point that with Location History turned “off”:
(1) Google LLC would not be able to obtain, retain or use location data; or
(2) Google LLC would continue to be able to obtain and use location data when the user was using a Google product or service, but Google LLC would not be able to retain data so obtained or later use that data.
250 I accept that there were also users in the class who would not have been misled at this point and who would have concluded that, with Location History “off”, Google might still obtain, retain and use location data when the user used a Google product or service.
251 Variants 3 to 5 referred to Location History allowing Google to “obtain” or “get” location data from Linked Devices: SB 114-116. These variants did not contain any reference to a “map”. They referred to the fact that Google could get the data “even when” the user was not using a Google product or service. Variant 4 (at SB 115) was as follows:
252 The text in Variant 3 was: “Google gets location data from devices that have Location History turned on, even if you aren’t using a Google product. Learn more”. The text in Variant 5 was: “Google gets location data from devices that have Location History turned on, even if you aren’t using a Google product. Find out more”.
253 Given that the class of users identified by the ACCC were ones who wanted to turn Location History “off” for the reasons which would have included those identified at [237] above, it is reasonable to conclude that the users would have paid more attention to the words used than “typical” users. The reasonable user was one interested in the topic and who had a view about the function of the Location History setting. The user is one who was likely to have used the device for some time and appreciated that location data was being obtained and used in providing services. In my view, users in the class, acting reasonably, would have interpreted the text in two different ways of present relevance:
(1) some users would have understood that the Location History setting was confined to obtaining location data from devices only and that location data could still be obtained when the user was using a “Google product” (in particular those users viewing Variant 4);
(2) some users would have understood that Location History allowed Google LLC to obtain location data from devices and when the user was using a “Google product” (in particular those users viewing Variants 3 and 5).
254 Whether or not the former interpretation accords better with the literal words, the latter interpretation is one to which I consider reasonable members of the class would have come in the context in which the words were viewed. It is not clear, particularly in the case of Variants 3 and 5, whether the statements are intended to convey: (a) what the function of Location History is or (b) that Google obtains location data both from devices and otherwise and this is what Location History controls. Accordingly, and having regard to the name of the setting being “Location History”, it is likely that, within the class identified by ACCC, there were users who would have understood from Google’s conduct as a whole to the point of reading Type A Landing Page Variants 3 to 5 that, if Location History were turned “off”:
(1) Google LLC would not be able to obtain, retain or use location data; or
(2) Google LLC would continue to be able to obtain and use location data when the user was using a Google product or service, but Google LLC would not be able to retain data so obtained for later use.
255 I accept that there were also users in the class who would not have been misled at this point and who would have concluded that, with Location History “off”, Google might still obtain, retain and use location data when the user used a Google product or service.
256 After pressing the toggle on the relevant Landing Page to “off”, users then saw a Pop Up in the following form (SB 118; noting that the wording changed from time to time):
257 In relation to this page, the ACCC submitted that:
(1) The heading “Pause Location History?” was large and bolded and “in a form akin to a call to action” and that it was likely that a user would read the heading and press “OK”, without reading the content of the message in detail.
(2) Users were likely to understand “Location History” to concern data about the user’s historical location. That is particularly the case given the “salience” of location data (and the use of “location” together with “pause” in the bolded heading). The ACCC referred, in this regard to the name of the setting, “Location History”, which suggested the setting concerns data about the user’s historical location.
(3) The “call to action” in the heading also conveyed the impression that, if one were to take that action (by pressing “OK”), the data about a user’s location would no longer be obtained by Google LLC.
(4) If users did read the text under the heading:
(a) The first paragraph referred to the “places you go with your devices” no longer being added to a Location History map and the fact that the functionality of other Google services would be affected, emphasising to users the advantages of having Location History turned “on”. This heightened the sense of “losing” those advantages were it to be turned “off”, without reference to what that choice meant for the collection of data;
(b) The second paragraph referred to the fact that users should “remember” that past activity would be retained even if Location History was paused, but did not identify (as a matter to “remember”) that Google might continue to collect and store personal data in relation to location if the Web & App Activity setting was “on”.
258 As to these submissions, reasonable users in the class would interpret the heading as inviting reconsideration of the decision to turn Location History “off”. Given that the class of users identified by the ACCC were ones who wanted to turn Location History “off” for reasons which would have included those identified at [237] above, I consider that the heading would not have discouraged users from reading the text underneath and that reasonable users in the class would have read the text. The heading is appropriately adapted to informing the user that he or she had got to the correct place to turn Location History “off”.
259 The ACCC noted that the Type A Pop Up contained a warning in respect of past data. The ACCC submitted that this created the impression that a warning was needed about the past, but none was necessary about data going forward. The ACCC noted that the Pop Up did not indicate that Google would continue to collect location data going forward by reason of the Web & App Activity setting, unless the user also located and turned that setting “off”.
260 I accept that these matters are part of the context which must be taken into account in determining whether there were members in the class who would have been misled by Google’s conduct assessed as a whole.
261 The Pop Up screen reinforced for those users who saw Variant 1 or 2 of the Type A Landing Page that the places the user went with the device would not be added to the “Location History map”. Those users who had seen Variants 3 to 5 of the Type A Landing Pages saw a reference to “map” for the first time.
262 The representation that turning Location History “off” might limit the functionality of some Google products over time would have suggested to some users that turning Location History “off” would mean that Google would not be able to obtain or use location data.
263 In my view, at this point in the process of turning Location History “off”, and noting that users would have just seen and read one of the five variants of the Type A Landing Page, there were reasonable users within the class who would have thought or understood from Google’s conduct to this point that:
(1) Google LLC would not be able to obtain, retain or use location data; or
(2) Google LLC would continue to be able to obtain and use location data when the user was using a Google product or service, but Google LLC would not be able to retain data so obtained or later use that data.
264 I accept that there were also users in the class who would not have been misled at this point and who would have concluded that, with Location History “off”, Google might still obtain, retain and use location data when the user used a Google product or service.
265 After a user pressed “OK” in the Type A Pop Up, the user was taken to the Exit Page showing that Location History was toggled “off”.
266 In my view, there were users who, acting reasonably, would not have read the Exit Page at all or with the degree of care with which the user might have read the first two screens. The user had, at this point, achieved the objective of turning Location History “off”. For this reason, reasonable users might be less likely to read the Exit Page screen in detail or at all.
267 There were three variants of the Exit Page (SB 119–121), each of which contained a message referring to what would happen if the user turned Location History back “on”. The variants referred to Google creating a map from location data obtained from devices with Location History enabled, and contained a reference to the advantages to users of having the setting enabled, such as “improved search results” and “more useful ads”. An example of an Exit Page is as follows:
268 This marks the end of the three steps which users in the class went through to turn Location History “off”. As noted the earlier, the question for the purposes of s 18 of the ACL is whether Google engaged in conduct which, assessed at the end of the process, was such as to mislead or was likely to mislead reasonable users in the class.
269 Google contended that Location History would be understood by users to relate to a specific product, being the creation of a private “map”. In response to this submission, the ACCC contended that the language in the three variants of the Landing Page was inconsistent with such a notion because some variants of the Landing Page were not linked to the preparation of a “map”. Whilst it is true that Variants 3 to 5 of the Landing Page did not refer to a “map”, it is also true that the Pop Up and Exit Page both referred to a “map”. Nevertheless, I consider that there were users in the class who, acting reasonably, would not have thought that the creation of a map was a complete description or summary of the function of the Location History setting or known exactly what was meant by the creation of a private map. In my view, there were reasonable users who, having passed through the three steps, would have formed the view that Location History was concerned with retaining location data about a user’s location for purposes which included creating and retaining a “map” of where the user had been and goes in order, amongst other things, to improve the services Google provided to the user. There were reasonable users who would have understood that one purpose of retaining the information was so that the information could be later accessed by Google in order to improve such things as search results and to provide more useful ads (see the text of the Exit Page).
270 The ACCC submitted that there was no indication in the screens that location information, including personal data about location, might be collected through other channels and that, therefore, users were likely to understand that there was no other setting which they needed to consider if seeking to stop personal location data being collected. I consider that many users having passed through the three Type A Location History screens would have appreciated that, when Location History was “off”, Google could still obtain and use location data when the user was using a “Google product” or service. In particular, I consider that many users who had seen Variants 3 to 5 of the Landing Page would have reached that conclusion. Those users might not have known how to stop that happening or that Web & App Activity was a possible means of preventing Google LLC obtaining and using location data.
271 Nevertheless, I accept that there were users in the class identified by the ACCC (including users who had seen any of Variants 1 to 5) who, acting reasonably, would not have appreciated that Google LLC would be continuing to obtain, retain and use location data via other channels.
272 In my view, the objective of turning Location History “off” having been achieved, and noting that users would have seen and read one of the five variants of the Type A Landing Page, there were reasonable users within the class who would have thought or understood from Google’s conduct (and I conclude that Google made representations to that effect) that, by turning Location History “off”:
(1) Google LLC would not be able to obtain, retain or use location data; or
(2) Google LLC would continue to be able to obtain and use location data when the user was using a Google product or service, but Google would not be able to retain data so obtained or later use that data.
273 In reaching that conclusion, I accept that there were users in the class who would not have been misled and who would have concluded that, with Location History “off”, Google might still obtain, retain and use location data when the user used a Google product or service.
274 If I were confined to determining one response, I would say that (2) in [272] above would be the response of the hypothetical reasonable member of the class. I have explained why I do not think one is confined to one response.
275 It follows that I accept that the ACCC has established contraventions of s 18 to the extent indicated at [272] above.
276 The ACCC contended that s 33 or, alternatively, s 34 was also contravened. I am satisfied that Google LLC’s conduct in the respects indicated at [272] above was “liable to mislead the public” in that there is a probability that members within the class identified by the ACCC, acting reasonably, were misled “as to the nature, the characteristics, [or] the suitability for their purpose … of … services”. I therefore conclude that Google LLC breached s 34.
277 Further, I would draw the inference that, within the class identified by the ACCC, there were users who, acting reasonably, were in fact misled by representations to the effect indicated at [272] above, which I find Google LLC made through what it stated and what it did not state. I therefore conclude that Google LLC breached s 29(1)(g) by representing that the services which Google LLC provided had “performance characteristics”, “uses” or “benefits” which they did not have in breach of s 29(1)(g). Such representations were “false or misleading” within the meaning of s 29(1)(g).
Type B Location History Statements
278 The “Type B Location History Statements” were shown to users in the period 9 May 2018 until 1 December 2018: Concise Statement at [31]. The ACCC ultimately relied on a number of variants of the Type B screens:
Landing Page: Variant 1 of 3 (SB 113); Variant 2 of 3 (ACCC abandoned reliance on this variant); Variant 3 of 3 (SB 166);
Pop Up: Variant 1 of 2 (SB 118); Variant 2 of 2 (SB 169-170);
Exit Page: same as Landing Page but with toggle turned “off”.
279 According to the ACCC, both variants of the Type B Landing Page contained identical text. The screen at SB 166 was as follows:
280 The ACCC noted that, by the Type B Landing Page, Google informed users that Location History would save where users had been and go with their devices in order to provide personalised maps, recommendations based on where the user had been “and more”.
281 The ACCC submitted that, when referring to the uses to which the data would be put, the Landing Page referred only to the benefits to users of the data being collected (“where you go”, “to give you …”). The ACCC submitted that there was no mention of personalised location data being collected and stored by Google for its own purposes. As to this second submission, in my view reasonable users would fully appreciate that Google also stood to benefit from the use of location data and, indeed, perhaps that Google had more to gain than the user.
282 After pressing the toggle to “off” or “paused”, users then saw one of the two variants of the Type B Pop Up. The text in Variant 1 was:
Pause Location History?
Pausing Location History may limit or disable personalised experiences across Google services. For example, you may not see recommendations based on places you’ve visited or helpful tips about your commute.
This setting does not affect other location services on your device, like Google Location Services and Find my Device.
Some location data may be saved as part of your activity on other Google services, like Search and Maps.
Pausing this setting doesn’t delete any of your past data. You can see or delete your data and more at maps.google.com.timeline.
Learn about the data Google continues to collect and why at policies.google.com.
283 The text in Variant 2 was:
Pause Location History?
Pausing Location History may limit or disable personalised experiences across Google services. For example, you may not see recommendations based on places that you’ve visited or helpful tips about your commute.
This setting does not affect other location services on your device, like Google Location Services and Find My Device.
Some location data may be saved as part of your activity on other Google services, like Search and Maps.
Pausing this setting doesn’t delete any of your past data. You can see or delete your data and more at maps.google.com/timeline.
Learn more about the data Google continues to collect and why at policies.google.com.
284 The ACCC submitted:
The Type B Pop Up informed Users that it did not affect other location services. However it went on to say:
Some location data may be saved as part of your activity on other Google services like Search and Maps.
That was presumably an oblique reference to the fact that Web & App Activity may still be turned on, so that such data would be collected by Google. However, importantly, neither the Type B Pop Up nor Landing Page referred to the Web & App Activity setting, let alone indicated its relevance to the collection of location data.
285 Google emphasised the following aspects of the text:
This setting does not affect other location services on your device, like Google Location Services and Find my Device.
Some location data may be saved as part of your activity on other Google services, like Search and Maps …
Learn about the data Google continues to collect and why at policies.google.com
286 It is true that the Type B Pop Up did not refer to Web & App Activity. But it is equally true that the Pop Up plainly stated that location data might be saved as part of activity on other Google services. Although this page did not inform the user how to turn off the saving of location data through activity on other Google services, the Pop Up did not represent that location data was not being obtained or used by Google simply because Location History was “off”.
287 The ACCC submitted that like the Type B Landing Page, the Type B Pop Up referred only to the benefits of Location History to users: the first paragraph referred to limiting or disabling “personalised” experiences across Google services; the examples given were missing out on personalised recommendations and tips; the third paragraph of the Pop Up referred to the location data that continued to be saved as being part of “your” activity. The ACCC submitted that there was no mention of the location data that continued to be collected, stored and used by Google for Google’s own purposes.
288 In my view, reasonable users would fully appreciate that Google also stood to benefit from the use of location data. Further, reasonable users in the class identified by the ACCC would have understood that location data continued to be collected, retained and used by Google, including for Google’s purposes or purposes which benefitted Google and the user, when users used other Google services.
Were the Type B Location History statements misleading?
289 The ACCC submitted that the Type B screens were misleading for two reasons:
(1) First, the ACCC contended that by providing users with the Type B Location History Statements and not at the same time stating that users could stop Google continuing to obtain personal data in respect of location by turning the Web & App Activity setting “off”, Google represented to users that the only way to prevent Google from obtaining personal data about the user’s location arising from the user’s use of other Google services (such as Search and Maps) was to cease using those services: Concise Statement [50(i)] and [50(ii)].
(2) Secondly, the ACCC contended that by providing users with the Type B Location History Statements and not at the same time indicating that Google might continue to obtain, use and retain personal data about the user’s location for one or more of “Google’s Purposes”, Google represented to users that, if they turned Location History “off”, this would have the result that Google would:
(a) only obtain or use further personal data about the user’s location obtained from a Linked Device for the user’s purpose; and
(b) not obtain or use further personal data about the user’s location obtained from a Linked Device for any of Google’s purposes: Concise Statement at [51], [55].
290 The ACCC submitted that, by reason of the Type B screens, Google told users that even with Location History turned “off”, Google might continue to obtain personal data in respect of location if the user used Google services like Search and Maps. The ACCC submitted that the statement was technically accurate but incomplete: users could stop that occurring by turning the Web & App Activity Setting “off”.
291 The ACCC submitted that the representation was misleading: in fact, users could continue to use Google services (including Search and Maps) without Google collecting personal data in respect of location, by turning Web & App Activity “off”. The ACCC referred to the evidence of Mr Monsees that a user could continue to use the “central feature” of both Maps (being a map that reflects where you are) and Search (being search results that are meaningful because they reflect where you are) when Location History and Web & App Activity are “off”.
292 I do not accept that reasonable users would have thought that the only way to stop Google using location data when using Google services was not to use those services. Reasonable users would have been left wondering where or how to prevent Google using location data when using Google products or services, and would justifiably have felt frustrated if the user’s objective in turning Location History “off” had been to prevent the use of such data, but reasonable users would not have thought the only option was not to use Google services. I accept Google’s submission that “[i]t would be a wholly unreasonable response for the user to jump to the false conclusion that the ‘only’ thing he or she could do was to stop using those other Google Services altogether”.
293 The ACCC submitted that, when referring to the uses to which the data may be put, the Type B screens did not mention the possibility that Google would use such data for its own purposes. The ACCC submitted that it was logical for users to conclude that the data would be used for purposes such as those mentioned on the Type B Landing Page and Pop Up, all of which pointed towards use to serve the user’s purposes. According to the ACCC, considered as a whole, the screens represented that the location data which continued to be provided to Google would be used to serve the user’s purposes. The ACCC submitted that such a representation was misleading. The personal data about users’ locations that Google continued to collect by reason of the Web & App Activity setting was used for various purposes, including Google’s own purposes.
294 Google submitted that the pleaded distinction (see Concise Statement at [15]) between a “user’s purpose” and “Google’s purpose” was unsatisfactory. According to Google, there was no neat distinction between the two concepts. For example, reasonable users would have understood that the use of personal location data to provide more relevant and personalised advertisements benefitted both users and Google.
295 The ACCC stated that it accepted that the concepts are not mutually exclusive, but submitted that “it is obvious that the use of data by Google for its own purposes does not necessarily serve the interests of the User whose data is being used”. The ACCC gave two examples:
(1) The ACCC submitted that data from a user might be used to personalise ads both for Person A and for Person B. Person A’s data is not being used for the purpose of services to Person A. It is being used for the purpose of services to Person B. That is not a user’s purpose.
(2) The ACCC submitted that an “even more egregious example” is Google’s use of Person A’s data in order to promote services to advertisers. The ACCC submitted that it cannot be said that such a use was for the purpose of providing services to Person A; it was being used to assist Google to make money from advertising.
296 The Concise Statement acknowledged that there was overlap, because “user’s purpose” was defined by reference to the purpose “only” of the user, whereas “Google’s purpose” was not defined by reference to the purpose “only” of Google.
297 I do not accept that Google’s statement that, when Location History was turned “off” some location data associated with user’s activity may be saved as part of the user’s activity on other Google services, carried with it an implied representation that that location data would be used exclusively for the purposes of the user. I do not accept that reasonable users would not have well known that Google’s obtaining, retention and use of personal data about location served the purposes of Google as well as the user. Reasonable users within the class of user who had made a decision to turn Location History “off” would have assumed that Google was obtaining as much commercial advantage as it could from use of the user’s personal location data.
298 It follows that the ACCC has not made out its case in relation to the Type B Location History Statements.
Type C Location History Statements
299 The “Type C Location History Statements” were made from 16 October 2018 to the date of commencement of these proceedings: Concise Statement at [35]. The relevant steps taken by users and the screens that were accessed were similar to those that arose in connection with the Type B Location History Statements. The ACCC relied on variants of the Type C screens:
Landing Page: Variant 2 of 2 (SB 166); ACCC abandoned reliance on Variant 1 of 2;
Pop Up: the ACCC relied on SB 169-170;
Exit Page: same as landing page but with toggle turned “off”.
300 The relevant screens were not sufficiently different from the Type B Landing Pages, Pop Ups and Exit Pages as to warrant any different conclusion from that reached in relation to the Type B Location History Statements. It follows that the ACCC has not made out its case in relation to the Type C Location History Statements.
301 It follows that the ACCC has not made out its case in relation to the Type C Location History Statements.
302 The ACCC has established its case under ss 18, 29(1)(g) and 34 of the ACL in relation to the Type A Location History Statements to the extent indicated at [275]-[277] above. The ACCC has not made out its case in relation to the Type B or Type C Location History Statements.
SCENARIO 3: USERS CONSIDERING WHETHER TO TURN WEB & APP ACTIVITY “OFF”
303 The third aspect of the case was directed to users considering whether to turn Web & App Activity “off”. The ACCC relied upon four different types of Web & App Activity Landing Pages, the second of which had two variants. Two allegations were made in relation to the “Web & App Activity Statements”:
(1) First, the ACCC alleged that Google represented, before October 2018, to those users who were presented with the Type 1 and Type 2 Web & App Activity Statements that having Web & App Activity turned “on” would not allow personal data in relation to the user’s location to be obtained, retained and used by Google: Concise Statement [53].
(2) Secondly, the ACCC alleged that Google, at all material times, represented to those users who were presented with any of the four types of Web & App Activity Statements that Google would only obtain and use location data for the user’s purposes and not for Google’s purposes: Concise Statement [54].
The class of users
304 On the ACCC’s case, the representations were alleged to have been made to users who “wished to consider whether to turn Web & App Activity ‘off’ (or ‘pause’ it) using their Linked Device” – see: Concise Statement at [37] to [46], [53] to [54], [57] and [58]. The relevant user was someone who had turned his or her mind to the issue of whether or not to turn Web & App Activity “off”, but had not yet decided that it should be turned “off”.
305 The pleading does not make it clear why the person wished to consider turning Web & App Activity “off”. One reason might have been that the user learned during the set-up process that Web & App Activity permitted Google to obtain, retain and use location data when the user was using a Google product or service or had come to understand that fact. There may have been other reasons. The ACCC submitted that one type of user by which the case in respect of Web & App Activity could be considered was a user who, during set-up or perhaps otherwise, read what was said about Location History but, for whatever reason, also chose to check what the Web & App Activity Landing Page said about location data.
306 The reasonable user in the class must have known how to obtain access to the Web & App Activity Landing Page, because the relevant user was identified in the Concise Statement as a person who obtained access to the “Web & App Activity” page: Concise Statement at [37]. That access was obtained in various ways – see: Concise Statement at [38]. As mentioned, the relevant users would land on 1 of 4 different types of Web & App Activity Landing Pages. These pathways required the user to know:
(1) in respect of the pathway at Concise Statement [38(a)]: that the Web & App Activity setting was to be found within a group of settings entitled “Personal info & privacy”, and within a further sub-group of settings entitled “Activity Controls”;
(2) in respect of the pathway at Concise Statement [38(b)-(c)]: that the Web & App Activity setting was to be found within a sub-group of settings entitled “Data & personalisation”.
307 Google submitted that reasonable members of the class of users, who knew where to find the Web & App Activity setting, should therefore be presumed to have knowledge that the setting was relevant to “Personal info & privacy” or “Data & personalisation”.
308 I accept Google’s submissions in this respect.
The content of the Web & App Activity Landing Pages
309 The content of the various screens is similar. Type 1 is similar in format to Type 2. The text of Type 1 was as follows (emphasis in original):
Web & App Activity
On [Toggle appeared here]
If you use more than one account at the same time, some data may get saved in your default account. Learn more at support.google.com.
Save your search activity on apps and in browsers to make searches faster and get customized experiences in Search, Maps, Now, and other Google products.
Learn more
Include Chrome browsing history and activity from websites and apps that use Google services
Data from this device
Control reporting of App Activity from this device
MANAGE ACTIVITY
310 Type 2 variant 1 of 2 was as follows:
311 Type 2 variant 2 of 2 was as follows:
312 The text of Type 3 was as follows:
Activity Controls
The data saved in your account helps give you more personalised experiences across all Google services. Choose which settings you want to save data in your Google Account.
Web & App Activity [Toggle appeared here]
Saves your activity on Google sites and apps, including associated info like location, to give you faster searches, better recommendations and more personalised experiences in Maps, Search, and other Google services.
Learn more
Include Chrome history and activity from sites, apps and devices that use Google services
MANAGE ACTIVITY
313 The text of Type 4 was as follows:
Activity Controls
The data saved in your account helps give you more personalised experiences across all Google services. Choose which settings you want to save data in your Google Account.
Web & App Activity [Toggle appeared here]
Saves your activity on Google sites and apps, including searches and associated info like location.
Learn more
Include Chrome history and activity from sites, apps and devices that use Google services
MANAGE ACTIVITY
314 The ACCC’s case focussed on two aspects of the information provided by the various screens. First, what data was to be collected (or “saved”) by Google and secondly, what Google would do with that data.
315 In respect of the first aspect (what data is to be collected by Google), the ACCC noted that:
(1) Type 1 referred to “your search activity on apps and in browsers” and did not identify that this included a user’s location;
(2) Type 2 referred to “your activity on Google sites and apps” and did not identify that this included a user’s location;
(3) Types 3 and 4 both specifically referred to the fact that the data to be saved included data about the user’s location when using Google services.
316 In respect of the second aspect (what Google will do with that data), the ACCC noted that:
(1) Types 1, 2 and 3 referred to search activity being saved in order to provide the user with faster searches and customised/personalised experiences using Google services;
(2) Type 4 referred to the fact that the data saved related to the user’s activity, including for “more personalised experiences” across Google services, but did not provide any other indication of the reason for the data being saved.
What representations were conveyed?
317 The ACCC submitted that two representations were conveyed by the relevant screens:
(1) First, that by making the Type 1 and 2 statements, and not alerting the user to the fact that if Web & App Activity was “on” Google might continue to obtain, retain and use personal data about a user’s location, Google impliedly represented that having Web & App Activity turned “on” would not allow personal data in relation to the user’s location to be obtained, used and retained by Google (“first representation”: Concise Statement [44], [45], [53].
(2) Secondly, that by the Type 1, 2, 3 and 4 statements, Google represented that it would only obtain and use data from users for the purpose of the user’s use of Google’s services and not for Google’s purposes (“second representation”).
First representation: Type 1 and 2 Statements
318 Google submitted that the implied representation was not conveyed and relied in that respect on the following submissions:
(1) First, the relevant landing pages spoke at the beginning of “data” being “saved” in a user’s account. The reference to data was unqualified. The pages did not say in terms that the “data” included location data, but (more importantly) they did not say that “data” excluded location data. According to Google, the ACCC had not explained why the concept of data would be read as excluding location data.
(2) Secondly, the pages stated that Web & App Activity “Save[s] your search activity on apps and in browsers” (Type 1) or “Saves your activity on Google sites and apps” (Type 2). According to Google, a reasonable user would not draw the conclusion that this excluded information about location. The opposite is true. A reasonable user would expect that his or her activity on Google sites and apps could include information about their location. That expectation would be confirmed when the user saw that one of the benefits was “more personalised experience in Maps”. This was a clear indication that location was part of what was being saved by this setting.
(3) Thirdly, the user who “wished to consider whether to turn Web & App Activity ‘off’” (see Concise Statement at [37]) was a user who might reasonably be expected to wish to learn more by clicking on the “Learn more” link. That user was taken to the page shown at SB 224 and following. On that page the user was expressly informed that when Web & App Activity was on, Google saved the user’s location (see SB 225).
(4) Fourthly, such a user would also be likely to click the “Manage My Activity” link (SB 110) from the Web & App Activity Landing Page, which would show the user what activity that had been saved by Google as a result of the setting being on and which Google service had saved that information. The My Activity page would reveal to the user what location information had been saved and which Google service had caused that information to be saved.
319 As to (1) above, whilst I accept that some users in the class would have assumed that “data” included data about location, I consider that the class included users who, acting reasonably, would not have considered that “data” had anything to do with information concerning the user’s location.
320 As to (2) above, I accept that some users would have concluded that the statement “Save[s] your search activity on apps and in browsers” or “Saves your activity on Google sites and apps” might include information about location. I also consider that the class included users who, acting reasonably, would not have considered that the statement had anything to do with the user’s location.
321 As to (3) and (4) above, I accept that there were users in the class identified by the ACCC who would have taken the further steps indicated by Google and worked out that Web & App Activity being turned “on” enabled Google to obtain, use and retain location data when a user was using a Google service. Equally, however, I accept that there were reasonable users within the class identified by the ACCC who would not have taken these steps.
322 In my view, the class included users who, acting reasonably, who would have formed the view, from what was stated and by the absence of any specific reference to location, that having Web & App Activity turned “on” did not enable Google to obtain, retain and use location data. I accept that Google by its conduct made a representation to that effect.
Second representation: Type 1, 2, 3 and 4 Statements
323 As noted at [317](2) above, the ACCC alleged that the user who wished to consider whether to turn Web & App Activity “off” and navigated to or landed on one of the pages containing the 4 types of Web & App Activity Statements was not notified or alerted to the fact that if Web & App Activity was left “on”, Google might continue to obtain, retain and use personal data about the user’s location for one or more of Google’s purposes: Concise Statement [46].
324 Google submitted that this allegation turned on the unsustainable distinction between a “User’s Purpose” and “Google’s Purpose”; giving a user “faster searches, better recommendations, and more personalised experiences in Maps, Search and other Google services” (which appeared on Types 2 and 3, and similar text in Type 1) was not for the purpose “only” of the user’s use of Google services.
325 Google also submitted that, given the Web & App Activity Statements all related to screens that a user would access having already set-up a Google Account, by the time the relevant user accessed those screens, he or she already knew that his or her use of Google’s products and services was governed by the Terms of Use and that Google processed users’ information as described in its Privacy Policy. Even if the individual user had not read the Privacy Policy and Terms of Use, he or she would have at least understood that those documents existed. Google submitted that, for this reason, the ordinary and reasonable user would not assume that the Web & App Activity Statements were a comprehensive statement of the purposes for which the user’s data could be used by Google. That was the very function of the Privacy Policy and there was no allegation that the Privacy Policy was in any way defective.
326 I do not accept that the Web & App Activity Landing Page conveyed to reasonable users in the class identified by the ACCC an implied representation that Google would only obtain and use data from users for the purpose of the users’ use of Google’s services and not for “Google’s purposes”. I do not accept that reasonable users would not have known that Google’s obtaining, retention and use of personal data served the purposes of Google as well as the user. Reasonable users would have assumed that Google was obtaining as much commercial advantage as it could from use of the user’s personal data.
Were the Web & App Activity Statements misleading?
327 The ACCC submitted that the first representation was misleading because if the Web & App Activity setting was turned “on”, Google continued to collect and store such data. I accept this submission.
328 If this representation had been conveyed (contrary to the conclusion I have reached at [326]) it would have been misleading in the limited sense that it is conceivable that Google could use personal data for a purpose which could be seen to benefit only Google and not the user.
329 I accept that by the Type 1 and 2 statements, Google’s conduct as a whole represented to reasonable users in the class identified by the ACCC that having Web & App Activity turned “on” would not allow personal data in relation to the user’s location to be obtained, retained and used by Google. Reasonable users in this class included users who were looking to prevent or limit Google from obtaining, retaining and using data about the user’s location when using Google products or services. There were users in the class who, acting reasonably, would have incorrectly come to the conclusion from reading the Type 1 and 2 screens that:
(1) Web & App Activity was not the relevant setting which prevented personal data in relation to the user’s location being obtained, retained and used by Google;
(2) having the setting turned “on” would not allow personal data in relation to the user’s location to be obtained, retained and used by Google.
330 I am satisfied that Google’s conduct assessed as a whole was misleading or deceptive of, or likely to mislead or deceive, ordinary members within the class identified by the ACCC, acting reasonably. I conclude that Google’s conduct assessed as a whole conveyed a representation that having Web & App Activity turned “on” would not allow Google to obtain, retain and use personal data about the user’s location – see Concise Statement at [53]. I therefore conclude that Google LLC breached s 18 of the ACL.
331 I am also satisfied that Google’s conduct in this respect was “liable to mislead the public” in that there was a probability that members within the class identified by the ACCC, acting reasonably, were misled “as to the nature, the characteristics, [or] the suitability for their purpose … of … services”. I therefore conclude that Google LLC breached s 34 of the ACL.
332 Further, I would draw the inference that, within the class identified by the ACCC, there were users who, acting reasonably, were in fact misled by a representation (which I find Google made through what it stated and what it did not state) that having Web & App Activity turned “on” would not allow Google to obtain, retain and use personal data about the user’s location – see Concise Statement at [53]. I therefore conclude that Google LLC breached s 29(1)(g) by representing that the Google services had “performance characteristics”, “uses” or “benefits” which they did not have in breach of s 29(1)(g). Such representations were “false or misleading” within the meaning of s 29(1)(g).
333 I do not accept that Google’s conduct represented that Google would only obtain or use personal data about the user’s location for the user’s purposes – see: Concise Statement at [55].
334 The ACCC alleged that GAPL contravened ss 18, 29(1)(g) and 33 or 34 of the ACL on the basis that GAPL passed on or adopted the conduct and representations made by Google LLC. The ACCC relied on the following statement made by French CJ, Crennan and Kiefel JJ in Google Inc at [15] (citations omitted):
[T]he question whether a corporation which publishes, communicates or passes on the misleading representation of another has itself engaged in misleading or deceptive conduct will depend on whether it would appear to ordinary and reasonable members of the relevant class that the corporation has adopted or endorsed that representation. It has also been established that, if that question arises, it will be a question of fact to be decided by reference to all the circumstances of a particular case.
335 The ACCC submitted that GAPL passed on or adopted Google LLC’s conduct and representations because:
(1) GAPL was responsible for the supply of Pixel phones directly to consumers in Australia, and to third-party suppliers in Australia;
(2) The Pixel phones had the Android OS and Google Mobile Services pre-installed; and
(3) The screens to which these proceedings relate referred to “Google” generally and there was no reason to think that ordinary and reasonable members of the relevant class of users would distinguish between Google LLC and GAPL in this respect.
336 The ACCC submitted that GAPL would be understood as having adopted or endorsed the conduct and representations made by Google LLC concerning Location History and Web & App Activity in the settings of the phone.
337 In my view, it would appear to ordinary and reasonable members of the relevant class that GAPL, being the wholly owned subsidiary of Google LLC, adopted and endorsed the conduct and representations of its parent. GAPL’s circumstances are not equivalent to the circumstances of a third party seller of products which might not be regarded as endorsing statements made by an unrelated manufacturer.
338 Contrary to Google’s submission, it is not to the point that the Terms of Service expressly stated that the Google products and services were provided by Google LLC. The ordinary and reasonable members of the relevant classes would not have read the Terms of Service and, even if they had, they would nevertheless have understood GAPL to adopt and endorse the representations made by Google LLC.
339 In relation to Google LLC’s breach of s 29(1)(g), Google submitted that it could not be said that GAPL made any of the alleged representations “in connection with the supply” of the Pixel phones because the representations were “only logically made to [u]sers who had already purchased, and commenced to use their Pixel phones”.
340 In the circumstances of this case, it does not matter that the representations were made after the relevant consumers took possession of the relevant devices. Section 29(1)(g) prohibits false or misleading representations “in connection with” the supply of goods or services. The words “in connection with” are of wide import. It may be accepted, as Google submitted, that the representations were “only logically made to [u]sers who had already purchased, and commenced to use their Pixel phones”, but the representations were still made “in connection with”: (a) the supply of the phones (being goods); and (b) the services provided by and through those phones via the pre-installed software. The words “in connection with” do not imply a necessary temporal limitation that the representations occur before or at the time of supply of goods or services. I do not understand Lindgren J in Monroe Topples & Associates Pty Ltd v Institute of Chartered Accountants [2001] FCA 1056 at [260] to have intended to state otherwise by his identification of potential synonyms for the phrase “in connection with”; see also Commissioner of Taxation v Hart (2004) 217 CLR 216 at [91]. For example, representations might be made after a supply of goods in order to dissuade a consumer from returning the goods. Such a representation would likely be “in connection with” the supply of the goods. That is not to say that representations made after the act of supply might lose the necessary “connection” or “relationship” with the supply because of the timing of the representation, but it all depends on the particular facts.
341 The ACCC has established breaches or contraventions of ss 18, 29(1)(g) and 34 of the ACL to the extent indicated above. The parties should confer with a view to agreeing orders to reflect the conclusions reached and agreeing the appropriate further steps.
I certify that the preceding three hundred and forty-one (341) numbered paragraphs are a true copy of the Reasons for Judgment of the Honourable Justice Thawley. |
Associate: