Ari Ezra Waldman*
Yet, privacy policies are confusing, inconspicuous, long, and difficult to understand. They are also ineffective: most people never read them, and even experts find them misleading. And, as I argue elsewhere, privacy notices are often designed, displayed, and presented to users in ways that make their substance even more inscrutable. For example, many are written in grey tones on white backgrounds, in small font sizes and single-spaced text, without white spaces or noticeable headings. And even aesthetically pleasing designs can be deployed to trick confused consumers into making risky privacy choices.
This Essay takes a further step in a developing research agenda on the design of privacy policies. As described in more detail in Part II, I created an online survey in which respondents were asked to choose one of two websites that would better protect their privacy given images of segments of their privacy policies. Some of the questions paired notices with, on the one hand, privacy protective practices displayed in difficult-to-read designs, and, on the other hand, invasive data use practices displayed in graphical, aesthetically pleasing ways. Many survey respondents seemed to make their privacy decisions based on design rather than substance. Furthermore, using statistical modeling, this Essay shows that increased knowledge about the legal implications of privacy policies is associated with lower odds of being confused by aesthetically pleasing designs. Although this study is subject to certain limitations, all of which are discussed at the end of Part II, it suggests several avenues for future research and several ways policymakers can improve the efficacy of notice-and-choice.
I. The Importance of Design
The law has occasionally recognized that the design of legal documents is an important part of validity and transparency. In Carnival Cruise Lines, Inc. v. Shute, for example, Justice Stevens argued in dissent that a forum selection clause written in tiny print on the back of a passenger ticket should not be enforceable because it was designed in a way to give consumers “little real choice.” Similarly, the D.C. Circuit held that incomprehensible design, typified by tiny fine print, could make a contract unconscionable. And states have passed laws with design requirements. South Carolina requires employers to design disclaimers in employee handbooks so that they stand out. California prescribes both the design and content of arbitration agreements in the name of enhancing understanding, transparency, and comprehension.
The executive branch has taken notice, too. The Securities and Exchange Commission has a Plain English Handbook that requires individuals to design documents in aesthetically pleasing ways so investors and other members of the public can understand them. The Consumer Financial Protection Bureau (CFPB) has gone even further. Its Design+Technology program recruited graphic designers to, among other things, create “[d]esign tools that enable millions of people to make informed financial choices.” And it follows an open source design manual for its own documents. This manual, which provides guidance on anything from the CFPB color palette to typography and different types of icons, is used to create “[h]onest, transparent design that wins the public’s trust” and empowers users.
Nor have the design and aesthetics of privacy policies gone entirely unnoticed. In 2001, for example, former FTC Commissioner Sheila Anthony called for a “standard format” for privacy policies along the lines of the Nutritional Labeling and Education Act’s standard format for food labels. Commissioner Anthony recognized that inconsistent and confusing policy design was preventing consumers from becoming aware of their data privacy rights. This was one of the reasons why implementing regulations of the Gramm-Leach-Bliley Act (GLBA), which regulates certain financial information, included some voluntary standardized notice design elements. In a report on how to comply with the California Online Privacy Protection Act, the California Attorney General’s Office included a recommendation that policies be drafted in “a format that makes the policy readable, such as a layered format.” In reaction, the International Association of Privacy Professionals suggested “us[ing] graphics and icons in [ ] privacy policies to help users more easily recognize privacy practices and settings.” California also went so far as to recommend that companies publish two different policies, one that is easy to read and geared toward ordinary consumers and another for regulators.
A. Research Design and Methodology
I designed a survey that asked respondents to choose one website over another based solely on images of privacy policies. The survey was created using Google Forms and conducted through Amazon Mechanical Turk. A total of 513 unique Turkers took the survey. Eighteen responses were eliminated from the analysis due to missing or incomplete data.
The first part of the survey asked for basic demographic data. Respondents listed their age, gender, income, and education level, how much time they spend online per day, and to what extent they read privacy policies. They were then asked to select the social networking websites on which they maintain active profiles, where “active” referred to any website that respondents viewed or updated regularly. Ten of the most popular social networks were listed; the eleventh option was an “other” category. Time online and number of social networking profiles help assess how “networked” an individual is—significant time online per day and a high number of active profiles may all be correlated with an increased digital savviness.
To test the impact of design, the third part of the survey varied designs, but kept the underlying data use practices constant. The final part changed designs and data use practices. Sometimes, designs were paired with privacy protective practices; in other questions, the designs displayed highly invasive practices. The pairs were mixed and matched.
The sample population can be characterized as follows: there were 495 valid responses, of which 39.8% (197) identified female and 60% (297) identified male. College graduates made up 45.3% of the sample, and those with postcollege advanced degrees constituted an additional 12.5%, for a total of 286 respondents. Income levels varied: 32.7% earned under $30,000 per year; 23.2% earned between $30,000 and $50,000 per year; 24.2% earned between $50,001 and $75,000; and 19.8% earned $75,001 or above. More than 82% of the sample reported that they are online more than three hours per day. The sample was also relatively networked. Nearly half of the respondents maintain active profiles on three or more social networking sites.
The majority of respondents concede that they never (16.2%) or rarely (43%) read privacy policies. Another 32.1% suggest that they “sometimes” read privacy notices. Fewer than 9% of respondents do so “always” or “often.” Finally, a majority of the sample exhibited incomplete or inadequate knowledge of the legal implications of privacy policies: 57.8% answered incorrectly; 30.1% answered True to one correct statement, while 12.1% answered True to both correct statements.
In two questions, the survey asked users to choose between policies with identical substance, but different designs: 74.5% and 68% of respondents recognized that the policies were the same. Sizeable majorities were expected here, as it is easy to compare identical language in side by side images.
Survey respondents then had two opportunities to choose between an invasive policy designed with a readable, modern aesthetic and a privacy protective policy presented in the traditional way. The first question offered the following choice:
Similarly, when given the choice between another set of policies that paired a graphically designed notice with invasive practices (Figure 3), on the one hand, and a traditionally designed policy with protective practices (Figure 4), on the other, the sample split down the middle, with 49.1% choosing Figure 4. Again, knowledge of privacy law was a statistically significant predictor of identifying the stronger privacy protections.
Specifically, in this question, the odds of identifying the privacy protective practices were 1.81 times higher for those who understood the legal implications of privacy policies than for those who did not. These results are also displayed in Table 1.
C. Discussion and Limitations
The data suggests that greater awareness of the legal implications of privacy notices is associated with a more discerning approach to interpreting those policies. This makes sense. More than twenty years ago, Alan Westin suggested that “privacy fundamentalists,” or those who value privacy highly, are more active about protecting their information than the “privacy unconcerned,” or people who have few qualms about giving over personal information to others. Assuming Westin was, and still is, correct, greater concern about privacy is likely to translate into greater self-education, which, in turn, will likely result in more effective decisionmaking.
This suggests that if policymakers would like to enhance internet users’ ability to make discerning privacy choices under the notice-and-choice regime, the most effective steps involve greater education. Granted, privacy policies must be readable. They also must be designed with real users in mind. But the data presented here suggests that in addition to improving the transparency of the policies themselves, greater public education about privacy and privacy notices can improve consumers’ ability to interact with those policies and make the choices they want.
That said, this study is subject to certain limitations. A sample set of approximately 500 respondents is adequate but still relatively small. Additional research is necessary to replicate this study on a larger scale. Furthermore, knowledge of privacy law could be measured in different ways. For ease of statistical analysis, anyone who marked a false statement as “True” was considered to lack knowledge of the legal implications of privacy policies. Only those who marked either one or both true statements as “True,” without any others, were categorized as knowledgeable. Although this strategy accurately reflects knowledge on a dichotomous scale, it misses nuance and partial accuracy.
The design of privacy policies affects users’ ability to comprehend the substance of those policies. Design can make information more readable and understandable; it also can obscure, confuse, and manipulate. Design is not neutral. This Essay adds to the scholarship of the relationship between notice design and user comprehension by showing that greater awareness of the legal implications of privacy policies is associated with more discerning approaches to interpreting those policies. In particular, those internet users who correctly identified certain facts about the law of notice-and-choice were statistically more likely to identify privacy policies that offered stronger privacy protections in spite of manipulative design strategies. This suggests that in addition to mandating improvements in notice readability and design, policymakers should commit themselves to educating the public about privacy law basics and the legal implications of privacy notices.
© 2018 Ari Ezra Waldman. Individuals and nonprofit institutions may reproduce and distribute copies of this Essay in any format, at or below cost, for educational purposes, so long as each copy identifies the author, provides a citation to the Notre Dame Law Review Online, and includes this provision and copyright notice.
* Associate Professor of Law and Director, Innovation Center for Law and Technology, New York Law School. Ph.D., Columbia University; J.D., Harvard Law School. Affiliate Scholar, Princeton University, Center for Information Technology Policy. A few paragraphs of this Essay were adapted from Ari Ezra Waldman, Privacy, Notice, and Design, 21 Stan. Tech. L. Rev. 74 (2018), but the study and analysis are entirely new.
 “Notice-and-choice” refers to the legal regime whereby web platforms are required to tell consumers what information they collect, how and for what purpose they collect it, and with whom they share it (notice). See Daniel J. Solove & Woodrow Hartzog, The FTC and the New Common Law of Privacy, 114 Colum. L. Rev. 583, 592 (2014). Consumers then have the opportunity to opt out (choice). See id. This Essay leaves to one side the broader debate over whether privacy law should maintain or replace notice-and-choice. Rather, it accepts notice-and-choice as the current approach to consumer privacy law and seeks to improve notice within that regime.
 See Chris Jay Hoofnagle, Federal Trade Commission Privacy Law and Policy 145–305 (2016) (discussing the FTC’s regulation of privacy); Solove & Hartzog, supra note 1, at 627–66 (discussing the FTC’s jurisprudence on the new common law of privacy).
 For example, the Gramm-Leach-Bliley Act requires certain financial institutions to explain their data collection and use practices to their customers. The policy must state what information is collected, the names of affiliated and outside third parties with whom information is shared, which data is shared with them, and how to opt out. 15 U.S.C. §§ 6803(a)(1)–(2) (2012); 16 C.F.R. §§ 313.6(a)(3), (6) (2018). The Children’s Online Privacy Protection Act, which guards against unauthorized use, collection, and dissemination of information of children thirteen years old and younger, requires certain child-oriented websites to post privacy policies with what data they collect, whether it is obtained actively or passively, how it will be used, whether it will be shared with others, and how to delete data or opt out of collection. 15 U.S.C. §§ 6502(b)(1)(A)(i)–(ii) (2012). For a more comprehensive list of consumer privacy statutes, see Daniel J. Solove & Paul M. Schwartz, Information Privacy Law 37–39 (4th ed. 2011).
 See Joel R. Reidenberg et al., Disagreeable Privacy Policies: Mismatches Between Meaning and Users’ Understanding, 30 Berkeley Tech. L.J. 39, 40, 87–88 (2015) (“[A]mbiguous wording . . . undermines the ability of privacy policies to effectively convey notice of data practices to the general public.”).
 See, e.g., George R. Milne & Mary J. Culnan, Strategies For Reducing Online Privacy Risks: Why Consumers Read (or Don’t Read) Online Privacy Notices, 18 J. Interactive Marketing 15 (2004); Jonathan A. Obar & Anne Oeldorf-Hirsch, The Biggest Lie on the Internet: Ignoring the Privacy Policies and Terms of Service Policies of Social Networking Services 19–22 (Aug. 24, 2016) (unpublished paper), http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2757465.
Although most privacy policies were displayed in black text on white backgrounds, 35% were written in grey on white. Half of those greys were light-to-medium (40%–60% opaque). The median font size was 11: nearly 20% were written in the median size (n=37), which is roughly the same number of policies that were written in size seven or eight. All the policies reviewed included headings and subheadings for its sections, but nearly half of those headings were written in the same font size and color.
Id. at 82.
 Williams, 350 F.2d at 449–50; see also In re RealNetworks, Inc., Privacy Litig., No. 00-C-1366, 2000 WL 631341, at *5 (N.D. Ill. May 8, 2000) (“[B]urying important terms in a ‘maze of fine print’ may contribute to a contract being found unconscionable . . . .”).
 See S.C. Code Ann. § 41-1-110 (2016) (“[A] disclaimer in a handbook or personnel manual must be in underlined capital letters on the first page of the document and signed by the employee. For all other documents referenced in this section, the disclaimer must be in underlined capital letters on the first page of the document.”).
 See Cal. Civ. Proc. Code § 1295(b) (West 2016) (“Immediately before the signature line provided for the individual contracting for the medical services must appear the following in at least 10-point bold red type: ‘NOTICE: BY SIGNING THIS CONTRACT YOU ARE AGREEING TO HAVE ANY ISSUE OF MEDICAL MALPRACTICE DECIDED BY NEUTRAL ARBITRATION AND YOU ARE GIVING UP YOUR RIGHT TO A JURY OR COURT TRIAL. SEE ARTICLE 1 OF THIS CONTRACT.’”).
 Chris Willey, Design+Technology Fellows: Changing the Way Government Works, CFPB: Blog (June 21, 2012), https://www.consumerfinance.gov/about-us/blog/designtechnology-fellows-changing-the-way-government-works/.
 Id. (“If the goal of the industry’s self-regulatory efforts is to provide informed consent for consumers, it has failed. . . . As a general rule, privacy policies are confusing, perhaps deliberately so, and industry has no incentive to make information sharing practices transparent. If privacy policies were presented in a standard format, a consumer could more readily ascertain whether an entity’s information sharing practices sufficiently safeguard private information and consequently whether the consumer wishes to do business with the company.”). But see Gill Cowburn & Lynn Stockley, Consumer Understanding and Use of Nutrition Labelling: A Systematic Review, 8 Pub. Health Nutrition 21 (2005) (arguing that standardized labeling does not alleviate all comprehension problems).
 See, e.g., Nat’l Telecomm. & Info. Admin., Short Form Notice Code of Conduct to Promote Transparency in Mobile App Practices (2013), https://www.ntia.doc.gov/files/ntia/publications/july_25_code_draft.pdf; Patrick Gage Kelley et al., Standardizing Privacy Notices: An Online Study of the Nutrition Label Approach, Carnegie Mellon Univ. CyLab (2010), http://repository.cmu.edu/cgi/viewcontent.cgi?article=1002&context=cylab.
 See Lorrie Faith Cranor et al., Are They Actually Any Different? Comparing Thousands of Financial Institutions’ Privacy Practices, The Twelfth Workshop on the Econs. of Info. Sec. (2013), http://www.econinfosec.org/archive/weis2013/papers/CranorWEIS2013.pdf.
 See Aleecia M. McDonald et al., A Comparative Study of Online Privacy Policies and Formats, in Privacy Enhancing Technologies: 9th International Symposium 37, 49–50 (Ian Goldberg & Mikhail J. Atallah eds., 2009).
 Several studies have shown that Amazon Turk offers researchers a random sample of respondents with a demographic distribution roughly comparable to the United States population. See, e.g., Tara S. Behrend et al., The Viability of Crowdsourcing for Survey Research, 43 Behav. Res. Methods 800 (2011); Gabriele Paolacci et al., Running Experiments on Amazon Mechanical Turk, 5 Judgment & Decision Making 411 (2010).
 These numbers likely suffer from response biases. Individuals are often disinclined to admit that they do not do things they know or perceive they really should. See generally Eunike Wetzel et al., Response Biases, in The ITC International Handbook of Testing and Aessessment 34963 (Frederick T.L. Leong & Dragos Iliescu eds., 2016).
 Notably, age was a significant factor here, as well. The data suggests that every one year increase in age is associated with 1.025 greater odds of seeing through design differences to identify the privacy protective practices. Age was not a significant factor anywhere else in this study, suggesting that it does not play a strong role overall.
 See Opinion Surveys: What Consumers Have to Say About Information Privacy: Hearing Before the Subcomm. on Commerce, Trade & Consumer Prot. of the H.R. Comm. on Energy & Commerce, 107th Cong. 15–16 (2001) (statement of Alan F. Westin, Professor Emeritus, Columbia University, President, Privacy and American Business). Westin referred to everyone else as “privacy pragmatists,” or those who make case by case privacy decisions based on midlevel concern about privacy and average distrust in government, business, and technology. See id. at 16.