By: Michael Mellon*
I. Introduction
In the 1980s, a form of advertising known as direct mail began to prove itself as a lucrative revenue stream, generating an estimated $140 billion in sales in 1982.[1] Changes in mail, including work-sharing[2] and the expanded use of personalized marketing,[3] propelled this business model. While revolutionary for the time, the direct mail practices of that decade are like mass spam campaigns by today’s standards.[4]
A little over a decade later, the internet evolved into a consumer network and found its way into homes across the United States.[5] With growing connectivity came a new means of reaching customers and an opportunity to finetune consumer marketing.[6] Websites began collecting increasing amounts of data from visitors to gain insight into the exact consumer preferences.[7] Companies emerged with the sole purpose of helping other businesses collect and analyze digital information to predict future behavior.[8] Marketing became personalized in ways that it had never been before.[9] There is, however, a tradeoff implicit in that shift toward individualized marketing.[10] Consumer data privacy interests began to suffer unnecessary exposure, all in the name of personalization.[11]
Personal information in the data privacy context is often referred to as “personally identifiable information” (“PII”) or “personal data.”[12] Personally identifiable information is defined differently depending on the jurisdiction and industry in question.[13] “Personal data” generally may be thought of as any information not publicly available that can be utilized to identify a person.[14] This might include specific information such as a social security number, home address, or bank account number, or a collection of information that, in the aggregate, can be used to identify someone.[15] In some jurisdictions, personally identifiable information may fall under this more generalized term of “personal data.”[16]
Consumers have grown accustomed to sharing this sort of personal information online.[17] Many internet users have at some point provided payment information to make purchases, provided their address to receive purchases, provided identifying information to sign up for services, or more.[18] This personal data makes transactions more seamless while allowing companies to provide a product tailored to individuals.[19] It is information vital to the internet economy. Yet, the methods of collection and the use of this personal information are questionable.[20]
Many individuals are unaware of the extent to which data tracking is conducted by businesses.[21] It is far from a simple matter of retaining login information to make the next visit to a website easier. Virtually all forms of communication, information, and contact data present as potentially valuable data and may be subject to retention.[22] This raises significant concerns regarding consumer choice and privacy,[23] especially given that personal data is highly sought after by nefarious actors.[24]
Personal data may as well be black market gold[25]—it contains all the necessary ingredients to commit identity theft, extortion, and more.[26] As a result, whether it be through opportunistic hacks, sophisticated breaches committed with the aid of state actors, or something else; data is compromised at an increasing rate. For example, the Identity Theft Resource Center reported 3,205 instances of compromised data in 2023 alone, a 78% increase from 1,801 instances in 2022.[27]
Unfortunately, these data breaches are nothing new. Over the years, there have been several instances where companies failed to adequately secure personal data, harming numerous data subjects.[28] When a breach occurs, all fifty states require the issuance of breach notifications in some form.[29] However, the law has historically failed to adequately address data protection and related privacy concerns.[30] While the federal government has made attempts to address this issue, none have come to fruition.[31]
II. Federal Data Privacy’s Failure to Launch
In 2022, the American Data and Privacy Protection Act (“ADPA”) sought to tackle federal data privacy, intending to regulate the way companies collect, process, and retain personal data.[32] The ADPA had promise as it was the first data privacy bill to make it past committee.[33] However, the ADPA never made it to either the House or Senate floors for a vote.[34]
Privacy advocates saw their hopes restored with the introduction of the American Privacy Rights Act (“APRA”) on April 7, 2024.[35] The bill was ushered in with bipartisan, bicameral support by Republican Representative Rodgers and Democratic Senator Cantwell as co-sponsors.[36] Both sponsors chair the committees most concerned with privacy, the House Committee on Energy and Commerce and the Senate Committee on Commerce, Science, and Transportation, respectively.[37] The APRA sought to provide consumers with certain rights related to their data, including the rights to access, correct, delete, and export their data; to know to whom their data is transferred to and why; and to have requested services fulfilled within specific timeframes. [38] The APRA, if enacted, would bring uniform privacy regulation to the patchwork of state laws currently in existence, significantly benefiting citizens seeking to protect their personal information.[39] There is just one issue—progress on the APRA appears to have stalled.[40]
The House Committee on Energy and Commerce scheduled a markup of the bill on June 27, 2024.[41] Minutes prior to the start of the markup, a congressional staff announced it was not happening.[42] The cancelation is said to have stemmed from a disagreement between Representative Rodgers and Republican leadership.[43] However, there has been no noteworthy news on the bill since. Lawmakers likely shifted their focus to the 2024 election and, consequently, early-stage legislation like the APRA became at risk of being left to the wayside.[44] While it may be early to call the bill “dead in the water,” history suggests the APRA’s chances of success are evaporating rapidly.[45] Unfortunately, a trend is emerging at the federal level. Like a flash and a bang, federal data privacy legislation draws much fanfare upon introduction. Yet, in an instant, it disappears, leaving states to fill the void.[46]
III. The Maryland Approach
While the APRA may be floundering in Washington, D.C., Maryland has taken action to protect consumer data.[47] On April 8, 2024, Maryland passed House Bill 567, the Maryland Online Data Privacy Act (“MODPA”).[48] MODPA will implement a number of protections for consumer data and place certain guardrails to regulate the collection of that data.[49] On the same day, Maryland passed House Bill 603, known as the Maryland Kids Code (the “Maryland Kids Code” or “Kids Code”).[50] The Kids Code seeks to better protect Maryland’s youth by prohibiting the tracking of individuals under the age of 18 and discouraging the use of “manipulative techniques” to keep children engaged online.[51] Another privacy bill recently proposed in Maryland is H.B. 645, which, if enacted, would allow minors featured in social media videos to seek compensation in certain situations and provide them with a right to request the deletion of said content upon reaching the age of majority.[52]
Collectively, this recent legislative activity demonstrates that Maryland is taking the right to data privacy seriously. Lawmakers are not waiting to see what happens on the other side of the Potomac; they are leading the charge. As a result, the rights of Maryland citizens are poised to be greater in the privacy context.[53]
a. The Maryland Online Data Privacy Act
The Maryland Online Data Privacy Act (“MODPA”) goes into effect on October 1, 2025.[54] The law applies to entities conducting business in Maryland by either providing goods or services to Maryland residents or had, in the previous calendar year, “[c]ontrolled or processed[55] the personal data[56] of at least 35,000 consumers” or “[c]ontrolled or processed[57] the personal data of at least 10,000 consumers[58] and derived more than 20% of its gross revenue from the sale of personal data.”[59] Generally, MODPA requires a Maryland consumer’s affirmative consent for qualifying entities to process their personal data.[60] It also provides consumers with certain rights concerning this data processing, including the rights to know whether an entity is processing the consumer’s personal data, to access any personal data processed, to correct any inaccuracies in the processed data, to require deletion of the processed data unless retention is required by law, to obtain an easily portable copy of the consumer’s personal data processed through automation, to obtain a list of categories of third party entities that the consumer’s personal data has been shared with, and to opt out of various forms of processing.[61] Additionally, MODPA mandates a qualifying entity to comply with a consumer’s exercise of these rights within forty-five days unless an extension is requested or an exception to the compliance requirement applies.[62] The law further requires an entity to establish a process for consumers to appeal when the entity denies an exercise of these rights.[63] Much of this information must be set out in an easily understood privacy policy.[64]
Further, MODPA instructs that entities who are collecting consumers’ personal data must: (1) limit data collection “to what is reasonably necessary and proportionate to provide or maintain a specific product or service requested by the consumer;” (2) limit the collection or processing of sensitive data[65] to what is “strictly necessary to provide or maintain a specific product or service requested by the consumer;” and (3) cease the sale of sensitive data.[66] These provisions are said to be some of the most restrictive state data privacy regulations to date.[67]
Additionally, MODPA provides several provisions that strengthen security protections for consumer data. Controllers[68] must implement “reasonable administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data appropriate to the volume and nature of the personal data at issue.”[69] Controllers also must enter certain contractual requirements with processors to ensure that those processors are processing data in an ethical and secure way.[70] Moreover, controllers must regularly conduct data protection assessments on “processing activities that present a heightened risk of harm to a consumer,” weighing the benefits of the activity against its potential harm.[71] The Attorney General’s Division of Consumer Protection (the “Division”)[72] may later evaluate these assessments to determine whether an entity has been compliant with MODPA.[73] These assessments may also be used in a subsequent enforcement action against an entity.[74]
b. The Maryland Kids Code
The Maryland Kids Code went into effect on October 1, 2024.[75] Generally, covered entities under the law include those who have annual revenue beyond $25 million, process the personal data of 50,000 or more consumers, or derive at least 50% of their annual revenue from the sale of personal data.[76] However, the Kids Code specifically applies to covered entities that process children’s data or provide an online product “reasonably likely to be accessed by children.”[77]
The Kids Code “seeks to limit data that could be collected from children online and protect them from being flooded with harmful material they were not trying to find.”[78] To achieve this statutory purpose, a substantial portion of the law is dedicated to requiring data protection impact assessments (“DPIA”) that determine whether an entity’s online interaction with a child could cause various forms of harm.[79] These DPIAs also must state any steps that an entity has or will take to comply with the law “in a manner consistent with the best interests of children.”[80] These assessments are retained for as long as the analyzed product or service is likely to be accessed by children, reviewed within ninety days of any “material changes to processing pertaining to the online product,” and are to be provided for review within seven days when requested by the Division.[81] The law also requires relevant covered entities to implement privacy settings by default and to provide an easily understood privacy policy.[82] Furthermore, the law prohibits covered entities from processing child-related data in excess of what is reasonably necessary to achieve the purpose for which it was collected.[83]
II. A Lingering Question: Are the Kids Alright?
Together, MODPA and the Kids Code will greatly enhance the privacy interests of Maryland residents. These laws should be lauded as substantial advancements for the privacy rights of the consumer. Yet, within the shadow of these monumental bills sits House Bill 645 (“H.B. 645”), which implicitly raises an emerging youth privacy issue left unaddressed in Maryland.[84] While H.B. 645 has seen no movement since February 2024,[85] the bill’s proposal is worth brief commentary, particularly in light of the rise in family vlogging.[86]
Consent is one of the principal requirements for processing data under MODPA.[87] MODPA defines consent as “a clear affirmative act signifying a consumer’s freely given, specific, informed, and unambiguous agreement to allow the processing of personal data relating to the consumer for a particular purpose.”[88] MODPA also defines personal data as “any information that is linked or can be reasonably linked to an identified or identifiable consumer.”[89] However, excluded from that definition is publicly available information “that a person . . . reasonably believes a consumer or widely distributed media has lawfully made available to the general public; or . . . if the consumer has not restricted the information to a specific audience, obtains from a person to whom the consumer disclosed the information.” [90]
In addition to limitations on the distribution of personal data, MODPA includes specific restrictions on the distribution of “sensitive data.”[91] Sensitive data includes, among other things, “personal data of a consumer that the controller knows or has reason to know is a child.”[92] It is reasonable to infer that the child-specific personal data included in the definition of sensitive data implicitly includes the personal data exceptions under the statute.[93] Thus, consensually distributed media depicting a child would be considered publicly available data and would likely not constitute personal data or sensitive data.
MODPA notes that “controllers and processors that comply with the verifiable parental consent requirements of [the Children’s Online Privacy Protection Act (“COPPA”)] shall be considered compliant with an obligation to obtain parental consent in accordance with this subtitle with respect to a consumer who is a child.”[94] The Federal Trade Commission (“FTC”) enforces COPPA under the Children’s Online Privacy Protection Rule.[95] There, it notes only that:
Obtaining verifiable consent means making any reasonable effort (taking into consideration available technology) to ensure that before personal information is collected from a child, a parent of the child: (1) Receives notice of the operator’s personal information collection, use, and disclosure practices; and (2) Authorizes any collection, use, and/or disclosure of the personal information.[96]
The FTC does not otherwise explicitly state a required method of obtaining such parental consent.[97]
Accordingly, it is reasonable to conclude that a guardian may post personal information in the form of videos, pictures, and commentary about their children[98] on social media and other online sites, where it can then be disseminated, provided the guardian gives consent in accordance with COPPA. This posting would be deemed a consensual sharing of the child’s information despite the child having no real say in the matter.
This notion of the guardian acting on the child’s behalf aligns with societal thinking.[99] Children cannot make legally binding decisions, and they should not be expected to know what is—or is not—in their best interests.[100] This essay does not intend to suggest that people should be prohibited from showing their children and their accomplishments on social media. Rather, it merely points out the inevitable: Kids grow up eventually.
Most children will reach the age of majority, rendering them capable of making their own decisions. Yet, under a fair reading of Maryland’s new privacy laws, children will remain bound by the choices that their parents made concerning the social media publication of their information.[101] They will have no discernable ability to require the deletion of the information that was shared about them because it was consensually[102] made public by their guardians and is no longer afforded heightened protection.[103]
These publications related to childhood can contain traditionally private information, and dissemination can have an impact on subjects well into adulthood.[104] This is the issue intrinsically raised by Maryland’s H.B. 645, which seeks to afford some recourse, at least in a limited circumstance.[105] The law would permit subjects depicted as children in videos on social media to request deletion of that content upon reaching the age of majority.[106] H.B. 645 implicitly recognizes the permanence and long-term impact that social media posts of children can have and intends to offer an escape.
Despite good intentions, H.B. 645 seems unlikely to make a resurgence. While the general premise behind H.B. 645 may appeal to some, such a state law would also be an outlier, excluding California’s analogous Eraser Law.[107] Furthermore, if recent jurisprudence is any indication, it is likely that an enacted H.B. 645 would be subject to legal attack for various reasons, including First Amendment challenges.[108] Finally, H.B. 645 seems unlikely to gain renewed momentum now that MODPA and the Kids Code have been codified. Those laws significantly advance the privacy interests of Maryland consumers in a way that aligns with legislation coming into existence around the country.[109] The failure to incorporate the provisions of H.B. 645 into those laws suggests that the Legislature was either unwilling or unable to push the envelope any further. Thus, while Maryland has made great strides in the name of data privacy, residents should not expect to be able to delete mom’s embarrassing posts anytime soon.
V. Conclusion
In sum, companies are interacting with customers in an increasingly intimate fashion. To facilitate that interaction, companies obtain and retain personal information to tailor content to individuals, greasing the wheels of the digital economy. However, that personalization should not completely erode consumer privacy. The law needs to ensure that businesses process personal data in an ethical and secure way that protects consumer interests. Although the federal government has been slow to address the issue, Maryland has taken substantive action. MODPA and the Kids Code significantly advance the privacy interests of residents and recognize that the private sector must respect data privacy—a position that the rest of the nation should take notice of.

* Michael Mellon is an appellate attorney practicing in New Jersey. He attended Rutgers-Camden School of Law, where he was a member of the Journal of Law and Public Policy. In 2015, he was admitted to the bars of New Jersey and Pennsylvania. He has since been admitted to the District Court of New Jersey and the Third Circuit Court of Appeals. Michael has briefed to, and appeared in, all levels of state court in New Jersey. He has also participated in briefing to the United States Supreme Court. In addition to practicing as an appellate attorney, Michael is a student at Drexel University’s Thomas R. Kline School of Law, where he is enrolled in the Cyber Law and Data Privacy LLM program. Michael has previously published scholarship in the areas of public policy and the intersection of law and technology.
[1] Eric Pace, Advertising; The Boom in Direct Marketing, N.Y. Times (Nov. 29, 1982), https://www.nytimes.com/1982/11/29/business/advertising-the-boom-in-direct-marketing.html; see also Fed. Rsrv. Bank of Minneapolis, Inflation Calculator, Minneapolis Fed., https://www.minneapolisfed.org/about-us/monetary-policy/inflation-calculator (last visited Sept. 9, 2024) (calculating that the same revenue would equal approximately $456.2 billion dollars as of September 9, 2024).
[2] See U.S. Gen. Acct. Off., U.S. Postal Service: A Primer on Postal Worksharing 1 (2003), https://www.gao.gov/assets/gao-03-927-highlights.pdf (explaining that “[p]ostal worksharing activities generally involve mailers preparing, sorting, or transporting mail to qualify for reduced postage rates, that is, worksharing rates.”); see also Edward S. Pearsall, The Effects of Worksharing, Other Product Innovations and the 9/11-Anthrax Attack on U. S. Postal Volumes and Revenues 2 (2004) (sharing that innovations like worksharing “caused an increase in total revenue of about 26 percent” for the United States Postal Service between 1976 and 2003).
[3] See Julien Boudetet al., The Future of Personalization—and How to Get Ready for It 2 (2019) (providing that that personalized marketing can “drive 5 to 15 percent increases in revenue and 10 to 30 percent increases in marketing.”).
[4] Robin Kurzer, The Story of Data: How Did We Get Here? MarTech (May 8, 2018), https://martech.org/the-story-of-data-how-did-we-get-here/(statement of Julia Stead, vice president of marketing of Invoca) (“The earliest direct mail campaigns were sometimes inartful, if not spammy.”).
[5] Id.
[6] Id.
[7] Id.; see also Tom Biselli et al., Supporting Informed Choices about Browser Cookies: The Impact of Personalised Cookie Banners, Proc. on Priv. Enhancing Techs. Symp. 171, 171 (2024) (providing an explanation and overview of the use of cookies to track users across the web); Jonathan Zhang et al., Know Your Customer: How Retailers Have Used Data Throughout History 3 (2023), https://netchoice.org/wp-content/uploads/2023/06/NetChoice_Know-Your-Customer%E2%80%94How-Retailers-Have-Used-Data-Throughout-History_Full-Report-1.pdf (discussing the various means of data analysis that companies have developed in the digital age).
[8] Kurzer, supra note 4.
[9] Id.
[10] See generally Julien Cloarec, The Personalization–Privacy Paradox in the Attention Economy, 161 Tech. Forecasting & Soc. Change 1, 2 (2020) (discussing the “personalization-privacy paradox [, which] refers to a continuous tension[] between a firm’s need for consumer information to personalize consumer experiences and a consumer’s need for privacy.”).
[11] Id. The United States Government Accountability Office has released a number of blog posts calling for regulation on this issue. See Internet Privacy: Additional Federal Authority Could Enhance Consumer Protection and Provide Flexibility, U.S. Gov’t Accountability Off. (Jan. 15, 2019), https://www.gao.gov/products/gao-19-52; see also To Address Emerging Privacy Issues, Congress Should Consider Comprehensive Legislation, U.S. Gov’t Accountability Off. (Sept. 13, 2022), https://www.gao.gov/blog/address-emerging-privacy-issues-congress-should-consider-comprehensive-legislation.
[12] See, e.g., Erika McCallister et al., Guide to Protecting the Confidentiality of Personally Identifiable Information (PII) at 1-1 (2010), https://nvlpubs.nist.gov/nistpubs/legacy/sp/nistspecialpublication800-122.pdf (discussing the use of personally identifiable information); Brooke Auxier et al., Americans and Privacy: Concerned, Confused and Feeling Lack of Control Over Their Personal Information 2–6 (2019), https://www.pewresearch.org/internet/wp-content/uploads/sites/9/2019/11/Pew-Research-Center_PI_2019.11.15_Privacy_FINAL.pdf (discussing use of personal data) [hereinafter Americans and Privacy Report].
[13] See Paul M. Shwartz & Daniel J. Solove, The PII Problem: Privacy and a New Concept of Personally Identifiable Information, 86 N.Y.U. L. Rev. 1814, 1816 (“Given PII’s importance, it is surprising that information privacy law in the United States lacks a uniform definition of the term.”). Compare Requesting Maryland Medical Assistance Data– Frequently-Asked Questions, Md. Dep’t of Health, https://health.maryland.gov/mmcp/datarequests/Pages/Frequently-Asked-Questions-Step-2.aspx (last visited Sept. 17, 2024) (“[I]nformation which can be used to distinguish or trace an individual’s identity.”), with 2023 Minnesota Statutes–Chapter 325M. Internet Privacy, Minn. Legislature, https://www.revisor.mn.gov/statutes/cite/325M.01 (last visited Sept. 17, 2024) (“”Personally identifiable information” means information that identifies: (1) a consumer by physical or electronic address or telephone number; (2) a consumer as having requested or obtained specific materials or services from an Internet service provider; (3) Internet or online sites visited by a consumer; or (4) any of the contents of a consumer’s data-storage devices.”).
[14] Daniel J. Solove, Data Is What Data Does: Regulating Based on Harm and Risk Instead of Sensitive Data, 118 Nw. Univ. L. Rev. 1081, 1086 (2024), (“The most common definition of ‘personal data’ is from the GDPR[,] which defines it as ‘any information relating to an identified or identifiable natural person.’”).
[15] See McCallister et al., supra note 12, at ES-1, -2.
[16] Solove, supra note 14, at 1087 (“Under the more common definition of personal data, which involves identified and identifiable data, the existence of the identifiability prong gives personal data a broad, open-ended, and dynamic scope. First, any data that is associated with personal data becomes personal data.”). “Personal data” is the term used in the Maryland legislation considered in this work. See discussion infra Part III.
[17] Americans and Privacy Report, supra note 12.
[18] Id. at 6, 13.
[19] See Julien Boudet et al., McKinsey & Co., Consumer Data Privacy and Personalization at Scale: How Leading Retailers and Consumer Brands Can Strategize for Both, 6 (2021), https://www.mckinsey.com/~/media/McKinsey/Business%20Functions/Marketing%20and%20Sales/Our%20Insights/Consumer%20data%20privacy%20and%20personalization%20at%20scale/Consumer-data-privacy-and-personalization-at-scale.ashx (“By reducing hold times, simplifying and tailoring multifactor authentication to meet consumer preferences, and placing data-protection controls for consumer-facing applications in the hands of the consumer, they are improving customer experience without compromising underlying security and privacy.”).
[20] Id. (concluding that businesses must balance personalization with “reputational, operational, legal, and financial risks” to “create a net positive consumer experience”).
[21] Fed. Trade Comm’n, A Look Behind the Screens: Examining the Data Practices of Social Media and Video Streaming Services 46 (2024) (“Much of this type of tracking occurs behind the scenes, with users unaware and unable to avoid what’s happening.”).
[22] See Your Data Is Shared and Sold…What’s Being Done About It?, Knowledge at Wharton (Oct. 28, 2019), https://knowledge.wharton.upenn.edu/article/data-shared-sold-whats-done/ (statement of Elea Feit, senior fellow at Wharton Customer Analytics and marketing professor for Drexel University LeBow College of Business) (“Most companies are collecting data these days on all the interactions, on all the places that they touch customers in the normal course of doing business[.]”).
[23] While there is no singular data privacy law, generally, companies are legally mandated to make disclosures related to the information they obtain and retain from users online. See Daniel J. Solove & Woodrow Hartzog, The FTC and the New Common Law of Privacy, 114 Colum. L. Rev. 583, 587 (2014) (“Privacy law in the United States has developed in a fragmented fashion and is currently a hodgepodge of various constitutional protections,
federal and state statutes, torts, regulatory rules, and treaties.”); see generally Stephen P. Mulligan & Chris D. Linebaugh, Cong. Rsch. Serv., R45631, Data Protection Law: An Overview (2019) (summarizing several federal laws and state laws that offer statutory protection). However, research suggests that online users are not grasping the information disclosed in these privacy policies. See Johnathan Yerby & Ian Vaughn, Deliberately Confusing Language in Terms of Service and Privacy Policy Agreements, 23 Issues in Info. Sys. 138, 140 (2022) (finding that Terms of Service documents and privacy policies are too complex for average Americans); see also Fed. Trade Comm’n, supra note 21, at iv, 59(suggesting that several tech giants have collected and retained user data in ways consumers would not expect, and that those same companies have failed to provide an easy way to opt out of said data collection). Companies also have disclosed customer information after indicating they would not. For example, the FTC has taken recent action against healthcare providers, who allegedly promised to keep customer information confidential but then shared it with third parties. See U.S. v. Cerebral, Inc. et al, No. 1:24-cv-21376 (filed Apr. 15, 2024); U.S. v. Movement, Inc., No. 1:24-cv-01034-BAH (filed Apr. 11, 2024).
[24]See Identity Theft Res. Ctr., 2023 Data Breach Report 3 (2024), https://www.idtheftcenter.org/wp-content/uploads/2024/01/ITRC_2023-Annual-Data-Breach-Report.pdf (noting record highs of personal data breaches in 2023).
[25] See Neelie Kroes, Vice-President, European Commission, Opening Remarks at the Press Conference on Open Data Strategy (Dec. 2011) (“Just as oil was likened to black gold, data takes a new importance and value in the digital age.”).
[26]See Lillian Ablon et al., Consumer Attitudes Toward Data Breach Notifications and Loss of Personal Information 1, 29 (2016), https://www.rand.org/content/dam/rand/pubs/research_reports/RR1100/RR1187/RAND_RR1187.pdf (providing that “studies have shown that nonfinancial information, such as account credentials (username and password combination), can be considered more valuable and often command a higher price on the cybercrime black market.”).
[27] See Identity Theft Res. Ctr., supra note 24, at 1, 5 (breaking down a total of 3,205 compromises into 3,122 data breaches, 25 data exposures, 2 data leaks, and 56 unknown compromises). The report also provides definitions for the relevant terms as used in the report, noting that a “data breach” occurs “[w]hen unauthorized individuals access and/or remove personal information from the place where is it stored,” a “data exposure” occurs “[w]hen personal information is available for access and/or removal from the place where it is stored, but there is no evidence the information has been accessed by unauthorized individuals,” and a “data leak” occurs “[w]hen personal information that is publicly available or willingly shared on social media and represents no or low risk when viewed as individual records; however, when aggregated, the sheer volume of personal information available in a single database creates risk to the data subjects and value for identity criminals who specialize in social engineering and phishing.” Id. at 4.
[28] See, e.g.,Vindu Goel & Nicole Perlroth, Yahoo Says 1 Billion User Accounts Were Hacked, N.Y. Times (Dec. 14, 2016), https://www.nytimes.com/2016/12/14/technology/yahoo-hack.html; Equifax Data Breach Settlement, Fed. Trade Comm’n (Nov. 2024), https://www.ftc.gov/enforcement/refunds/equifax-data-breach-settlement (“In September of 2017, Equifax announced a data breach that exposed the personal information of 147 million people.”).
[29] See Security Breach Notification Laws, NCSL (Jan. 17, 2022), https://www.ncsl.org/technology-and-communication/security-breach-notification-laws.
[30] See, e.g., Shwartz & Solove, supra note 13, at 828 (“[P]rivacy law offers multiple competing definitions, each with some significant problems and limitations.”); Daniel J. Solove, The Limitations of Privacy Rights, 98 Notre Dame L. Rev. 975, 978 (“[Data Privacy rights] can give people a small amount of power in a few isolated instances, but this power is too fragmented and haphazard to have a meaningful impact on protecting privacy.”).
[31] See discussion infra Part II.
[32] See American Data Privacy and Protection Act, H.R. 8152, 117th Cong. (2022).
[33] See Paul Anderson, H.R. 8152–The American Data Privacy and Protection Act: The United States’ Solution for the Current “Patchwork” of Data Privacy & Protection Laws, Univ. of Md. Francis King Carey Sch. of L. (Apr. 4, 2023), https://www.law.umaryland.edu/content/articles/name-659578-en.html; see also Solove & Hartzog, supra note 23, at 587.
[34] Steve Alder, Revised American Data Privacy and Protection Act Due to Be Released, Hipaa J. (Apr. 14, 2023), https://www.hipaajournal.com/revised-american-data-privacy-and-protection-act-due-to-be-released/.
[35] See American Privacy Rights Act of 2024, H.R. 8818, 118th Cong. (2024); see also Chris D. Linebaugh et al., Cong. Rsch. Serv., LSB1161, The American Privacy Rights Act 1 (2024), https://crsreports.congress.gov/product/pdf/LSB/LSB11161.
[36] Committee Chairs Rodgers, Cantwell Unveil Historic Draft Comprehensive Data Privacy Legislation, Energy & Com. Chair Rodgers (Apr. 7, 2024), https://energycommerce.house.gov/posts/committee-chairs-rodgers-cantwell-unveil-historic-draft-comprehensive-data-privacy-legislation.
[37] Id.
[38] See H.R. 8818 § 105.
[39] See Linebaugh et al., supra note 35, at 1 (“The APRA would create a comprehensive federal consumer privacy framework.”).
[40] Joe Duball, American Privacy Rights Act Markup Canceled, Next US House Steps Uncertain, IAPP (June 27, 2024), https://iapp.org/news/a/american-privacy-rights-act-markup-canceled-next-us-house-steps-uncertain.
[41] Id.
[42] Id.
[43] Id.
[44] Cf. Anthony Adragna & Daniella Diaz, The New Normal: No Big Legislating in Election Years?, Politico (Feb. 16, 2024), https://www.politico.com/newsletters/inside-congress/2024/02/16/the-new-normal-no-big-legislating-in-election-years-00141975 (discussing the difficulty of passing significant bills during an election cycle); see also United States Congress Elections, 2024, Ballotpedia, https://ballotpedia.org/United_States_Congress_elections,_2024 (last visited Sept. 21, 2024) (noting that 33 Senate seats and all 435 House seats were up for election on November 5, 2024).
[45] See generally Alder, supra note 34; see also Gilad Edelman, Don’t Look Now, but Congress Might Pass an Actually Good Privacy Bill, Wired (July 21, 2022), https://www.wired.com/story/american-data-privacy-protection-act-adppa/ (noting that a bipartisan agreement on federal privacy regulation has been “just around the corner” since 2019).
[46] Nearly twenty states have enacted comprehensive data privacy laws. See, e.g., S.B. 1121, 2018 Leg., Reg. Sess. (Cal. 2018) (amending multiple sections and creating Cal. Bus. & Prof. Code § 1798.199 (Deering 2024)); S.B. 619, 2023 Leg., 82d Sess. (Or. 2023) (creating Or. Rev. Stat. §§ 646A.570–.589 (2024));S.B. 21-190., 71st Gen. Assemb., Reg. Sess. (Co. 2021) (creating Colo. Rev. Stat. §§ 6-1-1301 to -1314 (2024));2023 Conn. Acts 23–56 (Reg. Sess.) (amending 2022 Conn. Acts 22-15 (Reg. Sess.)); H.R. 154, 152d Gen. Assemb., Reg. Sess. (Del. 2023) (amending Del. Code Ann. tit. §§ 12D-101 to -111 (2024)), S.B. 262, 2023 Leg., Reg. Sess. (Fla. 2023) (creating Fla. Stat. §§ 501.701-.721, 501.1735 (2024), and amending Fla. Stat. § 501.171 (2024));S.B. 5, 2023 Leg., 123d Sess. (Ind. 2023) (creating Ind. Code §§ 24-15-1-1 to -11-2 (2024));S.F. 262, 90th Gen. Assemb., Reg. Sess. (Iowa 2023) (creating Iowa Code §§ 715D.1–715D.9 (2024));H.R. 15, 2024 Leg., Reg. Sess. (Ky. 2024) (creating Ky. Rev. Stat. Ann. §§ 367.3611–.3629 (LexisNexis 2024));H.R. 2309, 2024 Leg., 93d Sess. (Minn. 2024) (creating Minn. Stat. §§ 325M.10–.21 (2024)); S.B. 384, 2023 Leg., 68th Sess. (Mont. 2023) (generally revising Mont. Code §§ 30-14-2801 to -2817 (2024));S.B. 225, Gen. Ct., Reg. Sess. (N.H. 2024) (creating N.H. Rev. Stat. §§ 507-H:1–12 (LexisNexis 2024));L.B. 1074, 108th Leg., 108th Reg. Sess. (Neb. 2024) (creating Neb. Rev. Stat. §§ 87-1101 to -1130 (2024)); S.B. 332, 2023 Leg., Reg. Sess. (N.J. 2023) (supplementing N.J. Stat. §§ 56:8-166.4 to -166.19 (2024));S.B. 619, 2023 Leg., 82d Sess. (Or. 2023) (creating Or. Rev. Stat. §§ 646A.570–.589); S. 2500, Gen Assemb., Reg. Sess. (R.I. 2024) (creating R.I. Gen. Laws §§ 6-48.1-1 to -10 (2024)); H.R. 1181, Gen. Assemb., Reg. Sess, (Tenn. 2023) (creating Tenn. Code Ann. §§ 47-18-3201 to -3214 (2024));H.R. 4, 2024 Leg., 88th Sess. (Tex. 2024) (creating Tex. Bus. & Com. Code § 541.001 (2024));S.B. 227, 2022 Leg., 64th Sess. (Utah 2022) (amending Utah Code Ann. § 13-2-1 (LexisNexis 2024)); S.B. 361, Gen. Assemb., Reg. Sess. (Va. 2024) (amending Va. Code Ann. §§ 59.1-575 to -578 (2024)). Some states have narrowed their data privacy laws. See, e.g. H.R. 1155, 68th Leg., Reg. Sess. (Wash. 2023) (creating Wash. Rev. Code Ann. § 19.373.005 (LexisNexis 2024)); S.B. 370, 2023 Leg., 82d Sess. (Nev. 2023) (amending Nev. Rev. Stat. Ann. § 603A (LexisNexis 2024)). The International Association of Privacy Professionals provides a valuable source for keeping tabs on state privacy legislation throughout the United States. See US State Privacy Legislation Tracker, IAPP (Dec. 16, 2024), https://iapp.org/resources/article/us-state-privacy-legislation-tracker/.
[47] See H.D. 567, 446th Gen. Assemb., Reg. Sess. (Md. 2024); H.D. 603, 446th Gen. Assemb., Reg. Sess. (Md. 2024); see also Natasha Singer, Maryland Passes 2 Major Privacy Bills, Despite Tech Industry Pushback, N.Y. Times (Apr. 7, 2024), https://www.nytimes.com/2024/04/07/technology/maryland-children-privacy-laws-technology.html.
[48] SeeH.D. 567.
[49] Id.
[50] SeeH.D. 603.
[51] Id.
[52] See H.D. 645, 446th Gen. Assemb., Reg. Sess. (Md. 2024).
[53] See discussion infra Part III.A (providing an account of the specific privacy rights provided for under MODPA); see also discussion infra Section III.B (discussing certain limitations placed on the processing of child-related data under the Maryland Kids Code).
[54] See H.B 567, 446th Gen. Assemb., Reg. Sess. (Md. 2024); see also Md. Code Ann., Com. Law § 14-4701 (LexisNexis 2024).
[55] Whether an entity controls or processes personal data determines whether they are considered a “controller” or “processor” under the law. See Com. Law § 14-4701 (noting that “‘[c]ontroller’ means a person that, alone or jointly with others, determines the purpose and means of processing personal data” and “‘[p]rocessor’ means a person that processes personal data on behalf of a controller.”). This designation is important because it impacts the responsibilities and potential liabilities of an entity. See id. §§ 14-4701 to 14-4714 (providing the responsibilities attached to each role). As this essay is intended to be a high-level overview, these distinctions are seldom discussed here; nevertheless, in practice, it is important for an entity to know what their specific role is in order to comply with MODPA and other data privacy-related laws.
[56] “Personal data” is defined as “any information that is linked or can be reasonably linked to an identified or identifiable consumer.”Id. § 14-4701(w)(1). This does not include publicly available data or de-identified data. See id. § 14-4701(w)(2). An “identifiable consumer” is defined as “a consumer who can readily be identified, either directly or indirectly.” Id. § 14-4701(u).
[57] “Process” is defined as “an operation or set of operations performed by manual or automated means on personal data . . . include[ing] collecting, using, storing, disclosing, analyzing, deleting, or modifying personal data.” Com. Law § 14-4701(y)(1)-(2).
[58] “Consumer” refers to a resident of Maryland. See id. § 14-4701(h)(1).
[59] Id. §§ 14-4702(1)-(2). Various exceptions exist that exempt otherwise qualifying entities, information, and data. These include, but are not limited to, exemptions related to the Gram-Leach-Bliley Act and the Health Insurance Portability and Accountability Act. See generally id. § 14-4703 (providing exceptions); see also id. § 14-4704 (providing limits for entities processing health-related data).
[60] When a consumer revokes consent to the processing of their personal data, the law requires the controller to stop the processing “as soon as practicable, but not later than 30 days after receiving the request.” See Com. Law § 14-4707(b)(2).
[61] Id. § 14-4705(b).
[62] See id. §14-4705(e).
[63] Id. § 14-4705(f).
[64] Id. §§ 14-4707(d)-(e).
[65] Sensitive data is defined in the statute as:
- Data revealing: (i) Racial or ethnic origin; (ii) Religious beliefs; (iii) Consumer health data; (iv) Sex life; (v) Sexual orientation; (vi) Status as transgender or nonbinary; (vii) National origin; or (viii) Citizenship or immigration status;
- Genetic data or biometric data;
- Personal data of a consumer that the controller knows or has reason to know is a child; or
- Precise geolocation data.
Com. Law § 14-4701.
[66] Id. §§ 14-4707(a)(1)–(2). But see Cobun Zweifel-Keegan, A View from DC: Privacy Law Flirts with Its ‘Ban It’ Era, IAPP (Sept. 13, 2024), https://iapp.org/news/a/a-view-from-dc-privacy-law-flirts-with-its-ban-it-era (concluding that MODPA does not present an absolute ban on the sale of sensitive data, despite appearing to do so on its face).
[67] See, e.g., S.F. 262, 2023 Leg., 90th Sess. (Iowa 2023) (creating Iowa Code §§ 715D.1–.9 (2024)). Iowa’s data privacy law is considered more business-friendly. For example, the law does not grant consumers a private right of action nor the right to delete or correct data collected by third parties.); S.B. 227, 2022 Leg., 64th Sess. (Utah 2022) (amending Utah Code Ann. § 13-2-1 (LexisNexis 2024)). Utah’s data privacy law is narrower in scope, thereby excluding many businesses from compliance. Furthermore, Utah’s law excludes employee data from protection.).
[68] See Com. Law § 14-4701 (explaining the distinction between a controller and a processor).
[69] Id. §§ 14-4707(b)(ii).
[70] Id. §§ 14-4708(a)–(b).
[71] Id. §§ 14-4710(a)–(c).
[72] Id. § 14-4710(a). It is assumed that the “Division” referred to in MODPA is the Division of Consumer Protection of the Office of the Attorney General. However, it is not expressly stated in MODPA. See discussion infra note 81.
[73] Com. Law § 14-4710(d)(2).
[74] Id. § 14-4710(d)(2); see also id. §§ 14-4714(b)–(d) (proving that the Division “may” allow a non-compliant entity 60 days to come into compliance before bringing an action, depending on a number of factors that are set out in the statute).
[75] See H.D. 603, 446th Gen. Assemb., Reg. Sess. (Md. 2024); see also Com. Law §§ 14-4801 to -4813. Many of the definitions and provisions contained therein are the same or substantially similar to those discussed in MODPA, albeit tailored specifically for children and their interactions with online products and services. Accordingly, many definitions under the Kids Code are not restated here to avoid redundancy. However, this limited analysis should not be understood to imply that compliance with MODPA equals compliance with the Kids Code. The two are separate statutory provisions and have distinctions in addition to those discussed in this essay. H.D. 603; Com. Law §§ 14-4801 to -4813.
[76] Com. Law § 14-4801(h).
[77] Id. § 14-4803; but see id. § 14-4802 (providing a list of exemptions, including but not limited to those relating to entities required to comply with Title V of the Gramm-Leach-Bliley Act and entities required to comply with the Health Information Technology for Economic and Clinical Health Act).
[78] See Brian Witte, Maryland Governor Signs Online Data Privacy Bills, AP (May 9, 2024, 3:29 PM), https://apnews.com/article/online-data-privacy-maryland-7f691026b58f30be4b535c015c24845a; but see Com. Law § 14-4810 (noting that “nothing in this subtitle may be interpreted or construed to . . . (3) Prevent or preclude a child from deliberately or independently searching for or specifically requesting content; or (4) Require a covered entity to implement an age-gating requirement.”).
[79] Com. Law§§ 14-4810(3)–(4); see also id. § 14-4804(2) (noting that entities must begin preparing DPIAs on or before April 1, 2026).
[80] Com. Law § 14-4804. The term “best interests of children” is defined as:
a covered entity’s use of the personal data of children or the design of an online product in a way that does not:
(1) Benefit the covered entity to the detriment of children; and
(2) Result in:
(i) Reasonably foreseeable and material physical or financial harm to children;
(ii) Severe and reasonably foreseeable psychological or emotional harm to children;
(iii) A highly offensive intrusion on children’s reasonable expectation of privacy; or
(iv) Discrimination against children based on race, color, religion, national origin, disability, gender identity, sex, or sexual orientation.
Id. §14-4801(c).
[81] Com. Law §§ 14-4805, -4807; see also id. § 14-4809 (indicating that when an entity “is in substantial compliance with the requirements of §§ 14-4804 through 14-4806,” but the Division still alleges a violation, it shall notice the relevant entity of the alleged violation in writing and allow 90 days to cure the alleged violation before commencing an action); id. § 14-4808 (discussing applicable fines and penalties).
[82] Com. Law § 14-4810.
[83] Id.
[84] See generally H.D. 645, 446th Gen. Assemb., Reg. Sess. (Md. 2024).
[85] Id. (noting that this legislation never made it out of committee during the 2024 legislative session).
[86] SeeRachel C. Abrams, Family Influencing in the Best Interests of the Child, 2 Chi. J. Int’l L. 97, 99–100 (2023).
[87] Com. Law §§ 14-4701(y)(1)-(2).
[88] Id. § 14-4701(g)(1).
[89] Id. § 14-4701(w)(1); cf. id. § 14-4801(n)(1) (showing that while the Maryland Age Appropriate Design Code Act is aimed at a different purpose, it shares its definition of “personal data” with the Maryland Online Data Privacy Act).
[90] Com. Law § 14-4701(cc); cf. id. § 14-4701 (noting the separate definition of “publicly available information” in both the Maryland Online Data Privacy Act and the Maryland Age-Appropriate Design Code Act).
[91] Com. Law §§ 14-4707(a)(1)-(2).
[92] Id. § 14-4707(gg)(3).
[93] See Abrams, supra note 86.
[94] Com. Law § 14-4703(c).
[95] See generally 15. U.S.C. §§ 6501-6506 (2024); see also 16 C.F.R. § 312.2 (2024).
[96] Id.
[97] The FTC offers examples of methods that have met the standard. See Children’s Online Privacy Protection Rule: A Six-Step Compliance Plan for Your Business, Fed. Trade Comm’n (June 2017), https://www.ftc.gov/business-guidance/resources/childrens-online-privacy-protection-rule-six-step-compliance-plan-your-business#step4.
[98] See Social Media Platforms–Vloggers and Video Content Featuring Minors, H.D. 645, 446th Gen. Assemb., Reg. Sess. (Md. 2024); cf. Complying with COPPA: Frequently Asked Questions, Fed. Trade Comm’n (July 2020), https://www.ftc.gov/business-guidance/resources/complying-coppa-frequently-asked-questions#A.%20General%20Questions (noting that personal information under COPPA includes “[a] photograph, video, or audio file, where such file contains a child’s image or voice”).
[99] See Jonathan Todres, Confronting Categorical Exclusions Based on Age: The Right of Children and Youth, 36 Harv. H.R. J. 283, 286 n.24 (finding that minimum age laws may protect society from children but can create excessive exclusivity).
[100] Id. at 286 (“[T]he law opts for rules that treat children as lacking capacity[.]”).
[101] Note that COPPA provides for the deletion of personal information of children posted by children, but not that which is posted by their parents. See Complying with COPPA: Frequently Asked Questions, supra note 98 (noting in response to the prompt “Does COPPA apply to information about children collected online from parents or other adults?” that, “No. COPPA only applies to personal information collected online from children, including personal information about themselves, their parents, friends, or other persons.”). Under the law, “[p]rior to posting photos, videos, and audio recordings that contain a child’s image or voice, operators covered by COPPA must either (i) prescreen and delete from children’s submissions any photos, videos, or audio recordings of themselves or other children or (ii), first give parents notice and obtain their consent prior to permitting children to upload any photos, videos, or audio recordings of themselves or other children.” Id.; see also Children and Teens Online Privacy Protection Act, S. 1418, 118th Cong. (2023) (providing the text for COPPA 2.0, which would seek to amend COPPA). COPPA 2.0 would create an exception to the deletion requirement if the content in question is posted “by any person other than the user who is attempting to erase or otherwise eliminate the content or information, including content or information submitted by the user that was republished or resubmitted by another person.” S. 1418.
[102] Md. Code Ann., Com. Law §§ 14-4707(a)(1)–(g)(2) (LexisNexis 2024).
[103] See Abrams, supra note 86, at 99–100.
[104] See, e.g., Fortesa Latifa, Activists Urged Maryland Legislators to Protect Child Influencers from a “Monetized Childhood” with No Profit, Teen Vogue (Feb. 14, 2024), https://www.teenvogue.com/story/activists-urged-maryland-legislators-to-protect-child-influencers (testimony of Cam Barrett) (“I know what it’s like to grow up with a digital footprint I never asked for . . . As my mom posted to the world my first ever menstrual cycle . . . [and] the intimate details of me being adopted, her platform grew and I had no say in what was posted.”). Barrett went on to discuss the lingering anxiety she felt as a result of this objectification, as well as the various ways children may be exploited by their guardians for social media related profit.). Id. While most parents who share content about their children online do not act with such indifference, postings about children may present inadvertent privacy and security risks which parents should be aware of. See Stacey B. Steinberg, Sharenting: Children’s Privacy in the Age of Social Media, 66 Emory L. J. 839, 878 (2017) (“While the law might never offer children complete protection from their parents’ choice to disclose personal information online, society is beginning to recognize that there are inherent safety and moral risks involved in many of today’s common parental sharing practices.”).
[105] H.D. 645, 446th Gen. Assemb., Reg. Sess. (Md. 2024).
[106] Id. (noting, however, that the bill contains a number of exceptions to this right of deletion). The proposal mirrors California’s “Eraser Law,” which allows minors to request the deletion of content parents have posted online. See Cal. Bus. & Prof. Code § 22581 (2024).
[107] Bus. & Prof. § 22581.
[108] See Bryan P. Spears & Josh Kurtz, Sponsors of ‘Kids Code’ Bill Confident It Will Withstand Court Challenge, Md. Matters, (May 10, 2024), https://marylandmatters.org/2024/05/10/sponsors-of-kids-code-bill-confident-it-will-withstand-court-challenge/ (statement of Del. Jared Solomon) (“With this bill, we are now only the second state in the country and we hope that we will be the first state in the country to withstand what we think will probably be a likely court challenge and we’ve done a lot of work to make that happen[.]”). But see NetChoice, LLC v. Bonta, 113 F.4th 1101, 1121 (9th Cir. 2024) (suggesting that a similar provision in California’s Age-Appropriate Design Code Act may be unconstitutional), and Moody v. NetChoice, LLC, 603 U.S. 707, 716-17 (2024) (suggesting that social media content moderation and curation determinations are protected under the First Amendment); see generally Mahanoy Area Sch. Dist. v. B.L., 594 U.S. 180 (2021) (finding among other things that a post on social media can constitute protected free speech).
[109] See H.D. 567, 446th Gen. Assemb., Reg. Sess. (Md. 2024); H.D. 603, 446th Gen. Assemb., Reg. Sess. (Md. 2024); see also Singer, supra note 47.






Leave a comment