Shakes off bones; emerges from coffin.Let's walk through both bills. That is, the actual text, and not whatever bullshit has been distilled through word-of-mouth because none of the people posting about this stuff online bother to actually read anything other than other people's takes.
First,
KOSA. Let's look at the "big scary thing" that the EFF rants about:
SEC. 3. DUTY OF CARE.
(a) Best Interests.—A covered platform has a duty to act in the best interests of a minor that uses the platform's products or services.
(b) Prevention Of Harm To Minors.—In acting in the best interests of minors, a covered platform has a duty to prevent and mitigate the heightened risks of physical, emotional, developmental, or material harms to minors posed by materials on, or engagement with, the platform, including—
(1) promotion of self-harm, suicide, eating disorders, substance abuse, and other matters that pose a risk to physical and mental health of a minor;
(2) patterns of use that indicate or encourage addiction-like behaviors;
(3) physical harm, online bullying, and harassment of a minor;
(4) sexual exploitation, including enticement, grooming, sex trafficking, and sexual abuse of minors and trafficking of online child sexual abuse material;
(5) promotion and marketing of products or services that are unlawful for minors, such as illegal drugs, tobacco, gambling, or alcohol; and
(6) predatory, unfair, or deceptive marketing practices.
(I'm not going to bother formatting this stuff nicely; you can just read the actual bill)
Notably, this is the third section, after the title and definitions. This is not an actual list of things that all websites are now suddenly legally forced to uphold. This is the
goal of the bill, not some dictation. "Has a duty" is not legally enforceable. What is actually being requested here?
SEC. 4. SAFEGUARDS FOR MINORS.
(a) Safeguards For Minors.—
(1) IN GENERAL.—A covered platform shall provide a minor, or a parent acting on a minor's behalf, with readily accessible and easy-to-use safeguards to control their experience and personal data on the covered platform, including settings to—
(A) limit the ability of other individuals to contact or find a minor, in particular adults with no relationship to the minor;
(B) prevent other individuals from viewing the minor’s personal data collected by or shared on the covered platform, in particular restricting public access to personal data;
(C) limit features that increase, sustain, or extend use of the covered platform by a minor, such as automatic playing of media, rewards for time spent on the platform, and notifications;
(D) opt out of algorithmic recommendation systems that use a minor’s personal data;
(E) delete the minor's account and request removal of personal data;
(F) restrict the sharing of the geolocation of a minor and to provide notice regarding the tracking of a minor’s geolocation; and
(G) limit time spent by a minor on the covered platform.
(2) DEFAULT SAFEGUARD SETTINGS FOR MINORS.—A covered platform shall provide that, in the case of a user that the platform knows or reasonably believes to be a minor, the default setting for any safeguard described under paragraph (1) shall be the strongest option available.
(3) ACCESSIBILITY FOR MINORS.—With respect to safeguards described under paragraph (1), a covered platform shall provide information and control options in a manner that is age appropriate and does not encourage minors to weaken or turn off safeguards.
(b) Parental Tools.—
(1) PARENTAL TOOLS.—A covered platform shall provide readily accessible and easy-to-use parental tools for parents to appropriately supervise the use of the covered platform by a minor.
(2) REQUIREMENTS.—The parental tools provided by a covered platform shall include—
(A) the ability to control privacy and account settings, including the safeguards established under subsection (a)(1);
(B) the ability to restrict purchases and financial transactions by a minor;
(C) the ability to track total time spent on the platform;
(D) a clear and conspicuous mechanism for parents to opt out of or turn off any default parental tools put in place by the covered platform; and
(E) access to other information regarding a minor's use of a covered platform and control options necessary to a parent's ability to address the harms described in section 3(b).
(3) NOTICE TO MINORS.—A covered platform shall provide clear and conspicuous notice to a minor when parental tools are in effect.
(4) DEFAULT PARENTAL TOOLS.—A covered platform shall provide that, in the case of a user that the platform knows or reasonably believes to be a minor, parental tools shall be enabled by default.
(c) Reporting Mechanism.—
(1) PARENTAL REPORTS.—A covered platform shall provide minors and parents with—
(A) a readily accessible and easy-to-use means to submit reports of harms to a minor, including harms described in section 3(b);
(B) an electronic point of contact specific to matters involving harms to a minor; and
(C) confirmation of the receipt of such a report and a means to track a submitted report.
(2) TIMING.—A covered platform shall establish an internal process to receive and respond to reports in a reasonable and timely manner.
(d) Illegal Content.—A covered platform shall not facilitate the advertising of products or services to minors that are illegal to sell to minors based on applicable State or Federal law.
Everything listed in 4.a.1. is already either an option or could be made an option without concern. The rest of 4.a. just covers the fact that this should be the default if it's reasonable to presume that the user is a minor, which is generally covered by asking their age. Nothing particularly special there; websites do this already. Everything in 4.b. just requires attaching a minor's account to an account of a parent who has control; nothing special there, either.
Then there's 4.c, which is the enforcement mechanism of section 3. Which entirely constitutes... giving parents a mechanism to report all of the stuff in section 3 to the website runner.
This already exists; it's called a report button.
4.d. bars advertising age-restricted things to minors which isn't exactly something to write home about.
Section 5 is just matters of disclosure and telling users all the various information. Nothing special.
Then there's section 6:
SEC. 6. TRANSPARENCY.
(a) Audit Of Systemic Risks To Minors.—
(1) IN GENERAL.—Not less frequently than once a year, a covered platform shall issue a public report identifying the foreseeable risks of harm to minors based on an independent, third-party audit conducted through reasonable inspection of the covered platform and describe the prevention and mitigation measures taken to address such risks.
(2) CONTENT.—
(A) TRANSPARENCY.—The public reports required of a covered platform under this section shall include—
(i) an assessment of whether the covered platform is reasonably likely to be accessed by minors;
(ii) a description of the commercial interests of the covered platform in use by minors;
(iii) an accounting of the number of individuals using the covered platform reasonably believed to be minors in the United States, disaggregated by the age ranges of 0–5, 6–9, 10–12, and 13–16;
(iv) an accounting of the time spent by the median and average minor in the United States on a daily, weekly, and monthly basis, disaggregated by the age ranges of 0–5, 6–9, 10–12, and 13–16;
(v) an accounting, disaggregated by category of harm, of—
(I) the total number of reports of the dissemination of illegal or harmful content involving minors; and
(II) the prevalence of content that is illegal or harmful to minors; and
(vi) a description of any material breaches of parental tools or assurances regarding minors, unexpected use of the personal data of minors, and other matters regarding non-compliance.
(B) SYSTEMIC RISKS ASSESSMENT.—The public reports required of a covered platform under this section shall include—
(i) an audit of the known and emerging risks to minors posed by the covered platform, including the harms described in section 3(b);
(ii) an assessment of how algorithmic recommendation systems and targeted advertising systems can contribute to harms to minors;
(iii) a description of whether and how the covered platform uses system design features to increase, sustain, or extend use of a product or service by a minor, such as automatic playing of media, rewards for time spent, and notifications;
(iv) a description of whether, how, and for what purpose the platform collects or processes geolocation, contact information, health data, or other categories of personal data of heightened concern regarding minors, as determined by the Commission;
(v) an evaluation of the efficacy and any issues in delivering safeguards to minors under section 4; and
(vi) an evaluation of any other relevant matters of public concern over risks to minors.
(C) MITIGATION.—The public reports required of a covered platform under this section shall include—
(i) a description of the safeguards and parental tools available to minors and parents on the covered platform;
(ii) a description of interventions by the covered platform when it had or has reason to believe that harm could occur to minors;
(iii) a description of the prevention and mitigation measures intended to be taken in response to the known and emerging risks identified in its audit of system risks, including steps taken to—
(I) adapt or remove system design features that expose minors to risks;
(II) set safeguards to their most safe settings by default;
(III) prevent the presence of illegal and illicit content on the covered platform; and
(IV) adapt algorithmic recommendation system to prioritize the best interests of users who are minors;
(iv) a description of internal processes for handling reports and automated detection mechanisms for harms to minors, including the rate, timeliness, and effectiveness of responses under the requirement of section 4(c);
(v) the status of implementing prevention and mitigation measures identified in prior assessments; and
(vi) a description of the additional measures to be taken by the covered platform to address the circumvention of safeguards and parental tools.
(3) REASONABLE INSPECTION.—In conducting an inspection of the systemic risks of harm to minors, a covered platform shall—
(A) take into consideration the function of algorithmic recommendation systems;
(B) consult parents, experts, and civil society with respect to the prevention of harms to minors;
(C) conduct research based on experiences of minors that use the covered platform, including harms reported under section 4(c);
(D) take account of research, including research regarding system design features, marketing, or product integrity, industry best practices, or outside research; and
(E) consider indicia or inferences of age of users, in addition to any self-declared information about the age of individuals.
(4) PRIVACY SAFEGUARDS.—In issuing the public reports required under this section, a covered platform shall take steps to safeguard the privacy of its users, including ensuring that data is presented in a de-anonymized, aggregated format.
Please explain to me how any of the things listed here are the "doomsday mechanisms for the Internet" that the EFF is screaming bloody murder about. "Please report the number of minors on your platform once a year." Ooh, real scary. This is literally "please bring in a third party to report on your moderation activities regarding minors" and that's not exactly a significant thing.
Section 7 then outlines all of the requirements of who is eligible to be one of these "third-party researchers". Section 8 is just a guideline for how to approach doing market research on minors.
Section 9, the infamous "age verification study":
SEC. 9. AGE VERIFICATION STUDY AND REPORT.
(a) Study.—The Director of the National Institute of Standards and Technology, in coordination with the Federal Communications Commission, Federal Trade Commission, and the Secretary of Commerce, shall conduct a study evaluating the most technologically feasible options for developing systems to verify age at the device or operating system level.
(b) Contents.—Such study shall consider—
(1) the benefits of creating a device or operating system level age verification system;
(2) what information may need to be collected to create this type of age verification system;
(3) the accuracy of such systems and their impact or steps to improve accessibility, including for individuals with disabilities;
(4) how such a system or systems could verify age while mitigating risks to user privacy and data security and safeguarding minors' personal data; and
(5) the technical feasibility, including the need for potential hardware and software changes, including for devices currently in commerce and owned by consumers.
(c) Report.—Not later than 1 year after the date of enactment of this Act, the agencies described in subsection (a) shall submit a report containing the results of the study conducted under such subsection to the Committee on Commerce, Science, and Transportation of the Senate and the Committee on Energy and Commerce of the House of Representatives.
Note the lack of any sort of developing this mechanism; just evaluating what such a mechanism would even be, and whether it's feasible. EFF naturally jumps to the conclusion that they'll mandate such a hypothetical system, but of course, that's not in the bill. That's just the EFF tilting at legislation-shaped windmills, and every other tech reporter taking their word as gospel.
The rest is just boilerplate enforcement (it's civil, so we're talking lawsuits) and implementation (making a council to figure out how the hell to do it).
The only thing I see coming of KOSA, assuming it even does end up added, is that more websites are going to ask your age when you make an account.
That's about it.
Now, onto
INFORM.
I'm not gonna bother breaking out into snippets what's in the text because it's rather straightforward. If selling something through an online marketplace as a third-party in "high volume" ("a third party seller... who, in any continuous 12-month period during the previous 24 months, has entered into 200 or more discrete sales or transactions of new or unused consumer products resulting in the accumulation of an aggregate total of $5,000 or more in gross revenues."), you're obliged to provide your name and contact information. Notably, if you lack a business address to provide, then the only requirement is country (and if applicable, state) or the online platform simply saying that there is no business address available.
Notably, none of this applies if you're just selling this stuff through your own storefront. Only if you're trying to sell your "politically provocative" book on Amazon, sans publisher.
No, I will not reply to anything else here. I linked the text of the bills; those are going to be more authoritative than anything else.