Privacy Update: FCC Seeks Public Comment on the Current and Future Regulation of the "Internet of Things"

Headlines that Matter for Privacy and Data Security
On

US News

FCC Seeks Public Comment on the Current and Future Regulation of the “Internet of Things” 

The FCC is requesting public comment in a proceeding that will help determine the scope and nature of regulation of the “Internet of Things” (IoT) for the next several decades. Industries and technologies affected by this proceeding include, but are not limited to, those operating in the following settings: industrial/commercial (e.g., automotive, factory automation), residential (e.g., “connected homes”), public safety, and government. The FCC is seeking input on topics such as the role of licensed and unlicensed spectrum in IoT growth, FCC rules that could be modified to facilitate greater spectrum access for IoT deployments, and any regulatory barriers to particular IoT use cases in specific frequency bands, for example. Initial Comments due by November 1, 2021; Reply Comments due by November 16, 2021.

Sonic Corp. Data Breach Class Action Moves Forward

An Ohio federal judge denied fast-food company, Sonic Corp.’s motion for summary judgment in a class action lawsuit filed by a class of financial institutions based on a breach in which hackers used malware to access customers’ payment card data through the Sonic point-of-sale systems. In the dismissal, the court noted that Sonic owed an obligation to the financial institutions, stating, “Sonic’s affirmative acts created a risk of harm, and Sonic knew or should have known that the risk of hacking made its flawed security practices unreasonably dangerous.” Such risky actions included making a “permanently-enabled VPN tunnel,” granting system access to anyone with credentials without multifactor authentication. The case is ongoing.

SEC Reaches $10M Settlement Mobile App Alternative Data Provider Over Securities Fraud Charges 

In its first enforcement action against an alternative data provider, the SEC recently reached a settlement with Annie Inc., an alternative data provider for the mobile app industry, and its co-founder and former CEO and Chairman, Bertrand Schmitt, over securities fraud charges for engaging in deceptive practices and making material misrepresentations about how Annie’s alternative data was derived.  

According to the order, Annie is one of the largest sellers of market data on mobile app performance, including estimates on the number of times a particular company’s app is downloaded, how often it’s used, and the amount of revenue the app generates for the company (AKA “alternative data”). 

The order finds that Annie and Schmitt understood that companies would only share their confidential app performance data with Annie if it promised not to disclose their data to third parties, and as a result, Annie and Schmitt assured companies that their data would be aggregated and anonymized before being used by a statistical model to generate estimates of app performance. Contrary to these representations, the order finds that from late 2014 through mid-2018, Annie used non-aggregated and non-anonymized data to alter its model-generated estimates to make them more valuable to sell to trading firms. The SEC’s found that Annie and Schmitt violated the anti-fraud provisions of Section 10(b) of the Exchange Act and Rule 10b-5 and ordered them to pay $10 million and $300,000, respectively. Schmitt is also prohibited from serving as an officer or director of a public company for three years.

National Institute of Standards and Technology (NIST) Workshop Considers Internet of Things Labeling Improvements

In a September 2021 virtual public workshop, the NIST worked to (1) “identify IoT cybersecurity criteria for a consumer labeling program;” and (2) “identify secure software development practices or practices for a consumer software labeling program.” The workshop, which was recorded, featured panel discussions and presentations based on consumer software labeling position papers submitted to NIST and potential IoT baseline security criteria NIST shared in August.

California Invites Preliminary Comments on Proposed Rulemaking Under the CCPA

California’s Privacy Protection Agency has invited the public to submit comments related to cybersecurity audits and risk assessments performed by businesses, potential opt-outs for consumers with respect to automated decision-making technology, agency-conducted audits, consumers’ right to delete, right to correct and right to know, consumers’ right to opt-out of the selling or sharing of their personal information and to limit the use and disclosure of their sensitive personal information, consumers’ rights to limit the use and disclosure of sensitive personal information, and the information provided in response to a consumer request to know, among others. The deadline for comment is November 8, 2021.

Illinois First District Court Rules that Only BIPA Claims Rooted in Unlawful Disclosure are Subject to IL’s One-Year Privacy Claim Limit

An Illinois district court has ruled that different statutes of limitations apply under the state’s biometric privacy law depending on the type of claim made. Specifically, the court held that the filing window for BIPA claims is only one year for claims of unlawful disclosure and five years for claims involving notice, consent, and retention. The decision was handed down as part of a 2019 proposed class action wherein two Black Horse drivers claimed the company failed to obtain consent to use drivers’ fingerprints and institute a retention schedule and unlawfully disseminated the biometric data by sharing fingerprints with a company that processed timekeeping records. The district court’s decision was handed down on appeal after Black Horse attempted to dismiss the claim as time-barred.  

Global News

Irish Data Protection Commission Publishes Guidance on Risk-Based Approaches to Data Processing

In its guidance, the Commission notes that organizations that process personal data must take steps to ensure that the data is handled legally, securely, efficiently, and effectively to deliver the best possible care. To do so, organizations may construct a risk profile, determined according to the personal data processing operations carried out, the complexity and scale of data processing, the sensitivity of the data processed, and the protection required for the data being processed. 

In creating this profile, organizations may wish to review Recital 75 of the GDPR, which governs the tangible harms organizations must safeguard against, such as discrimination, identity theft or fraud, and financial loss. Organizations may also wish to refer to GDPR sister-concepts Data Protection By Design (embedding data privacy features and data privacy-enhancing technologies directly into the design of projects at an early stage) and Data Protection By Default (user service settings must be automatically data protection-friendly, and that only data which is necessary for each specific purpose of the processing should be gathered at all) in Article 25. Finally, the Commission advises that Data Protection Impact Assessments are useful tools to help data controllers demonstrate compliance, even when not mandatory.

UK Announces National AI Strategy

The UK announced its national AI strategy aimed at facilitating the country’s capabilities for AI and machine learning tech over the next ten years. The plan aims to:

  1. Invest and plan for the long-term needs of the AI ecosystem.
  2. Support the transition to an AI-enabled economy.
  3. Ensure appropriate national and international governance of AI technologies.

Amongst the actions to be taken to support these aims is to publish a framework for the government’s role in enabling better data availability, publishing the Ministry of Defense’s approaches to the use of AI, and undertaking a review of international and domestic approaches to semiconductor supply chains.

Ontario Government Publishes White Paper Outlining Proposals for Private-Sector Privacy Regulation

As part of its Digital and Data Strategy, the Government of Ontario published a white paper proposing regulations to address gaps in Ontario’s legislative privacy framework, including changes to automated decision-making, consent laws, data transparency, and children’s privacy. The paper also proposes a “fundamental right to privacy” and “fair and appropriate measures” for organizations to collect, use, or disclose personal data. Many of the paper’s proposals mirror established privacy regimes like GDPR. The comments period for this proposal closed on September 3, 2021.

Contacts

Continue Reading