What are the general policies businesses should follow when interacting through virtual assistants and chatbots? A recent panel at South by Southwest in Austin discussed best practices, current laws, and upcoming legislation that will affect how companies do business.

“The law is the tortoise and technology is the hare,” said Hannah Taylor, counsel to Frankfurt Kurnit Klein & Selz.

Proof of that statement can be found in the lack of a federal privacy law in the U.S., and older guidelines from the Federal Trade Commission on what is needed in order for disclosure to be lawful, Said Taylor.

The FTC has stated its position on privacy disclosures online as being “if a platform does not provide an opportunity to make proper disclosures, then it should not be used to disseminate advertisements that require such disclosures.”

Under this policy, the FTC effectively argued in its statement that Twitter should not be used as a platform for advertising due to its character limit, said Taylor.

Platforms have developed their own privacy policies that they expect developers to adhere to, including Facebook, which is under the spotlight in the wake of the Cambridge Analytica controversy.

“You have to align your privacy policy to [Facebook’s] practices,” said Daniel Goldberg, counsel to Frankfurt Kurnit Klein & Selz. “There are prohibitions within their terms and policies, so you can’t go out and use that data if it says otherwise.”

However, Audrey Wu, CEO and co-founder of Convrg, noted that many of these policies are hardly, if ever, enforced by the platforms, and app developers often find ways to circumvent such policies.

Cambridge Analytica provides a textbook example of a developer that did not adhere to Facebook’s policy, which states: “If you’re going to use [Facebook’s] data for purposes, it has to be for very specific scientific research; and it can’t be used for political and other purposes.” Cambridge Anayltica acquired the personal data of 87 million Facebook users through an app called “thisisyourdigitallife.

The Cambridge Analytica/Facebook debacle resulted in greater attention from the Federal Communications Commission and other entities about industry practices, including state legislatures.

California was the first to respond and created new laws to address personal data on the internet, including the Internet of Things Bill (employing reasonable security when customers are using a connected device, such as Alexa or Echo) and the California Consumer Privacy Act.

The California Attorney General’s Office identified six items as privacy principles, and these have been largely adapted in some shape or form by attorneys general in other states, said Goldberg. Those principles are: transparency, choice, reasonable security, limit collection and retention, sensitive data, and reasonable expectations.

Goldberg defined the six terms as:

• Transparency—“Are you transparent about how you’re collecting information, how you’re using information, what the purpose is, and how you’re sharing it?”
• Choice—“Does the user actually have a choice on the type of data you’re collecting from them?”
• Reasonable security—“Are you taking reasonable security measures?”
• Limit collection and retention—You’re “not just collecting data for the purposes of collecting it.”
• Sensitive data—“Limit the amount you collect.”
• Reasonable expectations—“Don’t be creepy. If you’re using the data and people would really not expect you’re using the data on the platform for that reason, and you’re being creepy, you should reevaluate how you’re using that data.”

Every state in the U.S. has laws addressing reasonable security through data breach laws, but these laws don’t necessarily cover the type of data collection being done by companies online, said Goldberg.

“What’s really interesting about Facebook/Cambridge Analytica is that it was not considered an actual data breach under the law … The type of data that was leaked through that, or was used improperly, was not the type of personal data as defined under those applicable statues. That’s changing,” said Goldberg.

Taylor said this was an occasion where “the law is moving quickly enough to address things in real life,” however, the panel agreed the law was possibly moving “too quickly.”

Goldberg pinpointed the California Consumer Privacy Act, which goes into effect in January 2020, as a law that moved too quickly and without proper research.

“There are a lot of issues with that bill,” said Goldberg. “Everyone in the tech industry is trying to scramble to find out how to deal with it.”

The act provides the ability for users to opt-out of collection of their personal data from apps and web sites.