Tech

Roomba testers feel cheated after intimate pictures end on Facebook

“A lot of these languages ​​appear to be designed to exempt the company from applicable privacy laws, but none of them reflect the reality of how the product works.”

More, all of test participants must agree that their data can be used for machine learning training and object detection. Specifically, the part of the global experimental agreement on “use of research information” requires confirmation that “text, video, images or audio… can be used by iRobot for statistical analysis. and usage data, technology problem diagnosis, product performance enhancement, product and feature innovation, market research, commercial presentations, and internal training, including machine learning and detection object.”

What’s not explicitly stated here is that iRobot does the machine learning training through a human data labeler teaching algorithms, click-by-click, to recognize individual elements captured in the raw data. In other words, the agreements shared with us never explicitly mention that personal photos will be seen and analyzed by others.

Baussmann, a spokesperson for iRobot, says that the language we markup “covers a variety of test scenarios” and is not specific to images submitted to annotate data. “For example, testers are sometimes asked to take a photo or video of the robot’s behavior, such as when it gets stuck on a certain object or fails to attach itself completely, and sends pictures or that video for iRobot,” he wrote, adding that “for tests where images will be taken for annotation purposes, there are specific terms set forth in the agreement regarding that experiment.”

He also writes that “we cannot be sure if the people you spoke to were part of the development work related to your article,” although it is worth noting that he did does not contest the authenticity of the global testing agreement, the last that allows all of examine the user’s data to be collected and used for machine learning.

What do users really understand?

When we asked privacy lawyers and academics to review consent agreements and share with them the concerns of test users, they viewed documents and violations. The privacy rights that followed are emblematic of a broken consent framework that affects all of us—whether we’re beta testers or regular consumers.

Experts say companies are well aware that people rarely read privacy policies carefully, if we do. But what iRobot’s global trial agreement attests to, said Ben Winters, an attorney with the Electronic Privacy Information Center who focuses on AI and human rights, is that “even if you read it, you still don’t understand.”

Instead, “a lot of these languages ​​appear to be designed to exempt the company from applicable privacy laws, but none of them reflect the reality of how the company operates.” product,” says Cahn, pointing out the robot vacuum’s portability and impossibility of control. where potentially sensitive people or objects—especially children—are in their home at all times.

In the end, “that place”[s] much of the responsibility… lies with the end user,” said Jessica Vitak, an information scientist at the University of Maryland School of Information Studies who studies best practices in research and consent policy. idea. However, it doesn’t provide them with a real explanation of “how bad things can go,” she said – “which would be very valuable information when deciding whether to participate. or not.”



Source by [author_name]

news7h

News7h: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button
Immediate Peak