Address
33-17, Q Sentral.
2A, Jalan Stesen Sentral 2, Kuala Lumpur Sentral,
50470 Federal Territory of Kuala Lumpur
Contact
+603-2701-3606
[email protected]
Address
33-17, Q Sentral.
2A, Jalan Stesen Sentral 2, Kuala Lumpur Sentral,
50470 Federal Territory of Kuala Lumpur
Contact
+603-2701-3606
[email protected]
Human Rights Watch found out that a bunch of photos of Australian kids ended up in a big AI dataset called Laion-5B without asking the kids or their parents. These photos were used by AI companies like Stability AI and Midjourney to train their systems. This is a big no-no for privacy and shows that there’s not enough checks on how data is used in AI.
Using these photos without permission is not only a privacy problem but also brings up some serious ethical questions. It means there’s a gap in how data is managed and points to the need for tighter rules and clearer processes.
They looked into it and found that the kids in the photos could be easily recognized, and some photos even had names and personal stuff listed. This could lead to bad situations like identity theft or misuse of the images.
Laion, who made the dataset, is working with Human Rights Watch to fix this by taking out the sensitive data. However, they mentioned that even if they remove the images from their dataset, those images are still out there on the internet for others to grab and use.
There’s a push for stronger laws to keep kids’ photos safe and make sure their images don’t end up in AI databases without clear permission.
It’s also on AI companies to make sure they’re not using personal info unless they really need to and have the right legal okay to do so.
We’re talking about the ethical issues and privacy worries when it comes to using kids’ photos in AI datasets. This includes a look at the Laion-5B dataset problem and what steps are being taken to keep digital identities safe.
Sources The Guardian