HomeTechnologyGoogle’s Photo App Still Can’t Find Gorillas. And Neither Can Apple’s. -...

Google’s Photo App Still Can’t Find Gorillas. And Neither Can Apple’s. – UnlistedNews

Google and Apple are two of the biggest tech giants in the world, constantly pushing the boundaries when it comes to the latest in artificial intelligence technology. However, despite all of their advancements, both companies have yet to solve an issue with their photo apps – the ability to correctly label gorillas in photos.

While this issue may seem small in the grand scheme of things, it points to a larger problem within the field of AI and machine learning. Despite incredible advancements, there are still limitations and problems that are yet to be solved.

The problem with labeling gorillas in photos was first brought to light back in 2015, when it was discovered that Google Photos was incorrectly labeling African Americans as gorillas. While this mistake was quickly corrected, the underlying issue of correctly identifying gorillas was not. While Google has stated that they have since fixed the problem, it still exists in their system. In fact, a recent test by a New York Times journalist found that both Google Photos and Apple Photos still had trouble correctly labeling photos of gorillas.

This issue is a result of how AI and machine learning works. These systems are designed to learn from large amounts of data and identify patterns in that data to make conclusions or predictions. However, when the data is biased or limited in some way, the patterns that are learned can be incorrect. The issue with correctly labeling gorillas likely stems from the lack of diversity in the data used to train these systems. Because there are fewer examples of gorillas in the data used for machine learning, the system has a harder time accurately identifying them.

While this issue may seem small, it highlights a larger problem within the field of AI – the issue of bias. Bias can occur in AI systems when the data used to train them is biased in some way. This can lead to incorrect or harmful conclusions being drawn by the system. For example, a facial recognition system that is trained on a biased dataset may incorrectly identify individuals based on their race or ethnicity.

To combat this issue, those working in AI and machine learning need to make a conscious effort to ensure that the data being used to train these systems is diverse and free from bias. This means working to collect and include more diverse data, as well as implementing more rigorous testing and analysis of these systems to identify and correct bias. Only by doing so can we ensure that AI and machine learning continue to advance in a way that is fair and equitable for everyone.

In conclusion, while the issue of correctly labeling gorillas in photos may seem insignificant, it is a key example of the larger issue of bias in AI and machine learning. Companies like Google and Apple need to work to ensure that their systems are free from bias and that the data used to train them is diverse and inclusive. Only then can we truly unlock the potential of these technologies for the betterment of society.


Sara Marcus
Sara Marcushttps://unlistednews.com
Meet Sara Marcus, our newest addition to the Unlisted News team! Sara is a talented author and cultural critic, whose work has appeared in a variety of publications. Sara's writing style is characterized by its incisiveness and thought-provoking nature, and her insightful commentary on music, politics, and social justice is sure to captivate our readers. We are thrilled to have her join our team and look forward to sharing her work with our readers.


Please enter your comment!
Please enter your name here

Most Popular

Recent Comments