top of page

SELECTED BLOGS

I chose these three blog posts as I feel most confident with them and I believe they had the most passion within the text that made me most proud. They truly grasped a lot of the context within the specified chapters and expanded my knowledge or helped me to solidify what I had previously known which became more definite in the text. I feel as though they perfectly encapsulate what I had been gathering from these readings and I view them as an important part of what I had learned from all of the readings and putting it into words.

DTC 475: Text

Click the titles of the readings below!

DTC 475: List

SYNTHESIS REPORT

     From all of the readings, I found that tribal sovereignty, misrepresentation, and algorithm glitches are three concepts that intersect in various ways. To understand this, you must know that tribal sovereignty refers to the inherent right of indigenous nations to govern themselves and their territories. Misrepresentation, on the other hand, refers to the act of distorting or misinterpreting information about an individual or group. Algorithm glitches occur when software or computer programs malfunction or don’t have the correct information, leading to unintended consequences. Throughout this essay, I will cover how these three concepts intersect and why it is important to pay attention to these intersections.

     To begin, one intersection between tribal sovereignty, misrepresentation, and algorithm glitches is in the realm of data collection. Many companies and government agencies collect data about indigenous peoples, often without their consent or knowledge. This data can make decisions that affect indigenous communities, but if it is collected and analyzed without an understanding of the unique cultural and historical context of those communities, the results can be skewed or misinterpreted. In addition, algorithm glitches can lead to inaccurate data that is used to make decisions, further add to the problem of misrepresentation.

     Another crossing can be seen in the domain of social media and online platforms. As mentioned previously, indigenous peoples are often misrepresented or stereotyped in mainstream media and popular culture, and this can also occur on social media platforms. For example, the use of indigenous cultural symbols or practices by non-indigenous individuals can be seen as cultural appropriation and can perpetuate harmful stereotypes. Algorithm glitches can also amplify these issues, as social media algorithms may promote or amplify content that is based on stereotypes or misrepresentations.

     A specific current issue that illustrates these intersections is the ongoing struggle of indigenous peoples to protect their sacred sites and cultural practices. Many indigenous sacred sites have been desecrated or destroyed due to development projects or resource extraction. In addition, non-indigenous individuals and organizations may appropriate indigenous cultural practices or symbols without understanding their cultural significance, leading to misrepresentation and erasure of indigenous voices and perspectives. Algorithm glitches can worsen these issues, as search engine algorithms may prioritize information from non-indigenous sources, leading to further misrepresentation and erasure of indigenous perspectives.

     The meeting of tribal sovereignty, misrepresentation, and algorithm glitches are important to understand because they have real-world implications for indigenous peoples. For example, inaccurate data or misrepresentation can lead to policies or decisions that harm indigenous communities or perpetuate harmful stereotypes. In addition, the erasure of indigenous voices and perspectives can lead to the loss of important cultural knowledge and practices.

     It is also important to recognize these intersections because they are often interconnected with larger issues of systemic oppression and colonialism. The history of colonization and forced assimilation has led to a legacy of mistrust and exploitation between indigenous peoples and non-indigenous governments and institutions. In this context, the collection and use of data can be seen as an extension of colonialism, as it often occurs without the consent or input of indigenous communities. Misrepresentation and erasure of indigenous voices can also be seen as part of a larger pattern of cultural erasure and assimilation.

     The convergence of tribal sovereignty, misrepresentation, and algorithm glitches are complex and many-sided. They have real-world implications for indigenous peoples, and it is important to recognize and understand these intersections in order to address the root causes of these issues. By doing so, we can work towards a more equitable and just society that respects the inherent right of indigenous nations to govern themselves and their territories.

     In addition to understanding the intersections of tribal sovereignty, misrepresentation, and algorithm glitches, it is also important to consider the ways in which these issues can be addressed. One approach is to involve indigenous communities in the data collection process, ensuring that data is collected with their consent and input. This can help to ensure that data is accurate and reflects the unique cultural and historical context of indigenous communities.

     Another approach is to prioritize the voices and perspectives of indigenous peoples in media and online platforms. This can be done by promoting indigenous-led media outlets and organizations, and by developing algorithms that prioritize indigenous voices and perspectives. By doing so, we can begin to shift the narrative away from harmful stereotypes and towards a more accurate and nuanced understanding of indigenous cultures and histories.

     It is also important to recognize that addressing the intersections of tribal sovereignty, misrepresentation, and algorithm glitches requires systemic change. This means addressing the root causes of these issues, such as the legacy of colonization and the ongoing exploitation of indigenous lands and resources. It also means working towards a more equitable and just society that recognizes the inherent rights of indigenous nations and prioritizes their voices and perspectives in decision-making processes.

     Overall, the intersections of tribal sovereignty, misrepresentation, and algorithm glitches highlight the ongoing struggles of indigenous peoples to protect their cultures, lands, and communities in the face of systemic oppression and exploitation. By recognizing and addressing these intersections, we can work towards a more equitable and just society that respects and values the diversity of indigenous cultures and histories.

DTC 475: Text

MULTIMODAL DESIGN

DTC 475: Text
DTC 475: Pro Gallery
DTC 475: Text

     In today’s world, the internet has been hailed as the great equalizer, providing access to information and opportunities to people around the world. However, recent studies have shown that search engines are far from neutral and unbiased. Search algorithms can reflect and reinforce existing societal biases, including racism. The purpose of this multimodal redesign is to examine the ways in which search engines can perpetuate racial biases and suggest ways in which these algorithms can be improved. I decided upon this topic as search engines have become an integral part of our lives, and the information they provide can shape our beliefs and attitudes. By raising awareness about the issue and suggesting ways in which we can work towards a more equitable future, we can change a lot.

     When it comes down to the internet, search engines have become an integral part of our lives, providing us with information on a variety of topics. However, these search engines are not neutral, and their algorithms can reflect and reinforce existing biases. In the reading "Algorithms of Oppression: How Search Engines Reinforce Racism," Safiya Umoja Noble examines how search algorithms can perpetuate and amplify racial biases. She argues that search algorithms have the power to shape our perceptions of race and ethnicity, and they can reinforce harmful stereotypes. For example, a search for "black girls" may yield results that reinforce negative stereotypes about black women. Similarly, a search for "Mexican immigrants" may provide results that perpetuate harmful stereotypes about the Latino community.

     Others have also discussed the issue of algorithmic bias in search engines. In the article "Race and Gender Bias in Search Engines," Latanya Sweeney and other collaborators explore how search algorithms can reflect and reinforce existing biases. They found that search algorithms can produce racially biased results for certain queries. For example, a search for "black-sounding" names may produce results that are more likely to be associated with criminal activity, while a search for "white-sounding" names may produce results that are more likely to be associated with higher-paying jobs which shouldn’t be the case at all.

     In addition to scholarly sources, several popular sources have also covered the issue of algorithmic bias in search engines. In the article "Google Searches Can Deliver Fake News, Study Finds," the New York Times reports on how search algorithms can reinforce fake news and misinformation. They note that search algorithms can amplify certain viewpoints and bury others, leading to a distorted view of the world. Similarly, the Guardian's article "Google Staff Stage Walkout over Handling of Sexual Harassment Claims" discusses how search algorithms can reinforce systemic biases, including sexism and racism. The article highlights the need for greater transparency and accountability in search engine algorithms.

     Based on previous mentions, it is clear that search engines can perpetuate racial biases and harmful stereotypes. This can have serious consequences, including reinforcing existing inequalities and shaping our perceptions of different racial and ethnic groups. In order to address this issue, it is important to increase transparency and accountability in search algorithms. Search engines should be more transparent about how their algorithms work and the factors that influence search results. They should also be held accountable for the impact of their algorithms on different communities.

     One potential solution is to increase diversity in the teams that develop search algorithms. By ensuring that the teams that develop search algorithms are diverse, search engines can incorporate a broader range of perspectives and experiences, reducing the likelihood of biases creeping into the algorithms. Another solution is to increase public awareness about the issue of algorithmic bias in search engines. By educating people about the ways in which search engines can reinforce harmful stereotypes, we can help people to become more critical consumers of information.

     In conclusion, search engines have become an integral part of our lives, but they are far from neutral and unbiased. Search algorithms can reflect and reinforce existing societal biases, including racism. Through this reading, we have examined the issue of algorithmic bias in search engines and suggest ways in which these algorithms can be improved. This includes increasing transparency and accountability, diversifying the teams that develop search algorithms, and increasing public awareness about the issue of algorithmic bias. By taking these steps, we can work towards a more equitable future and ensure that search engines do not perpetuate harmful stereotypes and reinforce existing inequalities.

DTC 475: Text

       BIBLIOGRAPHY

                                  Unable to do hanging indents or hyperlink...

Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016). Machine bias. ProPublica. Retrieved from https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing


Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. Conference on Fairness, Accountability and Transparency, Proceedings of Machine Learning Research, 81, 77-91. Retrieved from https://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf


Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. St. Martin's Press.


Hicks, D. (2019). Digital media ethics. Polity.


Hill, K. (2019). Why algorithms can be racist and sexist. The Guardian. Retrieved from https://www.theguardian.com/technology/2019/apr/01/why-algorithms-can-be-racist-and-sexist


Noble, S. U. (2018). Algorithms of oppression: how search engines reinforce racism. New York University Press.


Wakabayashi, D. (2018). Google employees discussed tweaking search function to counter travel ban. The New York Times. Retrieved from https://www.nytimes.com/2018/09/21/technology/google-search-travel-ban.html

DTC 475: Text
DTC 475: Blog2 Custom Feed
bottom of page