This week’s readings and media posts had many themes introduced as well as some being reintroduced from previous weeks. We went over how methods can be biased not only from a societal standpoint but can also happen from an algorithmic standpoint too. The idea of digital humanities is to further progress our digital humanities techniques by using critical thinking and adapting to situations to help identify, reduce and hopefully avoid bias’.
In “Digital Creativity as Critical Material Thinking: The Disruptive Potential of Electronic Literature” it was discussed that digital humanities are said to consist of two methods of thinking; Gnosis which is the “conceptual” side of thinking and Poiesis which is the “active” side of thinking. Within the paper, it was mentioned how the main focus of digital humanities has been Gnosis and how the distinction between the two is rarely considered in the humanities.
“The era of blind faith in big data must end | Cathy O’Neil” introduced that algorithms can be biased as they are nothing but a digitally programmed version of someone’s opinions and essentially are constructed in a way that tricks society into thinking in a particular way. The issue also stems from the fact that algorithms are based on past achievements and events that have taken place. But to progress in society, we are to re-evaluate and reconstruct from the past into a new age of thinking rather than repeat the same mistakes made in the past. The dangerous part of algorithms is, that they are essentially silent as no one will ever really know what is causing the issue unless strenuous amounts of concentration are brought upon the code. Cathy O’Neil compared how an ill-designed plane crashing is seen by everyone but an ill-designed algorithm hides in the depths for long periods before being engaged. To ensure such issues do not arise we must find a way to better understand algorithms so we can further audit them! Through auditing, we can help lessen the bias in these sequences of code but even then, we should be careful and never fully trust them. This in conjunction with “Algorithms of Oppression: How Search Engines Reinforce Racism – Chapter 1: A Society, Searching” where Safia Noble’s findings in Google’s autosuggest searches is a great example of how algorithms can be silently ruining or biasing people of today society. The autosuggestions were both extremely demeaning to different races and sexes (Ie. the “three black teenagers” vs “three white teenagers” google search results issue that sparked outrage back in 2016.). Although still remaining today, there has been a drastic decrease in these algorithmic issues compared to not only a decade ago.
Throughout the semester, we have been slowly working through a multitude of different tools and techniques where we built upon each concept and technique we learned the previous week. These techniques helped us the model and visualize our thoughts and impressions about digital humanities as well as other topics about the world. Some tools helped us visualize the general idea of multiple publications (Ie. Voyant) whereas others helped display/spread awareness of a cause or idea (Ie. Omeka, Visual Timelines, and Maps). But to truly be able to summarize ideas such as this, we must overcome many hurdles. As mentioned in “Is a Critical Digital Humanities Possible?| Roopika Risam”, some people do not want sensitive information such as religious scriptures or cultural findings to be digitized for the general public, essentially having it unmonitored at times. This in my opinion is a very valid reason for wanting to know if what is sacred to you and your religion is being respected the way it should be and not being desecrated by others online. I do also think that there are ways around like monitoring digital formats. By this, I mean that there can be certain keys that give a select few access to the data. Although it wouldn’t allow the general public to access the file at hand, at least we would have a digital format that could be preserved for the future.
Today we live in an age where we have instant access to almost any information we so wish to attain just from our computers. In “Digital transformation – a revolution in the humanities?” it was mentioned how the use of archived digital data is a key process by which we base digital humanities and that this archived data helps us understand our past processes. It was emphasized that although not everyone utilizes technology, it also cannot replace humans and the notions society we have constructed for ourselves, but is still a very crucial part of understanding our views on Earth! Relating to “Can Digital Humanities Mean Transformative Critique?” and just as mentioned in “Digital Humanities: Knowledge and Critique in a Digital Age Chapter 8 “Towards a Critical Digital Humanities” for digital humanities to progress, we must be open to debate and tension among peers. Critically assessing the work allows for a more full and substantial meaning to be brought from the ideas being processed and presented. The “creative tension between those who’ve been in the field for a long time and those who are coming to it today”. This is only possible due to an ever-changing future where definitions of words and phrases and even the application of tools are being altered and redefined for a more complete and comprehensive digital humanity! (Ie. More socially justice-minded practices, identities, and collaborations made viral with the aid of the #transformDH movement.)