June 7: Computational Thinking
Intro to Coding and ChatGPT
Abstract: Today my lovely readers you I will share with you my first experience with coding as well as my experience with CHAT GPT.
Coding
My Background in Coding: Absolutely no coding experience whatsoever
As I had 0 experience in coding, I decided to start slow easing my way into it, choosing to do the project "Intro to Coding". Using Codeacademy I was able to learn how to use Python completing 4 1/2 out of the 6 lessons required, reaching 23% out of 30% required for the project. Overall, I learned a lot of the fundamental basics of Python covering subjects such as syntax, strings and console output, date and time, conditionals and control flow, and pyglatin.
Picture of Codeacademy lesson problem
Reflection: After going through and finishing the 6 lessons I learned the basic rules and principles for coding. However, in addition to this I learn how it is easy to often get confused with the rules adding constraints and specific instructions in order to get the desired result. So, while this was a great introduction to coding, I think I defiantly need to get in a little extra practice to perfect or rather better my coding skills and understanding of it.
ChatGPT
In this lesson covering ChatGPT in Stem Skills, we uncovered the level of racial and gender bias, if any, found within AI systems.
After completing the assignment of reading two articles on the racial and gender bias within AI systems as well testing ChatGPT by asking to give me a summary of the articles I have concluded that there is racial and gender bias within AI systems as training and programing of this software predominantly focuses on lighter skinned men for its training.
Sources/Articles:
- Artificial Intelligence Has a Racial and Gender Bias Problem | TIME
- Why algorithms can be racist and sexist
Evidence of Bias:
- Though technology is assumed to be unbiased, research has uncovered "large gender and racial bias in AI systems sold by tech giants like IBM, Microsoft, and Amazon."
- "Less than 2% of employees of technical roles at Facebook and Google are black."
- "One government dataset of faces collected for testing that contained 75% men and 80% lighter-skinned individuals and less than 5% women of color—echoing the pale male data problem that excludes so much of society in the data that fuels AI."
- Difficult to identify “exactly how systems might be susceptible to algorithmic bias”
Reflection:
So, after reflecting and analyzing this information it is undoubtably clear that there is racial and gender bias within AI systems and although there is such bias we are and should continue to research and improve our AI systems to show equality amongst all users.
Comments
Post a Comment